no code implementations • 12 Oct 2023 • Lapo Frati, Neil Traft, Jeff Clune, Nick Cheney
We show that our zapping procedure results in improved transfer accuracy and/or more rapid adaptation in both standard fine-tuning and continual learning settings, while being simple to implement and computationally efficient.
1 code implementation • 4 Jul 2023 • Csenge Petak, Lapo Frati, Melissa H. Pespeni, Nick Cheney
In contrast, conservative bet-hedgers have a set of offspring that all have an in-between phenotype compared to the specialists.
no code implementations • 18 Aug 2021 • Joshua Powers, Ryan Grindle, Lapo Frati, Josh Bongard
Efforts to combat catastrophic interference to date focus on novel neural architectures or training methods, with a recent emphasis on policies with good initial settings that facilitate training in new environments.
5 code implementations • 21 Feb 2020 • Shawn Beaulieu, Lapo Frati, Thomas Miconi, Joel Lehman, Kenneth O. Stanley, Jeff Clune, Nick Cheney
Continual lifelong learning requires an agent or model to learn many sequentially ordered tasks, building on previous knowledge without catastrophically forgetting it.
no code implementations • 15 Oct 2019 • Joshua Powers, Ryan Grindle, Sam Kriegman, Lapo Frati, Nick Cheney, Josh Bongard
Catastrophic forgetting continues to severely restrict the learnability of controllers suitable for multiple task environments.