On biologically-plausible meta-learning
Humans can discover and exploit shared structure in a problem domain to improve learning performance, to the point of being able to learn from a very limited amount of data. The theory of meta-learning hypothesizes that such fast learning is supported by slower learning processes which unfold over many problem instances. While a number of artificial meta-learning algorithms have been proposed, the biological mechanisms that support this form of learning are largely unknown. Here, we present a biologically plausible meta-learning rule in which synaptic changes are buffered and contrasted across more than one problem before being consolidated. Our rule is theoretically justified and, unlike standard machine learning methods, it does not require reversing learning trajectories in time or evaluating second-order derivatives, two operations that are difficult to conceive in neural circuits. Experiments reveal that our meta-learning rule enables deep neural network models to learn new tasks from few labeled examples. We conclude by discussing a systems model where the hippocampus plays the role of an instructor which is in charge of prescribing auxiliary learning problems to the cortex. Our theory suggests that the concerted action of hippocampus and cortex may enable meta-learning to be implemented using a simple synaptic plasticity rule.
Date:
7 April 2022, 11:00 (Thursday, -2nd week, Trinity 2022)
Venue:
Lecture Theatre
Speaker:
Joao Sacramento (ETH Zurich)
Organising department:
Medical Sciences Division
Organiser:
Dr Rafal Bogacz (University of Oxford)
Organiser contact email address:
rafal.bogacz@ndcn.ox.ac.uk
Part of:
Oxford Neurotheory Forum
Booking required?:
Not required
Audience:
Members of the University only
Editor:
Rui Costa