The dominant theoretical framework for long-term memory is attractor neural networks (ANN) in which information is encoded by neuronal ensembles and stored by Hebbian synaptic modifications. I will address the issue of memory recall in the absence memory-specific retrieval cues, such as in free recall experiments. I will develop an associative model of recall where each retrieved memory item is triggering the recall of the next item. This model can be cast in the language of random graph theory and universal laws of recall can be derived that broadly account for the results of both classical and more recent free recall experiments.
Misha Tsodyks is a professor in the Department of Neurobiology at the Weizmann Institute of Science and a visiting professor at the Center for Neural Theory of Columbia University. He received his master and PhD degrees in the Landau Institute of Theoretical Physics in Moscow and then briefly held a research position in theoretical neuroscience at the Institute of Higher Nervous Activity of the Russian Academy of Science. After moving to Israel in 1990, he did a postdoctoral fellowship at the Hebrew University and then at the Salk Institute. He became a faculty member at the Department o Neurobiology of the Weizmann Institute in 1995. His research focuses on mathematical and computational modelling of information processing in the brain.