In the neuron models discussed so far each synapse is characterized by a single constant parameter wij that determines the amplitude of the postsynaptic response to an incoming action potential. Electrophysiological experiments, however, show that the response amplitude is not fixed but can change over time. Appropriate stimulation paradigms can systematically induce changes of the postsynaptic response that last for hours or days. If the stimulation paradigm leads to a persistent increase of the synaptic transmission efficacy, the effect is called long-term potentiation of synapses, or LTP for short. If the result is a decrease of the synaptic efficacy, it is called long-term depression (LTD). These persistent changes are thought to be the neuronal correlate of `learning' and `memory'.
In the formal theory of neural networks the weight wij of a connection from neuron j to i is considered as a parameter that can be adjusted so as to optimize the performance of a network for a given task. The process of parameter adaptation is called learning and the procedure for adjusting the weights is referred to as a learning rule. Here learning is meant in its widest sense. It may refer to synaptic changes during development just as well as to the specific changes necessary to memorize a visual pattern or to learn a motor task. There are many different learning rules that we cannot all cover in this book. In this chapter we consider the simplest set of rules, viz., synaptic changes that are driven by correlated activity of pre- and postsynaptic neurons. This class of learning rule can be motivated by Hebb's principle and is therefore often called `Hebbian learning'.
© Cambridge University Press
This book is in copyright. No reproduction of any part of it may take place without the written permission of Cambridge University Press.