Adaline, (Optimal) Perceptron and Backpropagation


Single-layer neural networks can be trained using various learning algorithms. The best-known algorithms are the Adaline, Perceptron and Backpropagation algorithms for supervised learning. The first two are specific to single-layer neural networks while the third can be generalized to multi-layer perceptrons.


The applet was written by Olivier Michel. This page written by Alix Herrmann.
The optimal perceptron was added by Tobias Denninger.


Let's consider a single-layer neural network with b inputs and c outputs:


Click on each topic to learn more.  Then scroll down to the applet.


This applet allows you to compare the different learning algorithms.  The network implemented here has two inputs and a single output neuron.  In this tutorial, you will train it to classify 2-dimensional data points into two categories.

Click here to see the instructions.  You may find it helpful to open a separate browser window for the instructions, so you can view them at the same time as the applet window.


  1. Ideal case:   place 10  red points (class 1) and 10 blue points (0) in two similar, distinct, and linearly separable clusters.
  2. Different cluster dispersions:  Place 20 red points (1) in a very narrow cluster (strongly correlated points) and 5 blue points (0) in a very wide cluster in such a way that the classes are linearly separable.
  3. Imperfectly separable case:  Place 10 red points to (1) and 10 blue points (0) in two similar,  linearly separable clusters. Then, place an additional blue point inside the red cluster.
  4. For which kind of problem is the Adaline algorithm the best ?
  5. For which kind of problem is the Backpropagation algorithm the best ?
  6. For which kind of problem is the Perceptron algorithm the best ?
  7. For which kind of problem is the Pocket algorithm the best ?