Multi-Layer Perceptron [-1:1]


This applet is very similar the Multi-Layer Perceptron [0,1]. The only difference is the range of output of the units. The range is [-1,1] in this applet whereas it was [0,1] in the first one.


The original applet was written by Olivier Michel.


To change the structure of the multi-layer perceptron:
  1. change the values H1, H2 and H3 corresponding to the number of units in the first second and third hidden layer. If H3 is equal to 0, then only two hidden layers are created ; if both H3 and H2 are equal to 0 a single hidden layer is created and if all H1, H2 and H3 are null, no hidden layer is created, corresponding to a single layer perceptron.
  2. click on the Init button to build the requested structure and initialize the weights.




  1. Repeat the same simulation as before. Set a cluster of red points (1.0) in the center, surrounded by blue points. Which network structure and which momentum / learning rate combination can solve such a problem ? How does this perceptron compare with the [0,1] one ?
  2. Define a simple XOR problem and take a network with two hidden neurons (one hidden layer). Repeat several learning runs with different initial conditions. How of ten does the network find a solution? How does the performance compare with that of the [0,1] one?