Exclusive Or

Example: The Obligatory Xor Function

Neural Network

This example is the neural network equivalent of Hello World. First load the packages required for the example.

<<FannGraph` <<Graphics`

The neural network created here has two input, three hidden and one output neuron. The classic incremental backpropagation algorithm is used.

NeuralNet[net, Layers {2, 3, 1}, TrainingAlgorithmBackpropagationIncremental] ;

Data

The data is a truth table for the exclusive or function with False replaced by zero and True replaced by one.

FormBox[RowBox[{RowBox[{xordata, =, RowBox[{( {False, False}   {False}         ... , False}    {True}                                                        {True, True}     {False}

Training

The weights are initialized to assist the neural network in converging on a solution and the network is trained.

InitializeWeights[net, xordata] ; RowBox[{RowBox[{mse, =, RowBox[{Train, [, RowBox[{net, ,, xordata, ,, Epochs1000, ,, RowBox[{MeanSquareError, , 0.001}]}], ]}]}], ;}]

Visualization

The neural network and the mean square error can be visualized.

{ShowGraph[NeuralNetGraph[net]], LogListPlot[mse, PlotJoinedTrue, PlotStyle {Hue[1]}]} ;

    [Graphics:HTMLFiles/Xor_6.gif]            [Graphics:HTMLFiles/Xor_7.gif]

Results

The input values from the truth table are mapped over the neural net Execute function.

xorvalues = Map[Execute[net, #] &, xordata〚All, 1〛]

RowBox[{{, RowBox[{RowBox[{{, 0.0120124, }}], ,, RowBox[{{, 0.914179, }}], ,, RowBox[{{, 0.911585, }}], ,, RowBox[{{, RowBox[{-, 0.00335202}], }}]}], }}]

Mapping the output values to booleans gives the boolean value for each of the inputs.

Map[#>.5&, Flatten[xorvalues]]

{False, True, True, False}

Credits

This notebook is part of Fann for Mathematica © 2004 freegoldbar (http://www.geocities.com/freegoldbar/).

Discard[All] ;


Created by freegoldbar  (September 29, 2004)

Hosted by www.Geocities.ws

1