Associative Memory

Example: Simulating Bidirectional Associative Memory with a Neural Network

Description

This example shows how a feedforward backpropagation neural network can be used to simulate bidirectional associative memory (BAM). BAM is an simplified model of long-term memory as found in humans and other animals. Empirical data suggest that human memory has at least three stages: immediate, short-term, and long-term. Immediate memory seems to last little more than a second. Short-term memory lasts about 15–30 seconds. Short-term retention makes information available long enough to be rehearsed. If rehearsed, the information can be transferred to long-term storage. Once committed to long-term memory, the results of learning tend to endure but can be lost or degraded when specific parts of the brain are injured or removed; they are also vulnerable to interference from additional learning. Long-term memory can last for years or even a life time. The tendency to recall words that sound the same or have similar meanings indicates that information is clustered. This supports the theory that the process of retrieval from long-term memory first locates stored data in some kind of associative network and then selects an item within that system. In contrast the internal memory of a computer is divided into locations, each with a unique address. If computer memory is disturbed then specific items will be lost without affecting other stored items. Some of the characteristics of long-term memory such as associative correction, interference and fault tolerance can be shown in BAM even though the BAM is a very simple model. Auto associative memory (Hopfield model) can be simulated with a BAM using the same data for both input and output from the neural network.

Helper Functions

A small set of helper functions are defined in the initialization cell below to assist in the example. First since a neural network operates on numerical data functions to convert strings to and from bits are defined. Then a function is defined to combine and convert data to the form required for neural network training. Next a lookup function is defined to retrieve data by associating from the given input to an output and back again to an input. Finally some functions to alter weights in the neural network are defined so memory degradation can be explored.

Helper Functions Implementation

Phonebook Example

To investigate the properties of associative memory a small list of names and hypothetical phonenumbers is defined. Some names and numbers are chosen to be similar to verify that the net can distinguish between them. The longest string in the data is found so the number of input bits needed for the neural net can be calculated. The textual data is converted to binary data with 8 bits for each character in the NeuralNetData form.

namelist = {"Tony", "Tory", "Dory", "Leo", "Georg ... oin[namelist, numberlist]]] ; data = http://www.geocities.com/freegoldbar/BidirectionalAssociativeData[namelist, numberlist, maxlen] ;

The bam neural net for this example can be created without any hidden layers. To include a hidden layer remove the comment around it. Sigmoid activation functions are used since the binary data has values of zero and one.

NeuralNet[bam, NeuralNetTypeLayer, Layers {8 maxlen, (*8 maxlen, *)8 maxlen},  ... tivationFunctionOutputLayerSigmoid, TrainingAlgorithmBackpropagationIncremental] ;

The bam is trained on the phonelist which is reminiscent of rehearsing the list.

ListPlot[Train[bam, data, Epochs1000], PlotJoinedTrue, PlotStyle {Hue[1]}, PlotRangeAll] ;

[Graphics:HTMLFiles/AssociativeMemory_5.gif]

The phonebook associations stored in the neural net can be verified.

{BidirectionalAssociativeLookup[bam, namelist], BidirectionalAssociativeLookup[bam, numberlist]}

{{{Tony, 121212}, {Tory, 121112}, {Dory, 121211}, {Leo, 343434}, {George, 565656}, {Paul, 7878 ... {121112, Tory}, {121211, Dory}, {343434, Leo}, {565656, George}, {787878, Paul}, {909090, Peter}}}

Partial and incorrect data is to some degree corrected by the bam neural net.

BidirectionalAssociativeLookup[bam, {"Dony", "Pater", "le0", "787"}]

{{Dory, 121211}, {Peter, 909090}, {Leo, 343434}, {787878, Paul}}

Memory Properties

Different properties of a BAM such as its storage limit and its ability to associate with incomplete data can be explored. To explore the saturation limit for the BAM a larger number of short strings (short strings use fewer neurons) can be generated with the CharacterRange function. Interference and capacity with and without hidden layers using different training algorithms can be seen. Humans do not exhibit any capacity limitations probably due to the vast number of neurons in the brain. Here the ability to operate with partially damaged weights is explored. A function to set a random portion of the weights in the neural network is used to reset 5% of the weights. Repeating the following cells show how the memory degrade as more weights are reset. Retraining the BAM quickly restores the lost information.

RowBox[{{, RowBox[{Test[bam, data], ,, RowBox[{RowBox[{SetRandomWeights, [, RowBox[{bam, ,, 0.05, ,, 0}], ]}], ;, Test[bam, data]}]}], }}]

RowBox[{{, RowBox[{0.324515, ,, 0.390936}], }}]

{BidirectionalAssociativeLookup[bam, namelist], BidirectionalAssociativeLookup[bam, numberlist]}

{{{Tony, 121210}, {Tory, 121110}, {Dory, 121211}, {Leo, 343634}, {Weorg`, 56=454}, {Paul, 7878 ...  {121110, Tory}, {121211, Dory}, {343634, Leo}, {56=454, Ferg}, {787878, Paul}, {909090, Peter}}}

RowBox[{RowBox[{Round, [, RowBox[{RowBox[{RowBox[{Count, [, RowBox[{bam[Weights], ,, RowBox[{{ ... , /, bam[TotalConnectionCount]}],  , 100}], ]}],  , "% of weights have been reset.",  }]

25 % of weights have been reset.

If the network is presented with multiple choices, say someone has two or three phone numbers, then the net will average the answer and the mean square error will not converge to zero. As implemented the output from the FromBits function only returns with bits set if their activation value are above 0.5. Depending on the application it might be possible to add extra information to the input to make it unique for example with a trailing number. The lookup could then try the different possibilities and return a list of those with sufficiently high activation values. Another option when the output bits represent unique answers then their activation value can be taken as a propability for each answer.

Credits

The idea for the phonebook example comes from a similar example in 'Heteroassociative Memory' by K. Kutza, 1996, with reference to B. Kosko, 'Bidirectional Associative Memories', IEEE Transactions on Systems, Man, and Cybernetics, 18, pp. 49-60, 1988. The description of biological memory is condensed from Encyclopædia Britannica. This notebook is part of Fann for Mathematica © 2004 freegoldbar (http://www.geocities.com/freegoldbar/).

Discard[All] ;


Created by freegoldbar  (September 16, 2004)

Hosted by www.Geocities.ws

1