SOAR-FAQ -- MUME (Multi-Module
Neural Computing Environment) is a simulation environment for multi-modules neural computing. It
provides an object oriented facility for the simulation and training of multiple nets with various
architectures and learning algorithms. The object oriented structure makes simple the addition of
new network classes and new learning algorithms. MUME includes a library of network
architectures including feedforward, simple recurrent, and continuously running recurrent neural
networks. Each architecture is supported by a variety of learning algorithms, including backprop,
weight perturbation, node perturbation, and simulated annealing. MUME can be used for large scale
neural network simulations as it provides support for learning in multi-net environments. It also
provide pre- and post-processing facilities. MUME can be used to include non-neural computing
modules (decision trees, etc.) in applications. _ MUME is being developed at the Machine
Intelligence Group at Sydney University Electrical Engineering. The software is written in 'C' and is
being used on Sun and DEC workstations. Efforts are underway to port it to the Fujitsu VP2200
vector processor using the VCC vectorising C compiler, HP 9000/700, SGI workstations, DEC
Alphas, and PC DOS (with DJGCC). MUME is available to research institutions on a
media/doc/postage cost arrangement after signing a license agreement. The license agreement is
available by anonymous ftp from mickey.sedal.su.oz.au:/pub/license.ps [129.78.24.170]. An
overview of mume is available from the same machine as /pub/mume-overview.ps.Z. It is also
available free for MSDOS by anonymous ftp from brutus.ee.su.oz.au:/pub/MUME-0.5-DOS.zip
For further information, write to Marwan Jabri, SEDAL, Sydney University Electrical Engineering,
NSW 2006 Australia, call +61-2-692-2240, fax +61-2-660-1228, or send email to Marwan Jabri
. To be added to the mailing list, send email to [email protected].