In figure 5.1 an example source listing is presented, here the standard back propagation method is applied to the well-known exor problem. The following steps can be found. First the library is initialized (sprinit) and the learning set is loaded into memory (load_dataset). In the following steps a feed forward network is created with 2 inputs, 4 units in the hidden layer and one output (create_ff_net(1L, 3, NetVect)). The initial weights are set to random values between -0.01 and 0.01 (unif_rand_net(0.01, NetPtr)).
In the next part the actual learning is done. First the backpropagation needs to be
initialized before training (bp_init(NetPtr)). In the innermost
loop the network is trained for 10 cycles (bp_learn(NetPtr, LearningSet, 1.0, 0.9, 10L, BpOptions)),
and then the network's performance is evaluated,
(net_perf(NetPtr, LearningSet, &ErrorStruct, 0.1, SIMPLE_PERF)).
If all training patterns are classified correctly (ErrorStruct.AllCorrect == TRUE),
or if the number of training sweeps exceeds the 1,000,000 the training is
stopped. The memory allocated by the initialization of the backpropagation is then freed,
bp_free(NetPtr), and the network is saved to disk,
fprintf_network(stream, NetPtr, (HISTU | HISTT | HISTW)).
Finally all other data is cleared (the delete_net and
delete_dataset are called)
and the program exits execution (sprexit(EXIT_SILENT)).
Figure 5.1: Backpropagation program in SPRANNLIB.
In the previous box, page , the output of the example program is given. The execution of this program took approximately 0.6 seconds on a Sun 4/330 including all file I/O. The final network saved took 3705 bytes in normal mode and 1642 bytes in compressed format. Some overhead is due to the general nature of the network representation on disk file.
Next: References
Up: The SPRANNLIB user guide
Previous: Generic application set-up