The Complete SPRLIB & ANNLIB

learn_mlnet

- perform a number of learning cycles on a maximum likelihood network

SYNOPSIS

int learn_mlnet (net, dset, cycles, etap, alphap, etas, alphas, options)

ARGUMENTS

NET *net The NET to be trained.
DATASET *dset The DATASET on which the network has to be trained.
long cycles The number of cycles the network should be trained.
double etap The learning rate for learning the positions of the kernels.
double alphap The momentum constant for learning the positions of the kernels.
double etas The learning rate for learning the variances of the kernels.
double alphas The momentum constant for learning the variances of the kernels.
long options Flags - see FUNCTION below.
int class The class number of the sample that was used to evaluate the network. This number indicates which part of the network is to be adapted.

RETURNS

TRUE if an error occured, FALSE otherwise.

FUNCTION

This routine trains a maximum likelihood network. It first checks the network; checks include: NetFlag should be MLNET (see NET-flags); SystemDataRWFlag should be set to BP_INIT_DONE (i.e., bp_init should be called) and NumInputs should be equal to dset->NumInputs. Furthermore, the dataset should be a LEARNSET and a LABELSET (see DATASET-flags). The values etap and etas should lie between 0 and 1000 (included) and the values alphap and alphas between -1.0 and 1.0 (not included). After the checks, the network is trained for cycles steps, by evaluating the network for all SAMPLEs in dset which have their SampleFlag set to SAMPLE_ENABLED (see SAMPLE-flags). To evaluale the network the eval_ff_net function is used. Then the adapt_mlnet function is called. If the BPBATCH flag was set in options, the weight changes are not immediately processed but summed in ML_STRUCT.Accum_Delta by calling adapt_mlnet without the BPUPDATE flag for all but the last sample in the dataset. The options flag is a combination of the HIST-flags and BP-flags. Finally, the net->ModificationDate is updated using the time_stamp function. The learning rule used can be found in: J.D. van Setten, ``Een Klassificator voor een adaptief lerend systeem'', Ph.D. thesis, Delft University of Technology, 1992.

NOTE

This routine is to be used in conjunction with the normal back-propagation routines: the normal order in a program would be to call bp_init, followed by one or more calls to learn_mlnet. The function bp_free should be called afterwards to free the memory used by the ML_STRUCT.

SEE ALSO

create_mlnet, init_mlnet

This document was generated using api2html on Thu Mar 5 09:00:00 MET DST 1998