The Complete SPRLIB & ANNLIB

init_mlnet

- initialize a maximum likelihood network

SYNOPSIS

int init_mlnet (net, theta_init, dset)

ARGUMENTS

NET *net A pointer to a maximum likelihood NET.
double theta_init The initial value of the variance of the kernels. From this value, the amplitude of the kernels is deduced (i.e., the volumes are normalized).
DATASET *dset A pointer to the training set.

RETURNS

TRUE in case of an error, FALSE in case of succes.

FUNCTION

This function initializes the values of the WEIGHT structures between the input and the hidden layer of a maximum likelihood network. Since each hidden unit represents a kernel, the weight values between these layers can be initialized using datapoints as an initial guess for the kernel centers. Therefore, the weights are set as the input values of randomly chosen samples of the training set. First, some prerequisites are checked: dset must either be a LEARNSET or LABELSET (see DATASET-flags); net must be an MLNET (see NET-flags); the NumInput of net must match the NumInputs of dset; the number of hidden units in net must be less than or equal to the number of samples in dset and theta_init should lies between 0 and 1000 (not included). The routine traverses all hidden units and assigns values to its incoming weights. Each weight will be initialized with the input values of appropriate samples. The biases (theta's) of the hidden units are set to theta_init, after which the weight value w of the each of the units' outlinks are normalized so that w * unit->NumInLinks * (sqrt (2 * PI) * unit->Theta.Value.Weight)^(net->NumInputs) = 1.

NOTE

The dataset dset should always contain at least one sample for each class. If not, the routine will end up in an infinite loop.

SEE ALSO

create_mlnet , learn_mlnet

This document was generated using api2html on Thu Mar 5 09:00:00 MET DST 1998