The Complete SPRLIB & ANNLIB
Index
A
- ActEucDist
- activation function
- ActInprod
- activation function
- ActSqEucDist
- activation function
- adapt_mlnet
- perform one learning cycle on a maximum likelihood network
- adapt_wavnet
- adapt a wavelet network after presentation of one learning sample
- add_classerror
- add a pattern to the error estimation routines using Fisher's linear discriminant
- add_mean_var
- add a pattern to the summations needed for mean and variance
- add_unit_ff
- add a new unit to a layer
- addq_classerror
- add a pattern to the error estimation routines using a quadratic discriminant
- all_scatter_dataset
- determine the scatter matrices for a dataset
B
- best_matching_unit
- find the unit with the smallest output value in a Kohonen net
- BP-flags
- back-propagation option flags
- bp_adapt_net
- perform one learning cycle using the backpropagation rule
- bp_adapt_unit
- adapt a unit according to the generalized delta rule
- bp_adapt_weights
- adapt weights marked ``changed'' according to the accumulated delta's
- bp_free
- free a network that was generated by bp_init
- bp_init
- initialise and set up a network for training with the backpropagation learning rule
- bp_learn
- perform a number of learning cycles with the backpropagation learning algorithm
- bp_learn_pocket
- train a network using the pocket algorithm
- brent
- parabolic interpolation and brent's method in one dimension
C
D
E
F
G
H
I
- INIT-flags
- flags used in the SPRANNLIB initialization function
- init_full_fname
- initializes the filename expansion mechanism
- init_mlnet
- initialize a maximum likelihood network
- init_wavnet
-initialize a wavelet network
- IO-variables
- variable supporting network I/O functions
- IOFORMAT
- constant indicating the network I/O-format version number
- ioset_to_labelset
- convert an IOSET to a LABELSET
- isodata_basic
- the isodata clustering algorithm as defined by Duda and Hart (k-means clustering)
J
- jacobi
- compute the eigenvalues and eigenvectors of a real symmetric matrix
- join_net
- construct a new network out of a number of networks
K
- khighley_dataset
- generate samples according to the generalized Highleyman dataset
- khighley_overlap
- compute the overlap of two classes for the Higleyman set
- kl_coordinates
- compute the Karhunen-Loeve transform
- knn_classify
- classify a new pattern using the k-nearest neighbour classifier
- knn_free
- freeing routine for the k-nearest neighbour classifier
- knn_init
- initialization routine for the k-nearest neighbour classifier
- knn_loo_perf
- k-nearest neighbour classifier performance determination using the leave one out method
- knn_majority
- assign a classlabel according to the k-nearest neighbour rule
- KNNELM
- local structure for the k-nearest neighbour classifier
- koh_adapt_net
- adapt a net's weights according to the Kohonen learning rule
- koh_adapt_unit
- adapt a unit's weights according to the Kohonen learning rule
- koh_draw_free
- clear all data used in drawing Kohonen maps
- koh_draw_init
- initialize the variables for drawing a Kohonen map
- koh_learn
- perform a number of learning cycles with the Kohonen learning rule
- koh_test_net
- constructs a two-dimensional Kohonen network with two inputs
- koh_weight_draw
- draw a Kohonen map
- KOHONEN-DRAW-flags
- option flags for Kohonen weight drawing
- KOHONEN-DRAW-variables
- variables used for graphical display of Kohonen type networks
- KOHONEN-flags
- option flags for Kohonen learning
- kvar_dataset
- generate samples according to the generalized var dataset
- kvar_overlap
- compute the overlap of two classes for the varset
L
- labelset_to_ioset
- convert a LABELSET to an IOSET
- LAYER-flags
- flags to indicate the type of a layer
- LAYER
- layer datatype
- LAYER_RECIPE
- simplified description of a layer of a shared weights network
- learn_bfgs_ffnet
- apply the BFGS optimization method to train a feedforward neural network
- learn_cgdes_ffnet
- apply the conjugate gradient descent method to a feedforward neural network
- learn_marquardt_ffnet
- train a feedforward or radial-basis function network with the dataset pointed by dset for steps number of times, using the Marquardt optimization method
- learn_mlnet
- perform a number of learning cycles on a maximum likelihood network
- learn_ptron
- train a perceptron with the loaded training samples
- learn_ptron_free
- free the space allocated for perceptron learning
- learn_ptron_init
- initialization routine for perceptron training
- learn_wavnet
- train a wavelet network on a dataset
- LIBRARY-flags
- various constants influencing library operation
- LINK-flags
- flags to indicate the type of a link
- LINK
- link (unit connection) datatype
- linmin
- line minimization
- load_dataset
- read a (possibly) compressed dataset from a file
- load_network
- load a (compressed) network
- log_lhood_mlnet
- determine the log-likelihood function of all samples in a dataset
- loo_min_dist_perf
- compute the performance of the minimum distance using the leave-one-out method
- lubksb
- solve a set of n linear equations
- ludcmp
- lu decomposition of a matrix
M
- mahal_multiclass_dataset
- determine the Mahalanobis coefficients for all classes in the set
- make_matrix3D
- construct a transformation matrix to display three-dimensional data
- make_source_ffnet
- write a stand-alone C program containing a network evaluation routine
- make_spcost_matrix
- compute the shortest-path distance matrix between all nodes in a SOM
- malloc_dataset
- allocate a DATASET structure
- malloc_layer
- allocate a LAYER structure from a pool
- malloc_link
- allocate a LINK structure from a pool
- malloc_map
- allocate a MAP structure from a pool
- malloc_monitor
- an SPRANNLIB replacement for the malloc routine
- malloc_net
- allocate a NET structure from a pool
- malloc_ptron
- allocate space for a perceptron and initialize
- malloc_sample
- allocate a SAMPLE structure
- malloc_unit
- allocate a UNIT structure from a pool
- malloc_unit_value
- allocate a UNIT_VALUE structure from a pool
- malloc_weight
- allocate a WEIGHT structure from a pool
- malloc_weight_value
- allocate a WEIGHT_VALUE structure from a pool
- MAP
- unit group datatype
- MAP_RECIPE
- simplified description of a map of a shared weights network
- matcopy
- copy matrix A to B
- matinv
- invert a matrix A into matrix B
- matinvd
- invert a matrix A into matrix B and return the determinant
- matmult
- multiply two matrices and store result in a new matrix
- matrix
- a Numerical Recipes in C datatype
- mattransp
- transpose matrix A to At
- matvecmult
- multiply a matrix with a vector and store the result in a new vector
- mean_var_dataset
- estimate the mean and variance of a class in a data set
- meanset_error
- compute the classification error on the meanset using a linear discriminant
- memory_monitor
- return the total amount of dynamically allocated memory
- MESSAGE-functions
- front-ends to the sprmessage function
- ML_STRUCT
- a local structure for maximum likelihood learning
- MNBRAK
- search for a bracketed minimum of a function in downhill direction
- MONITOR-ALLOCATION-functions
- sprlib memory allocation routines replacing the original C-functions
N
P
- PAO-variables
- local variables used for Pao clustering
- PARZELM
- local structure for Parzen classification
- parzen_best_s
- determine the best smoothing value for a class
- parzen_classify
- two class Parzen classifier
- parzen_dataset_class
- classify a sample, given a dataset, using the parzen estimator
- parzen_dset_best_s
- compute the optimal smoothing parameter for a class
- parzen_init
- parzen_init
- initialization for the Parzen classifier
- parzen_mclass
- pattern classification using the Parzen classifier
- parzen_probability
- determine the probability of a new pattern using the Parzen classifier
- PERF-flags
- network performance calculation option flags
- perf_free
- free the performance testing facilities of a learning set
- perf_init
- initialize the performance testing facilities of a learning set
- perf_mlnet
- auxiliary function to determine the performance of a maximum likelihood network
- pn_adapt_net
- perform one learning cycle with the pseudo-Newton variation of backpropagation
- pn_adapt_unit
- adapt a unit according to the pseudo-Newton variation on the generalized delta rule
- pn_adapt_weights
- adapt weights marked ``changed'' according to the accumulated (pseudo-Newton) delta's
- pn_free
- free a network that was generated by pn_init
- pn_init
- initialise and set up a network for training with the pseudo-Newton backpropagation learning rule
- pn_learn
- perform a number of learning cycles with the pseudo-Newton learning algorithm
- pr_cum_dbl_exp
- compute the cumulative (integrated) probability function
- pr_cum_normal
- compute the cumulative probability distribution function of a standardized normal variable
- pr_dbl_exp
- the probability density function for the double exponential distribution
- pr_error
- compute the classification error for a one-dimensional classifier
- pr_error_multi
- compute the classification error for a classifier
- pr_k_normal
- compute the probability density for a general k-dimensional normal distribution
- pr_normal
- compute the probability density of a standardized normal random variable
- PTRON-flags
- flags for perceptron learning
- PTRON
- a structure for perceptron learning
R
S
- SAMANN-flags
- SAMANN option flags
- samann_adapt_net
- perform one learning cycle using the SAMANN rule
- samann_adapt_unit
- adapt a unit according to SAMANN learning rule
- samann_free
- free a network that was generated by samann_init
- samann_init
- initialise and set up a network for training with the SAMANN backpropagation learning rule
- samann_learn
- perform a number of learning cycles with the backpropagation learning algorithm
- samann_learn_sample
- perform one learning cycles with the SAMANN learning algorithm
- samann_scale_dataset
- scale a dataset to make it suitable for SAMANN backpropagation training
- SAMMON-flags
- flags for Sammon map routines
- SAMMON-MAP-flags
- flags for Sammon map injection/extraction routines
- sammon_constant
- query the Sammon constant
- sammon_dstress
- calculate the derivative of Sammon's stress measure
- sammon_exit
- free memory allocated by sammon_init
- sammon_extract_map
- extract the current Sammon mapping in the form ofa DATASET
- sammon_init
- initialize the Sammon mapping routines
- sammon_inject_map
- replace the current Sammon mapping by a DATASET
- sammon_rand
- initialize a Sammon map with random values
- sammon_solve
- minimize Sammon's stress measure
- sammon_stress
- calculate Sammon's stress measure
- sammon_triangulate_set
- map a dataset using an existing Sammon mapping
- SAMPLE-flags
- enabling or disabling a data sample
- SAMPLE
- structure holding information of specific data point
- sample_distance
- compute the Euclidean between two SPRANNLIB samples
- sample_mse
- calculate the MSE of one sample and the corresponding network output
- sample_perf
- compute the performance of a network on a sample
- scale_output_dataset
- scale one output element of an IOSET dataset
- scawi_ffnet
- initialize a feedforward neural net using the SCAWI method
- set_auto_priority
- install a signal handler which can renice the process
- set_user_signal
- set the user definable signal SIGUSR1 to a userdefined function
- SHAREDNET_RECIPE
- simplified description of a shared weights network
- som_goodness
- compute the goodness-of-fit between a trained SOM and a dataset
- SOUND-flags
- some flags for playing sound samples
- SOUND-variables
- some variables used in the sound sample routines
- sound_sample
- play a sound sample to the audio device
- sprexit
- terminate the use of SPRANNLIB
- sprinit
- initialize SPRANNLIB and print version and copyright information
- sprmessage
- a general stdout error message / warning function
- STATISTICS
- data structure with all statistics about data set
- status_layer
- print a LAYER structure's contents to a stream
- status_link
- print a LINK structure's contents to a stream
- status_map
- print a MAP structure's contents to a stream
- status_monitor
- print a memory allocation status report
- status_net
- print a NET structure's contents to a stream
- status_total
- print a network's contents to a stream
- status_unit
- print a UNIT structure's contents to a stream
- status_weight
- print a WEIGHT structure's contents to a stream
- std_mean_dataset
- generate samples according to the mean dataset
- std_var_dataset
- generate samples according to the var dataset
- store_hist_all_thetas
- store the current values of the thetas in the linked lists of their history
- store_hist_all_units
- store the current unit values in the linked list of their history
- store_hist_all_weights
- store the current weight values in the linked lists of their history
- store_hist_theta
- store the current value of theta in a newly allocated structure in the linked list of values
- store_hist_unit
- store the current state of a unit in a newly allocated structure in the linked list of unit states
- store_hist_weight
- store the current weight value in a newly allocated structure in the linked list of weight values
- subspace_dataset
- create a new dataset from another dataset with a higher dimension
- svdcmp
- perform a single value decomposition of matrix
- system_data
- a union used to store external data
- system_time
- get information about used CPU time
T
U
- unif_fanin_rand_net
- randomize a network's weights with uniformly distributed samples
- unif_jogg_net
- randomly jogg a network's weights with uniformly distributed samples
- unif_rand_net
- randomize a network's weights with uniformly distributed samples
- UNIT-flags
- flags to indicate the type of a unit
- UNIT
- unit datatype
- UNIT_VALUE
- unit value datatype
V
W
This index was automatically generated by api2html
This document was generated using api2html on Thu Mar 5 09:00:00 MET DST 1998