Constant Index
$#! · 0-9 · A · B · C · D · E · F · G · H · I · J · K · L · M · N · O · P · Q · R · S · T · U · V · W · X · Y · Z
E
 ERRORFUNC_LINEAR, FANNCSharp.ErrorFunction
 ERRORFUNC_TANH, FANNCSharp.ErrorFunction
F
 FANN_COS_SYMMETRIC, FANNCSharp.ActivationFunction
 FANN_ELLIOT, FANNCSharp.ActivationFunction
 FANN_ELLIOT_SYMMETRIC, FANNCSharp.ActivationFunction
 FANN_GAUSSIAN, FANNCSharp.ActivationFunction
 FANN_GAUSSIAN_SYMMETRIC, FANNCSharp.ActivationFunction
 FANN_LINEAR, FANNCSharp.ActivationFunction
 FANN_LINEAR_PIECE, FANNCSharp.ActivationFunction
 FANN_LINEAR_PIECE_SYMMETRIC, FANNCSharp.ActivationFunction
 FANN_SIGMOID, FANNCSharp.ActivationFunction
 FANN_SIGMOID_STEPWISE, FANNCSharp.ActivationFunction
 FANN_SIGMOID_SYMMETRIC, FANNCSharp.ActivationFunction
 FANN_SIN_SYMMETRIC, FANNCSharp.ActivationFunction
 FANN_THRESHOLD, FANNCSharp.ActivationFunction
 FANN_THRESHOLD_SYMMETRIC, FANNCSharp.ActivationFunction
 FANN_TRAIN_SARPROP, FANNCSharp.TrainingAlgorithm
L
 LAYER, FANNCSharp.NetworkType
S
 SHORTCUT, FANNCSharp.NetworkType
 STOPFUNC_BIT, FANNCSharp.StopFunction
 STOPFUNC_MSE, FANNCSharp.StopFunction
T
 TRAIN_BATCH, FANNCSharp.TrainingAlgorithm
 TRAIN_INCREMENTAL, FANNCSharp.TrainingAlgorithm
 TRAIN_QUICKPROP, FANNCSharp.TrainingAlgorithm
 TRAIN_RPROP, FANNCSharp.TrainingAlgorithm
Standard linear error function.
Tanh error function, usually better but can require a lower learning rate.
Periodical cosinus activation function.
Fast (sigmoid like) activation function defined by David Elliott
Fast (symmetric sigmoid like) activation function defined by David Elliott
Gaussian activation function.
Symmetric gaussian activation function.
Linear activation function.
Bounded linear activation function.
Bounded Linear activation function.
Sigmoid activation function.
Stepwise linear approximation to sigmoid.
Symmetric sigmoid activation function, aka.
Periodical sinus activation function.
Threshold activation function.
Threshold activation function.
THE SARPROP ALGORITHM: A SIMULATED ANNEALING ENHANCEMENT TO RESILIENT BACK PROPAGATION http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.47.8197&rep=rep1&type=pdf
Each layer only has connections to the next layer
Each layer has connections to all following layers
Stop criteria is number of bits that fail.
Stop criteria is Mean Square Error (MSE) value.
Standard backpropagation algorithm, where the weights are updated after calculating the mean square error for the whole training set.
Standard backpropagation algorithm, where the weights are updated after each training pattern.
A more advanced batch training algorithm which achieves good results for many problems.
A more advanced batch training algorithm which achieves good results for many problems.
Close