- All Known Implementing Classes:
- ActivationBiPolar, ActivationBipolarSteepenedSigmoid, ActivationClippedLinear, ActivationCompetitive, ActivationElliott, ActivationElliottSymmetric, ActivationGaussian, ActivationLinear, ActivationLOG, ActivationRamp, ActivationSigmoid, ActivationSIN, ActivationSoftMax, ActivationSteepenedSigmoid, ActivationStep, ActivationTANH
public interface ActivationFunctionextends Serializable, CloneableThis interface allows various activation functions to be used with the neural network. Activation functions are applied to the output from each layer of a neural network. Activation functions scale the output into the desired range. Methods are provided both to process the activation function, as well as the derivative of the function. Some training algorithms, particularly back propagation, require that it be possible to take the derivative of the activation function. Not all activation functions support derivatives. If you implement an activation function that is not derivable then an exception should be thrown inside of the derivativeFunction method implementation. Non-derivable activation functions are perfectly valid, they simply cannot be used with every training algorithm.
Methods Modifier and Type Method and Description
activationFunction(double d, int start, int size)Implements the activation function.
derivativeFunction(double b, double a)Calculate the derivative.
setParam(int index, double value)Set one of the params for this activation function.
void activationFunction(double d, int start, int size)Implements the activation function. The array is modified according to the activation function being used. See the class description for more specific information on this type of activation function.
d- The input array to the activation function.
start- The starting index.
size- The number of values to calculate.
double derivativeFunction(double b, double a)Calculate the derivative. For performance reasons two numbers are provided. First, the value "b" is simply the number that we would like to calculate the derivative of. Second, the value "a", which is the value returned by the activation function, when presented with "b". We use two values because some of the most common activation functions make use of the result of the activation function. It is bad for performance to calculate this value twice. Yet, not all derivatives are calculated this way. By providing both the value before the activation function is applied ("b"), and after the activation function is applied("a"), the class can be constructed to use whichever value will be the most efficient.
b- The number to calculate the derivative of, the number "before" the activation function was applied.
a- The number "after" an activation function has been applied.
- The derivative.
- Return true if this function has a derivative.
- The params for this activation function.
void setParam(int index, double value)Set one of the params for this activation function.
index- The index of the param to set.
value- The value to set.
- The names of the parameters.
- A cloned copy of this activation function.
- The string for the Encog factory code. Return null if you do not care to be support for creating of your activation through factory.
SCaVis 2.0 © jWork.ORG