ActivationFunction
org.encog.engine.network.activation

Interface ActivationFunction

  • All Superinterfaces:
    Cloneable, Serializable
    All Known Implementing Classes:
    ActivationBiPolar, ActivationCompetitive, ActivationGaussian, ActivationLinear, ActivationLOG, ActivationRamp, ActivationSigmoid, ActivationSIN, ActivationSoftMax, ActivationStep, ActivationTANH


    public interface ActivationFunctionextends Serializable, Cloneable
    This interface allows various activation functions to be used with the neural network. Activation functions are applied to the output from each layer of a neural network. Activation functions scale the output into the desired range. Methods are provided both to process the activation function, as well as the derivative of the function. Some training algorithms, particularly back propagation, require that it be possible to take the derivative of the activation function. Not all activation functions support derivatives. If you implement an activation function that is not derivable then an exception should be thrown inside of the derivativeFunction method implementation. Non-derivable activation functions are perfectly valid, they simply cannot be used with every training algorithm.
    • Method Detail

      • activationFunction

        void activationFunction(double[] d,                      int start,                      int size)
        Implements the activation function. The array is modified according to the activation function being used. See the class description for more specific information on this type of activation function.
        Parameters:
        d - The input array to the activation function.
        start - The starting index.
        size - The number of values to calculate.
      • derivativeFunction

        double derivativeFunction(double b,                        double a)
        Calculate the derivative. For performance reasons two numbers are provided. First, the value "b" is simply the number that we would like to calculate the derivative of. Second, the value "a", which is the value returned by the activation function, when presented with "b". We use two values because some of the most common activation functions make use of the result of the activation function. It is bad for performance to calculate this value twice. Yet, not all derivatives are calculated this way. By providing both the value before the activation function is applied ("b"), and after the activation function is applied("a"), the class can be constructed to use whichever value will be the most efficient.
        Parameters:
        b - The number to calculate the derivative of, the number "before" the activation function was applied.
        a - The number "after" an activation function has been applied.
        Returns:
        The derivative.
      • hasDerivative

        boolean hasDerivative()
        Returns:
        Return true if this function has a derivative.
      • getParams

        double[] getParams()
        Returns:
        The params for this activation function.
      • setParam

        void setParam(int index,            double value)
        Set one of the params for this activation function.
        Parameters:
        index - The index of the param to set.
        value - The value to set.
      • getParamNames

        String[] getParamNames()
        Returns:
        The names of the parameters.

SCaVis 2.0 © jWork.ORG