Documentation API of the 'org.encog.engine.network.activation.ActivationFunction' Java class
ActivationFunction
org.encog.engine.network.activation

Interface ActivationFunction

  • All Superinterfaces:
    Cloneable, Serializable
    All Known Implementing Classes:
    ActivationBiPolar, ActivationCompetitive, ActivationGaussian, ActivationLinear, ActivationLOG, ActivationRamp, ActivationSigmoid, ActivationSIN, ActivationSoftMax, ActivationStep, ActivationTANH


    public interface ActivationFunctionextends Serializable, Cloneable
    This interface allows various activation functions to be used with the neural network. Activation functions are applied to the output from each layer of a neural network. Activation functions scale the output into the desired range. Methods are provided both to process the activation function, as well as the derivative of the function. Some training algorithms, particularly back propagation, require that it be possible to take the derivative of the activation function. Not all activation functions support derivatives. If you implement an activation function that is not derivable then an exception should be thrown inside of the derivativeFunction method implementation. Non-derivable activation functions are perfectly valid, they simply cannot be used with every training algorithm.

Warning: You cannot see the full API documentation of this class since the access to the DatMelt documentation for third-party Java classes is denied. Guests can only view jhplot Java API. To view the complete description of this class and its methods, please request the full DataMelt membership.

If you are already a full member, please login to the DataMelt member area before visiting this documentation.