Class ResilientPropagation

  • All Implemented Interfaces:
    MLTrain, BatchSize, Train, MultiThreadable

    public class ResilientPropagationextends Propagation
    One problem with the backpropagation algorithm is that the magnitude of the partial derivative is usually too large or too small. Further, the learning rate is a single value for the entire neural network. The resilient propagation learning algorithm uses a special update value(similar to the learning rate) for every neuron connection. Further these update values are automatically determined, unlike the learning rate of the backpropagation algorithm. For most training situations, we suggest that the resilient propagation algorithm (this class) be used for training. There are a total of three parameters that must be provided to the resilient training algorithm. Defaults are provided for each, and in nearly all cases, these defaults are acceptable. This makes the resilient propagation algorithm one of the easiest and most efficient training algorithms available. The optional parameters are: zeroTolerance - How close to zero can a number be to be considered zero. The default is 0.00000000000000001. initialUpdate - What are the initial update values for each matrix value. The default is 0.1. maxStep - What is the largest amount that the update values can step. The default is 50. Usually you will not need to use these, and you should use the constructor that does not require them.
    • Constructor Detail

      • ResilientPropagation

        public ResilientPropagation(ContainsFlat network,                    MLDataSet training)
        Construct an RPROP trainer, allows an OpenCL device to be specified. Use the defaults for all training parameters. Usually this is the constructor to use as the resilient training algorithm is designed for the default parameters to be acceptable for nearly all problems.
        network - The network to train.
        training - The training data to use.
      • ResilientPropagation

        public ResilientPropagation(ContainsFlat network,                    MLDataSet training,                    double initialUpdate,                    double maxStep)
        Construct a resilient training object, allow the training parameters to be specified. Usually the default parameters are acceptable for the resilient training algorithm. Therefore you should usually use the other constructor, that makes use of the default values.
        network - The network to train.
        training - The training set to use.
        initialUpdate - The initial update values, this is the amount that the deltas are all initially set to.
        maxStep - The maximum that a delta can reach.
    • Method Detail

      • canContinue

        public boolean canContinue()
        True, as RPROP can continue.
      • isValidResume

        public boolean isValidResume(TrainingContinuation state)
        Determine if the specified continuation object is valid to resume with.
        state - The continuation object to check.
        True if the specified continuation object is valid for this training method and network.
      • pause

        public TrainingContinuation pause()
        Pause the training.
        A training continuation object to continue with.
      • resume

        public void resume(TrainingContinuation state)
        Resume training.
        state - The training state to return to.
      • setRPROPType

        public void setRPROPType(RPROPType t)
        Set the type of RPROP to use. The default is RPROPp (RPROP+), or classic RPROP.
        t - The type.
      • getRPROPType

        public RPROPType getRPROPType()
        The type of RPROP used.
      • initOthers

        public void initOthers()
        Perform training method specific init.
        Specified by:
        initOthers in class Propagation
      • updateWeight

        public double updateWeight(double[] gradients,                  double[] lastGradient,                  int index)
        Calculate the amount to change the weight by.
        Specified by:
        updateWeight in class Propagation
        gradients - The gradients.
        lastGradient - The last gradients.
        index - The index to update.
        The amount to change the weight by.
      • updateWeightPlus

        public double updateWeightPlus(double[] gradients,                      double[] lastGradient,                      int index)
      • updateWeightMinus

        public double updateWeightMinus(double[] gradients,                       double[] lastGradient,                       int index)
      • updateiWeightPlus

        public double updateiWeightPlus(double[] gradients,                       double[] lastGradient,                       int index)
      • updateiWeightMinus

        public double updateiWeightMinus(double[] gradients,                        double[] lastGradient,                        int index)
      • getUpdateValues

        public double[] getUpdateValues()
        The RPROP update values.
      • setBatchSize

        public void setBatchSize(int theBatchSize)
        Do not allow batch sizes other than 0, not supported.
        Specified by:
        setBatchSize in interface BatchSize
        setBatchSize in class Propagation
        theBatchSize - The batch size.

SCaVis 2.0 © jWork.ORG