public class Backpropagationextends Propagationimplements Momentum, LearningRateThis class implements a backpropagation training algorithm for feed forward neural networks. It is used in the same manner as any other training class that implements the Train interface. Backpropagation is a common neural network training algorithm. It works by analyzing the error of the output of the neural network. Each neuron in the output layer's contribution, according to weight, to this error is determined. These weights are then adjusted to minimize this error. This process continues working its way backwards through the layers of the neural network. This implementation of the backpropagation algorithm uses both momentum and a learning rate. The learning rate specifies the degree to which the weight matrixes will be modified through each iteration. The momentum specifies how much the previous learning iteration affects the current. To use no momentum at all specify zero. One primary problem with backpropagation is that the magnitude of the partial derivative is often detrimental to the training of the neural network. The other propagation methods of Manhatten and Resilient address this issue in different ways. In general, it is suggested that you use the resilient propagation technique for most Encog training tasks over back propagation.
Fields Modifier and Type Field and Description
LAST_DELTAThe resume key for backpropagation.
Constructors Constructor and Description
Backpropagation(ContainsFlat network, MLDataSet training)Create a class to train using backpropagation.
Backpropagation(ContainsFlat network, MLDataSet training, double theLearnRate, double theMomentum)
Methods Modifier and Type Method and Description
initOthers()Perform training method specific init.
isValidResume(TrainingContinuation state)Determine if the specified continuation object is valid to resume with.
pause()Pause the training.
resume(TrainingContinuation state)Resume training.
setLearningRate(double rate)Set the learning rate, this is value is essentially a percent.
setMomentum(double m)Set the momentum for training.
updateWeight(double gradients, double lastGradient, int index)Update a weight.
Methods inherited from class org.encog.neural.networks.training.propagation.Propagation
calculateGradients, finishTraining, fixFlatSpot, getBatchSize, getCurrentFlatNetwork, getLastGradient, getMethod, getThreadCount, iteration, iteration, report, rollIteration, setBatchSize, setErrorFunction, setThreadCount
Methods inherited from class org.encog.ml.train.BasicTraining
addStrategy, getError, getImplementationType, getIteration, getStrategies, getTraining, isTrainingDone, postIteration, preIteration, setError, setIteration, setTraining
Methods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
public Backpropagation(ContainsFlat network, MLDataSet training)Create a class to train using backpropagation. Use auto learn rate and momentum. Use the CPU to train.
network- The network that is to be trained.
training- The training data to be used for backpropagation.
public Backpropagation(ContainsFlat network, MLDataSet training, double theLearnRate, double theMomentum)
network- The network that is to be trained
training- The training set
theLearnRate- The rate at which the weight matrix will be adjusted based on learning.
theMomentum- The influence that previous iteration's training deltas will have on the current iteration.
public boolean canContinue()
public double getLastDelta()
- The last delta values.
public double getLearningRate()
public double getMomentum()
public boolean isValidResume(TrainingContinuation state)Determine if the specified continuation object is valid to resume with.
state- The continuation object to check.
- True if the specified continuation object is valid for this training method and network.
public TrainingContinuation pause()Pause the training.
public void resume(TrainingContinuation state)Resume training.
public void setLearningRate(double rate)Set the learning rate, this is value is essentially a percent. It is the degree to which the gradients are applied to the weight matrix to allow learning.
public void setMomentum(double m)Set the momentum for training. This is the degree to which changes from which the previous training iteration will affect this training iteration. This can be useful to overcome local minima.
public double updateWeight(double gradients, double lastGradient, int index)Update a weight.
SCaVis 2.0 © jWork.ORG