**'org.encog.neural.networks.training.propagation.manhattan.ManhattanPropagation'**Java class

org.encog.neural.networks.training.propagation.manhattan

## Class ManhattanPropagation

- java.lang.Object
- org.encog.ml.train.BasicTraining
- org.encog.neural.networks.training.propagation.Propagation
- org.encog.neural.networks.training.propagation.manhattan.ManhattanPropagation

- All Implemented Interfaces:
- MLTrain, LearningRate, Train

public class ManhattanPropagationextends Propagationimplements LearningRate

One problem that the backpropagation technique has is that the magnitude of the partial derivative may be calculated too large or too small. The Manhattan update algorithm attempts to solve this by using the partial derivative to only indicate the sign of the update to the weight matrix. The actual amount added or subtracted from the weight matrix is obtained from a simple constant. This constant must be adjusted based on the type of neural network being trained. In general, start with a higher constant and decrease it as needed. The Manhattan update algorithm can be thought of as a simplified version of the resilient algorithm. The resilient algorithm uses more complex techniques to determine the update value.

**Warning:**You cannot see the full API documentation of this class since the access to the DatMelt documentation for third-party Java classes is denied. Guests can only view jhplot Java API. To view the complete description of this class and its methods, please request the full DataMelt membership.

If you are already a full member, please login to the DataMelt member area before visiting this documentation.