Evaluation
Before applying our multilayer perceptron to understand fluctuations in the currency market exchanges, let's get acquainted with some of the key learning parameters introduced in the first section.
Execution profile
Let's look at the convergence of the training of the multiple layer perceptron. The monitor trait (refer to the Training section under Helper classes in the Appendix) collects and displays some execution parameters. We selected to extract the profile for the convergence of the multiple layer perceptron using the difference of the backpropagation errors between two consecutive episodes (or epochs).
The test profiles the convergence of the MLP using a learning rate, ? = 0.03, and a momentum factor of a = 0.3 for a multilayer perceptron with two input values, one hidden layer with three nodes, and one output value. The test relies on synthetically generated random values:
Impact of learning rate
The purpose of the first exercise...