1. Ward Nets
There are five Ward Nets, depending upon the number of inputs they take. Their names are Ward2, Ward3, Ward4, Ward5, and Ward6.
All Ward Nets have parameters common to all of the nets, i.e., scaling, inputs, and weights. There are two additional parameters, however: h1 and h2. These are integer values of either 0 or 1 that tell the neural net what activation functions to use in the hidden neurons (refer to "Neural Network Architectures" for explanations of these terms). You can let the optimizer find the best value of each of these, or fix them yourself. To let the optimizer find them, set the optimization range to 0 to 1. To fix them, set the optimization range to either 0 to 0 or 1 to 1. A setting of 0 is the hyperbolic tangent, and a setting of 1 is the Gaussian activation function.
2. Jump Nets
There are five nets with jump connections and cross connections. They are named Jump2, Jump3, Jump4, Jump5, and Jump6.
3. Recurrent Nets
There are five recurrent nets, depending upon the number of inputs they take. They are Recur2, Recur3, Recur4, Recur5, and Recur6.
4. Sparse Nets
There are three sparse nets made to accept 8, 10, and 12 inputs respectively. They are named Sparse8, Sparse10, and Sparse12.
5. Conditional Versions
We have implemented modified versions of all of the above nets. The conditional version is named like the original version, except that a "C" has been placed at the end of the name. For example, Jump3C is the conditional version of Jump3. The "C" or conditional version of each of these architectures produces a value of either true or false rather than a value between –1 and 1. Therefore the C version of any of the architectures can be used in a Trading Strategy without the necessity of inserting it into an A>B indicator. The C version was created from the original version of the architecture by inserting it as A in the A>B indicator with B set to 0.
|