Having said all that, the neural network is very easy to build. First, we define a neural network as such:
type convnet struct {
g *gorgonia.ExprGraph
w0, w1, w2, w3, w4 *gorgonia.Node // weights. the number at the back indicates which layer it's used for
d0, d1, d2, d3 float64 // dropout probabilities
out *gorgonia.Node
outVal gorgonia.Value
}
Here, we defined a neural network with four layers. A convnet layer is similar to a linear layer in many ways. It can, for example, be written as an equation:
Note that in this specific example, I consider dropout and max-pool to be part of the same layer. In many literatures, they are considered to be separate layers.
I personally do not see the necessity to consider them as separate layers. After all, everything is just a mathematical equation; composing functions comes...