The fundamental understanding that neural networks are mathematical expressions leads to really simple and easy implementations of neural networks. Recall from the previous chapter that a neural network can be written like this:
func affine(weights [][]float64, inputs []float64) []float64 {
return activation(matVecMul(weights, inputs))
}
If we rewrite the code as a mathematical equation, we can write a neural network like this:

A side note:
is the same as
.


We can simply write it out using Gorgonia, like this:
import (
G "gorgonia.org/gorgonia"
)
var Float tensor.Float = tensor.Float64
func main() {
g := G.NewGraph()
x := G.NewMatrix(g, Float, G.WithName("x"), G.WithShape(N, 728))
w := G.NewMatrix(g, Float, G.WithName("w"), G.WithShape(728, 800),
G.WithInit(G.Uniform(1.0)))
b := G.NewMatrix(g, Float, G.WithName...