For the neural network itself, we have to wrap the layers together and add some methods to forward through the complete network and to predict the class according to the output vector. After the layer's implementation, the following code should be self-explanatory:
import numpy as np
from layer import FullyConnectedLayer
def sigmoid(x): # Apply the sigmoid function to the elements of x.
return 1 / (1 + np.exp(-x)) # y
class SimpleNetwork(object):
"""A simple fully-connected NN.
Args:
num_inputs (int): The input vector size / number of input values.
num_outputs (int): The output vector size.
hidden_layers_sizes (list): A list of sizes for each hidden layer to be added to the network
Attributes:
layers (list): The list of layers forming this simple network.
"""
def __init__(self, num_inputs, num_outputs, hidden_layers_sizes=(64, 32)):
super().__init__()
# We build the list...