For this example, let's build our own architecture from scratch. Our network architecture will contain a combination of different layers, namely:
- Conv2d
- MaxPool2d
- Rectified linear unit (ReLU)
- View
- Linear layer
Let's look at a pictorial representation of the architecture we are going to implement:
Let's implement this architecture in PyTorch and then walk through what each individual layer does:
class Net(nn.Module):
def __init__(self):
super().__init__()
self.conv1 = nn.Conv2d(1, 10, kernel_size=5)
self.conv2 = nn.Conv2d(10, 20, kernel_size=5)
self.conv2_drop = nn.Dropout2d()
self.fc1 = nn.Linear(320, 50)
self.fc2 = nn.Linear(50, 10)
def forward(self, x):
x = F.relu(F.max_pool2d(self.conv1(x), 2))
x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2))
x...