Simplifying implementations of common architectures via the torch.nn module
You have already seen some examples of building a feedforward NN model (for instance, a multilayer perceptron) and defining a sequence of layers using the nn.Module
class. Before we take a deeper dive into nn.Module
, let’s briefly look at another approach for conjuring those layers via nn.Sequential
.
Implementing models based on nn.Sequential
With nn.Sequential
(https://pytorch.org/docs/master/generated/torch.nn.Sequential.html#sequential), the layers stored inside the model are connected in a cascaded way. In the following example, we will build a model with two densely (fully) connected layers:
>>> model = nn.Sequential(
... nn.Linear(4, 16),
... nn.ReLU(),
... nn.Linear(16, 32),
... nn.ReLU()
... )
>>> model
Sequential(
(0): Linear(in_features=4, out_features=16, bias=True)
(1): ReLU()
(2): Linear(in_features=16, out_features=32, bias=True)
(3...