NN building blocks
In the torch.nn
package, you'll find tons of predefined classes providing you with the basic functionality blocks. All of them are designed with practice in mind (for example, they support minibatches, have sane default values, and the weights are properly initialized). All modules follow the convention of callable, which means that the instance of any class can act as a function when applied to its arguments. For example, the Linear
class implements a feed-forward layer with optional bias:
>>> import torch.nn as nn >>> l = nn.Linear(2, 5) >>> v = torch.FloatTensor([1, 2]) >>> l(v) tensor([ 0.1975, 0.1639, 1.1130, -0.2376, -0.7873])
Here, we created a randomly initialized feed-forward layer, with two inputs and five outputs, and applied it to our float tensor. All classes in the torch.nn
packages inherit from the nn.Module
base class, which you can use to implement your own higher-level NN blocks. We'll see how you...