The need for tensors
Usually, input data is represented with multi-dimensional arrays:
- Images have three dimensions: The number of channels, the width, and the height of the image
- Sounds and times series have one dimension: The duration
- Natural language sequences can be represented by two-dimensional arrays: The duration and the alphabet length or the vocabulary length
We'll see more examples of input data arrays in the future chapters.
In Theano, multi-dimensional arrays are implemented with an abstraction class, named tensor, with many more transformations available than traditional arrays in a computer language such as Python.
At each stage of a neural net, computations such as matrix multiplications involve multiple operations on these multi-dimensional arrays.
Classical arrays in programming languages do not have enough built-in functionalities to quickly and adequately address multi-dimensional computations and manipulations.
Computations on multi-dimensional arrays have a long history of optimizations, with tons of libraries and hardware. One of the most important gains in speed has been permitted by the massive parallel architecture of the GPU, with computation ability on a large number of cores, from a few hundred to a few thousand.
Compared to the traditional CPU, for example, a quadricore, 12-core, or 32-core engine, the gains with GPU can range from 5x to 100x, even if part of the code is still being executed on the CPU (data loading, GPU piloting, and result outputting). The main bottleneck with the use of GPU is usually the transfer of data between the memory of the CPU and the memory of the GPU, but still, when well programmed, the use of GPU helps bring a significant increase in speed of an order of magnitude. Getting results in days rather than months, or hours rather than days, is an undeniable benefit for experimentation.
The Theano engine has been designed to address the challenges of multi-dimensional arrays and architecture abstraction from the beginning.
There is another undeniable benefit of Theano for scientific computation: the automatic differentiation of functions of multi-dimensional arrays, a well-suited feature for model parameter inference via objective function minimization. Such a feature facilitates experimentation by releasing the pain to compute derivatives, which might not be very complicated, but are prone to many errors.