Simplifying the Model
Have you heard about parsimony? Parsimony, in the context of model estimation, concerns keeping a model as simple as possible. Such a principle comes from the assumption that complex models (models with a higher number of parameters) overfit the training data, thus reducing the capacity to generalize and make good predictions.
In addition, simplifying neural networks has two main benefits: reducing the model training time and making the model feasible to run in resource-constrained environments. One of the approaches to simplifying a model relies on reducing the number of parameters of the neural network by employing pruning and compression techniques.
In this chapter, we show how to simplify a model by reducing the number of parameters of the neural network without sacrificing its quality.
Here is what you will learn as part of this chapter:
- The key benefits of simplifying a model
- The concept and techniques of model pruning and compression...