Now that you know what optimization is, it's time to explore some of the methods used in practice. We will not be covering the entire field of optimization because that would require an entire book to cover. We will only cover the essential optimization methods that are applicable to deep learning.
Exploring the various optimization methods
Least squares
Least squares is a subclass of convex optimization. It is classified as having no constraints and takes the following form:
Here, , are rows of A, and is our optimization variable.
We can also express this as a set of linear equations of the form. Therefore, .
The problem of least squares is very similar to that of maximum likelihood estimation.