Understanding eigenvalues, eigenvectors, and orthogonal bases
In this section, we will learn about the mathematical concepts behind PCA, such as eigenvalues, eigenvectors, and orthogonal bases. We will also learn how to find the eigenvalues and eigenvectors for a given matrix.
Many real-world machine learning problems involve working with a lot of feature variables; sometimes in the millions. This not only makes it harder for us to store the data due to its massive size but also leads to the slower training of machine learning models, making it harder for us to find an optimal solution. In addition, there is a chance that you are overfitting your model to the data. This problem is often referred to as the curse of dimensionality in the field of machine learning.
A solution to this curse of dimensionality is to reduce the dimensionality of datasets that have many feature variables. Let's try to understand this concept with the help of an example dataset: pizza.csv
. This...