Summary
You’ve made it this far; well done! The effort will be worth it. Along with random variables and probability distributions, linear algebra is one of the core math building blocks for all data science algorithms.
Vectors are a natural way to represent data, and matrices are a natural way to encode transformations that act on that data. And it is those transformations that are a core part of what a data scientist does – shaping, aggregating, and manipulating data. Explanations of matrix algebra are often dry, hiding what the matrices are doing. We have tried to correct that in this chapter. Along the way, we have learned the following:
- How to calculate inner and outer products of pairs of vectors
- How to do matrix multiplication
- How a matrix represents a transformation
- The inverse and identity matrices
- The two core matrix decomposition methods: the eigen-decomposition and the SVD
- How to calculate the trace and determinant of a square...