Manifold learning
In Chapter 3, Introduction to Semi-Supervised Classification, we discussed the manifold assumption, saying that high-dimensional data normally lies on low-dimensional manifolds. Of course, this is not a theorem, but in many real cases, the assumption is proven to be correct, and it allows us to work with non-linear dimensionality reduction algorithms that would be otherwise unacceptable. In this section, we're going to analyze some of these algorithms. They are all implemented in scikit-learn, so it's easy to try them with complex datasets.
Isomap
Isomap is one of the simplest algorithms, and it's based on the idea of reducing dimensionality while trying to preserve the geodesic distances (which are the lengths of the shortest paths between a couple of points on the manifold) measured on the original manifold where the input data lies. The algorithm works in three steps. The first operation is a KNN clustering and the construction of the following...