In comparison to supervised learning, unsupervised learning does not need a dataset of labeled examples during the training phase–labels are only needed during the testing phase when we want to evaluate the performance of the model.
The purpose of unsupervised learning is to discover natural partitions in the training set. What does this mean? Think about the MNIST dataset—it has 10 classes, and we know this because every example has a different label in the [1,10] range. An unsupervised learning algorithm has to discover that there are 10 different objects inside the dataset and does this by looking at the examples without prior knowledge of the label.
It is clear that unsupervised learning algorithms are challenging compared to supervised learning ones since they cannot rely on the label's information, but they have to discover features...