In the preceding section, we looked into a simple neural network without any layers called a perceptron. The perceptron was found to have serious limitations, and in 1969, Marvin Minsky and Seymour Papert worked on research that led to the conclusion that a perceptron is incapable of learning any complex logic.
In fact, they showed that it would be a struggle to learn even logical functions as simple as XOR. That led to a decrease in interest in machine learning in general, and neural networks in particular, and started an era that is now known as the AI winter. Researchers around the world would not take AI seriously, thinking that it was incapable of solving any complex problems.
One of the primary reasons for the so-called AI winter was the limitation of the hardware capabilities available at that time. Either the necessary computing...