3. Neural Networks
We learned about perceptrons in the previous chapter, and there is both good news and bad news. The good news is that perceptrons are likely to represent complicated functions. For example, the perceptron can (theoretically) represent complicated processes performed by a computer, as described in the previous chapter. The bad news is that weights must be defined manually first before the appropriate weights are determined in order to meet the expected inputs and outputs. In the previous chapter, we used the truth tables with AND and OR gates to determine the appropriate weights manually.
Neural networks exist to solve the bad news. More specifically, one important property of a neural network is that it can learn appropriate weight parameters from data automatically. This chapter provides an overview of neural networks and focuses on what distinguishes them. The next chapter will describe how it learns weight parameters from data.