Since the last decade, neural networks have been at the forefront of machine learning research and applications. Deep neural networks (DNNs), transfer learning, and availability of computationally efficient GPUs have helped achieve significant progress in the field of image recognition, speech recognition, and even text generation. In this chapter, we will concentrate on the basic neural network perceptron, a fully connected layered architecture of artificial neurons. The chapter will include the following recipes:
- Activation functions
- Single layer perceptron
- Calculating Gradients of Backpropagation algorithm
- MNIST classifier using MLP
- Function approximation using MLP--predicting Boston house prices
- Tuning hyperparameters
- Higher level APIs-Keras