Anatomy of deep neural networks
Neural networks are a collection of neurons that are interconnected with each other through various layers that can learn the mapping between the inputs and the outputs with the provided training data. Neural networks model complex patterns in datasets where traditional ML algorithms fail. They use multiple hidden layers and non-linear activation functions. The concept of a neural network was put first forth by Warren McCullough and Walter Pitts in 1944, and they describe a neural network as the collection of connected nodes (https://www.cambridge.org/core/journals/journal-of-symbolic-logic/article/abs/warren-s-mcculloch-and-walter-pitts-a-logical-calculus-of-the-ideas-immanent-in-nervous-activity-bulletin-of-mathematical-biophysics-vol-5-1943-pp-115133/7DFDC43EC1E5BD05E9DA85E1C41A01BD). Here, nodes represent artificial neurons, which are the functional units of neural networks which we will discuss in the following section.
The fundamental unit of...