Perceptrons—a brain in a function
The simplest neural network architecture—the perceptron—was inspired by biological research to understand the basis of mental processing in an attempt to represent the function of the brain with mathematical formulae. In this section we will cover some of this early research and how it inspired what is now the field of deep learning and generative AI.
From tissues to TLUs
The recent popularity of AI algorithms might give the false impression that this field is new. Many recent models are based on discoveries made decades ago that have been reinvigorated by the massive computational resources available in the cloud and customized hardware for parallel matrix computations such as Graphical Processing Units (GPUs), Tensor Processing Units (TPUs), and Field Programmable Gate Array (FPGAs). If we consider research on neural networks to include their biological inspiration as well as computational theory, this field is over a hundred years...