Foundation Models
Advanced large multipurpose transformer models represent such a paradigm change that they require a new name to describe them: Foundation Models. Accordingly, Stanford University created the Center for Research on Foundation Models (CRFM). In August 2021, the CRFM published a two-hundred-page paper (see the References section) written by over one hundred scientists and professionals: On the Opportunities and Risks of Foundation Models.
Foundation Models were not created by academia but by the big tech industry. Google invented the transformer model, leading to Google BERT, LaMBDA, PaLM 2, and more. Microsoft partnered with OpenAI to produce ChatGPT with GPT-4, and soon more.
Big tech had to find a better model to face the exponential increase of petabytes of data flowing into their data centers. Transformers were thus born out of necessity.
Let’s consider the evolution of LLMs to understand the need for industrialized AI models.
Transformers...