Preface
So, you want to work with foundation models? That is an excellent place to begin! Many of us in the machine learning community have followed these curious creatures for years, from their earliest onset in the first days of the Transformer models, to their expansion in computer vision, to the near ubiquitous presence of text generation and interactive dialogue we see in the world today.
But where do foundation models come from? How do they work? What makes them tick, and when should you pretrain and fine-tune them? How can you eke out performance gains on your datasets and applications? How many accelerators do you need? What does an end-to-end application look like, and how can you use foundation models to master this new surge of interest in generative AI?
These pages hope to provide answers to these very important questions. As you are no doubt aware, the pace of innovation in this space is truly breathtaking, with more foundation models coming online every day from both open-source and proprietary model vendors. To grapple with this reality, I’ve tried to focus on the most important conceptual fundamentals throughout the book. This means your careful study here should pay off for at least a few more years ahead.
In terms of practical applications and guidance, I’ve overwhelmingly focused on cloud computing options available through AWS and especially Amazon SageMaker. I’ve spent more than the last five years very happily at AWS and enjoy sharing all of my knowledge and experience with you! Please do note that all thoughts and opinions shared in this book are my own, and do not represent those of Amazon’s.
The following chapters focus on concepts, not code. This is because software changes rapidly, while fundamentals change very slowly. You’ll find in the repository with the book links to my go-to resources for all of the key topics mentioned throughout these fifteen chapters, which you can use right away to get hands-on with everything you’re learning here. Starting July 1, 2023, you’ll also find in the repository a set of new pretraining and fine-tuning examples from yours truly to complete all of the topics.
You might find this hard to believe, but in my early twenties I wasn’t actually coding: I was exploring the life of a Buddhist monastic. I spent five years living at a meditation retreat center in Arizona, the Garchen Institute. During this time, I learned how to meditate, focus my mind, watch my emotions and develop virtuous habits. After my master’s degree at the University of Chicago years later, and now at Amazon, I can see that these traits are extremely useful in today’s world as well!
I mention this so that you can take heart. Machine learning, artificial intelligence, cloud computing, economics, application development, none of these topics are straightforward. But if you apply yourself, if you really stretch your mind to consider the core foundations of the topics at hand, if you keep yourself coming back to the challenge again and again, there’s truly nothing you can’t do. That is the beauty of humanity! And if a meditating yogi straight from the deep silence of a retreat hut can eventually learn what it takes to pretrain and fine-tune foundation models, then so can you!
With that in mind, let’s learn more about the book itself!
Note
Most of the concepts mentioned here will be accompanied by scripting examples in the repository starting July 1, 2023. However, to get you started even earlier, you can find a list of resources in the repository today with links to useful hands-on examples elsewhere for demonstration.