Chapter 1: What Is Cloud Computing?
Cloud computing has become the default option to design, build, and implement Information Technology (IT) applications for businesses across the globe. In the old days, you would host the entire infrastructure, hire a group of developers, and design each component and process required to build your applications. This approach not only ate into the bottom line, but also often did not follow best practices. It also lacked flexibility and scope for innovation.
Understanding cloud computing has become vital for IT professionals worldwide if they are to sustain their jobs and make progress in their careers. You can no longer deliver old-school solutions to your clients—it is simply not cost-effective in today's fast-paced IT world.
In addition, architecting solutions for the cloud comes with its own challenges, such as security considerations and network connectivity. This makes it crucial to upskill so that you can gain a deep understanding of how to build resilient, scalable, and reliable solutions that can be hosted in the cloud.
In this chapter, we introduce you to the concept of cloud computing, what it includes, and the key advantages of moving to the cloud. We also discuss the various cloud computing models, as well as deployment options for the cloud. Understanding the key differences between the models and deployment options and their use cases and benefits is fundamental to formulating an effective cloud-adoption strategy for your business.
We also look at a high-level overview of virtualization—a principal ingredient that has made cloud computing possible.
This chapter covers the following topics:
- What is cloud computing?
- Exploring the basics of virtualization
- Exploring cloud computing models
- Understanding cloud deployment models