Working with Kubernetes
We have already launched our example microservices on Docker containers. We have even used CI and CD automated pipelines in order to run them on the local machine. You may, however, be asking an important question. How can we organize our environment on a larger scale and in production mode where we have to run multiple containers across multiple machines? Well, this is exactly what we have to do when implementing microservices in accordance with the idea of cloud native development. It turns out that many challenges still remain in this instance. Assuming that we have many microservices launched in multiple instances, there will be plenty of containers to manage. Doing things such as starting the correct containers at the correct time, handling storage considerations, scaling up or down, and dealing with failures manually would be a nightmare. Fortunately, there are some platforms available that help in clustering and orchestrating Docker containers at scale. Currently...