Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech Guides - Virtualization

3 Articles
article-image-containers-end-of-virtual-machines
Vijin Boricha
13 Jun 2018
5 min read
Save for later

Are containers the end of virtual machines?

Vijin Boricha
13 Jun 2018
5 min read
For quite sometime now virtual machines (VMs) have gained a lot of traction. The major reason for this trend was IT industries were totally convinced about the fact that instead of having a huge room filled with servers, it is better to deploy all your workload on a single piece of hardware. There is no doubt that virtual machines have succeeded as they save a lot of cost and work pretty well making failovers easier. In a similar sense when containers were introduced they received a lot attention and have recently gained even more popularity amongst IT organisations. Well, there are a set of considerable reasons for this buzz; they are highly scalable, easy to use, portable, have faster execution and are mainly cost effective. Containers also subside management headaches as they share a common operating system. With this kind of flexibility it is quite easier to fix bugs, place update patches and make other alterations. All-in-all containers are lightweight and more portable than virtual machines. If all of this is true, are virtual machines going extinct? Well, for this answer you will have to deep dive into the complexities of both worlds. How Virtual Machines work? A virtual machine is an individual operating system installed on your usual operating system. The entire implementation is done by software emulation and hardware virtualization. Usually multiple virtual machines are used on servers where the physical machine remains the same but each virtual environment runs a completely separate service. Consider a Ubuntu server as a VM and use it to install all or any service you need. Now, if your deployment needs a set of software to handle web applications you provide all the necessary services to your application. Suddenly, there is a requirement for an additional service where your situation gets tighter, as all your resources are preoccupied. All you need to do is, install the new service on the guest virtual machine and you are all set to relax. Advantages of using virtual machines Multiple OS environments can run simultaneously on the same physical machine Easy to maintain, highly available, convenient recovery, and application provisioning Virtual machines tend to be more secure than containers Operating system flexibility on VMs is better than that of containers Disadvantages of using virtual machines Simultaneously running virtual machines may introduce an unstable performance, depending on the workload on the system by other running virtual machines Hardware accessibility becomes quite difficult when it comes to virtual machines Virtual machines are heavier in size taking up several gigabytes How Containers work? You can consider containers as lightweight, executable packages that provide everything an application needs to run and function as desired. A container usually sits on top of a physical server and its host OS allowing applications to run reliably in different environments by subtracting the operating system and physical infrastructure. So where VMs depend totally on hardware we have a new popular kid in town that requires significantly lesser hardware and does the task with ease and efficiency. Suppose you want to deploy multiple web servers faster, containers make it easier. The reason for this is, as you are deploying single services the containers require lesser hardware compared to virtual machines. The benefit of using containers does not end here. Docker, a popular container solution, creates a cluster of docker engines in such a way that they are managed as a single virtual system. So if you’re looking at deploying apps with scale, and lesser failovers your first preference should be containers. Advantages of using Containers You can any day add more computing workload on the same server as containers consume less resources Servers can load more containers than virtual machines as they are usually in megabytes Containers makes it easier to allocate resources to processes which helps running your applications in different environments Containers are cost effective solutions that help in decreasing both operating and development cost. Bug tracking and testing is easier in containers as there isn’t any difference in running your application locally, or on test servers, or in production Development, testing, and deployment time decreases with containers Disadvantages of using Containers Since containers share the kernel and other components of host operating system it become more vulnerable and can impact security of other containers as well Lack of operating system flexibility. Everytime you want to run a container on a different operating system you need to start a new server. Now coming to the original question. Are containers worth it? Will they eliminate virtualization entirely? Well, after reading this article you must have already guessed the clear winner considering the advantages over disadvantages of each platform. So, in virtual machines the hardware is virtualized to run multiple operating system instances. If one needs a complete platform that can provide multiple services then, virtual machines is your answer as it is considered a matured and a secure technology. If you're looking at achieving high scalability, agility, speed, lightweight, and portability, all this comes under just one hood, containers. With this standardised unit of software, one can stay ahead of the competition. If you still have concerns over security and how a vulnerable kernel can jeopardize the cluster than you need, DevSecOps is your knight in shining armor. The whole idea of DevSecOps is to bring operations and development together with security functions. In a nutshell, everyone involved in a software development life cycle is responsible for security. Kubernetes Containerd 1.1 Integration is now generally available Top 7 DevOps tools in 2018 What’s new in Docker Enterprise Edition 2.0?
Read more
  • 0
  • 0
  • 3766

article-image-differences-kubernetes-docker-swarm
Richard Gall
02 Apr 2018
4 min read
Save for later

The key differences between Kubernetes and Docker Swarm

Richard Gall
02 Apr 2018
4 min read
The orchestration war between Kubernetes and Docker Swarm appears to be over. Back in October, Docker announced that its Enterprise Edition could be integrated with Kubernetes. This move was widely seen as the Docker team conceding to Kubernetes dominance as an orchestration tool. But Docker Swarm nevertheless remains popular; it doesn't look like it's about to fall off the face of the earth. So what is the difference between Kubernetes and Docker Swarm? And why should you choose one over the other?  To start with it's worth saying that both container orchestration tools have a lot in common. Both let you run a cluster of containers, allowing you to increase the scale of your container deployments significantly without cloning yourself to mess about with the Docker CLI (although as you'll see, you could argue that one is more suited to scalability than the other). Ultimately, you'll need to view the various features and key differences between Docker Swarm and Kubernetes in terms of what you want to achieve. Do you want to get up and running quickly? Are you looking to deploy containers on a huge scale? Here's a brief but useful comparison of Kubernetes and Docker Swarm. It should help you decide which container orchestration tool you should be using. Docker Swarm is easier to use than Kubernetes One of the main reasons you’d choose Docker Swarm over Kubernetes is that it has a much more straightforward learning curve. As popular as it is, Kubernetes is regarded by many developers as complex. Many people complain that it is difficult to configure. Docker Swarm, meanwhile, is actually pretty simple. It’s much more accessible for less experienced programmers. And if you need a container orchestration solution now, simplicity is likely going to be an important factor in your decision making. ...But Docker Swarm isn't as customizable Although ease of use is definitely one thing Docker Swarm has over Kubernetes, it also means there's less you can actually do with it. Yes, it gets you up and running, but if you want to do something a little different, you can't. You can configure Kubernetes in a much more tailored way than Docker Swarm. That means that while the learning curve is steeper, the possibilities and opportunities open to you will be far greater. Kubernetes gives you auto-scaling - Docker Swarm doesn't When it comes to scalability it’s a close race. Both tools are able to run around 30,000 containers on 1,000 nodes, which is impressive. However, when it comes to auto-scaling, Kubernetes wins because Docker doesn’t offer that functionality out of the box. Monitoring container deployments is easier with Kubernetes This is where Kubernetes has the edge. It has in-built monitoring and logging solutions. With Docker Swarm you’ll have to use third-party applications. That isn’t necessarily a huge problem, but it does make life ever so slightly more difficult. Whether or not it makes life more difficult to outweigh the steeper Kubernetes learning curve however is another matter… Is Kubernetes or Docker Swarm better? Clearly, Kubernetes is a more advanced tool than Docker Swarm. That's one of the reasons why the Docker team backed down and opened up their enterprise tool for integration with Kubernetes. Kubernetes is simply the software that's defining container orchestration. And that's fine - Docker has cemented its position within the stack of technologies that support software automation and deployment. It's time to let someone else take on the challenge of orchestration But although Kubernetes is the more 'advanced' tool, that doesn't mean you should overlook Docker Swarm. If you want to begin deploying container clusters, without the need for specific configurations, then don't allow yourself to be seduced by something shinier, something ostensibly more popular. As with everything else in software development, understand and define what job needs to be done - then choose the right tool for the job.
Read more
  • 0
  • 1
  • 7750

article-image-docker-has-turned-us-all-sysadmins
Richard Gall
29 Dec 2015
5 min read
Save for later

Docker has turned us all into sysadmins

Richard Gall
29 Dec 2015
5 min read
Docker has been one of my favorite software stories of the last couple of years. On the face of it, it should be pretty boring. Containerization isn't, after all, as revolutionary as most of the hype around Docker would have you believe. What's actually happened is that Docker has refined the concept, and found a really clear way of communicating the idea. Deploying applications and managing your infrastructure doesn't sound immediately 'sexy'. After all, it was data scientist that was proclaimed the sexiest job of the twenty-first century; sysadmins hardly got an honorable mention. But Docker has, amazingly, changed all that. It's started to make sysadmins sexy… And why should we be surprised? If a SysAdmin's role is all about delivering software, managing infrastructures, maintaining it and making sure it performs for the people using it, it's vital (if not obviously sexy). A decade ago, when software architectures were apparently immutable and much more rigid, the idea of administration wasn't quite so crucial. But now, in a world of mobile and cloud, where technology is about mobility as much as it is about stability (in the past, tech glued us to desktops; now it's encouraging us to work in the park), sysadmins are crucial. Tools like Docker are crucial to this. By letting us isolate and package applications in their component pieces we can start using software in a way that's infinitely more agile and efficient. Where once the focus was on making sure software was simply 'there,' waiting for us to use it, it's now something that actively invites invention, reconfiguration and exploration. Docker's importance to the 'API economy' (which you're going to be hearing a lot more about in 2016) only serves to underline its significance to modern software. Not only does it provide 'a convenient way to package API-provisioning applications, but it also 'makes the composition of API-providing applications more programmatic', as this article on InfoWorld has it. Essentially, it's a tool that unlocks and spreads value. Can we, then, say the same about the humble sysadmin? Well yes – it's clear that administering systems is no longer a matter of simple organization, a question of robust management, but is a business critical role that can be the difference between success and failure. However, what this paradigm shift really means is that we've all become SysAdmins. Whatever role we're working in, we're deeply conscious of the importance of delivery and collaboration. It's not something we expect other people to do, it's something that we know is crucial. And it's for that reason that I love Docker – it's being used across the tech world, a gravitational pull bringing together disparate job roles in a way that's going to become more and more prominent over the next 12 months. Let's take a look at just two of the areas in which Docker is going to have a huge impact. Docker in web development Web development is one field where Docker has already taken hold. It's changing the typical web development workflow, arguably making web developers more productive. If you build in a single container on your PC, that container can then be deployed and managed anywhere. It also gives you options: you can build different services in different containers, or you can build a full-stack application in a single container (although Docker purists might say you shouldn't). In a nutshell, it's this ability to separate an application into its component parts that underlines why microservices are fundamental to the API economy. It means different 'bits' – the services – can be used and shared between different organizations. Fundamentally though, Docker bridges the difficult gap between development and deployment. Instead of having to worry about what happens once it has been deployed, when you build inside a container you can be confident that you know it's going to work – wherever you deploy it. With Docker, delivering your product is easier (essentially, it helps developers manage the 'ops' bit of DevOps, in a simpler way than tackling the methodology in full); which means you can focus on the specific process of development and optimizing your products. Docker in data science Docker's place within data science isn't quite as clearly defined or fully realised as it is in web development. But it's easy to see why it would be so useful to anyone working with data. What I like is that with Docker, you really get back to the 'science' of data science – it's the software version of working in a sterile and controlled environment. This post provides a great insight on just how great Docker is for data – admittedly it wasn't something I had thought that much about, but once you do, it's clear just how simple it is. As the author of puts it: 'You can package up a model in a Docker container, go have that run on some data and return some results - quickly. If you change the model, you can know that other people will be able to replicate the results because of the containerization of the model.' Wherever Docker rears its head, it's clearly a tool that can be used by everyone. However you identify – web developer, data scientist, or anything else for that matter – it's worth exploring and learning how to apply Docker to your problems and projects. Indeed, the huge range of Docker use cases is possibly one of the main reasons that Docker is such an impressive story – the fact that there are thousands of other stories all circulating around it. Maybe it's time to try it and find out what it can do for you?
Read more
  • 0
  • 0
  • 4392
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime