Virtualization in Cloud Computing

Virtualization, one of the most popular technology has revolutionized the way infrastructure is maintained and rented as a public utility.

You will be surprised to know that it was first used in the early 1970s and was introduced by IBM to virtualize its mainframe systems through hardware virtualization, which is still a popular mode of service delivery.

Let’s start with a fundamental question

What is virtualization?

RedHat documentation really defined it well.

Virtualization is a technology that allows you to create multiple simulated environments or dedicated resources from a single, physical hardware system. Software called a hypervisor connects directly to that hardware and allows you to split 1 system into separate, distinct, and secure environments known as virtual machines (VMs). These VMs rely on the hypervisor’s ability to separate the machine’s resources from the hardware and distribute them appropriately.

So the virtualization clearly allows the web hosting and data center service providers to make the best use of their hardware investments and maintaining optimal operational cost.

All thanks to this, service providers were able to pass on the benefits to their users/customers with low-cost services and offer a low barrier to entry for new customers.

Virtualization technology leverages unprecedented benefits as follows:

  • Increased performance and optimized compute capacity sharing.
  • Optimizing under/over utilized hardware and software resources.
  • Reduced indirect carbon emissions.

Virtualization technology has a long history of development and has gone through various phases of research, development, and implementation. In fact, there has been a couple of research projects administrated to support the simulation of cloud-based virtualization systems. These systems model the real-world cloud behavior to develop state of the art optimization policies.

Leave a Reply

Your email address will not be published. Required fields are marked *