It could save your business money and your data in case of disaster. Ian Murphy explains how to get started with virtualisation.Reduce the costs of power and cooling; improve the use of assets; do more with less hardware. These have become the underlying goals of IT departments
over the past few years, and one of the primary technologies for achieving these goals is virtualisation. So what is virtualisation and why is it so important for businesses?
Virtualisation is a range of technologies that allow you to run operating systems and applications in their own virtual machine (VM). Each VM is a separate environment, so any problems with an application are contained within it, leaving other VMs completely unaffected. This containment means you can deploy multiple VMs on the same physical box and so reduce the number of computers your business needs. This obviously reduces your hardware expenditure, as well as the space needed in the datacentre, and generates more value for money from the hardware you own.
Another benefit of VMs that’s often overlooked is that they consist of only a few files. Provided you shut down the applications and the VM properly, you can copy these files to another machine or back them up to tape quickly and easily. This provides a simpler, more robust and reliable backup and disaster recovery option than conventional backup approaches.
We’ll explain the different types of virtualisation and discuss their real-world pros and cons, to help you decide whether it will suit your organisation.
There are two primary approaches to machine virtualisation. The first is hardware virtualisation, where the VM sits almost directly on the physical hardware, and the second is native virtualisation, in which the VM sits on a host operating system. Irrespective of which solution you choose, a VM contains its own operating system along with the application.
Hardware virtualisation is where there’s little more than a thin layer of software on top of the physical hardware, which is known as a hypervisor. This is accessed by the VMs, which have full access to the underlying hardware and its resources. Whenever a VM needs a change of resources (such as more memory or access to another processor), it can be done in real-time without having to stop and start the VM.
The key advantage here is flexibility, and the fact that each VM has access to the native hardware. This ensures no resources are wasted by the host operating system hogging them itself. On the flip side, the hardware must be supported by the hypervisor – this isn’t only restrictive, but it also makes the underlying computer more expensive. Thankfully, this situation is improving, as more vendors enter the market.
IBM and VMware are the two biggest vendors in this space, but Microsoft is working on its own, long-awaited hypervisor solution. All three vendors see virtualisation as being essential to the next wave of hardware and operating systems. They aren’t alone. Intel and AMD are both enhancing support for virtualisation in forthcoming processors.
|Virtualisation allows IT managers to map resources effectively.|