Showing posts with label virtualisation. Show all posts
Showing posts with label virtualisation. Show all posts

Virtually Perfect? Back-up and recovery strategies for a brave new virtual world

With the wave of virtualisation sweeping across the business IT infrastructure, Mark Galpin, Product Marketing Manager of Quantum, encourages IT managers to embrace the advantages of virtualisation after fully considering the impact on the back-up and recovery infrastructure.

There can be no doubt that virtualisation is the technology trend of the moment. Google the term and more than 30m links offering expertise in the area will appear in milliseconds - and this is not just more technology hype. The virtualisation trend is having an impact on the business IT landscape. Drivers for virtualisation range from hardware, power and space savings through to increased manageability and data protection. Analyst group Forrester reports that 23 per cent of European firms are today using server virtualisation, and an additional 12 per cent are piloting the process as a means of reducing costs. IDC also predicts that the total number of virtualised servers shipped will rise to 15 per cent in 2010, compared to 5 per cent in 2005. And with the recent flotation of virtualisation leader VMware at a market value of £9 billion, many investors as well as IT experts are betting their business on this trend becoming accepted everyday best practice.

Virtualisation brings benefits
Virtualisation has brought us new ways of doing things from managing desktop operating systems to consolidating servers. What's also interesting is that virtualisation has become a conceptual issue - a way to deconstruct fixed and relatively inflexible architectures and reassemble them into dynamic, flexible and scalable infrastructures.

Today’s powerful x86 computer hardware was originally designed to run only a single operating system and a single application, but virtualisation breaks that bond, making it possible to run multiple operating systems and multiple applications on the same computer at the same time, increasing the utilisation and flexibility of hardware.

In essence, virtualisation lets you transform hardware into software to create a fully functional virtual machine that can run its own operating system and applications just like a “real” computer.

Multiple virtual machines share hardware resources without interfering with each other so that you can safely run several operating systems and applications at the same time on a single computer.

The VMware approach to virtualisation inserts a thin layer of software directly on the computer hardware or on a host operating system. This software layer creates virtual machines and contains a virtual machine monitor or “hypervisor” that allocates hardware resources dynamically and transparently so that multiple operating systems can run concurrently on a single physical computer without even knowing it.

However, virtualising a single physical computer is just the beginning. A robust virtualisation platform can scale across hundreds of interconnected physical computers and storage devices to form an entire virtual infrastructure.

By decoupling the entire software environment from its underlying hardware infrastructure, virtualisation enables the aggregation of multiple servers, storage infrastructure and networks into shared pools of resources that can be delivered dynamically, securely and reliably to applications as needed. This pioneering approach enables organisations to build a computing infrastructure with high levels of utilisation, availability, automation and flexibility using building blocks of inexpensive industry-standard servers.

But benefits can come with initial increased complexity
One of the great strengths of virtualisation is its apparent simplicity and its ability to simplify and increase flexibility within the IT infrastructure. However, as time passes there are some important lessons emerging from early adopters’ experience which are important to consider. IT managers looking to unleash virtualisation technology in their production networks should anticipate a major overhaul to their management strategies as well. That's because as virtualisation adds flexibility and mobility to server resources, it also increases the complexity of the environment in which the technology lives. Virtualisation requires new thinking and new ways of being managed, particularly in the back-up and recovery areas of storage in a virtualised environment.

Virtual servers have different management needs and have capabilities that many traditional tools cannot cope with. They can disappear by being suspended or be deleted entirely, and they can move around and assume new physical addresses.
As a result, some existing infrastructures need to become more compatible with virtual machines in areas such as back-up and recovery.

Many of the virtualisation deployments to date have been implemented on application or file servers where unstructured data is the key information. In these environments, VMware tools for back-up and recovery work well. Copies of the virtual machine images can be taken once a week, moved out to a proxy server and then saved onto tape in a traditional manner.

Real returns are available through virtualising structured data
But the real returns on investment for business from virtualisation will come in its ability to virtualise the structured data of its key applications such as Oracle, SQL or Exchange. Many of these areas have been avoided to date because of the complexity of protecting these critical business applications in a virtualised environment. The standard VMware replication tools take a snapshot image in time and do not provide a consistent state for recovery and rebuild of structured data.

The answer for critical applications where recovery times need to be seconds rather than hours is to build expensive highly available configurations. This solves system or site loss risks but protection is still required against data corruption, accidentally deleted data and virus attack. Less critical systems also need to be protected and data sets retained for compliance and regulatory purposes. In most data centres, traditional backup and recovery will be performing these functions today using familiar software tools that integrate with the database and tape or disk targets for the data.

So, the obvious solution is to continue to back-up everything as before but in a virtualised environment the increased load on the network infrastructure would become unbearable very quickly with machines grinding to a halt and applications groaning. Tape systems with their high bandwidths and intolerance of small data streams are also unsuitable as targets as more flexibility is needed to schedule back-ups to multiple devices.

The answer is disk-based back-up appliances
With structured data, the answer is to use new disk-based back-up appliances to protect data. Using a Quantum DXi solution, for example, businesses can combine enterprise disk back-up features with data de-duplication and replication technology to provide data centre protection and anchor a comprehensive data protection strategy for virtualised environments.

DXi solutions bring a number of additional benefits. In as much as they are useful when storing structured data, they are also effective in storing virtual machine disk format (VMDK) images and unstructured data, meaning users can benefit from a single point of data management. A benefit of storing VMDK images on de-duplicated disk is that all VMDK image are very much alike and so achieve an exceptionally high de-duplication ratio. This means much larger volumes of data can be stored on limited disk space.

The DXi range leverages Quantum’s patented data de-duplication technology to dramatically increase the role that disk can play in the protection of critical data. With the DXi solutions, users can retain 10 to 50 times more back-up data on fast recovery disk than with conventional arrays.

With remote replication of back-up data also providing automated disaster recovery protection, DXi users can transmit back-up data from a single or multiple remote sites equipped with any other DXi-Series model to a central, secure location to reduce or eliminate media handling. DXi–Series replication is asynchronous, automated, and it operates as a background process.

Whether businesses are looking to increase return on investment from their virtualisation implementations or planning a virtualised environment, the lessons are clear. To make the most of this latest technological innovation, IT managers must plan their recovery and back-up strategies to cater for the virtual new world.

Quantum is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com

Quantum Corp. is the leading global storage company specializing in backup, recovery and archive. Combining focused expertise, customer-driven innovation, and platform independence, Quantum provides a comprehensive, integrated range of disk, tape, and software solutions supported by a world-class sales and service organization. As a long-standing and trusted partner, the company works closely with a broad network of resellers, OEMs and other suppliers to meet customers’ evolving data protection needs. Quantum Corp. www.quantum.com.

Source: StoragePR
<>

Survey finds 38% of Organisations have already Gone Virtual

Results reveal flexibility, cost and availability are key motivating factors for adopting virtualised infrastructures

Worcester, UK. – 2nd September 2008Double-Take Software announced the results of a survey that it ran in collaboration with the organisers of Storage Expo 2008, the UK’s definitive event for data storage, information and content management, which takes place in London on 15th and 16th October. Key findings include overwhelming interest in virtualisation, with 38% of respondents having already virtualised their production infrastructure and 52% having plans to do so. Increased infrastructure flexibility was cited as the main benefit of adopting virtualisation (48%), followed by cost reduction (21%). However, beyond these core factors, improved disaster recovery is widely considered a key additional benefit of virtualisation, with 82% stating that it is a significant feature.

“Natalie Booth, event manager for Storage Expo 2008, commented, “Virtualisation has seen dramatic adoption rates in recent years. From our discussions with CIOs and analysts we expect this trend to accelerate. We have included a series of seminars on virtualisation in the education programme of Storage Expo 2008 to help organisations explore its benefits. In the keynote on Improving Asset Utilisation with Virtualisation, John Abbott, Chief Analyst from The 451 Group will lead a panel of speakers from Oxford University, Wellcome Trust; British Horse Racing Authority and Liverpool Women's NHS Foundation trust. Other virtualisation presentations include ‘Storage Virtualisation, does it meet today's business needs?’ and ‘How to help you reduce spend by 50% on Storage for Virtualised Servers.”

Ian Masters, UK sales and marketing director at Double-Take Software, said, “We could have predicted that reduced costs would appear as a major factor in the adoption of virtualised infrastructures. However, the high interest expressed in the flexibility of virtualised infrastructures was less predictable and is very significant. It demonstrates that organisations are unhappy with the constraints native to traditional physical infrastructures and are actively seeking ways that can make the management of their infrastructure more fluid and open to innovation.”

“Although, 82% of respondents believe that improved disaster recovery is a key additional benefit of virtualisation, Double-Take Software maintains that it is not an automatic by-product. All instances of virtualised systems rely on the physical hardware on which shared data is stored. If physical systems are not adequately protected and fail, then virtualised systems will also fail. Data replication technology from Double-Take Software ensures that any virtualised infrastructure, or mixed environment, is comprehensively protected and also provides organisations with further flexibility, allowing them to dynamically manage their virtualised infrastructure.”

Double-Take business continuity deployment allows businesses to move IT systems anytime, anywhere, for whatever purpose, without interfering with ongoing operations. Double-Take copies data in real time from one server, or one site, to another to create a complete duplicate on a live backup system to provide very high levels of data protection and availability. Combining Double-Take with virtualisation provides the ability to protect and dynamically manage the entire infrastructure. Whether recovering from a disaster, simplifying routine server maintenance or migrating whole data centres, Double-Take creates a dynamic infrastructure that ensures effective business continuity planning as well as allowing data centre managers to innovate in how they manage their entire infrastructure.

The survey polled 59 UK IT and business professionals with an interest in storage issues from across industry sectors and was conducted by Storage Expo, which takes place at Olympia, London, 15 & 16th October, www.storage-expo.com.

Double-Take® Software is a leading provider of affordable software for recoverability, including continuous data replication, application availability and system state protection. Double-Take Software products and services enable customers to protect and recover business-critical data and applications such as Microsoft Exchange, SQL, and SharePoint in both physical and virtual environments. With its unparalleled partner programs, technical support, and professional services, Double-Take Software is the solution of choice for more than ten thousand customers worldwide, from SMEs to the Fortune 500. Information about Double-Take Software's products and services can be found at www.doubletake.com.

© Double-Take Software.

Source: StoragePR
<>

Virtualisation is here to stay

By Kevin Bailey, Director Product Marketing EMEA, Symantec

Virtualisation has now been around for a number of years, yet many businesses are still not implementing engines in a strategic way.

As IT managers face increased pressure to meet the changing demands of an IT infrastructure, supporting a growing number of applications on desktops from deployment through to retirement, as well as coping with the daily deluge of software conflicts can be a daunting task.

The increase in usage of software virtualisation technology will ease this pressure by eliminating conflicts, allowing organisations to deploy and fix applications quickly and ultimately reduce support costs and improve application reliability. However, storage virtualisation is not the solution, but just one component of an end-to-end storage management solution. Virtualisation engines should be planned alongside other IT strategies, such as business continuity, disaster recovery and general availability procedures, so that IT managers can holistically integrate it into their IT.

One of the challenges IT managers are facing is that most current systems management tools deployed to monitor the enterprise IT infrastructure are not always built with virtualisation in mind. A common complaint is that configuration database management tools do not work properly in dynamic virtual environments.

Users commonly experience problems with their applications slowing down and PCs failing to reboot as their system gets older and more littered with applications. Magnify this problem by a thousand users and it’s clear to see how productivity within an organisation could suffer and how quickly this could become an expensive problem. These problems occur when users install new software or application updates which share common resources and codes, resulting in conflicts, application failure or the reintroduction of security holes that were previously patched.

With this in mind, IT managers should seriously consider taking a look at software virtualisation technology which enables desktop applications to be run as virtual software packages, allowing users to switch applications on and off instantaneously to eliminate any conflicts. Applications can then be reinstalled remotely without adversely affecting the base Windows configuration. By simply switching an application on or off without needing to reboot, a user can keep their PC’s capacity under control as well as maximise its performance and resilience.

The technology works by deploying the software to a part of the file system that is normally hidden from Windows. As a result, the resources that are used by applications like Microsoft Word are isolated from the operating system or other applications that might have conflicting drivers.

IT managers can also use software virtualisation technology when testing and rolling out new versions of an application. Performing a successful upgrade to a business critical application is essential, but there is always a risk attached to changing or upgrading a package. If the application doesn’t work properly for some reason, the management team will not be interested in understanding why, they will just expect the application to be working again quickly.

Virtualisation technology can resolve upgrading issues by allowing users to simply roll back to the old version so they can continue working. This gives IT managers time to repair the damaged application before making the new package available again. In addition to this, virtualisation allows users to host multiple versions of an application on the same system giving them sufficient time to become familiar and comfortable with the new features of the package before they feel confident to move away from the old version.

Even though awareness of virtualisation has been around for some time, many IT managers still do not understand the technology and how it will change the way software is managed in the future. However, once more IT managers start looking at software virtualisation and begin to see the true value of the technology, it will only be a matter of time before IT infrastructures become completely virtualised. Organisations shouldn’t make the mistake of turning a blind eye to virtualisation as it is here to stay and will be used in the future by many IT departments in their quest to standardise IT infrastructures and achieve financial efficiencies.

Symantec (UK) Ltd is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com

Source: StoragePR
<>

Beating the data deluge with storage virtualisation

As data volumes explode, businesses face the daunting prospect of unmanageable storage growth. Steve Murphy, UK Managing Director for Hitachi Data Systems reveals how organisations can use storage virtualisation to consolidate their systems, increase utilisation and efficiency and reduce costs.

As virtualisation is a technique rather than a specific technology, applied to areas such as servers, storage, applications, desktops and networks, it is often poorly understood. This article attempts to bring clarity to how virtualisation is applied to storage systems and the benefits it can deliver.

Fundamentally, virtualisation aims to abstract software from hardware, making the former independent of the latter and shielding it from the complexity of underlying hardware resources.

Storage virtualisation often performs two functions. It makes many storage systems look like one, simplifying the management of storage resources and in some cases provides partitioning so one storage system appears as many, providing security for applications that need to be separated.

Across most industries, data volumes are spiralling out of control as the amount of information and content we create grows exponentially. Data storage within most organisations is increasing by about 60% annually. This means that organisations have to increase their capital and operational expenditure in areas such as IT staff, power, cooling and even data centre space.

Traditionally, organisations have tried to deal with growing data volumes by buying more disks. However many organisations are finding that their storage infrastructures are becoming unmanageable, while the utilisation of these systems is unacceptably low, often running at 25-30%.

Another challenge is that while data volumes grow, IT managers still need to meet the same demands: supporting new and existing applications and users so that the business remains competitive, managing risk and ensuring business continuity and maintaining compliance with government regulations and specific industry standards.

There is a strong argument for organisations to stop buying more capacity and, instead, look for ways to consolidate their existing estate and increase utilisation and reduce costs. Storage virtualisation is an increasingly popular way for organisations to address these challenges.

Storage virtualisation aims to ‘melt’ groups of heterogeneous storage systems into a common pool of storage resources. Vendors have adopted a range of methods to achieve this. One technique is to let the server handle storage virtualisation, although as the server is removed from the storage system and has other functions to manage, performance can suffer.

One of the most widely used approaches is to use the intelligent storage controller as the virtualisation engine. By installing an intelligent storage controller in front of their infrastructure, companies can aggregate existing storage systems and virtualise the services provided to host applications such as data protection, replication, authorisation and monitoring. This offers advantages such as simplified management, increased utilisation of storage resources, seamless migration across tiers of storage, lowered interoperability barriers and better integration of common functionality.

Virtualisation brings about cost reductions and efficiencies, by reducing the need for additional software applications and licences, the need for additional hardware (which in turn means lower power, cooling and space costs) and also labour costs and resources required to manage spiralling data volumes. Typically, administrators can manage from three to 10 times more storage capacity once virtualisation is implemented.

Storage virtualisation also allows organisations to consolidate and utilise existing storage assets, extending their shelf life so they continue to deliver value. Organisations can also consolidate their management and storage services, using a single standard interface to manage storage, archive and backup functions.

Storage virtualisation allows organisations to consolidate systems and increase utilisation, significantly cutting the power required to both operate and cool their data centres. This reduces energy costs, which makes good business sense from an environmental and cost saving perspective.

Hitachi Data Systems is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com

Source: StoragePR
<>