Black Screens For Unauthorized Copies of Windows XP

Anyone without and original copy will see a black screen

by Michael Smith

For all those Windows-XP users that do not pass the so-called Windows Genuine Advantage (WGA) test Microsoft has a new trick in store. Instead of the again and again repeating pop-up users of a supposed pirated copy of the Operating System will now see a black desktop background and a watermark at the bottom of the screen. Microsoft apparently hopes with this new approach to persuade users of illegal and pirated copies – but we all know that not all copies that fail the "Genuine Windows Validation" are in fact illegal and pirated – to buy an original version.

It would appear that that previous methods are no longer thought sufficient by the developers in Redmond. Hence they have decided to come up with the newer, tougher, approach to deal with those that failed the "Genuine Windows Validation" test. Primarily this is being aimed at users of pirated versions of the operating system and to encourage, or better, force, them to buy an original copy.

Should the user not heed the messages about the failed "Genuine Windows Validation" then his or her desktop background will be switched to an all black one on an hourly rhythm. The “watermark” at the bottom of the screen, the “Persistent Desktop Notification”, furthermore will remain as a permanent one.

Users, so the WGA Blog states, can change the background back according to how he or she wishes it to appear but after an hour the black background will return.

Well, so much for Windows and privacy and all that. When Microsoft recently accused Google of privacy violations of users, so to speak, it surely was a case of the pot calling the kettle.

We do all, I am sure, know, however, of incidents and this was the case not so long agao with Vista, that the "Genuine Windows Validation" frequently gets it wrong.

In addition to that a “cracker tool” has been around for a while which is able to fool the "Genuine Windows Validation" into believing, to some extent, that a copy whose license has expired, for instance, is valid, as it is a genuine one anyway. OK, I know that this is, theoretically, illegal but...

This new scheme out of Redmond shall primarily be used with regards to illegally copied windows XP Professional. The reason given for this is, according to the WGA Blog, the fact that XP Professional is one of the most frequently copied version of Windows.

The fact here, in my opinion, is not so much a case of illegally copied versions but that many secondhand PCs are in use that have been bought with XP Pro installed on them from when they came and the license of the companies and organizations who have been the previous owners has expired. Hence the WGA validator sees it as an illegal copy.

The only way out of this failed "Genuine Windows Validation" issue that is know at present is to actually purchase a genuine certified original copy of XP Pro in the box. And whoever wants to really act according to Microsoft's wishes then goes and rather buys a copy of Vista.

Here now comes an interesting thought, methinks: Could this all be an attempt by the people there in Redmond to “force” users to migrate to Vista, so that they, Microsoft, can, as they wanted to, remove all support from XP?

I shall leave you with this little food for thought...

© M Smith, August 2008

Risk and Opportunity of “In The Cloud Computing”

by Michael Smith

I have written about my take of “In The Cloud Computing” before and while this can be a useful idea for data storage away from a PC and even for working from a host PC it also has lots of drawbacks, not the least of them being security.

Some of those risks are such things as identity management. Most cloud services rely on simple password authentication and authorisation occurs on an application by application basis. The outcome will inevitably be multiple login experiences with different IDs and passwords. This creates an additional burden for user account management auditing within the organisation.

But even for a web worker this can be a security issue as to the strength of the password and such but especially as to the possible lack of any encryption.

The other issue is that of service reliability and availability. One must keep in mind that a loss of availability might not always be down to the service provider: routing problems, cables being cut, and other unfortunate incidents could all result in the systems being inaccessible.

On the other hand, how do we know that those services will always be available to us. Having had personal experiences with the lack of access to my bookmarks stored online with My Web 2.0 from Yahoo, with the service refusing – for about two month – to actually log me in while all the other services of Yahoo showed me logged in with no problem. The issue resolved itself after that time without explanation from Yahoo or whatever. So, what if that happens with your data, especially if you have not got copies of this data stores elsewhere, whether also online or off line? It could cause you and your business untold damage if you would be unable to get access to your important data stored somewhere in the cloud.

If you use “in the cloud” computing, especially as regards to keeping data there for collaboration and your use when on the move, ensure that you have the same data also in other form and readily available should problems occur with the online system and service.

Like with so many of some of those things, including hand-held PCs, I am still someone who needs to be entirely convinced as to the reliability of those. The same goes for Software as a Service (SaaS), whether provided by Microsoft, Google or whoever else.

While I personally make use of some of the Web 2.0 services online such as Google Calendar, Google Documents, and also, but only as a print out of a small diary planner thing, iScribe, I rather still have my stuff off line as well.

Data protection laws could be another minefield when it comes to “in the cloud computing”. If you are a company with globally dispersed offices collating customer information from each of your regions of operation then sharing it across your business from a Cloud service based in America, which data protection laws apply?

And there is more that one could and probably should discuss at length. Being aware of the risks is always good because it means one can plan contingency. In the case of Cloud computing the business sees the opportunity and that really makes it all worthwhile because, if we like it or not, this is where we are all going in the end. Personally I hope not but...

One of those opportunities is scalability and the way that services can scale up or down depending on requirements. This means that, in other words, you only, theoretically, pay for the processing time and disk space that you need at any given time, if you are not even using a free service. While the latter may work for the “lone wold” web worker it may not be (legally) possible for an enterprise of any size.

Just about anything can be offered as a cloud-based service.

Cloud Computing is more than just SaaS in that everything as a service (XaaS) would be a more appropriate way to describe it by now.

The opportunities for collaboration are the most interesting as described on the CloudSecurity blog where it states: Forward thinking companies use collaboration technologies to melt away the physical distance between disparate offices, remote workers and suppliers.

But, for all the benefits that “In the Cloud Computing” may bring us, especially in the way of working from home and such like, the question that needs addressing, more than anything, I should guess, is how to manage the risks.

While “in the cloud computing” also reduces the environmental footprint of the company and the individual workers, as they may not have to commute to the office in the case of those working for corporations – either not at all every, so to speak, or just say one day or two a week – or if they are lone road warriors of the web worker kind that they can work also from anywhere where there is a PC (not even needing to lug about a laptop) the security aspect remains of the data that is stored in the cloud. Then there is the accessibility aspect as to what happens when the service goes down. Having experienced this latter aspect personally I am, while I do use some online services of the “in the cloud” department I do not want to rely on it solely.

How many of us who use one or the other aspect of the “in the cloud computing” have not experienced some breakdown or the other of the services that we use, if only for a short while. I am sure most of us have, especially with regards to web based email services as well, and are not all of them services, theoretically, web based for most of them are accessible as Webmail as well. I have had problems with Hotmail, with Yahoo, and others. Fingers crossed, so far, Gmail has been doing well, though I access it primarily with an email client on the PC.

In my opinion, but then I do stress that that is my opinion as a user, “in the cloud computing” still has a way to go, especially as to security and reliability. Its time for proper use may come but, personally, I rather would like to remain on terra firma with my stuff – at least as a duplicate – on a hard disk.

© M Smith (Veshengro), August 2008

Ubucon 2008 - German-area Ubuntu Users Conference 2008

Nearly every week friends of Ubuntu meet at different locations for user meetings in order to share information and to exchange findings and new developments.

Now, from October 17 to October 19, 2008, in addition to all those meetings on a weekly or fortnightly basis, the second Ubuntu User Conference in the German-speaking realm will be held. This year the Ubuntu Users Conference (Ubucon) will happen in Göttingen, in the buildings of the Georg-August-University. Last year around 300 friends of the Ubuntu operating system, and those of the Ubuntu-derives ones, Kubuntu and Edubuntu, meet at conference in Krefeld.

At the weekend after the planned release of the latest version of Ubuntu, namely Version 8.10, to be called "Intrepid Ibex", users and developers of Ubuntu from countries such as Germany, Austria and Switzerland will meet to exchange their experiences and opinions to this popular Linux distribution.

It is probably a given fact that the new release of Ubuntu will be at the center of the Conference. Ubuntu shall henceforth also be usable on smaller monitors, such as for instance those of subnotebooks. Also the power management has been revised and many software actualisations and improvements are also planned for Version 8.10.

The first German language Ubucom was held from October 20 to 21, 2007 at the Hochschule Niederrhein in Krefeld. Upon invitation of the German-speaking Ubuntu-Community ( of the Ubuntu Deutschland e. V. ( more than 300 users and developers of Ubuntu, Kubuntu and Edubuntu from Germany, the Netherlands Switzerland met at the Campus of the Fachhochschule.

Sponsors of the Ubucon 2008 are, amongst others, IBM and SerNet.

The official Ubucon-Homepage 2008 can be found at

Michael Smith (Veshengro)

Technorati - A Reliable Benchmarking Tool?

by Michel Smith (Veshengro)

Can Technorati be trusted as a yard stick for blogs and such like?

The reliability of Technorati as a benchmark as to how well a blog (or such) is performing must be called in question here as, again and again, Technorati are having serious - more or less ongoing - problems, or as they would term it “the Technorati monster escapes”. To say that Technorati be about as reliable as flipping a coin might be a little unfair but things are getting worth rather than better and there are continuous issues that do not seem to get - properly - resolved. All that users keep getting are excuses, excuses and more excuses.

I doubt that those of us that write more or less professional on blogs have the time to try and manually ping the things again and again and still get no update of posts or of authority or in grade. I certainly do not have the time to spend the gods only know how long and how often on the Technorati site to ping the system manually despite the fact that the auto-ping is installed on my sites.

Hence, can Technorati really be seen as a valid guide and yard stick as to how well or not a blog or website is doing and where it stands as to readers and such? The answer here, I think, from what is being experienced by everyone who uses Technorati to index their blogs, must be an categorical no. Every time that those issues happen updates and ratings go to pots and more often than not, even when a site has been manually indexed by the admins of Technorati the ratings are often not restored. This has happened to my sites more than once now and hence I can but say that Technorati can only be seen as a very loose guide and not as the real benchmark.

It can be used, maybe, to some degree as a guide of how a site, a blog, may be performing, especially with regards to articles being cross-posted and such but that is, in my opinion and especially experience, about all.

For many of us as professional and semi-professional bloggers advertising revenue and such like depend on the ratings, unfortunately, on Technorati, as many potential advertisers or companies that one approaches for product samples use Technorati as a guide. The same is true in order to gain media accreditation to many conferences, shows, trade and consumer fairs. Many of those that make the decisions there as to who may gain entry to such events on a press badge go by the ratings of a blogger's site on Technorati and the constant problems with that site can cause us more than a little problem, of that I am sure; it has to me.

Hopefully, one of these days, maybe, just maybe, the Technorati site will actually work for more than a week without those ongoing issues and the ratings could actually be trusted. Presently, however, this is hardly the case.

© M Smith (Veshengro), August 2008

Another serious case of data loss in Britain

by Michael Smith (Veshengro)

Home Office loses USB memory stick with data of about 100,000 criminals

The continuing data security breaches and loss of data and laptops containing secret information must, by now, become an embarrassment to the British government, or so at least it should. It is rather time that heads rolled but, alas, that is hardly going to happen.

How, pray, does anyone put data such as that which has just been lost – due to the fact the USB memory stick has been lost – onto a small little USB memory stick unencrypted.

Apparently the private sector contractor working for the British Home Office – the the British Ministry of the Interior – took the data which was, so we are told, encrypted originally, decrypted it and then simply stuck it onto an unsecured memory stick. This is not just being stupid or incompetent, though both attributes certainly also apply, but this is criminal negligence.

As Keith Vaz, Labour MP and chairman of the home affairs select committee, said: “f you hand out memory sticks almost like confetti to companies and ask them to do research for you, then you have to be absolutely certain that the company concerned has put in practice procedures which will be just as robust as the procedures that I hope the government has followed.”

But it is not just private sector contractors to the government that have such a lackadaisical attitude to data security; the government's own departments are, normally, directly, the culprits.

If one does need and want to use portable devices, such as USB memory sticks, then they should at least be hardware encrypted – please note: I said hardware encrypted – and this with very strong credentials. There is no excuse not to use such devices. They are also no longer costing the earth and it certainly should not have anything to do with cost.

If the information can be believed that was given to me then the reason, for instance, that the data from the HMRC office that was sent by courier to London a while back now which was unencrypted on CDs and which were subsequently lost, then it was because the two departments do not have the same encryption program. While we were being told that a junior clerk had simply copied the data onto the disks and send them out, apparently, the reasons are different.

Already, the data should have been encrypted, period, when it was downloaded onto the CDs in that instance. Why is open data held in the first place on computers? The data that is held on the computers systems of whichever government department should already be encrypted and would, hence, when copied to CD or whatever, still be in code. But, apparently, this is not the case.

A spokeswoman for the Home Office said in a public statement that the reason as to why the data was in the hands of a private contractor and why it was downloaded onto a USB memory stick was that the outside company was to conduct a study as to how to provide an improved prosecution of offenders. Further information as to how it happened that this stick was lost, however, was not given.

It might be better if the British government began conducting a proper study as to how to avoid loss of data from government departments, for presently there seems to be a sieve here in operation and no safeguards in place whatsoever. This is not only scandalous; it is criminal.

Shadow Home Secretary Dominic Grieve said that there had been a "massive failure of duty" and I do not think that one can add any more to that. With the exception, perhaps, that it is time that the minister responsible for the Home Office tendered his or her resignation. I say here his or her as I cannot remember whether presently it is a man or a woman that is in charge there. People come and go there too often, in general, and that culture too, probably, has a lot to do with things going missing.

© M Smith (Veshengro), August 2008

MXI Security and Diino put digital online protection at the tip of your finger

Stealth MXP biometric USB device provides authentication to unlock access to Web-based storage and file sharing services

Stockholm, Sweden and Montreal, Canada - August 21, 2008 - Diino, a provider of secure Web-based file storage, access, and distribution, and MXI Security, the leader in superior managed portable security solutions, today announced that they have signed an agreement to offer their customers a unique combination of strong authentication and ease-of-use for protecting access to data stored online on their Diino desktop.

“MXI Security’s mission is to provide companies and governments with state-of-the- art security solutions to protect portable data and applications for any computing environment. We believe that biometric multifactor authentication sets the security bar high against identity theft or data loss while preserving ease-of-use for end-users.” says Lawrence Reusing, CEO of MXI Security. “Our partnership with Diino is an important step toward making sure our customers can benefit from greater mobility without compromising their digital security.”

By simply swiping their finger on MXI Security’s Stealth MXP Biometric USB device, customers will automatically and securely access their Diino account where they can safely and easily store, access, share, and backup files of any size, from any location. In addition to featuring the most secure authentication with FIPS 140-2 Level 2 validation, the Stealth MXP device offers transparent on-board hardware encryption, allowing customers to securely store data that will be made available or retrieved from their Diino Internet Desktop. MXI Security’s Stealth MXP devices are trusted by the most demanding customers and have been adopted by government agencies worldwide to protect access to their most sensitive data, applications, and systems.

"Diino allows you to store, access, share, and backup your digital documents while maintaining a wall of privacy between your information and the Internet. In combination with MXI Security’s solutions, we will create a product that is extremely appealing to customers that demand high security and flexibility when handling information," says Jan Nilsson, CEO of Diino. “We offer online services to more than one million customers worldwide, and as a market leader we’re looking to offer value- added services that will protect our customers’ privacy and online identity”.

MXI Security, a division of Memory Experts International Inc., leads the way in providing superior managed portable security solutions designed to meet the highest security and privacy standards of even the most demanding customers.

MXI Security solutions combine the power of strong user authentication, digital identity and data encryption to protect access to sensitive information and systems. Easy to manage and transparent to the end user, MXI Security solutions enable organizations to satisfy multiple security needs with a single device, facilitating greater mobility without compromising security.

For more information please visit

Diino AB with offices in Atlanta, London, Mexico City, and Stockholm has over the past years developed storage and file sharing technologies. Diino AB is one of the leading providers of secure online storage and file sharing technologies.

Diino provides an easy, powerful, cost-effective and secure way for individuals and businesses to store, access, share and publish files. Diino AB and its U.S.-based subsidiary Diino Inc. is owned by Novestra AB. Diino is currently top-ranked amongst online back-up providers worldwide by

Source: MXI Press Center

Canonical Joins The Linux Foundation

Commercial sponsor of Ubuntu ® looks to support cross-industry collaboration and promotion to fuel Linux growth

The Linux Foundation, the nonprofit organization dedicated to accelerating the growth of Linux, on August 18, 2008, announced that Canonical, the power, so to speak, behind Ubuntu and its success, has become a member of the Foundation.

Canonical is the commercial sponsor of Ubuntu, a popular version of the Linux operating system, and supports a wide range of other open source projects including Bazaar, Storm and Upstart. Ubuntu has become a popular choice for the server and desktop as well as for the rapidly emerging areas of netbooks and mobile Internet devices.

Matt Zimmerman is the CTO of the Ubuntu project in Canonical, chairs the Ubuntu Technical Board and leads all engineering efforts for the distribution.

“The Linux Foundation occupies a critical, non-commercial function in the use and popularization of Linux around the world. We've always seen the Linux Foundation's value and are pleased to now become an official member and support its activities. We look forward to working with them to continue the march of Linux in all areas of computing,” said Matt Zimmerman, Ubuntu program manager and CTO, Canonical.

Ubuntu community members have been active participants in a variety of workgroups at the Foundation, including the Linux Standard Base, Desktop Architects and Driver Backporting groups. With Canonicals support, user interests for both commercial and community versions of Ubuntu will be represented.

“Canonical is an important new member for the Linux Foundation,” said Jim Zemlin, executive director of The Linux Foundation. “Matt and his team have created an exciting distribution that has taken the world by storm. They have rallied the cause of cross-industry, cross-community collaboration for years. We are extremely pleased to work even more closely with Canonical as we push Linux to the next stage of growth.”

The Linux Foundation is a nonprofit consortium dedicated to fostering the growth of Linux. Founded in 2007, the Linux Foundation sponsors the work of Linux creator Linus Torvalds and is supported by leading Linux and open source companies and developers from around the world. The Linux Foundation promotes, protects and standardizes Linux by providing unified resources and services needed for open source to successfully compete with closed platforms. For more information, please visit

Source: Linux Foundation Press Center

British Justice Ministry loses 45,000 sets of data

by Michael Smith (Veshengro)

If losing data would be an Olympic discipline Britain sure would be topping the league table for gold medals. Shame that this is not something to write home about really. But it nigh on has become a sports discipline of the authorities in Britain to lose sensitive data of its citizens. And then the people should trust them with the data for a national bio-metric ID card. Methinks not.

Once again has a department of the British government – in this case the Justice Ministry – lost sensitive data, thousands of them, without the faintest idea as to where those currently reside.

If ordinary businesses would treat data in such a lackadaisical manner they would find themselves prosecuted – and rightly so – by this very same government. When it itself, however, treats data, which is even more sensitive than “mere” credit card details, in such a manner nothing seems to be happening at all.

Slowly but surely the reliability of all government departments in the UK are being called in question seeing how one scandal chases the other and more often than not such a scandal has to do with the loss of sensitive data of the people.

The Justice Ministry has now become part of the long line of UK government departments that are incapable of securely store, retain and protect the sensitive data of the people of the British Isles which they have been entrusted with. Around 45,000 sets of data have been lost by the said ministry, and those include the date of birth, the national insurance number, the extracts from the criminal records, as well as, in many cases, also bank details. Wost of all in this is that 30,000 of those people thus affected have not even been informed by the authorities as to the fact that their details have been lost in such as way because the department reckoned that the loss of such data – despite the fact that it is, so we understand, unencrypted, as per usual with the British government – did not pose any risks for those whose details have been lost. Oh really? This government is getting more and more incompetent and it really expects people to trust it with information. They really do not live on this planet, I am sure.

This all points, yet again, to the apparent fact that the British authorities seem to have absolutely no interest in proper protection of the personal data of its citizens.

Not so long ago 1,000s of new blank passports – the kind with the chip – have been stolen and the people were told as well that there is no problem there and that those passports could never be used. Well, tell that the hackers that have managed to get into the chips and are thus able insert any data that they desire.

The British government, including its intelligence service and defence ministry, must be holding the world record in data loss, at least in the loss of unsecured data.

Apparently, with reference to the first – reported one – of these incidents when millions of sets of data of child benefit recipients went missing, the reason that those two CDs with all the data were not encrypted is because the two departments have different encryption programs and neither can read the other system. HELP!!!

Oh well, maybe one day we find all those sets of data again, somewhere. Let us then just hope that no one in the meantime makes use of the material on those disks and laptops for criminal or terrorist goals. I think praying might be in order here to which ever deity the reader may chose.

© M Smith (Veshengro), August 2008

VMware Joins The Linux Foundation

VMware joins leading Linux consortium to address increasing adoption of virtualization and cloud computing with Linux

The Linux Foundation, the nonprofit organization dedicated to accelerating the growth of Linux, announced on August 6, 2008, that VMware has become a member of the Foundation. The company joins existing Linux Foundation members and technology leaders such as Adobe, AMD, Dell, Fujitsu, Google, Hitachi, HP, IBM, Intel, Motorola, NEC, Novell, Oracle and Red Hat, among others.

“A growing number of organizations run their Linux environments on VMware virtualization, and the Linux Foundation gives us a collaborative forum to effectively address the needs of our customers,” said Dan Chu, vice president of emerging products and solutions at VMware. “We are delighted to become a member of The Linux Foundation and look forward to making future contributions to the Linux community.”

According to research firm IDC, revenue for the virtual machine software market may increase by more than four times from 2006-2011 to reach $4.8 billion by 2011*. As adoption of Linux expands as a result of its natural position as a platform for next-generation computing in the cloud and in virtualized environments, companies such as VMware are looking to The Linux Foundation as the forum for collaboration.

VMware's participation in the Linux community includes the contribution of the Virtual Machine Interface (VMI), a paravirtualization interface as an open specification, and subsequent collaboration with the Linux kernel community and others in the development of a source-level paravirtualization interface (paravirt-ops) for the Linux kernel. In 2007, VMware announced the release of its Open Virtual Machine Tools, the open source implementation of VMware Tools, and the creation of the open-vm-tools project to enable community participation.

“Linux is a natural platform for virtualization and cloud computing. VMware is obviously a leader in that field and a leading ISV who has embraced the Linux platform,” said Jim Zemlin, executive director of The Linux Foundation. “We're excited to have VMware as our newest member.”

VMware has led the industry for the last decade in the breadth of operating systems supported by VMware virtualization, including all major Linux operating systems. VMware will work with the Linux Foundation and its members to address the increasing number of Linux users who are working with High Performance Computing (HPC), managed desktops, Web 2.0 technologies, and Software as a Service (SaaS) in virtualized environments.

The Linux Foundation is a nonprofit consortium dedicated to fostering the growth of Linux. Founded in 2007, the Linux Foundation sponsors the work of Linux creator Linus Torvalds and is supported by leading Linux and open source companies and developers from around the world. The Linux Foundation promotes, protects and standardizes Linux by providing unified resources and services needed for open source to successfully compete with closed platforms. For more information, please visit

Source: Linux Foundation Press Center

Havering Sixth Form College safeguards students’ files with Evault InfoStage®

By Phil Evans, Vice President of Sales Northern Europe at EVault

“It just does its thing, it’s simple, straightforward and stress-free.”

Hornchurch, UK-based Havering Sixth Form College offers a wide variety of GCSE, A-Level and vocational qualifications to students from the local area. Its mission: be an outstanding provider of full-time education for 16-19 year olds, embracing diversity and creating excellent opportunities for all.

Havering SFC required a better way to manage backed-up data for its 2000 students, many of whom relied on this when their files on their own PCs were lost or corrupted. This was critical to meeting deadlines, and to protecting files vital for public examinations.

Ready for Faster Recovery

The College’s legacy backup system was built on tape technology. This created problems. Backup to tape was slow, errors were frequent, and IT staff spent too much time managing the backup and recovery process.

Havering SFC students regularly needed files restored. As a result, IT staff spent two or three hours each day changing tapes on an eight-tape rack, manually looking for data. Speed was vital but often the process of returning files to students took hours, if not days.

“With so many students relying on the network, it makes a real difference to their education if we can return saved work to them so they get their essays finished on time,” says Pat McConalogue, Assistant Principal of Havering Sixth Form College.

IT staff found that more and more of their time was taken up manually searching, indexing, and moving tape cartridges as they looked for backed-up data. A single search could take several hours, and the volume of requests meant that other important IT tasks were pushed aside.

Eventually the college decided to upgrade to a modern, faster, disk-based solution.

Discovering the Value of Evault

McConalogue says “We liked EVault initially because the TCO over three or four years would work out more cost effectively than the competitors. The initial capital cost would be recouped with lower running costs and less cost in terms of man-hours managing the backup system.”

Deploying the solution took five days. Two servers were installed—one onsite, one offsite—by EVault partner SSIL Computer Services.

SSIL’s Michael Manster commented: “We initially ran a month-long trial backing up one file server, one SQL server and one Exchange server. Havering SFC immediately saw the time saving and ease of accessibility and decided to go with EVault.”

Rapid Recovery and Better Backups with EVault InfoStage

Havering SFC also noticed a startling improvement in performance. Manster described the benefit: “Our engineer has handed responsibility for this to Havering SFC’s IT staff. He doesn’t spend time changing tape over as the back up is automated. Data retrieval is so quick and easy that the help desk staff now manage it all. And since deploying EVault, no students have complained about data loss.”

With EVault, restores are much quicker, there is no need to change tapes, and students notice a difference in performance. They used to receive their data the day following a retrieval request. Now it happens near instantaneously.

Assistant Principal Pat McConalogue says, “EVault is a good product, it just sits there and works. There is no need to change tapes, to index them, there is no physical intervention needed at all. It just does its thing.”

“It is simple, straightforward, and stress-free,” concludes McConalogue. “Archiving the current year’s student data after they leave will be simple too. We will put that data on the EVault system and store it safely offsite.”

Seagate Services is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008

The Seagate Services Group, a division of Seagate Technology, specializes in storage management solutions which protect and manage businesses’ most valuable information. Built on the most advanced technologies, SaaS, managed services, and licensed software offerings are delivered, deployed, and managed in flexible ways to suit small to large businesses’ unique needs. The Seagate Services Group is comprised of three business units: Data Protection (EVault) for comprehensive data software and outsourced protection solutions for backup, recovery, archiving and business continuity, Recovery Services (Seagate Recovery Services) for data recovery, data migration and collection services; and E-Discovery (MetaLINCS and EVault Insight) for enterprise software and hosted solutions for intelligent E-Discovery. For more information, please visit

Source: StoragePR

Are solid state hard drives an eco-friendly option?

by Michael Smith (Veshengro)

Despite the amount of technological barriers still to overcome, leaps in reliability and battery life may make them a better environmental option for computer storage, if only slightly.

The question as to whether those drives are a more environmentally friendly option comes with the news that Dell is now offering such storage drives in their consumer level M1330 and M1530 laptops.

Solid-state drives, so-called “SSDs”, are an alternative to plate-spinning hard disk drives, aka “HDDs”. The latter are the part of your computer that you may want to throw out the window from a rather great height after it crashes and you lose all your life's work. SSDs are said to be more mechanically reliable – so the claim, which, personally, I still do not believer and the reason for that will become further down in this article – because there are fewer moving parts and also SSDs are more energy efficient, typically adding about 20 minutes more battery life to your laptop compared to HDDs. It is also reckoned that SSDs are generally speedier, though operating systems have yet to take full advantage of them.

The technology for the solid state hard drives is still very much in its infancy and would appear to be experiencing growing pains.

Price also is especially prohibitive.

Dell's new 128GB SSD upgrade will put you back $450, which is no small chunk of of money by any measure, meaning that it will double, in some cases the cost of the laptop to start with.

Drive capacity is another issue.

Dell's laptop hard disk options, presently, have a maximum of 320GB. Solid state drives, so far, only go up to 128GB, which may not be enough for some users.

Solid state drives have not as yet been touted as an environmentally friendly alternative and option to the traditional hard disk drives. There are many companies, however, that are already marketing the performance, efficiency, and reliability aspects of the solid state devices.

From the price point they still are rather out of the range of most budgets and whether or not they are the environmentally friendly option, realiability is being claime dto be higher but this this has to be proven to me. If USB flash drives are anything to go by then they are not. So far I have managed to completely and irretrievably crashed three ordinary USB drives within the last couple of months and furthermore had one hardware encrypted one go bad on me.

Not a very good result, methinks, and this certainly does not inspire much confidence in me as to solid-state hard drives for computers. One day, maybe, they will be ready but it does not seem to be the case as yet. As far as I am concerned manufacturers can claim all they want about reliability of solid state drives being better than those of the old kind of hard drive. When I can see one of those in action working for 5 to 10 years without problems then, and only then, will I believe that they are better than standard hard disk drives.

So, for the time being, as far as I am concerned, regardless of their possibility of drawing less power and such they are, in my opinion, not as yet ready to replace the normal HDDs. How many does one want to have replace in the lifetime of a PC or laptop?

As I said, one day, maybe, but not presently, and this is not just because of price. Reliability is the issue here.

© M Smith (Veshengro), August 2008


Dealing With Problems Inherent To Expanded Environments

By Owen Cole, Technical Director, UK&I

Enterprises, faced with enormous and growing volumes of data washing over them, have a storage problem. IT departments, who are supposed to manage this data cost effectively, add more storage. What they often end up with is a storage infrastructure that does not allow for different types of data or applications; a costly set-up tailored for the highest common denominator.

All data is not created equal.

Or at least, it shouldn’t be treated as such. When environments grow, more servers are added, more engineers, more network components…and ultimately you spend money and time to backup information you don’t need to backup. If an organization has 10TB of storage with a cost of $200,000 and experiences 100% growth, their storage-related purchases will soar from $200,000 in the first year to $400,000 in the second, $800,000 in the third, and so on.

This is untenable, and in order to address this waste of spend, data needs to be looked at in a different way; not as some amorphous, indistinguishable blob, but from the standpoint that different values can be apportioned to data dependent on its characteristics. In short - a tiered approach.

Most data stored is not critical, and therefore does not warrant expensive tier 1 storage. Older data is generally less relevant, and gets changed less. In most companies over two thirds of files haven’t been altered in the last three months or more. Personal music, photo or video files and e-mail archives take up large amounts of space, but are far from critical to company profitability.

If not all data has equal relevance to the business then not all data need reside on the same type of storage. Given that there are wildly varying performance, availability and cost points associated with different types of storage, the logical conclusion is that there are tremendous efficiencies and cost savings open to IT teams. If the organization described above purchased a different class of storage at half the cost, and was able to move 70% of their data to this second tier, their storage-related purchases would drop from $200,000 in the first year to $130,000 and from $400,000 in the second year to only $260,000. Adding more tiers would result in further savings.

Decoupling the data

The challenge in being able to realize these cost savings lies in the ability to:

1. Identify different types of data,

2. place them on appropriate tiers of storage,

3. manage this relationship over time, and

4. do all this without
a. impacting client access to the data or
b. increasing management costs.

The technology that has these abilities is called file virtualisation. Simply put, it decouples client access to files from the physical location of the files. Clients are no longer required to possess explicit knowledge of the physical location of the data. In fact, clients are not aware that data is moving between different storage tiers and their access to data is never disrupted.

File virtualisation solutions classify data based on flexible criteria such as age or type, place the data on the correct storage tier and automate the movement of data throughout its lifecycle based on policies. For example, a company might create a policy that says:

  • All new files are placed on tier 1.
  • Files that have not changed in 90 days are moved to tier 2.
  • Files that have not changed in 1 year are moved to tier 3.
  • If any file that has been moved to tier 2 or 3 changes, return it to tier 1.
Backup pain

Along with rising costs, burgeoning amounts of data increases backup pain. As the amount of data escalates so too does the length of time it takes to complete a backup. When backups run the risk of exceeding allocated backup windows, IT departments are faced with an unwelcome choice of either failing to meet service level agreements to the business or, (worse yet) not adequately protecting the data.

But the fact is much of the data being backed up is the same data that has been captured in previous backups. This is true for almost every company. Since only a small fraction of data is actually changing, the vast majority does not need to be continually backed up. Non-essential content like personal music libraries or e-mail archives probably doesn’t need to be backed up either. The bottom line is that many companies are spending time and money backing up information they do not need to.

Because file virtualisation can track data is not changing and manage it accordingly, the amount of redundant data that is backed up on a regular basis is dramatically reduced, thereby condensing the time taken to do backups. Recently a large media company used a file virtualisation solution to move data that had not changed in a month out of the primary backup data set and saw their backup times drop from 36 hours to less than one hour.

Reducing the amount of data in the primary backup data set also cuts the costs associated with the backup infrastructure (which usually requires far more capacity than the primary data). Less data to backup means less tape, fewer tape libraries, less virtual tape, lower licensing costs, and reduced fees associated with offsite storage.

File virtualisation, then, enables an automated tiering solution that is seamless to clients and provides dramatic cost and efficiency benefits. Elements to consider when looking at the possible solutions on the market could include:
  1. The ideal solution will work with the storage environment you have today as well as providing flexible options for the future. The solution should not lock you into a specific storage technology or force you to change the infrastructure you already have.
  2. Look for a solution that will meet not only your current scale needs, but accommodate your future growth. Solutions that require links or stub files can be difficult to remove, and often come with performance penalties and scale limitations.
  3. A solution that manages information at the file level rather than the directory level will give you greater flexibility and provide the greatest business value.
  4. The most effective tiering solutions will manage data in real-time. Placing a file on the right storage device when it’s created is more effective than having to go and search for it and move it after-the-fact.
F5 Networks is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008

Source: StoragePR

SAN deployment at Roke Manor Research improves storage management, increases security and reduces overall IT costs

Owned by Siemens and based in Romsey, Hampshire, Roke Manor Research is an innovative solutions provider and contract R&D specialist. It pioneers developments in electronic sensors, networks, and communications technology, providing products and services to Siemens businesses, Government departments and commercial customers. The company employs 478 people. Its turnover for the financial year ending 30 September 2007 was £42.55 million.

The challenge

As a recognised centre of excellence for research and development Roke manages around 350 projects every year, each with its own diverse set of IT requirements, which can change during the project.

“We are often working in the dark,” explains Tony Barrett, network consultant, Roke. “Storage requirements in particular are extremely difficult to predict at the start of a project, making allocating space difficult.”

The centre has seen a significant year-on-year growth in storage demands, and predicts this will increase by 60 - 70% in 2008/2009. Roke has also recently been advised by parent company Siemens that all data must now be kept indefinitely. In order to meet these demands the IT team began researching the feasibility of a Storage Area Network (SAN).

“Like lots of others we were stuck with a legacy storage infrastructure that didn’t have the scalability the organisation needed,” explains Barrett.

Roke began by looking at an upgrade of its existing direct-attached storage (DAS) solution (from Dell) but soon realised that it would need to look at a SAN-based solution to manage the ever-increasing levels of data efficiently, reduce energy costs, create a more ecological footprint and provide a flexible infrastructure to respond to the organisation’s needs.

The Roke team carried out an intensive one-year evaluation process that included looking at offerings from Dell/EMC, Sun Storagetek, Hitachi Data Systems and Compellent.

The solution

Having worked with ICT integrator Fordway on a variety of projects since 2004, the Roke IT team decided to enlist the company’s expertise in helping to choose and integrate the most appropriate SAN solution.

“Roke needed a solution that would help them minimise administration and management-related costs,” explains Richard Blanford, managing director, Fordway. “Where Compellent differs from other vendor offerings in the market is that it eliminates the complexity and associated risks of the traditional hardware and third-party software mix. Compellent’s Storage Centre Software Suite already has the core functionality in place: be it data replication, thin provisioning or data instant replay.”
Blanford continues: “By choosing the ‘all-in-one’ Compellent solution Roke also avoids the cost and time implications of implementing any future changes, and as the organisation grows it can add a second SAN simply by buying the license to enable replication.”

Compellent’s ability to bespoke-build the system, ensuring exactly the right specifications to meet Roke’s requirements, also played an important part in the decision-making process.

“It gave us additional confidence that it was exactly the right solution for us and our needs,” confirms Barrett.

The results

Taking into consideration time saved from administration related to server management, energy savings from reducing the number of servers in use, the improvements in up-time and network speed, and the physical space made available at the site, Roke predicts IT-related cost savings in the region of £40,000 over the next three years.

“We will now be able to reduce the number of Windows servers we operate by half, from 50 to 25,” comments Barrett. “The end result will be a greener IT operation and a significant reduction in energy costs - at a time when these are seeing unprecedented increases in the UK.”

Users at Roke have also already provided positive feedback, reporting quicker access to project-critical data.

In addition to the cost benefits Roke found that the SAN enabled it to better manage the varying levels of access to information, which can often be highly sensitive or restricted depending on the type of project.

Next steps

In Barrett’s words, “The SAN has been a catalyst for change. We now have a variety of possible IT developments that we’re looking into, many of which we would not have considered – or realised were possible – before we had the SAN in place.”

Part of the team’s plans include a second Compellent SAN for disaster recovery purposes, the objective being to add additional layers of resilience to the IT infrastructure of the organisation.

Fordway Solutions Ltd is exhibiting at Storage Expo 2008 the UK's definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008.

Source: StoragePR

How mature is your Archive?

By Steve Tongish, Marketing Director EMEA at Plasmon

A recent report from analysts the Aberdeen Group found that 59 percent of respondents do not have an e-discovery or message archiving strategy in place. The worrying implication is that the majority of businesses today are not in a position to meet even the most basic IT governance requirements, let alone access the value residing in their data assets. It is clear that many organisations continue to rely on their backup infrastructure to provide data archiving capabilities but a backup is no substitute for a carefully considered strategy.

In order to help organisations accurately assess the status of their current archive and plan future strategies, Plasmon has developed the Business Archive Management assessment. This model defines five stages in the development of a mature archive. In addition, it identifies four key business processes that when, put in place, lead to an increasingly sophisticated archive environment. The five developmental stages and four business processes can be visualized in a simple matrix that can be used to chart the status and evolution of an archive.

The Five Archive Stages

The developmental stages of an archive environment begin with Chaos where the concept of archiving does not even exist and extend to the fifth stage of Business Value where archiving is totally embedded within the business process and the value of archive data is fully exploited. In between these two extremes the archive environment evolves, becoming Reactive to risk and regulations, moving into the Business Sustaining stage which centralises the archive process and then to a Proactive state that incorporates the wider needs of the organisation. These definitions enable organisations to assess their current state, quantify risks and highlight additional benefits.

  • Chaos
  • Reactive
  • Business Sustaining
  • Proactive
  • Business Value

The Four Key Business Processes

In order to evolve from one stage of archive development to the next, key business processes must be brought to bear. These include both Business and Archive Planning considerations in combination with Archive Operations designed to meet specific regulatory obligations, control risk and establish an overall code of best practice. Another area of critical importance is Infrastructure Management which selects specific software and hardware solutions and defines their long-term maintenance.
  • Business Planning
  • Archive Plan and Design
  • Archive Operations
  • Infrastructure Management
The first step in the archive assessment process for an organisation is to determine where their current archive strategy is positioned within the five archive stages. With this established, they can concentrate their efforts on the four business processes essential to evolving their archive environment. The process has been designed specifically to help a company advance from a less mature to a more sophisticated archive stage in order to capitalize on the business and financial benefits of a truly integrated archive strategy.

To take an online Business Archive Management assessment or download a more detailed white paper on the subject, visit the Plasmon website

Plasmon Data Ltd is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008

Source: StoragePR

SanDisk Cruzer Enterprise USB Stick – Broken!!!

by Michael Smith (Veshengro)

Earlier this year I reviewed the Cruzer Enterprise USB hardware encrypted USB drive.

At the time of my review it did what it said on the box and the only real negative comment that I made was that it did not have any support for Linux, but I also understood that when it was given to me for review. The only problem is that it did not work for very long.

Having used the drive on and off occasionally to carry data – and am I glad there was nothing on it at this time – coming to plug it in the other day it refuses to recognize the partition. While it initiates up to a point I then receive the message on screen “Cruzer Enterprise requires two free drive letters. One free dive is available at G:\ but an additional drive is not available. Cruzer Enterprise cannot operate until an additional drive is available and will shut down”.

The problem is, however, that even if one would free a drive letter it still will not work as a drive letter, in this case “H” has already been allocated for the partition and it shows perfectly in “My Computer”.

This is not a case, as it is so often with Windows, that Windows does not recognize the dual-drive stick; on the contrary; Windows does – so it would seem - only Cruzer itself does not recognize that the drive has been mounted. Other dual-drive sticks, be this Blockmaster's “Safestick” or MXI's “Stealth Drive”, work just fine.

It would appear that something has happened to SanDisk's Cruzer stick and as to the how, and why and wherefore, that might be something that the manufacturer would like to address and comment one.

While I have to say that recently I have had a small problem with the “Safestick” from Blockmaster as well in that it did not want to recognize my password – then again, I might have been thinking I was entering the right password and I did not – and I had to reset the drive which, by the way, in the case of Blockmaster's “Safestick” only wipes the drive but does not destroy it as, so it is always claimed by other manufacturers, it will do to their drives, to have a crash like this in a secure USB stick is NOT a good idea.

It would appear that we still have a way to go at SanDisk as to the safety and security of the Cruzer Enterprise USB drive. Or am I the only one who managed to crash one as yet?

Not that that would surprise me for I have managed to crash a few other USB drives in the recent months but those were cheap give-aways from companies and not drives that cost quite a bit of money.

But, as always with flash drives of any kind... do yourself a favor and have them backed up somewhere if you do carry sensitive and important data on them so you can restore the stuff.

© M Smith (Veshengro), August 2008

What good is a Data Backup if your PC won’t work!

By Brian Blanchard, Sales Director – EMEA, CMS Products Inc.

We are hearing more stories these days about how people are losing or having portable storage or Laptop PCs stolen. As devices get smaller, they become easier targets for theft or for getting mislaid. The traumas of transporting a PC or storage device and the current trend of upgrading software automatically coupled with the risk from viruses and malware mean the PC is open to problems during travel. More people are travelling and taking their IT with them. This is a potential problem for both the business and the individual. Reliance on the PC during travels means you have to consider the worst.

How can you cope with the loss of a PC, CD/DVD, USB Stick or external storage device? What do you do if your PC fails to start-up or run correctly. How would you be able to continue working if you lost the use of your PC whilst travelling? Don’t think a data backup will be enough protection, it won’t.

There are 2 considerations here. First, it is important that, if your storage device or PC is lost or stolen, the recipient of your device is unable to access the contents. Second, you need to get access to your data, programs and PC very soon in order to ensure the continuation of your business or private use. Being on-the-move means this is often difficult to achieve.

A solution to the first point is to make sure that the information on your device is encrypted, whilst a solution to the second is to have a spare system disk which is ready to be installed inside your PC or inside a similar one if your PC was stolen. Knowing that you have a backup of your data in the office won’t help you recover whilst travelling. Not everyone has an IT department waiting for their call either. Even if they do, the chances are that the IT department will not be able to get them working again in minutes – typically it is days.

This spare system disk will not only have a copy of your data files, but will have all programs and settings needed to operate your PC. It is not a backup but a fully ready disk waiting to run your PC with all your applications and data on it. Just think how long it would take to set-up a new PC the same as the one you’ve just lost.

A good analogy is the spare wheel in the back of your car which is the solution to a puncture that gets you back on the road again quickly.
A mobile worker with a spare system disk has the solution to a PC system problem. Imagine you are driving to a meeting and get a puncture. Luckily, most of us can either change the wheel or call roadside assistance to do it. However, if you don’t have a spare in the back, even roadside assistance cannot help you. It is off to the local tyre specialist – when the low-loader eventually arrives to transport you and the car. That will certainly delay you and you could end up losing valuable business.

With a spare system disk in your bag, if anything does go wrong with your PC’s system disk or the software gets corrupted, you are only a few minutes away from working again. If you are not comfortable changing the disks – just call into the local Computer store and get them to do it. In fact now it is possible to start-up and run the PC from the spare system disk attached to the USB port, so no screwdriver or technical skill is required to recover. Corporations around the world are finding that by deploying spare system disk solutions to their mobile workers, up-time is increased and IT costs reduced. Usually ROI is measured in weeks not years.

Protecting your PC and storage devices from unauthorised access can be done through the use of encryption. Software is available that creates encrypted areas, known as Vaults, on the storage device where access to decryption can only be made after entering the correct pass-phrase. This encryption can also be found on full disk encrypted solutions where the external storage disk is attached to the USB port and a screen prompts for the pass-phrase. Again the storage device is permanently encrypted, decryption occurs in the memory not on the disk surface and only after correct pass-phrase entry.

A USB stick could be encrypted with Vault software, data files loaded onto it and the Stick sent to another part of the world. The recipient would get an e-mail with the pass-phrase and be able to download software which will use the pass-phrase to de-crypt the sensitive data on the Stick. If the Stick is intercepted on its way, the data is secure.

Ideally you need both a spare system disk and encryption. Hard disk drives are now available that have encryption chips on them so that the system disk in your PC will be automatically encrypted once given an access pass-phrase. Similarly the spare system disk can also be hardware-encrypted. Although encrypted, the drives still have the ability to start-up the PC so maintaining the benefits of carrying a spare system drive but with the added benefit of being secure from unauthorised access.

The hardware-encryption which is now appearing on Hard Disk Drives does not appear to impact cost – in fact within a couple of years probably all hard drives will have encryption chips built-in.

So don’t just buy a Data Backup solution for your Desktop or Laptop PC, buy a solution which creates a ready-to-use spare system disk and has encryption, after all, what use is a data backup if your PC won’t work!

PST Europe Ltd is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008

Source: StoragePR

EVault Gets Arriva London Out Of Storage Jam

By Phil Evans, Vice President of Sales Northern Europe at EVault

Streamlined disaster recovery process and improved resource management the top results of online backup implementation

Arriva London is one of the biggest bus operators in London, providing more than 19 per cent of the capital’s bus services under contract to Transport for London (TfL). Contracts involve routes, vehicles and timetables specified by TfL and are closely monitored to ensure services are of the highest quality. Arriva London operates from nine garages in north London, and from six garages in south London.

In total, Arriva London operates more than 50 million miles each year, nearly 330 million passenger journeys. Arriva London is part of the Arriva Plc group.

Arriva Plc is one of the largest providers of passenger transport in Europe. Its buses and trains provide more than a billion passenger journeys a year. It has a major presence in over ten European countries and its revenue has increased by 140 per cent over four years to £752.3m.

Arriva London’s Head office in Wood Green, stores and manages all of the scheduling, rostering, HR and payroll for London. As the company grew so did the demand for data, prompting Arriva London to re-examine its data management and protection processes. Arriva London’s legacy backup system was built on tape technology. This created problems, as tape backup was slow, errors were frequent, and IT staff spent too much time managing the backup and recovery process.

The situation was made worse by the fact that as new servers were installed the tapes they required changed, which meant that IT staff had to deal with a number of different tapes, of different sizes, stored in different locations.

Alan Ricot IT Manager, Arriva London commented: “The situation was getting to the point where it was becoming hard to manage, staff were frustrated by the lack of uniformity and we needed a change. We have some twenty servers and five or six different models of tape drives, each requiring a different tape format.”

Recognising that replication software is often expensive and bandwidth-intensive, Alan decided to investigate the market further. Following a data storage seminar, hosted by EVault’s platinum reseller BSG, Alan was instantly impressed by EVault’s solution. After reviewing its website he decided that the disk base backup solution could be an ideal fit, as it eliminated the need for physical tape backup.

Alan commented: “I contacted EVault in January after initially being introduced to them through their reseller BSG. After discussing the solution further with an EVault representative I decided to go ahead with the thirty-day free trial, using three of our servers as a test bed. We continued the trial until March and following a proposal to Alan Sewell (Arriva Finance Director), decided to migrate all of the remaining servers from tape- storage to online disk-based storage.”

Alan continued “The implementation itself took just a couple of weeks, once I was shown how to install one it was pretty easy, so I had most of the servers up and running by April. EVault then returned to check the solution was running efficiently and trained the remainder of the team.”

Arriva London now have a single, centralised back up system that can be managed from one terminal with ease, delivering end to end disaster recover. They no longer have to worry about the management, retrieval and storage of physical tape, which has lead to greater efficiencies and space savings across the whole IT department. The online backup is much faster and Arriva London has been able to eliminate the time spent previously on tape management and minimise the administrative errors inherent in this process.

Alan commented: “As soon as the solutions was implemented the benefits were instantly apparent. Staff can now retrieve data instantly, just by searching on the screen, they no longer have to spend time looking through hundreds of physical tapes of different shapes and sizes. The automated nature of the online backup solution now means that the management of backup and restore only takes a few mouse-clicks, which has had a dramatic effect on my staff and efficiency.”

Alan concluded: “We’ve been so impressed with the system that we will be looking to install a second vault at our Edmonton depot as part of our disaster recovery plan.”
Seagate Services is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008

The Seagate Services Group, a division of Seagate Technology (NYSE:STX) specializes in storage management solutions which protect and manage businesses’ most valuable information. Built on the most advanced technologies, SaaS, managed services, and licensed software offerings are delivered, deployed, and managed in flexible ways to suit small to large businesses’ unique needs. The Seagate Services Group is comprised of three business units: Data Protection (EVault) for comprehensive data software and outsourced protection solutions for backup, recovery, archiving and business continuity, Recovery Services (Seagate Recovery Services) for data recovery, data migration and collection services; and E-Discovery (MetaLINCS and EVault Insight) for enterprise software and hosted solutions for intelligent E-Discovery. For more information, please visit

Source: StoragePR

Virtualisation is here to stay

By Kevin Bailey, Director Product Marketing EMEA, Symantec

Virtualisation has now been around for a number of years, yet many businesses are still not implementing engines in a strategic way.

As IT managers face increased pressure to meet the changing demands of an IT infrastructure, supporting a growing number of applications on desktops from deployment through to retirement, as well as coping with the daily deluge of software conflicts can be a daunting task.

The increase in usage of software virtualisation technology will ease this pressure by eliminating conflicts, allowing organisations to deploy and fix applications quickly and ultimately reduce support costs and improve application reliability. However, storage virtualisation is not the solution, but just one component of an end-to-end storage management solution. Virtualisation engines should be planned alongside other IT strategies, such as business continuity, disaster recovery and general availability procedures, so that IT managers can holistically integrate it into their IT.

One of the challenges IT managers are facing is that most current systems management tools deployed to monitor the enterprise IT infrastructure are not always built with virtualisation in mind. A common complaint is that configuration database management tools do not work properly in dynamic virtual environments.

Users commonly experience problems with their applications slowing down and PCs failing to reboot as their system gets older and more littered with applications. Magnify this problem by a thousand users and it’s clear to see how productivity within an organisation could suffer and how quickly this could become an expensive problem. These problems occur when users install new software or application updates which share common resources and codes, resulting in conflicts, application failure or the reintroduction of security holes that were previously patched.

With this in mind, IT managers should seriously consider taking a look at software virtualisation technology which enables desktop applications to be run as virtual software packages, allowing users to switch applications on and off instantaneously to eliminate any conflicts. Applications can then be reinstalled remotely without adversely affecting the base Windows configuration. By simply switching an application on or off without needing to reboot, a user can keep their PC’s capacity under control as well as maximise its performance and resilience.

The technology works by deploying the software to a part of the file system that is normally hidden from Windows. As a result, the resources that are used by applications like Microsoft Word are isolated from the operating system or other applications that might have conflicting drivers.

IT managers can also use software virtualisation technology when testing and rolling out new versions of an application. Performing a successful upgrade to a business critical application is essential, but there is always a risk attached to changing or upgrading a package. If the application doesn’t work properly for some reason, the management team will not be interested in understanding why, they will just expect the application to be working again quickly.

Virtualisation technology can resolve upgrading issues by allowing users to simply roll back to the old version so they can continue working. This gives IT managers time to repair the damaged application before making the new package available again. In addition to this, virtualisation allows users to host multiple versions of an application on the same system giving them sufficient time to become familiar and comfortable with the new features of the package before they feel confident to move away from the old version.

Even though awareness of virtualisation has been around for some time, many IT managers still do not understand the technology and how it will change the way software is managed in the future. However, once more IT managers start looking at software virtualisation and begin to see the true value of the technology, it will only be a matter of time before IT infrastructures become completely virtualised. Organisations shouldn’t make the mistake of turning a blind eye to virtualisation as it is here to stay and will be used in the future by many IT departments in their quest to standardise IT infrastructures and achieve financial efficiencies.

Symantec (UK) Ltd is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008

Source: StoragePR

Think again about your Disaster Recovery strategy

By Martin Brown, Senior Director of Enterprise Sales UK, Symantec

Secondary sites for business continuity and disaster recovery have been part of the enterprise computing equation for decades, but only recently has their importance commanded top-line visibility. Specifically, in the Summer of 2006, the UK floods and previously in August 2005, Hurricane Katrina gave business executives and IT managers a collective shock: these natural disaster severely impacted the IT infrastructure and network facilities across the globe, affecting the primary and backup IT facilities of many organisations. The wholesale confusion that ensued in many cities brought into sharp focus the need for comprehensive business continuity plans, incorporating secondary data centre sites located far enough away so as to be untouched by the disaster affecting the primary data site. However, many IT organisations believe that secondary data centres are inherently expensive and impractical for all but the largest enterprises.

Today, new technologies are available to help enterprises better achieve business continuity in the chaos and devastation that natural and man-made disasters leave in their wake. This disaster recovery (DR) strategy can also be an extension of the local high availability (HA) solution the organisation already has in place, it can address causes of downtime like user error that most IT managers rarely think about when devising their DR plan. Automated solutions for configuration management, clustering, provisioning, and server virtualisation are available now, making secondary data centres a cost-effective option that can operate in a streamlined, cost-effective manner. In addition to automating failover and recovery functions, these same tools can also help administrators meet stringent system availability requirements by helping to minimise downtime.

The following strategies can help enterprise IT organisations implement robust high availability and disaster strategies that maximise system availability for day-to-day operations.

Solve problems faster: Traditionally, one of the key challenges in executing timely disaster recovery was a delay in alerting IT staff to an outage, and subsequent problem diagnosis. Advanced clustering technology notification and reporting capabilities can pinpoint when an outage occurs, and immediately notify administrators of a problem. Clustering technology then takes immediate action by starting up applications at the secondary data centres and connecting users to the new data centre. Administrators can then use configuration management tools to diagnose the cause of the downtime.

Automate recovery processes: For many organisations, system recovery is a manual process. Pressure builds on administrators as time, revenue, and customer loyalty slip away, and the potential for human error rises. An automated approach, such as high availability clustering, eliminates vast amounts of downtime compared to a traditional manual recovery process. If a system fails in the primary data centre, the software can restart the application automatically on another server.

Test your DR plan: Recent studies have shown that few companies test their DR plans on a regular basis, and as a result, most companies have little faith that their DR plans will work when needed. Companies have been reluctant to conduct DR testing because testing often involves bringing down production systems and mobilising a large segment of the work force, forcing employees to work during inconvenient hours such as weekends or nights. With automated failover capabilities, IT organisations can test recovery procedures using a copy of the production data – without interrupting production, corrupting the data, or risking problems upon restarting a production application. This capability means that tests can be run during business hours instead of over the weekend, reducing staff overtime.

Extract value from secondary sites: For most enterprise IT organisations, secondary sites are viewed strictly as cost centres, sitting idle much of the time. New advances in server provisioning software allow more value to be extracted from secondary sites, enabling them to be used for test development, quality assurance, or even less critical applications. If a disaster strikes and the primary data centre goes down, administrators can use provisioning software to automatically reprovision server resources to match the production environment. With the flexibility to dynamically reconfigure and reallocate resources, the secondary site becomes a resource that can be used for multiple purposes the majority of the time, but can be quickly reverted to its backup designation when needed.

Across industries, enterprise IT organisations have been made highly aware of the devastating impact that natural and man-made disasters can have on business continuity. Secondary data centres have long been viewed as expensive and impractical for all but the largest enterprises, but with innovative new high availability and disaster recovery software is available to allow IT organisations to consider a secondary site as a viable choice.

Now, more than ever, the time is right for enterprise IT organisations to rethink DR strategies and enhance their business continuity plans with a secondary data centre.

Symantec (UK) Ltd is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008

Source: StoragePR

Survey finds Disaster Recovery is the Top Issue for Organisations

London, 7th August: Research by Storage Expo ( has found that the top data storage challenge facing organisations is how to implement the right disaster recovery strategy (83%). The second most important issues of concern to the 252 organisations covered in the research was ensuring data security for the business (82%) and third was the management of increasing volumes of data (81%). Centralising data access was a concern for 79% of organisations closely followed by how to store data cost effectively (77%). Ensuring compliance with the latest data storage legislation and knowing what regulations to comply with are of concern to 68% of organisations. Achieving interoperability across existing storage solutions was an issue for 60% of organisations.

Storage Expo 2008 at the National Hall, Olympia on the 15th and 16th October, is the UK’s definitive event for data storage, information and content management. Providing the opportunity to compare the most comprehensive range of solutions and services from all the leading suppliers, the show features over 100 of the world’s top storage vendors and an extensive, cutting-edge free education programme with over 62 experts speaking, including sessions that will address the latest issues on how to tackle data growth and disaster recovery.

The education programme for 2008 has been expanded to reflect the needs of today’s data storage and information management experts as they become as concerned with information and data management as they are with storage capability, scalability and infrastructure. With strategic and technical analysis, case studies and storage management reviews, this years programme will reveal expert knowledge of how information management can increase both storage efficiency and information utilisation for business application.

The free to attend keynote sessions at Storage Expo bring together the industry's leading independent experts, analysts and end-users from high profile corporations and take an in-depth look at some of the latest trends in data storage today.

The sessions will address the latest business advantages that the sound application of storage strategy can deliver for an organisation. Storage practice has traditionally been driven by the push of regulation, legislation, and business continuity rather than the pull of sound business practice. The impetus is changing from push to pull due to the emphasis on increasingly business efficiency, utilization of intelligence, process management and productivity benefits.

The keynotes at Storage Expo reflect the change in the data storage and information management landscape, here is a selection of what’s on offer at the show:

With data storage volumes still growing at over 50% per annum, the need for efficient Storage architecture has never been greater. Jon Collins, Service Director, Freeform Dynamics chairs a panel on Architecting for Efficiency and Effectiveness with senior executives from EMC, Pillar Data, Double-Take Software, NetApp, HP and HDS.

John Abbott, Chief Analyst, The 451 Group chairs a panel on Improving Asset Utilisation with Virtualisation with Dr Zafar Chaudry, Director of Information Management and Technology, Liverpool Women’s NHS Foundation Trust and Steven Shaw, IT Manager, British Horseracing Authority.

Simon Robinson, Research Director, The 451 Group leads a panel on Reducing your data footprint with De-duplication with Steve Bruck, Infrastructure Architect, Associated Newspapers Ltd and Simon Spence, CIO, CB Richard Ellis.

Graeme Hackland, IT Manager, Renault F1 Team will give an insight into modular architecture and growing with clustered storage. The keynote on protecting your data with back-up strategies has case studies and debate from Dylan Matthias, Unix and Storage Manager, Britannia Building Society and Graeme Walker, Senior Systems Engineer, Carlsberg looking at the rationale and technology behind what are the best back-up solutions in the world. Probably!

For more on the education programme visit Storage Expo, the UK’s definitive event for data storage, information and content management, that provides you with the opportunity to compare the most comprehensive range of solutions and services from all the leading suppliers. For further information on Storage Expo, or to register, please visit

Source: StoragePR

Beating the data deluge with storage virtualisation

As data volumes explode, businesses face the daunting prospect of unmanageable storage growth. Steve Murphy, UK Managing Director for Hitachi Data Systems reveals how organisations can use storage virtualisation to consolidate their systems, increase utilisation and efficiency and reduce costs.

As virtualisation is a technique rather than a specific technology, applied to areas such as servers, storage, applications, desktops and networks, it is often poorly understood. This article attempts to bring clarity to how virtualisation is applied to storage systems and the benefits it can deliver.

Fundamentally, virtualisation aims to abstract software from hardware, making the former independent of the latter and shielding it from the complexity of underlying hardware resources.

Storage virtualisation often performs two functions. It makes many storage systems look like one, simplifying the management of storage resources and in some cases provides partitioning so one storage system appears as many, providing security for applications that need to be separated.

Across most industries, data volumes are spiralling out of control as the amount of information and content we create grows exponentially. Data storage within most organisations is increasing by about 60% annually. This means that organisations have to increase their capital and operational expenditure in areas such as IT staff, power, cooling and even data centre space.

Traditionally, organisations have tried to deal with growing data volumes by buying more disks. However many organisations are finding that their storage infrastructures are becoming unmanageable, while the utilisation of these systems is unacceptably low, often running at 25-30%.

Another challenge is that while data volumes grow, IT managers still need to meet the same demands: supporting new and existing applications and users so that the business remains competitive, managing risk and ensuring business continuity and maintaining compliance with government regulations and specific industry standards.

There is a strong argument for organisations to stop buying more capacity and, instead, look for ways to consolidate their existing estate and increase utilisation and reduce costs. Storage virtualisation is an increasingly popular way for organisations to address these challenges.

Storage virtualisation aims to ‘melt’ groups of heterogeneous storage systems into a common pool of storage resources. Vendors have adopted a range of methods to achieve this. One technique is to let the server handle storage virtualisation, although as the server is removed from the storage system and has other functions to manage, performance can suffer.

One of the most widely used approaches is to use the intelligent storage controller as the virtualisation engine. By installing an intelligent storage controller in front of their infrastructure, companies can aggregate existing storage systems and virtualise the services provided to host applications such as data protection, replication, authorisation and monitoring. This offers advantages such as simplified management, increased utilisation of storage resources, seamless migration across tiers of storage, lowered interoperability barriers and better integration of common functionality.

Virtualisation brings about cost reductions and efficiencies, by reducing the need for additional software applications and licences, the need for additional hardware (which in turn means lower power, cooling and space costs) and also labour costs and resources required to manage spiralling data volumes. Typically, administrators can manage from three to 10 times more storage capacity once virtualisation is implemented.

Storage virtualisation also allows organisations to consolidate and utilise existing storage assets, extending their shelf life so they continue to deliver value. Organisations can also consolidate their management and storage services, using a single standard interface to manage storage, archive and backup functions.

Storage virtualisation allows organisations to consolidate systems and increase utilisation, significantly cutting the power required to both operate and cool their data centres. This reduces energy costs, which makes good business sense from an environmental and cost saving perspective.

Hitachi Data Systems is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008

Source: StoragePR

Infosecurity Adviser says there are greater intrusions to fear than the DNA database

London, 4th August 2008 - Mike Barwise, from Infosecurity Adviser, the online forum run by the Infosecurity Europe team, has revealed he is less concerned about the privacy issues that the National DNA Database creates than other planned government files.

"The media seems preoccupied at the moment about people's DNA being stored centrally, but the reality is that the database is really a one-dimensional invasion of citizen's privacy," he said.

"Two-dimensional databases, such as the planned telecommunications database of the numbers that people call from their landlines and mobile phones, are much more worrying," he added.

According to Barwise, when you factor in the time element to the planned government telecommunications database and add in location-based data from the cellular carriers, you create a three-dimensional view of the person concerned.

"Not only do you have the numbers called and the locations called from, but you have a time-based diary from which you can extrapolate their movements," he explained.

In his online blog, Barwise notes that a sample of 30 ordinary UK citizens were assembled back in January to debate the pros and cons of the national DNA database.

"This has been a highly charged subject for years, not least due to the progressive extension of the scope of the database, culminating in recent proposals to include young children who might offend in the future - or indeed everyone in the country," he said, adding that the issue arouses strong emotions.

The proposed telecommunications database, however, he adds, would disclose your circle of business and social contacts, as well as your Web browsing habits.

This would, says Barwise, reveal vast amounts of information about your lifestyle.

Small wonder then, that Barwise says that, against this backdrop, concerns about a DNA database start to pale into insignificance...

For more on Mike Barwise's comments:

For more on Infosecurity Adviser:

Source: InfosecurityPR