50% Increase in Removable Media Capacity Moves ProDefence to the Head of the Pack
London—October 16, 2008— ProDefence, a UK distributor of eSecurity products and services across the UK and Ireland, announced today that they will be launching at Storage Expo a 1.5TB removable SATA disk for use with their line of removable disk backup systems as part of Idealstor’s product suite – a leading manufacturer of removable disk-to-disk backup solutions. This announcement increases their lead in removable disk media capacity which they claim further strengthens the argument for using removable disk media in place of tape for backup and disaster recovery.
ProDefence who distribute Idealstor’s products in the UK and Ireland, produce removable disk backup systems that are designed to replace or augment tape backup. The Idealstor backup systems are available for organizations of all sizes with systems available with 1 removable drive bay up to 8 with removable disk capacities up to 12TB per system. Unlike most disk to disk backup systems on the market today that are designed for storing data, Idealstor uses a combination of hardware and software to make the disks removable so they can be used for offsite storage of data and disaster recovery. Idealstor removable media is unique in that they utilize non-proprietary 3.5” SATA2 drives as backup media. These drives are available as a kit from ProDefence that includes a ruggedized removable disk caddy and a protective carrying case.
“This announcement is especially important for their SMB range of products”, says Ross Holmes, Sales Manager of Prodefence the UK based distributor of Idealstor products. “Our partners have been selling Idealstor backup systems for a few years now and have had a number of wins with their Teralyte product which targets the SMB. Most used to sell LTO-3 drives for these customers because of the capacity LTO offered. They have since switched to selling the Idealstor Teralyte because it not only offers much faster transfer rates and more reliable backups, their native capacity is nearly 4 times that of LTO-3 and twice that of LTO-4. Now with 1.5TB drives available this product is even more compelling as our partners can offer up to 1.5TB of removable media for much less than a branded LTO Drive”.
“We are very excited to announce that we have successfully tested and certified the 1.5TB SATA2 drives for use with our removable disk backup systems,” said Ben Ginster, channel marketing manager at Idealstor. “This increase in capacity offers a major benefit to our existing and potential clients. Without having to upgrade their backup system, our capacities just increased 50%. This means that existing clients can simply purchase larger disks as their data increases rather than having to upgrade the entire drive and backup media like one would have to do with tape.”
ProDefence is a UK distributor of eSecurity products and services throughout the UK and Ireland. ProDefence offers the highest level of account management and technical support within the security channel. www.prodefence.co.uk
Idealstor manufactures removable/ejectable disk backup systems that are designed to augment or completely replace tape as backup and offsite storage media. The Idealstor Backup Appliance has been on the market for over 5 years offering a fast, reliable and portable alternative to tape based backup systems. Each Idealstor system uses industry standard SATA disk as the target for backup data and as offsite media. Systems range from 1 removable drive up to 8 and can be used by a range of businesses from SMB to corporate data centers.
Source: StoragePR
<>
ProDefence Launches 1.5TB Removable Disk Backup Media
Green IT still a priority despite Credit Crunch
London, 14th October – A survey conducted by Storage Expo of 513 organisations has found that even though IT budgets are getting tighter due to the credit crunch 70% said that Green IT and efficiency is still a priority provided it also saved them money, and 4% declared that it was a priority even if there were no cost savings. One in ten organisations are no longer pursuing Green IT because of budget cutbacks and 4% said Green IT was never a priority. Finally a few respondents (2%) said that they were so worried about their job they could not even think about it.
Natalie Booth, Event Manager for Storage Expo 2008 says, “In today’s uncertain economic environment, and with energy usage and prices increasing at a rapid rate, finding ways to reduce power consumption whilst maintaining the growth of one’s business is high priority. In the Energy Efficiency Zone at Storage Expo, sponsored by IBM, visitors will be able to hear from leading authorities and technical experts on the best way to increase energy efficiency, save money and stay green. The credit crunch has made a number of businesses rethink their IT strategies and budgets. However contrary to what most companies think green IT and beating the credit crunch can go hand in hand if the right strategy is used.”
Talks at the Energy Efficiency Zone include:
- Energy Efficient Storage Systems by Mick Walker, IBM STG Green Computing Consultant, IBM
- Green Enterprise Storage Reinvented by Eyal Zimran, Global Director of Marketing and Alliances
- Archive Green by David Longson, IBM Storage and Data Services
With two days of stimulating and thought provoking seminars that reflect the needs of today’s data storage professionals and information management experts, Storage Expo 2008 gives you the chance to improve and update your storage and information management strategies
For more information on all seminar sessions visit Storage Expo the UK’s definitive event for data storage, information and content management, which provides visitors with the opportunity to compare the most comprehensive range of data storage solutions from all leading suppliers whilst addressing today’s key issues.
Now in its 8th year, the show features a comprehensive FREE education programme, and over 100 exhibitors at the Olympia, London from 15-16 October 2008. For more information visit at www.storage-expo.com
Source: StoragePR
<>
Atlanta Technology demonstrates hosted storage services at Storage Expo
Storage Expo, 15th - 16th October 2008, London, Olympia – Stand 130
London, UK, 10 October 2008: Atlanta Technology, hosted technology specialists, is showcasing its range of storage virtualisation services at this year’s Storage Expo. During the show, Atlanta will be explaining how IT Directors, Managing Directors and CTOs can reduce capital expenditure on IT hardware and minimise administration costs whilst optimising storage capacity and performance via a hosted service model.
Atlanta Technology, which is sharing a stand with disaster recovery partner FalconStor, aims to demonstrate the benefits storage and server consolidation can deliver to businesses of all sizes. Atlanta has many years experience designing, installing and supporting converged networks that are able to scale to meet the strategic needs of a business as it grows. During the show, the team will provide case studies of existing client implementations that demonstrate the time, cost and resource savings businesses have received as a result.
Simon Kelson, managing director, Atlanta Technology said: “In today’s current market conditions, directors and managers are looking at identifying cost savings to help protect cash-flow. Hosted IT services, including storage and server virtualisation, is ideal for businesses that require guaranteed service levels and have the ability to scale when needed, yet it does not carry the burden of heavy capital expenditure as it becomes a monthly operational cost. So, not only can businesses reap financial savings, but by opting for hosted services can aim to increase service levels, whilst improving business continuity and disaster recovery plans.”
In 1996, the founders of Atlanta Technology began to build their business vision for managed services. Atlanta is a new breed of customer-focused IT Partner that combines high levels of technical expertise with an in-depth understanding of customer needs. Today Atlanta is a trusted IT partner offering strategic advice and scalable, cost-effective IT solutions to customers in small business and the small to medium enterprise. Atlanta has developed a compelling remote services offer that removes the burden of managing expensive in-house IT resources, allowing customers to focus on core business issues. The key competencies include: Hosted Disaster Recovery Services; Hosted Servers; and self-hosted Server & Storage Virtualisation.
To find out more about Atlanta Technology’s range of services, visit stand 130 at Storage Expo on 15th - 16th October 2008, London, Olympia, or visit www.atlantatechnology.co.uk.
Source: Atlanta Technology
<>
Poor Data Classification can cost companies millions
London, UK 10th October 2008, a recent survey conducted by Storage Expo found that one of the main reasons companies classify data was access control (67%) the second reason was retention control (21%) and the third was retrieval and discovery (12%). Access control may be the key reason to classify data; however Alan Pelz- Sharpe, Principal, CMS Watch believes companies should place more importance on the impact of retrieval and discovery with costs in this area reaching £1,000,000 per Terabyte.
He says “typically 80% of mail data consists of duplication. Yet any search tool has to treat each piece of data equally, thus slowing down the process and pushing discovery costs through the roof.”
He goes on to add, “ We estimate that the cost of 1GB of storage is about 10p, however the cost of legal discovery on 1GB of storage would cost at least £1000, so storing everything may seem cheap on the one hand, but can become very expensive should something go wrong.”
Pelz- Sharpe will be chairing a Keynote session on the subject of ‘Email Management and Archive- How to Spend Wisely’ on the 16th of October at 10:30am at Storage Expo 2008.
Theresa Regali, Principal, CMS Watch, says, “Increasingly, data classification is determined based on intended use of data, rather than simply its subject matter or source. Classification is vital to ensure data doesn’t fall into the wrong hands and security protocols are met and to facilitate enterprise-wide search, retrieval and discovery”.
Theresa Regali will chair a keynote session on ‘Data Classification: Can anyone really do it’ at Storage Expo on the 15th of October at 1pm. This session explores the criteria and policies that should be in place to assure coherent classification with respect to information value as it passes through its lifecycle. Key challenges addressed include managing information lifecycle value, meeting retention, discovery and recovery needs and determining appropriate classification schemes.
Speakers at the session include:
- Bob Plumridge, Member of the Board of Directors, SNIA Europe.
- Edward Wood, Director of Information Services, House of Commons Library.
Sessions that focus on Data Classification and Email Management include:
- Email Management beyond Archiving by Ken Hughes, CTO, C2C.
- The problem with archived data and how to solve it by Alec Bruce, Solutions Manager, Hitachi Data Systems.
- Information Lifecycle Management by Shahbaz Ali, CEO and Founder, TARMIN.
Now in its 8th year, the show features a comprehensive FREE education programme, and over 100 exhibitors at the Olympia, London from 15-16 October 2008. Register free to visit at www.storage-expo.com
Source: StoragePR
<>
EuroSOX – TIME FOR a new approach to compliance
By Jürgen Obermann, CEO of GFT inboxx GmbH
The 5th September 2008 marked the deadline for European organisations to transpose two new directives – the Statutory Audit Directive and the Company Reporting Directive – into domestic law. Commonly referred to as EuroSOX, this latest initiative is the European Commission's eighth guideline for the protection of shareholders, brought in with the aim of ensuring the reliability of annual accounts and consolidated financial accounts of companies, in the wake of recent high profile corporate fraud cases, such as the Parmalat scandal.
Despite the publicity around the introduction of EuroSOX proclaiming the drastic requirements expected from IT, there is surprisingly little said in the EU guidelines as to the concrete IT requirements necessary for organisations to become compliant. Thus suggesting that the current hype regarding 'EuroSOX compliance in IT' has been somewhat exaggerated. After all, companies operating globally have already had to abide by the International Financial Reporting Standards (FRS) or the United States Generally Accepted Accounting Principles (US-GAAP) if they wish to adhere to international legal regulations.
The impact on IT
Aside from the obvious changes necessary in IT, EuroSOX will additionally lead to some indirect IT requirements. These ultimately derive from requirements that qualified auditors have to meet, though they are mainly general requirements regarding the quality of systems, processes and data management, as have already been prescribed for years - e.g. in accordance with Basel II.
In implementing EuroSOX, companies should not look on this as just another compliance regulation to be abided by, but rather as an advantageous tool which should be used to encourage greater business transparency.
Best practice approach to EuroSOX
As far as EuroSOX and other compliance rulings are concerned, IT departments should not interpret individual regulations and laws such as EuroSOX, Basel II etc., but should instead concentrate on a holistic approach. This is as proven in recent research commissioned by GFT inboxx which found that 94% of IT managers in Europe have insufficient knowledge of the legal requirements regarding archiving of e-mails.
IT departments must concentrate on their core tasks. They are not in a position to tackle the legal details of individual laws. This is a job for legally trained and specially qualified expert staff. By concentrating on the combined, generic requirements of all compliance guidelines, IT departments can tackle the issues at a higher level.
The requirements that should be met by an IT department can be roughly divided into three basic tasks, however these are not mutually exclusive:
1.Generic best practice data management and data handling – making sure that a consistent approach is taken across the board.
2.Long-term safeguarding and processing of all information. Preparation for possible disturbances (disaster recovery), secure long‑term archiving of all information and ensuring access at all times within the parameters of storage times are of the utmost importance in this context.
3.Transparency, which is above all facilitated by creation of powerful search functions and analytical methods regarding all information in the company.
The first task is very much open to interpretation and is broad in nature. In the event of any doubt, any weak points coming to light as a result of audits and inspections can be resolved in this context. Items two and three, however, are clear and not open to interpretation. An email document either exists or it doesn't. Either powerful overall search is possible or impossible. Inspections will thus concentrate on these points. Thus in the short term there is a need for action from the IT department in this respect.
Recommendations for IT departments
1.Do not tackle individual legal regulations such as EuroSOX – leave the interpretation to the specialist departments.
2.Don’t take a siloed approach. Instead concentrate on implementing the common requirements for all compliance guidelines:
a.Transparency of IT processes;
b.Audit-proof long-term archiving and planning for disaster recovery
c.Creation of an overall search and analysis platform to facilitate e‑Discovery
3.In the short term focus on (b) and (c). They are rigorous requirements that cannot be avoided.
4.Use this as an opportunity to create a business case for other IT projects.
GFT Inboxx is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: StoragePR
<>
62% of companies use Data De-duplication to chase away Storage Inefficiency
London, UK, 6th October 2008 - Research by Storage Expo has found that 62% of companies use De-duplication to increase efficiency, reduce storage requirements and also costs. A further 27% intended to implement de-duplication technology within the next 12 months. At the other end of the spectrum 4% do not intend to join the bandwagon and 6% claimed that De-duplication was not delivering what they expected.
Simon Robinson, Research Director, The 451Group, commented, “Data De-duplication has emerged as one of the most talked about technologies in storage, and IT departments are actually embracing it, suggesting that De-duplication can provide real value and return on investment to businesses that deploy it.”
Natalie Booth, Event Manager for Storage Expo 2008 said, “Data De-duplication is a relatively new method of reducing storage needs and boosting efficiency by eliminating the inherent data redundancies that exist in many traditional storage and data protection processes, such as backup. Heralded as one of the most exciting technologies in the storage market De-duplication has left organizations very excited about its possibilities.”
Simon Robinson will be chairing a seminar on Data De-duplication called ‘Reducing your Data Footprint with De-duplication’ on the 15th of October at 3:45pm. The seminar will be focussing on organizations that have implemented de-duplication technology and discussing their experiences on whether the reality really does live up to the hype. Key challenges addressed at the seminar will include reducing storage requirements, increasing efficiency, reducing energy requirements and reducing asset redundancies”
The other speakers in the keynote are:
- Steve Bruck, Infrastructure Architect, Associated Newspapers Ltd
- Simon Spence, CIO, CB Richard Ellis
With two days of stimulating and thought provoking keynotes and seminars that reflect the needs of today’s data storage professionals and information management experts, Storage Expo 2008 gives visitors the chance to improve and update their storage strategies.
Other Seminars that focus on De-Duplication include:
- De- duplication Explained by Barnaby Skivington, Senior Consultant, Data Domain
- Backup-to-Disk with De-duplication. Lowering Costs, Improving Protection, and Simplifying Management by Sean Livingstone, Technology Business Consultant, EMC
- Storage Management for the 21st Century by Jonathan Kamminga, Global Solutions Architect, Dell
- The Reinvention of Storage by Eyal Zimran, Senior Director of Marketing and Alliances, IBM
Source: StoragePR
<>
ABSplus for when the worst happens
Your PC does not work …. Now what do you do?
By B.Blanchard, CMS Products Inc
Introduction:
New working practices have led to more employees working from home and more mobile workers carrying laptop PCs. In both cases the PC has become a vital tool in the business process and the individual is working without the benefit of IT co-workers on hand to provide a quick response when problems occur.
One of the key methods of business communication is now the e-mail which means the PC becomes the main link to the rest of the organisation as well as to suppliers and customers. The PC is used for many business functions from Diary to typewriter and calculator, but what if your PC fails to work properly – what do you do? There is no time to send it off to the IT group for repair, you need it now. Self-help may be the only solution feasible in the time available.
The Problems:
Your PC problem could be one of several types.
PCs today have become much more reliable and laptops have become small enough to be treated like appliances. We forget the technology inside is both advanced and vulnerable, being susceptible to shock and other environmental issues. This can lead to failure of the system’s hard drive – one of the more common hardware failures.
Today there are thousands of programs available for the PC and any one PC will have a combination of these programs according to personal preference and the requirements of the owners work. This can sometimes lead to problems particularly after a software change or update.
Viruses and mal-ware are threatening our PCs everyday and even a protected PC could succumb to the threat, leaving the PC in a dangerous state. Not only is its operation affected but it could be difficult to eradicate the problem and could lead to further infection. A good, clean replacement system is required.
A Solution is needed
What is needed is a solution that you can use when any of the above occurs. It must be quick and easy to implement in order to get your PC running properly again.
There are many Backup solutions on the market today but all they really do is automate the copying of files from the data area to another disk. They may not even access all areas of the disk to allow files like e-mail to be copied. The backup created would be of no use in recovering the use of your PC.
Some Backup solutions take the approach of making an image backup of the system disk. These usually blindly copy information from each part of the system disk to an image backup device like a CD/DVD or tape where the information is not stored in its native format and therefore has to be completely restored back to the system disk before it can be used. This takes a long time and furthermore, each backup requires all the information to be copied again. It is not possible to copy only the changes. When processes take a long time they are often forgotten or ignored by users.
Industry analysts would agree that the fastest way to recover a PC is with a replacement system drive. When trouble occurs, recovery is achieved by starting the PC from the replacement system disk. An example of this kinid of solution is the CMS Products’ ABSplus where the BounceBack Ultimate software creates the replacement system drive and keeps it up-to-date by just backing-up the changes. When recovery is required you restart the PC from the ABSplus disk on the USB port.
The required solution is one which quickly recovers use of the Laptop or desktop and must be quick & easy to use, since there may be no IT assistance readily available to the user. It should have the ability to password protect access to the information and offer the user the chance to encrypt sensitive data. Further solutions that protect data during times when it is exposed and vulnerable, such as during travel should also be available and capable of being integrated.
Furthermore, this solution should have the ability to encrypt those files deemed to be sensitive by the user or company. When the PC is recovered, these encrypted files should be automatically restored to their decrypted state on the hard drive after the user enters the correct password.
Conclusions:
Companies who want to reduce their IT costs and keep their users productive deploy solutions which generate replacement system drives. Return-on-investment can be measured in weeks rather than years and is easily calculated from known business parameters.
In the case of a Laptop PC, the replacement solution is in the form of a small palm-sized hard drive which can be carried with the laptop. For a desktop PC, the solution could sit inside the PC itself, ready to be accessed when trouble strikes.
Self-help IT products mean less time and distraction for the IT department and the users can help themselves when they need to, without delay.
CMS Products Inc. is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: StoragePR
<>
Intelligent Information Management for the Real World
By Richard Grondin, VP R&D and Product Deployment, SAND Technology
Intelligent Information Management is not really a new concept. What is new is the scope and volume of enterprise data that needs to be managed, and the more stringent regulations concerning data governance by which organizations must abide. Now more than ever, enterprise data assets need to be managed carefully to protect data access, immutability, privacy, monitoring capabilities, auditability, and business value over the complete information lifecycle. This is precisely what IIM is about, and to be successful it needs to be implemented with a focus on the data itself instead of on the specific ways it will be used. This requires a paradigm shift on the enterprise level: a realignment of IT architectures from an application-centric to a data-centric approach. Business needs are changing quickly, and IT architectures should be able to satisfy these within reasonable timeframes and at acceptable costs, all the while protecting enterprise data assets.
The Corporate Information Factory model, developed by Bill Inmon, is currently in use by a variety of organizations. Typically, the starting point for a Data Warehouse implementation is a business requirement for specific reports. The data architects then identify what information is available, how to get access to it and the level of transformation required to produce those reports (this is the data preparation phase). While this approach has brought significant benefits for many enterprises, it also has some weaknesses – the most important being that it covers only the data associated with a specific need at a specific point in time. For this reason, such an approach could be termed application-centric.
Now, new legal regulations are putting increased pressure on organizations to retain, and maintain access to, a greater variety of data. Data not associated with any specific business requirements must now be kept around “just in case” it is needed, and at the same time data governance policies need to be introduced. The easiest way to respond to these new data requirements would be to store the information on tape, but then the problem becomes how to get access to it. For this reason, some organizations have opted to transform their data warehouses into de facto storage devices. However, a side effect of this approach is that DBA teams are under increasing pressure to maintain the various Service Level Agreements (SLAs) that have already been established, since keeping more and more data in the warehouse while maintaining good performance can be a difficult proposition. This is one of the reasons there are so many RDBMS products currently available on the market.
An Intelligent Information Management implementation can help organizations to overcome these new challenges without going through multiple “revolutions” in their current data architecture. IIM can help satisfy data governance requirements and at the same time improve “data agility” for efficient delivery of Business Intelligence value. This type of implementation requires a shared vision and best practices program supported by a flexible data-centric architecture, along with an iterative transition process to review the various data requirements in the organization.
Many organizations have already deployed multiple data warehouses, data marts and cubes to satisfy their business intelligence requirements. Much has been invested in such deployments, and to protect this investment, IIM has to be implemented as an evolution of the infrastructure currently in place rather than a revolution requiring everything to be rebuilt from the ground up. Typically, the first step taken by organizations implementing IIM best practices it is to introduce an Information Lifecycle Management (ILM) infrastructure.
SAND Technology is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: StoragePR
<>
Keep Your Data Lean… and Green
By Peter Olding, General Manager, SAND Technology
What keeps you awake at night? Is it having to go to your Board of Directors or CIO tomorrow and explain why your organisation cannot hit agreed service levels around your storage infrastructure or perhaps beg for more capital expenditure to increase your existing infrastructure footprint? If so, read on…
We all know the challenge of retaining and maintaining the vast volumes of data required for business and regulatory requirements, is one that faces every modern IT organisation. Increasingly, this puts pressure on IT budgets and data centre infrastructure and is compounded by rising operational and energy costs.
So in the current economic and environmental climate, should we really be buying and consuming more storage? Somewhat controversially, but perhaps timely, I will discuss a way to buy less storage - or do more with your existing infrastructure.
By way of example, let’s consider a 100 Terabytes (TB) of structured data with varying requirements such as complex analysis, long term data retention and archiving. These may be new requirements or extra load on an existing data warehouse. Either way, you’re faced with a number of questions and decisions and you will no doubt be bombarded by vendors and consultants offering best advice - or their version of best advice – which isn’t usually cheap! Either way you’re looking at an expensive system with some expensive consulting. Or are you?
Very few people tend to look at these requirements from the perspective of the data or consider a data-centric architecture, as we are so used to deploying application- centric architectures. In simplistic terms you’re looking at a large pool of data. And you’re not entirely sure what it’s going to be used for except that you have to store, manage and retrieve it. In essence, this large pool of data is a corporate information memory – a primary repository of enterprise data which will enable your company to face the challenges mentioned earlier.
There are a number of approaches to deploying a corporate information memory architecture but in essence it comprises a mix of software, solutions and techniques which, when combined, enables companies to deliver significant commercial, competitive and environmental advantage.
By way of example, by combining deduplication, compression and indexing techniques, this 100 TB of data can be held in under 2 TB. That’s right - 2 TB – or a 98% reduction. Not only that, but it is optimised for fast querying and retrieval, requiring little administration and can work with Oracle, SQL Server, DB2, SAS, Business Objects and so on.
To demonstrate the benefits of this approach, consider an organisation that retains and analyses huge volumes of retail data in an ever changing, very demanding and thin margin industry. They are constantly striving to introduce operational improvements across the business. As a result of deploying a corporate information memory architecture, they reduced their storage infrastructure for their analysts by a factor of thirty which resulted in:
- Lower administration costs enabling IT staff to creatively support the business instead of constantly fire fighting
- Improved operational efficiency to over deliver on service levels for operations and querying
- Increased productivity of their end user analysts by a factor of three
For years the IT industry has been obsessed with size – who has the largest data centres, largest data warehouses, largest severs and so on. Isn’t it about time we got clever and applied innovative solutions to problems and brag out who has the smallest data centre footprint and shortest batch windows?
By the way - what keeps me awake at night? It’s the ducks that live at the end of my garden performing duck karaoke well into the night.
Peter rejoined SAND in 2006 bringing a wealth of commercial experience in data analysis and retention. Prior to this, he was instrumental in the early success of ClarityBlue as a Business Development Director which spun off from SAND in the UK as an analytic CRM provider. Prior to this, he was an Account Director at WhiteCross Systems (Kognitio) having moved from a consulting role with British Telecom where he was involved in a number of data centric projects.
Peter has a Bachelor's Degree in Geological Sciences and spent his early career as a data analyst on the rigs in Africa, the Middle East and Scandinavia, and a Master's Degree in Information Technology. He started a PhD in genetic algorithms until the lure of industry was too great.
He feeds ducks in his spare time.
SAND Technology is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: StoragePR
<>
Over 60% of companies rate consolidation as main objective for Virtualisation
London, UK, 1st October 2008 - Recent research carried out by Storage Expo on 362 companies found that the main objective for implementing virtualisation was sever consolidation ( 62%) closely followed by new management capabilities (30%). A small percent (6%) rated availability as an objective while only 2% had no plans to implement virtualisation.
According to Natalie Booth, Event Manager for Storage Expo 2008, “Virtualisation has seen dramatic adoption by companies in recent years and the qualitative benefits of flexibility, recoverability and assurance are well known. Virtualised server infrastructure is a powerful approach to lower costs, improve manageability, and dramatically increase utilisation.”
John Abbott, Chief Analyst, The 451 Group sums up Virtualisation as “proving to be a catalyst for introducing or revitalizing related technologies. It is easier to move virtual resources around a datacentre (or multiple datacentres) in response to demand, to deploy new resources more rapidly, and to redeploy them once they are no longer required. And it is also easier than in the past to integrate surrounding tools (such as monitoring, billing and chargeback) with virtualised resources.”
John Abbott will chair a keynote programme on ‘Improving Asset Utilisation with Virtualisation’ at Storage Expo 2008 on the 15th of October at 2:15pm. The programme explores the benefits of separating the physical configuration from application and how virtualisation can deliver benefits to a company across all electronic assets in the global network. Key challenges addressed in this session include virtualisation for servers and other electronic assets, understanding financial implications of implementing virtualisation and optimising current assets”.
Speakers include:
- Dr Zafar Chaudry, Director of Information Management and Technology, Liverpool Women's NHS Foundation trust
- Richard Gough, IT Operations Manager, The Wellcome Trust
- Jon Hutchings , Senior Systems Engineer , Network Systems Management Services, Oxford University
- Steven Shaw, ICT Manager, British Horseracing Authority
Sessions that focus on Virtualisation include:
- Virtualisation, Consolidation and Application-Aware Storage - The New Mandates for Datacentre Efficiency by Adrian Groeneveld, Director of Product Marketing EMEA, Pillar Data Systems
- Storage Virtualisation, does it meet today's business needs by Simon Brassington, Enterprise Product Manager, HP
- Virtualising the SMB; Disaster Recovery That Does not Break the Bank by Ed Harnish, Vice President of Marketing, Acronis UK
- Virtual Machine Business Continuity and Disaster Recovery by Luke Reed, Consulting Systems Engineer, NetApp
- How to help you reduce spend by 50% on Storage for Virtualised Servers by Rich Fenton, Consulting Systems Engineer, NetApp
Now in its 8th year, the show features a comprehensive FREE education programme, and over 100 exhibitors at the Olympia, London from 15-16 October 2008. Register free to visit at www.storage-expo.com
Source: StoragePR
<>
Facing up to information overload
Recent HP research shows that the average UK employee now spends up to five weeks a year looking for lost computer files and data. Stephen Watson, Product Marketing Manager for HP StorageWorks in the UK looks at the causes of the ‘information overload’ and suggests some possible cures.
Information is a gift and a curse. It lies at the heart of today’s knowledge economy, allowing organisations to create wealth through its distribution and manipulation. No company could function without it. But in recent years the spiralling growth in the volume of information has left many organisations drowning in data and lacking an effective means of managing it.
The loss in productivity caused by data overload is reaching crisis proportions. Research firm Basex recently chose ‘information overload’ as its Problem of the Year for 2008, estimating that the annual loss to the US economy alone is $650 billion and warns that failure to solve the problem will lead to “reduced productivity and throttled innovation.”
Scaling up comes at a cost
The issue of information growth has often been tackled by simply adding new storage server capacity bringing with it serious financial implications. HP, for example, estimates that for every $1 spent on capacity, management of that capacity adds a further $3.
The costs associated with a kilowatt of electricity are also rising significantly. According to the Uptime Institute the current three-year cost of powering and cooling servers is around one-and-a-half times the cost of purchasing server hardware.
Scaling up in response to data overload also has environmental costs. The Environmental Protection Agency estimates that the total power consumed by servers in the US amounted to 1.2 percent of total electricity in 2005, double the figure when compared with 2000, with IT hardware and data representing the biggest national contributor to carbon emissions.
Information clutter
Some of the reasons for the massive increase in corporate data have been well cited, including the growth of e-mail and other forms of electronic communications. Perhaps more surprisingly, a recent cross-sector survey of 350 UK IT Managers commissioned by HP, revealed that up to 45% of staff store digital content on corporate networks, placing extra strain on storage requirements.
But it is not just the explosion in the volume of data that is causing an information strain – it’s the way it is organised. Nearly two thirds (62 percent) of respondents to the same survey said information was often duplicated within their organisation because it was difficult to find.
This information clutter has a negative knock on effect for IT departments. Increasingly IT managers are spending a disproportionate amount of their time fielding requests from employees trying to find basic information, rather than focusing on improving infrastructure or more strategic IT projects.
Regulatory headaches
Companies now face the added challenge of storing data in line with regulatory requirements. When first introduced, these highly complex compliance regulations created immense pressure on organisations because the specific requirements for data management were frequently not fully understood.
The rapid introduction of policies addressing data authentication, data capture, and its distribution, often led to ‘keep-or-dump’ on-the-fly decision making. Erring on the side of caution, many companies have responded by taking the short-term decision to store everything.
In the absence of effective business intelligence tools, they now find they are unable to retrieve data on demand, leading to potential issues with breaches of compliance, decisions based on poor information, and lost business opportunities. This makes independent audits difficult and time consuming to support.
Just as organisations were coming to terms with existing corporate governance legislation such as Basel II, a new set of directives are now causing widespread concern for business and IT professionals charged with the responsibility of implementing effective information management policies.
The imminent arrival of EuroSOX, a set of European Union directives on corporate governance due to start being passed into law by member states this summer, will push information management systems to new levels of sophistication.
Grasping the information nettle
Although technology is causing information overload, many forward thinking companies realise it can also offer ways to combat it. By grappling with the issue through an effective information management strategy, companies are looking to improve customer satisfaction, increase operational efficiency, and enhance overall business performance.
Next generation solutions can help ensure that information isn’t simply ‘siloed’ into ever-growing storage farms by providing data storage infrastructure and the back-up software that goes with it, complemented with business intelligence and data warehousing tools.
Using industry-leading business intelligence solutions companies can derive more value from their data, manage information more efficiently, mitigate risks and assure compliance with regulatory mandates. They can also accelerate business growth by making better, faster decisions.
Putting information to work
Information is critical for every business and with the right strategy and tools in place, its management can be simplified to make it easier to create, locate and analyse corporate data. This in turn allows staff to find the right data quickly so they can devote their time to carrying out more valuable tasks.
By embracing next generation management information tools organisations can help employees to take informed decisions and reduce data duplication, making information work harder for them – rather than having to work hard to find the right information.
HP StorageWorks is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: StoragePR
<>
CMS Products Introduces the EasyEncrypt Upgrade Kit
Hardware-encrypted Disk Drive protects Laptop PC data from unauthorized access
Hook, UK, September 2008 – CMS Products, Inc., (www.cmsproducts.com) a leader in data security, backup, disaster recovery and content management technologies announces availability of its EasyEncrypt Upgrade Kit for Laptop PCs with SATA hard drives.
The EasyEncrypt upgrade kit allows laptop computer users to quickly and easily replace existing, non-encrypted system disk drives with a state-of-the-art, secure, hardware encrypted disk drive using AES 128-bit encryption and strong password support so data is always safe and secure. EasyEncrypt supports BIOS level ATA password locking for pre-boot authorization – during install you create and enter your password.
EasyEncrypt comes with all of the hardware required to connect its encrypted disk drive to your Laptop PC via the USB interface. The BounceBack transfer software transfers the PC’s Windows operating system, applications, data and personal settings to the EasyEncrypt disk drive, encrypting the files as they are written onto the disk. The new encrypted disk drive can then physically replace the non-encrypted disk drive within the laptop, then when you power on your PC it will prompt you for your password.
CMS Products Inc. will be presenting at Storage Expo, Olympia, London on 15th & 16th October 2008 on stand 490.
EasyEncrypt is available through resellers and at www.cmsproducts.com and prices start from £199 incl.VAT
CMS Products has sold more than two million units of software while installing more than four million complete storage solutions in 90-plus countries. The complete line of product offerings includes automatic backup solutions for both portable and desktop computers, RAID systems, backup and disaster recovery software, media management software, laptop hard drive upgrades and data transfer kits and high capacity desktop hard drives.
CMS Products, Inc., Velocity Series, BounceBack and QuickRestore are trademarks or registered trademarks of CMS Products, Inc. Any other product names are trademarks or registered trademarks of their respective companies.
For further information on CMS Products please visit: http://www.cmsproducts.com
Source: StorageExpo
<>
CMS announces BounceBack Ultimate Recovery/Backup software
BounceBack™ Ultimate Recovery & Backup Software to be launched at Storage Expo London
Major Changes Implemented – Failsafe recovery from USB port / Hardware Encrypted backup drives supported / CDP improvements
Hook, UK, September 2008 – CMS Products, Inc., (www.cmsproducts.com) a leader in data security, backup, disaster recovery and content management technologies will launch its newest disaster recovery software, BounceBack Ultimate at Storage Expo London.
Several new features have enhanced CMS Products’s reputation as a major player in providing professional grade backup/recovery solutions to corporate clients.
“The most important aspect of a Backup/Recovery system for anyone with a PC is the ability to recover quickly and easily from a disaster”, said Brian Blanchard, Sales Director-EMEA at CMS Products Inc. “Recovery is especially important to the employee who is based remotely from the corporate IT group and to the mobile worker whose laptop PC is a vital tool in the business. With BounceBack Ultimate, recovery is as quick as re-starting the PC from the BounceBack drive attached to the USB port. Everything will be there, where you expect it, including your data files.”
Important BounceBack features include:
- Easy install with the first backup making a spare system disk on the backup drive including all partitions, the operating system, applications and Data files.
- Recovery is simple - just re-start the PC from the backup drive and you are working again in a few minutes.
- Continuous Data Protection (CDP) will backup your files in the background while you continue to work or you can just plug the Backup drive into the USB port to launch an incremental backup.
BounceBack Ultimate supports Windows Vista, XP and 2000 operating systems and will be available from leading resellers or from the company’s web site, www.cmsproducts.eu. Beginning in October it will ship with ABSplus automatic backup solutions.
CMS Products Inc. will be present at the Storage Expo show in Olympia, London on 15th & 16th October 2008 on stand 490 and will be discussing its range of Backup & Recovery and secure mobile storage products.
“Don’t buy just any backup software, insist on one that creates a ready-to-use replacement for your PC’s system disk and get data protection and the ability to recover your PC in minutes”, said Blanchard
CMS Products has sold more than two million units of software while installing more than four million complete storage solutions in 90-plus countries. The complete line of product offerings includes automatic backup solutions for both portable and desktop computers, RAID systems, backup and disaster recovery software, media management software, notebook hard drive upgrades and data transfer kits and high capacity desktop hard drives.
CMS Products, Inc., Velocity Series, BounceBack and QuickRestore are trademarks or registered trademarks of CMS Products, Inc. Any other product names are trademarks or registered trademarks of their respective companies.
For further information on CMS Products please visit: http://www.cmsproducts.com
Source: StorageExpo
<>
Effectiveness and Efficiency drive Storage into the Clouds
London, UK, 25th September 2008 - A survey of 875 organisations by Storage Expo has found that the main driver of their current storage policy is storage effectiveness (60%) necessitated by the need for reliability, scalability and access speed. The second most important driver was Storage efficiency (33%) resulting from the need to cope with cost vs. capability. The least popular drivers were Green criteria (7%).
Jon Collins, Service Director, Freeform Dynamics commented, “This is quite fascinating, and confirms a trend that we have seen in other studies: that organisations are prioritizing effectiveness over efficiency when it comes to setting policy and making purchasing decisions.”
With data storage volumes still growing at over 50% per annum, the need for effective and efficient Storage architecture has never been greater. In keeping with the current demands Storage Expo 2008 brings together an exciting and informative portfolio of Storage seminars that take an in-depth look at some of the latest trends in data storage and information management today.
According to Claire Sellick, Event Director for Storage Expo 2008, “one of our keynote sessions on the 15th at 11:30am led by Jon Collins include senior executives from six of the leading storage companies in the world discussing their interpretation of the drive for efficient architecture and exploring the hype around cloud computing and emerging technologies that may radically change the way business operates. Key challenges addressed in the session will include reducing storage costs, growing storage with your needs and understanding the rise of cloud computing amongst other things.”
Speakers at the session include:-
- Adam Thew, Storageworks Director for UK&I, HP
- Adrian Groeneveld, Director of Product Marketing EMEA, Pillar Data Systems
- Ian Masters, UK Sales and Marketing Director, Double- Take Software
- Johannes Kunz, Senior Director for Solutions Marketing and Business Development EMEA, Hitachi Data Systems
- John Rollason, Product Marketing Manager EMEA, NetApp
- Mark Kenealy, Director Technology Solutions, EMC2
The seminar will take a detailed look at cloud computing as it stands today and what the future roll out of cloud technologies will mean for effective and efficient storage. Included in this discussion will be the often overlooked aspects of cloud computing such as disaster recovery, data security and who will be held accountable for data loss.
Storage Expo 2008 at the National Hall, Olympia on the 15th and 16th October, is the UK’s definitive event for data storage, information and content management. Providing the opportunity to compare the most comprehensive range of solutions and services from all the leading suppliers, the show features over 100 of the world’s top storage vendors and an extensive, cutting-edge free education programme with over 62 experts speaking, including sessions that will address the latest issues on how to tackle data growth and disaster recovery.
The education programme for 2008 has been expanded to reflect the needs of today’s data storage and information management experts as they become as concerned with information and data management as they are with storage capability, scalability and infrastructure. With strategic and technical analysis, case studies and storage management reviews, this years programme will reveal expert knowledge of how information management can increase both storage efficiency and information utilisation for business application.
For more information or to register free to attend please visit www.storage-expo.com
Source: StoragePR
<>
Providing a Service is all about the client... yeah right!
By Guus Leeuw, President & CEO, ITPassion Ltd
There are two types of storage services that can be provided to organisations: in-sourcing and out-sourcing.
With in-sourcing, the client would receive, say, a storage administration team that then works on the premises and with the equipment of that client. The business model behind this type of service is quite easy to setup and to sell. Setting up an average cost, to be paid per team member per month, is quite easily done, as one only has to look at the salary ranges of these people to figure out how much the client should be paying for each one of them. As Junior Administrators are likely to be less expensive than a Senior Administrator, an average price covering all variants between Junior and Principal is easily made up.
The client side of this is that one would expect a good balance in skills, ranging from junior people to Principals. However, it is fairly easy to provide a lot more junior people than a good balance would suggest, and thus under-deliver on the quality of service.
The result of such a scheme would be that the client is over-charged for the service that it receives. After a while the client becomes unhappy with the service and starts looking for a different organisation to provide more of the same. Meanwhile, the storage provider gets a nice bonus for under-delivering on the quality and is doing well financially.
The difficulty in this scenario is to find and maintain the right balance, for the sake of the client. The interests of both parties are essentially conflicting, as the service provider wants to reduce cost, whereas the client wants to improve quality of service. Often, this conflict of interest is not understood at the client, who assumes that the service provider will do their utmost to provide good services, whereas the service providers eagerly make sure, that this remains so.
It would be a good thing, if the service provider would care more about profit in the long term, making clients happy. For the only good client is a happy client.
Out-sourcing is a lot trickier to setup from a business model perspective. There are several factors that play a role: Cost of data centre, electricity, cooling, equipment, and staff all play a vital role in making sure that the price for 1 TB of storage actually matches the cost that the service provider has in providing and managing that Terabyte of storage.
There are several ways to make sure one can over-charge a client, the most obvious is to hide the business model and the calculations that resulted in the price of that Terabyte of storage. Unless faced with procurement who already did the cost calculations for the organisation itself, not many clients understand the business model behind storage service providers.
Another easy way to reduce ones costs, from a provider perspective, is to utilise low-cost labour. Low-cost labour is often times also less experienced. Again, here is a conflict of interest: the client wants good quality storage services, whereas the provider wants to reduce the cost behind its business model.
In reality providing a service to a client is about making a profit off that client. The question that the client should ask and answer for himself is: How much of a profit do I want the service provider to make? And only when the answer is understood should one go about selecting a service provider.
Guus Leeuw jr. studied Software Development on the Polytechnics Highschool of Information & Communication Technology in Enschede, Netherlands. Soon after gaining his degree he was hired by EMC Germany to aid internal software development. Guus subsequently travelled and worked across Europe before, in 2007, setting up his own Software and Storage company ITPassion.
IT Passion Ltd is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: StoragePR
<>
Low cost automatic backup and remote data duplication solution for SMBs
By Ernesto Soria-Garcia, VP Sales IDS-Enterprise
When looking into products that solve our daily issues concerning the aspects of making sure our every day and all-important documents and databases are safely secured away, we find a host of different options and specialized products that do so with proficiency.
However If we look closely at the issues concerning data safety, and solutions to restore data in cases of accidentally deletion, or in the unfortunate case of losing a laptop under whatever circumstance or ‘force majeure’ as they say, it is clear that not all issues are covered by one solution, and that often we are obliged to build up a mecano-like assembly of products, and manage them fitting together.
In most small and medium businesses, not to mention micro enterprises or independent consultant’s cases, addressing these issues ourselves becomes almost impossible, as they are not necessarily knowledgeable enough to feel confident at having a crack at it, especially with the heavy burden of responsibility that goes with the management of such solutions.
So often we lay our hands down and decide that we’ll survive without, or accept a partial solution, or of course look for a third party to take care of this aspect of our business on our behalf. Often the more secure or complex the solution, the more costly it is, and the more dependent we are on the third party that we have called to our aid.
Linux based mini servers can offer SMBs and micro enterprises a solution that makes a backup of data from PCs in the office into a locally based unit and then duplicates it over existing ADSL lines to a second identical unit for disaster recovery of their choice, all automatically, extremely fast and at a surprisingly low cost. The server can integrate the most advanced enterprise-class technology and allow SMBs and micro enterprises with no more skills in IT than the general PC user to set up and run, providing for the first time real private and fully confidential outsourcing of duplication of one’s data. The intelligent server can monitor the whole process with security checks and counter checks, provided to the user as well as the manager/owner of the SMB.
Outlined below are what every enterprise whether a conglomerate or SMB needs to do in order to secure their data backup and the procedures and techniques needed to restore the data and files. To do this, the IT industry defines broadly the building blocks necessary to have a full fledged solution as follows:
For backing up data on the company site and then making copies of it for transportation to the remote site physically or perhaps to duplicate and transfer data via networks to a recipient unit elsewhere including media for remote storage:
Backup software to manage the daily ‘gathering’ of data from all ‘producers’ of data or clients.
- The hardware (PC, Server) that orchestrates this collection
- Local office storage media such as tape drives, disk drives to store the data collected
- Software or procedures to create a copy or replication of the backed up data to local tapes or disks that are physically transported daily to the remote (outsourced) site
- Hardware( PC/Server) to orchestrate and send data execute this process to additional remote hardware (PC/Server) storage Disk or Tape drives and their media
- Network infrastructure, whether SAN, LANS, or Internet ADSL
- High –level IT engineer to put it all together, administrate and monitor.
By covering in one fully integrated and purpose made software and hardware product that does all this linux servers can offer a huge opportunity for SMB and micro enterprises to once and for all equip themselves with a solution that brings an extremely affordable, high-level enterprise security to their data.
A typical end user wish list when looking for a backup and remote duplication solution are commonly identified as needing to be:
- All inclusive with high performance. IDSbox integrates all under its Linux OS, and local office data modifications and updates only are transmitted to the remote IDSbox, therefore avoiding any clogs in the ADSL lines, and providing extremely high performance synchronization of local and remote (outsourced) backup.
- Have completeness and integrity of data. Various monitoring control Checks are carried out at repertory, file levels amongst others
- Robust. A robust system made of metal chassis including a reinforced PVC casing
- Low operating cost. An electronic temperature and activity control lowers energy consumption to a meagre 9 Watts; could not be greener!
- Easy to install and administrate. Linux OS has been adapted to not require human intervention during its operation. Any programming or planning is done via a web browser.
- Provider of highest security levels. The local and remote servers as well as their disks containing the backed up data are mutually interchangeable. Data is transferred via encrypted data tunnels, validation and authentification certificates, password control and operate and Rsync/ssh protocol controls are used.
- Independence. No additional services or third party products are needed so there are no hidden surprises. Only existing ADSL lines are used.
- Easy Data backup definitions and restoration procedures. The software should provide a very easy way to define (simple drag and drop!) the files and folders one wants to permanently have secured by backup and remote duplication. The times to backup and frequencies should also be automated.
- Flexibility in data outsourcing possibilities. Should the company be a micro enterprise with a couple of PCs only, then only a single server will be needed at the remote location to back up data from the PCs.
- Confidentiality. Access to back up data should be provided through personal passwords only. This is different from NAS servers that share data.
- Scalability. The data backup and duplication capacity can be upgraded by simply changing the server disk sizes, e.g. from 320 Gb capacity to 1TB, or by adding an external Disk of up to 1 TB. Tape drives and disks for snapshot functions can be attached.
- Accompanying services such as warranty extensions, hot lines. Various warranties and extensions should be provided by the technology supplier, to provide maximum peace of mind.
- Cost effective. Linux servers can provide the most economical solution in the market to back up locally data and duplicate it remotely. This type of solution can be as purchased for as little as £1,200 for an entry level twin box solution consisting of two mini Linux servers including 320 Gb disk capacity in them, and all the software necessary to backup as many as 20 PCs or servers locally, the software to duplicate the data, to create the secure encrypted transmission tunnel via the ADSL and monitor and report to the SMB manager and individual users. Other solutions in the market that can carry this service out, whether hardware, software based, or both can cost 20 to 50 times more.
In conclusion, for the first time an extremely affordable solution that integrates all the components required to back up data on a company site as well as creating and maintaining a permanently updated copy physically elsewhere via ADSL is available and can be set up and programmed by a typical PC user. A SMB can now physically posses and control not only its local but outsourced data, without requiring a third party. Many SMB owners have expressed strong interest in this, as their data remains now fully confidential, and in case of a major disaster or data loss they can themselves recover the data using the extremely simple and intuitive software provided.
IDS-Enterprise are exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: StoragePR
<>
Nearline and Archiving in the Data Warehouse: What's the Difference?
By: Arthur Ritchie - Chairman and CEO at SAND
In recent years, data warehouses have begun to increase radically in size. To maintain acceptable performance in the face of this "data explosion", several techniques have been introduced. These include pre-building aggregates and Key Performance Indicators (KPI’s) from large amounts of detailed transaction data, and indexing as many columns as possible in order to speed up query processing.
As data warehouses continue to grow, however, the time required to do all the necessary preprocessing of data increases to the point where these tasks can no longer be performed in the available "batch windows" when the warehouse is not being accessed by users. So, trade-offs need to be made. Doing less preprocessing work reduces the required time, but also means that queries that depend on aggregates, KPIs or additional indexes may take an inordinately long time to run, and may also severely degrade performance for other users as the system attempts to do the processing "on the fly". This impasse leads to two possible choices: either stop providing the analytic functionality – making the system less valuable, and users more frustrated, -- or “put the database on a diet" by moving some of the data it contains to another location.
Putting the Database "on a Diet"
Both Nearline and Archiving solutions can help trim down an over-expanded database: the database can be made much smaller by implementing an Information Lifecycle Management (ILM) approach, removing unused or infrequently used detailed transactional data from the online database and storing it elsewhere. When the database is smaller, it will perform better and be capable of supporting a wider variety of user needs. Aggregates and KPI’s will be built from a much smaller amount of detailed transaction data. Additionally, column indexing will be more practicable as there will be fewer rows per column to be indexed.
The Key Differences between Archiving and Nearlining in a Data Warehouse
It is important to stress the differences between archiving warehouse data (using products from Open Text, Princeton Softech and so on) and storing it nearline (using SAND/DNA). Since both types of product are used to hold data that has been moved out of the main "online" system, it is unclear to some why one would need to be implemented if the other is in place. To clarify this question and make it easier to discuss why one or the other type of system (or both) might be required in a given situation, the major differences between nearline data and archived data are outlined below.
Archive
Normally, the concept of electronic archiving focuses on the preservation of documents or data in a form that has some sort of certifiable integrity (for example, conformity to legal requirements), is immune to unauthorized access and tampering, and is easily subject to certain record management operations within a defined process – for example, automatic deletion after a certain period, or retrieval when requested by an auditor. The archive is in fact a kind of operational system for processing documents/data that are no longer in active use.
The notion of archiving has traditionally focused on unstructured data in the form of documents, but similar concepts can be applied to structured data in the warehouse. An archive for SAP BI, for example, would preserve warehouse data that is no longer needed for analytical use but which needs to be kept around because it may be required by auditors, as would be the case if SAP BI data were used as the basis for financial statements. The archive data does not need to be directly accessible to the user community, just locatable and retrievable in case it is required for inspection or verification – not for analysis in the usual sense. In fact, because much of the data that needs to be preserved in the archive is fairly sensitive (for example, detailed financial data), the ability to access it may need to be strictly regulated.
While many vendors of archiving solutions stress the performance benefits of reducing the amount of data in the online database, accessing the archived data is a complicated and relatively slow process, since it will need to be located and then restored into the online database. For this reason, it is unrealistic to expect archived data to be usable for analysis/reporting purposes.
Nearline
In the Information Lifecycle Management approach, the nearline repository holds data that is used less frequently than the "hottest" most current data but is still potentially useful for analysis or for constructing new or revised analytic objects for the warehouse.
While the exact proportion of nearline to online data will vary, the amount of "less frequently used" data that needs to be kept available is normally quite large. Moving this out of the main database greatly reduces the pressure on the online database and enables continued performance of standard database operations within available time windows, even in the face of the explosive data growth that many organizations are currently facing.
Thus, the archiving requirements described above do not apply to a nearline product such as SAND/DNA, which is designed to reduce the size of the online warehouse database, while at the same time keeping the data more or less transparently accessible to end users who may need to use it for analysis, for rebuilding KPI's and so on.
In Brief
Why a Nearline Product is not an Archive
Nearline products do:
- Make older data easily accessible to end users for enhanced analysis/reporting
- Offer very good performance in delivering data to end users - typically not more than 1.x times slower than online, with little or no impact on online users
- Allow greater amounts of relatively recent data to be moved out of the online system
- Offer methods for ensuring the compliance of data with regulations
- Feature any special built-in security regime beyond the read-only status of the data
- Take care of operational processes on data, such as enforcement of retention periods, automatic deletion and so on.
Archiving products do:
- Provide controlled storage of older data that will probably not be accessed except in special circumstances
- Enforce organizational policies with regard to data retention
- Ensure compliance
- Limit access to sensitive data.
- Make data easily accessible to users for analysis or reporting.
- Offer fast performance in restoring data
- Store relatively recent data that may be required for analytics/reporting
Source: StoragePR
<>
10 Criteria to Selecting the Right Enterprise Business Continuity Software
By Jerome M Wendt DCIG, LLC
The pressures to implement business continuity software that can span the enterprise and recover application servers grow with each passing day. Disasters come in every form and shape from regional disasters (earthquakes, floods, lightning strikes) to terrorist attacks to brown-outs to someone accidently unplugging the wrong server.
Adding to the complexity, the number of application servers and virtual machines are on the rise and IT headcounts are flat or shrinking. Despite these real-world situations, companies often still buy business continuity software that is based on centralized or stand-alone computing models that everyone started abandoning over a decade ago.
Distributed computing is now almost universally used for hosting mission critical applications in all companies. However business continuity software that can easily recover and restore data in distributed environments is still based on 10 year old models. This puts businesses in a situation when they end up purchasing business continuity software that can only recover a subset of their application data.
Organizations now need a new set of criteria that accounts for the complexities of distributed systems environments. Today’s business continuity software must be truly enterprise and distributed in its design. Here are 10 features that companies now need to identify when selecting business continuity software so it meets the needs of their enterprise distributed environment:
- Heterogeneous server and storage support. In distributed environments, companies generally have multiple operating systems and storage systems from multiple different hardware vendors. Companies want the flexibility to recover applications running on any of these operating systems while using storage that they have available at the DR site to do the recovery. Many business continuity solutions require the same configurations (host software, network appliance, storage system) at the production and DR sites. New enterprise business continuity intended for distributed environments should not.
- Accounts for differences in performance. A major reason that companies implement specific business continuity solutions for specific applications is due to how they manage high numbers of write I/Os. High performance (i.e. high write I/Os) applications put much different demands on business continuity software than those that protect application servers with infrequent write I/Os. To scale in enterprise distributed environments, the business continuity software needs to provide options to scale under either type of application load.
- Manages replication over WAN links. Replicating all production data to the target site is great until the network connection becomes congested or breaks. Enterprise business continuity needs to monitor these WAN connections, provide logical starting and stopping points if the connection is interrupted and resume replication without loosing data or negatively impacting the application which it is protecting.
- Multiple ways to replicate data. Not every application server needs all of its data replicated. Some application servers need only select files or directories replicated while other application servers need all data on one or more volumes replicated to ensure the recoverability of the system. Enterprise business continuity software should give companies the flexibility to replicate data at whatever layer – block or file – that the application server requires.
- Application integration. Replicating data without any knowledge of what application is using the data or how it is using the data represents a substantial risk when it comes time to recover the application. Recovering applications such as Microsoft Exchange, SharePoint or SQL Server that keep multiple files open at the same time can result in inconsistent and unrecoverable copies of data at the DR site. Business continuity software must integrate with these applications such that it provides consistently recoverable images at the DR site.
- Provides multiple recovery points. A problem with a number of existing business continuity solutions is that it only provides one recovery point – the one right before the disaster occurred. However disasters are rarely ever that neat and tidy. Sometimes companies are not even aware a disaster has occurred until hours after the disaster (think database corruption or wrong file loaded). Business continuity software needs to provide multiple recovery points so companies can rollback to a point in time right before the disaster occurred as well as give them multiple options to recover the data.
- Introduces little or no overhead on the host server. Putting agents on host servers provides a number of intangible benefits – application awareness, capture of all write I/Os, and even a choice as to where the replication (block or file) of the data will occur. However if using agent on the server consumes so many resources on the server that the application can not run, it negates the point of using the business continuity software in the first place.
- Replicates data at different points in the network (host, network or storage system). Getting agents on every corporate server is usually never an option. Whether it is because of corporate service level agreements (SLAs), ignorance about the presence of new virtual machines or just good old fashioned corporate politics, agents are great but not an option in every situation. In this case, the business continuity software should also provide options to capture data at either the network or storage system level.
- Centrally managed. Enterprise business continuity software needs to monitor and manage where it is installed in the enterprise, what applications it is protecting, how much data it is replicating and the flow of replication from the production to DR sites. It also should provide a console from which administrators can manage recoveries anywhere in the enterprise.
- Scales to manage replication for tens, hundreds or even thousands of servers. Enterprise companies sometimes fail to realize just how many application servers they actually have in their organization. Tens of servers is a given in even most small organizations with hundreds or even thousands of servers more common than not in any large company. The business continuity software should include an architecture that scales to account for this number of servers without breaking the replication processes or the bank.
InMage is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: StoragePR
<>
Top Tips for Email Management and Archiving
By Dave Hunt, CEO of C2C
Introduction: with only 20% of companies demonstrating good control on email management, Dave comments on the state of email management and archiving and notes what resellers can do to position themselves as protectors of companies’ most used and valuable communication method.
Just how bad does it get?
Though around 30% of organisations have some form of archiving in place, most consider that this would not constitute adequate control. A recent survey by C2C found that 65% of respondents had set mailbox capacity limits meaning in effect, that end users were responsible for managing their own mailboxes. In practice, this self regulation probably results in significant lost productivity and constitutes a poor strategy for managing and discovering data. In this article, we consider the top five questions being by resellers interested in recommending email management:-
1. Is Email control a management or archive issue?
It is a management issue and archiving is part of the solution. Resellers should identify a solution that identifies unnecessary emails, handles attachments and provides automated quota management which should be part of a strategic ‘cradle to grave’ management of email. It isn’t a case of archiving email merely to reduce the live storage footprint, but part of a well thought-out strategy, designed hand-in-hand with the customer that aids productivity and time management and that can be implemented by an IT department simply and economically.
2. What is the biggest problem for email management – storage costs, ‘loss’ of information or compliance issues?
All of these are problems. Some will cost your customers on a daily basis; others could result in huge fines in liability. Failure to preserve email properly could have many consequences including brand damage, high third-party costs to review or search for data, court sanctions, or even instructions to a jury that it may view a defendant’s failure to produce data as evidence of culpability.
3. What guidelines should be in place for mailbox quotas – and how can these be made more business friendly?
Most specialists in email management agree that mailbox quotas are a bad idea. The only use would be a quota for automatic archiving, whereby, on reaching a specific mailbox threshold, email is archived automatically (and invisibly to the user) until a lower threshold is reached. Our C2C survey also found that those who self-manage email to stay within quotas frequently delete messages, delete attachments, and/or create a PST file. The over-reliance on PST files as a means to offload email creates several challenges when companies must meet legal requirements, since PST files do not have a uniform location and cannot be searched centrally for content with traditional technologies. Resellers can explain that reliance on PST files is poor practice.
4. Once retention schedules and compliance have been met, does the email need to be destroyed – and if so, how should resellers’ recommend companies go about this?
In some instances it is necessary to delete emails once the retention period has passed, in others it is only an option. Deletion also depends on the industry type, for instance, does it have to be guaranteed destruction, such as to US DoD standards, or is a simple removal of the email sufficient?
5. What would be your top tips be for email management?
Resellers that wish to add true value should consider the whole picture of email data management, from the instant an email is sent to the time it is finally destroyed.
C2C is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: StoragePR
<>
The Great Green Collision
By Simon Pamplin, SE Manger UK & Ireland, Brocade
Among all the challenges CIOs and IT administrators currently face, two historical trends are on a collision course. Firstly, the growth in data processing is generating ever-increasing demand for servers, storage arrays and the infrastructure needed to support those devices. According to IDC projections, the projected total volume of corporate data worldwide for 2010 is nearly a zetabyte (one billion terabytes). This sets off a spiralling circle of events in the data centre: the growth in data means that more hardware is required; more hardware in turn leads to larger data centres; these larger data centres require more power and an upsurge in the cooling needed to sustain continuous operations.
But this chain of events cannot continue unchecked. The limited availability and increasing cost of energy worldwide is undermining the energy utilities’ ability to supply reliable power. Several factors are contributing to this trend and pointing to an impending conflict between projected supply and demand. Because all modern enterprises depend on information technology, IT organizations must be able to align energy consumption with energy availability and simultaneously accommodate data growth as part of a viable IT strategy. Using today’s technology, organizations can build sophisticated data centres in a cost-effective manner. By selecting products that use energy efficiently, CIO’s can not only help their businesses by reducing costs and running their IT infrastructure efficiently but also contribute to the global reduction in energy usage.
In terms of business benefit, being able to reduce the running costs of storage is a no-brainer. The cost obviously depends on a host of factors, from how many copies you take of the data, to how it is backed up and onto what medium, how long you need to retain it, how efficient the devices are etc., but probably the most relevant figure is the cost of electricity to run it all. Recent research by the European Commission shows that UK industries spend an average of 10.78€ per 100Kwh of electricity used. By selecting the most efficient storage components, savings of thousands of Euros per device per year can be made when compared to less efficient products.
The same business benefits are found to be applying a green storage policy. The more efficient you are the fewer resources you consume; resources cost money and they also contribute to the global issue of resource usage. The less power needed to run a device the less heat it will generate; the less heat it generates the less cooling is required; and reduced cooling requires less power. Typically the reduced power consumption comes from reducing the number of components in the device, allowing for higher density per sq/foot. This in turn means more available space in the data centre and less need for expansion, which is another way to lower company overheads.
Going green means that organizations must re-examine all aspects of their IT operations including facilities, people, and infrastructure—so they can proactively implement best-practice strategies and identify areas where they can achieve greater power efficiencies.
As the historical trends of data growth collides with the availability, or lack, of power and space for data centres, the IT industry needs to wake up to the fact that a green solution is the only solution for the future.
Brocade is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: StoreagePR
<>