By Richard Grondin, VP R&D and Product Deployment, SAND Technology
Intelligent Information Management is not really a new concept. What is new is the scope and volume of enterprise data that needs to be managed, and the more stringent regulations concerning data governance by which organizations must abide. Now more than ever, enterprise data assets need to be managed carefully to protect data access, immutability, privacy, monitoring capabilities, auditability, and business value over the complete information lifecycle. This is precisely what IIM is about, and to be successful it needs to be implemented with a focus on the data itself instead of on the specific ways it will be used. This requires a paradigm shift on the enterprise level: a realignment of IT architectures from an application-centric to a data-centric approach. Business needs are changing quickly, and IT architectures should be able to satisfy these within reasonable timeframes and at acceptable costs, all the while protecting enterprise data assets.
The Corporate Information Factory model, developed by Bill Inmon, is currently in use by a variety of organizations. Typically, the starting point for a Data Warehouse implementation is a business requirement for specific reports. The data architects then identify what information is available, how to get access to it and the level of transformation required to produce those reports (this is the data preparation phase). While this approach has brought significant benefits for many enterprises, it also has some weaknesses – the most important being that it covers only the data associated with a specific need at a specific point in time. For this reason, such an approach could be termed application-centric.
Now, new legal regulations are putting increased pressure on organizations to retain, and maintain access to, a greater variety of data. Data not associated with any specific business requirements must now be kept around “just in case” it is needed, and at the same time data governance policies need to be introduced. The easiest way to respond to these new data requirements would be to store the information on tape, but then the problem becomes how to get access to it. For this reason, some organizations have opted to transform their data warehouses into de facto storage devices. However, a side effect of this approach is that DBA teams are under increasing pressure to maintain the various Service Level Agreements (SLAs) that have already been established, since keeping more and more data in the warehouse while maintaining good performance can be a difficult proposition. This is one of the reasons there are so many RDBMS products currently available on the market.
An Intelligent Information Management implementation can help organizations to overcome these new challenges without going through multiple “revolutions” in their current data architecture. IIM can help satisfy data governance requirements and at the same time improve “data agility” for efficient delivery of Business Intelligence value. This type of implementation requires a shared vision and best practices program supported by a flexible data-centric architecture, along with an iterative transition process to review the various data requirements in the organization.
Many organizations have already deployed multiple data warehouses, data marts and cubes to satisfy their business intelligence requirements. Much has been invested in such deployments, and to protect this investment, IIM has to be implemented as an evolution of the infrastructure currently in place rather than a revolution requiring everything to be rebuilt from the ground up. Typically, the first step taken by organizations implementing IIM best practices it is to introduce an Information Lifecycle Management (ILM) infrastructure.
SAND Technology is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: StoragePR
<>
Intelligent Information Management for the Real World
The death of the (open) Internet
A leading US telecommunications company has once again predicted the death of the Internet. But could there be an ulterior motive at play here rather than the truth being told?
The Internet will, so we are being told by the VP of legislative affairs for US telecoms giant AT&T, reach its maximum data capacity by the year 2010 and will then grind, more or less, to a halt.
“The online content surge is at the centre of the most dramatic changes affecting the Internet today,” said Cicconi at the Westminster eForum on Web 2.0 in London in May 2008. He added, “Eight hours of video are loaded onto YouTube every minute. Very soon everything will become High Definition [HD], and HD is seven to 10 times more bandwidth hungry than a typical video. Video will be 80% of all traffic by 2010, up from 30% today.”
Experts agree that, as with any form of infrastructure, investment in the Internet must increase in line with demand. But some are now questioning the motives of telecommunications giants such as AT&T predicting the ‘death of the Internet’, and this questioning should, in my view, be not surprising.
AT&T and fellow US telecoms company Comcast are vocal opponents of the proposed legislation protecting ‘net neutrality’, which would bar them from charging content providers a premium for preferential traffic routing and improved service quality – a potentially lucrative business. Doing so contravenes guidelines from the Federal Communications Commission (FCC), but the pressure to back these up in law is mounting.
Hence one can but understand why they keep predicting the end of the Internet as we know it, so to speak. They are looking for a way to make even more money than they do already from the users of the World Wide Web. Corporate greed, yet again, and in such an effort they would rather make it a closed affair, to some extent at least, than to actually improve the service so that people could use and enjoy the Web more.
If AT&T’s predictions come true, they might provide US senators with an incentive to oppose such legislation: as the Internet approaches capacity, it makes economic sense to allow big business right of way on the network, especially if they are prepared to pay for it.
But with regards to the specific prediction that the Internet will reach capacity by 2010, the corroborating evidence comes solely from a study published in 2007 by Nemertes Research and backed by an organisation called the Internet Innovation Alliance (IIA). The IIA is a telecommunications lobbying group that warns of a coming ‘exaflood’ – a catastrophic explosion of data that kills the Internet – and whose members include none other than AT&T.
Jan Dawson of telecommunications analyst company Ovum believes AT&T’s comments relate to its position on net neutrality
The underlying point of these remarks is sound – massive investment in the network capacity of the Internet will be needed in coming years. But AT&T is in the minority in suggesting that a new business model is required to fund it all.
It’s worth noting that Cicconi is responsible at AT&T for regulatory issues, not network investment, making it more likely that this is further positioning in the net neutrality debate.
Michael Holloway of the Open Rights Group believes that technology developments will pre-empt the death of the Internet
Doomsayers periodically claim the net is reaching maximum capacity. This is a poor argument for networking filtering or preferential traffic routing because, as has historically been the case, new technologies will ensure capacity stays ahead of traffic growth. History also shows that network monopolists such as AT&T are uncomfortable with the disruption and financial cost of keeping the net up to speed.
At all costs we who are interested in the Internet as users, for whatever purpose we may use it, must ensure that it remains free, at least as free – and that is limited – as it is today. Freer still would still be better but. The only way to achieve that, I think, would be the use of more Open Source software (and hardware).
When I say above that the freedom on the Net and usage of it is, to a degree, limited we must recognize that the biggest switchers and routers are not owned by the telecoms companies but belong to the US military which, regardless of what some might like to claim, in reality owns or at least owned the Internet. It was the military communications nets via computers that laid the foundation to the Internet as we now know it and if we look at when the speed of the Internet traffic slows to a trickle and it behaves like thick molasses then we should check what is going on in the field of military or national (US) security operations. Nine out of ten times when the Web slows to a crawl about half of all capacity is taken over by the military and security agencies.
However, while that may happen the Internet presently is relatively free. If, however, the likes of AT&T and Comcast, and others of their ilk, get their way, and also with the hint-hint of improving security, e.g. protection of children and such, we will see a restricted Internet; one that is no longer free for all to use.
Already we are seeing that an attempt is being make to make Bloggers, though given the same protection in some countries as “ordinary” journalists, have protection removed and that the governments and other agencies can have Blogs removed that they do not like the contents of as regards to political views. I am not talking about China here or maybe Russia, but of the United States and other, so-called free and democratic countries. The powers that be are scared of the Blogger, of the citizen journalist.
The Internet must remain free and we must ensure that is stays thus.
© M Smith (Veshengro), September 2008
<>
Keep Your Data Lean… and Green
By Peter Olding, General Manager, SAND Technology
What keeps you awake at night? Is it having to go to your Board of Directors or CIO tomorrow and explain why your organisation cannot hit agreed service levels around your storage infrastructure or perhaps beg for more capital expenditure to increase your existing infrastructure footprint? If so, read on…
We all know the challenge of retaining and maintaining the vast volumes of data required for business and regulatory requirements, is one that faces every modern IT organisation. Increasingly, this puts pressure on IT budgets and data centre infrastructure and is compounded by rising operational and energy costs.
So in the current economic and environmental climate, should we really be buying and consuming more storage? Somewhat controversially, but perhaps timely, I will discuss a way to buy less storage - or do more with your existing infrastructure.
By way of example, let’s consider a 100 Terabytes (TB) of structured data with varying requirements such as complex analysis, long term data retention and archiving. These may be new requirements or extra load on an existing data warehouse. Either way, you’re faced with a number of questions and decisions and you will no doubt be bombarded by vendors and consultants offering best advice - or their version of best advice – which isn’t usually cheap! Either way you’re looking at an expensive system with some expensive consulting. Or are you?
Very few people tend to look at these requirements from the perspective of the data or consider a data-centric architecture, as we are so used to deploying application- centric architectures. In simplistic terms you’re looking at a large pool of data. And you’re not entirely sure what it’s going to be used for except that you have to store, manage and retrieve it. In essence, this large pool of data is a corporate information memory – a primary repository of enterprise data which will enable your company to face the challenges mentioned earlier.
There are a number of approaches to deploying a corporate information memory architecture but in essence it comprises a mix of software, solutions and techniques which, when combined, enables companies to deliver significant commercial, competitive and environmental advantage.
By way of example, by combining deduplication, compression and indexing techniques, this 100 TB of data can be held in under 2 TB. That’s right - 2 TB – or a 98% reduction. Not only that, but it is optimised for fast querying and retrieval, requiring little administration and can work with Oracle, SQL Server, DB2, SAS, Business Objects and so on.
To demonstrate the benefits of this approach, consider an organisation that retains and analyses huge volumes of retail data in an ever changing, very demanding and thin margin industry. They are constantly striving to introduce operational improvements across the business. As a result of deploying a corporate information memory architecture, they reduced their storage infrastructure for their analysts by a factor of thirty which resulted in:
- Lower administration costs enabling IT staff to creatively support the business instead of constantly fire fighting
- Improved operational efficiency to over deliver on service levels for operations and querying
- Increased productivity of their end user analysts by a factor of three
For years the IT industry has been obsessed with size – who has the largest data centres, largest data warehouses, largest severs and so on. Isn’t it about time we got clever and applied innovative solutions to problems and brag out who has the smallest data centre footprint and shortest batch windows?
By the way - what keeps me awake at night? It’s the ducks that live at the end of my garden performing duck karaoke well into the night.
Peter rejoined SAND in 2006 bringing a wealth of commercial experience in data analysis and retention. Prior to this, he was instrumental in the early success of ClarityBlue as a Business Development Director which spun off from SAND in the UK as an analytic CRM provider. Prior to this, he was an Account Director at WhiteCross Systems (Kognitio) having moved from a consulting role with British Telecom where he was involved in a number of data centric projects.
Peter has a Bachelor's Degree in Geological Sciences and spent his early career as a data analyst on the rigs in Africa, the Middle East and Scandinavia, and a Master's Degree in Information Technology. He started a PhD in genetic algorithms until the lure of industry was too great.
He feeds ducks in his spare time.
SAND Technology is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: StoragePR
<>
RSA® Conference Europe Announces Full Line-Up of Keynote Speakers for Annual Information Security Event in October
Keynote Speakers include Richard Thomas, Information Commissioner; Baroness Neville-Jones, Shadow Security Minister; and Bruce Schneier
LONDON, UK, Oct 01, 2008: RSA® Conference, the world’s leading information security conference group, today announced its complete line-up of keynote speakers for the 9th annual RSA Conference Europe. The Conference, taking place from 27th-29th October 2008 at ExCeL London, UK, is set to bring together international industry experts, senior executives, security professionals, developers, architects and vendors – and will deliver its educational content through keynotes, special interest groups, and more than 70 sessions across 10 tracks.
Continuing to feature keynote panels addressing the hottest topics in security, RSA Conference Europe will include what is expected to be a highly-controversial session examining ‘Online Privacy and the World of Behavioural Targeting: Challenges and Options’ on the main stage on Monday, 27th October. In this moderated panel, representatives from industry and government including Peter Hustinx, European Data Protection Supervisor (EDPS), Ari Schwartz, CEO at Center for Democracy and Technology, Michael Spadea, Privacy Counsel at Barclays LLC and Paul Goad, MD at NebuAd Ltd. will discuss online privacy with regard to 'traditional' online advertising (e.g., through use of tools such as Google's AdWords and AdSense), and how a newer move to behavioural targeting may impact online privacy and security.
Other Keynote Speakers include:
- Richard Thomas, Information Commissioner, Information Commissioner's Office, UK
- Baroness Neville-Jones, Shadow Security Minister, UK
- Bruce Schneier, Security Technologist, and CTO, BT Counterpane
- Art Coviello, Executive Vice President, EMC and President, RSA, The Security Division of EMC
- Ken Silva, CTO, VeriSign
70+ sessions focused on the Business of Security, Developers & Applications, Governance, Hosts, Hot Topics, Networks, Professional Development, Research & Threats, Security Services and Sponsor Case Studies
The Exhibition featuring new product innovations from over 40 world-class service and technology vendors, including Fortify Software, Integralis, MXI Security, Omada A/S, Optenet, phion AG, SanDisk, Tripwire, Trusted Computing Group, VeriSign, RSA, The Security Division of EMC and WinMagic Inc.
20 + Special Interest Groups (SIGs) offering interactive and hands-on debating forums
Unparalleled Networking Opportunities with 1,400 security professionals
“RSA Conference Europe will continue to encourage thought-provoking discussions on issues that affect our industry in EMEA as well as around the world,” said Linda Lynch, RSA Conference Europe Manager. “Not surprisingly, data security will be one of the most discussed topics at this year’s Conference following a number of high-profile security breaches in the last year in Europe, and in particular in the UK. We’ll have a number of sessions, panels and keynotes focusing on this critical topic, presented by representatives from the European Commission, Symantec, Field Fisher Waterhouse LLP, The Ponemon Institute and others.”
"There are a number of security events in Europe that stand out, but the RSA Conference is the most prominent,” said Mike Davies, Director of Identity and Authentication Services at VeriSign. “In addition to technology leaders sharing their thinking and vision for the industry, the conference provides excellent opportunities to network with peers and prospects alike."
This year’s RSA Conference theme is built around Alan Turing – the British cryptographer, mathematician, logician, philosopher and biologist – and will celebrate his legacy and contribution towards digital computers today. Experts and historians agree that Turing had a deeper understanding of the vast potential of computer science than anyone in his era, and is often considered the father of modern computer science.
To celebrate his life and works, RSA Conference Europe has teamed up with Bletchley Park’s Education Department and will be showcasing a number of cipher and enigma machines at the event.
RSA Conference is helping drive the security agenda worldwide with annual events in the U.S., Europe and Japan. Throughout its history, RSA Conference has consistently attracted the world’s best and brightest in the field, creating opportunities for Conference attendees to learn about IT security’s most important issues through first-hand interactions with peers, luminaries and both emerging and established companies. As the IT security field continues to grow in importance and influence, RSA Conference plays an integral role in keeping security professionals across the globe connected and educated. For more information and Conference dates, visit www.rsaconference.com
Full details about registration are available at http://www.rsaconference.com/2008/Europe/Registration.aspx
Source: AxiCom
<>
Over 60% of companies rate consolidation as main objective for Virtualisation
London, UK, 1st October 2008 - Recent research carried out by Storage Expo on 362 companies found that the main objective for implementing virtualisation was sever consolidation ( 62%) closely followed by new management capabilities (30%). A small percent (6%) rated availability as an objective while only 2% had no plans to implement virtualisation.
According to Natalie Booth, Event Manager for Storage Expo 2008, “Virtualisation has seen dramatic adoption by companies in recent years and the qualitative benefits of flexibility, recoverability and assurance are well known. Virtualised server infrastructure is a powerful approach to lower costs, improve manageability, and dramatically increase utilisation.”
John Abbott, Chief Analyst, The 451 Group sums up Virtualisation as “proving to be a catalyst for introducing or revitalizing related technologies. It is easier to move virtual resources around a datacentre (or multiple datacentres) in response to demand, to deploy new resources more rapidly, and to redeploy them once they are no longer required. And it is also easier than in the past to integrate surrounding tools (such as monitoring, billing and chargeback) with virtualised resources.”
John Abbott will chair a keynote programme on ‘Improving Asset Utilisation with Virtualisation’ at Storage Expo 2008 on the 15th of October at 2:15pm. The programme explores the benefits of separating the physical configuration from application and how virtualisation can deliver benefits to a company across all electronic assets in the global network. Key challenges addressed in this session include virtualisation for servers and other electronic assets, understanding financial implications of implementing virtualisation and optimising current assets”.
Speakers include:
- Dr Zafar Chaudry, Director of Information Management and Technology, Liverpool Women's NHS Foundation trust
- Richard Gough, IT Operations Manager, The Wellcome Trust
- Jon Hutchings , Senior Systems Engineer , Network Systems Management Services, Oxford University
- Steven Shaw, ICT Manager, British Horseracing Authority
Sessions that focus on Virtualisation include:
- Virtualisation, Consolidation and Application-Aware Storage - The New Mandates for Datacentre Efficiency by Adrian Groeneveld, Director of Product Marketing EMEA, Pillar Data Systems
- Storage Virtualisation, does it meet today's business needs by Simon Brassington, Enterprise Product Manager, HP
- Virtualising the SMB; Disaster Recovery That Does not Break the Bank by Ed Harnish, Vice President of Marketing, Acronis UK
- Virtual Machine Business Continuity and Disaster Recovery by Luke Reed, Consulting Systems Engineer, NetApp
- How to help you reduce spend by 50% on Storage for Virtualised Servers by Rich Fenton, Consulting Systems Engineer, NetApp
Now in its 8th year, the show features a comprehensive FREE education programme, and over 100 exhibitors at the Olympia, London from 15-16 October 2008. Register free to visit at www.storage-expo.com
Source: StoragePR
<>
Hurricane Ike caused large Internet outage
by Michael Smith
Hurricane Ike caused damage that goes into the billions of US Dollars. While the huge catastrophe that was expected did not occur more than twenty people lost their lives. Aside from the environmental damage and that to the countryside the storm also caused considerable damage to the country's infrastructure.
According to estimates of insurance experts the costs resulting from the damage cause by hurricane Ike will run to about US$18 billion and not the smallest part of this sum comes from the damage cause to the infrastructure of the US. The hurricane cause the biggest Internet outage from more than five years.
Responsible for the outages in the Internet connection were mostly the power outages caused by the storm damaging power lines. States that bore the brunt of those damages are Pennsylvania, Ohio and Texas. The company Renesys has made it its task to monitor the Internet connections in the United States and has provided the relevant dates from its observations. The talk if of an Internet outage the size of which has not been seen since at least 2003.
At the time of the height of the outages more than 400 IP nets were without a working connection and amongst the affected companies were amongst other Time Warner Cable Inc. in Ohio, who supply private users with Internet and Cable connections. Also affected was the Johnson Space Center (NASA) in Houston.
The extent of this was quite obvious from the contacts that I personally have in the USA that we unable to access the Net and who have had problems for weeks after; some still have problems. The truth is that still at the end of September 2008 some who I know have only intermittent connection to the Internet as a result of the damage caused by that hurricane.
© M Smith (Veshengro), September 2008
<>
Facing up to information overload
Recent HP research shows that the average UK employee now spends up to five weeks a year looking for lost computer files and data. Stephen Watson, Product Marketing Manager for HP StorageWorks in the UK looks at the causes of the ‘information overload’ and suggests some possible cures.
Information is a gift and a curse. It lies at the heart of today’s knowledge economy, allowing organisations to create wealth through its distribution and manipulation. No company could function without it. But in recent years the spiralling growth in the volume of information has left many organisations drowning in data and lacking an effective means of managing it.
The loss in productivity caused by data overload is reaching crisis proportions. Research firm Basex recently chose ‘information overload’ as its Problem of the Year for 2008, estimating that the annual loss to the US economy alone is $650 billion and warns that failure to solve the problem will lead to “reduced productivity and throttled innovation.”
Scaling up comes at a cost
The issue of information growth has often been tackled by simply adding new storage server capacity bringing with it serious financial implications. HP, for example, estimates that for every $1 spent on capacity, management of that capacity adds a further $3.
The costs associated with a kilowatt of electricity are also rising significantly. According to the Uptime Institute the current three-year cost of powering and cooling servers is around one-and-a-half times the cost of purchasing server hardware.
Scaling up in response to data overload also has environmental costs. The Environmental Protection Agency estimates that the total power consumed by servers in the US amounted to 1.2 percent of total electricity in 2005, double the figure when compared with 2000, with IT hardware and data representing the biggest national contributor to carbon emissions.
Information clutter
Some of the reasons for the massive increase in corporate data have been well cited, including the growth of e-mail and other forms of electronic communications. Perhaps more surprisingly, a recent cross-sector survey of 350 UK IT Managers commissioned by HP, revealed that up to 45% of staff store digital content on corporate networks, placing extra strain on storage requirements.
But it is not just the explosion in the volume of data that is causing an information strain – it’s the way it is organised. Nearly two thirds (62 percent) of respondents to the same survey said information was often duplicated within their organisation because it was difficult to find.
This information clutter has a negative knock on effect for IT departments. Increasingly IT managers are spending a disproportionate amount of their time fielding requests from employees trying to find basic information, rather than focusing on improving infrastructure or more strategic IT projects.
Regulatory headaches
Companies now face the added challenge of storing data in line with regulatory requirements. When first introduced, these highly complex compliance regulations created immense pressure on organisations because the specific requirements for data management were frequently not fully understood.
The rapid introduction of policies addressing data authentication, data capture, and its distribution, often led to ‘keep-or-dump’ on-the-fly decision making. Erring on the side of caution, many companies have responded by taking the short-term decision to store everything.
In the absence of effective business intelligence tools, they now find they are unable to retrieve data on demand, leading to potential issues with breaches of compliance, decisions based on poor information, and lost business opportunities. This makes independent audits difficult and time consuming to support.
Just as organisations were coming to terms with existing corporate governance legislation such as Basel II, a new set of directives are now causing widespread concern for business and IT professionals charged with the responsibility of implementing effective information management policies.
The imminent arrival of EuroSOX, a set of European Union directives on corporate governance due to start being passed into law by member states this summer, will push information management systems to new levels of sophistication.
Grasping the information nettle
Although technology is causing information overload, many forward thinking companies realise it can also offer ways to combat it. By grappling with the issue through an effective information management strategy, companies are looking to improve customer satisfaction, increase operational efficiency, and enhance overall business performance.
Next generation solutions can help ensure that information isn’t simply ‘siloed’ into ever-growing storage farms by providing data storage infrastructure and the back-up software that goes with it, complemented with business intelligence and data warehousing tools.
Using industry-leading business intelligence solutions companies can derive more value from their data, manage information more efficiently, mitigate risks and assure compliance with regulatory mandates. They can also accelerate business growth by making better, faster decisions.
Putting information to work
Information is critical for every business and with the right strategy and tools in place, its management can be simplified to make it easier to create, locate and analyse corporate data. This in turn allows staff to find the right data quickly so they can devote their time to carrying out more valuable tasks.
By embracing next generation management information tools organisations can help employees to take informed decisions and reduce data duplication, making information work harder for them – rather than having to work hard to find the right information.
HP StorageWorks is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: StoragePR
<>