Top Tips for Email Management and Archiving

By Dave Hunt, CEO of C2C

Introduction: with only 20% of companies demonstrating good control on email management, Dave comments on the state of email management and archiving and notes what resellers can do to position themselves as protectors of companies’ most used and valuable communication method.

Just how bad does it get?

Though around 30% of organisations have some form of archiving in place, most consider that this would not constitute adequate control. A recent survey by C2C found that 65% of respondents had set mailbox capacity limits meaning in effect, that end users were responsible for managing their own mailboxes. In practice, this self regulation probably results in significant lost productivity and constitutes a poor strategy for managing and discovering data. In this article, we consider the top five questions being by resellers interested in recommending email management:-

1. Is Email control a management or archive issue?

It is a management issue and archiving is part of the solution. Resellers should identify a solution that identifies unnecessary emails, handles attachments and provides automated quota management which should be part of a strategic ‘cradle to grave’ management of email. It isn’t a case of archiving email merely to reduce the live storage footprint, but part of a well thought-out strategy, designed hand-in-hand with the customer that aids productivity and time management and that can be implemented by an IT department simply and economically.

2. What is the biggest problem for email management – storage costs, ‘loss’ of information or compliance issues?

All of these are problems. Some will cost your customers on a daily basis; others could result in huge fines in liability. Failure to preserve email properly could have many consequences including brand damage, high third-party costs to review or search for data, court sanctions, or even instructions to a jury that it may view a defendant’s failure to produce data as evidence of culpability.

3. What guidelines should be in place for mailbox quotas – and how can these be made more business friendly?

Most specialists in email management agree that mailbox quotas are a bad idea. The only use would be a quota for automatic archiving, whereby, on reaching a specific mailbox threshold, email is archived automatically (and invisibly to the user) until a lower threshold is reached. Our C2C survey also found that those who self-manage email to stay within quotas frequently delete messages, delete attachments, and/or create a PST file. The over-reliance on PST files as a means to offload email creates several challenges when companies must meet legal requirements, since PST files do not have a uniform location and cannot be searched centrally for content with traditional technologies. Resellers can explain that reliance on PST files is poor practice.

4. Once retention schedules and compliance have been met, does the email need to be destroyed – and if so, how should resellers’ recommend companies go about this?

In some instances it is necessary to delete emails once the retention period has passed, in others it is only an option. Deletion also depends on the industry type, for instance, does it have to be guaranteed destruction, such as to US DoD standards, or is a simple removal of the email sufficient?

5. What would be your top tips be for email management?

Resellers that wish to add true value should consider the whole picture of email data management, from the instant an email is sent to the time it is finally destroyed.

C2C is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com

Source: StoragePR
<>

New Open Source Site for IFAT

by Michael Smith

World Fair Trade Day will, for 2009, have a major new multilingual, interactive, multimedia website.

Ethical communications agency, Host Universal, has built the site, on behalf of the International Fair Trade Association (IFAT), and has been using for it the open source content management system Joomla!

Users will be able to register and manage their profiles, plan events, post videos, take part in discussion forums and comment on blog and news articles, as well as post content to their favourite social networking sites.

Plans are in place to extend the social networking capabilities of the site over the next few months, to allow users to link to each other, form groups and exchange private messages as well as invite friends to the site and send e-cards. The site will help organisations to plan and manage their World Fair Trade Day events on 9 May 2009.

The Joomla! site uses a variety of social networking extensions and runs on Apache on a Xen virtual private server with Debian as the operating system. The use of open source software has allowed the site to be developed rapidly and to a tight budget.

Open Source software is a great way to go with such sites and not just with sites but also for ethical businesses and charities in the form of desktops, servers, laptops, and operations in general.

Open Source does not have a license fee attached to it, especially not the operating systems such as Linux Ubuntu and the likes of Open Office and other items of software.

While it is true that we should not go to Open Source Software primarily for the fact that, in the main, all of it is free, it, nevertheless, whether for individuals or organizations, and especially here those in the third sector, the charity and non-profit sector, where money that would have to be used for the licenses for proprietary software can then be used much better in places of need.

© M Smith (Veshengro), September 2008
<>

Yet another USB stick lost by British government agency

by Michel Smith

London, September 15, 2008: The Home Office has this evening announced that one of its police forces – believed to be the West Midlands Force – has lost a data memory stick with sensitive data.

While the Home Office refuses to comment as to the contents of the memory stick it is believe to contain sensitive information of anti-terrorist operations. When you thought it could not get an y worse, apparently it can. One of these days, as I said in the previous piece, the luck will run out and someone is going to use the data against this country and its people.

The more we hear of such losses the more a sense of deja vu overcomes me and the more I want to scream at this people to, at least, have hardware encrypted sticks on which to carry the data. It is NOT rocket science and they are easily available. I have reviewed a few of them by now.

Maybe someone should point the government agencies into the direction of this journal and the reviews done by yours truly. They might just get the idea, but then again, they still may not. I am beginning to lose confidence that this government can actually learn how to deal with sensitive data of any kind.

© M Smith (Veshengro), September 2008
<>

The umpteenth case of data loss – due to wild party?

by Michael Smith

How blasé can any government actually be as regards of sensitive data? It would appear that the British government is trying to provide an answer to that question for there is no other way, I am sure, to explain all this kind of careless sloppiness, which by now borders on the criminal negligent, in the handling and dealing with sensitive information and data.

Now, this time, the travel plans and destinations and other details of seventy soldiers of the 3rd Battalion the Yorkshire Regiment.

The data in question was stored on a USB-Stick and this little comrade is now, since the year 2004, the 120th stick that has been lost by the British government and it agents.

The lost stick was lost and found in the Nightclub “The Club” in Newquay, Cornwall. How it got there and how it was lost is till now unknown. Maybe it would be a good idea to actually ask which member of staff it was who had the grandiose idea to carry that stick with him or her with such sensitive data into a nightclub.

The loss of this stick and the unencrypted data on the stick could have been rather dangerous for the soldiers concerned, especially in light of the always mentioned danger from terrorists. In fact it is rather amazing that to this very day none of the lost data has, apparently, not fallen into the wrong hands and been misused. This despite the fact that in most instances none of the data has been encrypted in any way, shape or form.

To any terrorist the lost stick with the data of the soldiers would have been a goldmine and would have made them very happy indeed. On the UBS stick were contained the exact times of travel, the destinations and all routes and planned accommodation. Phew! Only good it did not get into the wrong hands and was found by an honest person. One day the luck will, however, run out, of that we can be sure.

The Ministry of Defence has only called this incident in its statement a “unhappy incident”. To call this attitude blasé probably is an understatement. How can any government and government agency have such attitude with such kind of sensitive data? This is especially difficult to understand as hardware encrypted USB sticks are no longer costing a fortune and regardless of cost, such data should, if stored on USB sticks, should, nay must, be put onto such sticks that are encrypted to the highest standard. Otherwise the day will come when the luck will finally run out and such data found will be used against this country and us, as the people of this country.

© M Smith (Veshengro), September 2008
<>

The Great Green Collision

By Simon Pamplin, SE Manger UK & Ireland, Brocade

Among all the challenges CIOs and IT administrators currently face, two historical trends are on a collision course. Firstly, the growth in data processing is generating ever-increasing demand for servers, storage arrays and the infrastructure needed to support those devices. According to IDC projections, the projected total volume of corporate data worldwide for 2010 is nearly a zetabyte (one billion terabytes). This sets off a spiralling circle of events in the data centre: the growth in data means that more hardware is required; more hardware in turn leads to larger data centres; these larger data centres require more power and an upsurge in the cooling needed to sustain continuous operations.

But this chain of events cannot continue unchecked. The limited availability and increasing cost of energy worldwide is undermining the energy utilities’ ability to supply reliable power. Several factors are contributing to this trend and pointing to an impending conflict between projected supply and demand. Because all modern enterprises depend on information technology, IT organizations must be able to align energy consumption with energy availability and simultaneously accommodate data growth as part of a viable IT strategy. Using today’s technology, organizations can build sophisticated data centres in a cost-effective manner. By selecting products that use energy efficiently, CIO’s can not only help their businesses by reducing costs and running their IT infrastructure efficiently but also contribute to the global reduction in energy usage.

In terms of business benefit, being able to reduce the running costs of storage is a no-brainer. The cost obviously depends on a host of factors, from how many copies you take of the data, to how it is backed up and onto what medium, how long you need to retain it, how efficient the devices are etc., but probably the most relevant figure is the cost of electricity to run it all. Recent research by the European Commission shows that UK industries spend an average of 10.78€ per 100Kwh of electricity used. By selecting the most efficient storage components, savings of thousands of Euros per device per year can be made when compared to less efficient products.

The same business benefits are found to be applying a green storage policy. The more efficient you are the fewer resources you consume; resources cost money and they also contribute to the global issue of resource usage. The less power needed to run a device the less heat it will generate; the less heat it generates the less cooling is required; and reduced cooling requires less power. Typically the reduced power consumption comes from reducing the number of components in the device, allowing for higher density per sq/foot. This in turn means more available space in the data centre and less need for expansion, which is another way to lower company overheads.

Going green means that organizations must re-examine all aspects of their IT operations including facilities, people, and infrastructure—so they can proactively implement best-practice strategies and identify areas where they can achieve greater power efficiencies.

As the historical trends of data growth collides with the availability, or lack, of power and space for data centres, the IT industry needs to wake up to the fact that a green solution is the only solution for the future.

Brocade is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com

Source: StoreagePR
<>

Is London on the brink of a data crunch?

By David Galton-Fenzi, Group Sales Director, Zycko

It’s London’s turn next and as preparations for the 2012 Olympics get into full swing the city finds itself on the verge of a data storage crisis as Canary Wharf’s ability to accommodate new data centres reaches capacity.

In an effort to ensure our second chance at hosting the Olympics is as much a success as our first, the electricity requirements of building for the Games are taking precedence. This is compounded by an ongoing increase in the number of servers required by City firms and the corresponding need for more powerful cooling systems. In fact, in the run-up to 2012, it is expected that the power demand in the Docklands area alone will rise by around 90 per cent. Taken in the context of a government commitment to reducing CO2 emissions, it is clear that City firms are going to need to re-examine their data storage options.

Of course the option of moving away from the City is out of the question for most of the companies that will be affected, but some will be able to take advantage of locating all or part of their data centres remotely. This has traditionally been accepted as good practice for a secondary site as part of a disaster recovery strategy, but relocating the primary data centre away from the primary business location remains a step too far for many corporate cultures.

The solution lies not in finding more space for data centres, but rather in making existing centres more efficient; creating a 21st century data centre that will evolve to become a complete ecosystem where change in one area inevitably impacts what happens in another.

A key driving force behind this will be virtualisation – allowing server, storage and networking capacity to be deployed and redeployed at the touch of a mouse button. While the total virtual data centre as a living, ever-changing ecosystem may still be just around the corner, all the component parts are here today, and efficient data storage will be at its core. If the correct data does not arrive where it is needed, the data centre’s productivity is impacted.

Fortunately, virtual storage has been in existence for some time and has developed over multiple generations to become a stable foundation technology, on which more recent advances such as virtual servers can be built. Companies such as LSI’s StoreAge have been providing storage virtualisation for over a decade, with the added benefit of using this virtualised environment to allow the seamless and transparent movement of data between storage devices.

However, the main challenge remains the sheer volume of data being produced and stored at every level, from the home user right through to the very largest organisations in the world. This data deluge shows no signs of stopping and trying to limit the flow has implications for workplace processes and requires a significant change in corporate culture. So instead it is often simply labeled as a corporate overhead and left for the data centre technologists to handle.

In response to this a number of technologies have evolved to assist the data centre. These address the issue in different ways tailored to suit processes in the data storage environment.

The first of these is data compression where the amount of disk space needed to store data is reduced. Companies such as Storwize already provide a real time, transparent storage compression solution that optimises primary (disk) storage capacity in the data centre.

The second industry initiative is data de-duplication. While data compression has been focused on the primary storage, data de-duplication is directed at the backup and archiving arena. The idea is to have only one copy of any file stored in a system with duplicate copies deleted and pointers left to direct any data accesses to the principal copy, thus saving on unnecessary storage capacity consumption. Within this arena, companies such as ExaGrid and Diligent offer two different approaches that provide even greater granularity in creating de-duplication solutions.

The reason why this powerful tool is directed at the backup and archiving environment rather than primary storage is simple – it inevitably creates some overhead on the process of accessing data. Primary storage is much more about speed and performance than efficient capacity usage making any operational overhead unwelcome. The future data centre is an ecosystem which will require architecting at a macro level. This in turn will be dependent on sub-systems such as data storage to be able to integrate and provide an effective and efficient service. These storage sub-systems of the data centre ecosystem are in turn dependent on point technologies working at a micro level providing them with the tools to deliver what is needed.

The lesson to be learned from all this, is that the necessary future for both data centre professionals and their suppliers looks like an integration of skills, enabling them to move back and forth between the micro and the macro environments, if the integrated 21st century data centre is to reach its full potential and enable us to avoid a data crunch ahead of 2012.

Zycko is a value-add distributor of best-in-class convergent IT infrastructure solutions through a channel of resellers, systems integrators and service providers.

Zycko is privately held and has been profitable since inception in 2000, when the company’s original charter was to market data networking accessories to resellers as a wholesale distributor. Zycko now employs 250 staff, serving over 3,000 resellers around the world from thirteen offices on four continents. The company enjoys an annual turnover of more than $180 million.

Zycko’s provision of best-in-class IT products and logistics management is supported by true value-add professional services - such as pre-sales expertise, technical support, custom configuration, an industry leading accredited training program, and marketing support. These vital services and support enable our customers to quickly deliver profits and invest in new market opportunities, allowing them to differentiate in a crowded market. Zycko is the channel partner of choice.

Zycko’s strategic partner base includes world-class companies such as Avago, Asigra, Diligent, Edgewater, Epicenter, Hitachi Data Systems, Intransa, Isilon, LSI, OnStor, Polycom, Powerdsine, QLogic, Riverbed, Usystems.

Zycko is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com

Source: StoragePR
<>

Brits lack email security awareness – says a Cisco report

by Michael Smith

British workers are the most likely to open dodgy-looking emails than any other developed nation except the Chinese, according to new research by Cisco on the habits of corporate workers.

I must say that this finding does come at no surprise to this writer as I have seen this happening but the Brits are not alone here, of that I am sure. There are also enough of them in the United States who will do that and also fall for the kind of spam like “pass this on to 100 of your friends or your PC will blow up” and stuff like that.

While only 25% of US workers, 23% of French workers and 28% of Japanese admitted opening suspicious emails, the figure in the UK rose to 45%. Only the Chinese, at 54%, showed a higher level of curiosity.

Maybe the Brits are a little more honest than the workers from other countries when they answered the questions.

But although the Brits like to see the message text, they are better disciplined when it comes to opening unsafe attachments or going to websites of dubious origin. Only 3% admitted doing so – far fewer than most other countries. In Japan, 14% opened attachments, followed by India (11%), China (8%), Germany (6%) and Australia (5%). Only 2% of US workers admitted opening attachments or suspicious URLs.

The problem is that not all URLs are suspicious looking, and that is the problem here. However, the advice should be not to open attachments or follow links unless we know who sent them and that they are safe. If need be check back with the supposed sender.

The Cisco research marks the second year the company has surveyed attitudes in 10 industrial countries, questioning 100 IT decision-makers and 100 remote workers (end-users) in each country.

The survey also found an increase in workers using their work computers for personal use, such as shopping. In the UK, 43% of respondents said their company had no objection to them doing so.

It seems also that the lines between work and home computers are blurring, with a greater proportion of remote workers using personal devices to access work files, and work devices to access personal files than they did in 2006. That trend seems to be strongest in China and the US.

Because of this blurring trend devices such as the MXI Secure Stealth MXP loaded with MojoPac Enterprise desktop environment are so important nowadays. People use their own PCs to connect to networks of their companies and organizations and could, inadvertently introduce the gods only know what into the system. A secure desktop on a USB drive is the answer here. But, I digressed.

There is some risky behaviour about. We have more remote workers, and we are blurring the lines between personal and corporate assets. And with Web 2.0, everyone has hopped on the bandwagon of socialising with people around the world.

A lot of people at work feel comfortable because they believe their PCs are locked down tightly. With the threat vectors changing, however, we need to take a look at how to tackle such threat. This means that users have to be trained properly and that proper procedures and devices must be used to make things as secure as possible.

Hackers from around the world starting to use stealth tactics to get into networks and steal intellectual property. Who wants to pay millions in research and development when they can just go and steal the information?

Often poor security procedures are allowing hackers to penetrate networks and once inside, they escalate their privileges to become basically an unpaid systems administrator. They then grab the corporate data and remove it very slowly piece by piece, so that no one even knows that they have been there.

Technology and procedures can only do so much. A cultural and behavioural change needs to imposed in the organisation so that people understand the implications as to their own vulnerabilities.

People have to be made aware of the potential repercussions of any mistake or a "moment's lack of thought".

For compliance purposes organisations also need to be able to show they provide users with adequate training and information, so that they can prove good practice in the event of a security breach.

© M Smith (Veshengro), September 2008
<>