by Michael Smith (Veshengro)
Open Source software is the biggest open secret in the IT world.
Open Source software is free, secure, and supported by some of the world's largest software and hardware companies. The software they will be promoting includes OpenOffice, a complete office suite, Firefox, a secure web browser and Thunderbird, an email and calendar manager. Companies that are committed to open source include Dell, Hewlett Packard, Sun Microsystems and Novell.
Operating systems like Linux and BSD are safe and secure to use and some easier than others. The current one that I find, probably, to be the best is Ubuntu Linux in this regards.
At least half of the Internet runs using Open Source software, and many large government departments and companies use open source, including Lloyds TSB, Ebay, NASA, B.T., Bristol and Birmingham Councils to name only a few.
The French Gendarmerie, on local and national level, as well as other bodies have switched over to Open Source software, while, in some cases, retaining Windows as operating system, though in other cases ditching Windows and using Linux.
Elsewhere in Europe we are seeing the same trend of governments, local and national, going over to using open source software, such as OpenOffice, Firefox, Thunderbird, while still retaining Windows OS or migrating over entirely to Open Source, Operating System and all.
The city of Vienna migrated all of its computers, from desktops to server, over to Linux, as, so I understand, did the city of Munich.
They all do it for more than one reason, though the main is, obviously, cost. Open Source software can be installed on as many machines as wanted without the need of paying the likes of Microsoft a separate license fee for basically each and every machine that you wish to install a program on. The other reasons are security as the majority of Open Source software is very virus proof, as well as hacker proof.
At a Hacker Convention in the United States early in 2008 they cracked Apple Mac's latest operating system in a few minutes flat; they took a little longer to compromise Windows Vista and after seven days they still had not managed to get into Ubuntu Linux. That, I believe shows something.
Open Source software also is potentially green. Why is it green and environmentally friendly? No, this is not greenwashing in the making by this writer. Open Source software is potentially green because with Open Source older PCs can be rescued and kept running to nigh ad infinitum. While Vista requires fast processors and huge memory and loads of hard disk space to install upon Ubuntu, for instance, comes on a normal CD of 700MB and that is not even all used by the software. Some Linux operating system have and even small footprint, so to speak, as they can be installed with 50MB and can, literally, be run from a thumb drive, such as Puppy or Damn Small Linux (DSL). However, having said that, on Windows, for some reason, OpenOffice above version 2.0 seem to be needing a huge amount of RAM, thereby making problems, unless one upgrades the memory of the PC which is, if one can, anyway a good idea, considerably.
If the memory is low, so I have found, then OpenOffice 2.3 for instance, is very memory-hungry which froze my PC again and again and I switched back to OpenOffice 2.0. That version is the same that comes bundles with the Ubuntu Dapper Drake version that sits on the PC that runs Linux and it is, like the operating system itself, very stable and happy and keeps just chugging along nicely.
It would appear, from what I keep seeing, that the old Dapper Drake of Ubuntu, even though there have been a fair number of upgrades since as versions go, is still the most used Ubuntu distribution in use. I certainly must say that for the writing work of mine I love it simply because it does as it is “told”, so to speak, and gets on with it.
Join the Revolution! the eco-friendly computer software revolution, and get some freedom onto your PC.
© M Smith (Veshengro), June 2008
by Michael Smith (Veshengro)
Brian Chess, Founder and Chief Scientist, Fortify Software
Judging by the number of public breaches that we keep hearing about, it looks like the bad guys are far outrunning the good guys. We know it’s a big problem because as a company we get called in to sort out the problems most often once the horse has bolted.
In June of this year in the US with section 6.6 of the PCI Data Security Standards (DSS) becomes mandatory in the US will things change? From a UK perspective it’ll be interesting to whether it makes a change for the better. Online merchants that process credit card payments will either have to conduct a code review for their applications or install an application-layer firewall. The standard offers a choice, but there really isn’t any choice at all. If an organization is going to successfully protect its data, it needs to aim for preventing a breach, not passing an audit. This means, first, finding and fixing the vulnerabilities in your software, second, building security into the development process, and third, protecting your applications once they’re deployed.
Hannaford Bros, a supermarket chain based in New England, USA, passed a PCI audit and then got hacked. They lost 4.2 million credit and debit card numbers, which has led to 1,800 cases of fraud to date. Over the last two years, as the PCI standards have slowly been implemented, the number of data breaches has increased from 158 incidents in 2005 to 443 incidents in 2007, for a total of 212 million records. So judging by this, you’ll see the bad guys are still very much in the lead. And that’s why PCI keeps evolving. But, in order to win this battle, companies must invest in security, not just in compliance.
In the spring of 2005, someone broke into a Web application for the Assignment Management System of the United States Air Force. They stole 33,000 personal records. The USAF responded to their breach with a multi-million dollar effort to identify and eliminate their security holes. This initiative incorporated a heavy reliance on source code analysis, in order to fix the problems at the root cause, as well as targeted investments in application firewalls, web application scanning tools, and database firewalls. The key to their approach was having the right motivation. They didn’t launch this initiative to pass an audit. They did it to ensure their software was secure. The result has been a comprehensive and dedicated deployment. As software drives nearly every military activity today, we can all be a little more comfortable knowing they have the right approach to deal with the threat.
The PCI council knows that analyzing the code early is the right thing to do, as they stress the importance of building security into the development process. All of the following quotes come from the PCI council, and they all emphasize the importance of the code.
- “…it is recommended that reviews and scans also be performed as early as possible in the development process.” (1)
- “Tools should be made available to software developers and integrated into their development suite as much as practical.” (1)
- “The reviews or assessments should be incorporated into the SDLC and performed prior to the application’s being deployed into the production environment.” (1)
- “Develop all web applications based on secure coding guidelines such as the Open Web Application Security Project guidelines.” (2)
- “Review custom application code to identify coding vulnerabilities.” (2)
- “Cover prevention of common coding vulnerabilities in software development processes.” (2)
(2) Payment Card Industry (PCI) Data Security Standard, Version 1.1. September, 2006
Bottom line – build security in. If you want to have the best chance of passing a PCI audit, AND preventing a breach, fix the code first, and then monitor it in real-time.
PCI Section 6.6 is a productive step forward and encourages companies to do just this, but as with many standards, companies can interpret the mandates in many ways. A bad interpretation and a weak implementation will mean a false sense of security. Passing a PCI compliance audit is necessary, but compliance alone does not protect your company from a breach. So be ahead of the bad guys, put your efforts into ensuring your applications are secure – that way you’re be out there taking the lead.
Monday 30th June 2008 - The team behind the popular Infosecurity Europe show - held in London every spring - has launched an online interactive security forum for the Infosecurity industry with advice, forums, blogs, career path, ask the experts, Q&A and other resources for everyone involved in the challenges of information security. The key difference from other sites is not about the latest news, it is a community where all the content is created by and for the benefit of the global infosecurity community.
"The Infosecurity Adviser portal contains a wide variety of resources, including top-quality blogs, all of which are designed to keep computer users up to date on current and future events in the IT security industry. Registered users of the site can get involved by posing questions to members of the “Ask the Expert” panel, on the forum, add a product review on a technology they have used or comment on any of the content created by the community." said Claire Sellick, event director of Infosecurity Europe.
The most active bloggers at the moment are Members of the Information Security Awareness Forum Board including, Dr David King ISSA chair of the ISAF; Peter Wenham, CISSP from the Communications Management Association, Andy Jones, CISSP from the Information Security Forum; also Jon Collins, Service Director with analyst firm Freeform Dynamics and Chris Potter a partner with PricewaterhouseCoopers. The 2 top rated blogs at the moment are “Top down awareness” by ISAF Blog team member Peter Wenham and “Security awareness - the next generation” by Chris Potter from PricewaterhouseCoopers.
"We also have an exclusive "Ask the Experts" section of the site where users can get free advice from industry experts," Sellick added.
According to Sellick, thanks to the support of the Information Security Awareness Forum and a number of other IT security bodies, the Infosecurity Adviser portal can offer all types of computer users information and resources that will keep them informed on the many aspects of information security they need.
"The site's crisp and concise manner, together with regular updates from a flotilla of industry experts, means that we expect the portal to become a must-visit resource on the Web in a short space of time," she said.
Infosecurity Advisor is supported by the Information Security Advisory Forum (ISAF). Dr David King, ISSA UK and Chair of the Information Security Awareness Forum said, "The new Infosecurity Advisor Portal will help to bring together expertise and advice to those who have questions around information security. This in turn will help to promote security awareness. The Information Security Awareness Forum supports this initiative and welcomes the bringing together of different elements of the industry through the portal mechanism. The awareness forum is also supporting the portal through its blog which is available on the portal website."
Raj Samani, ISSA-UK VP of Comms, “Sometimes you can be left with problems which Google simply cannot answer! It is therefore refreshing to see something out there which can provide practical help to problems which can sometimes seem impossible to deal with on your own.”
“The IT security industry is an industry in transition. For this reason as much as any, it’s great to have a place where industry experts and security professionals on the front line can have a clear and open exchange of views. It’s both useful in its own right, and it all helps move the debate forward.” Said Jon Collins, Service Director with analyst firm Freeform Dynamics
"In my experience, the information security community comes up with some really good questions. I'm looking forwards to the online community being a great way for us all to share experience and get to the answers!", said Chris Potter Partner PwC
For more on the Infosecurity Adviser portal:
Optimizing Application Performance and Disaster Recovery in a Virtual Server Environment
By Jeff Aaron, Director, Product Marketing, Silver Peak Systems
In a virtualized environment, IT managers need to pay careful attention to the impact that Wide Area Network (WAN) performance has on application performance. When virtual servers are placed in centralized locations, limited bandwidth, high latency, and packet loss on the WAN can impact application performance for end users. At the same time, the WAN can present a major obstacle for data protection and disaster recovery in these environments when large virtual images must be replicated between geographically disperse locations. For all of these reasons, WAN optimization has become a key enabler for strategic virtualization projects.
This paper will discuss in more detail the challenges of doing virtualization across the WAN, and identify common WAN optimization techniques that improve virtual application performance while improving the backup and recovery of virtual machines.
There are many reasons why enterprises turn to virtualization as a way of consolidating application servers and databases. While hardware and management costs are typically the most recognized, performance, scalability, and security benefits can also be primary drivers.
Virtual machines suffer all the same performance challenges as physical servers when accessed across a WAN. More specifically, the following WAN characteristics can all adversely impact the performance of centrally hosted virtual applications:
- Limited bandwidth. Depending on the volume of data being accessed and transferred across the WAN, bandwidth can be a huge concern in a virtualized environment. WAN speeds typically function at a fraction of LAN speeds, which creates a natural bottleneck that can adversely impact the performance of many virtual applications.
- High Latency. You cannot break the laws of physics. It takes time to physically communicate from one location to one another, which can be further exacerbated by “chatty” communication protocols, like the Transport Control Protocol (TCP). The impact that latency will have on performance will depend on the type of application being hosted centrally.
- Packet loss. As enterprises move increasingly to MultiProtocol Label Switching (MPLS) and Internet Protocol Virtual Private Neworks (IP VPNs), packet loss is becoming a bigger and bigger problem. These networks are oversubscribed by the carriers, which can result in packets being dropped or delivered out of order during times of heavy congestion. Packet delivery issues such as these are especially problematic when high data volumes must be sustained across the WAN.
Virtualization technology can make it easier and more cost effective to implement disaster recovery. Instead of requiring a 1:1 mapping between physical hosts and targets, which effectively doubles infrastructure costs, virtualization allows a single physical server to act as a recovery point for many virtual machines. This limits the amount of hardware required for data protection and recovery. In addition, this simplifies the disaster recovery process by eliminating the need to manage disparate servers with disparate operating systems.
VMware and other virtual solutions have snapshot capabilities to regularly replicate changes to target virtual machines. In the event of an outage, the replicated virtual machine can be started as a backup device with the most recent data. In addition, numerous 3rd party solutions exist that provide real-time replication of virtual machines to target devices for maximum data protection.
Both the replication and snapshot processes can generate an enormous amount of traffic, which can create a challenge when the process is taking place across the WAN. In addition, both leverage TCP for transport, which can create latency and cause backup tasks from being completed in allocated windows. This results in database synchronization issues and missed Recovery Point Objectives (RPO). Lastly, many replication processes require high sustained data throughput, which cannot be disrupted until the entire replication process is complete. If the flow of data is disturbed, as can occur if packets are dropped or delivered out of order across the WAN, effective throughput across the WAN will never exceed 1 or 2 Mbps regardless of how much bandwidth is actually available. This will bring a replication process to its knees.
WAN Acceleration: A Virtual “Must Have”
WAN acceleration addresses the common bandwidth, latency and loss issues that can hamper server centralization and data protection plans, making it an essential component for enterprise virtualization. More specifically, WAN acceleration provides the following benefits in virtual environments:
- Improve data transfer times: WAN deduplication is a new technology that has moved to the forefront of the WAN acceleration space. It works by delivering duplicate data from local data stores instead of resending it across the WAN. WAN deduplication can have an enormous impact on data transfer times, which means better perceived performance for virtual applications. In addition, faster data transfers improve the performance and reliability of replication/recovery processes.
- Maximize WAN efficiency: WAN deduplication can reduce as much as 99% of WAN traffic by eliminating the transfer of duplicate information. With byte level granularity, repetitive patterns can be detected within a single transfer, across separate transfers, and across different virtual applications. In this respect, WAN deduplication complements deduplication that might already be taking place in the host or replication software.
- In addition, advanced header and payload compression techniques can be used to reduce the amount of WAN bandwidth consumed by virtual applications when accessed across the WAN- even when the information is not repetitive.
- Reduce packet loss and errors. WAN acceleration can be used to reduce the impact of both packet loss and jitter that occurs when router links are oversubscribed and drop or re-order packets (as is common with shared IP networks, such as MPLS and IP VPNs). Adaptive Forward Error Correction (FEC) for example, rebuilds lost packets on the far end of a wan link in real-time, while Packet Order Correction (POC) reorders packet in real-time. Both techniques eliminate the need for re-transmission, which can lead to poor application performance and failed replication processes.
- Increase geographic distances. By reducing the impact of latency, enterprises can extend the distances between users and data, enabling virtual servers to be located anywhere in the world (and backed up to disaster recovery locations located anywhere in the world.
- Protect virtual traffic. Many WAN acceleration devices use encryption to protect network traffic sent across the WAN. This adds an added element of security to those enterprises concerned about unauthorized access to data stored on virtual machines.
Just as bandwidth, latency, and loss can hamper the performance of applications running on physical servers, the same is true of virtual servers. In addition, just as these WAN challenges can hamper backup/replication processes between physical hosts, they can also impact the performance and reliability of data protection in a virtual environment. As a result, WAN acceleration is strategic to many server virtualization initiatives.
Silver Peak Systems Ltd is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: Eskenzi PR Ltd
Chris James, marketing director for EMEA at Overland Storage
Data de-duplication technology is certainly the hottest thing in the world of storage right now, but what impact is it likely to have on the way IT managers buy storage in the future? With ever-increasing focus on business continuity and a raft of renewed environmental concerns, could data de-duplication be the answer the market has been waiting for?
The great fix-all?
We have been working with the technology for several years now, but it is still new in terms of commercial products; these have only emerged within the last three years. The real buy-in started about a year ago, in-line with market demands to reduce physical disk space, prompted in many cases by environmental and budgetary concerns.
The advantages of using data de-duplication technology are extensive. It facilitates more efficient usage of disk capacity, allowing for improved retention periods and recovery speeds. When used alongside a backup solution, the technology lowers the amount of required disk or tape space as it avoids storing duplicate copies of the same file, ultimately enabling IT managers to save money on hardware spending.
What does this mean for other storage technologies?
We have seen a reduction in the process of backing up to tape, with backup to disk on the rise. One of the main reasons for the move to disk for backups is the additional reliability it offers. While tape backups can fail over night and delay the entire process, disk offers increased efficiency, extra protection and more effective processes.
Having said this, the connection rate between the two mediums is increasing, with most customers buying disk in addition to tape. We are seeing a move towards customers using disk for backup and tape to hold the copies of data as they do not need to be up and running and fully accessible 100 per cent of the time.
The role of tape has therefore had to adapt to meet this shift. Since tape needs little power to retain data, it is an ideal medium for archiving which is becoming its primary function. Tape is approximately 100 times more energy efficient than disk in an archive which means it is the best choice for those concerned with environmental issues and reducing power consumption.
More than ever before, we are seeing our customers begin to adopt a tiered data protection strategy, in line with business continuity policies and efforts to be “greener”. The next step is to extend a company’s evolving tiered strategy by adding data de-duplication for easier access, faster restores and longer-term, near-line retention of backup data.
The remote backup space has also changed dramatically, as more and more companies now encompass remote offices. As a consequence, data de-duplication technology will have to expand in the near future to cover transmission over wide-area networks.
We understand that one size does not fit all, so the introduction of this technology has added yet another level of protection for customers seeking simplified, affordable long-term data retention on disk. Data de-duplication is a game-changing technology and we are excited about the changes we are seeing in the market.
Overland Storage is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: Eskenzi PR Ltd
Solcara, the market-leading provider of software for the control, management and searching of digital information, has partnered with ISYS Search to extend its SolSearch offering. SolSearch Indexer will deliver a powerful, comprehensive and cost effective enterprise search solution, which combines market leading federated search technology with an advanced indexing capability.
Solcara provides software that connects people to information that matters, when it matters most. Solcara streamlines access to critical business content, ensuring that the user can deliver an exceptional and timely service to his customers and stakeholders in all situations.
SolSearch Indexer provides a strong functional administrator interface, and can extract entities and concepts when indexing documents, web sites and databases. These entities and concepts can be used to support more efficient navigation of indexed content. For users this means a complete and proven enterprise search solution that can be quickly deployed and deliver superior results.
Rob Martin, Managing Director of Solcara, said:
“SolSearch has just got better. By adding the new indexing module to our federated search technology, we will be providing existing and new clients with an even more powerful integrated search solution. The new indexing module will enable clients to quickly, and cost effectively, apply advanced indexing techniques to content that is not already effectively indexed.”
Powered by ISYS Search, SolSearch Indexer can index content from document management systems (such as Interwoven), RDBMS systems (including SQL Server and Oracle), file systems, websites and email systems.
ISYS has been deployed by thousands of organizations worldwide, operating in a variety of vertical markets.
For more information on Solcara products and services, visit: http://www.solcara.com
For more information on ISYS visit: http://www.isys-search.com/
Source: College Hill Associates Ltd
by Michael Smith (Veshengro)
Dump those flying toasters and endlessly looping slide shows. They may be doing more harm than good.
In today's world the screen saver is no longer needed, and it has not been needed for many years now. Our monitors are no longer the ones that can end up with so-called screen burn or burn in, as the old green and gray ones once did. So, why do you still use a screen saver.
You do not still use a screen saver, do you?
All I can say is, kill that stupid screen saver. In the “good old days” of tube monitors, and here I must add, the mono-color monitors and the early multi-color ones, screen savers, such as those unforgettable flying toasters, were invented to prevent burn-in, a permanent shadow branded into the phosphors of your monitor by a static image of, say, a spreadsheet that had been left on the screen all weekend.
Today's flat-screen LCD monitors don't burn in and neither do the other variety of the CRT tubes, the more modern ones, any that are less than say eight years old, so if you still have flying toasters or an endlessly looping slide show of your adorable niece and nephew, you're behind the times. When you're not sitting in front of your monitor it should be off off off. Wither turn it OFF physically by means of that button on the monitor; it is there somewhere, I promise, or by means of the power save function of the PC. But, the screen should be off, as in black and nothing happening.
Telstra, the biggest telephone company in Australia, has removed all the corporate screen savers from the 36,000 computers in its offices. What, actually took them that long. I mean, they still had “corporate screen savers”?
So, now that they are gone, what will happen? The change will cut tons of CO2, it is claimed will, which will be the equivalent of taking 140 cars off the road for a year.
Obviously, the Telstra's figures only add up if everyone turns the monitor off when they are not sitting in front of it and actually doing some work on it.
So, let's follow Telstra's example. Let those flying toasters crash and burn.
I have been turning off monitors – either manually or by means of the power control of the PC – for years already. I have never ever been able to warm to screen savers of whatever kind, I have found then stupid in the extreme and also a drain on resources, and at times found them to freeze up the PC and crash it.
While, when using a CRT monitor it took a while for the monitor to be fully there again with today's LCD monitors there is no time lost even if one but turns the monitor off physically, at the switch.
So, lose that screen saver and do your bit for green IT.
M Smith (Veshengro), June 2008
New Retention Lock Software Option Enables Active Archive Protection for IT Governance
SANTA CLARA, Calif., June 23, 2008: Data Domain® (NASDAQ: DDUP), the leading provider of Deduplication Storage systems, today announced the Data Domain Retention Lock software option, the industry’s first software to allow file locking for IT regulatory governance with high throughput inline deduplication and maximum flexibility for trusted operators. With Data Domain Retention Lock software, IT administrators can now store deduplicated files in an unalterable state for a specified length of time. This introduction of WORM (write once – read many) for active archive, high performance deduplication storage allows enterprises to implement a wide range of corporate IT governance policies that require data be retained and unchanged for fixed periods of time before removing it.
Unlike other CAS and NAS archive and compliance storage products, many of which offer no deduplication and weak throughput, Retention Lock is based on Data Domain’s high-speed inline deduplication storage. This means, where other active archive storage products provide a silo of specialized storage, Data Domain continues to expand its consolidative effect, enabling a consistent storage system for the nearline storage tier, encompassing archiving, remote office backup, datacenter backup, and lower-tiered file storage.
Data Domain Retention Lock software enforces per-file locking with a set retention period. During this period, users cannot change or delete the files, but trusted operators can manage files and space as required. This level of compliance protection is designed for supporting those regulations that focus on protection from inadvertent or malicious data modification by storage users. While some industry-specific regulations require immutability guarantees that assume even IT administrators cannot be trusted, many others allow that files may need to be deleted under court order, e.g. to protect identity information or to remove other inappropriate confidential information. In these governance regulations, data still needs to be held with retention enforcement, but administrators need file-level policy flexibility.
With Data Domain Retention Lock software, privileged administrators can reconfigure retention periods or security settings on a per-file basis, providing the flexibility required to deal with changing security and retention policies. Retention Lock uses existing industry standard interfaces for NAS-based locking, enabling rapid certification with third-party software vendors. Retention Lock may be applied to any file on a Data Domain system regardless of whether it is from a backup, archive, or other nearline application. Retention Locked files may also be replicated using standard Data Domain replication for disaster recovery or WAN vaulting, and all locking policies will be retained across links.
According to recently published research by the Enterprise Strategy Group, organizations will archive nearly 200,000 Petabytes of information over the next five years. “Compliance with recordkeeping and legal preservation mandates, both of which require retention management solutions, are two of the biggest drivers for capacity growth,” said Brian Babineau, Senior Analyst with Enterprise Strategy Group. “Data Domain has already proven it can help customers manage capacity in nearline storage environments which has a direct impact on space, power, and cooling. With the release of the Data Domain OS 4.5 and the Retention Lock product, Data Domain further extends these benefits to information that is retained for compliance purposes. The addition means that customers can now lock down archive and backup data within the same system that delivers the benefits of de-duplication.”
The Department of Water in Perth, Australia is leveraging Retention Lock to prevent inadvertent or malicious alteration of critical company data that must be retained to satisfy government policies. “By leveraging Data Domain with Retention Lock to store disk-based backups and data archives, we can now rest assured that our critical data is retained securely on disk for pre-defined time periods, at a fraction of capacity required to store it on traditional disk storage systems,” says Yong Leong, Manager, ICT Infrastructure. “Retention Lock also offers the flexibility required to react to changing security requirements and policies, reducing our cost to manage our environment.”
Data Domain Retention Lock software integrates with archiving solutions from industry-leading providers, including EMC, CommVault, Symantec, AXS-One and DataGlobal.
“Where most deduplication products can address either backup or archive storage, Data Domain can do both, enabling users to consolidate around a single architecture,” said Brian Biles, VP of Product Management at Data Domain. “Retention Lock provides extraordinary operator flexibility for archiving operations, so it can be used with backup data, email archives, database archives or unstructured data archives with equal ease, all at high deduplication throughput.”
The Data Domain Retention Lock is available immediately as a software license for all Data Domain systems running Data Domain OS 4.5.
A Data Domain webcast entitled “Nearline Storage Update,” which includes a discussion of Data Domain Retention Lock software, is available now at http://www.datadomain.com/retention-lock-webcast.html and in the Resources section of the Data Domain website.
The risk of data loss cannot be isolated to one type of data, one type of channel for leakage, or one mode of end-user behavior.
A recent Aberdeen Group survey of senior-level, enterprise IT and security executives revealed that organizations with deployed data leakage protection (DLP) and encryption technologies experience 92 percent fewer data loss incidents. The Utimaco sponsored Aberdeen report entitled "Data Loss Prevention: Little Leaks Sink the Ship", published on June 19, 2008, validates Utimaco's vision for a multi-layered security architecture centered on industry-leading DLP and encryption technologies.
For further information contact Utimaco Safeware
by Ian Masters, UK sales and marketing director at Double-Take Software
Many organisations are adopting virtualisation technologies in their data centre to secure the benefits of increased hardware utilisation, reduced power consumption and simplified management. The reliability of this new infrastructure is likely to be of critical importance but what is the best way to protect virtual servers and keep them highly available?
A virtual infrastructure has a single point of failure: shared disk space. An organisation that relies on tape to protect this environment will struggle to provide the infrastructure with the protection and availability it requires as it can take days to restore virtual systems from tape, if it’s possible at all. Some virtual products come ready equipped with a snapshot-based technology that sends data in periodic chunks. However, the flexibility of this technology is limited and as a result they do not provide the protection, availability and disaster recovery that a business critical virtual infrastructure warrants. No matter which virtualisation vendor’s solutions are deployed, independent data replication products provide availability of virtual infrastructures far more effectively than tape, greatly increase native protection and provide data centre managers with a very useful management tool.
If an organisation is already using an independent data replication solution within its business continuity plan, it may be flexible enough to be used within virtual infrastructures. Data centre managers are likely to maintain a variety of hardware on which they host virtual servers so the high availability solutions needs to have the flexibility to work in any hardware environment. Host-based replication is an asynchronous technology that replicates at the server level and streams replicating changes in real time, as they occur and compiles them on target servers in the order that the operations occurred. Host-based replication is hardware agnostic and therefore ideal for heterogeneous environments so has the flexibility required to protect typical virtual infrastructures. Host-based replication has the additional benefit of providing data centre managers with a simple to use virtual infrastructure migration and management tool.
Many organisations already have a disaster recovery facility or satellite office where they send backup copies of data for disaster recovery. Having a live duplicate of the virtual infrastructure within those locations provides the ultimate level of protection and recovery in the event of substantial site disaster. Host-based replication technologies are able to replicate over any distance so provide organisations managing virtual infrastructures with the best possible protection for business-critical physical and virtual environments.
Virtualising servers is only the first step in modernising a data centre to take advantage of the benefits on offer. Virtual infrastructures are business-critical so organisations need to make sure they are highly available. Deploying an appropriate data replication technology is the only strategy that will provide the protection required. Host based data replication products not only provide high availability but can also help data centre managers better maintain virtual systems by having the ability to provision, convert and migrate the systems both near and far.
Source: Eskenzi PR Ltd.
By Philip Crocker, Director of EMEA Marketing, Isilon Systems.
Many organisations are facing a tremendous increase in the amounts of data needed to conduct everyday business. The growth of unstructured data such as video, audio, image, research data, and other large digital files is pushing the bounds of traditional storage systems. Into the breach come Clustered Storage and Storage Virtualisation to potentially offer solutions to help meet the challenge of larger data storage requirements.
So what is Clustered Storage?
Modern storage clustering has gone way past the simple fail over between a pair of redundant server/controller head and disks. Today, most advanced systems tend to be Distributed Clustered Storage. This architecture is a networked storage system that allows users to add self-contained nodes to continually expand the cluster. Each node contains processing power, cache, terabytes of disk storage and front and backend interconnect normally based on Gigabit Ethernet, InfiniBand or Fibre Channel.
Each node also has a suite of built-in applications to deliver the three layers of traditional storage architectures; file system, volume manager and RAID. This creates an intelligent fully symmetrical file system that spans all nodes within a cluster. This allows each node to serve any data file to any client irrespective of where the data physically resides on the cluster.
- Ability to scale performance and capacity independently
- High availability able to remain operational even with multiple simultaneous drive or node failures
- Highly automated management functions including load balancing, drive rebuilds, data replications.
- No independent interoperability standards between vendors
- Not suited to highly structured “small” files like email or database transactions
- Unable to virtualise existing NAS or SAN hardware
Storage virtualisation is essentially an aggregation technology that presents a single “virtual” storage infrastructure normally derived from different storage end points. These solutions vary greatly and are often a combination of software applications, appliances and switches to create a single namespace of storage that appears to management systems as one large pool of data. Typically, these solutions enable “synthetic trees” that encompass several NAS servers or storage devices. Most virtualisation solutions can control laying out a file (striping data) across disk volumes to a specific silo but often not across the silos that make up the virtualisation. The ability to categorise data makes virtualisation suitable for information lifecycle projects, as it will often allow data movement between tiers of storage with limited client interruption.
- Ability to tie together multiple storage vendor products under a single virtual namespace
- Prolongs the life of otherwise redundant storage platforms
- Reduces management overhead compared to traditional “islands” of storage
- Limited scalability for both capacity and performance
- Limitation on largest file size and largest single name space imposed by underlying NAS silos
- Increased complexity especially during drive or controller failures
Both technologies aim to reduce the cost of managing data and succeed in many respects. As an analogy, Clustered Storage is building an infrastructure from the ground up to deal with large files and huge volumes of data. Where as Storage Virtualisation is a short-term response to the problem of spiralling data management using tools to help alleviate the burden until organisations can take a pause and create a lasting fix.
The factors involved in each potential customer’s circumstances are too varied to make a clear-cut “winner” but a simple rule can be extrapolated for the casual observer. If an organisation wants to build a multi-terabyte capacity storage pool suitable for large capacity data files then clustered storage is a better option. If an organisation has invested in a lot of NAS and SAN hardware that is complex and expensive to manage then Storage Virtualisation offers the most immediate benefits. Depending on the situation, each technology can allow the customer to significantly improve their ability to manage data storage.
Isilon Systems is exhibiting at Storage Expo 2008 the UK’s definitive event for data storage, information and content management. Now in its 8th year, the show features a comprehensive FREE education programme and over 100 exhibitors at the National Hall, Olympia, London from 15 - 16 October 2008 www.storage-expo.com
Source: Eskenzi PR Ltd.
by David Hobson, Managing Director of Global Secure Systems (GSS)
In its 2006 annual report for the fiscal year ended 27 January 2007, T.J. Maxx recorded a pre-tax charge of approximately $5 million for costs incurred in connection with the computer intrusion it formally disclosed in January 2007. This charge covers actual costs incurred to investigate and contain the breach, strengthen its computer security and systems, and communicate with customers, as well as technical, legal, and other fees. $5 million may suggest that it got off lightly but is this just the tip of the iceberg? What are the hidden costs of a security breach? What will be the final figure? This article aims to examine the hidden expense of a data breach, both the tangible and intangible costs. It concludes with a ‘top ten tips’ to prevent being the next headline grabber.
IT security in the early 1990's was relatively simple. Data was stored on mainframes, access control was limited and the need to share data was very limited. Today the rules have changed. More data is needed to be shared, access to data is required from almost anywhere and the need to secure that data has grown through regulation and legislation. The user population is much more technical now, and the Internet boom has enabled an increasing number of people to be able to cause more trouble than ever. Most organisations acknowledge that the impact of a security breach to the business will result in financial expense.
It’s going to cost how much!
Firstly, there are the direct and easily correlated costs such as replacing any lost or stolen devices; investing in, or strengthening existing, IT security; and if necessary strengthening the building’s physical security.
In August 2007, Monster had to take action when it discovered that con artists had mined contact information from curriculum vitaes for 1.3 million people, and possibly many more as Monster has since confirmed that this was not an isolated incident. Files were stolen not only from Monster.com but from USAJobs.gov, the federal-government career-listing service operated by Monster. Monster has said it will have to spend at least $80 million on upgrades to its site, which will include security changes. Among them is closer monitoring of the site and limits on the way its data can be accessed.
It doesn’t stop there
Some costs are harder to pin down including contacting those whose records may have been exposed, credit monitoring for those affected, and even the possibility of subsequent legal action taken by people who have suffered a financial loss as a direct result of their records being exploited.
The HMRC, who in December had two CD’s containing 25 million child benefit records go astray in its internal post system, wrote to each person whose personal details were at risk. When tallying this up there is the physical cost of the paper and envelopes, printing the letter and addressing the envelopes, postage, and the harder to guesstimate employee’s time to draft the letter and to physically perform the mail out, to account for.
Customer lawsuits can cause serious headaches for businesses that go far beyond the reputation-slaying negative headlines. Aside from the actual monetary damages, lawsuits often leave companies on the hook for additional training, systems upgrades or -- in the case of a data breach -- credit monitoring for those affected.
In the case of TJ Maxx’s massive security breach, it revealed that all affected customers were offered credit monitoring at its expense. Additionally it disclosed that it has agreed to pay up to $24 million in a settlement with MasterCard and it might not stop there. It also confirmed that it’s had to budget for various litigation and claims that have been, or may be, asserted against it or its acquiring banks on behalf of customers, banks, and/or card companies seeking damages allegedly arising out of the Computer Intrusion.
In another instance the Information Commissioner’s Office (ICO) found Marks & Spencer in breach of the data protection act in January this year following the theft in April last year of an unencrypted laptop containing the personal information of 26,000 M&S employees. As a result, the ICO ordered Marks & Spencer to ensure all hard drives on laptops that it uses were encrypted fully by April 2008 facing further prosecution if it failed to comply although M&S have appealed against this decision and a final outcome is yet to be decided. Other tangible costs Marks & Spencer faced were writing to all 26,000 employees affected and the cost of its offer to them for free credit checks. But what is the hidden cost, how many employees loyalty will have been damaged by this incident? We all recognise the cost of recruitment and training.
In 2007, the UK's largest building society Nationwide, received a fine of nearly £1m from the Financial Services Authority after the theft of an employee's laptop unearthed security flaws which could have put its 11 million customers at risk. In the first action taken by the City regulator over such systems and controls issues, Nationwide had faced a £1.4m penalty but was given a reduced fine of £980,000 because of its cooperation.
It runs deeper still
So what other concealed costs are there?
There is bound to be an impact on share price, even if only temporarily, as stakeholders react to the news.
There is the lost marketing investment when a brand is damaged, which is a key impact that UK Boardrooms should be concerned about. This is closely followed by the recovery costs in the form of future/increased marketing budgets to regain market position, rebuild reputation, etc. Imagine the continuing damage if the company’s communications can no longer be trusted. IKEA fell victim earlier this year when a hole in its website security allowed hackers and phishers access to its ‘contact IKEA’ function enabling them to send bulk outbound mail via its email servers. The potential damage to the company's reputation and possibility of email blacklisting could be significant.
There is the cost of customer erosion, especially where the breach has compromised credit card details as in the case of Cotton Traders. Apacs has called the recent hacking attack on its website a “serious” breach, saying the hackers could use the stolen card details for fraud. The clothing company has so far refused to say how many people have been affected, and has tried to alleviate continuing fears by confirming that its customer credit card data is now encrypted on its website, but could this prove too little too late?
There could even be the risk of employee’s jumping ship as internal morale dives when they feel their loyalty is compromised if the company they work for makes headline news for the wrong reasons. Filling vacancies is a costly exercise.
There is even the reality that those unaffected and uninvolved will still end up footing the bill. Again the HMRC data loss can provide a perfect example of this. The Chancellor of the Exchequer at the time of the breach, Alistair Darling, confirmed that banks were having to monitor all 7.25 million bank accounts whose details were on the discs. Although the cost for this monitoring has not been revealed the banks will make sure that they recoup the expense from someone! So either the tax payer, or everyone with a bank account, is going to cover this charge.
This article proves that data loss is not an insignificant issue. Information assurance is business critical and for many organisations, the data they own is their key asset, so why are so many failing to treat it as such? Failing to do so opens the corporate purse with no guarantee that it will ever be closed again. TJ Maxx itself summed it up when it said in its statement : “Beyond this charge [$5 million], we do not have enough information to reasonably estimate losses we may incur arising from the Computer Intrusion.”
Top Ten Tips to Preventing a Breach:
- Management set the tone for their organisations by their own behaviour. As such, good information practices are obligatory for all stakeholders, not just employees.
- Be proactive – management should deal with information assurance issues proactively, rather than reactively as information assurance is far more cost effective in a preventative rather than a remedial context.
- Information assurance is a business issue, not something extra for IT to handle. IT simply does not have the resources and/or authority to drive information assurance best practices through their organisations.
- Understand that information assurance is an ongoing process, not an annual event just before the auditors arrive.
- Information assurance is everyone’s job and as such investments in training and awareness programs for all employees are critical.
- Management should set out the company’s expectations with respect to information assurance in clear, accessible policies.
- The process for dealing with information security incidents should be defined in straightforward and unambiguous procedures.
- Investments need to be made in technology that will result in the secure transport and processing of information by the company’s information technology assets.
- Suitable best practices should be identified and implemented rather than ad hoc approaches implemented.
- Expert advice should be sought and used at all times to advise and oversee efforts in respect of information assurance from an experienced and objective third-party perspective.
Source: Eskenzi PR Ltd.
Solcara’s Spotlight product has been adopted by Glasgow City Council to provide its press office with an efficient, fast, accurate and now indispensable media service.
Solcara, the market-leading provider of software for the control, management and searching of digital information, sees Spotlight consistently deployed by communications teams in need of protecting and enhancing their reputation. Spotlight ensures the efficient and auditable collection, management and distribution of time sensitive information.
Chris Starrs, PR Manager for Glasgow City Council, said: “Glasgow City Council's Public Relations department is one of the busiest press offices in Scotland and rarely a day goes by that we are not in the national media. This product has given us a way to respond efficiently and consistently to media calls. It allows press officers to be on top of a whole range of issues which are critical to our reputation, particularly when called out-of-hours. It's a fantastic tool and staff now wonder how they ever coped with it.”
Rob Martin, Managing Director of Solcara, said: “Solcara Spotlight has been deployed in many of the UK’s largest press offices, and we are delighted to partner with Glasgow City Council. Solcara is already well placed in Scotland with Edinburgh City Council, Argyll & Bute Council and Inverclyde Council on board, making Glasgow City Council an attractive option.”
For more information on SolSearch and the suite of Solcara products, visit: http://www.solcara.com
Source: College Hill
Solcara, the market-leading provider of software for the control, management and searching of digital information, has announced the appointment of three senior sales managers for Financial Services, Legal & Professional Services and the Commercial Sector.
Commenting on the expansion of Solcara’s sales force Rob Martin, Managing Director, said:
“Solcara has reached a stage in its development where we are confident in accelerating our growth plans and extending our client base in these critical business sectors. Through senior hires, new product development, and the growth of our channel sales partnership, we are taking the necessary steps to consolidate and strengthen our leadership position”.
The appointments are:
- Tony Nicholls, Financial Services
- Mark Harding, Legal and Professional Services
- Ian Walbank, Commercial Sector
Source: College Hill Associates
by Michael Smith (Veshengro)
More than two thirds of web-based malware is now found on legitimate web sites, according to a report by security supplier Scansafe. This represents an increase of 407% in comparison with May of last year, that is to say May 2007.
According to a senior security researcher at Scansafe hackers have moved away from direct attacks like social engineering to focus on indirect attacks that use trusted brand names.
Just because you are accessing a well-known site you definitely and absolutely cannot and should not assume that that site if safe. At this presently moment thousands of legitimate web sites are being compromised on a daily basis.
According to the report there has also been a 220% increase in the different kinds of web-based malware in the past year.
According to Scansafe authentication-bypass and password stealing malware has grown the fastest with an 855% increase, which puts sensitive corporate data at serious risk.
Since October last year there have been hundreds of thousands of mainly China-based attacks, in which hackers passed malicious code to visitors on completely legitimate websites.
The computers of visitors to those sites are infected when they are redirected to malicious servers using a code injection method based on the database query language SQL.
A number of legitimate websites have thus been attacked in the USA and the United Kingdom, amongst them the Wal-Mart's website in the USA, as well as the websites of the Royal Statistical Society, National Media Museum, Skills for Care, and a number of businesses in the UK.
Unlike in the past, so it would appear, a much larger number malicious networks and servers were used in those recent attacks. Whether this means that the attacker or attackers has or have changed tactics or whether we are seeing a copycat is still not clear at this moment, it would seem.
The one thing this might point to though is a government sponsored attack, maybe. Rumor has it, though that is rumor from serious professionals, that many of the attacks from China are in fact coming from security services and military in that country. Maybe we are seeing an attempt to find out weaknesses in the systems in order to attack much more sensitive places next.
© M Smith (Veshengro), June 2008
Another blackmailing virus. Oh, how lovely - NOT
by Michael Smith (Veshengro)
Security software firm Kaspersky Lab has reported a new and dangerous blackmailing virus and is alerting computer users everywhere about a new variant of Gpcode, a dangerous encryptor virus.
The Virus.Win32.Gpcode.ak malware encrypts users' files with various extensions, including .doc, .txt, .pdf, .xls, .jpg, .png, .cpp, .h and more, using an RSA encryption algorithm with a 1024-bit key, and that is a lot.
Kaspersky Lab itself added a virus signature to block Virus.Win32.Gpcode.ak in early June 2008.
Kaspersky Lab says it has succeeded in thwarting previous variants of Gpcode by cracking the private key held by the attackers.
However, the author of the new Gpcode variant has taken two years to improve the virus and previous errors have been fixed and the key has been lengthened to 1024 bits instead of the original 660, which was crackable.
So far, it would appear that Kaspersky have been unable to decrypt files encrypted by Gpcode.ak since the key is 1024 bits long and so far no errors have been found in the implementation. That means, according to Kaspersky, the only way to decrypt the encrypted files, presently, is to use the private key which, unfortunately, only the author of that virus has.
After Gpcode.ak encrypts files on the victim's machine, it changes the extension of these files to ._CRYPT, and places a text file named !_READ_ME_!.txt in the same folder.
In the text file the criminal tells the victims that the file has been encrypted and offers to sell them a decryptor: "Your files are encrypted with RSA-1024 algorithm. To recovery your files you need to buy our decryptor. To buy decrypting tool contact us at: ********@yahoo.com"
Kaspersky Lab is still working on a way to recover data that has been encrypted without having to use the criminal's decryptor. Let's hope that they will do so and in addition to that that those criminals get caught.
In addition to that, what can one do? Even the best anti-virus software is and will be always one step behind the virus writers and criminals.
While personal vigilance as to where one goes and what email one opens is important, and more often than not it is the bad email protocol of users that bring them those lovely viruses and Trojans. However, Trojans sometimes come packaged in different ways even where the user would not suspect them to be.
See my article “Viruses and Trojans in Trusted Downloads” on how easy it is to have such things reach your computer.
© M Smith (Veshengro), June 2008
by Michael Smith (Veshengro)
Recently – though I never gave it a thought before and I did not have any anti-virus program check it – Bitdefender v10 FREE found in an ISO image help on my external hard drive a Trojan.
The ISO was for the OPEN CD 7.04 (both ISO and cut CD have now been destroyed) and came via a direct download from the official website.
This could only mean one of two things, I believe, and that is that either there is, or has been, at time of my download, infection on the site or two that there is a Trojan, according to BitDefender, embedded in the CD, in one of the programs.
Although few of us will ever think this necessary I would suggest – and I shall be one to follow my own advice this time for a change – that everyone always save any download to desktop and then check anything for hidden dangers and pitfalls prior to actually installing anything on the PC. Anything that cannot be saved to disk but wants to force install should be considered suspect, even if from a supposedly reputable source, and left well alone.
I am glad to say that I never actually installed anything from that particular OPEN CD – though I love OPEN CD in general – and therefore never actually had the Trojan let loose on my system.
Let the user beware!
© M Smith (Veshengro), June 2008
London, UK 9th June 2008 - The Information Security Awareness Forum (ISAF) the cross-industry initiative founded by the ISSA-UK to raise awareness of information security, has formally opened its Web site.
Located at www.theisaf.org, the site seeks to act as a resource that will over time develop in to a focal point for IT security education, news and other relevant information from the Forum.
Launched in February of this year, the ISAF is backed by a number of key organisations, including the ISSA, ISACA, GetSafeOnline, (ISC)², ASIS International, the British Computer Society, Infosecurity Europe and the Institute of Information Security Professionals.
Announcing the opening of the site, the ISAF's chairperson, Dr David King, said that it will help members, as well as the industry generally, pool their expertise and help co-ordinate the Forum's development.
"The Information Security Awareness Forum has been formed to coordinate and build on existing work and initiatives, to improve their overall effectiveness, and ultimately to increase the level of security awareness that will help us all” he said.
"Our new Web site will act as the foundation stone to help us achieve these aims," he added.
Martin Smith MBE, BSc, FSyI, the chairman and founder of the Security Awareness Special Interest Group, supported the opening of the new site, saying that his group strongly recommends the use of the new Forum pages as a first port of call.
"It serves equally well those individuals seeking security awareness knowledge for themselves and their families, and managers of businesses of all sizes and all sectors looking for advice and guidance about how to protect their data from accidental or deliberate disclosure," he said.
Several other leading organisations have voiced their support for the opening of the new ISAF Web site, including the BCS, the Jericho Forum and the NCC:
“The National Computing Centre's members rely on its ability to quickly direct them to trusted best practice. www.theisaf.org provides a highly relevant link in the information chain.”
Danny Dresner, NCC
“Since its inception in 2005, GetSafeOnline.org has been working in partnership with the UK Government, law enforcement and the private sector to raise awareness of internet security issues amongst consumers and micro-businesses. We have always believed that a collaborative approach is the only way to effectively tackle online safety issues – an area that is not only complex, but also relevant to individuals and organizations in different ways. We applaud the initiative to extend this approach through the new Information Security Awareness Forum website."
Tony Neate, Managing Director, Get Safe Online, www.getsafeonline.org
“The new www.theisaf.org website is a great initiative to help improve awareness of infosecurity issues and by coordinating the activities and resources of all the member organisations enables individuals and organisations to quickly find succinct advice to help them. The Information Security Awareness Forum also has a blog on Infosecurity Adviser www.infosecurityadviser.com which is another example of how the forum's members are fulfilling their common aim of improving infosecurity awareness across the entire industry.”
Claire Sellick, Event Director, Infosecurity Europe 2008
“ISSA-UK is delighted with the progress that ISAF has made since its formation as an ISSA-UK Advisory Board initiative in September 07. ISSA-UK congratulates ISAF on the launch of its new website which we strongly believe will support the continued growth and development of Information Security awareness across organisations. It will also provide individuals with a central repository of knowledge and a first point of contact for those seeking help and guidance. This new portal will enable those seeking help to locate good, impartial advice from the leading security organisations, working together in the forum, to communicate awareness to a wider audience.“
Geoff Harris, President of ISSA-UK
“The National e-Crime Prevention Centre welcomes all efforts to protect the UK from electronic crime and the ISAF Web site is an additional and useful site for advice and guidance. Encouraging people and businesses to take action on the available advice is key to reducing the harm to individuals and the economy.”
Ken Rabey, Project Director, National e-Crime Prevention Centre
“Given ISACA’s long-held belief in the importance of educating both institutions and individuals on information security we are confident that the resources on the Information Security Awareness Forum website will help to improve awareness. Having a single website to locate the huge amount of valuable information available from all the member associations is an extremely useful feature.”
Lynn Lawton, CISA, FCA, FIIA, PIIA, International President of ISACA
"ASIS UK Chapter 208 is delighted to support the launch of the ISAF's Web site and encourages all those who want to work together with other security organisations to visit and contribute to the various activities located on the Web pages."
James Willison, Convergence Lead, ASIS UK, Chapter 208
"The IET is pleased to be a member of the Information Security Awareness Forum and believes that the new ISAF Web site will provide a valuable mine of information for both individuals and organisations. We support the development of a co-ordinated approach to the provision of advice and guidance on all matters to do with information security"
Margaret Smith, Member of the IT Sector Panel, The IET
“This coming together of ICT professional bodies, trade associations and interest groups to work together to promote awareness is most welcome and deserves every support from suppliers, users and the many government departments and agencies with responsibilities for the safety and security of those using their systems.”
Philip Virgo, Secretary General, EURIM
“EEMA welcomes the ISAF website initiative which will increase awareness of the online security issues. EEMA is also honoured to be a member and bring a European perspective to the ISAF; time and recourses are a scarce commodity in this day and age and co-ordination in the security space is essential if we are to face up to the issues and challenges of online crime.”
Roger Dean, Executive Director, EEMA
"The BCS is pleased to be a member of the Information Security Awareness Forum and hopes that the endeavours through the new ISAF Web site will signpost both individuals and organisations to resources that they should be aware of both personally and professionally. This is certainly a resource that our 62,000+ members should find useful ongoing.”
Andrea Simmons, CISSP, CISM, MBCS CITP, M.Inst.ISP, BCS Consultant Security Forum Manager
"The CMA, as an early supporter of the Information Security Awareness Forum, fully supports ISAF's pragmatic initiatives to promote industry wide collaboration and particularly welcomes the new ISAF web site (www.theisaf.org). This web site should become the destination (or portal) of choice for people, be they the man or woman in the street or a company Manager, seeking advice and guidance on how to secure information in this electronic and ever more inter-connected world."
Peter Wenham CISSP MICAF CLAS, Director, CMA
"The Jericho Forum welcomes the Information Security Awareness Forum's practical initiatives to promote collaboration between groups working in this crucial area. Collaboration is an essential part of our vision to allow seamless and secure communications between businesses, suppliers and customers across an open, Internet-driven, networked world."
-- Andrew Yeomans, member of Jericho Forum board of management.
Additional Background Information about ISAF Members
A number of professional bodies and organisations involved in information security have come together to form the Information Security Awareness Forum to coordinate and build on existing work and initiatives, to improve their overall effectiveness, and ultimately to increase the level of security awareness in the UK that will help protect us all:
The Information Systems Security Association UK Chapter (ISSA-UK) provides educational forums, publications and peer interaction opportunities that enhance the knowledge, skill and professional growth of its members. ISSA-UK is a founding member and primary supporter of ISAF.
The British Computer Society (BCS) is dedicated to increasing the effectiveness and productivity of security professionals by developing educational programs and materials.
The Communications Management Association (CMA) is the UK’s premier independent membership body for professionals and organisations focused on exploiting communications, networking and ICT, for business advantage.
The Cybersecurity Knowledge Transfer Network provides a single focal point for UK cyber-security expertise, and provides special interest groups and runs events.
EURIM brings together politicians, officials and industry to help improve the quality of policy formation, consultation and implementation.
Get Safe Online is sponsored by the British Government and leading businesses to give you free objective advice.
The Institute of Information Security Professionals (IISP) is setting the standard for professionalism in information security, speaking with an independent and authoritative voice.
The Information Technologists' Company are all senior IT professionals who have joined the Company in order to give something back to the IT sector and the wider community.
The Information Assurance Advisory Council (IAAC)’s aim is to work for the creation of a safe and secure Information Society. It is a unique, not for profit body with high level support from government and industry backed by world class research expertise.
The Institution of Engineering and Technology (IET) provides a global knowledge network to facilitate the exchange of ideas and promote the positive role of science, engineering and technology in the world.
The Information Security Forum (ISF) delivers practical guidance and solutions to overcome wide-ranging security challenges impacting business information today.
The Information Systems Audit and Control Association (ISACA) is a recognised worldwide leader in information technology (IT) governance, control, security and assurance.
ASIS International is dedicated to increasing the effectiveness and productivity of security professionals by developing educational programs and materials.
Infosecurity Europe addresses today’s strategic and technical issues in an unrivalled education programme and showcases the most diverse range of new and innovative products and services from over 300 of the top suppliers on the show floor.
(ISC)² is the globally recognised Gold Standard for certifying information security professionals throughout their careers.
The Jericho Forum is an international IT security thought-leadership group dedicated to defining ways to deliver effective IT security solutions.
The International Underwriting Association of London (IUA) is the world's largest representative organisation for international and wholesale insurance and reinsurance companies.
The Security Awareness Special Interest Group (SASIG) is a subscription free quarterly networking forum open to those who have an interest in, or a responsibility for, raising awareness about security within their organisations.
The National Computing Centre (NCC) has pioneered a methodology for managing the 'human vulnerabilities' in information systems.
The National e-Crime Prevention Centre (NeCPC) is a multidisciplinary and multi-agency network and currently a virtual centre of excellence in e-Crime prevention and enterprise security.
The Police Central E-Crime Unit is a centre of excellence in regard to computer and cyber crime committed under the Computer Misuse Act 1990, notably hacking, maliciously creating and spreading viruses and counterfeit software.
The organisation, EEMA – the European association for e-identity and security – brings together over 135 member organisations (and over 1,500 employees of member organisations) in a neutral environment for education and networking purposes.
For further information visit www.theisaf.org
WorkLight Announces Strategic Relationship with Microsoft to Bridge Microsoft Office SharePoint Server 2007 with Consumer Web 2.0
New product to allow employees to use Windows Live, iGoogle, Facebook, and other popular Internet tools to interface with Microsoft’s enterprise collaboration suite
BOSTON – June 9, 2008 – WorkLight™ Inc. today launched a new product that allows employees, for the first time, to securely access and update Microsoft Office SharePoint Server 2007 information using popular Internet consumer tools. Titled WorkLight for SharePoint, the new offering combines the consumer web experience and Office SharePoint Server 2007 to deliver secure, enterprise-class collaboration tools with an easy-to-use interface of familiar tools like Windows Live, iGoogle, Facebook, and others.
This announcement, made at the Enterprise 2.0 Conference in Boston as part of a new strategic relationship between WorkLight and Microsoft, marks yet another milestone in the introduction of consumer Web 2.0 technologies into the corporate world.
WorkLight for SharePoint addresses both enterprise and employee needs by combining the collaborative capabilities of Office SharePoint Server 2007 with popular consumer social networks such as Facebook, personalized homepage gadgets, desktop widgets, RSS, mobile devices and more. Companies can now take advantage of SharePoint Server 2007’s collaboration, portal, enterprise search, enterprise content management, business intelligence and business process capabilities using popular consumer web-based gadgets; the same type of gadgets they use in their personal lives.
“By bridging SharePoint's social computing and enterprise search capabilities with popular consumer Web 2.0 services, employees and enterprises benefit from the best of both worlds," said Deb Bannon, senior product manager for Microsoft’s SharePoint Server Partner Group. "WorkLight's latest offering enhances SharePoint's highly popular collaboration platform by extending its reach to familiar consumer tools such as Windows Live and others. We are excited to work together with WorkLight in allowing employees and partners to do business with Web 2.0 securely."
With WorkLight for SharePoint, companies and employees are able to:
- Access SharePoint Server 2007 information such as document updates and contact information via consumer web-based and desktop gadgets, running on a wide variety of platforms – e.g. Windows Live, Vista Sidebar, iGoogle, Netvibes, etc., or from within Facebook.
- Receive notifications originating from SharePoint Server 2007 on mobile devices such as iPhone and BlackBerry.
- View and revise enterprise application data from secure gadgets running as SharePoint Server 2007 Web Parts, enabling employees to perform daily tasks like time reporting, task acknowledgement and purchase approvals right from their SharePoint Server 2007 dashboard
- Search for colleagues and expertise in the organization across both SharePoint Server 2007 and Facebook, using WorkLight’s secure Facebook application – WorkBook
WorkLight will be demonstrating WorkLight for SharePoint at the Enterprise 2.0 Conference in Boston, June 9-12, 2008 (booth # 618). To learn more about WorkLight for SharePoint, visit www.myworklight.com/sharepoint.
WorkLight™ Inc. develops and markets a line of server products that allow organizations to do more business securely using popular consumer Web 2.0 tools and technologies. like iGoogle, Windows Live, Netvibes, and Facebook. Through WorkLight, employees, channels, partners, and consumers connect to protected enterprise data (and to each other) using Web 2.0 services.
WorkLight is a venture-backed company with offices in Yakum, Israel and Boston. WorkLight has received prestigious industry accolades including being selected as a Winner of the Red Herring 100 Europe 2008 Award, named as one of the "Five Enterprise 2.0 Startups to Watch," by InformationWeek magazine, selected as part of CIO Magazine's Web 2.0 Product Suite and singled out with an honourable mention as one of Computerworld's "10 Cool Cutting-edge Technologies on the Horizon." For more information, visit www.myWorkLight.com.
Following a successful trial, law firm Mishcon de Reya has deployed Solcara’s KnowHow and SolSearch software products within its UK business.
Mishcon chose Solcara over other products, due to its desire for effective knowledge management and enhanced internal communications and a need for a solution which fitted in with its existing IT systems.
Caren Gestetner, the Partner at Mishcon de Reya with responsibility for Knowledge Management commented on the deal: “As a rapidly growing law firm providing a diverse range of legal services to businesses and individuals, effective knowledge sharing is key to our continued success. We were looking for a knowledge management platform which would facilitate retrieval of knowledge and enhance internal communications in line with our growth. We were attracted to Solcara for its federated search capability, enabling fee earners to retrieve materials through a single search of internal and external materials. Since our new intranet launched, feedback within the firm has been universally positive, with both lawyers and support staff finding it easy to use.”
Solcara’s MD Rob Martin added:
“Knowledge management is of vital importance to legal firms, given the volume of information passing their way. Any firm that wants to be at the cutting edge and have an advantage over their competition these days need to have access to the best tools to do the job. In the world of law, speedy access to resources can make all the difference and Solcara’s KnowHow and SolSearch do give distinct advantages over competitors.”