Now incorporating 112 Review, Community Safety Review & Military Review

Monitoring for effective data protection

By Geoff Sweeney, CTO, Tier 3 www.tier-3.com

Recent security breaches in both the private and public sector have highlighted the need for organisations to ensure personal information is processed and stored securely. Ever growing collections of personal data, more remote access and the prevalence of crime such as identity theft all create vulnerabilities. It is essential that effective data protection policies and practices are in place, combined with vigilance and strong governance at all levels in all organisations, to ensure data protection is taken seriously.

Individuals expect the Data Protection Act to shield the security of their information. At the same time information security is increasingly at risk. As part of its new data protection strategy launched in March 2008 the UK’s privacy watchdog, the Information Commissioner’s Office (ICO), disclosed its plans to promote the importance of appropriate security, the use its regulatory powers against organisations that neglect their responsibilities in this area and to help individuals to protect their own information.

In May this was reinforced when The Criminal Justice and Immigration Act received Royal Assent creating tough new sanctions for the ICO. This new legislation gives the ICO the power to impose substantial fines on organisations that deliberately or recklessly commit serious breaches of the Data Protection Act and represents a step up from the ICO's previous power to simply issue enforcement notices.

This isn’t necessarily the end of the changes and there may be more regulation to come as, towards the end of May, the European Network and Information Security Agency (ENISA), called for laws tougher than those in the US to force companies to reveal when their computer systems have been breached. In its General Report 2007 the EU's top security body said governments, businesses and consumers are still underestimating the scope of the IT security problem, in part because of the lack of transparency when breaches occur, and mandatory disclosure of security breaches would be a step toward raising recognition of the seriousness of security threats. In the US, there are two laws which force organisations to publish details of security breaches. One is the California Breach Law (SB1386), which requires organisations doing business in California to tell customers about possible security breaches. Similar laws are planned for other states. The second is Sarbanes-Oxley, which obliges executives to keep informed about material aspects of their business, including security breaches.

Whether mandatory disclosure of information security breaches is ultimately adopted in the UK or not is not yet known but clearly advances in IT have made the collection, storage and sharing of all sorts of information easier and available to a wider population. Undoubtedly these advancements have resulted in enhanced services across many sectors but it has also increased the challenge of managing and protecting information. The vulnerability of data protection is evidenced almost daily with costly data leakage incident regularly impacting individuals and the organisations charged with the custody of their sensitive information.

The connectivity of WANs and the internet means that there are now few barriers to sharing information. The consequence however is that it is increasingly apparent that organisations can quickly lose control of who is sharing the information, where it is going and whether it is being used appropriately?

With this in mind the best way for organisations to meet their data protection obligations is to understand the information flows and uses within their business environment. A systematic risk based approach which matches the data monitoring and protection capabilities of the organisation with the risks associated with the loss of information based on its sensitivity/value and its likely impact to the individual and the organisation is increasingly important. Security policies, processes and technology are all part of the operational risk management process of identifying, monitoring and controlling information security breaches which may cause highly public exposure to your organisation and its stakeholders.

Increasingly, with the massive data volumes involved, this risk management loop requires the integration of skilled operational staff and competent technology to provide appropriate monitoring and control to ensure the use and movement of confidential information is within policy and adequately protected.

The good news in all this is that the security management process shouldn’t be to onerous and indeed should be part of the overall IT security effort. Technology is available which readily monitors who is accessing information, when and for what purpose. Using data protection systems which employ behavioural analysis an organisation can easily distinguish between legitimate use of its confidential information and inappropriate usage. One of the most damaging breaches is when an authorised user who has “legitimate” access to sensitive information either accidentally or maliciously chooses to misuse or leak that information. A behavioural analysis based security system can detect unexpected or risky data movement even where other systems can’t.

By recording the movement and use of information a behavioural analysis based security system establishes a profile that incorporates the characteristics of normal system use. By constantly monitoring and profiling user and system activity the system immediately recognises when information is accessed, changed or shared in an unusual or uncharacteristic manner and immediately alerts the accountable manager for remediation and evidentiary audit purposes. Specific business and policy rules can complement the system to enable early warning of any specific forbidden or unacceptable practices eg. Theft or fraud.

The scale and task of protecting stored and transmitted sensitive information is undoubtedly becoming greater. The problem for organisations, however, is that their responsibility for information assurance remains unchanged and with the intrinsic risk associated with its storing and sharing information owners continue to need ongoing visibility of who is accessing data, for what purpose and where are they taking it. Behavioural based security monitoring technology provides the ability to continuously manage and report the status of access and usage of confidential information for any organisation.

Source: Eskenzi PR
<>

How to multiply the uses of your Business Continuity Infrastructure

Ian Master, sales and marketing director UK, Double-Take Software

Introduction
Business continuity (BC) infrastructure is typically thought of as a means to get data offsite. IT managers don’t necessarily realise that the BC infrastructure they are building can provide much more. A well thought through deployment can provide the ability to move information technology systems anytime, anywhere, for whatever purpose, without interfering with ongoing operations. Whether recovering from a disaster, simplifying routine server maintenance or even migrating whole data centres, a good deployment can provide a dynamic infrastructure that ensures effective business continuity planning as well as making the data centre manager’s life a whole lot easier.

A day in the life of a data centre manager
To state the obvious, data centre managers don’t spend their working lives exclusively worrying about large-scale disasters. Their day-to-day experience is more likely to include managing smaller business continuity and infrastructure issues. How can they maintain full service when they know a shared disk is starting to malfunction and needs to be swapped out? How can they replace a physical server because it is no longer performing optimally? What happens when entire clusters of servers need to be moved because the nodes lack disk or processing ability? What happens if the entire data centre needs to be moved to a different location?

Building a dynamic infrastructure
Data replication solutions, which copy data in real time from one server to another to create a complete duplicate on a live backup system, provide very high levels of data protection and availability. However, data replication is just that; it only protects an application’s data, not the application itself. In the event of a disaster, system administrators will have to hope that all of the application backups are valid and can be restored, because if not, they’ll have no choice but to find the installation disks and sometimes even that isn’t an option. To overcome this, the more sophisticated data replication solutions provide byte-level replication for application system states so that administrators have the ability to provision an entire server at the touch of a button and keep business critical applications up and running.

Another tool used to reduce hardware costs and manage infrastructure more flexibly is virtualisation. Virtualisation provides data centre managers with the ability to move servers “dynamically” to a different virtual machine where more processing power or disk space may be available. However, the process of moving virtual machines is limited to the virtual infrastructure and sometimes only the same physical server where the technology is hosted. By combining data replication that moves data and the application system state, virtualisation, WAN accelerators, operational monitoring and security tools, you now have the ability protect and dynamically manage your entire data centre, regardless of the situation.

Dynamic infrastructure in operation
Using host-based replication allows you to replicate data and operating systems, independent of hardware and in real time, while systems are still in production. Administrators are able to replicate from physical to a virtual environment or vice versa, physical-to-physical or virtual-to-virtual, all while the end users are accessing the data.
Data centre managers are using dynamic infrastructures to move entire data centres without end users even being aware, easing operational management as well as meeting the most stringent business continuity requirements. If a server is in need of maintenance, the data centre manager isn’t committed to a 2.00 am Sunday morning change control window just to tweak a configuration setting or perform a reboot. The operation of that server is dynamically moved to another without interruption, allowing the technician to take as long as needed to perform maintenance or repair that server. Maybe the part from the vendor won’t be available for 10 days? Operations continue uninterrupted and the maintenance window is open to whenever it is convenient.

Conclusion
If you have the ability to move systems anywhere, anytime, for whatever reason, without interruption to users, you have just exceeded a rather large piece of your company’s business continuity requirements and, more importantly, maximised data centre uptime. Dynamic infrastructures are providing the ability to restore business operations after a disaster not only to a functional level but also to the level of service that your end users expect, as well as providing the ability to seamlessly manage data centre operations.

Source: StoragePR
<>