Skip to main content

The new economics of data: An imperative for change

Data

A combination of a dramatic change in business requirements coupled with regulatory upheaval, embodied by GDPR, is taking IT organisations to a tipping point where they must take a much more pro-active approach to understanding and managing their data.

The current state of data in the enterprise

For years, IT, encouraged by falling storage costs, has taken a relatively low-touch approach to the management of data, with most organisations taking the view that keeping data is less expensive than managing it proactively, based on business value or for legal or regulatory purposes. This was mirrored by a relatively low key approach to data privacy regulations.

Over time IT has created numerous copies of data for protection and governance purposes. This has been matched by a trend of increasing volume of copies used to gain insights from data. IDC research on copy data management shows that 45-60% of total storage capacity consists of ‘copy data’, whilst 82% of those organisations surveyed have at least 10 copies of each database.

Test, development, business continuity, operational recovery and analytics have all spawned multiple copies of data, each with its own discrete set of supporting infrastructure. Of late, the shift to digital business has increased the requirement for the business to inspire high levels of customer trust combined with a regular supply of fresh business insights. For IT, that means continuous service delivery – which relies on copies of data for failover and recovery purposes, and data lakes - which combine numerous sources of data for analytics.

Traditionally, IT has taken the position that it is too costly and risky to consolidate data operations – that is backup/recovery, archive and snapshot management. That argument is much harder to make today when you take the data management requirements of GDPR and the demands of digital together. Today, the cost and risk in not acting may well be higher than the risk of acting for many organisations.

See also:

GDPR changes the rules

Most organisations now have high volumes of loosely managed data that includes a large numbers of copies and replicas. Unstructured data (such as files and email) is particularly problematic as, unlike databases, it is not indexed. This makes both the data classification and privacy impact assessments required for GDPR extremely challenging to conduct.

GDPR’s strict breach notification rule means that organisations also have to determine the nature of the breach, its scale, who has been affected, and how it occurred within 72 hours for notification purposes. A rising tide of cyber-attacks means the risk of a breach is significant and increasing, and tighter data management is an essential part of any plan to remediate those risks.

Rebalancing for scale

The phenomenon of large volumes of relatively loosely managed data in the data centre also presents a barrier to scaling digital projects and initiatives. Gartner research on “2018 CIO Agenda: Mastering the New Job of the CIO” shows that once digital initiatives have been delivered, but before they can be ‘harvested’, they need to scale.

The report goes on to identify resources as one of the largest barriers to scale at 23% of respondents surveyed, with reallocation of resources as part of IT portfolio management being largely focused on infrastructure and data centre; 30% of organisations are expecting to reduce investment there. Reducing unnecessary copies of data, and the infrastructure used to support them, can contribute significantly to IT’s ability to rebalance the portfolio towards BI/analytics and cloud services, the most popular investment areas to shift to.

Mind the gap

The increasing importance of data and data-related skills needs to be taken into account in planning. A recent survey of IT staff shows that data management and analysis are among the top scoring priorities. The survey also shows that whilst data is changing how IT departments function, two-thirds of respondents felt insufficiently prepared for key data challenges.

GDPR as a catalyst for change

Instead of treating GDPR as an investment with little in the way of a return other than compliance, it is more fruitful to build a plan that will address the demands of GDPR whilst simultaneously equipping the company with better set of data capabilities. As one CIO said at Gartner Symposium 2017: “I do not see GDPR as a problem, I see it is a catalyst for change”.

Taking a shared services approach to data across the spectrum of business requirements (protection, governance and use) rather than tackling them discretely is likely to result in a set of capabilities that will serve the organisation more broadly by and assist with digital transformation.

Below is a simple five-point plan for that shared services approach, using GDPR as the catalyst for change.

1. Take the GDPR requirements for data management as both an imperative and a catalyst for change: From a data management perspective, minimisation, retention, accuracy and integrity/confidentiality are well understood, albeit from a slightly different perspective. These are also four of the six data protection principles of GDPR. The other two are lawfulness, fairness and transparency and purpose limitation. As part of the GDPR plan, you can calculate cost savings that may result from the tighter data management regime and better data management capabilities required for GDPR.

2. Plan to consolidate data operations as part of the GDPR plan: As with the previous step, seek and calculate the savings that can be obtained from the consolidation of backup/recovery, archive/managed retention, snapshot & replica management etc. which can be significant.

3. Introduce a formal copy data strategy as part of GDPR planning: This includes an element designed to reduce current volumes of copy data and its supporting infrastructure, focused on both best practice for GDPR and data centre cost, storage and infrastructure cost reduction. Calculate the cost savings that may result from copy data reduction. As copy data volumes have proliferated to the point where they often exceed production volume there are significant economies to explore.

4. Review the additional use cases that can be added post consolidation to achieve economies of scale and scope with your data, particularly in deriving value from it along with its protection and governance:

  • Test and development
  • End user search
  • End user self-service recovery
  • Analytics – operational, risk and business

Seek and calculate the savings and potential contributions to revenue that can be obtained from the additional data platform use cases. Enabling self-service search and recovery, for example, reduces the internal IT burden by up to 47%.

5. Model the projected cost savings over time and align with your portfolio rebalancing. This will help shift the IT investment from the data centre towards BI/analytics, data science, cloud services, digital marketing and other investments that are geared to transforming the business and growing revenue.

GDPR can provide a catalyst for change resulting in a new set of data capabilities vital for securing the future of the organisation. Taking GDPR as the start point and working through a plan designed to serve the wider needs of the organisation makes good sense. Encompassing the use of data as well as focusing on protection and governance it will serve the IT organisation in good stead as it seeks to support the shift to digital business.

Nigel Williams, Senior Director of Marketing for EMEA at Commvault

Facebook Conversations

 

NEWSLETTER

Gigabit Weekly