COMPUTER BUSINESS REVIEW

Critical. Authoritative. Strategic.

TECHNEWS

CBR is proudly produced & published
by Technews
www.technews.co.za
Issue Date: October 2006

Dishing the dirt

October 2006

Data quality is more a business issue than a technology issue. Madan Sheina explains why pushing it onto the boardroom agenda is key.

We have all heard of the embarrassing business gaffes: Argos selling a TV/DVD combination for £0,49 ($0,89) online instead of its retail price of £350 ($635), or an Indiana county wrongly recording a $150 000 house as a $400m mansion and subsequently being forced to cut its budget by $3,1m to account for the grossly over-stated tax rates.
Ignoring dirty data - inaccurate, incomplete, redundant, conflicting or outdated information - costs companies serious money. Industry estimates show that data quality issues are costing US firms up to $1,5 trn per year on their bottom lines. This can include something as innocuous as triplicate credit card pitches mailed to existing customers, or be the result of misguided incentive programs, failed data warehousing and business intelligence projects, unrealistic business plans or budgets and lost customers.
Cleaning up dirty data can be an equally costly endeavour. A recent survey of information workers in Europe by Harris Interactive shows that respondents spend as much as 30% of their week verifying the accuracy and quality of their data. The productivity loss translates to a cost of over $400 000 per week by some estimates.
Bad data is not new: it has always been a fact of corporate IT life. But the urgency in dealing with the problem has never been greater. The sheer proliferation of data spread across a fragmented and often siloed IT infrastructure, coupled with increased regulatory compliance (Sarbanes-Oxley and Basel II) and customer data integration initiatives, means that organisations are now being pressured to deliver better and richer data quality at an enterprise-wide level.
Forrester Research estimates that spend on data quality software and services will surpass $1bn by 2008. That is a lucrative opportunity for data quality software vendors and there are an abundance of data quality technologies available on the market today to automate the clean up. But there is no single appliance that can rid a company of poor data in one fell swoop.
Many companies are learning the hard way that a tactical, reactive approach to dealing with bad data on a project-by-project basis is not good enough. What is needed is a more strategic investment in information quality that is backed by a lasting enterprise-wide commitment.
Corporate discipline
Data quality can appear so overwhelming that many businesses do not know where to start and delay or defer even trying. While creating a scalable, yet not overly complex technology configuration can be a challenge, Ed Wrazen, vice president of international marketing at Trillium Software, argues that "data quality is not a technical problem as much as it is a business issue."
“[Data quality] requires companies to treat ‘quality’ as a core component of the information lifecycle.” - Ed Wrazen, Trillium Software vice president of international marketing
“[Data quality] requires companies to treat ‘quality’ as a core component of the information lifecycle.” - Ed Wrazen, Trillium Software vice president of international marketing
That might seem strange coming from a data quality software vendor. But he is right. Software automation is an important piece of the data quality puzzle. But it can only go so far. "Above all it requires companies to treat 'quality' as a core component of the information lifecycle, just as they would for the products and services they deliver to their customers," Wrazen says.
Equally, if not more, important is establishing the right strategic foundation. Jane Griffin, a technology integration principal at Deloitte Consulting, believes that data quality is a corporate discipline that companies have to infuse at all levels of the organisation. "No large-scale initiative, IT or otherwise, can succeed without addressing the human and cultural factors," she says.
Jane Griffin, Deloitte Consulting technology
integration principal
Jane Griffin, Deloitte Consulting technology integration principal
To implement data quality successfully requires tearing down both mental and organisational roadblocks. "It is a mindset thing; getting companies and departments out of a state of denial is the first step," says Wrazen.
Griffin argues that tackling data quality as a process issue is also key. The problem she sees is that too many companies are trying to fix the data when the processes to ensure data quality are the real issue. "Data quality is owned by business and managed by business processes," she says.
Nigel Turner, head of the data quality program for UK telecoms company BT, concurs with this view and sees a symbiotic relationship between the two. "Poor processes produce bad data. Poor data produces inefficient processes."
Part of the problem stems from the approach being used. "A lot of what we see is focused on tactical fixes, typically focused on an isolated application," says Laurie Mascott, CEO of data quality company Datanomic. "It is like having dirty windows; you do not clean them every day but only when they get too bad. The same is true of data. The issue is when will they get dirty again."
But with corporate information becoming more multifaceted and serving a variety of various business purposes in a number of operational, analytic and reporting applications, often simultaneously, Wrazen says that that companies "need to think more broadly about delivering data just for a particular [application] silo. The 'single customer view' that businesses often want obscures the complexity of business purposes that such data must serve."
Flexible integration
Companies need to take a smarter enterprise-wide approach. "That requires data quality tools to be flexible enough to be embedded as part of any business process," Mascott says. It is also where technologies such as master data management (MDM) come into play, according to Patrick Connolly, product marketing manager at IBM's information integration solutions unit.
"MDM has been a lightning rod for raising awareness that data quality programs are part of broader CRM, ERP and enterprise integration efforts, not just name and address cleansing." He sees many companies embracing data quality "for the simple reason that their big, expensive IT project does not work".
Wrazen believes that companies have to set up pervasive and ongoing processes to ensure that data is treated as a valuable business asset across the enterprise.
"The old 'fix it and forget' process applied to mailing lists does not cut it anymore. Data quality has to become a practice where the solutions are applied in many different ways in order to build a cohesive and strong foundation."
BT's Turner concurs and likes to think of cleansing and enterprise data quality as synonymous. "At BT we do not do any data cleansing unless it is part of a continuous process."
Data quality processes are usually absent within a company because there is no obvious person to take responsibility for data quality issues across the entire enterprise. "Ownership of the problem and doing something about it can be a sticky issue," says Griffin.
One of the biggest obstacles is the business side of the organisation viewing data quality as an IT problem. "Most organisations have relinquished the responsibility for data quality to IT," she says.
Who is responsible?
A recent Gartner survey of 600 executives reinforces this view: nearly 50% think that IT is responsible for their organisations' data quality.
The problem, according to Wrazen, is that IT does not necessarily understand the issues involved. "Historically, IT has only cared about keeping the systems' lights on and not the quality of information content that flows through them."
But that is misleading, according to Griffin. "Rather than looking at data quality as a chore, a business has to accept the fact that it has prime responsibility for data quality and manage data as another business asset," says Griffin. "And while IT may not own the business processes that create dirty data, it can make the business case to change those processes to improve data quality."
Despite the powerful business case in its favour, data quality can seem so overwhelming that it is hard to get the corporate-wide buy-in needed for an effective data quality improvement program.
According to Rebecca Clayton, director of marketing at data cleansing specialist QAS, less than 35% of companies have data quality as a board-level responsibility. "Often this responsibility is spread when there is a pressing need for a corporate data strategy at board level."
"You cannot do this without the right level of executive sponsorship that sets the tone by promoting information quality as a strategic imperative," says Griffin. This means setting up broader governance programs across the enterprise to instill corporate discipline and best practices on how information can be managed and used across the organisation.
Data governance requires establishing formal sets of business processes and policies to ensure that data is handled in a consistent way, including standard definitions for data elements and categories that are applied globally across organisations, as well as metrics for measuring data quality.
"The governance committees take responsibility for the definition of standards, understand deviations from the standard and put in place programs to change behaviour and processes," Griffin says.
Since data quality is everyone's problem, data governance should cut vertically across the chain of organisational command and across discrete project and functional areas.
An important element is to infuse accountability by appointing individuals as both stewards and champions of data quality processes in their domains. The goal is to encourage individuals and departments to think and behave differently with regard to data quality and continually reinforce the importance of their data quality responsibilities on an ongoing basis.
While every stakeholder in a company has a role to play in data quality, Griffin singles out the collaboration between influential decision makers in the office of finance and CIOs as being especially critical.
"Data quality issues have a significant financial impact. And since finance departments are the aggregators and interpreters of most operational data they have a critical knowledge about business and analytical requirements," he says. But forging such a relationship can be tricky.
"There can be a lot of finger pointing here. CIOs sometimes forget they have the word information in their title and are preoccupied with applications and technologies."
"Data quality does not happen overnight. It is a long journey and the costs can mount up," says Griffin, who has worked with clients over several months just to get the initiative up and running. "I have seen projects where clients are spending $20m implementing SAP and $10m on data quality and master data management over several years."
Wrazen also acknowledges that data quality does not come cheaply, "but the cost of not doing it can be just as, if not more, expensive." There are big savings to be made in the long run as BT found when it embarked on its data quality initiative nearly a decade ago. "Businesses have to be in it for the long-term haul," says Turner, who has used his internal experiences with over 50 data quality projects at BT to set up an external consultancy business.
Weighing the evidence
While acknowledging that data quality is a top-down driven initiative, Turner also warns companies "not to get too strategic, too quickly" and try to implement it as a 'bigbang'. Rather, he suggests companies prove the ROI on smaller line-of-business CRM and business intelligence projects. "You can then use that to build up a compelling business case for data quality and get the necessary buy-in at the highest levels of the organisation."
CBR opinion
Improving data quality is a tall order for ­companies that cannot be done by implementing off-the-shelf software. Data quality does not come cheaply, but the cost of not improving it can be much worse on the business. Piecemeal data quality technology implementation and projects can only go so far. Companies first need to get the necessary political, cultural and financial buy-in to articulate its importance to the business and lay a lasting strategic foundation for enterprise data quality. Above all, a commitment by all levels of the organisation to treat data like any quality product or service is essential to success.


Others who read this also read these articles

Others who read this also read these regulars

Search Site





Search Directory

  • Search for:





Subscribe

Previous Issues