Critical. Authoritative. Strategic.


CBR is proudly produced & published
by Technews
Issue Date: October 2006

Poor data can cost billions

1 October 2006

Getting data under control has become a corporate imperative of the highest order. This is so because of three factors:
* The sheer volume of data generated and stored by organisations, and its exponential annual growth.

* The variable quality of this data.

* The cost of managing it.
The Data Warehousing Institute estimates that poor data quality costs US businesses $611 billion annually in excess postage alone. Worse still, research released by Gartner, Forrester Research and BMI-T independently indicate that the average worker spends more than 30% of his productive time on searching for information, at a cost to the business. Compliance requirements further aggravate the consequences of inferior data. Penalties for non-compliance, be they financial or legal, can be severe.
Organisations strive for accurate, clean, reliable and consistent data that is the only version of the truth by using a combination of solutions that perform data cleansing and extraction, transformation and loading, among other things.
Often disparate, these systems are costly from several perspectives. High maintenance costs result from each system being implemented with its own service level agreements; they require specialised skills which increase human resource costs. The result is a technology infrastructure that is complex, burdensome, inflexible and often only partly effective.
For this reason, organisations are standardising. In so doing, they are able to integrate transaction processing and analytical systems, support expanding user bases and reduce supplies, maintenance and licensing requirements. All of this helps reduce costs.
To unlock data's true potential, organisations must apply master data management. IDC's research reveals this is best undertaken by the organisation via a policy hub. The hub collects master data from analytical and transactional systems and manages it centrally.
Essentially, master data management is a collaborative platform on which to coordinate decisions on master data reconciliation and rationalisation. It provides data synchronisation for business performance management master data such as business dimensions, reporting structures, hierarchies, attributes and business rules across distributed data warehouses, data marts, analytic applications and transaction systems; applying changes established in a central server to each application; and performs version control and change monitoring at the central policy hub.
The organisation is empowered by this ability to centrally change business information. Managing data centrally means that information pushed to the numerous systems in operation is trustworthy and ultimately usable.
However, master data management goes beyond concerns of data quality. It also encompasses standardisation and accountability. The right business owners must be responsible for their data, and its master data.
Clean, good quality data enables people in an organisation to make quicker, better decisions. It enables the better use of resources and greatly improves customer service. Business analyses can be done more accurately and speedily, forecasts can be projected with greater confidence.
For more information contact Tanya Furniss, Intellient, +27 (0) 11 607 8200,

Others who read this also read these articles

  • Tips for content managers

    Seven tips for succeeding at the complex job of content management.

    [ August 2007 ]

  • Data master class

    Master data management (MDM) is an enterprise-wide approach that encompasses all corporate data, both operational and analytic, helping both IT and business to rationalise overlapping and conflicting data in disparate systems, driving data consistency and quality across functional business

    [ March 2007 ]

  • Corporate communication – why we cannot just ‘send and receive’ anymore

    Effective management of a business tool and compliant systems are essential aspects of the modern competitive business

    [ February 2007 ]

  • Mastering data management

    MDM is a process that requires a data model, supported by integration services and synchronisation functionality, not to mention a set of data quality tools

    [ October 2006 ]

  • Dishing the dirt

    What is needed is a more strategic investment in information quality that is backed by a lasting enterprise-wide commitment.

    [ October 2006 ]

  • MDM with an edge

    Enterprise dimension management (EDM) seeks to satisfy data consistency requirements by defeating costly multiple charts of accounts, supporting business performance management (BPM) initiatives and other associated problems

    [ October 2006 ]

  • Building an IT continuity framework

    Business continuity is the overall provision of measures aimed at ensuring that the rest of the business outside of IT can continue to function despite disruptions and disasters

    [ September 2006 ]

Others who read this also read these regulars

Search Site

Search Directory

  • Search for:


Previous Issues