As if there has not been enough time and money spent on cleansing and integrating data, along comes master data management promising to deliver a final solution to corporate data problems.
Data is a fact of life in corporations the world over, as is, unfortunately, the fact that most organisations have various large data stores in different parts of the company, and often in different formats that are very difficult to collate and use as a single data source for business intelligence (BI), CRM or other requirements.
More than the technical problems involved in getting a single view of corporate data, there is also the political problem of trying to tell someone their empire does not own the data in their databases and it must be available to whoever needs it. There have been numerous attempts at finding a way to ensure data integrity in the past few years, but all the industry really came up with was a way to, at regular intervals, collate and clean data to be used for a specific function.
Enter master data management (MDM). MDM is touted as the final step, a way in which companies can always ensure they are accessing the latest data no matter which application they are using. But, as with all things technology related, things are not that simple.
Ana Moreira, a systems analyst for Edge Evolve, says: "MDM acts as a central repository for master data. It integrates the data in order to synchronise data across varied system architectures and business functions."
In this definition, master data is defined as data that is always correct to avoid embarrassment when customers are attacked with the wrong data.
Organisations struggle to reconcile master data, especially when it is maintained in multiple systems - this also leads to data integrity being lost. Moreira highlights the need to consolidate the various databases as the main factor that brought MDM to the forefront. "With MDM it is now possible to share data by propagating the data, with a central repository. When reporting is done, relevant information from the repository can be used. The advantage of MDM is that it can be implemented on existing systems and enhance reporting."
Estelle de Beer, practice manager at the BI Practice says there is confusion in the market when it comes to MDM because almost everyone has their own definition of the word. De Beer says MDM is about creating and maintaining only one source of master data for an organisation from which all applications will source information. Logically, MDM should form part of the integration layer in organisations through which data 'flows' in order to ensure one version of the truth is always maintained.
"MDM is an essential component of system integration if we look at the four areas of EIM, namely data quality, data integration, master data management and metadata management," says Christo Nel, services solution manager at Cognos South Africa. "If we look at the four components of EIM, MDM fine tunes the processes that makes information delivery more sleek and efficient. Once implemented, it eliminates the ongoing process of trying to establish which version of multiple data sites is accurate thus lowering ongoing efforts to obtain consistent information."
Christo Nel, services solution manager at Cognos South Africa
The question of how to implement MDM is a tough one, but one that must be addressed. De Beer says having access to accurate information is more important than ever today as companies find it harder to keep customers. Exposing the poor quality of information your company has on its customers is no way to engender loyalty.
Marianne Vosloo, MD of iCentric adds that in the BI field particularly data quality is critical. How can management make decisions based on the intelligence their analytical applications deliver if the original data was incorrect? She adds that MDM is not a silver bullet that will automatically fix dirty data, but it will assist in keeping data quality high over the long term. MDM therefore is not a tool that will replace current data quality tools, but will supplement them - as long as the appropriate time, resources and money is made available, which is often a problem.
Marianne Vosloo, managing director of iCentric
"It is recognised that with front end BI applications, data quality is always an issue," adds Nel. "A business can have many solutions but if there is no data quality, the systems do not deliver their inherent or intended benefits."
Technology, processes and people
A recent Metadata study found that 80% of companies have no centralised data strategy in place. In parallel, many software vendors are trying to provide packaged master data management (MDM) systems to fill this void - a problem looking for a victim. Nicholas van Zeggeren, VP, EMEA Regional Markets at Informatica, says any successful MDM programme requires three components: technology, processes and people.
Nicholas van Zeggeren, VP, EMEA Regional Markets at Informatic
"These three components make MDM-in-a-box a poisoned chalice that IT departments worth their salt will not go near," Van Zeggeren adds. "Fragmented, outdated and incorrect data can have a severe impact on the business. Data is typically stored in isolated and often broken silos. Greater data complexity and fragmentation is added through the requirements of the different lines of business and, to an even greater degree, by merger and acquisition activities. Business processes traverse the silos but break down when the silos are not synchronised on the same data elements and when certain data elements are only available in some of the silos."
MDM may well close the loop in enterprise data management, he explains, but without reliable, clean data, no amount of process will deliver the business value sought through an MDM initiative.
He also notes that data quality and effective data integration are critical elements of MDM and yet are often trivialised or overlooked. "Many companies either have no data quality and data integration standards or are using hard-coded routines to manage constantly changing business rules and the data generated by them. An effective MDM programme that is likely to touch every data point in the business is dependent on having the right data delivered to the right place at the right time."
Both De Beer and Vosloo agree that quality is a key component of MDM, given that IDC statistics show US companies lose in the region of $616 million annually due to problems related to poor-quality data.
Looking ahead, Attie Taljaard, senior sales consulting manager, applications, at Oracle SA, says in an SOA world, we will see MDM becoming a service that provides data (the right data) to all applications requiring information. Until then, however, companies are going to be stuck with synchronising data across various best-of-breed systems to ensure every application operates with the correct information.
Attie Taljaard, sales consulting manager, Applications, Oracle SA
As above, he notes there is no simple tool or application one installs for MDM. It is a process that requires a data model, supported by integration services and synchronisation functionality, not to mention a set of data quality tools. And most important, on top of all this is a set of business processes to make the data work for the business and its customers.
MDM is no quick solution to the data quality problem. It is, however, something more companies are showing interest in given the poor state of data in almost every company. Will MDM be the final solution in terms of allowing companies to rely on their data? Probably not, but it will go a long way to improving the errors made far too often because of incorrect information.
$150 MDM project
UK insurer Prudential, spent about $150 million on an MDM project that delivered a large operational customer database that provides a single touchpoint for over 13 million customers across numerous products, services and channels. The database is kept current and integrates information from a range of disparate operational systems with rapid access to data on multiple mainframe systems. Information is delivered quickly, without massive system overhead, because the underlying technology identifies and moves only the small fraction of data that has changed within a given period.
Jeremy Gray, head of architecture at Prudential’s IT division, PruTech, says the programme is projected to achieve annualised return on investment (ROI) of $48 million over five years “by enhancing our customer responsiveness and satisfaction”. He adds: “Our data Integration Competency Centre is helping us gain an holistic view of customer information across our business – including data residing on mainframes – while ensuring we can scale our integration efforts as data volumes continue to grow.”