Business Intelligence Blog from arcplan
27Jun/120

Evaluating Data Quality Improvement Technologies – Part I (MDM)

by

The big data phenomenon has driven many organizations not only to increase analytics initiatives, but also to focus on improving data quality in order to make reliable decisions. After all, what good is a lot of bad data? Aberdeen says that when data is “unhealthy,” it’s “one of the root causes of inefficient, time-consuming business processes [and] inaccurate decisions.”[i]

So what exactly have companies been doing to manage their data and improve data quality? Some have implemented enterprise-scale master data management (MDM) systems to centralize information and others have implemented extract, transform, and load (ETL) tools to profile and cleanse their data. The size of your company (and your IT budget) may dictate the options you have for managing your data, but there’s always a way to ensure data quality, one way or another. Let’s evaluate some of the options out there. Today we’ll tackle MDM.

As the name suggests, master data management is a huge initiative. Its goal is to create a single, unified view of an organization by integrating data from a number of sources – a centralized master file that contains a single version of the truth. According to Aberdeen’s 2012 report on The State of Master Data Management, most organizations aren’t doing anything crazy when it comes to MDM. In fact, 45% of companies (mostly SMBs under $500 in revenue) are simply using spreadsheets or individual databases to manage and store their master data. Others (21%) are using internal talent to develop home-grown MDM systems.

However, MDM is not just a technology-focused effort…

Continue reading this post >>