Business Intelligence Blog from arcplan
27Jun/120

Evaluating Data Quality Improvement Technologies – Part I (MDM)

by

The big data phenomenon has driven many organizations not only to increase analytics initiatives, but also to focus on improving data quality in order to make reliable decisions. After all, what good is a lot of bad data? Aberdeen says that when data is “unhealthy,” it’s “one of the root causes of inefficient, time-consuming business processes [and] inaccurate decisions.”[i]

So what exactly have companies been doing to manage their data and improve data quality? Some have implemented enterprise-scale master data management (MDM) systems to centralize information and others have implemented extract, transform, and load (ETL) tools to profile and cleanse their data. The size of your company (and your IT budget) may dictate the options you have for managing your data, but there’s always a way to ensure data quality, one way or another. Let’s evaluate some of the options out there. Today we’ll tackle MDM.

As the name suggests, master data management is a huge initiative. Its goal is to create a single, unified view of an organization by integrating data from a number of sources – a centralized master file that contains a single version of the truth. According to Aberdeen’s 2012 report on The State of Master Data Management, most organizations aren’t doing anything crazy when it comes to MDM. In fact, 45% of companies (mostly SMBs under $500 in revenue) are simply using spreadsheets or individual databases to manage and store their master data. Others (21%) are using internal talent to develop home-grown MDM systems.

However, MDM is not just a technology-focused effort…

Continue reading this post >>

12Jun/120

Poor Data Quality – Part 3: Strategies to Combat It

by

82% of top performers implement a process for continuous data quality improvement.

–  Gleanster Deep Dive: How Top Performers Improve Data Quality for Better Business Intelligence, January 2011

In recent posts, we’ve explored the consequences of poor data quality and also evaluated who should be responsible for maintaining good data within an organization. We’ve seen that there’s no quick fix for subpar data quality; rather, ensuring superior data requires a well-orchestrated team effort.  A 2011 Gleanster benchmark report revealed that top performing companies understand that maintaining data quality is an ongoing discipline requiring constant attention. Organizations successful in the pursuit of better data have implemented strategies such as these to continuously improve their data:

1. Have a policy in place and take ownership
Organizations may hire a data quality manager as a dedicated resource and the first line of defense against bad data. The data quality manager governs data processes by ensuring that reliable information is loaded into the data warehouse and is responsible for data processes such as migration, manipulation and analysis. Additionally, some BI systems like arcplan allow authorized users to write back data directly to the data source. In this case it is the responsibility of that user to enter accurate information and not corrupt the system with erroneous data.

2. Enforce data quality at the source system
“Garbage in, garbage out” is the phrase to keep in mind. Making particular information mandatory in the source system is one way to go about maintaining data quality…

Continue reading this post >>