Business Intelligence Blog from arcplan

Evaluating Data Quality Improvement Technologies – Part II (ETL)


If you’re ready to deploy a business intelligence reporting and analytics solution, then data quality is probably on your mind. Last week, we demystified master data management (MDM) and how it combines technology with consensus-building to improve and maintain data quality. Today let’s review another technology option for data quality initiatives: ETL.

Extract, transform, and load (ETL) tools are widely used to improve data quality. ETL tools first combine data from normally heterogeneous sources to a single repository (the extract phase), transform the data by cleansing and organizing it to optimize reporting and analysis, and then load the cleansed data into a data warehouse or other system, where end users can then rely on vetted information. When just starting out, many organizations simply use ETL to cleanse their data – like resetting dates or adding default values to empty fields. More advanced data cleansing can mean “deduping” duplicate customer records or parsing fields like “Full Name” into separate fields, “First Name” and “Last Name.”

Enterprise ETL tools are expensive, with Oracle’s Data Integrator costing $23,000 per processor, for example. There are open source and low-cost ETL tools out there, but they’re not generally suitable for enterprise-scale data quality initiatives. has compiled a list of them as well as their limitations.

The advantages of ETL tools vary. Obviously SAP ETL tools offer tighter integration with SAP products; the same with Oracle. Some tools are faster than others to implement and learn, and some offer better support and documentation than others. Beyond advanced data cleansing support, it’s important to carefully consider the following list of items before purchasing any ETL tool:

Continue reading this post >>


ODTUG KScope 12 Conference Recap – an Oracle Partner’s Perspective


I just returned to arcplan’s headquarters in Germany after spending the last week in San Antonio, TX for KScope, the Oracle Developer Tools User Group Conference. This was our second year in a row at the show and another fantastic experience. It’s 5 days of knowledge sharing for Oracle customers, very applicable to real-life challenges and professionally delivered by industry experts with hand-on experience. And it’s an event very different from typical vendor-driven user conferences, which tend to deliver more “marketing driven” material. Again this year, KScope felt more like “by Oracle users, for Oracle users.”

The conference is an absolute must for Hyperion customers as it provides high quality content on a wide variety of topics of interest to them, from MDX to Essbase scripting to ASO cubes. While sometimes it was awkward for the conference to straddle the varied interests of attendees – especially between Apex developers and Hyperion/EPM attendees – the various tracks helped people find their niche. While my colleagues represented arcplan at our booth, I attended many sessions and heard first-hand how attendees were benefitting significantly from the knowledge shared and planning to apply it to their own projects. The Sunday symposiums provided a compact and good overview on some key aspects of Oracle database technologies and the future of Oracle planning, data integration, financial reporting and more. Our SVP Dwight deVera was on the agenda this year, presenting his infamous hour on Calculating ROI for Business Intelligence Projects. His session was as well-received as it was at Collaborate earlier this year.

At KScope, the conference atmosphere is very open and communicative. As an attendee, it provides a great opportunity to network and engage with organizations and individuals that share your same interests. As a vendor, it’s a terrific event for arcplan. We help customers enhance their existing Oracle investments, which is exactly what attendees are there to figure out how to do.

Continue reading this post >>


Evaluating Data Quality Improvement Technologies – Part I (MDM)


The big data phenomenon has driven many organizations not only to increase analytics initiatives, but also to focus on improving data quality in order to make reliable decisions. After all, what good is a lot of bad data? Aberdeen says that when data is “unhealthy,” it’s “one of the root causes of inefficient, time-consuming business processes [and] inaccurate decisions.”[i]

So what exactly have companies been doing to manage their data and improve data quality? Some have implemented enterprise-scale master data management (MDM) systems to centralize information and others have implemented extract, transform, and load (ETL) tools to profile and cleanse their data. The size of your company (and your IT budget) may dictate the options you have for managing your data, but there’s always a way to ensure data quality, one way or another. Let’s evaluate some of the options out there. Today we’ll tackle MDM.

As the name suggests, master data management is a huge initiative. Its goal is to create a single, unified view of an organization by integrating data from a number of sources – a centralized master file that contains a single version of the truth. According to Aberdeen’s 2012 report on The State of Master Data Management, most organizations aren’t doing anything crazy when it comes to MDM. In fact, 45% of companies (mostly SMBs under $500 in revenue) are simply using spreadsheets or individual databases to manage and store their master data. Others (21%) are using internal talent to develop home-grown MDM systems.

However, MDM is not just a technology-focused effort…

Continue reading this post >>


Real World Cloud BI Webinar Recording


In case you missed arcplan’s webinar with GPX Software on June 19th, A Look at Real World Cloud BI Solutions, here’s the recording to view at your convenience:

In this webinar, we discuss:

  • The challenges of cloud-based BI
  • Cloud economics
  • Services-oriented architecture (SOA) as a way to securely implement enterprise-scale cloud BI
  • Cloud BI success stories, including Xambrosius, GPX Software’s small business planning solution built with arcplan technology
  • And much more

Thanks to everyone who attended, and for those who didn’t, leave us a comment if you’d like to discuss anything you see in the recording.


The 2 Most Common Budgeting, Planning and Forecasting Frustrations


Ask a sales manager what ABC means and the response, often with a smile, will be “Always Be Closing.” This concept makes sense for the sales manager but unfortunately, many finance managers are “always closing” too – and they’re not smiling about it. The budgeting, planning and forecasting (BP&F) cycle is too often the most dreaded time of the year. A recent report from Ventana Research revealed that the BP&F process “typically eats up 10 – 15% of the finance team’s time, and it takes 5 – 10 months to complete the full cycle in large organizations.” So how does the finance team get anything else done if we’re talking about a 10-month process?

Few organizations are insulated from the challenges of financial reporting and planning. The process continues to cause pain because there are many moving parts that finance managers and planners need to balance and many individuals who contribute to them. We’ve found that many of the finance team’s concerns boil down to the following 2 issues, both of which can be addressed with a dedicated, comprehensive BP&F tool.

1) You’ve outgrown your current process. For some, planning consists of Excel spreadsheets and a notepad. We recently spoke to one $3 billion company that is still managing their planning this way! In a previous post, we looked at some of the indicators for when you need to move on from your current BP&F process: multiple versions of the same spreadsheets used by different people, time wasted consolidating spreadsheets rather than analyzing data, and limited visibility. These are all signs that it’s time to graduate to a corporate performance management solution or dedicated BP&F software (like arcplan Edge), investing in a tool that is centralized, adaptive, and allows you to deliver the value that your business needs. Inefficient data collection and multiple versions of spreadsheets lengthen the BP&F cycle unnecessarily, reducing the value of the entire process.

Continue reading this post >>