Business Intelligence Blog from arcplan
26Oct/120

Teradata PARTNERS User Group Conference 2012 Recap

by

I just got back from the Teradata PARTNERS Conference in Washington D.C. – once again a a great event for learning from experts in the industry, listening to real-world examples on challenges with managing and leveraging huge data volumes, and networking with our fellow Teradata partners and customers alike.

It was my second consecutive year at the event, and what struck me most this year was that the topics have clearly shifted from managing big data to leveraging big data. Obviously, data volumes are exploding due to social media and clickstream data, sensor data and other sources and will only continue to grow. This year’s conference, however, was all about Analytics – how to use those data to drive business benefits. And there were great examples given at the conference.

In one of his presentations, Stephen Brobst, CTO of Teradata, described the benefits of collecting weather data around retail stores to determine whether conditions have a significant impact on food consumption in the store (e.g. the deli section). He said combining external weather forecasts with internal operational data and analytical information allows stores to adjust staffing and supplies for a huge impact on the bottom line.

Shaun Connolly, Program Director of Global Industry Solution at Teradata, described an example of how FedEx was able to save $60 million in staffing per year…

Continue reading this post >>

26Sep/120

Careers in Business Intelligence: What Makes BI an Attractive Field

by

Along with the demand for big data, better data, and the need for greater insight into company operations comes the need for analytic professionals who can effectively leverage this data to maximize business benefits. But it’s not always easy to find and hire these types of people. Within the past few years, we’ve seen titles such as BI Analyst, Data Analyst, Data Scientist, and Big Data Engineer emerge in job listings as companies seek out much-needed expertise to wrangle their growing amounts of information. As a matter of fact, McKinsey and Company’s often-quoted 2011 report on big data predicted that  by 2018, the US alone faces a shortage of 140,000 to 190,000 people with deep analytical skills as well as 1.5 million managers and analysts to analyze big data and make decisions based on their findings. The bright side of this situation is that job seekers with analytic talent and business acumen have a tremendous opportunity to make a positive impact for business teams. BI is a growing field that shows no signs of slowing down, so let’s take a closer look at what makes it attractive to job seekers as well as students deciding what course of study to pursue.

Continue reading this post >>

17Jul/120

Mobile BI Strategy Checklist: Part I

by

At every turn, we’re confronted with the reality that mobile BI is making its mark among modern organizations. Studies are confirming this, with TDWI‘s December research report revealing that 70% of participants see mobile analytics as an important part of their company’s BI strategy. Howard Dresner’s Mobile Business Intelligence Market Study found a similar number – 68% see mobile BI as either “critical” or “very important” to their business. And from my own experience with customers and prospects at arcplan, it seems as though everybody is jumping on the mobile BI bandwagon. Before diving head-first into your own mobile BI deployment, lay out a smart strategy that will ensure the project’s success.

Let’s consider the most basic (and important) factors of any organization’s mobile BI strategy: where the money’s coming from, who the project is aimed at, and what kind of BI applications are appropriate for mobile devices.

1. Return on investment
As with any other business project, your mobile BI strategy must have a discernible return on investment in order to get off the ground. In another article, we explored the 5 types of return on investment and the importance of categorizing a business project into one of these buckets. Revenue enhancement is one of the easiest forms of ROI to prove for a mobile BI project. Here’s an example: one of our customers is a company that tracks the effectiveness of pharmaceutical sales reps on arcplan-powered dashboards. The data has revealed that the average sales call for these reps is only about 3 minutes long, so every second counts. One company instituted a pilot program to switch reps from laptops to tablets, which start up significantly faster, to see if this would have a positive effect on sales. It worked – the switch increased the productivity of the reps in their meetings (allowing them to pull up research studies and email them to physicians quicker). This responsiveness on the part of the devices (and therefore the reps) has led to an average sales call duration increase of over 30%. Consequently, these reps have been able to increase the number of sales for the pharmaceutical company they represent. This pilot program proved revenue enhancement ROI and stakeholders gladly signed off on the larger project (tablets for everyone!) as a valuable investment.

Continue reading this post >>

5Jul/122

Evaluating Data Quality Improvement Technologies – Part II (ETL)

by

If you’re ready to deploy a business intelligence reporting and analytics solution, then data quality is probably on your mind. Last week, we demystified master data management (MDM) and how it combines technology with consensus-building to improve and maintain data quality. Today let’s review another technology option for data quality initiatives: ETL.

Extract, transform, and load (ETL) tools are widely used to improve data quality. ETL tools first combine data from normally heterogeneous sources to a single repository (the extract phase), transform the data by cleansing and organizing it to optimize reporting and analysis, and then load the cleansed data into a data warehouse or other system, where end users can then rely on vetted information. When just starting out, many organizations simply use ETL to cleanse their data – like resetting dates or adding default values to empty fields. More advanced data cleansing can mean “deduping” duplicate customer records or parsing fields like “Full Name” into separate fields, “First Name” and “Last Name.”

Enterprise ETL tools are expensive, with Oracle’s Data Integrator costing $23,000 per processor, for example. There are open source and low-cost ETL tools out there, but they’re not generally suitable for enterprise-scale data quality initiatives. ETLTools.net has compiled a list of them as well as their limitations.

The advantages of ETL tools vary. Obviously SAP ETL tools offer tighter integration with SAP products; the same with Oracle. Some tools are faster than others to implement and learn, and some offer better support and documentation than others. Beyond advanced data cleansing support, it’s important to carefully consider the following list of items before purchasing any ETL tool:

Continue reading this post >>

27Jun/120

Evaluating Data Quality Improvement Technologies – Part I (MDM)

by

The big data phenomenon has driven many organizations not only to increase analytics initiatives, but also to focus on improving data quality in order to make reliable decisions. After all, what good is a lot of bad data? Aberdeen says that when data is “unhealthy,” it’s “one of the root causes of inefficient, time-consuming business processes [and] inaccurate decisions.”[i]

So what exactly have companies been doing to manage their data and improve data quality? Some have implemented enterprise-scale master data management (MDM) systems to centralize information and others have implemented extract, transform, and load (ETL) tools to profile and cleanse their data. The size of your company (and your IT budget) may dictate the options you have for managing your data, but there’s always a way to ensure data quality, one way or another. Let’s evaluate some of the options out there. Today we’ll tackle MDM.

As the name suggests, master data management is a huge initiative. Its goal is to create a single, unified view of an organization by integrating data from a number of sources – a centralized master file that contains a single version of the truth. According to Aberdeen’s 2012 report on The State of Master Data Management, most organizations aren’t doing anything crazy when it comes to MDM. In fact, 45% of companies (mostly SMBs under $500 in revenue) are simply using spreadsheets or individual databases to manage and store their master data. Others (21%) are using internal talent to develop home-grown MDM systems.

However, MDM is not just a technology-focused effort…

Continue reading this post >>