Business Intelligence Blog from arcplan
15Nov/100

Cloudy & Loving It Part III: Automation is Good

by

In my last entry on this subject, I discussed some of the impediments to the smooth extraction of transactional data in the cloud for the purposes of analytical processing. Today, I’m discussing why automation of data extraction is the way to go.

Imagine a 3-layer stack made up of Source, Target, and the Connection between the two. In this example, we have a Source (the transactional data) that is less accessible than a standard DBMS. The accessibility issue is caused by

  1. the data not being in-house
  2. the data is an abstraction (rather than individual tables)

So instead of joining SQL Server Tables that make up your General Ledger system to find specific transactions, you now have to log into a cloud using a browser and look for Assets. Therefore, when you have to extract the Assets data from the source, you’ll have to find or write a query that creates the Assets data extract and you no longer have the ability to directly interact with the underlying tables, as they’ve been hidden from view.

Continue reading this post >>

11Nov/100

Cloudy & Loving It Part II: Not So Structured Query Language

by

In my last post on the subject of Cloud Computing, I mentioned two ways to slice and dice data in the cloud — depend on query tools to extract data to a local database, or use Data Warehouses to support the transactional system in the cloud. Today, I’ll delve deeper into these two choices for culling meaningful trends and KPIs from data in the cloud.

Whether or not a transactional system is moved to the cloud, the data collected is still necessary for analytical processing. A transaction processing system is optimized to capture the specific transactions as effectively as possible. On the other hand, analytical processing data has to be optimized to allow detection of trends in Key Performance Indicators. Business Intelligence (BI) systems are usually built on the latter. When the transaction system is in-house, an Extract-Transform and Load (ETL) system can be written to automate the transformation of data from highly normalized transactional to denormalized analytical form.

Continue reading this post >>

4Nov/100

Make Decision-Making a Core Competence

by

Knowledge workers have driven more than 70% of the economic growth in the US over the past three decades, and 85% of the new jobs created in the past decade required complex knowledge skills.

Companies that are making decision-making a core competence – even a competitive differentiator – are out-performing their peers. arcplan’s view is that even if complete, actionable data is available for decision-making, biases can cause decisions to become misdirected. On October 20th, arcplan co-sponsored a webinar with Financial Executives International featuring noted writer and speaker Dr. Courtney Hunt titled, “Decision-Making 1.0: The Importance of High Quality Data and Rational, Objective Decision-Making Processes.” Listen to this webinar replay to learn how your organization can make decision-making a core competence.

Click here to register for Part II of our webinar series with FEI on November 18th, “Decision-Making 2.0: How Organizations Can Leverage 2.0 Technologies to Improve Decision-Making and Collaboration.” You don’t need to be a member of FEI to participate.

2Nov/100

Cloudy And Loving It

by

Business Intelligence applications are often based on a denormalized version of transactional data. This is done mainly to:

  1. keep analytical processing from slowing down the transaction systems
  2. create “reporting friendly” databases that lend themselves to analysis

Traditionally, both Transactional and Analytical databases reside on hardware inside the company’s firewall and when necessary, a BI report and/or chart can drill down from one system to another transparently.

With Cloud Computing, this model gets more complicated. The current trend of moving to the Software as a Service (SaaS) model is centered on transaction processing. For example, Salesforce.com is a transactional system that allows users to access a Customer Relationship Management system in a cloud. In the old days, because of the total cost of ownership, smaller organizations could ill afford to acquire these systems, and instead, resorted to maintaining their data in home grown and/or Excel-based databases. The SaaS model allows an organization of any size to access and benefit from very sophisticated systems through subscribing to them on a named user basis. Therefore, whether an organization has 10 or 1,000 sales reps, it can maintain a robust set of metrics at a very reasonable cost.

Continue reading this post >>

28Oct/100

What’s Your Business Intelligence ROI?

by

Companies have always struggled to calculate the ROI of their BI investments, as demonstrated again in the results of this year’s The BI Survey 9. In most cases, BI adopters are realizing “soft” ROI (faster and more accurate reporting, improvements in the decision-making process, increased productivity, etc.), but concrete ROI (cost reductions, higher revenues) remains difficult to quantify. In many cases, return on investment is never calculated simply because the project owners/managers don’t have the knowledge required to a perform the analysis.

If you’re being asked to justify your business intelligence spend as 2010 comes to a close, arcplan invites you to a must-attend webinar next week: Calculating ROI for Business Intelligence Projects.

Continue reading this post >>