How BI facilitates a decision-making process that saves millions
At the core of every business decision is the desire to drive value for the company – whether that’s increased sales, higher margins, elevated profits, or return on investment. Decision makers should use all the resources at their disposal to drive this value, including their business intelligence software, which may include guided analytics (i.e. dashboards), ad-hoc analysis and collaboration capabilities that contribute to informed decision-making. Today I’ll explore how BI software facilitates decisions in a retail scenario. But this article isn’t just for retailers – anyone can extrapolate this information to their business to see how BI can provide concrete ROI.
arcplan serves a number of customers in the retail industry, including two of the largest grocery chains in the United States. Retailers are well-known for the small net revenue margins – on average, 3% across the globe for all types of retailers – which pose significant challenges on process controls and efficiency in supply chain decisions. One of the key areas of interest for all retailers, especially grocery chains, is the reduction of shrink – the loss of inventory due to product spoilage, waste, theft and other causes. It’s estimated to account for 2-3% of overall sales. Perishable shrink even goes up to 5% within a typical grocery chain. So for one of our customers, whose revenue reached $6.25 billion in 2012, a reduction in shrink of just 0.1% means $6.2 million to their bottom line.
So a simple question that would catalyze a decision-making process at this grocery chain might be: How can BI help reduce my shrink by 0.1% while balancing availability of goods and customer satisfaction? They would want to meet high customer expectations without over-ordering, which leads to shrink through spoilage.
I am sitting on a train to Düsseldorf on my way back from Paris, where I presented an update of what we are doing at arcplan to a mixed audience of customers and prospects. Part of my presentation included the usual content of company and product development updates. The outlook included a preview of our next release, code named Xenon, in the context of what is happening in businesses these days. One of the key topics was the explosive appearance of mobile devices and the challenges this poses to organizations – different form factors and operating systems, security issues, and expectations from a user community that is educated by the private consumption of applications on these devices (bringing an expectation of usability to the business environment). Of course, I introduced our first-ever approach in the business intelligence world to solve the dilemma of catering to this ever-increasing diversity of different device types and form factors as DORA: Develop Once, Run Anywhere. This is accomplished by responsive design for business intelligence and analytics applications. The audience was clearly impressed as was our customer advisory board in a similar session last week.
However, this blog article is not about how to develop and deploy analytic content effectively in this new world; it’s about the business value BI solutions create.
This year we were positioned by Gartner in their annual Magic Quadrant for Business Intelligence Platforms. Although the Gartner analysts expressed strong appreciation for our capabilities (and commented accordingly in the strengths and cautions section of the report), we are positioned at the lower end of the niche vendor section. We were told this is partially due to self-service analytics and data discovery playing a strong role in this year’s Quadrant as this represents advanced BI. Really?
Everyone is throwing around the term “analytics” – about as much as they’re throwing around the term “big data.” While I might put big data on my list of the Most Overused Phrases, analytics gets a pass. As companies realize the amount of insight and value they can glean from their ever growing volumes of data, there has been a surge in analytics initiatives. The goal of these projects is to use data to analyze trends, the effects of decisions, and the impact of scenarios to make improvements that will positively impact the company’s bottom line, improve processes, and help the business plan for the future.
In order for analytics to remain relevant and always provide value, organizations must continually up their game. One way to do this is with predictive analytics, which is becoming more mainstream every day. If you stick around to the end of the article, I’ll tell you a simple way to bypass its complexity and still get the predictions you need.
Gettin’ Predictive With It
Predictive analytics involves making predictions about the future or setting potential courses of action by analyzing past data. A 2012 benchmark study by Ventana Research revealed that predictive analytics is currently used to address a variety of business needs, including forecasting, marketing, customer service, product offers and even fraud detection. While predictive analytics used to be in play in only a small number of companies, two-thirds of companies participating in Ventana’s survey are using it, and among those, two-thirds are satisfied or very satisfied. These results indicate the maturity that predictive analytics has undergone over the last few years, as technology has advanced to make it less expensive and more approachable, and therefore easier for more areas of the business to make use of. At this point, it’s safe to say that most Fortune 500 companies are churning out predictive insights on a regular basis, but that doesn’t mean smaller companies without “big data” can’t do the same thing. They can supplement their internal data with external data from social media, government agencies, and other sources of public data to get the insights they need.
Let’s take a look at finance institutions, which have predictive analytics down to a science….
I just got back from the Teradata PARTNERS Conference in Washington D.C. – once again a a great event for learning from experts in the industry, listening to real-world examples on challenges with managing and leveraging huge data volumes, and networking with our fellow Teradata partners and customers alike.
It was my second consecutive year at the event, and what struck me most this year was that the topics have clearly shifted from managing big data to leveraging big data. Obviously, data volumes are exploding due to social media and clickstream data, sensor data and other sources and will only continue to grow. This year’s conference, however, was all about Analytics – how to use those data to drive business benefits. And there were great examples given at the conference.
In one of his presentations, Stephen Brobst, CTO of Teradata, described the benefits of collecting weather data around retail stores to determine whether conditions have a significant impact on food consumption in the store (e.g. the deli section). He said combining external weather forecasts with internal operational data and analytical information allows stores to adjust staffing and supplies for a huge impact on the bottom line.
Shaun Connolly, Program Director of Global Industry Solution at Teradata, described an example of how FedEx was able to save $60 million in staffing per year…