Business Intelligence Blog from arcplan
22Nov/130

3 Big Data Considerations for Newbies

by

big-data-newbiesSixty-four percent of organizations are already investing in or plan to invest in big data soon according to a recent Gartner report.* That equates to a huge number of individuals who now have to research how to embark on a big data deployment. The prevalence and benefits of big data analytics are undeniable, but there are some considerations to keep in mind before jumping in:

1) Identify a specific business need

Big data projects reap the most benefits when they address specific business needs. Having a use case in mind will help determine what data you need to analyze – social, machine or transactional data. Gartner recommends researching use cases and success stories in other industries; why not get inspired by what’s worked for others? Gartner analyst Doug Laney recently shared examples of big data at work in various industries: using big data analytics, the department store Macy’s was able to adjust prices in near real time for 73 million items based on demand and inventory; Wal-Mart was able to optimize search results and increase web checkouts by 10 – 15%; and American Express used sophisticated predictive models to analyze historical transactions and forecast potential churn. Once you’ve identified the analytic need not met by “small” data analysis, you have the first green light for considering big data technology.

Continue reading this post >>

1Nov/130

3 Big Data Approaches Based On Your Available Resources & Infrastructure

by

big-data-approachesUnearthing previously unimaginable insights from massive data sets is the premise of all the big data hype. Over the past few years as more and more stories come out about how companies are finding competitive advantages in their data, big data has moved beyond the buzz. Enterprises are deploying big data projects at a faster rate every year, and even more plan to do so within the next 2 years.

The extent to which a company can take advantage of big data analysis is determined by the amount of resources and infrastructure it has available. The good news is that now the barriers to entry have been lowered, making it possible for more organizations to meet their goals to transform operations with insights gained from big data. Here are three approaches that companies of any size can take based on their particular situation.

One thing to note is that these are underlying infrastructure approaches, and that you’ll still need an analytic engine like arcplan on top in order to interact with, visualize and distribute your insights.

Lots of resources and lots of infrastructure

Before big data was “big data,” Teradata was the only game in town. They’ve been at it for so long and their functionality is so robust – some of their capabilities are second to none. Now other vendors like SAP (with HANA) and Kognitio have their own massively parallel analytic databases. They offer robust processing and querying power on multiple machines simultaneously, enable near real-time MDX (Multidimensional Expressions, for OLAP querying) and SQL (Structured Query Language, the standard way to ask a database a question) queries, and in the case of SAP HANA and Kognitio, are fully in-memory. Not surprisingly, Teradata and SAP HANA come at a high price, but for that high price, the insights you achieve can be very near the speed of thought.

Continue reading this post >>

23Mar/120

Big Data FAQs – A Primer

by

The big data trend promises that harnessing the wealth and volume of information in your enterprise leads to better customer insight, operational efficiency, and competitive advantage. The marketing hype around big data and the pace of studies, analyst reports, and articles on the subject can be mind-numbing for companies that want to take advantage of big data analytics but do not know how to separate fact from fiction and determine real use cases for their business. So here’s a big data primer for those just getting in the game.

1) What exactly is big data?
“Big data” is an all-inclusive term used to describe vast amounts of information. In contrast to traditional structured data which is typically stored in a relational database, big data varies in terms of volume, velocity, and variety. Big data is characteristically generated in large volumes – on the order of terabytes or exabytes of data (starts with 1 and has 18 zeros after it, or 1 million terabytes) per individual data set. Big data is also generated with high velocity – it is collected at frequent intervals – which makes it difficult to analyze (though analyzing it rapidly makes it more valuable). Additionally, big data is usually not nicely packaged in a spreadsheet or even a multidimensional database and often includes unstructured, qualitative information as well.

2) Is it a new trend?
Not exactly. Though there is a lot of buzz around the topic, big data has been around a long time. Think back to when you first heard of scientific researchers using supercomputers to analyze massive amounts of data. The difference now is that big data is accessible to regular BI users and is applicable to the enterprise. The reason it is gaining traction is because there are more public use cases about companies getting real value from big data (like Walmart analyzing real-time social media data for trends, then using that information to guide online ad purchases). Though big data adoption is limited right now, IDC determined that the big data technology and services market was worth $3.2B USD in 2010 and is going to skyrocket to $16.9B by 2015.

3) Where does big data come from?
Big data is often boiled down to a few varieties including social data, machine data, and transactional data…

Continue reading this post >>

9Mar/120

Big Data for Manufacturers: Customer Feedback Should Influence R&D

by

In 2001, The McKinsey Global Institute published a comprehensive report on big data, Big Data: The Next Frontier for Innovation, Competition, and Productivity, which explores the value that companies across various industries may yield as result of the big data explosion. So far we’ve explored the impact of big data on retail and healthcare companies, but today I’ll explore how big data analytics impact the manufacturing industry.

The manufacturing sector stores more data than any other sector, according to the McKinsey report. Manufacturers will likely get the most benefit from big data analytics since they have so much “raw material” to work with (from machinery metrics to sales systems). Manufacturing is a relatively efficient industry, with many advances made over the last few decades to streamline processes and improve quality through management practices like lean & six sigma (and lean six sigma!). But big data can be the impetus for the next wave of improvements in manufacturing, especially in R&D.

Research and Development
Streamlining the R&D process results in greater efficiency and reduced costs for US manufacturers and is important for products to be competitive in the global economy. But in 2012 and beyond, manufacturers should be going further, leveraging big data to influence design decisions. This means incorporating customer feedback into the process, designing products and adding features that customers actually want. McKinsey calls this “design to value” or “value-driven design.”

Surveys: I’ve taken consumer surveys that ask questions like “How much more would you be willing to pay for x feature?” and I now understand why companies are asking this. They are culling data from consumers about what features are desired and if they are included in the product/service, what is the value, i.e. how much are people willing to pay for it. Gathering concrete insights is one step toward big data analytics influencing R&D. Manufacturers should be listening to what consumers want and refining their designs accordingly. It’s just smart business.

Here’s a concrete example: Domino’s Pizza. You might not think of Domino’s as a manufacturer, but it is – the company is a serious dough manufacturer, producing and distributing dough to more than 5,000 US stores.

Continue reading this post >>

17Jan/123

The Big Data Trend Explained: Big Data vs. Large Data

by

Acquiring thorough insight into your data and tapping into the needs and buying patterns of customers are growing needs for businesses striving to increase operational efficiency and gain competitive advantage. Throughout 2011, I noticed a heightened interest in ‘big data’ and ‘big data analytics’ and the implications they have for businesses. In August, Gartner placed big data and extreme information processing on the initial rising slope of their Hype Cycle for Emerging Technologies, so we’re just at the beginning of the big data trend. A recent TDWI survey reports that 34% of organizations are tapping into large data sets using advanced analytics tools with the goal of providing better business insight. The promise of big data analytics is that harnessing the wealth (and volume) of information within your business can significantly boost efficiency and increase your bottom line.

The term ‘big data’ is an all-inclusive term used to describe vast amounts of information. In contrast to traditional data which is typically stored in a relational database, big data varies in terms of volume, frequency, variety and value. Big data is characteristically generated in large volumes – on the order of terabytes or exabytes of data (one exabyte starts with 1 and has 18 zeros after it) per individual data set. Big data is also generated in high frequency, meaning that information is collected at frequent intervals. Additionally, big data is usually not nicely packaged in a spreadsheet or even a multidimensional database and often takes unstructured, qualitative information into account as well.

So where does all this data come from?

Continue reading this post >>