Business Intelligence Blog from arcplan

The Big Data Trend Explained: Big Data vs. Large Data


Acquiring thorough insight into your data and tapping into the needs and buying patterns of customers are growing needs for businesses striving to increase operational efficiency and gain competitive advantage. Throughout 2011, I noticed a heightened interest in ‘big data’ and ‘big data analytics’ and the implications they have for businesses. In August, Gartner placed big data and extreme information processing on the initial rising slope of their Hype Cycle for Emerging Technologies, so we’re just at the beginning of the big data trend. A recent TDWI survey reports that 34% of organizations are tapping into large data sets using advanced analytics tools with the goal of providing better business insight. The promise of big data analytics is that harnessing the wealth (and volume) of information within your business can significantly boost efficiency and increase your bottom line.

The term ‘big data’ is an all-inclusive term used to describe vast amounts of information. In contrast to traditional data which is typically stored in a relational database, big data varies in terms of volume, frequency, variety and value. Big data is characteristically generated in large volumes – on the order of terabytes or exabytes of data (one exabyte starts with 1 and has 18 zeros after it) per individual data set. Big data is also generated in high frequency, meaning that information is collected at frequent intervals. Additionally, big data is usually not nicely packaged in a spreadsheet or even a multidimensional database and often takes unstructured, qualitative information into account as well.

So where does all this data come from?

Continue reading this post >>