
Big data is well understood, at least in a general sense. But it’s worthwhile to note that it can include content from a wide variety of sources. Among these: content from internal company data warehouses, social media, click stream information from the company’s websites, the content of customer emails, customer online product reviews, survey responses, details of mobile phone call records, photos, videos, SMS texts, transaction data, and data produced by “Internet of Things” sensors.
SAS estimates the amount of information stored worldwide totals 2.8 zettabytes (2.8 trillion gigabytes) today, and will grow by a factor of fifty by 2020.[1] Data centers across the planet now occupy some twenty-five square miles, equivalent to the entire land area of Syracuse, NY, [2] home to over a million people.
Three key trends in big data deserve your attention as gathering business intelligence from big data continues to unfold.[3]
First, big data is derived from previously untapped sources. All these new sources of “in the moment” data can be melded with historical data to allow predictions of what is most likely to happen in the future. With appropriate predictive analytics tools, analytics becomes much more than a look at “what’s already happened.” It gives one the ability to predict what will happen next. This convergence of historical data and “now” data is what companies need to make business decisions aimed at growing profitability.
Second, there is a need for automation technology. With data flowing into an enterprise from so many directions, automation is a baseline requirement. Machines are well equipped to process huge volumes of information. People, not as well. The sheer volume, velocity and variety of big data you might choose to evaluate far outstrips a human’s ability to process.
Third, getting value from big data calls for flexible, less fragile, more adaptable systems. You might have built data stores that, after much planning, debate and discussion, specify particular formats for the data to be saved in various databases. Big data cannot be processed by systems that require information to be stored with rigid schema that need re-engineering every time a new source of data appears. Instead, analysis of big data requires you build a processing infrastructure that’s flexible and adaptable.
- [1] http://www.sas.com/en_us/insights/analytics/big-data-analytics.html (from the video)
- [2] http://www.demographia.com/db-uscity98.htm
- [3] http://www3.weforum.org/docs/GITR/2014/GITR_Chapter1.5_2014.pdf
To learn more about what Wavestone can do for your company, visit http://www.wavestone.com/capabilities.
Optimizing the 3 Stages of Your Cloud Software Development Lifecycle
May 25, 2023
Your Cloud Optimization Strategy requires seamless coordination between optimization levers throughout the SDLC to produce and maintain effective cloud solutions. Discover best practices and improvement opportunities for each lever, where they fit in the SDLC, and how to synergize them effectively.
Accelerate Cloud Maturity with the Right Cloud Optimization Strategy
May 18, 2023
Migration is only the beginning of the cloud journey. Moving to the cloud is not enough to leverage its advantages – a central, organized framework is needed to direct efforts. Learn to formulate a strategy customized to your needs and optimize your cloud enterprise continuously with a Cloud Optimization Strategy.
Have a Question? Just Ask
Whether you're looking for practical advice or just plain curious, our experienced principals are here to help. Check back weekly as we publish the most interesting questions and answers right here.