Until the emergence of “Big Data”, data were mainly treated locally in warehouses of several structured databases. Gradually, these data sources became diversified.
Analysts are projecting the future of customer data. Several points of attention are highlighted. While most companies collect, store and analyze data, majority of them are struggling with their big project data and are struggling to meet IT challenges associated with the use of this framework.
Newscientis, with Microsoft, released an infographic on how big data techniques seek to gain insight by analyzing large data sets. The proliferation of sources associated with 3V (volume, variety, velocity) have contributed to big data’s recent growth. Data is coming in a growing variety of new and often unstructured forms such as text, video and sensor reading.
By 2020, all the digital data created, replicated and consumed per year will reach 40,000 Exabytes. The connected devices including pocket calculators, personnel computer, mobile phones, servers and mainframes and videogame consoles have contributed more than 10 million instructions per second.
Via Lauren Moss