What does Big Data mean (for us)?

Article by Jérôme Berthier in Revue internationale et stratégique 2018/2 (no.110).

What does Big Data mean (for us)?

Article by Jérôme Berthier in Revue internationale et stratégique 2018/2 (no.110).

Extract: “Big Data can be defined as the transformation of the  maximum amount of data of all kinds into information that can help take  decisions. This may either be generated automatically by a robot, in  which case we talk about artificial intelligence (AI), or depend on  people and therefore be based on actual facts, and no longer on beliefs,  which all too often still guide managers who are convinced of the  relevance of their choices based exclusively on the way they feel.

Despite its growing popularity, which gives the term Big Data a new  aspect, most of the techniques used today have been known for a very  long time. Experts agree that the work of Alan Turing, in the 1950s,  represented the first steps with the development of AI. Natural language  processing also emerged during this same period. The term “machine  learning” was invented in 1959 by Arthur Samuel. And while deep learning  is a revolutionary technique that is widely used in AI, it was first  seen in the 1980s and 1990s, notably developed by Yann LeCun.

However, until around 2005, complex data analysis was still very  difficult, if not impossible, due to the major processing resources  needed and the low volume of digitised data. For example, in 1956, the  first computer with a hard drive, IBM’s RAMAC 305, had a capacity of  just 5 megabytes, for a cost of 100,000 euros…”.

Buy the article on Cairn.info