The history of Big Data is very little known. As often happens with trends, it seems that when they explode is something very novel that has just appeared; but in many cases, they are the result of a technology that has a long maturing time.

Between the years 30 and 49 began the era of computing. In 1940 Turing and Good made a machine capable of analysing documents encrypted by the Germans using the Enigma machine in World War II, in such a way that it was able to decipher the key with which they had been encrypted. And in that same year a machine is made, the Kerrison Predictor, capable of automating antiaircraft defence by targeting enemy aircraft. And in 1946, the Manhattan Project team managed to use computers to analyse and predict the behaviour that a nuclear chain reaction could cause.

pexels-photo-669621

Like many of today’s computer technologies and concepts, the first steps of what would be the storage and concept of Big Data began to work in the research laboratories of the company IBM. Back in the early seventies, where one of its scientists published an article that showed how to access information stored in large databases, without knowing how the data was structured or where it resided.

Later on, between 1950 and 1969, analytical commercialisation was introduced by generating the first meteorological prediction model with data analysis. In 1956 the problem of the shortest path was solved by means of computational analytics, which allows improving logistics and transport. In 1958, for its part, FICO applied predictive models for credit risk decisions, while in 1966 it began what in the future would be known as the SAS Institute, as an analytical research project funded by the Ministry of Agriculture of the United States. United.

Between 1970 and 1999 the analysis became popular, being created in 1973, for example, the Black-Scholes model to predict the optimal price of shares in the future, or the first commercialized tool to build support systems in decisions guided by models in 1980. But it is not until well into the 1990s, in 1995, when the Amazon and eBay websites are launched, which represent the beginning of the race to personalize the diverse online experiences for each of the users. This supposes the necessity that the searches of the engines consider more and more the importance of the relevance of the results, arriving to apply in 1998 algorithms of search in Google.

In the mid-nineties, dozens of companies understood that digital storage of information was more profitable than having paper data. Just in 1997, Michael Cox and David Ellsworth, two researchers from the National Aeronautics and Space Administration (NASA), published an article where the term Big Data is mentioned for the first time in history and there initiated the entire revolution in the concept.

It took 12 years for another concept to emerge that would be added to the in-depth analysis of information, underpinning Big Data. Tim Berners-Lee, director of the World Wide Web Consortium (W3C), and considered one of the fathers of the internet, was the first to use the term linked data in 2009 and that is how the so-called Business Intelligence emerged, a concept that became a priority for millions of branch specialists.

Since then, organizations around the world have begun to implement new technologies to analyze and optimize the large amounts of data they generate. He also began to depend on the use of his information as a commercial asset to obtain a competitive advantage, be it in marketing issues, personalized experiences for clients or,, to improve his day-to-day processes.

pexels-photo-534216

In 2020 it is expected that the annual generation of data has increased to 4,300%. The growth will be motivated by the change of analogue to digital technologies and by the amount of information generated by both users and companies.

According to Gartner, in 2020 there will be more than 25 billion devices connected to the Internet. At the end of 2013, the amount of data generated by the devices was 4.4 billion GB. This figure is expected to multiply by 10 by 2020. This large amount of data will require new techniques. Also, of greater management capacity and consider the 3 Vs of Big Data: Speed, Variety and Volume.

What has already been achieved with Big Data allows us to see the future with optimism. It will be as long as the technology grows to the level that data does. Areas such as the environment, health, productivity or personal life can be benefited by those billions of bytes that we generate daily.

Close

About The Author

Ruchi Singla
An enthusiast writer with years of experience, who loves to create beautiful thoughts into words. She writes with a zeal and enjoys reading vogue with a good cup of coffee when not writing. Ruchi is working as a content writer for many businesses, helping them gain an extra edge over their competitors. As a content writer, she is responsible for enhancing engagement and traffic on the blog by brainstorming exciting content ideas.

This site uses Akismet to reduce spam. Learn how your comment data is processed.