It uses the term big data to describe a collection of datasets so large and complex as to require instruments different from the traditional ones, in all phases of the process: acquisition, the data filtering, leading to the sharing, analysis and visualization of the same.
What you have to imagine is the set collected and aggregated to a huge amount of data from disparate sources, not only structured data like the classic databases, but unstructured data such as images, email, GPS data, information captured by social network. To manage and manipulate these masses of data is required parallel computing power with dedicated tools run on hundreds or even thousands of servers.
How born the big data then? Until the attacks of 11 September 2001, the main tools available to the National Security Agency, were wiretaps, planes for espionage and hidden microphones and the only objective to watch was the Soviet Union. From then on the enemy NSA becomes a network of individual terrorists for which anyone could be in the right viewfinder for intensive espionage. Keeping pace with the exponential growth of mobile devices connected to the Internet, the instruments adopted so far are no longer sufficient. So in response to what the NSA began to collect everything: telephone records were captured mass from any US citizen or not. The enormous amount of data has long guarded secret at the premises of the agency. The main consequences of this were, however, the direct access by the NSA analysts to personal information of citizens, means by which to conduct a brief analysis of the market, studies on human behavior and so much more. The way to defend against misappropriation of personal data can be, today, to safeguard the storage and transmission of information through encryption. All institutions should be subject to a control of metadata and assessment procedures, as now are monitored credit cards to avoid scams.marea- di- dati-close-up-engineering
And how small and medium businesses can use the masses of data and metadata available to manage their activities and implement their strategic ideas? According to Thomas Davenport, a professor at Babson College, Director of Research Institute for Analitics and senior advisor to Deloitte, the effort needed to extract and structure the big data is remarkable. Need special skills and abilities to succeed. Devenport was perhaps the first to have shown companies how to combine big data with small data for create value in business processes. It is basically an approach that he called Analitics 3.0. This will open the doors to an analysis of data totally new and much more efficient than the result of which is the careful and effective combination of big data and the small data. The big data are the essential tool to learn strategic information about customers and consumers in every place on Earth, discover the resources offered by the suppliers. The hardware required to manipulate big data does not seem to be very expensive and the resources offered in exchange for the promise from the outer edge to capture a variety of ingredients necessary for a more efficient business management.
header imagine credits: isi.it
Francesca Granatiero nasce a San Giovanni Rotondo, classe 1988. Frequenta il Liceo Scientifico a Manfredonia per poi intraprendere, conseguito il diploma, la facoltà di Ingegneria Gestionale presso il Politecnico di Bari. Iscrittasi al corso di Laurea Magistrale in Ingegneria Gestionale presso lo stesso Politecnico di Bari consegue il titolo di Esperto in sistemi (SGA) per la gestione delle PMI. Diventa referente e scrittrice per la rivista Close-up Engineering nel settembre 2014 ad oggi. Consegue la laurea in Ingegneria Gestionale Magistrale nel dicembre 2015. Pur avendo un’impronta scientifica e assorta nell’ affascinante mondo dell’ingegneria, è molto appassionata di letteratura classica. D’indole “sognatrice” nel tempo libero ama leggere e viaggiare.