✕ CLOSE Online Special City News Entrepreneurship Environment Factcheck Everything Woman Home Front Islamic Forum Life Xtra Property Travel & Leisure Viewpoint Vox Pop Women In Business Art and Ideas Bookshelf Labour Law Letters
Click Here To Listen To Trust Radio Live

The Fourth Industrial Revolution and the big data algorithms

By Eva Azinge

Three important revolutions have shaped the course of history; these are the cognitive, agricultural, and scientific revolutions. First off, the cognitive revolution kick-started history about 70,000 years ago. But the actual shift in the cognitive paradigm occurred in the 1950s when psychology zeroed in on the internal mental processes driving human behaviour. Human cognition became accelerated by the second revolution- the agricultural revolution, which began 12,000 years later and ushered in several cultural transformations which allowed humans to change from a hunting and gathering subsistence to one of agriculture and animal husbandry. The third category is the scientific revolution which started slightly over 500 years ago when developments in mathematics, physics, astronomy, biology, and chemistry transformed the views of society about nature. 

Essentially, this became the cornerstone for the so-called Fourth Industrial Revolution which captures how the ever-evolving technology continues to shape human existence in ways that are both profound and unprecedented. This is how Yuval Noah Harari captures this phenomenon in his book, “Sapiens- a Brief History of Humankind.” “Humans began to live in empires around 200 BC. In the future, most humans will likely live in one, but this time it will be a new global empire, powered by technology.” 

SPONSOR AD

Technology is an evolving useful body of knowledge. It is the basis for the emergence of Artificial Intelligence (AI) and Big Data Algorithms which reversed the 20th-century inefficiency of information dissemination through centralized systems. AI simply means teaching a computer to make its own decisions based on its observations. Scientists have observed that computers can learn on their own when given a few simple step-by-step instructions using mathematical codes, otherwise called Algorithms. Algorithms are nothing new. Their use may be traced back to 1952 when Allan Turing, one of the world’s computational giants, published a set of equations that tried to explain the patterns we see in nature. Sadly, Turing took his own life two years after his book was published but his impact on the world did not end with his suicide. Almost another lifetime later, Scientists are still using these algorithms to discover patterns in nature. Today, even the smartphones in our pockets have so much computing power that would have blown Turing’s mind. Stephen F DeAngelis, “Artificial Intelligence: How Algorithms Make Systems Smart.” Accordingly, from PCs to cellphones, technology is revolutionizing the way we live, work and relate with one another.

Indeed, that humanity stands at the brink of a 4th Industrial Revolution with important consequences for business, governments and the labour markets sounds like a logical proposition. But in the meantime, the way technology is organized and managed seems to be turning humans into a community of bubbles that are isolated from each other in an increasingly borderless world confronted by viruses and carbon emissions. Bradford Lee Smith, the current President and Vice Chairman of Microsoft acknowledges that “what is needed at this time is a technology that can foster a more singular conversation on global issues rather than a series of silos.”  

But given that technological development is a multidisciplinary mechanism, it remains subject to further and better mastery in ways that channel it more broadly towards engendering flourishing democracies across the globe, supporting the activities of business enterprises, as well as facilitating greater access to the labour market. This is good news. However, what appears to be causing global concern post-COVID, is the sheer amount of data generated to track the pandemic when biometric surveillance became legitimized through body temperature checks. 

Although data accumulated at that time were largely anonymous, computer scientists have learned that information used to monitor COVID may also be used to monitor other things when eternal contexts are provided. An example is the Facial Recognition biometric identification software. It has indeed been an ongoing debate that brains and minds can equally be monitored using biometric sensors. This is a potential game changer because biometric sensors convert biological data into digital data that the computer can analyze and this may not always protect privacy in ways consistent with fundamental freedoms. 

So, what are the silver linings in the dark clouds? From the 1970s to about the 2010s, tech companies were like some huge castle filled with computer and data scientists. But over the years, it has transformed into a multidisciplinary exercise. Persons who create technology are now required to have a much broader approach and are trained in history, philosophy, and other relevant disciplines in such a way that education becomes a lifelong process, and merely graduating from the university is no longer enough. The tech companies are providing intensive training and retraining for their employees so that computer engineers can acquire the necessary professional ethics to balance the surveillance equation for all practical purposes. All points considered, computer engineers are the most important people shaping the world in the 21st century and this is the best time in human history to be alive.

Eva Azinge Esq writes from NiMET, Abuja

Join Daily Trust WhatsApp Community For Quick Access To News and Happenings Around You.

NEWS UPDATE: Nigerians have been finally approved to earn Dollars from home, acquire premium domains for as low as $1500, profit as much as $22,000 (₦37million+).


Click here to start.