What are you going to be in the future? As I am directing my speech mostly to Generation Z, many of you are probably not sure yet. However, it is very likely that at least some of you are considering working as doctors, economists, or maybe engineers. But the key idea is that you see yourself as a future professional who will be competing against an ever-growing number of people for the best jobs in a specific area. But what If your chances of even having a job are threatened, not by men, but by smart computers? Although pessimistic, this idea is far from being mere fiction. I am talking, of course, about Artificial Intelligence, and the threat it may soon pose to the job market.
Over the last 2 million years, Homo Sapiens has successfully survived by utilizing a wide range of cognitive and physical skills. For instance, imagine a hunter in the savanna trying to get food for his family. In order to do that, he relied simultaneously on the abilities of observing the environment, chasing the prey, and attacking it while in movement. If he managed to kill the animal, he also needed to take it back home, while navigating through the forest.
Roughly seven thousand years ago, this sapient species tamed nature, organized civilizations and consequently reached a state where access to food and other basic needs is, at least in theory, universal. The only thing we need to do is to work, that is, to use particular sets of skills in order to play a role in modern society. For example, a doctor needs to deeply understand the human body and to know how to deal with very specific tools to take care of the population. A farmer has to master the soil and the seasons, and needs to water and harvest the crops to feed others. The problem is that human production cannot keep pace with the demographic growth. That, allied with the elites economic interests, made capitalists seek faster and cheaper ways of producing goods, in order to satisfy the market. The solution they came up with was technology.
Since the first industrial revolution, in the end of the eighteenth century, technology has started to replace humans in labor-intensive activities, such as agriculture, construction, and textile industries. Most of the unskilled workforce on fields was substituted by farm machinery, which reduced costs and spared time by increasing food production efficiency. Simultaneously, we could observe this same pattern in the cities, as many factory workers were substituted by conveyor belts and packaging machines, which raised the production to immense levels. However, as some jobs were lost, many more were created, especially in fields that demanded cognitive skills, such as programming and analyzing data.
Not too long after, machines able to follow written instructions started to be developed. Computers, as they were named, could perform only simple tasks, and writing these tasks (also known as coding or programming) was very complicated. Since the creation of the first calculating machine, in 1931, the area of informatics has been expanding at an astonishing rate, offering us a vast range of devices to make our lives easier. Many everyday jobs that require repetitive, mechanical labor, like supermarket cashiers and bank clerks, have substituted humans for computer-operated machines. However, this new era of computers has also offered many new work opportunities for those willing to study further, especially in the areas of technological R&D and informatics.
Yet, I believe that the real game changer came when scientists started to research Artificial Intelligence. But before I get into it, let me pose the question: What is Artificial Intelligence? AI, as it is commonly named, is not a computer, as some believe, but a computer program that has cognitive skills, the most important being the program’s ability to learn. That is, the program is capable of learning new things from its own experience; a skill that had, until now, been considered exclusive to sapient living creatures. Moreover, AIs can communicate with each other, exchanging information and maximizing their efficiency on performing a task. The more data is available to be processed, the more the program learns. Although humans can also learn from others, it is neither easy nor fast, as many of us tend to stick to what we know and struggle to admit our mistakes. Therefore, computers may be the best solution for areas that demand constant updating and communication. Besides, we should not forget that computers are not chained down by prejudice, which gives them an edge on decision-making activities.
If we believe that reason is a unique human gift, then we can be certain that jobs that demand complex reasoning, like for example heart surgery or stock brokering, will forever be performed by humans, so long as they are good at their jobs. However, if we consider that, one day, a computer running an AI program might learn how to do anything, and can even do it better than humans, then that conviction becomes fragile. What is the difference between a doctor and a factory worker, if a computer can perform both jobs, and even outperform humans at these functions?
Jacob Mincer, considered by many to be the father of modern labor economics, argues that, in the broader picture, technology innovation does not contribute to overall unemployment. In fact, technology actually creates more jobs than it takes on a long-term scale. But that does not mean that the same people get their jobs back, and definitely not the same jobs. Innovation makes certain skills and know-how obsolete, leaving workers to choose between training to become more proficient, or becoming irrelevant to the job market. Therefore, in a world where technological development is happening faster by the minute, our conception of ‘work’ must change. The belief that we may spend our whole lives working in a single area is already unsafe, and chances are that, every decade or so, we may be compelled to rediscover ourselves in the new areas created by emerging technology.
To conclude, I believe that a computer, if well programmed, can perform beyond human capability in any given job. But for that to happen, first that specific activity must be turned, by human hands, into a set of rules and patterns. For that reason, there is no such a thing as Artificial Intelligence yet. A program cannot be considered smart if it needs humans to understand the world and to explain it in numbers. That being said, we still have a chance to keep ahead of the machine. Unlike humans, there is no computer, at least not in our generation, capable of devising creative solutions to deal with unfamiliar problems. That is our greatest strength, and the only thing we can genuinely rely on to give us the edge over technology. Looking at this uncertain scenario, the only thing we know for sure is that we need to be constantly evolving, in order to stay relevant to the job market, and to keep our jobs.