Understanding the mechanisms of these systems that make decisions for humans is critical, warns computer scientist Virgílio Almeida, who this month will assume the Oscar Sala Chair at USP.
Many of us have been in a similar situation: wanting to capture an important moment. Take your smartphone out of your pocket. Open the camera app and tap the screen. Going beyond the image saved as a digital file and deciding to post it on a network connected to the rest of the world, which simultaneously distributes the photo and shows newly captured images across the planet in a mysteriously organized timeline, cured especially for us. All this in a matter of seconds.
While performing this action has become practically a reflex these days, the technological evolution that has led us to this solution hides countless layers of complexity, from microscopic computer chips in our pockets to a network of satellites placed outside planet Earth.
In 2012, writer Venkatesh Rao stated that “we don’t always realize when the future is coming”. For him, we are surrounded by a “fabricated normality”, in which mechanisms have been developed to prevent us from realizing that technologies once thought impossible are already around us.
At the heart of many of these inventions are algorithms, which are, in short, “a sequence of instructions to solve a particular problem,” explains computer scientist Virgílio Almeida, who wrote on the 25th of this month at 2:30 p.m. took office as a new holder of the Oscar Sala Chair at the Institute for Advanced Studies (IEA) at USP, replacing Professor and Semiologist Lúcia Santaella. The event will be broadcast live on the IEA website.
From 2011 to 2015, Almeida was the National Secretary of Information Technology Policy at the Ministry of Science, Technology and Innovation. In 2022, the professor at the head of the Oscar Sala chair will work on the Human-Algorithm Interactions project.
Essentially, the project has two motives: to study the diversity of types of algorithms in use, with increasing impact on society and citizens’ day-to-day activities, and to analyze their complexity within different types of systems.
“Imagine a system that makes decisions that affect people’s lives. These decisions can lead to different futures, including negative futures,” the professor illustrates, clarifying that the system in question is often “opaque”, that is, we don’t know how it really works.
“When we think about these systems, we don’t know who is responsible for these decisions, which puts us in a situation of bewilderment,” he theorizes, citing credit ratings whose criteria are not always clear to the applicant.
Not coincidentally, during the pandemic, to request emergency aid, requested by more than 39 million families in 2021, the basis of interaction between people and the system was an application. “We see more and more that people interact with algorithms through applications, through interfaces, through browsers. All these interactions are mediated by algorithms that make the decisions, decisions that can exclude people,” he explains.
According to the expert, understanding the reasons and mechanisms behind this exclusion becomes crucial day after day. “We noticed a big movement of automation, of the use of artificial intelligence, for example for health services. During the pandemic, we had the covid-19 chatbots, which could suggest a certain treatment or medication via an app,” says Almeida, emphasizing that the process of making this decision was not transparent to users.
Josef K. reaches the 21st century
Released in 1925, the book The process, by Franz Kafka tells the story of Josef K., who was “arrested without having done any harm” one fine morning. The mixture of a surreal situation with highly possible terror – that of incarceration without proof – served as a metaphor at the beginning of the 20th century, but in the 21st century it takes on an air of normality.
“Currently, several courts in the United States use systems that use algorithms to determine whether or not an accused deserves a sentence. habeas corpus or even calculate a criminal’s punishment,” reveals the professor. “In Brazil, some courts are interested in acquiring this type of system. In other words, the fiction of Kakfa is not that far off.”
And it is precisely the intersection of the interaction between humans and algorithms that forms the object of the Oscar Sala Chair.
“The digital transformation process has accelerated enormously with the pandemic. And we observe that in Brazil algorithm-based systems are being implemented without a better understanding of the impact and how they work,” he emphasizes, reinforcing that, in a country with huge inequality like ours, the effects can be significant. negative.
Behind the black screen
Although algorithms in mathematics are just sequences of instructions, these sequences have become increasingly complex in today’s technological context.
“Algorithms have evolved a lot these days and the order of instructions is not clearly informed, it is discovered through training. Now the algorithms work through machine learning, that is, they learn and evolve with the data fed into the system.”
Mechanics are changing the algorithms that hide behind our scenes in black boxes. Not only that, they are now intellectual property associated with companies taking advantage of their myriad uses. “They are private property, but their impact can be public,” defends Almeida.
But how can society know or influence the impact of these algorithms? For the professor, the answer comes in two phases: “We need transparency and multidisciplinarity”.
“What we notice is that a large part of the systems development is done by researchers in the field of computer science, who have no social or political background. In other words, they have been developed in a clever way, but without sensitivity to the impact on society,” says the professor. The Oscar Sala Chair project was therefore born with the explicit objective of bringing computer science and engineering closer to the social and human sciences.
“Decades ago, computing had much less of an impact on people’s lives, but with the invention of mobile phones, tablets and the internet itself, it got closer. But those who build these structures usually don’t have the sensitivity to imagine their social consequences,” he explains.
education as an answer
The technical problems of the algorithms – all created by people associated with the technology field – relate to scaling efficiency, accuracy and reduction of processing time, among other factors of product development. “Social concerns are not there, not on purpose, but because of a lack of education,” says the teacher.
It’s no wonder that the debate about the biases behind algorithms has sprung up in recent years, embarrassing major tech companies. Google itself apologized in 2015 for the racist bias of its image search engine.
“This bias does not exist in machines. People have prejudices, people have a discriminatory attitude. And this must be combated through education, opening the black boxes so that we can understand what is happening,” Almeida said.
The truth is that when you think of technology these days, you think of big global platforms that have tremendous power. To this end, governments have been working to create rules that prohibit discriminatory acts and acts that violate or disrespect users’ privacy.
And working on it means tampering with systems that are already part of our lives and not being put on hold so we can reassess them. “We are not going to block the nets for this, we are going to change the tires of a moving car, so to speak.”
The answer to countering the miscommunication between us humans and the inescapable algorithms, the teacher reinforces, is appropriate training. “Education is not a priority. With all the research problems, many states today are losing top researchers who would be important to this area. We need qualified personnel to carry out this process,” he defends.
As author Cathy O’Neil argued in her book Mass Destruction Algorithms, released in 2016, algorithms were originally created to be neutral and fair, avoiding human bias and logical errors. However, we know that those used today contain biases, which may be due to their designers being far removed from certain social issues or the data used to train the algorithms. And because algorithms are widely used, these biases affect the entire planet.
“In the chair we want to study these interactions between people and algorithms, with research in the fields of communication, social sciences and art. The aim is to try to understand this issue more broadly and to formulate fundamental models that enable society to understand what is happening.”
The Oscar Sala Chair is a collaboration between the IEA and the Information and Coordination Center of Ponto BR (NIC.br) under an agreement between USP and the Internet Management Committee in Brazil (CGI.br).
The inauguration of computer scientist Virgílio Almeida as holder of the Oscar Sala Chair at USP will take place on April 25 at 2:30 PM with a live broadcast on the Institute for Advanced Studies (IEA) website at USP. Free. Requests can be made via this link. More information can be found on the event website.