“We live in the age of trust.” – An Interview with Professor Andrea Renda
Professor Andrea Renda is a Senior Research Fellow and the Head of Global Governance, Regulation, Innovation and the Digital Economy at the Centre for European Policy Studies (CEPS), Part-Time Professor at the EUI’s School of Transnational Governance, and one of the leading experts on digitalization and regulatory policies at the European level.
In this exclusive interview, Lucía Bosoer of the Latin America Focus Group had the chance to discuss Renda’s recent visit to Argentina, his vision on digital transformation, digitization models, and the future of governance in the digital era. Although many of his insights refer to Argentina, they may well apply to other countries in the region facing similar challenges in terms of digital transformation. According to him, in the age of trust, Europe offers a third way that escapes the US and Chinese digitalization models and could be adopted by many Latin American countries.
This interview has been edited for brevity and clarity.
Lucía Bosoer: You recently visited Argentina at the invitation of the European Union delegation. Can you tell us a little more about that trip?
Andrea Renda: I was invited by the EU delegation, which was interested in stimulating the debate on EU-Argentina cooperation on digital matters. They put together a very dense program of visits during three – four days for me to meet with relevant institutions and civil society, with the aim to spread awareness about the challenges and opportunities of the digital transformation, and what the EU offers and purports to do in a world that is increasingly dominated by the dualism between the US and the Chinese model.
LB: In this global scenario dominated by these two competing models, you propose a “third way” for the digital era. What would this third way look like? And what are the similarities between Argentina and Europe that could unite them in this path?
AR: We live in a world where what was called “surveillance capitalism,” (following Shoshana Zuboff’s famous book) on the side of the US, is very tangible and visible in everyday life. From Cambridge Analytica to the recent revelations by Frances Haugen, there is increasing awareness that the absence of regulation in the digital ecosystem has generated a lot of inequality, lack of protection for end users, as well as social and economic unsustainability. We also see a lot of polarization and value capture by a few tech giants to the detriment of a big portion of the economy. This model, which has pushed the internet revolution at the very beginning, is not a good and sustainable model for the future. A suitable policy for digital technologies requires a lot more regulation and protection of fundamental rights.
On the other hand, we have the Chinese model, based on state-controlled, very intrusive technologies and policies such as the social credit scoring system, a model where there is very little space for users’ privacy and civil rights. Neither of these two models is ideal, especially from the perspective of the EU, which typically sees the protection of fundamental rights as one of its key priorities.
In this respect, Argentina is potentially a very like-minded country, not only because its legal tradition echoes very much that of continental Europe, but also because it is a country where civil society has been very active at the local level in trying to preserve knowledge, traditions, identity, and overall digital sovereignty. Incidentally, one of the areas where this has been more visible over time is agriculture, which in Argentina is super important. Farmers in the US have to approach large companies, such as Monsanto, IBM, or John Deere, to purchase access to the data that their own land has created. This process of platformization and agricultural market concentration has already taken place in the US; in Europe it will happen soon. If it happens in Argentina, you could only imagine what kind of a disaster that would be for an already compromised national economy. I think there is room for cooperation between the EU and Argentina, which currently has an incomplete legal framework for digital technology that could use some of the emerging European solutions. However, this is not what I saw during my visit; rather, what I perceived was the subtle but growing presence of the Chinese influence through the Digital Silk Road.
LB: During your visit, you were invited to speak in the #encUEntros series of talks, organized by Team Europe. There, you argued that trust is the new oil in the digital age, challenging a bit those who say that data is the new oil. Why trust?
AR: “Data is the new oil” is a statement that I’ve never understood, let alone agreed with. We already live in a world where data has been skyrocketing, but if you consider that today we have an estimated 8 billion connected devices at the global level, and by 2035 the estimation is one trillion (!!!) connected devices, we are about to witness the emergence of five elements in our world: earth, air, fire, water, and data. The quantity of data is not going to be a problem in the future; rather, whether we can trust data flows will be the key. This means being able to trust that the data we share with others is stored safely, and used for the purposes we have agreed on; and that the data that we receive from the media is not manipulated by private interests, or even simply because the algorithms that select them are poorly designed.
The first three decades of the internet economy has seen the rise of the economics of attention, of those players that have managed to conquer the eyeballs, the attention of the end users, and of those tech platforms that have been able to convert this attention and monetize it in countless ways. Now we enter the age of trust. Whether in the private or public sector, the institutions that will win the battle in cyberspace will be the ones that manage to win the trust of end users, so that they feel they are in a safe environment where they can share and receive data.
LB: It is often said that these large tech companies and the algorithms they have managed to develop have become so powerful that they have the power to manipulate our tastes and minds. Couldn’t they also, in the near future, create trust where there really shouldn’t be any?
AR: Yes, just like they have created needs where there were no needs, they can create trust where there is no trust. This is why governments have a key role to play in the age of AI and the Internet of Things, but at the same time they cannot act alone, because otherwise we may end up in the hands of governments that can also manipulate and abuse their position. The type of ecosystem that you need to build around digital technologies is inevitably a multi-stakeholder one. This is going to be an age of open algorithmic governance, where companies that use algorithms will be called to justify how they use them, and perhaps open them to selected people, be these from government, civil society organizations, or other stakeholders. Look at the proposed EU AI Act: one of its most controversial proposals is that those that deploy AI that use remote biometric identification cannot just test conformity of their AI systems simply through “internal checks,” but are obliged to open their systems to third parties for inspection. This embryonic provision will soon be replicated in many other pieces of legislation at the EU level. There are provisions on algorithmic inspections and mandatory data access in the Digital Services Act, in the Digital Markets Act, and potentially in the Data Governance Act. All of this despite the fact that, as far as I know, no one in the European Commission that can actually perform an algorithmic inspection. This gives you an idea of how complicated this is going to be. This is the future of governance, and we all have to get ready for it.
LB: On a final note, would you say you are optimistic about our digital future?
AR: I think it is going to be a bumpy road. Eventually we are going to get there, but it takes a lot of investment in government, and we need to catch up with three decades in which these small initial startups have gotten to a gigantic size by having absolutely no culture of regulation, or no legal culture internally. For companies like Facebook (now Meta), what users decide by raising hands or by clicking like, is often as important as what the Court of Justice says in a judicial decision, for example on what constitutes hate speech. These companies have been led to believe that they could actually self-regulate for all their life, and they continue to do so. For example, Apple’s recent proposal to combat child pornography by conducting a sort of “scanning” of all pictures on iPhones, without sharing them with the government. I do not want to live in an environment where there is a private company that becomes a policeman, especially such a powerful company, because the temptation is there, and they can very easily abuse a position like that. And then we have China, Russia, but also governments in Europe that started using algorithms in ways that are totally unacceptable.
The future is a future of government, private sector and civil society acting on an equal basis, trying to look at each other and trying to ensure that no one takes the edge over the others, and that is difficult. A bumpy road for sure, ups and downs, but perhaps one day we will get there.