.
D

ozens of countries around the world, from the United States to India, will hold or have already held elections in 2024. While this may seem like a banner year for democracy, these elections are taking place against a backdrop of global economic instability, geopolitical shifts, and intensifying climate change, leading to widespread uncertainty.

Underpinning all this uncertainty is the rapid emergence of powerful new technologies, some of which are already reshaping markets and recalibrating global power dynamics. While they have the potential to solve global problems, they could also disrupt economies, endanger civil liberties, and undermine democratic governance. As Thierry Breton, the European Union’s commissioner for the internal market, has observed, “We have entered a global race in which the mastery of technologies is central” to navigating the “new geopolitical order.”

To be sure, technological disruption is not a new phenomenon. What sets today’s emerging technologies apart is that they have reached a point where even their creators struggle to understand them.

Consider, for example, generative artificial intelligence. The precise mechanisms by which large language models like Google’s Gemini (formerly known as Bard) and OpenAI’s ChatGPT generate responses to user prompts are still not fully understood, even by their own developers.

What we do know is that AI and other rapidly advancing technologies, such as quantum computing, biotechnology, neurotechnology, and climate–intervention tech, are growing increasingly powerful and influential by the day. Despite the scandals and the political and regulatory backlash of the past few years, Big Tech firms are still among the world’s largest companies and continue to shape our lives in myriad ways, for better or worse.

Moreover, over the past 20 years, a handful of tech giants have invested heavily in development and acquisitions, amassing wealth and talent that empowers them to capture new markets before potential competitors emerge. Such concentration of innovation power enables these few players to maintain their market dominance—and to call the shots on how their technologies are developed and used worldwide. Regulators have scrambled to enact societal safeguards for increasingly powerful, complex technologies, and the public-private knowledge gap is growing.

For example, in addition to developing vaccines and early detection systems to trace the spread of viruses, bioengineers are developing new tools to engineer cells, organisms, and ecosystems, leading to new medicines, crops, and materials. Neuralink is working on trials with chip implants in the bodies of disabled people, and on enhancing the speed at which humans communicate with systems through direct brain-computer interaction. Meanwhile, quantum engineers are developing supercomputers that could potentially break existing encryption systems crucial for cybersecurity and privacy. Then there are the climate technologists who are increasingly open to radical options for curbing global warming, despite a dearth of real–world research into the side effects of global interventions like solar radiation management.

While these developments hold great promise, applying them recklessly could lead to irreversible harm. The destabilizing effect of unregulated social media on political systems over the past decade is a prime example. Likewise, absent appropriate safeguards, the biotech breakthroughs we welcome today could unleash new pandemics tomorrow, whether from accidental lab leaks or deliberate weaponization.

Regardless of whether one is excited by the possibilities of technological innovation or concerned about potential risks, the unique characteristics, corporate power, and global scale of these technologies require guardrails and oversight. These companies’ immense power and global reach, together with the potential for misuse and unintended consequences, underscore the importance of ensuring that these powerful systems are used responsibly and in ways that benefit society.

Here, governments face a seemingly impossible task: they must oversee systems that are not fully understood by their creators while also trying to anticipate future breakthroughs. To navigate this dilemma, policymakers must deepen their understanding of how these technologies function, as well as the interplay between them.

To this end, regulators must have access to independent information. As capital, data, and knowledge become increasingly concentrated in the hands of a few corporations, it is crucial to ensure that decision–makers are able to access policy–oriented expertise that enables them to develop fact–based policies that serve the public interest. Democratic leaders need policy–oriented expertise about emerging technology—not lobbyists’ framings.

Having adopted a series of important laws like the AI Act over the past few years, the EU is uniquely positioned to govern emerging technologies on the basis of solid rule of law, rather than in service of corporate profits. But first, European policymakers must keep up with the latest technological advances. It is time for EU decision–makers to get ahead of the next curve. They must educate themselves on what exactly is happening at the cutting edge. Waiting until new technologies are introduced to the market is waiting too long.

Governments must learn from past challenges and actively steer technological innovation, prioritizing democratic principles and positive social impact over industry profits. As the global order comes under increasing strain, political leaders must look beyond the ballot box and focus on mitigating the long–term risks posed by emerging technologies.

Copyright: Project Syndicate, 2024.
About
Marietje Schaake
:
Marietje Schaake, a former member of the European Parliament, is International Policy Director at Stanford University’s Cyber Policy Center and Practice Lead for Emerging Technology Governance at the International Center for Future Generations.
About
Steven Schuurman
:
Steven Schuurman, Co-Founder and former CEO of Elastic, is Co-Founder of the International Center for Future Generations.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.

a global affairs media network

www.diplomaticourier.com

Making emerging technologies safe for democracy

March 26, 2024

The rapid emergence of powerful new technologies is adding to the widespread uncertainty in what should be a banner year for democracy. Political leaders must look beyond the ballot box and focus on mitigating the long–term risks posed by emerging tech, write Marietje Schaake and Steven Schuurman.

D

ozens of countries around the world, from the United States to India, will hold or have already held elections in 2024. While this may seem like a banner year for democracy, these elections are taking place against a backdrop of global economic instability, geopolitical shifts, and intensifying climate change, leading to widespread uncertainty.

Underpinning all this uncertainty is the rapid emergence of powerful new technologies, some of which are already reshaping markets and recalibrating global power dynamics. While they have the potential to solve global problems, they could also disrupt economies, endanger civil liberties, and undermine democratic governance. As Thierry Breton, the European Union’s commissioner for the internal market, has observed, “We have entered a global race in which the mastery of technologies is central” to navigating the “new geopolitical order.”

To be sure, technological disruption is not a new phenomenon. What sets today’s emerging technologies apart is that they have reached a point where even their creators struggle to understand them.

Consider, for example, generative artificial intelligence. The precise mechanisms by which large language models like Google’s Gemini (formerly known as Bard) and OpenAI’s ChatGPT generate responses to user prompts are still not fully understood, even by their own developers.

What we do know is that AI and other rapidly advancing technologies, such as quantum computing, biotechnology, neurotechnology, and climate–intervention tech, are growing increasingly powerful and influential by the day. Despite the scandals and the political and regulatory backlash of the past few years, Big Tech firms are still among the world’s largest companies and continue to shape our lives in myriad ways, for better or worse.

Moreover, over the past 20 years, a handful of tech giants have invested heavily in development and acquisitions, amassing wealth and talent that empowers them to capture new markets before potential competitors emerge. Such concentration of innovation power enables these few players to maintain their market dominance—and to call the shots on how their technologies are developed and used worldwide. Regulators have scrambled to enact societal safeguards for increasingly powerful, complex technologies, and the public-private knowledge gap is growing.

For example, in addition to developing vaccines and early detection systems to trace the spread of viruses, bioengineers are developing new tools to engineer cells, organisms, and ecosystems, leading to new medicines, crops, and materials. Neuralink is working on trials with chip implants in the bodies of disabled people, and on enhancing the speed at which humans communicate with systems through direct brain-computer interaction. Meanwhile, quantum engineers are developing supercomputers that could potentially break existing encryption systems crucial for cybersecurity and privacy. Then there are the climate technologists who are increasingly open to radical options for curbing global warming, despite a dearth of real–world research into the side effects of global interventions like solar radiation management.

While these developments hold great promise, applying them recklessly could lead to irreversible harm. The destabilizing effect of unregulated social media on political systems over the past decade is a prime example. Likewise, absent appropriate safeguards, the biotech breakthroughs we welcome today could unleash new pandemics tomorrow, whether from accidental lab leaks or deliberate weaponization.

Regardless of whether one is excited by the possibilities of technological innovation or concerned about potential risks, the unique characteristics, corporate power, and global scale of these technologies require guardrails and oversight. These companies’ immense power and global reach, together with the potential for misuse and unintended consequences, underscore the importance of ensuring that these powerful systems are used responsibly and in ways that benefit society.

Here, governments face a seemingly impossible task: they must oversee systems that are not fully understood by their creators while also trying to anticipate future breakthroughs. To navigate this dilemma, policymakers must deepen their understanding of how these technologies function, as well as the interplay between them.

To this end, regulators must have access to independent information. As capital, data, and knowledge become increasingly concentrated in the hands of a few corporations, it is crucial to ensure that decision–makers are able to access policy–oriented expertise that enables them to develop fact–based policies that serve the public interest. Democratic leaders need policy–oriented expertise about emerging technology—not lobbyists’ framings.

Having adopted a series of important laws like the AI Act over the past few years, the EU is uniquely positioned to govern emerging technologies on the basis of solid rule of law, rather than in service of corporate profits. But first, European policymakers must keep up with the latest technological advances. It is time for EU decision–makers to get ahead of the next curve. They must educate themselves on what exactly is happening at the cutting edge. Waiting until new technologies are introduced to the market is waiting too long.

Governments must learn from past challenges and actively steer technological innovation, prioritizing democratic principles and positive social impact over industry profits. As the global order comes under increasing strain, political leaders must look beyond the ballot box and focus on mitigating the long–term risks posed by emerging technologies.

Copyright: Project Syndicate, 2024.
About
Marietje Schaake
:
Marietje Schaake, a former member of the European Parliament, is International Policy Director at Stanford University’s Cyber Policy Center and Practice Lead for Emerging Technology Governance at the International Center for Future Generations.
About
Steven Schuurman
:
Steven Schuurman, Co-Founder and former CEO of Elastic, is Co-Founder of the International Center for Future Generations.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.