.
I

n the wake of the 2016 presidential election, Russian influence in our politics has exposed more Americans to the dangers of digital disinformation. Foreign policy experts argue that Russian and Chinese efforts to encourage dissent via social networks pose one of the greatest threats to the United States. But social media and “junk news”—prejudice, extreme, and inaccurate political news—have been deployed against populations worldwide in both authoritarian regimes and democracies for years.

Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives | By Philip N. Howard | May 2020.

Philip Howard’s latest release Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives charts the depth and breadth of the influence of the systems that spread this dangerous disinformation. “Lie machines,” as Howard refers to them, produce and promulgate disinformation on a global scale with the intention of creating chaos. They subscribe to no one ideology (although Howard notes that the far-right has been particularly susceptible to “junk news”) and function with the purpose of making citizens question objective truth. In his book, Howard asks, “If we analyze the entire mechanism, with its social and technical parts, can we redesign it to support democratic norms?”

Analysis of the mechanisms of lie machines represent the strongest aspect of this book and accordingly receives the most page time. In discomforting detail, Howard breaks lie machines down into three primary parts: production, distribution, and marketing. Algorithms, social media, and big data equip troll armies with digital weaponry, information on citizens’ personal preferences, and significant political events with the potential to cause domestic turmoil. Bot networks, automated accounts, and accounts manually run by dedicated companies then spread computational propaganda, false information generated and spread by algorithms. Finally, structured campaigns encourage users—you and me—to further share junk news, because individuals are more likely to trust resources shared by those they know. Thorough descriptions of “Imitacja Consulting” and “Imitação Services,” two anonymous data consulting companies in Poland and Brazil, provide startling specimens that underscore how pervasive and scrupulous lie machines have become.

The author’s most striking example of the impact of lie machines on democratic society is his exploration of their influence on Brexit. He points out that both the Vote Leave and BeLeave organizations not only broke campaign finance law by overspending before the vote—according to a report from the UK Electoral Commission—but also employed similar promotion strategies supplemented by the outside data company Aggregate IQ. Despite existing as separate entities, Vote Leave and BeLeave acted as affinity organizations, putting forward the same pro-Brexit messages. Based on VoteLeave’s conversion rate of 10%, Howard argues that at least eight million users were exposed to these messages during the organizations’ illegal overspending. Advertisements for the Vote Leave campaign overall received over 891 million impressions on Facebook. This exposure undoubtedly had consequences. Had “Remain” received 600,000 more votes, Howard states, the UK would stay a part of the EU.

Howard’s research paints a disturbing picture of the social media landscape. Our daily interactions have become so saturated by bias that we struggle to distinguish fact from falsehood. However, the author’s suggestions for mitigating lie machines, the how to save democracy, are not as detailed as his explanation of existing systems and their impacts. A part of this weakness emerges from the complex nature of the issue. Because so many powerful interests, including politicians and profiting companies, retain a stake in spreading lies, regulation is difficult.

The upshots of social media also complicate the issue. The author acknowledges that the use of platforms like Twitter and Facebook have historically empowered activists to spread information. He describes, for instance, how social media enabled organizers to share relevant information toward instituting change during the Arab Spring before political leaders recognized its potential to mold public opinion through disinformation. For today’s readers, Howard’s description of positive change diminished by bots, trolls, and junk news may feel prescient of online activism in the Black Lives Matter Movement. BLM advocates have utilized the #BLM hashtag to convey information and share videos of marches and police brutality even before these stories hit major news outlets. But the failure of #BlackoutTuesday, in which unwitting users marked black squares that drowned out pertinent information with the same #BLM hashtag, serves as a reminder of the danger misinformed users pose to online activism.

Still, Lie Machines is an effective and modern cautionary tale. Howard warns that these dangerous systems have threatened democracy before and will continue to exacerbate political polarization without active prevention. When parties cannot agree on the basic facts of a situation, a productive conversation cannot ensue, much less a policy debate or a legislative change.

About
Claire Wyszynski
:
Claire Wyszynski is a student at the College of William and Mary and a research assistant for the Transparent Developing Footprints project at AidData.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.

a global affairs media network

www.diplomaticourier.com

A Modern Cautionary Tale: Lie Machines

Photo by Camilo Jimenez via Unsplash.

July 21, 2020

Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives | By Philip N. Howard | May 2020.

I

n the wake of the 2016 presidential election, Russian influence in our politics has exposed more Americans to the dangers of digital disinformation. Foreign policy experts argue that Russian and Chinese efforts to encourage dissent via social networks pose one of the greatest threats to the United States. But social media and “junk news”—prejudice, extreme, and inaccurate political news—have been deployed against populations worldwide in both authoritarian regimes and democracies for years.

Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives | By Philip N. Howard | May 2020.

Philip Howard’s latest release Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives charts the depth and breadth of the influence of the systems that spread this dangerous disinformation. “Lie machines,” as Howard refers to them, produce and promulgate disinformation on a global scale with the intention of creating chaos. They subscribe to no one ideology (although Howard notes that the far-right has been particularly susceptible to “junk news”) and function with the purpose of making citizens question objective truth. In his book, Howard asks, “If we analyze the entire mechanism, with its social and technical parts, can we redesign it to support democratic norms?”

Analysis of the mechanisms of lie machines represent the strongest aspect of this book and accordingly receives the most page time. In discomforting detail, Howard breaks lie machines down into three primary parts: production, distribution, and marketing. Algorithms, social media, and big data equip troll armies with digital weaponry, information on citizens’ personal preferences, and significant political events with the potential to cause domestic turmoil. Bot networks, automated accounts, and accounts manually run by dedicated companies then spread computational propaganda, false information generated and spread by algorithms. Finally, structured campaigns encourage users—you and me—to further share junk news, because individuals are more likely to trust resources shared by those they know. Thorough descriptions of “Imitacja Consulting” and “Imitação Services,” two anonymous data consulting companies in Poland and Brazil, provide startling specimens that underscore how pervasive and scrupulous lie machines have become.

The author’s most striking example of the impact of lie machines on democratic society is his exploration of their influence on Brexit. He points out that both the Vote Leave and BeLeave organizations not only broke campaign finance law by overspending before the vote—according to a report from the UK Electoral Commission—but also employed similar promotion strategies supplemented by the outside data company Aggregate IQ. Despite existing as separate entities, Vote Leave and BeLeave acted as affinity organizations, putting forward the same pro-Brexit messages. Based on VoteLeave’s conversion rate of 10%, Howard argues that at least eight million users were exposed to these messages during the organizations’ illegal overspending. Advertisements for the Vote Leave campaign overall received over 891 million impressions on Facebook. This exposure undoubtedly had consequences. Had “Remain” received 600,000 more votes, Howard states, the UK would stay a part of the EU.

Howard’s research paints a disturbing picture of the social media landscape. Our daily interactions have become so saturated by bias that we struggle to distinguish fact from falsehood. However, the author’s suggestions for mitigating lie machines, the how to save democracy, are not as detailed as his explanation of existing systems and their impacts. A part of this weakness emerges from the complex nature of the issue. Because so many powerful interests, including politicians and profiting companies, retain a stake in spreading lies, regulation is difficult.

The upshots of social media also complicate the issue. The author acknowledges that the use of platforms like Twitter and Facebook have historically empowered activists to spread information. He describes, for instance, how social media enabled organizers to share relevant information toward instituting change during the Arab Spring before political leaders recognized its potential to mold public opinion through disinformation. For today’s readers, Howard’s description of positive change diminished by bots, trolls, and junk news may feel prescient of online activism in the Black Lives Matter Movement. BLM advocates have utilized the #BLM hashtag to convey information and share videos of marches and police brutality even before these stories hit major news outlets. But the failure of #BlackoutTuesday, in which unwitting users marked black squares that drowned out pertinent information with the same #BLM hashtag, serves as a reminder of the danger misinformed users pose to online activism.

Still, Lie Machines is an effective and modern cautionary tale. Howard warns that these dangerous systems have threatened democracy before and will continue to exacerbate political polarization without active prevention. When parties cannot agree on the basic facts of a situation, a productive conversation cannot ensue, much less a policy debate or a legislative change.

About
Claire Wyszynski
:
Claire Wyszynski is a student at the College of William and Mary and a research assistant for the Transparent Developing Footprints project at AidData.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.