.
E

astern European populations vary dramatically in their vulnerability to online disinformation according to a new report by DisinfoLab, a student-led think tank at the College of William & Mary’s Global Research Institute.

Eastern Europe is a hotbed for foreign and state-sponsored disinformation about ongoing global crises like Russia’s invasion of Ukraine and the COVID-19 pandemic. DisinfoLab’s study, conducted in spring 2022, compared disinformation resiliency among Hungarian, Polish, and Russian speakers on Facebook by analyzing how these users responded to posts containing disinformation—ranging from agreement to rejection.

Click to read the full report for free.

Overall, DisinfoLab found that Poles exposed false narratives 1.7x more often than Hungarians, with 30% of Polish comments collected disagreeing with disinformation compared to just 18% of all Hungarian comments.

In comparison to Hungarian and Polish speakers, Russian speakers exhibited the highest rate of disinformation vulnerability in the study. Roughly 65% of Russian-language comments agreed with false posts, while only 19% disagreed.

Additionally, DisinfoLab observed that each linguistic group demonstrated unique vulnerability to disinformation on certain topics, likely shaped by the population’s socio-political profile that informs how citizens receive, understand, and respond to false claims online.

For instance, although Hungary and Poland are similarly experiencing state-led media takeovers, Poland is staunchly anti-Kremlin while Hungary has friendly relations with Russian leaders. In line with these sentiments, Russia-Ukraine disinformation on Facebook was met with support by 37% of Polish comments, and by 59% of Hungarian comments. Notably, these agreement rates were flipped for COVID-19 disinformation on Facebook, which was met with support by 80% of Polish comments, and by 31% of Hungarian comments.

For all groups within the study, comments agreed with disinformation posts far more often than they challenged them. Across DisinfoLab’s dataset, 54% of comments in target languages (Hungarian, Polish, Russian) agreed with disinformation posts, while only 22% of comments disagreed.

In light of DisinfoLab’s findings, the team offered recommendations for NATO states seeking to combat disinformation in Eastern Europe. DisinfoLab urges NATO members to craft their approaches around a country’s distinct media environment, considering country-specific factors like state censorship, local historical memory, and linguistic populations. To this end, DisinfoLab advises the following:

First, in countries experiencing democratic backsliding, NATO countries should seek to support and expand local media, media literacy, and fact-checking organizations to create resilience against state-led disinformation campaigns. In multilingual countries, such organizations should offer resources for each linguistic group. This move will allow all citizens to understand counter-disinformation efforts, minimizing the impacts of disinformation campaigns which seek to exploit social divisions.

Second, NATO states should encourage local media literacy and fact-checking organizations to focus their efforts on pre-bunking and debunking the most salient topics for disinformation. The overwhelming majority of disinformation content found in the study pertained to Russia’s invasion of Ukraine or COVID-19, two large-scale and hot button events. Local organizations are best suited to understand how such events uniquely interact with their respective communities.

Third, NATO states should engage in constructive partnerships with social media companies. Meta’s partnership with local fact-checking organizations in some countries is an effective mechanism for flagging misleading content for users. Meta should expand this program across more fact-checking organizations and countries, and other social media platforms should follow suit. Moreover, governments should pressure social media platforms to release more descriptive information about public posts, such as the poster’s country. More accurate information will allow disinformation researchers and government officials alike to better evaluate the strengths and shortcomings of their information environments and shape policy accordingly.

DisinfoLab would like to thank Sarah Devendorf, Shradha Dinesh, Megan Hogan, Brennen Micheal, Sayyed Razmjo, Chas Rinne, Skyler Seets, Madeline Smith, Samantha Strauss, Selene Swanson, Mary Waterman, Sarah Wozniak, Yile Xu, and Sean Zhou for their tremendous support in designing this research report and building the data collection methodology.

About
Aaraj Vij
:
Aaraj Vij, Co-Director of DisinfoLab, is a junior at the College of William & Mary studying computer science and international relations. Drawing on his education, he researches both policy and technical strategies to counteract online disinformation.
About
Thomas Plant
:
Thomas Plant is an analyst at Valens Global and supports the organization’s work on domestic extremism. He is also an incoming Fulbright research scholar to Estonia and the co-founder of William & Mary’s DisinfoLab, the nation’s first undergraduate disinformation research lab.
About
Jeremy Swack
:
Jeremy Swack, Technical Director of DisinfoLab, is a sophomore studying Computer Science and Data Science at the College of William and Mary. Using his background in machine learning and data visualization, he researches data driven methods to analyze and predict the spread of disinformation.
About
Alyssa Nekritz
:
Alyssa Nekritz is Managing Editor and Disinformation Analyst for DisinfoLab, a student-led think tank at the College of William & Mary’s Global Research Institute. She is a junior pursuing a B.A. in International Relations with a minor in Data Science from the College of William & Mary.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.

a global affairs media network

www.diplomaticourier.com

What Facebook Comments Say About Disinformation in Eastern Europe

Image via Adobe Stock.

August 17, 2022

Eastern European populations vary dramatically in their vulnerability to online disinformation according to a new report by DisinfoLab, a student-led think tank at the College of William & Mary’s Global Research Institute.

E

astern European populations vary dramatically in their vulnerability to online disinformation according to a new report by DisinfoLab, a student-led think tank at the College of William & Mary’s Global Research Institute.

Eastern Europe is a hotbed for foreign and state-sponsored disinformation about ongoing global crises like Russia’s invasion of Ukraine and the COVID-19 pandemic. DisinfoLab’s study, conducted in spring 2022, compared disinformation resiliency among Hungarian, Polish, and Russian speakers on Facebook by analyzing how these users responded to posts containing disinformation—ranging from agreement to rejection.

Click to read the full report for free.

Overall, DisinfoLab found that Poles exposed false narratives 1.7x more often than Hungarians, with 30% of Polish comments collected disagreeing with disinformation compared to just 18% of all Hungarian comments.

In comparison to Hungarian and Polish speakers, Russian speakers exhibited the highest rate of disinformation vulnerability in the study. Roughly 65% of Russian-language comments agreed with false posts, while only 19% disagreed.

Additionally, DisinfoLab observed that each linguistic group demonstrated unique vulnerability to disinformation on certain topics, likely shaped by the population’s socio-political profile that informs how citizens receive, understand, and respond to false claims online.

For instance, although Hungary and Poland are similarly experiencing state-led media takeovers, Poland is staunchly anti-Kremlin while Hungary has friendly relations with Russian leaders. In line with these sentiments, Russia-Ukraine disinformation on Facebook was met with support by 37% of Polish comments, and by 59% of Hungarian comments. Notably, these agreement rates were flipped for COVID-19 disinformation on Facebook, which was met with support by 80% of Polish comments, and by 31% of Hungarian comments.

For all groups within the study, comments agreed with disinformation posts far more often than they challenged them. Across DisinfoLab’s dataset, 54% of comments in target languages (Hungarian, Polish, Russian) agreed with disinformation posts, while only 22% of comments disagreed.

In light of DisinfoLab’s findings, the team offered recommendations for NATO states seeking to combat disinformation in Eastern Europe. DisinfoLab urges NATO members to craft their approaches around a country’s distinct media environment, considering country-specific factors like state censorship, local historical memory, and linguistic populations. To this end, DisinfoLab advises the following:

First, in countries experiencing democratic backsliding, NATO countries should seek to support and expand local media, media literacy, and fact-checking organizations to create resilience against state-led disinformation campaigns. In multilingual countries, such organizations should offer resources for each linguistic group. This move will allow all citizens to understand counter-disinformation efforts, minimizing the impacts of disinformation campaigns which seek to exploit social divisions.

Second, NATO states should encourage local media literacy and fact-checking organizations to focus their efforts on pre-bunking and debunking the most salient topics for disinformation. The overwhelming majority of disinformation content found in the study pertained to Russia’s invasion of Ukraine or COVID-19, two large-scale and hot button events. Local organizations are best suited to understand how such events uniquely interact with their respective communities.

Third, NATO states should engage in constructive partnerships with social media companies. Meta’s partnership with local fact-checking organizations in some countries is an effective mechanism for flagging misleading content for users. Meta should expand this program across more fact-checking organizations and countries, and other social media platforms should follow suit. Moreover, governments should pressure social media platforms to release more descriptive information about public posts, such as the poster’s country. More accurate information will allow disinformation researchers and government officials alike to better evaluate the strengths and shortcomings of their information environments and shape policy accordingly.

DisinfoLab would like to thank Sarah Devendorf, Shradha Dinesh, Megan Hogan, Brennen Micheal, Sayyed Razmjo, Chas Rinne, Skyler Seets, Madeline Smith, Samantha Strauss, Selene Swanson, Mary Waterman, Sarah Wozniak, Yile Xu, and Sean Zhou for their tremendous support in designing this research report and building the data collection methodology.

About
Aaraj Vij
:
Aaraj Vij, Co-Director of DisinfoLab, is a junior at the College of William & Mary studying computer science and international relations. Drawing on his education, he researches both policy and technical strategies to counteract online disinformation.
About
Thomas Plant
:
Thomas Plant is an analyst at Valens Global and supports the organization’s work on domestic extremism. He is also an incoming Fulbright research scholar to Estonia and the co-founder of William & Mary’s DisinfoLab, the nation’s first undergraduate disinformation research lab.
About
Jeremy Swack
:
Jeremy Swack, Technical Director of DisinfoLab, is a sophomore studying Computer Science and Data Science at the College of William and Mary. Using his background in machine learning and data visualization, he researches data driven methods to analyze and predict the spread of disinformation.
About
Alyssa Nekritz
:
Alyssa Nekritz is Managing Editor and Disinformation Analyst for DisinfoLab, a student-led think tank at the College of William & Mary’s Global Research Institute. She is a junior pursuing a B.A. in International Relations with a minor in Data Science from the College of William & Mary.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.