.
I

n 2003, when three Harvard seniors hired a sophomore from Dobbs Ferry, New York, to take over a coding project for them, they could have never predicted their role in creating one of the world’s most successful websites. Within the next several months, that sophomore would take his classmates’ ideas and transform them into social media giant Facebook.

Sixteen years later, Facebook is a “massive global enterprise,” embroiled in far too many international controversies to be regarded as a mere website in the new decade. Media coverage had already suggested that the site could interfere with elections before Russia used Facebook to sway the American presidential election in 2016. In 2018, the company admitted that it should have done more to prevent its platform from being used to stoke violence against Rohingya Muslims in Myanmar. And in Ethiopia that same year, a prominent political activist named Jawar Mohammed used Facebook to begin successfully advocating for violence against minority groups in his country.

In the United States, much recent commentary on Facebook’s role in global society revolves around concerns about what tech giants mean for American free speech. On January 7, following the insurrection in Washington, DC, Facebook decided to indefinitely suspend then-President Donald Trump’s account after he encouraged the Capitol riots. Twitter made a similar decision on January 8, opting to permanently suspend Trump’s account in the wake of the violence in Washington, DC. Though neither tech company’s decision violated the First Amendment, which prevents the government from prohibiting free speech in the U.S., the aftermath of Trump’s suspension from social media prompted debate on the topic both at home and abroad. One conservative commentator argued that the First Amendment should apply to tech companies because they are “more powerful than a de facto government.” German Chancellor Angela Merkel also argued that Twitter’s decision was a blow to free speech in the U.S., saying that the country should pass a law specifically prohibiting online incitement, as Germany has done.  

By appointing a Supreme Court, Facebook is attempting to govern the third of the global population that uses their site through non-democratic means.

By appointing a Supreme Court, Facebook is attempting to govern the third of the global population that uses their site through non-democratic means.

The question of what to do about Donald Trump is just one issue that Big Tech will have to grapple with in the coming decade. Facebook has resolved to handle the issue with its new Oversight Board, a sort of private Supreme Court the tech giant has put in place over the past few years to manage the problems it has faced with content management. To figure out how the board might work, Facebook invited experts to international workshops in cities ranging from Berlin to New Delhi. The workshops attempted to gather information about how Facebook’s Supreme Court might function in a way that incorporated perspectives from all over the world.

However, as a recent episode from the podcast Radiolab indicated, even seemingly simple content issues raised myriad responses depending on who was assessing the photo or video. In “Facebook’s Supreme Court,” the podcast interviewed Kate Klonik, a law professor who had sat in on one of Facebook’s international workshops in New York. Klonik described a seemingly straightforward simulation the workshop had conducted, analyzing a photo of a cherubic little girl with a speech bubble above her head that said “Kill All Men.” In New York, workshop participants voted to keep the post on Facebook, explaining that it was merely humor and social commentary. However, Radiolab also interviewed Berhan Taye, whose response to the photo was shaped by circumstances in her native Ethiopia. Taye works for an NGO defending digital rights of at-risk users throughout the world, and when analyzing the “Kill All Men” post, she drew on memories of Facebook content she had seen in Ethiopia which had encouraged users to literally kill members of rival ethnic groups. That activism had unfortunately been successful and led to violence against minority groups. In Nairobi, Taye and many others perceived the “Kill All Men” post to be a call to violence that should be taken down.

The “Kill All Men” simulation is just one example of what Facebook is grappling with in its attempt to establish a Supreme Court. The issues that will come before this Supreme Court are far bigger than protecting the First Amendment within the United States, an American right so many argued about after Trump’s account was suspended in early January. Today, Facebook has three billion users, 80% of whom live outside the United States. By appointing a Supreme Court, Facebook is attempting to govern the third of the global population that uses their site through non-democratic means. In the U.S., a Facebook spokesman said that Trump’s account was suspended as “a response to a specific situation based on risk,” but for years, as journalist Emily Bazelon observes, the platform preached free speech at home while disinformation and incitement spread like wildfire abroad. Much risk remains if this ethos continues to be applied by the Supreme Court in cases abroad where many might be affected by Facebook’s haphazard approach to managing its content disputes.

About
Allyson Berri
:
Allyson Berri is a Diplomatic Courier Correspondent whose writing focuses on global affairs and economics.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.

a global affairs media network

www.diplomaticourier.com

More Than Just First Amendment Decisions to Come Before Facebook’s Supreme Court

Photo by Glen Carrie via Unsplash.

February 24, 2021

I

n 2003, when three Harvard seniors hired a sophomore from Dobbs Ferry, New York, to take over a coding project for them, they could have never predicted their role in creating one of the world’s most successful websites. Within the next several months, that sophomore would take his classmates’ ideas and transform them into social media giant Facebook.

Sixteen years later, Facebook is a “massive global enterprise,” embroiled in far too many international controversies to be regarded as a mere website in the new decade. Media coverage had already suggested that the site could interfere with elections before Russia used Facebook to sway the American presidential election in 2016. In 2018, the company admitted that it should have done more to prevent its platform from being used to stoke violence against Rohingya Muslims in Myanmar. And in Ethiopia that same year, a prominent political activist named Jawar Mohammed used Facebook to begin successfully advocating for violence against minority groups in his country.

In the United States, much recent commentary on Facebook’s role in global society revolves around concerns about what tech giants mean for American free speech. On January 7, following the insurrection in Washington, DC, Facebook decided to indefinitely suspend then-President Donald Trump’s account after he encouraged the Capitol riots. Twitter made a similar decision on January 8, opting to permanently suspend Trump’s account in the wake of the violence in Washington, DC. Though neither tech company’s decision violated the First Amendment, which prevents the government from prohibiting free speech in the U.S., the aftermath of Trump’s suspension from social media prompted debate on the topic both at home and abroad. One conservative commentator argued that the First Amendment should apply to tech companies because they are “more powerful than a de facto government.” German Chancellor Angela Merkel also argued that Twitter’s decision was a blow to free speech in the U.S., saying that the country should pass a law specifically prohibiting online incitement, as Germany has done.  

By appointing a Supreme Court, Facebook is attempting to govern the third of the global population that uses their site through non-democratic means.

By appointing a Supreme Court, Facebook is attempting to govern the third of the global population that uses their site through non-democratic means.

The question of what to do about Donald Trump is just one issue that Big Tech will have to grapple with in the coming decade. Facebook has resolved to handle the issue with its new Oversight Board, a sort of private Supreme Court the tech giant has put in place over the past few years to manage the problems it has faced with content management. To figure out how the board might work, Facebook invited experts to international workshops in cities ranging from Berlin to New Delhi. The workshops attempted to gather information about how Facebook’s Supreme Court might function in a way that incorporated perspectives from all over the world.

However, as a recent episode from the podcast Radiolab indicated, even seemingly simple content issues raised myriad responses depending on who was assessing the photo or video. In “Facebook’s Supreme Court,” the podcast interviewed Kate Klonik, a law professor who had sat in on one of Facebook’s international workshops in New York. Klonik described a seemingly straightforward simulation the workshop had conducted, analyzing a photo of a cherubic little girl with a speech bubble above her head that said “Kill All Men.” In New York, workshop participants voted to keep the post on Facebook, explaining that it was merely humor and social commentary. However, Radiolab also interviewed Berhan Taye, whose response to the photo was shaped by circumstances in her native Ethiopia. Taye works for an NGO defending digital rights of at-risk users throughout the world, and when analyzing the “Kill All Men” post, she drew on memories of Facebook content she had seen in Ethiopia which had encouraged users to literally kill members of rival ethnic groups. That activism had unfortunately been successful and led to violence against minority groups. In Nairobi, Taye and many others perceived the “Kill All Men” post to be a call to violence that should be taken down.

The “Kill All Men” simulation is just one example of what Facebook is grappling with in its attempt to establish a Supreme Court. The issues that will come before this Supreme Court are far bigger than protecting the First Amendment within the United States, an American right so many argued about after Trump’s account was suspended in early January. Today, Facebook has three billion users, 80% of whom live outside the United States. By appointing a Supreme Court, Facebook is attempting to govern the third of the global population that uses their site through non-democratic means. In the U.S., a Facebook spokesman said that Trump’s account was suspended as “a response to a specific situation based on risk,” but for years, as journalist Emily Bazelon observes, the platform preached free speech at home while disinformation and incitement spread like wildfire abroad. Much risk remains if this ethos continues to be applied by the Supreme Court in cases abroad where many might be affected by Facebook’s haphazard approach to managing its content disputes.

About
Allyson Berri
:
Allyson Berri is a Diplomatic Courier Correspondent whose writing focuses on global affairs and economics.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.