.
T

he early 21st-century is a confusing time. At no point in history has humanity had as much access to information as it does today, yet at the same time, it seems it has never been harder to tell what is true and what is not. There is “fake news”, “alternative facts”, the creation of “our own realities” by our actions, “gaslighting”, and more. If we don’t like the information presented to us, we can simply go find information more to our liking and more in line with our preconceived notions. Increasingly, thanks to advanced algorithms and social media, computers can coddle our biases for us, even if we didn’t know it at the time.

Deep Fakes: The Coming Infocalypse | Nina Schick | Twelve | August 2020.

This is a situation that seems to be out of control, and if author Nina Schick is right, it will only get worse and faster, thanks to the emergence of “deep fakes” and synthetic media. In her aptly titled new book “Deep Fakes: The Coming Infocalypse”, Ms. Schick explores two mutually reinforcing trends: the rise of fake media through advanced technology (deep fakes) and the crisis of misinformation.

In essence, according to Ms. Schick, the world has become so saturated with misinformation, the proliferation of alternative media sources, the erosion of expertise, and the weakening of traditional information arbiters (news media, official sources, etc.) that we are amid an “Infocalypse”. Russian active measures campaigns, the White House’s denial of facts, anti-vaccination campaigns, conspiracy theory movements, clickbait driven media, and more, have all created an environment where it is harder than ever to discern fact from fiction.

Yet, with the advent and democratization of synthetic media tools and technologies, this Infocalypse will only get worse, according to Ms. Schick, as anyone can make anyone say anything or fabricate any image they want. It will become even more difficult to tell what is real and what is not.

Manipulating the Market

Russian disinformation, America’s information marketplace, and the role of the media have all been explored in greater depth elsewhere and in attempting to do so here, Ms. Schick detracts from what is arguably of greater interest. Schick’s book is at its strongest when it is least focused on America’s political scene and the accompanying information environment. What synthetic media means for the broader economy—illicit and licit—and what it can do in non-Western environments is fascinating.

Her exploration of synthetic media, first in the content of adult entertainment and later for the broader economy, is very interesting but also merits further exploration. Of course, readers may titter at the adult entertainment aspect, but it is an interesting vehicle to understand the ownership of images and identity as well as the broader marketplace. If celebrity images can be (and indeed are) uploaded to adult performers creating a synthetic media with little legal recourse, what does that mean for the rest of us? As Schick notes there have already been cases of unsuspecting individuals having their likeness used in a similar fashion.

Taking it one step further, long-dead celebrities are now resurrected to hawk various products and services, or even perform shows. If anyone, or their likeness, can be made to say and do anything, how can we tell what is real and what is not?

We’ve already seen the impact of “fake news” on the marketplace. Tweets from AP alleging bombings at the White House prompted a run on the market. Of course, no such bombing occurred and AP’s Twitter account had been hacked, but the combination of automated trading with algorithms primed for keywords led to an automatic response. What if a video (or even just audio) was leaked that allegedly showed a CEO informing investors of a fatal flaw in their flagship product? Or that she was suffering from incurable cancer? What would happen to the company’s stock price and how quickly could the damage be contained or rolled back?

In David Ignatius’ latest spy thriller, The Paladin (also reviewed for the Diplomatic Courier), this theory features as part of the plot. Without giving too much away, the CIA finds itself contending with synthetic media and its power to disrupt.

Be it the marketplace or beyond, our information economy is primed for emotional, “gut” responses; it relies up on the human tendency to react instantaneously, rather than seek out the truth. Rumors, clickbait, instant analysis, viral videos—the modern competition for eyeballs and ad dollars incentivizes emotional reactions rather than rational analysis. Synthetic media or deep fakes could easily be leveraged to exploit this tendency.

Deep Fakes in the Wild

In February 2017, a false report alleging the rape of a 15-year-old girl by German-speaking men circulated within Lithuania. The Russian-manufactured story sought to generate animosity towards the German troop presence in Lithuania and play upon very real historical grievances. In this instance, the campaign failed, but imagine—as horrid as it would be—if a similar situation played out with faked video or images, or perhaps played out in a less information-saturated environment like sub-Saharan Africa?

Here again, is where Schick shines. She explores cases in Myanmar, India, and Gabon. In Myanmar, social media—not faked—is used to mobilize anti-Rohingya sentiment leading to very real violence. Before Facebook acted to shut down accounts advocating violence, an estimated 25,000 ethnic Muslims were killed in that country and another 700,000 fled Myanmar.

In South India, a video circulated on WhatsApp allegedly showing a child being kidnapped on a motorbike. Five individuals traveling through the area were targeted by a mob, forcing them to flee. One was beaten to death and two suffered severe injuries. The video was part of a campaign in Pakistan to warn about the threat of child abductions and was not real, it was simply mis-contextualized. Schick also describes the story of an investigative journalist in India, Rana Ayyub, who was targeted using synthetic media and deep fakes to create lewd images of her in retaliation for her reporting.

In Gabon, President Ali Bongo disappeared from public sight for a period prompting speculation about his health and well-being. A press conference held to prove his safety resulted in even more rumors and allegations that it was a “deep fake” or that he had been replaced with a body double. This led to a coup attempt, which ultimately failed, but illustrates the dangers of an inability to trust the information marketplace.

Her treatment of the COVID-19 disinformation campaign feels a bit tacked on as it is neither fully fleshed out nor entwined with her overall theme of deep fakes and synthetic media. This is not entirely the author’s fault as the situation is fluid and evolving, and is likely to be some time before the full story of China’s manipulation of the COVID-19-related media and Russia’s exploitation of the pandemic is fully explored.

Revolution or Evolution?

Do “deep fakes” and synthetic media represent a revolutionary change in information warfare or are they merely an evolutionary step in disinformation and propaganda? Arguably, and after reading Schick’s book, they appear to be more of the latter rather than the former.

Every technological innovation from the telegraph to the internet has been—and will be—used for information warfare purposes, whether it is an active measures campaign to manipulate an adversary, or an effort to introduce confusion into an enemy’s decision-making cycle. In October 2017, Russia attacked NATO troops’ cellphones in an attempt to gain intelligence on exercises in Poland. It doesn’t take much extrapolation to imagine Moscow sending fake or confusing orders as if it were those soldiers’ commander. Any delay in response could provide a tactical advantage in a crisis.

Still images and video do have a powerful effect—seeing is believing—but audio can be just as powerful, especially if you can make anyone say anything with a few recordings of their real voice and some readily available software.

What is different is the speed and ease with which these fakes can be created, and the democratization of the technology. Thanks to the internet, nearly everyone can download and use deep fake technology, whether for entertainment or criminal activity. An internet-famous cat, OwlKitty, is inserted into popular films alongside famous actors and actresses. In a recent upload, OwlKitty’s owner edited her face onto Catherine Zeta-Jones’ body in the film Entrapment. In a scene where Ms. Zeta-Jones needs to dodge lasers to steal a priceless mask, OwlKitty (being a cat) jumps at the laser triggering the alarm. In the accompanying behind-the-scenes story, the owner notes how she used software to add her face for the three-second clip so it appears Ms. Zeta-Jones is reacting to the mischievous cat.

If an internet cat account can use deep fakes for a laugh, imagine what the GRU in Moscow could do.

About
Joshua Huminski
:
Joshua C. Huminski is Director of the Mike Rogers Center for Intelligence & Global Affairs at the Center for the Study of the Presidency & Congress.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.

a global affairs media network

www.diplomaticourier.com

Deep Fakes: The Coming Infocalypse

August 29, 2020

Deep Fakes: The Coming Infocalypse | Nina Schick | Twelve | August 2020.

T

he early 21st-century is a confusing time. At no point in history has humanity had as much access to information as it does today, yet at the same time, it seems it has never been harder to tell what is true and what is not. There is “fake news”, “alternative facts”, the creation of “our own realities” by our actions, “gaslighting”, and more. If we don’t like the information presented to us, we can simply go find information more to our liking and more in line with our preconceived notions. Increasingly, thanks to advanced algorithms and social media, computers can coddle our biases for us, even if we didn’t know it at the time.

Deep Fakes: The Coming Infocalypse | Nina Schick | Twelve | August 2020.

This is a situation that seems to be out of control, and if author Nina Schick is right, it will only get worse and faster, thanks to the emergence of “deep fakes” and synthetic media. In her aptly titled new book “Deep Fakes: The Coming Infocalypse”, Ms. Schick explores two mutually reinforcing trends: the rise of fake media through advanced technology (deep fakes) and the crisis of misinformation.

In essence, according to Ms. Schick, the world has become so saturated with misinformation, the proliferation of alternative media sources, the erosion of expertise, and the weakening of traditional information arbiters (news media, official sources, etc.) that we are amid an “Infocalypse”. Russian active measures campaigns, the White House’s denial of facts, anti-vaccination campaigns, conspiracy theory movements, clickbait driven media, and more, have all created an environment where it is harder than ever to discern fact from fiction.

Yet, with the advent and democratization of synthetic media tools and technologies, this Infocalypse will only get worse, according to Ms. Schick, as anyone can make anyone say anything or fabricate any image they want. It will become even more difficult to tell what is real and what is not.

Manipulating the Market

Russian disinformation, America’s information marketplace, and the role of the media have all been explored in greater depth elsewhere and in attempting to do so here, Ms. Schick detracts from what is arguably of greater interest. Schick’s book is at its strongest when it is least focused on America’s political scene and the accompanying information environment. What synthetic media means for the broader economy—illicit and licit—and what it can do in non-Western environments is fascinating.

Her exploration of synthetic media, first in the content of adult entertainment and later for the broader economy, is very interesting but also merits further exploration. Of course, readers may titter at the adult entertainment aspect, but it is an interesting vehicle to understand the ownership of images and identity as well as the broader marketplace. If celebrity images can be (and indeed are) uploaded to adult performers creating a synthetic media with little legal recourse, what does that mean for the rest of us? As Schick notes there have already been cases of unsuspecting individuals having their likeness used in a similar fashion.

Taking it one step further, long-dead celebrities are now resurrected to hawk various products and services, or even perform shows. If anyone, or their likeness, can be made to say and do anything, how can we tell what is real and what is not?

We’ve already seen the impact of “fake news” on the marketplace. Tweets from AP alleging bombings at the White House prompted a run on the market. Of course, no such bombing occurred and AP’s Twitter account had been hacked, but the combination of automated trading with algorithms primed for keywords led to an automatic response. What if a video (or even just audio) was leaked that allegedly showed a CEO informing investors of a fatal flaw in their flagship product? Or that she was suffering from incurable cancer? What would happen to the company’s stock price and how quickly could the damage be contained or rolled back?

In David Ignatius’ latest spy thriller, The Paladin (also reviewed for the Diplomatic Courier), this theory features as part of the plot. Without giving too much away, the CIA finds itself contending with synthetic media and its power to disrupt.

Be it the marketplace or beyond, our information economy is primed for emotional, “gut” responses; it relies up on the human tendency to react instantaneously, rather than seek out the truth. Rumors, clickbait, instant analysis, viral videos—the modern competition for eyeballs and ad dollars incentivizes emotional reactions rather than rational analysis. Synthetic media or deep fakes could easily be leveraged to exploit this tendency.

Deep Fakes in the Wild

In February 2017, a false report alleging the rape of a 15-year-old girl by German-speaking men circulated within Lithuania. The Russian-manufactured story sought to generate animosity towards the German troop presence in Lithuania and play upon very real historical grievances. In this instance, the campaign failed, but imagine—as horrid as it would be—if a similar situation played out with faked video or images, or perhaps played out in a less information-saturated environment like sub-Saharan Africa?

Here again, is where Schick shines. She explores cases in Myanmar, India, and Gabon. In Myanmar, social media—not faked—is used to mobilize anti-Rohingya sentiment leading to very real violence. Before Facebook acted to shut down accounts advocating violence, an estimated 25,000 ethnic Muslims were killed in that country and another 700,000 fled Myanmar.

In South India, a video circulated on WhatsApp allegedly showing a child being kidnapped on a motorbike. Five individuals traveling through the area were targeted by a mob, forcing them to flee. One was beaten to death and two suffered severe injuries. The video was part of a campaign in Pakistan to warn about the threat of child abductions and was not real, it was simply mis-contextualized. Schick also describes the story of an investigative journalist in India, Rana Ayyub, who was targeted using synthetic media and deep fakes to create lewd images of her in retaliation for her reporting.

In Gabon, President Ali Bongo disappeared from public sight for a period prompting speculation about his health and well-being. A press conference held to prove his safety resulted in even more rumors and allegations that it was a “deep fake” or that he had been replaced with a body double. This led to a coup attempt, which ultimately failed, but illustrates the dangers of an inability to trust the information marketplace.

Her treatment of the COVID-19 disinformation campaign feels a bit tacked on as it is neither fully fleshed out nor entwined with her overall theme of deep fakes and synthetic media. This is not entirely the author’s fault as the situation is fluid and evolving, and is likely to be some time before the full story of China’s manipulation of the COVID-19-related media and Russia’s exploitation of the pandemic is fully explored.

Revolution or Evolution?

Do “deep fakes” and synthetic media represent a revolutionary change in information warfare or are they merely an evolutionary step in disinformation and propaganda? Arguably, and after reading Schick’s book, they appear to be more of the latter rather than the former.

Every technological innovation from the telegraph to the internet has been—and will be—used for information warfare purposes, whether it is an active measures campaign to manipulate an adversary, or an effort to introduce confusion into an enemy’s decision-making cycle. In October 2017, Russia attacked NATO troops’ cellphones in an attempt to gain intelligence on exercises in Poland. It doesn’t take much extrapolation to imagine Moscow sending fake or confusing orders as if it were those soldiers’ commander. Any delay in response could provide a tactical advantage in a crisis.

Still images and video do have a powerful effect—seeing is believing—but audio can be just as powerful, especially if you can make anyone say anything with a few recordings of their real voice and some readily available software.

What is different is the speed and ease with which these fakes can be created, and the democratization of the technology. Thanks to the internet, nearly everyone can download and use deep fake technology, whether for entertainment or criminal activity. An internet-famous cat, OwlKitty, is inserted into popular films alongside famous actors and actresses. In a recent upload, OwlKitty’s owner edited her face onto Catherine Zeta-Jones’ body in the film Entrapment. In a scene where Ms. Zeta-Jones needs to dodge lasers to steal a priceless mask, OwlKitty (being a cat) jumps at the laser triggering the alarm. In the accompanying behind-the-scenes story, the owner notes how she used software to add her face for the three-second clip so it appears Ms. Zeta-Jones is reacting to the mischievous cat.

If an internet cat account can use deep fakes for a laugh, imagine what the GRU in Moscow could do.

About
Joshua Huminski
:
Joshua C. Huminski is Director of the Mike Rogers Center for Intelligence & Global Affairs at the Center for the Study of the Presidency & Congress.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.