.
T

he concept of privacy in the modern age is rapidly eroding. Both by omission and commission what was understood to be private and personal has fundamentally been upended by technology. Users of social media generate unquantifiable amounts of data via apps and social media about their habits, behaviors, and preferences—all of which are used to generate profiles about them. These are commoditized and sold on to determine what products a customer may like, what advertisements they may better respond to, and quite possibly how they will vote. 

Yet, we feel as though our person, our physical body and face, remains off-limits. Yes, many of us use biometric-related services, but that is, largely, by choice. What if you didn’t have a choice? What if the system just tracked you by your face—where you went and with whom you associated? Perhaps it was for the ease of navigating security at airports or, ostensibly, to reduce crime. But could you really be sure? 

Your Face Belongs to Us | Kashmir Hill | Random House

Reporter Kashmir Hill unpacks the perils of facial recognition in her thrilling and superbly reported new book “Your Face Belongs to Us” which explores the issue through her journalistic efforts to uncover Clearview AI—a secretive facial recognition start-up whose tools are increasingly used by law enforcement. Started by an unlikely group consisting of an Australian-Vietnamese software coder, a right-wing troll, and a Rudy Giuliani-linked fixer, Clearview AI’s growth and emergence is a strange and twisted story, one well told by Hill. 

For a company keen on identifying every person on the planet at any given time, Hill recounts, Clearview was deeply interested in keeping itself hidden. In her attempts to find out more about the company she used proxy users of the software (within police departments) to run her own face through the system. This prompted some of her law enforcement interlocutors to cut off engagement with her entirely, often with Clearview suspending users’ accounts upon becoming aware of their cooperation. Evidently, Clearview knew who she was and was eager to make her reporting even more difficult.  

Hill’s book bounces between the emergence and growth of Clearview, the history of facial recognition, and the developing awareness of the legal implications of biometric identification. It is a clever, well-done narrative; she maintains a clear focus on both, without losing the narrative thread of either in the process. 

“Your Face Belongs to Us” is more than the story of Clearview. It is a window into the challenges that result from the collision of emerging technology, market incentives, lax (or non-existent) regulation, and the societal effects of progress. While some companies, such as Google, have refrained from unleashing facial recognition into the wild, Facebook and Clearview have embraced it, and in the case of the former, got its users to do the dirty work of training its algorithms. Do you remember ever tagging your friends in a photo or confirming that a suggested tag was indeed your best friend from high school? Well, if you did, you helped train Facebook’s algorithms. As Hill recounts, in many cases, the market forces and incentives behind getting your product out first are significant—greater access to capital, greater market share capture, and potentially crowding out competitors. 

Companies will, as Clearview did, seek to find the jurisdiction of least regulation to maximize their profit potential and user base. Hill writes how a German man requested the photos of him Clearview had under the European Union’s General Data Protection Regulation (GDPR), which affords EU citizens markedly greater privacy rights than anything in the United States. While he could request Clearview to, in effect, forget him, it wouldn’t stop its AI from scrapping the internet and finding all photos of him again. 

By comparison, it is unlikely that Congress will pass anything similar; there is an uneven appreciation of the challenges of facial recognition and intense lobbying from the tech industry will water down anything approaching GDPR levels of restriction. In the absence of federal action; however, state-level legislation is possible. Hill writes of a little recognized bill in Illinois which sharply limits the use of biometric data, and there was an aborted effort in Texas to do the same. In the case of the former, companies like Clearview sought to avoid the state as much as possible, fearing litigation. Progress in Texas remains unfinished. 

In most cases legislative activity won’t happen unless there is a clear demand signal from the public, and that does not appear forthcoming. While Hill was chasing down Clearview, consumer use of facial recognition software seemed to just appear and become normalized. How many of us use Apple’s FaceID to unlock our iPhones, not just once, but dozens of times per day? What about Clear—the solution to the wholly-government-created problem of dreadful queues at airport security? That uses your biometrics to prove you are, well, you. 

That rapid and almost unquestioned emergence has; however, exposed both the duality of technology’s use and the risks of rapidly pushing an untested product to market. Hill covers the story of an innocent African American man in Detroit who was arrested because facial recognition tied him to a series of crimes. These positive identifications are meant to be only “investigative leads,” but often become much more than that—in this case justification for his arrest and detention. That case raises questions not just about the system’s use by law enforcement, but the fallibility of the algorithms themselves. It would, as in this case, be easy to see how an “investigative lead” becomes the basis for an entire case, especially when the system suggests it has 99% confidence of a match.

Yet, those very systems contain within them biases and flaws. Often overwhelmingly trained on Caucasian faces, these systems perform markedly worse for minorities. Even improving the complex algorithms to reduce biases and increase accuracy does nothing to address how law enforcement uses the tools, which is almost certainly a legal, ethical, and policy question that must balance the desire for arrest and conviction with the need to uphold equal justice under the law. 

This also assumes that the technology remains in the hands of those whom the public trusts. Which is clearly not the case. Hill writes of how users of facial recognition software were able to identify adult film actresses and stalk women simply by running their photo through the system. Furthermore, the Chinese government uses facial recognition software to monitor and control its Uyghur population. Beijing uses systems like these to track anyone within its borders and it is keen to export this and other elements of its system of social control abroad. 

On finishing “Your Face Belongs to Us” one can’t help but feel disheartened and slightly alarmed. Hill vividly illustrates the dual nature of technology, its promise and its peril, and the need for yet another conversation on balancing the two. Yet, there seems little appetite in Congress to address the broader issue of privacy in the digital age. The technology is inexorably marching forward. It is becoming more and more commonplace in our everyday lives while its government and commercial adoption increases. We may find ourselves in a world where facial recognition is well, just there, and we will look back and wonder how we found ourselves there. Hill’s book will then, as it does now, act as an instructive guide and Cassandra-like warning should that day come to pass.

About
Joshua Huminski
:
Joshua C. Huminski is the Senior Vice President for National Security & Intelligence Programs and the Director of the Mike Rogers Center at the Center for the Study of the Presidency & Congress.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.

a global affairs media network

www.diplomaticourier.com

Facing the End of Privacy

Image by Gerd Altmann from Pixabay

November 18, 2023

Our concept of privacy is rapidly eroding in the modern age, but mostly we still believe our physical person remains sacrosanct. In her latest—thrilling and superbly reported—new book, Kashmir Hill explores the extent to which facial recognition means that’s not true, writes Joshua Huminski.

T

he concept of privacy in the modern age is rapidly eroding. Both by omission and commission what was understood to be private and personal has fundamentally been upended by technology. Users of social media generate unquantifiable amounts of data via apps and social media about their habits, behaviors, and preferences—all of which are used to generate profiles about them. These are commoditized and sold on to determine what products a customer may like, what advertisements they may better respond to, and quite possibly how they will vote. 

Yet, we feel as though our person, our physical body and face, remains off-limits. Yes, many of us use biometric-related services, but that is, largely, by choice. What if you didn’t have a choice? What if the system just tracked you by your face—where you went and with whom you associated? Perhaps it was for the ease of navigating security at airports or, ostensibly, to reduce crime. But could you really be sure? 

Your Face Belongs to Us | Kashmir Hill | Random House

Reporter Kashmir Hill unpacks the perils of facial recognition in her thrilling and superbly reported new book “Your Face Belongs to Us” which explores the issue through her journalistic efforts to uncover Clearview AI—a secretive facial recognition start-up whose tools are increasingly used by law enforcement. Started by an unlikely group consisting of an Australian-Vietnamese software coder, a right-wing troll, and a Rudy Giuliani-linked fixer, Clearview AI’s growth and emergence is a strange and twisted story, one well told by Hill. 

For a company keen on identifying every person on the planet at any given time, Hill recounts, Clearview was deeply interested in keeping itself hidden. In her attempts to find out more about the company she used proxy users of the software (within police departments) to run her own face through the system. This prompted some of her law enforcement interlocutors to cut off engagement with her entirely, often with Clearview suspending users’ accounts upon becoming aware of their cooperation. Evidently, Clearview knew who she was and was eager to make her reporting even more difficult.  

Hill’s book bounces between the emergence and growth of Clearview, the history of facial recognition, and the developing awareness of the legal implications of biometric identification. It is a clever, well-done narrative; she maintains a clear focus on both, without losing the narrative thread of either in the process. 

“Your Face Belongs to Us” is more than the story of Clearview. It is a window into the challenges that result from the collision of emerging technology, market incentives, lax (or non-existent) regulation, and the societal effects of progress. While some companies, such as Google, have refrained from unleashing facial recognition into the wild, Facebook and Clearview have embraced it, and in the case of the former, got its users to do the dirty work of training its algorithms. Do you remember ever tagging your friends in a photo or confirming that a suggested tag was indeed your best friend from high school? Well, if you did, you helped train Facebook’s algorithms. As Hill recounts, in many cases, the market forces and incentives behind getting your product out first are significant—greater access to capital, greater market share capture, and potentially crowding out competitors. 

Companies will, as Clearview did, seek to find the jurisdiction of least regulation to maximize their profit potential and user base. Hill writes how a German man requested the photos of him Clearview had under the European Union’s General Data Protection Regulation (GDPR), which affords EU citizens markedly greater privacy rights than anything in the United States. While he could request Clearview to, in effect, forget him, it wouldn’t stop its AI from scrapping the internet and finding all photos of him again. 

By comparison, it is unlikely that Congress will pass anything similar; there is an uneven appreciation of the challenges of facial recognition and intense lobbying from the tech industry will water down anything approaching GDPR levels of restriction. In the absence of federal action; however, state-level legislation is possible. Hill writes of a little recognized bill in Illinois which sharply limits the use of biometric data, and there was an aborted effort in Texas to do the same. In the case of the former, companies like Clearview sought to avoid the state as much as possible, fearing litigation. Progress in Texas remains unfinished. 

In most cases legislative activity won’t happen unless there is a clear demand signal from the public, and that does not appear forthcoming. While Hill was chasing down Clearview, consumer use of facial recognition software seemed to just appear and become normalized. How many of us use Apple’s FaceID to unlock our iPhones, not just once, but dozens of times per day? What about Clear—the solution to the wholly-government-created problem of dreadful queues at airport security? That uses your biometrics to prove you are, well, you. 

That rapid and almost unquestioned emergence has; however, exposed both the duality of technology’s use and the risks of rapidly pushing an untested product to market. Hill covers the story of an innocent African American man in Detroit who was arrested because facial recognition tied him to a series of crimes. These positive identifications are meant to be only “investigative leads,” but often become much more than that—in this case justification for his arrest and detention. That case raises questions not just about the system’s use by law enforcement, but the fallibility of the algorithms themselves. It would, as in this case, be easy to see how an “investigative lead” becomes the basis for an entire case, especially when the system suggests it has 99% confidence of a match.

Yet, those very systems contain within them biases and flaws. Often overwhelmingly trained on Caucasian faces, these systems perform markedly worse for minorities. Even improving the complex algorithms to reduce biases and increase accuracy does nothing to address how law enforcement uses the tools, which is almost certainly a legal, ethical, and policy question that must balance the desire for arrest and conviction with the need to uphold equal justice under the law. 

This also assumes that the technology remains in the hands of those whom the public trusts. Which is clearly not the case. Hill writes of how users of facial recognition software were able to identify adult film actresses and stalk women simply by running their photo through the system. Furthermore, the Chinese government uses facial recognition software to monitor and control its Uyghur population. Beijing uses systems like these to track anyone within its borders and it is keen to export this and other elements of its system of social control abroad. 

On finishing “Your Face Belongs to Us” one can’t help but feel disheartened and slightly alarmed. Hill vividly illustrates the dual nature of technology, its promise and its peril, and the need for yet another conversation on balancing the two. Yet, there seems little appetite in Congress to address the broader issue of privacy in the digital age. The technology is inexorably marching forward. It is becoming more and more commonplace in our everyday lives while its government and commercial adoption increases. We may find ourselves in a world where facial recognition is well, just there, and we will look back and wonder how we found ourselves there. Hill’s book will then, as it does now, act as an instructive guide and Cassandra-like warning should that day come to pass.

About
Joshua Huminski
:
Joshua C. Huminski is the Senior Vice President for National Security & Intelligence Programs and the Director of the Mike Rogers Center at the Center for the Study of the Presidency & Congress.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.