IT IS A WEEK of renewed social crisis in the United States, which means American companies are quickly lining up to pay lip service to the cause. Just like its tech giant competitors at Facebook, Apple, and Google, Amazon tweeted vaguely in favor of the principles of social justice and equitable policing, a predictable and predictably tinny expression of corporate solidarity with “the fight against systemic racism and injustice.” But Amazon is arguably singular among its mega-tech peers in its determination to provide American law enforcement with tools experts say only enable racist policing.
In their rush to appear sympathetic to the rough contours of social justice — while keeping their legal, public relations, and social media teams in agreement — some companies seem to be forgetting what it is they actually do. When Nextdoor, a social network with a well-documented pattern of stoking the worst kinds of racial panic, tweets an image reading “BLACK LIVES MATTER,” it’s difficult to take seriously. But while Nextdoor is merely content to rationalize and streamline urban and suburban residential paranoia into a tidy algorithmic feed, a growing portion of Amazon’s business, as Wired’s Sidney Fussell noted yesterday, is expanding its public/private video surveillance dragnet across the country with an explicitly “anti-crime” mission.
In 2018, the ACLU published a report showing that Amazon’s “Rekognition” facial recognition software was fundamentally racially biased, disproportionately misidentifying, in ACLU’s test, black members of Congress as people who were arrested and had their mugshot in a police database. “The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.),” read the report. “Nearly 40 percent of Rekognition’s false matches in our test were of people of color, even though they make up only 20 percent of Congress.” A report published that same year by an MIT team found, similarly, that Rekognition misclassified darker-skinned women as men 31 percent of the time.
The ACLU report added, “People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that.” As the ACLU and other artificial intelligence researchers have made clear, the threat of computerized misidentification isn’t just an academic error, but a potentially ruinous one that could improperly influence police officers prior to an encounter, or even cause them to seek a search warrant, by presenting them with a false criminal history.
It’s difficult to reconcile this reality with a recent tweet from Amazon executive Andy Jassy, chief of Amazon Web Services, the cloud computing division which operates Rekognition:
As Wired’s Fussell astutely pointed out, Jassy has previously defended Rekognition’s use by police on the grounds that the company’s “terms of service” U.S. Constitution would prevent any abuse.
Equally thin Amazon’s companywide statement:
The inequitable and brutal treatment of Black people in our country must stop.
Together we stand in solidarity with the Black community — our employees, customers, and partners — in the fight against systemic racism and injustice.
What exactly does it mean to oppose the discriminatory policing of black Americans while simultaneously selling discriminatory tools to the police — or while operating Ring, a surveillance network predicated on the detection and elimination of racially coded “suspicious activity,” which funnels video directly to local police and which once handed out to employees blue and black Ring-branded novelty police badge stickers emblazoned with “FUCK CRIME” on the front?
In a statement to The Intercept, ACLU technology attorney Jacob Snow said that “If Amazon and Andy Jassy really stand in solidarity with the black community, then they should stop selling facial recognition surveillance technology that supercharges police abuse against the black community. They should also stop firing workers organizing for better conditions in Amazon warehouses. Real solidarity goes beyond empty posts on social media that do nothing to protect black people.”
Liz O’Sullivan, a privacy researcher and advocate who quit her job at AI surveillance firm Clarifai last year over ethical concerns, told The Intercept that the services Amazon provides to police nationwide “may enable these officers to track and suppress those protesting and speaking out against racial injustice” through the use of facial recognition. “Surveillance is a racial justice issue,” O’Sullivan added. “The two cannot be separated.”
I asked Amazon and Jassy repeatedly whether they would permit the use of Rekognition to identify individuals protesting against police violence, and whether the company would more generally reassess or modify the ways in which police use Rekognition, and received no response. Given that Amazon refuses to even disclose which law enforcement agencies use Rekognition to begin with, it’s probably no surprise that they wouldn’t provide clarity on how the software is used or not used. It’s one thing to quietly profit from an unjust system, and quite another to decry its existence while selling the very software it runs on.
Courtesy The Intercept