Analysis

Surveillance Reformers Seize Moment Amid Protests, Virus

By Ben Kochman
Law360 is providing free access to its coronavirus coverage to make sure all members of the legal community have accurate information in this time of uncertainty and change. Use the form below to sign up for any of our weekly newsletters. Signing up for any of our section newsletters will opt you in to the weekly Coronavirus briefing.

Sign up for our Retail & E-Commerce newsletter

You must correct or enter the following before you can sign up:

Select more newsletters to receive for free [+] Show less [-]

Thank You!



Law360 (June 26, 2020, 9:58 PM EDT ) On a Thursday afternoon in January, police handcuffed Robert Williams, a Black man living in a Detroit suburb, on his front lawn in front of his wife and two young daughters, charging him with stealing five watches from the luxury retailer Shinola.

Facial-recognition software used by Michigan State Police had tagged Williams as a suspect in an October 2018 shoplifting incident, matching him with an image taken from security footage recovered at the scene.

Michigan resident Robert Williams says a false match in the state's facial-recognition software resulted in his wrongful arrest in January on suspicion of shoplifting watches from a luxury retailer in Detroit. (ACLU)

Based on the match, Detroit police put Williams' driver's license photo into a lineup with photos of other Black men. A store security consultant who did not witness the crime then identified Williams from the lineup as the man in the security video, according to county prosecutors.

There was a problem with the investigation, however: The facial-recognition software that marked Williams as a suspect had mistaken him for another Black man, according to a complaint filed by the American Civil Liberties Union this week.

Authorities later dismissed the case for lack of evidence — but not before Williams spent nearly 30 hours in police custody, according to the administrative complaint filed with Michigan authorities.

When officers questioning him showed him the store's security photo, Williams, holding the image up to his face, convinced them that it was a photo of another man. "The computer must have gotten it wrong," an officer replied, according to the complaint.

Williams' arrest, widely publicized Wednesday, has added fuel to the push for limitations, moratoriums or outright bans on facial-recognition software and other surveillance tools used by governments and private companies around the country. The ACLU says Williams is the first known person to be arrested in the U.S. based on a flawed facial-recognition match, but suggested that other cases exist that have yet to be exposed.

The complaint was also filed amid what privacy advocates and industry attorneys described in interviews as a unique moment in the history of surveillance reform.

The protests erupting after police killings of Black people, including George Floyd in Minneapolis and Breonna Taylor in Louisville, Kentucky, have highlighted how government surveillance can disproportionately hurt people of color, they told Law360. It has also led to some law enforcement agencies reportedly using surveillance tools to monitor protesters, advocates say.

At the same time, government attempts to persuade citizens to sacrifice some of their privacy in efforts to trace the spread of the novel coronavirus have added to the scrutiny of authorities' surveillance practices.

"It's amazing how fast the whole dialogue around the issue has changed in just a few months," said Laura Jehl, who heads the privacy and cybersecurity practice at McDermott Will & Emery LLP.

"Just when some people were thinking, 'I'm ready to give up some of my privacy to be safe from COVID-19,' that maybe technology is going to save us, that feeling has evaporated," Jehl said. " You go to a Black Lives Matter rally and say, 'Maybe I don't want the government to be able to track where I am or who I am with.'"

Kade Crockford, director of the Technology for Liberty Program at the ACLU of Massachusetts, said that "people across political parties and ideologies are increasingly making the connection" between what she called "racist policing" and the "disproportionate amount of surveillance that has been directed at Black and brown people in this country."

"We better use that awakening to fight for systemic legal reforms to fight against these systems of oppression," Crockford told Law360. "We better do a lot more than shrug our shoulders."

Look Local for Regulation

A group of congressional Democrats on Thursday introduced the first U.S. federal legislation that would temporarily ban government use of facial-recognition software nationwide. The Facial Recognition and Biometric Technology Moratorium Act is backed by Sens. Ed Markey, D-Mass., and Jeff Merkley, D-Ore., and in the House by Reps. Ayanna Pressley, D-Mass., and Pramila Jayapal, D-Wash.

Several other proposals to curb the use of the technology are pending in Congress. But more immediate action is far more likely to take place at the municipal or state level, where lawmakers and advocates had been pushing through reforms before the pandemic or recent police protests.

Boston's City Council voted unanimously Wednesday to ban the city government from using facial-recognition software, becoming the second-largest U.S. city to do so. Similar laws have been passed in San Francisco, Oakland and Berkeley in California, as well as Cambridge and Somerville in Massachusetts.

Earlier this month, the New York City Council voted to require the NYPD, the nation's largest police department, to reveal details on the facial-recognition software, cellphone trackers and other surveillance tools it uses to monitor people. That bill had stalled since its introduction in 2017, with the police claiming that criminals and terrorists would be able to take advantage of the information. But the legislation gained momentum amid the nationwide protests for police reform, its backers say.

"The sea change in the political landscape is incredible and that is 100% driven by the historic protests and uprisings around the country," said attorney Albert Fox Cahn, executive director of the New York City-based Surveillance Technology Oversight Project.

"There was a mindset until just a few months ago that the police were the ultimate deciders of who gets to choose how to balance privacy and public safety, but that is all changing," Cahn said.

New York state lawmakers are also hoping to build support for new legislation that would ban law enforcement from seeking broad warrants for cellphone data within a geographic area — a tactic they fear could compromise the privacy of New Yorkers participating in the ongoing protests.

Other state legislation to watch includes a proposed moratorium on Massachusetts' state government using facial surveillance technology, backed by the ACLU and other privacy advocates.

Push for Moratoriums, Bans Gains Steam

Major tech companies like Microsoft Corp. have been calling for Congress to regulate facial-recognition technology since at least 2018. But in recent weeks, with federal lawmakers still failing to reach a consensus on the issue, Microsoft, Amazon and IBM have taken the further step of enacting one-year moratoriums on police using their facial surveillance products.

Amazon, which has drawn the ire of privacy advocates by sharing video from its internet-connected Ring home security systems with police with little oversight, said in a press release that it hoped the moratorium will give Congress time to enact regulations on the technology's usage. But industry attorneys are skeptical that such a federal proposal will gain momentum any time soon.

"Everywhere but Congress is moving on this, but I don't see the urgency," Jehl said. "It may be that Congress is too divided in a presidential election year to do more on the issue right now."

A Washington state facial-recognition law backed by Microsoft was considered by some as a potential national model when it was passed in March. The bill requires governments to submit facial-recognition technology for public review before using it and to allow third parties to test it for accuracy and bias, among other measures.

But the ACLU slammed the bill in a press release at the time as falling "far short of providing adequate protections for communities of color, immigrants, and religious minorities who have been historically harmed by government surveillance." A similar bill proposed in California failed to pass earlier this month after being criticized by privacy groups.

Those arguing for either moratoriums or outright bans of facial surveillance technology point to a report released in December by the National Institute of Standards and Technology, part of the U.S. Department of Commerce, which found disproportionately higher false positive rates for Black, Asian and Native American people. The institute analyzed 189 software algorithms from 99 developers.

A Massachusetts Institute of Technology and Stanford University study released in 2018 similarly found that three commercial facial analysis programs had an error rate of 34.7% for dark-skinned women, compared to just 0.8% for light-skinned men.

Advocates and industry attorneys agree that the law enforcement use of facial surveillance carries higher civil rights concerns than its use by private companies. But private sector applications of facial recognition are far more widespread than law enforcement use, including for advertising and for limiting access to restricted areas at businesses, airports and casinos, according to Mark McCreary, co-chair of the privacy and data security practice at Fox Rothschild LLP.

Most of the companies that sell their facial-recognition products to private companies are far less known than companies like Microsoft and Amazon, privacy advocates say. Only Illinois, Texas and Washington currently have state laws regulating the commercial use of facial recognition, with Illinois' Biometric Information Privacy Act by far the most influential because it allows individuals to bring lawsuits.

States may also include rules on facial recognition within more general privacy laws like California's Consumer Privacy Act, which requires private companies to disclose how facial mapping is used and to offer consumers a way to opt out and delete their data, McCreary said.

"Many states are considering laws similar to the California law, and it is anticipated most other states will eventually follow suit," he told Law360.

--Additional reporting by Emma Whitford. Editing by Breda Lund and Jill Coffey.

Correction: A previous version of this story misstated the agency that houses the National Institute of Standards and Technology. This error has been corrected.

For a reprint of this article, please contact reprints@law360.com.

Hello! I'm Law360's automated support bot.

How can I help you today?

For example, you can type:
  • I forgot my password
  • I took a free trial but didn't get a verification email
  • How do I sign up for a newsletter?
Ask a question!