As biometric data gathering like facial recognition becomes an increasingly common tool in fighting against terrorism, legal protections have lagged behind that technology, according to experts on a recent panel in Manhattan, who said regulators need to strike a better balance between security and protecting human rights.
At the moment, there's a consistent lack of ground rules in virtually all aspects of biometric data, from the process by which biometric recognition programs are created to the means in which data is gathered to the ways it is used and shared across agencies and borders, panelists said Tuesday during the event, hosted by New York University's nonpartisan public policy institute the Brennan Center for Justice.
Both human rights advocates and policymakers said that while biometric data could help save lives and prevent terror attacks, it also could lead to serious human rights violations, especially as the technology evolves rapidly and in unpredictable ways.
"One of the preliminary things that we have flagged is there is very little focus on the full impact of biometric data on human rights," said Krisztina Huszti-Orban, a research fellow with the Human Rights Center who also advises the United Nations on the issue. "The focus needs to be expanded from privacy and data protection to the full spectrum of human rights."
In addition to Huszti-Orban, the panel included representatives from the advocacy organizations Privacy International and the U.S.-focused Surveillance Technology Oversight Project, as well as a legal officer with the U.N. Security Council's Counter-Terrorism Executive Directorate.
Biometrics includes a broad range of information about or derived from an individual's physical or biological characteristics.
Some biometric technologies have been around a long time, such as fingerprinting, while others are more recent, like DNA testing. Still others are only just now being developed, such as facial recognition, gait recognition and even sentiment detection, which promises the ability to detect emotion and mental state.
"It's interesting when you see these technologies work for the first time. It can seem like magic," said Liz O'Sullivan, technology director of the Surveillance Technology Oversight Project. "But these algorithms are not magic. They are created by humans."
That means, she said, that algorithms are fallible, especially when the data they are trained with is incomplete. For instance, she said, some self-driving car programs have had problems dealing with bicycles because the training data simply did not include enough examples of bikes on the roads for programs to know how to deal with them. Similarly, data used to train algorithms that analyze humans can have major holes, she said.
Underrepresented populations, such as people with mental illnesses, people of color, children and people with disabilities, often don't make up much of the data these algorithms are trained with, and thus can be negatively impacted by the results.
What's worse, O'Sullivan said, is that these programs are designed to handle calculations that humans would simply be unable to process or understand, meaning programs often cannot explain why they have arrived at a particular result.
"We are now going to be making arrests of people with potentially no defense," O'Sullivan said.
Panelists also highlighted other major concerns, such as that databases designed for one highly selective purpose — for instance, screening international arrivals for suspected foreign terrorists — can be merged with other databases, increasing the number of potential uses. Information can also be shared across agencies and across borders without much oversight, panelists said.
And private citizens often don't realize what data about them might have been obtained, how it is being used, or whether that data has landed them on a list somewhere that they might not be able to get off of.
"It's understandable" that governments want to use biometric data, said Tomaso Falchetta, head of the advocacy and policy team at Privacy International. "Data is very useful."
But that doesn't mean governments should be able to use it however they like, he said.
All the panelists recognized that there is a lack of law and regulation in place to safeguard human rights. Not all nations have legislation protecting citizens' privacy or data, and not all such laws include biometric data. In addition, many privacy laws make exceptions for national security purposes.
At the international level, panelist Anne-Maria Seesmaa of the U.N. counterterrorism directorate said the group was working on developing guidelines for nations. Although directives from the U.N. have specified that countries are to safeguard human rights when using biometrics, she said, the group hopes to create rules for the best way to do that. In 2018, it released a compendium of recommended practices that filled in some gaps.
"It's not perfect," Seesmaa said. "But it is a first step."
Going forward, she said that the directorate hoped to address pressing concerns such as the sharing of biometric data across borders and to develop a robust framework for how the collection and use of biometrics should be overseen.
"It's actually not that easy to determine what is effective, appropriate, independent oversight of this information," she said.
Others on the panel, however, pushed back against some of the instructions U.N. bodies have put in place, such as the directive to create databases in the first place. Though these databases are developed for national security purposes, Falchetta noted, they can be put to use for other reasons.
Seesmaa, though, defended the use of biometric data, saying it can be employed in counterterrorism to save lives. And although it makes sense from a human rights perspective to restrict the use of data or limit who can access it, she said, those limits could also have consequences.
"When things happen," she said, "people ask, 'You knew — why didn't you do anything?'"
The key, she said, would be finding the right balance.
--Editing by Aaron Pelc.
Have a story idea for Access to Justice? Reach us at accesstojustice@law360.com.
Try our Advanced Search for more refined results
As Biometric Screening Spreads, The Law Isn't Keeping Up
By Emma Cueto | October 6, 2019, 8:02 PM EDT