- - Tuesday, April 4, 2017

ANALYSIS/OPINION:

Most Americans haven’t sampled the thrill of being the subject of a police line-up, where the victim of a crime studies the faces of suspects from behind a one-way mirror. The proliferation of facial recognition technology changes all that. While the police need every advantage they can manage to stay ahead of evildoers, strong safeguards are necessary to protect individual privacy and prevent false accusations and arrests.

The House Oversight and Government Reform Committee examined police policies on the use of facial recognition technology and discovered that “mug shots,” or photographs of the faces of suspects, of 125 million American adults are stored in digital databases and are already at the call of state and federal authorities. The FBI has made arrangements with 18 states to access their digital photographs, including those collected by state departments of motor vehicles. The bureau is seeking access to the others.

Rep. Jason Chaffetz, the chairman of the committee, observed that the FBI has not published the required privacy impact assessments of its use of its electronic identification programs. “So here’s the problem,” he told the FBI, “you’re required by law to put out a privacy statement and you didn’t. And now we’re supposed to trust you with hundreds of millions of people’s faces … Why should we trust you?” It was a pertinent question in the wake of allegations of Russian interference in the 2016 presidential campaign and suspicions of U.S. surveillance of the Trump transition team. The dangers inherent in federal cyber-access to the personal communications of Americans are real.

Kimberly J. Del Greco, the deputy assistant director of the FBI, told the committee that the tardy privacy assessments have now been forwarded to the U.S. Justice Department. Meanwhile, she said, the agency’s Interstate Photo System and other facial recognition programs are used “only to create leads, and not intended as positive identification.”

Privacy advocates object to the wholesale inclusion of the data on the law-abiding in facial recognition networks that are used to identify criminals. “Never before — not with fingerprints or DNA — has law enforcement created a national biometric network made up mostly of innocent people,” says Alvaro Bedoya, executive director for the Center on Privacy and Technology at Georgetown Law School.

The technology is still evolving, and critics say it is subject to making false matches. FBI programs fail to make an accurate identification 15 percent of the time, says Jennifer Lynch of the Electronic Frontier Foundation. Facial recognition technology is particularly unreliable in correctly identifying blacks and other ethnic minorities of darker hue, meaning “people of color will likely shoulder more of the burden of the Interstate Photo System’s inaccuracies than whites.”

As identification technology improves, it’s natural for law enforcement officials to rely on it to help keep the peace. But for now the urgency to identify suspects outpaces accuracy. The sort of deceitful unmasking that led to Gen. Michael Flynn’s termination as the president’s national security adviser may seem a present danger only for the high and mighty, but everyone with a driver’s license could eventually fall under the gaze of Big Brother.

If the U.S. intelligence agencies use facial recognition technology capable of placing Americans in a perpetual lineup, Congress is obligated to devise stringent safeguards and impose tough punishment for those who violate those safeguards. The late Supreme Court Justice Louis Brandeis rightly described privacy as “the right to be let alone.” That right has never been more precious, or more vulnerable to abuse.


Copyright © 2018 The Washington Times, LLC. Click here for reprint permission.

The Washington Times Comment Policy

The Washington Times welcomes your comments on Spot.im, our third-party provider. Please read our Comment Policy before commenting.

 

Click to Read More and View Comments

Click to Hide