RICHMOND, Va. (AP) — Last month, Virginia lawmakers quietly passed one of the most restrictive bans in the country on the use of facial recognition technology.
The legislation, which won unusually broad bipartisan support, prohibits all local law enforcement agencies and campus police departments from purchasing or using facial recognition technology unless it is expressly authorized by the state legislature.
But now, some law enforcement officials are asking Gov. Ralph Northam to put the brakes on the legislation, arguing that it is overly broad and hasn’t been thoroughly vetted.
“I think a lot of people want to know what impact that is going to have on public safety and a lot of other industries if you do away with it,” said John Jones, executive director of the Virginia Sheriffs’ Association.
“It is a way to catch bad guys — you can catch really bad actors — and that’s always a good thing,” Jones said.
The wide scope of the legislation and the level of support it gained in the legislature surprised even the bill’s lead sponsor, Democratic Del. Lashrecse Aird. There was little to no pushback from police and not a single lawmaker voted against it — Democrat or Republican.
Aird said she drafted the bill in response to an investigation by The Virginian-Pilot newspaper that found some gang detectives in the Norfolk Police Department had been using a controversial facial recognition app by Clearview AI to identify suspects in criminal investigations, without the knowledge of city leaders.
Earlier this month, the Virginia Beach Police Department acknowledged that 10 of its detectives also had used the Clearview AI app. In both cities, the police chiefs ordered all officers to stop using the program.
Clearview AI’s app uses a database of more than 3 billion images that the company says it scraped from Facebook, Venmo, YouTube and millions of other websites, according to a New York Times investigation.
The Virginia legislation says that any law enforcement agency using facial recognition technology, including Clearview AI, must stop as of July 1, when the law would go into effect.
“Citizens should have control of and awareness of whether or not their law enforcement officers are using this type of technology,” Aird said.
“The immediate baseline-level concern is that these databases have misidentified people on a large scale, particularly anyone with significant pigmentation, so black and brown people,” she said.
Aird’s original bill called for banning law enforcement agencies from using the technology unless their local municipal leaders passed an ordinance authorizing it.
To her surprise, Republican Sen. Ryan McDougle proposed amendments to make the legislation much more restrictive, requiring any locality that wants to use facial recognition to go to the state legislature and ask for authorization through a statute. McDougle said he was concerned that the original version of the bill could have created a “hodgepodge” of laws in communities around the state.
“This is biometric data that is unique to each person, so we need to put in protections, just like we would for other things that are being used in the criminal justice arena,” he said.
At least 20 cities and five other states, including California, New Hampshire, New York, Oregon, and Vermont — have halted or limited government use of facial recognition technology, according to the American Civil Liberties Union, a leading opponent of its use by police.
“If the governor signs this bill into law, Virginia will take a seat alongside the strongest state and municipal bans in the country,” said Chad Marlow, the ACLU’s senior policy counsel on privacy, surveillance and technology.
The ACLU cites studies that have found higher error rates for facial recognition software used to identify people of color, women, children and the elderly.
Aird said that under her legislation, local police departments would no longer be allowed to use Clearview AI as of July 1. Another program they’d be prohibited from using is the National Capital Region Facial Recognition Investigative Leads System, she said.
Fourteen local and federal law enforcement agencies in the Washington, D.C., region have access to the system, which uses a database of about 1.4 million images of mug shots supplied by the agencies, The Washington Post reported.
In response to an open records request, Fredericksburg police said they have used the system several times since 2019, including in a January 2021 rape case. After further investigation, charges were obtained on a suspect. The case is still active.
Jones said he had been under the impression that the legislature had approved the original bill, which required only local approval for law enforcement agencies to use facial recognition. The sheriffs’ group has asked Northam to put a reenactment clause on the legislation, which would require another vote in the General Assembly next year before it could become law.
“There’s no exclusion for anything in this bill, and I think it was done too hastily,” Jones said. “I just think it goes way too far.”
Northam has not said whether he will sign the bill. He has until Wednesday to send any amendments to the legislature to consider.
State police are not covered by the legislation. Last year, state police discovered last year that six troopers had downloaded free trial Clearview AI accounts. Two of the accounts were never used. The other four accounts were used for about five months before they were discovered and were then shut down at the direction of the troopers’ supervisors, spokeswoman Corinne Geller said.
Copyright © 2021 . All rights reserved. This website is not intended for users located within the European Economic Area.