Lawmakers Propose a Ban on Government Use of Biometrics
A group of 16 Democratic members of Congress in early March reintroduced legislation that would ban federal government agencies from using biometrics technologies for surveillance purposes or in judicial proceedings. Though primarily geared toward the use of facial recognition by law enforcement agencies, it also includes other biometric modalities, including voice “and other immutable physical characteristics.”
If approved, the proposed legislation, called the Facial Recognition and Biometric Technology Moratorium Act of 2023, could also cut off federal funding to state and local agencies, including police, unless they enact similar bans.
Similar legislation was introduced with the same name in 2021, but it never made it out of the Senate Judiciary Committee.
This year’s version would make it illegal for any federal agency or official to “acquire, possess, access, or use” biometric surveillance or information from video or audio systems operated by third parties. It was introduced by U.S. Sen. Ed Markey (D-Mass.), who also introduced the 2021 bill. It was co-sponsored in the senate by Jeff Merkley (D-Ore.), Bernie Sanders (I-Vt.), Elizabeth Warren (D-Mass.), and Ron Wyden (D-Ore.).
“The year is 2023, but we are living through 1984,” Markey said in a statement. “Biometric data collection poses serious risks of privacy invasion and discrimination, and Americans know they should not have to forgo personal privacy for safety. As we work to make our country more equitable, we cannot ignore the technologies that stand in the way of progress and perpetuate injustice.”
Rep. Pramila Jayapal (D-Wash.) introduced a companion bill in the House. Ayanna Pressley (D-Mass.), Rashida Tlaib (D-Mich.), Earl Blumenauer (D-Ore.), Cori Bush (D-Mo.), Greg Casar (D-Texas), Adriano Espaillat (D-N.Y.), Barbara Lee (D-Calif.), Eleanor Holmes Norton (D-Wash.), Jamaal Bowman (D-N.Y.), and Jan Schakowsky (D-Ill.) co-sponsored the House version.
The legislation has the support of dozens of civil rights and privacy advocates, including the American Civil Liberties Union (ACLU), Fight for the Future, and the Electronic Privacy Information Center.
Though not specifically covered under the proposed legislation, some organizations have also sounded the alarm on companies and private entities using biometrics technology on their employees and customers.
FTC Warns of Voice Cloning Scam
The U.S. Federal Trade Commission (FTC) in mid-March issued a strong warning about crooks using voice cloning technology to steal Americans’ money.
The warning, dated March 20 and appearing on the FTC’s website, urges consumers to be vigilant for calls using voice clones generated by artificial intelligence. In the calls, a computer-generated reproduction of the voice of a friend or family member asks for money to help get them out of trouble, The most common scam targets senior citizens and features the cloned voice of one of their grandchildren claiming to have been arrested for a minor infraction and needing money for bail. The callers typically want to be paid via wire transfer, cryptocurrency, or gift card, though some have also requested cash to be dropped off at a particular location.
“All [the scammer] needs is a short audio clip of your family member’s voice—which he could get from content posted online—and a voice-cloning program,” the FTC warning said. “When the scammer calls you, he’ll sound just like your loved one.”
As AI continues to advance in the area of speech technology, cybercriminals are finding new and innovative ways to exploit vulnerabilities in systems and voice manipulation and voice cloning techniques. Using sophisticated AI algorithms, hackers can record voices and use them to authorize fraudulent payments or gain access to sensitive information, the FTC said.
In addition to consumers, it is critical for businesses to stay current and adapt. Key to this is embracing technologies that prioritize security, experts have said.