Clearview’s facial recognition app is identifying child victims of abuse

Clearview’s facial recognition app is identifying child victims of abuse

SFGate

Published

Law enforcement agencies across the United States and Canada are using Clearview AI — a secretive facial recognition startup with a database of 3 billion images — to identify children who are victims of sexual abuse. It’s a powerful use case for the company’s technology, but raises new questions about the tool’s accuracy and how the company handles data.

Investigators say Clearview’s tools allow them to learn the names or locations of minors in exploitative videos and photos who otherwise might not have been identified. In one case in Indiana, detectives ran images of 21 victims of the same offender through Clearview’s app and received 14 IDs, according to Charles Cohen, a former chief of the state police, who is now retired. The youngest was 13.

“These were kids or young women, and we wanted to be able to find them to tell them we had arrested this guy and see if they wanted to make victim statements,” Cohen said.

Another official, a victim identification officer in Canada, who was not authorized to discuss investigations publicly, described Clearview’s technology as “the biggest breakthrough in the last decade” in the field of child sexual abuse crimes.

But privacy advocates say the company’s database is untested and unregulated, and could cause new kinds of harm. Clearview stores pictures uploaded by investigators — known as probe images — on its servers, meaning it could amass an extraordinarily sensitive data set of child victims of sexual abuse and exploitation.

“We understand the extreme sensitivity involved with identifying children,” Clearview’s founder, Hoan Ton-That, wrote in an email. “Our mission is to protect children.”

According to a company document distributed to clients, “searches are retained forever” by default, but administrators can change their...

Full Article