By Tim Cushing
Belatedly realizing its reputation is a burning dumpster floating in a sewage retention pond, Clearview is finally trying to turn things around. Building a database of personal info and photos scraped from public websites, the company turned its product over to anyone who was interested. Private companies, billionaires, dozens of police departments — all were invited to play with Clearview’s toy, which set an untested facial recognition algorithm loose in the billions of images in its database.
Clearview likes to claim its tool is helping solve crimes. Police departments referred to in marketing materials and public statements have almost always disagreed with Clearview’s self-assessment. Now, Clearview is trying to patch up its relationship with the public by altering its relationship with law enforcement agencies. CEO Hoan Ton-That is promising some sort of reform effort here, but his promises of a better, more trustworthy Clearview are as empty as its database is full.
Speaking at The Wall Street Journal’s Tech Live virtual conference, Mr. Ton-That said Clearview would make available training and compliance features that help ensure police officers use the technology ethically to solve crimes, though it would be up to police department heads to monitor officers and enforce the rules. One new feature he described is a requirement that police enter a specific case number and crime for each search, to enable better auditing.
This promise is coming from the same company that encouraged officers to “run wild” by testing the nascent AI on pictures of families, friends, and whoever else’s photos they had access to. Now, this company is claiming it can train officers to use it “ethically” and provide “compliance features” (whatever those are) to ensure only good things are done with its database full of scraped personal info.
The backlash against facial recognition tech has pretty much cleared the playing field for Clearview. With Microsoft, Amazon, and IBM exiting the market at least temporarily, Clearview is poised for growth. But empty assurances that more “training” will be made available isn’t going to make Clearview seem like a more trustworthy option. Neither is leaving it up to police departments to police their own use of the tech. If there’s one thing law enforcement agencies have done consistently, it’s prove they’re incapable — or at least unwilling — to police themselves.
And Clearview is similarly unwilling to police itself. It may have pulled the plug on private customers as the result of multiple lawsuits, but it’s still not very particular about who it sells to and what they’ll do with the tech. If a government agency buys the software to combat child sexual abuse but then decides it might be similarly useful for hunting down undocumented immigrants, that’s none of Clearview’s business.
“It’s not our job to set the policy as a tech company,” he said. “It’s up to us to help them execute what they want to do…We are here to help the government and fulfill their vision in keeping us safe.”
If so, then Clearview feels it’s not its job to rein in cops who might abuse the tech. That’s why it’s offering some undefined training and leaving the rest up to law enforcement agencies. Nothing about this will curb abuse. And it seems Clearview doesn’t care whether its product is abused. Only that it sells.