By Tim Cushing
It looks like some members of the Seattle Police Department have taken an interest in Clearview. Clearview scrapes photos and data from the open web and sells access to its untested facial recognition AI to government agencies, private companies, and the odd billionaire. According to Clearview, it has 4 billion scraped records in its database. What it doesn’t have is a proven law enforcement track record for solving crimes, despite making extremely forward overtures to hundreds of law enforcement agencies around the globe.
Records [PDF] obtained by Bridget Brulolo of the Bridge Burner Collective show at least one Seattle PD investigator obtained access to this software and tried it out. This off-the-books test run may have broken a local law.
According to records recently released by the Seattle Police Department (SPD), a detective working for SPD signed up for and used facial recognition app Clearview AI, which appears to be in violation of the City of Seattle’s Surveillance Ordinance.[…]
Released emails indicate that Nick Kartes, a detective on the South Precinct Burglary unit according to a staff roster obtained this summer, signed up for Clearview in September 2019 using his “@seattle.gov” work email address.
The emails show Nick Kartes took the software for a spin at least 30 times, using Seattle government computers to do so. He also downloaded the app to his phone and claimed to have “done some experimenting with it.” It’s unclear whether these “experiments” were related to Kartes’ burglary investigations, but it must be said Clearview encourages government employees to “experiment” with the software using friends, family members, and random individuals as unwitting test subjects.
In fact, we’ll just let Clearview say it, because this encouragement to “experiment” is included in one of the emails. [Emphasis in the original.]
Here are three important tips for using Clearview:
1. Search a lot. Your Clearview account has unlimited searches. Don’t stop at one search. See if you can reach 100 searches. It’s a numbers game. Our database is always expanding and you never know when a photo will turn up a lead. Take a selfie with Clearview or search a celebrity to see how powerful the technology can be.
This was followed up with another unproven assertion that encourages users to just feed photo after photo into the system.
Investigators who do 100+ Clearview searches have the best chances of successfully solving crimes with Clearview from our experience.[…]
The more searches, the more matches. It’s a numbers game. The investigators who search the most are the investigators that solve the most cases.
Yes, Clearview clearly advocates for responsible use of its tech to search its database of scraped images. Just feed faces into the portal or app and go wild. Fuck the consequences. Just imagine yourself solving crime after crime using this unproven AI. Bathe in the warm glow of being on the side of law and order as you prowl through the personal data of non-suspects.
Clearview’s pitches sound like a mass mailing firm. “It’s a numbers game.” Given enough attempts, investigators are pretty much guaranteed to stumble across a usable lead. This ignores all the dead ends caused by false positives and false negatives. It also ignores the damage law enforcement can do with a handful of bad leads generated by “it’s a numbers game” AI. Lives can be destroyed or severely disrupted. The innocent can be imprisoned. And all because a highly questionable facial recognition company that openly advocates for unethical use of its unethical scraped database says that’s the best way to solve crimes.