January 25, 2021

New York Schools Putting Students In The Crosshairs Of Tech That Targets Minorities, Thinks Broom Handles Are Guns

By Tim Cushing
We’re turning over discipline of school kids to cops and their tech and it’s just making existing problems even worse. We’ve seen the problems inherent in facial recognition tech. And it’s not just us — this so-called leftist rag (according to our anonymous critics). It’s also the National Institute of Standards and Technology (NIST). Its study of 189 facial recognition algorithms uncovered why most legislators seem unworried about the surveillance creep:

Middle-aged white men generally benefited from the highest accuracy rates.

When systems pose no risk to you personally, it’s unlikely you’ll object to rollouts of unproven AI and questionable tech. If it only hurts people who aren’t you or your voter base, any incremental increase in law enforcement “effectiveness” is viewed as an acceptable tradeoff.

Destroying the lives of minorities has never been a major legislative concern. But if we all agree children are our future, it seems insanely destructive to turn a blind eye to the havoc this tech can create. Unless, of course, legislators believe only white children can secure the future (give or take 14 words). Then it’s OK, even when it definitely isn’t.

Documents obtained by Motherboard show few people care about minorities, no matter what government position they hold. Vetting contractors should be the first check against abuses. But it appears no one involved with regulating the lives of students (who are legally obligated to attend schools that view them as criminals) cares what happens to the minors they subject to the racist tendencies of law enforcement agencies and the tech they deploy.

Ever since they learned that Lockport City School District intended to install a network of facial recognition cameras in its buildings, parents in the upstate New York community—particularly families of color—have worried that the new system will lead to tragic and potentially fatal interactions between their children and police.

Now, documents newly obtained by Motherboard accentuate those fears. They show that SN Technologies, the Canadian company contracted to install Lockport’s facial recognition system, misled the district about the accuracy of the algorithm it uses and downplayed how often it misidentifies Black faces. The records, comprising hundreds of pages of emails between the district and the company, also detail numerous technical issues with SN Technologies’ AEGIS face and weapons detection system, including its propensity for misidentifying objects like broom handles as guns.

Wonderful. The collective shrug of legislators is feeding kids to racist tech with a proven track record of being unable to identify criminal suspects. This one goes a step further. It’s unable to detect weapons accurately, which is probably why cops think it works great. Cops can’t seem tell a cellphone or a Wii controller from a gun, so whatever justifies the use of force is an acceptable tradeoff for… um… not deploying force, I guess. So, when lives are actually on the line, cops will be chasing down broom handles being held by minorities, rather than weapons held by white people, who are far more likely to engage in school shootings.

The New York State Education Department (NYSED) stands by its approval of this questionable tech… sort of. Lockport officials have refused to comment. So has the police department making use of it. And so has their chosen facial recognition vendor, SN Technologies, which provides the AEGIS tech.

It’s not like they didn’t have any warning that the tech was faulty. Lockport officials received an email that discusses AEGIS’s accuracy and propensity for aggravating racial biases. The AI finished 49 out of 139 respondents in the NIST’s test for racial bias. But even that weak finish was overstated. As the NIST pointed out, the algorithm submitted by SN Technologies (which licenses its algorithm from French firm id3 Technologies) wasn’t the same one that’s being deployed in New York schools.

[A]ccording to Patrick Grother, the NIST scientist who oversaw the testing, the agency never tested an id3 Technologies algorithm that matches the description Flynn gave Lockport officials. “Those numbers don’t tally with our numbers. They’re not even close. What id3 sent to NIST is not what these people are talking about,” Grother told Motherboard.

The documents obtained by Motherboard show something even more nefarious than the submission of an algorithm that didn’t actually represent what the company sold to clients. It appears SN Technology lied to school officials about the NIST’s test results, claiming the algorithm was nearly twice as accurate as NIST testing actually showed.

But that hasn’t stopped the rollout of facial recognition tech that disproportionately misidentifies minorities and/or their non-weapons. Other schools — some in other states — seem to believe faulty tech is better than no tech at all, especially if there’s a chance the next false positive could prevent a school brooming.

At least 11 other districts in the state have since applied for Smart Schools money to purchase facial recognition systems, according to a NYCLU analysis of the applications. Schools in other states, such as South Carolina, have also deployed similar systems which claim the ability to detect weapons and stop school shootings.

We’ll see if the spread of terrible tech slows in the future. Facial recognition is currently the target of lawsuits and legislation in New York. But if past performance is any indicator of future results, the tech isn’t going to go away, no matter how poorly facial recognition tech, you know, recognizes faces.

Source:: https://www.techdirt.com/articles/20201201/10065545799/new-york-schools-putting-students-crosshairs-tech-that-targets-minorities-thinks-broom-handles-are-guns.shtml