Capture d’écran Youtube / WGRZ-TV

Facial recognition and racist algorithm in schools: costly fiasco in New York state

9 Shares
9
0
0
0
0
0

Do modern surveillance technologies have their place in schools, even in a country marked by killings in educational facilities ? The controversy rages in Lockport, while the costly solution chosen by the authorities is as ineffective as it is discriminatory.

Located about ten kilometers from the Canadian border, the city of Lockport, New York, has been in the crosshairs of privacy activists for several years. The issue: the installation of a network of cameras using facial recognition technology to “secure” its schools. The system is notably supposed to:

  • detect any person deemed undesirable after having been previously included in a database (notorious sexual predator, excluded former student, etc.)
  • recognize the presence of a firearm unless it is hidden in a bag or under clothing (which greatly limits the effectiveness of the device for detecting weapons…)
  • notify the authorities in the event of an alert.

As an author, but above all a parent of a concerned student, wrote in June 2019 in the New York Times, this security solution was considered in 2015 under rather opaque conditions – on Twitter, the editor even describes investigators of the project as “crooks”

This initiative, which had obviously not been presented as a lucrative opportunity for certain go-betweens, was sold as a way to avoid a potential massacre like that of the Sandy Hook primary school (28 dead, including 20 children, in December 2012). Like many other worried text published on this subject, this tribune warned that:

“Facial recognition technology is notoriously unreliable, especially in recognizing women and people of color”.

These repeated warnings were not followed up, and the schools in the area concerned were equipped in early 2020, as this report from a local channel noted:

Submachine gun or broom?

The whistleblowers ignored by the public authorities were, however, right. Less than a year later, Motherboard, the Vice site section specializing in digital culture and tech, revealed this Tuesday, with supporting documents, numerous shortcomings in the technological installations adopted by the Lockport City School District.

The Canadian company SN Technologies, responsible for the controversial installations, is thus accused of having lied about the reliability of its solution, which would confuse broomstick-handles with firearms; such detection triggering an alert process with the authorities, and they may thus mistakenly believe that a shooting is in progress or imminent.

A black woman is apparently 16 times more likely to be misidentified

Even more serious than this confusion between objects, the success rate of the software in terms of recognizing people of color turns out to be less efficient than initially announced in the figures of SN Technologies. The company claims to use for its products a solution developed in France by the Grenoble-based SME id3 Technologies, which describes itself on its site as “one of the main experts in biometrics [which] has already issued 30 million licenses for its algorithm for smart-chip card authentication”. According to SN Technologies, the algorithm used at Lockport would have been ranked 49th out of 139 in tests for “racist bias” carried out by the NIST (National Institute of Standards and Technology) – a result not extraordinary, but in the upper average of the domain.

Problem: an expert from NIST categorically denied that the solutions proposed by the French SME could correspond to the figures put forward to coax Lockport schools. SN Technologies had declared that the algorithm used was twice as likely to misidentify a black man than a white man. According to NIST, this probability has been underestimated: a black man would be 4 times more likely to be misidentified than a white man. The probability of error would increase even more for black women compared to white men: they would be 16 times more likely to be misidentified.

Education funds allocated for surveillance

In addition to the inequalities between students according to their skin color and their potential consequences in the event of unjustified police interventions, the financing of such security has scandalized the U.S: the 2,7 million dollars spent on this intrusive and unreliable technology were taken from the Smart Schools Bond Act program funds. This law passed in 2014 in New York State is supposed to allow schools to equip themselves with technologies to improve student learning. This generally results in the replacement of ageing computers or the modernization of the Internet network, and not the installation of surveillance systems.

A New York Union for Civil Liberties petition addressed to state governor Andrew Cuomo calls on him to ban these practices that “do little to improve school safety and make students feel like suspects in their own classrooms”.

Asked by Motherboard, Jim Shultz – the parent author of the column in the New York Times cited above – recalls the sad irony of this case: in 2020, this expensive system was totally useless. Covid doesn’t help, since it requires all people going to schools to wear a mask and are therefore not recognisable!

Leave a Reply

Your email address will not be published. Required fields are marked *