A newly published study by researchers at the University of Michigan shows that facial recognition technology poses several problems in schools and has limited effectiveness. Led by Shobita Parthasarathy, director of the university’s Science, Technology and Public Policy (STPP) program, the study says the technology is unsafe and can actively promote racial discrimination, normalize surveillance, and undermine privacy , while inaccuracies are institutionalized, marginalization of non-compliant students.
The study follows the passage of a New York legislature moratorium on the use of facial recognition and other forms of biometric identification in schools through 2022. The bill that came in response to the Lockport City School District̵
Michigan University’s study – part of STPP’s Technology Assessment Project – uses an analog case comparison method to examine previous applications of security technologies such as CCTV cameras and metal detectors, as well as biometric technologies, and to anticipate the effects of facial recognition. While its conclusions are not new, it has a strong stance on commercial products that are believed to do far more harm to students and educators than it does to help them.
For example, the co-authors claim that facial recognition would disproportionately attack and discriminate against people of skin color, especially black and Latin American communities. At the same time, they say facial recognition would create new rules for dress and appearance, penalizing students who don’t fit tight acceptance standards, which would create problems if a school relied on it for activities like attending or buying lunch automate.
In fact, countless studies have shown that face recognition is prone to distortion. In a University of Colorado article last fall, Boulder researchers showed that AI from Amazon, Clarifai, Microsoft, and others kept accuracy rates above 95% for cisgender men and women, but misidentified trans men as women 38% of the time . Separate benchmarks of the systems of the major providers by the Gender Shades project and the National Institute of Standards and Technology (NIST) suggest that facial recognition technology has racial and gender prejudices and facial recognition programs can be extremely inaccurate and misclassify people over 96% of those surveyed Time.
The Michigan University study co-authors also claim that facial recognition in schools will create new types of student data that private companies will sell and buy. Data collected for a specific purpose will be used in other ways, making it impossible for students to give full and informed consent to data collection or control. An appeal to this was proposed last week by Senator Jeff Merkley (D-OR) and Senator Bernie Sanders (I-VT) in the National Biometric Information Privacy Act, which would make it illegal for companies to collect, buy, or trade in biometric information from customers without permission. However, most US states currently have few protective measures in place.
For these reasons, the researchers recommend a nationwide ban on facial recognition in schools. However, they provide policy recommendations for schools that consider the technology “absolutely necessary”. Among other things, they propose a five-year moratorium on the use of facial recognition technology in schools. Convening a national advisory committee to study facial recognition and its effects; Establishment of technology offices to help schools address the technical, social, ethical and racial challenges of facial recognition; and deleting facial recognition data at the end of each academic year or when students graduate or leave the district.
A number of efforts to use facial recognition systems in schools have met resistance from parents, students, alumni, community members, and lawmakers. At the college level, a media fire storm broke out after it was revealed that a University of Colorado professor secretly photographed thousands of students, staff and visitors on public sidewalks for a military counterterrorism project. Researchers at the University of California at San Diego admitted studying recordings of student facial expressions to predict engagement. And last year, the University of California at Los Angeles proposed using facial recognition software for security monitoring as part of a larger security policy for the campus.