Digital rights groups are calling on lawmakers to ban facial recognition in schools after a new study claimed the technology would have many negative implications affecting students.
The study titled, “Cameras in the Classroom,” was conducted by researchers from the University of Michigan’s Ford School of Science, Technology and Public Policy Program (STPP) and determined that facial recognition (FR) “is likely to mimic the impacts of school resource officers (SROs), stop-and-frisk policies, and airport security,” all of which “purport to be objective and neutral systems, but in practice they reflect the structural and systemic biases of the societies around them.”
“The research shows that prematurely deploying the technology without understanding its implications would be unethical and dangerous,” Shobita Parthasarathy, lead author and STPP director and professor of public policy, said.
The study’s analysis concluded there were five types of negative implications of FR, which include:
- exacerbating racism
- normalizing surveillance and eroding privacy
- narrowing the definition of the “acceptable” student
- commodifying data
- institutionalizing inaccuracy
“Because FR is automated, it will extend these effects to more students than any manual system could,” the study said.
While the use of FR technology isn’t “yet widespread” through school, the study said there is reason to stop its spread as “schools have also begun to use [FR] to track students and visitors for a range of uses, from automating attendance to school security.”
“All of these practices have had racist outcomes due to the users of the systems disproportionately targeting people of color,” the study said.
Digital rights groups who oppose the use of facial recognition said that companies that sell the technology are using the coronavirus pandemic to their advantage and therefore, there is more of an urgency to demand lawmakers ban FR in schools.
“We’re already seeing surveillance vendors attempt to exploit the COVID-19 pandemic to push for the use of this ineffective, invasive, and blatantly racist technology,” Evan Greer, deputy director of Fight for the Future, said. “It’s time to draw a line in the sand right now.”
The researchers said that “normalize the experience of being constantly surveilled starting at a young age” has the possibility of “mission creep,” “as administrators expand the usage of the technology outside of what was originally defined.”
“Using facial recognition in schools amounts to unethical experimentation on children,” Greer said.