Skip to content

‘Normalizing Surveillance From a Young Age’: More Schools Using Facial Recognition, AI Technologies to Monitor Kids

Parents and students are concerned about the growing use of artificial intelligence technologies in the classroom, according to a survey by the Center for Democracy and Technology. Privacy experts warn the technologies are about “manipulating and controlling behavior by instilling fear” in children.

‘Normalizing Surveillance From a Young Age’: More Schools Using Facial Recognition, AI Technologies to Monitor Kids Image Credit: SPmemory / iStock / Getty Images Plus
SHARE
LIVE
gab

Parents and students are increasingly concerned about the use of artificial intelligence (AI) technologies in the classroom, especially facial recognition technology, according to a survey by the Center for Democracy and Technology (CDT).

The CDT report, released Dec. 12, found that more than half of parents and students surveyed were concerned about the use of facial recognition and other AI technologies, including location tracking systems, in schools.

Teachers, who also were surveyed, showed a higher degree of acceptance of the technologies.

According to the report, a growing number of schools have implemented such tools.

Proponents of the technologies argue they can help protect school environments from violent threats, such as school shooters.

Privacy advocates argue the technologies that pose a risk to the privacy and personal data of students, have not been proven to increase school safety.

‘Deep disconnect between schools, parents, and students’ 

According to CDT, “experimental, potentially harmful safety tools are being used regardless of student, parent concerns” — including technologies “we previously thought ‘too outlandish.’”

These include predictive analyticsremote proctoringfacial recognitionlaw enforcement data sharingweapon detection systems and student location tracking.

Driven by AI, these technologies “are expanding in schools to respond to mass shootings, the youth mental health crisis, and other ever-present safety threats to staff and students,” CDT said — an “alarming” trend that schools are continuing the rollout of such technologies despite “high levels of concern” from parents and students.

These “high levels of concern” were evident in the survey’s results showing:

  • 58% of parents and 55% of students (and 33% of teachers) were concerned about the use of facial recognition cameras to check who should be allowed to enter a school building or who is authorized to be there.
  • 71% of parents and 74% of students (and 36% of teachers) expressed concern about the use of such technologies to track students’ physical location.
  • 60% of parents and 58% of students (and 31% of teachers) were concerned about the use of AI cameras “to notice unusual or irregular physical movements.”
  • 55% of parents and 45% of students (and 27% of teachers) expressed concern about the use of such technologies to detect gunshots on school grounds.
  • 69% of students and parents (and 36% of teachers) were concerned that student data are being analyzed to predict which individual students would be most likely to commit a crime, violent act or an act of self-harm.
  • 66% of parents and 65% of students (and 38% of teachers) expressed concern with the possibility that students’ academic information, such as their grades and attendance records, could be shared with law enforcement.
  • 68% of parents and 71% of students (and 37% of teachers) were concerned about such technologies being used to monitor students’ social media accounts.

These results show “a deep disconnect between schools, parents, and students in their priorities when it comes to edtech [educational data and technology] procurement decisions,” CDT wrote.

Schools using COVID recovery funds to buy surveillance technologies

The survey builds on a CDT report, published in September, on edtech tools that perform content filtering and blocking, student activity monitoring or that use generative AI.

According to that report, the COVID-19 pandemic helped hasten the uptake of such technologies in school environments — a development viewed critically by CDT.

“The use of student activity monitoring software rapidly expanded during remote learning and has maintained a significant presence in students’ lives. Unfortunately,

it continues to harm the students it is intended to help,” the report said.

According to the report, those harms range from disciplinary actions to outing students without their consent and initiating law enforcement contact.

The report also included data indicating that 88% of teachers reported their schools use student activity monitoring software, 40% of teachers reported that their schools monitor students’ personal devices and 38% of teachers reported that their school monitors students outside of school hours — though, notably, there was a 9 percentage point decrease in this metric from the 2021-22 school year.

Kenneth Trump, president of National School Safety and Security Services, told Education Week in October, “Schools have been using the COVID recovery funds to buy security equipment and hardware.”

Technology firms have “amped up” the marketing of these products to school districts in recent years, according to Trump who said the purchases “have been used to solve political and community relation problems, not so much school safety problems.”

“When there is gun use or confiscation on campus, we see school boards and superintendents make knee-jerk decisions and play to the emotional security needs of parents and staff,” Trump added.

In an example from the United Kingdom, Sky News reported in October 2021 that 27 schools had begun using a facial recognition system to serve lunch to students and 15 more were ready to implement the technology — a measure purportedly aimed at reducing the risk of COVID-19 transmission.

Sky News reported that parents and activists “warned that it normalised exposing children to biometric surveillance, and complained that they weren’t confident students were being adequately informed about the privacy risk.”

And despite reportedly high levels of consent from parents, Sky News reported at the time that children’s privacy advocates Jen Persson and Pippa King told Scotland’s Children’s Commissioner, “High uptake should not be mistaken as consent,” noting that consent forms provided to parents made acceptance appear mandatory.

Such complaints led the Information Commissioner’s Office, the U.K.’s data watchdog, to investigate. Scotland’s North Ayrshire Council paused its rollout of the technology, while the British House of Lords debated the issue in November 2021.

Greg Glaser, a California-based litigator for Children Health Defense’s privacy initiatives, told The Defender that “During the Covidian era of masks and lockdowns, parents witnessed a forced normalization of Zoom classrooms, but Zoom was not the only technology being normalized upon young people.”

Glaser added:

“This is unsurprising for government schools. Bureaucracy is not organized to remedy root issues in society, but to treat symptoms. Nothing will really improve in government schools until society decides to learn the deeper lesson, the why children fail in a system designed to fail. Why was the system designed to injure children?

“Meanwhile, the billion-dollar educational industry that profits on ‘fighting’ failure will continue to offer their supposed solutions. It’s all so tiresome — you can spot the grift a mile away.”

Concerns over ‘chilling effects’ of facial recognition technologies in schools

According to CDT, technologies used “in the name of student safety” pose capabilities for which education leaders and policymakers should be concerned.

These include a lack of efficacy and accuracy, such as technical limitations, “false positives” that could lead to “unsubstantiated disciplinary action” directed at students, and difficulties auditing such systems.

Another set of concerns, CDT said, are “chilling effects.” According to CDT, “Having various invasive safety technology tools as a regular part of a student’s learning environment can actually cause students to feel less safe in the classroom.

“Excessive monitoring and surveillance can chill speech, associations, movement, and access to vital resources,” CDT said.

These concerns were mirrored in a 2020 article published in Learning, Media and Technology by Mark Andrejevic, Ph.D., and Neil Selwyn, Ph.D. of Monash University in Australia, according to which such technology can alter “the nature of schools and schooling along divisive, authoritarian and oppressive lines.”

“The key challenge now facing educators is whether or not there is a realistic future prospect of somehow reshaping these technologies for more beneficial and/or benign purposes. Alternatively, is this a form of digital technology that should not be ‘educationally’ applied in any form whatsoever?” the authors wrote.

Tim Hinchliffe, editor of The Sociable, told The Defender, “Facial recognition in schools is about manipulating behavior, and it normalizes total surveillance from an early age.”

“In the classroom, facial recognition teaches children that they have no privacy, and that anything they say or do, can and will be used against them. This makes it easier for governments and corporations to control future generations because they are being brought up with the notion that privacy doesn’t exist, and they better do what they’re told or else!” he added.

Hinchliffe cited a 2020 “Good Morning America” report that showed video from an online class from Parkland Elementary School in Texas. When a second-grade teacher’s Zoom connection dropped, students acted up at first — before realizing the call was still being recorded.

“The kids started acting up at first, but when one student realized they were still being recorded, they all conformed for fear of getting into trouble with the principal. It’s that fear that makes it so powerful and gets children to conform,” he said.

Pin Lean Lau, Ph.D., of London’s Brunel Law School recounted a conversation with her daughter, who, when asked if she would be concerned about the use of facial recognition technology by her school’s cafeteria, said “Not really. It would make things a lot faster at checkout though.”

According to Lau, “Her words validate the concern that children are much less aware of their data rights compared to adults.”

“On a macro scale, a population that knows it is being watched will change its behavior to conform to the norms, and its citizens will police themselves,” Hinchliffe said.

The CDT report also addressed the potentially disproportionate impact of these technologies against protected categories of students, the lack of resources many schools have to maintain and update such technologies, unclear governance mechanisms overseeing the use of these technologies, and privacy risks such as data breaches.

A 2021 hack impacting Verkada, a developer of cloud-based security technologies widely used in schools, publicly exposed live feeds from surveillance cameras.

Irene Knapp, director of technology at the nonprofit Internet Safety Labs, told Education Week that schools need to carefully consider whether they wish to take on the responsibility of handling and protecting students’ biometric data.

Knapp said it is difficult to know whether the data being collected by such technologies are being shared with third parties.

According to Education Week, “There’s also the real risk of mission creep,” as it’s “tempting” for schools to use surveillance technology like facial recognition in ways it wasn’t originally intended for, “such as tracking and fining parents who are late picking their children up from school.”

Molly Kleinman, Ph.D., managing director of the Science, Technology, and Public Policy program at the University of Michigan, told Route Fifty in September that without regulations in place, schools may use such technologies for “routine tasks” or may require facial recognition for students to log in to school-owned computers and tablets.

According to Hinchliffe, “Even if facial recognition starts off at the entry to schools to verify who’s coming in for so-called ‘safety reasons,’ it’s only a matter of time before it enters the classroom, and when it does, it will rob students of another part of their childhood, and kids will no longer be allowed to be kids.”

“From a privacy law standpoint, the government surveillance schools are exposing themselves to liability on many potential levels, as their opt-in procedures, if any, will routinely fail to cover the reality of what they and their corporate partners are doing,” Glaser said.

“It only takes one data breach to trigger notice requirements to parents,” Glaser said. “And where security procedures are not dutifully followed, that means lawsuits. At least, that is how the system designed to fail will fail.”

Such concerns are not new. As far back as 2019, Wired magazine addressed the “delicate ethics” of implementing AI technologies in the classroom.

New York implemented ban on facial recognition technologies in classrooms

It was such “serious concerns surrounding the use of facial recognition technology” that “do not outweigh its claimed benefits” that led New York state to enact a ban on such technologies in September, prohibiting schools “from purchasing or utilizing facial recognition technology.”

This determination was made following the completion of an analysis by the Office of Information Technology Services and based on research derived from data collected by the nonprofit The Violence Prevention Project, which found that 70% of school shooters between 1980 and 2019 were current students.

A 2020 University of Michigan study on facial recognition technology and its impacts in the classroom potentially also influenced New York’s decision to impose a ban.

According to the study, facial recognition technology “will likely have five types of implications: exacerbating racism, normalizing surveillance and eroding privacy, narrowing the definition of the ‘acceptable’ student, commodifying data, and institutionalizing inaccuracy.”

“Because FR [facial recognition] is automated, it will extend these effects to more students than any manual system could,” the study adds. “On the basis of this analysis, we strongly recommend that use of FR be banned in schools.”

The study also issued a wide range of national and local recommendations for schools that would continue using such technologies. Similar recommendations were made by CDT accompanying its Dec. 12 report.

And in 2019, Vox noted that schools were increasingly “using facial recognition to try to stop shootings,” but argued “they should think twice” about this practice.

School districts can still opt to use other types of biometric technologies, such as digital fingerprinting, according to New York’s new policy.

This comes despite “immense pressure” school administrators face to protect their schools from gun violence and the threat of a shooting, according to Education Week, which noted that facial and weapons recognition technologies powered by AI “can be an alluring solution for school boards and superintendents looking to reassure parents.”

New York implemented a moratorium on facial recognition after parents legally challenged the adoption of this technology by the Lockport Central School District in January 2020, according to Time magazine.

“The western New York district was among the first in the country to incorporate the technology in the aftermath of deadly mass school shootings that have led administrators nationwide to adopt security measures ranging from bulletproof glass to armed guards,” Time reported.

In 2020, two high schools in France that experimented with facial recognition technology were sued, leading to an administrative court decision banning the practice.

According to the decision, the implementation of the technology “was neither proportionate nor necessary,” consent from students had not been freely obtained and less privacy-intrusive measures could instead have been implemented.

Elsewhere though, the rollout of such technologies is continuing. In August, Philadelphia announced it would introduce district-owned drones “to patrol violence-prone areas without the need for police on the ground.”

Biometric Update reported in October 2022 that Montana schools are using facial recognition technology by Verkada “in an attempt to improve safety.” The Sun River Valley School District, for instance, feeds its facial recognition system with “a watchlist from the local Sheriff’s Department, as well as yearbook photos of students.”

According to Hinchliffe, China is expanding facial recognition capabilities in its schools “with AI and wearables, to include emotion recognition and other behavioral aspects, so they can tell when a child is upset or when they aren’t paying attention.”

“Again, facial recognition in schools is all about manipulating and controlling behavior by instilling fear in the child,” Hinchliffe said.

Attorney Richard Jaffe told The Defender there is room for facial recognition in schools.

“Privacy, like all rights, is not absolute and has to give way to … the right of students to be safe.”

“I speculate that most parents and almost all teachers and staff will accept some relatively minor infringements on privacy to increase the safety and security of schools. Short of locking every school down guarded by a platoon of SWAT police armed with M16s, the solution has to involve technology, and will increasingly involve AI and facial recognition,” he said.

“Most will accept that tradeoff, and I doubt very much that the courts will second-guess school districts which employ the currently available measures,” Jaffe added.

Forbes, in a February 2023 story, also saw a place for facial recognition technologies in schools, calling it “clever tech” but noting its “bad, bad, bad implementation” and arguing that there are “Hurdles to overcome before facial recognition can be used.”

These hurdles include “research into the wellbeing impacts and ethics of biometrics use in schools,” ensuring such systems “operationally default to the highest levels of rights protection,” and ensuring their use is “fully lawful” with “no unintended consequences.”



Get 40% OFF our fan-favorite drink mix Vitamin Mineral Fusion NOW at the Infowars Store!
SHARE
LIVE
gab