With COVID-19 forcing millions of teachers and students to rethink in-person schooling, this moment is ripe for an innovation in learning. Unfortunately, many schools have simply substituted surveillance technology for real transformation. The use of proctoring apps—privacy-invasive software products that “watch” students as they take tests or complete schoolwork, has skyrocketed. These apps make a seductive promise: that schools can still rely on high-stakes tests, where they have complete control of a student’s environment, even during remote learning. But that promise comes with a huge catch—these apps violate student privacy, negatively impact some populations, and will likely never fully stop creative students from outsmarting the system.
No student should be forced to make the choice to either hand over their biometric data and be surveilled continuously or to fail their class.
Through a series of privacy-invasive monitoring techniques, proctoring apps purport to determine whether a student is cheating. Recorded patterns of keystrokes and facial recognition supposedly confirm whether the student signing up for a test is the one taking it; gaze-monitoring or eye-tracking is meant to ensure that students don’t look off-screen too long, where they might have answers written down; microphones and cameras record students’ surroundings, broadcasting them to a proctor, who must ensure that no one else is in the room. Even if these features were successful at rooting out all cheating, which is extremely unlikely, what these tools amount to is compelled mass biometric surveillance of potentially millions of students, whose success will be determined not by correct answers, but by algorithms that decide whether or not their “suspicion” score is too high.
Much of this technology is effectively indistinguishable from spyware, which is malware that is commonly used to track unsuspecting users’ actions on their devices and across the Internet. It also has much in common with “bossware,” the invasive time-tracking and worker “productivity” software that has grown in popularity during the pandemic. EFF has campaigned against the pervasive use of both of these tools, demanding anti-virus companies recognize spyware more explicitly, and pushing employers to minimize their use of bossware.
In addition to the invasive gathering of biometric data, proctoring services gather and retain personally identifiable information (PII) on students—sometimes through their schools, or by requiring students to input this data in order to register for an account. This can include full name, date of birth, address, phone number, scans of government-issued identity documents, educational institution affiliation, and student ID numbers. Proctoring companies also automatically gather data on student devices, regardless of whether they are school-issued devices or not. These collected logs can include records of operating systems, make and model of the device, as well as device identification numbers, IP addresses, browser type and language settings, software on the device and their versions, ISP, records of URLs visited, and how long students remain on a particular site or webpage.
The companies retain much of what they gather, too—whether that’s documentation or video of bedroom scans. Some companies, like ProctorU, have no time limits on retention. Some of this information they share with third parties. And when student data is provided to the proctoring company by an educational institution, students are often left without a clear way to request that their data be deleted because they aren’t considered the data’s “owner.”
The leveraging of student data for commercial purposes isn’t the only risk to student privacy—as we’ve noted time and time again, gathering vast amounts of data on people is unwise given frequent breaches and subsequent data dumps. ProctorU found that out recently, when over 440,000 user records for their proctoring service were leaked on a hacker forum last month, including “email addresses, full names, addresses, phone numbers, hashed passwords, the affiliated organization, and other information.”
Aside from privacy concerns, these tools could easily penalize students who don’t have control over their surroundings, or those with less functional hardware or low-speed Internet. For students who don’t have home Internet access at all, they are locked out of testing altogether. They could also cause havoc for students who already have trouble focusing during tests, either because of a difficulty maintaining “eye contact” with their device, or simply because tests make them nervous. Software that assumes all students take tests the same way — in rooms that they can control, their eyes straight ahead, fingers typing at a routine pace—are undoubtedly leaving some students out.
No student should be forced to make the choice to either hand over their biometric data and be surveilled continuously or to fail their class. A solution that requires students to surrender the security of their personal biometric information and give over video of their private spaces is no solution at all.
Technology has opened up unprecedented opportunities for learning at a distance, and COVID-19 has forced us to use that technology on a scale never seen before. Yet schools must accept that they cannot have complete control of a student’s environment when they are at home, nor should they want to. Proctoring apps fall short on multiple fronts: they invade students’ privacy, exacerbate existing inequities in educational outcomes, and can never fully match the control schools are used to enforcing in the test hall.
Educational institutions will need to adapt fundamentally to distance learning. New technologies and new teaching methods will be a part of that. Perhaps schools will need to reevaluate the need for closed book exams, or use fewer tests overall as compared to project-based assessments. Regardless, they should not rely on invasive proctoring apps to attempt to replace methods that only work in person. Surveillance tech has already crept into many areas of education, with some schools tracking students’ social media activity, others requiring students to use technology that collects and shares private data with third-party companies, and others implementing flawed facial recognition technology in the name of safety. While there are ways to fight back against some common school surveillance, it becomes increasingly difficult when that surveillance is directly tied to students’ evaluations and ultimate success. Teachers, parents, and students must not allow remote learning to become remote surveillance.