Due to the current Covid-19 lockdown, educational and many other institutions are turning to surveillance tools in unprecedented numbers, under the banner of “proctoring”. Schools and office jobs are engaging in extensive surveillance to supervise examinations and work remotely. Typically, a student/worker is monitored while completing an assignment or exam using a combination of tools including the specific equipment webcam, microphone, and screen. Normally access is granted to these tools on the equipment through third-party platforms, but it can be possible an “in-house” development.
I have written here about the rise of surveillance capitalism and ponder the grim future to which that Orwellian path leads. But to be honest, for our students happening now, really? That future is taking place as I am writing these lines, as they try to act dutifully in front of their “dogy webcams”.
While there is clear evidence that academic surveillance is dangerous, there is little evidence that it is effective at preventing cheating or that online evaluations require extraordinary anti-cheating measures. Moreover, it’s not just privacy that's at stake; while there exists undoubtedly very real privacy concerns (from biometric data collection, to the unnecessary permissions these “technologies” require over the students’ devices, to the invasion of students’ personal environments) these matters make clear that proctoring technologies also raise concerns about security, equity and accessibility, cost, increased stress, and bias in the technology. For example, nobody is aware these changes acknowledge the painful reality that it is impossible to conduct fair online tests, when some students are able to take them in sprawling home offices, and others take tests while crammed into a closet, a restroom, or outside the public wifi of a crowded Starbucks. These demands cannot be simply met by many low-income students, rural students, and students with challenging family situations such as homelessness. Additionally, being forced to show a stranger their living conditions may cause students living in poverty great discomfort and distress feelings that may directly affect a student’s performance in general.
So let's apply clear terms: surveillance technology is a fancy and somewhat innocent name for spyware, period. Hence, what academic institutions and corporations are installing in their people is basically malware. Some of this malware is really advanced as it utilizes visual AI to examine among other things: head movements, and possible eye recognition to judge if the person is looking somewhere else than the screen(for example, ProctorU’s AI system flags students with a potential “violation” if they look off-screen for four seconds or more than two times in a single minute). You need to be completely obedient to this spyware, or you will get a warning and maybe punished for showing illicit behavior.
Some services rely more heavily on AI to determine flags on a student’s behavior while others include a human proctor who observes “every second” of the exam. This proctor or interventionist may, at any point during an exam, demand a student to aim their webcam at a certain area or follow other instructions to allow for further surveillance. Of course, students can only obey this abusive control or suffer academic penalties if they refuse these horrific demands.
This legitimately is incredibly scary, and if you do not see it I am afraid you are unaware of what this terrible civic precedent implies: we are indoctrinating our youth to think this is normal. Youngsters are being trained to accept digital surveillance, and when reaching adulthood they will be less likely to rebel against spyware deployed by their “superiors” or even inside their personal life by their abusive partners.
Continuing with this post exposing the spyware deployed against students, it has been widely known that facial recognition technology struggles to accurately identify non-white, female, and older faces. As we discussed briefly in Machine Learning Models as the ones used by these "surveillance technologies" can be prone to misidentify targets. As a result, dark-skinned and older students are more likely to encounter trouble verifying their identities in online exams. This problem is not hypothetical: students with black or brown skin have reported that facial recognition technology used for ID verification by academic surveillance programs has failed to recognize their faces.
Any Machine Learning Model or AI system (or system in general) that continuously discriminates people on the basis of their race, sex, or age is unacceptably biased and should be banned completely.
Technology has opened up many different opportunities for distance learning, and Covid-19 has forced us to use that technology on a scale never seen before. But schools (and corporations) must accept that they cannot have control of a student when they are at home, no matter the reason or purpose. Proctoring apps are wrong in multiple ways as we saw: they invade privacy, exacerbate existing inequities, can never fully match the control schools are used to and for our domains: because it is pure and lure malware distributed under a legal disguise.