Proctoring software weaponizes the eugenic gaze, but it’s nothing new in our pedagogy of punishment
Sarah Rose // Features Editor
Valeriya Kim // Staff Illustrator
The first time the school bell rings, we discover a world bathed in wonder. Signs and numbers reveal themselves to young minds racing like the Autobahn with new connections.
The last time the bell rings, those memories are often replaced with resentment. That hatred, trauma and apathy directly related to classroom experiences carved deep into teenagers’ psyches is considered natural—just an inevitable developmental progression. The classroom is a place to shed the skin of innocence and enthusiasm and join “the real world.”
“No system works unless it operates with incentives,” declared the American Federation of Teachers president Albert Shanker. This sort of doctrinal consistency is a rare thing to behold—behaviorism, the philosophy developed by minds like Skinner and Pavlov in the early 20th century, still informs virtually every aspect of North American education.
It begins with a gold star sticker, and ends with an explosion of surveillance software in our pandemic age that forces students to choose between invasions of privacy and civil rights violations, or their grades. Online learning has existed for decades prior to COVID-19 without relying on proctoring. The massive push for the software now is a clear emphasis on surveillance and punishment over real education, and discrimination over evolution.
Over the last decade, higher education has increasingly intertwined itself with the online sphere. When the other shoe dropped after COVID, classes moved en-masse to a fully online model followed by an unquestioned implementation of proctoring software. Algorithmic test proctoring promises to deliver an even playing field that weeds out dishonesty and academic misconduct, but the reality is it doesn’t target anything besides the encoded ideal human body. Any deviation from the established ideal is punished because the software doesn’t actually measure honesty, a thing as nebulously shaped by our culture as the boundaries of cheating, but the human body itself. It decides which bodies are normal and safe, and which bodies are suspicious and need punishment.
Here’s how it works: the software starts recording your camera, audio and screen. It meticulously measures and tracks movement and eye movement and flags any deviations from this as suspicious behaviour. If it all sounds a bit Orwellian or a few degrees shy of being a Voight-Kampff test, that’s because it is.
Proctoring software is at its core an extension of the dominant pedagogy of punishment, the backbone of behaviourism in classrooms. With no distinction between causality in human behaviour and physical events, there’s no concern for any internal experiences of students. Therefore, institutions respond to distrust through surveillance, and if anything suspicious is seen, then a system of punishment and investigations follows.
The pedagogy of punishment completely discards the fact that the boundaries and definitions of things like cheating, plagiarism and citation are largely culturally constructed. Introducing new students to academic conduct policies is predicated entirely on threats and fear evangelized year-after-year in first year classrooms. When I was a new student with undiagnosed ADHD, I meticulously pruned through my first-year essays like my life depended on it, and although I didn’t understand it at the time, it did. Any extension of the eugenic gaze hinges on unequal power dynamics that aim to control behaviour around an acceptable parameter. Making citation mistakes because of a neurodivergence isn’t an acceptable behaviour regardless of the internal reasons for its causation.
What proctoring software flags as cheating includes a wide range of neurodiverse and disabled behaviours: verbally scripting questions, fidgeting, not making eye contact and more. I was led to believe that I teetered on the edge of privilege and opportunity with one careless mistake. Through fear it’s easy to leverage consent for invasive and discriminatory policy.
Naomi Klein illustrates this principle in detail in her novel The Shock Doctrine. With proctoring software sales up 900 percent since the start of the pandemic, these companies descended like vultures using a disaster capitalism complex on institutions desperate to uphold a standard of distrust and privilege. It taps into long held beliefs that students who aren’t white, middle-class, cis or abled aren’t trustworthy; so, they’ll cheat, and cheating hurts the “good” students, the “real” students—the white, abled students.
Even if the risks of this software to students were acknowledged by higher education institutions, which they currently aren’t, these companies are offering a product that resonates with several implicit core values and practices of education that have always outweighed student safety: exclusion, behaviourism, technological solutionism and the eugenic gaze.
Cheating is not a technological problem, it’s a pedagogical problem, and the blind faith that technology will solve these pedagogical problems is endemic to the algorithms of oppression that scholars like Safiya Umoja Noble dissect in her eponymous book, Algorithms of Oppression. If institutions genuinely believe that the solution to the possibility of students cheating on exams is to extort them into paying for faulty, harmful software that invades privacy instead of adjusting the content and administration of exams, then the problem lies squarely with how we view education as a whole. Class dismissed.