In the age of AI, the education system struggles in search of teaching solutions
Andrea Chiang (they/them) // Contributor
With the rise of Artificial Intelligence (AI), the education system is being forced to reevaluate how to deal with the latest technological revolutions. At Capilano University, the Center of Teaching Excellence (CTE) and teaching faculty members have opposing views. The CTE, made up of Educational Developers, has no intention of shying away from using AI in education and encourages using, “AI as a tool to enhance teaching and learning,” according to the CTE website.
While teaching faculty, specifically from the Humanities department in the Faculty of Arts and Sciences, are staunchly against AI due to its impact on students’ learning. “The whole point of ChatGPT is you not learning. It is doing it for you,” states one CapU student who has used it for their English 100 course. Although there is uncertainty about how to approach AI issues, one thing is certain, AI isn’t going away, a sentiment both teaching and non-teaching faculty can agree on.
AI is a broad field of, “technology that enables computers and machines to simulate human learning [and] comprehension,” according to IBM. Within educational institutions, the concern is with generative AI (gen AI), such as ChatGPT, which are, “used to create new content, including audio, code, images, text, simulations, and videos,” says McKinsey.com. This benefits those who wish to offload mundane tasks—like creating PowerPoint presentations, writing standardized emails, and creating templates—and streamline productivity.
Business and similar fields may view AI in a more optimistic lens compared to those in the humanities. For English teachers, Gen AI’s capabilities have introduced a new form of plagiarism. “The problem is not that students are using AI. It’s that they’re using it as and in place of their own work. I do let them use it if they cite it,” says Holly Flauto, a CapU English Instructor and Creative Writing Convener in the Faculty of Arts and Sciences. Additionally, for Professor Flauto and other instructors, generative AI has also impacted their workload. Grading papers with uncited AI ideas and language is, “quite stressful and takes a lot of time,” explains Flauto; “Something that would take two or three hours to provide grades and feedback, all of a sudden, is taking six hours or eight hours and that’s not even counting having to meet with each of these students who use AI with no acknowledgement. If I find 20 students, that’s 20 meetings.”
At the moment, the CTE has been supporting teaching faculty by providing workshops, individual consultation meetings and resources on their website. “We’ve already developed resources […] and faculty can use it to learn what AI is, how they can use it in their course,” says Barry Magrill, an educational developer and coordinator at the CTE. In an interview with Lydia Watson, an educational developer at the CTE, she adds that they have had guidelines on AI since 2022 and are testing tools that faculty are interested in using.
Although there are steps towards assisting teaching faculty with the new challenges brought by AI, the teaching faculty of the English department wants clearer policies from CapU. “Unlike many universities, we do not have a public- or student-facing statement on our website about the issues AI poses for academic integrity or intellectual property,” states Cassidy Picken, a CapU English instructor. “I would love to see [CapU] adopt a clearer stance on AI use overall,” says Torin McLachlan, a CapU English instructor in the Faculty of Arts and Sciences. “I think one of the things that would really help is if [CapU] had a strong and consistent policy about it,” says Cara DiGirolamo, an Engish professor at CapU with a background in linguistics.
An extreme solution would be for institutions to completely ban AI. However, this may go against teachers’ freedom to design their class curriculums, and use their own discernment to decide if their curriculum should include the integration of AI tools or not. DiGirolamo suggests an academic integrity policy that explicitly states, “submitting things generated by AI as your own is plagiarism.” Still, the question of how to effectively teach students in the age of AI remains. “Is my ideal teaching situation that I’m failing everyone and that I like catching them? No. That’s not who I want to be as an educator,” says Flauto.
At the heart of the issue is the educational system struggling to keep up with a rapidly evolving landscape of technology. “AI was never an educational tool. It’s a corporate tool, and so we’ve had to adapt to it suddenly, and because it’s evolving so quickly, we don’t have as much time as we used to have with new technologies, to make that adaptation,” says Magrill, a non-teaching faculty member of the CTE. Despite this, teachers have found their unique solutions for the issues AI poses in their classrooms. For McLachlan, it’s modifying her assignment and her rubric. For Magrill, it’s focusing on live in-class writing and using AI to teach students how to think critically rather than using AI to think for students: “AI is an unreliable answer giver, but it is a very good tool to help people think.” Although different in their approaches, instructors do what they can to encourage critical thinking and how to communicate their ideas.
On top of clearer policy from the university and continued support for teaching faculty, conversations about how to handle the problems AI poses to learning must continue between CapU, teaching and non-teaching faculty. “I think there are so many conversations still to be had about the way that AI impacts the labour landscape for us as teachers,” says McLachlan. Developing policies and guidelines that assist instructors requires an open dialogue across departments and faculty. “Possibly more important than all of the things about teaching using AI or dealing with AI plagiarism, is that actual solid knowledge about what [AI] is and how it works,” explains DiGirolamo. “[Gen AI is] being sold as these really useful and helpful tools without ever explaining through how they work, that they are just probabilistic text generators.” This sentiment is echoed by CTE faculty member Magrill: “I often recommend to faculty, when talking to their students, to encourage their students to ask [ChatGPT] questions rather than give answers,” he says.