AI isn’t just coming soon to higher education: it’s already here.
ChatGPT is a trickle before the tsunami. Canada’s higher education landscape is about to get flooded with better, faster, smarter, more interwoven iterations of AI that are just around the corner.
The industry is voicing its initial response and it’s… surprisingly level-headed.
Instead of decrying easily-accessible AI tools as a threat to the education status quo, the major emerging theme is integration.
Tentatively, academics are starting to ask a crucial question: How can we best harness these tools to enrich the higher education experience?
The answer will likely be discovered through iterative processes over the next few years. But there are some professionals that are already probing the ways that ChatGPT can marry well into the classroom.
A practical exploration
In professor Steven Mintz’s article ChatGPT: Threat or Menace, we learn how he’s already integrated ChatGPT into his own class.
“In my own small (40-person classes), the students’ five shorter essays (at least 500 words in length) must include four parts:
A detailed prompt input into ChatGPT
The text that ChatGPT “wrote” in response to the prompt
An essay that builds on the ChatGPT foundation, supplemented with additional research that must be cited in a bibliography.
A list of the corrections, revisions and additions that the student made in producing the reworked essay.
We will devote time in class to discussing the text that ChatGPT produced, including its strengths and weaknesses.”
Mintz also recounts how not all of his colleagues are as willing to welcome AI into academia.
“I recently attended a collegewide discussion of ChatGPT’s teaching implications, and I was struck by the exasperated faculty participants’ intense and insistent negativism. One presenter after another declared that the application will encourage intellectual dishonesty.
Others complained that the platform has expropriated and exploited scholars’ intellectual property and that coping with ChatGPT will place uncompensated burdens on faculty.”
Put plainly: those concerns are legitimate. But Mintz agrees there’s no stopping the generative text train, and the sooner we explore the compatibility between the capabilities of AI and higher education, the sooner we can produce meaningful outcomes for students.
A learned perspective
Dr. Wesley Wildman, a professor of Ethical Computing at Boston University, recently answered questions on Reddit about some of the hot topics concerning AI and higher education.
On the ethics of using AI for assessments:
“The ethics of cheating may seem relatively clear-cut, but GPTs complicate the very idea of cheating because they can be used in so many ways.
For example, we would normally encourage students to converse with friends to generate and refine ideas for a writing assignment, thinking that this helps them verbalize and learn in a different mode.
So can it be cheating to have the same kind of conversation with a chatbot? We would normally encourage comprehensive research to uncover hidden angles on a writing assignment. Can it be cheating if a student uses ChatGPT to sift through mountains of material and produce condensed summaries, learning about perspectives they may have missed?
Using text generated by GPTs without acknowledgement of explanation constitutes plagiarism, surely, but there are a ton of other uses of GPTs that don't go that far. The colleges subsuming the use of GPTs under existing plagiarism rules will quickly discover that this leaves open too many cases.”
On exams:
“ChatGPT has already passed standardized exams in medicine, law, and computer programming, and the descendants of ChatGPT, beginning with those using GPT-4, are going to do a lot better still. Standardized exams will only be possible under specific types of proctoring arrangements. Even those arrangements will probably fail eventually as wearable devices become undetectable to exam proctors. For now, I think those exams will have to continue but the old-fashioned way — NOT online.”
A worthy endeavor
Despite all of the uncertainty that AI brings to the learning space, the possibilities for enrichment are staggering. So even if we could ignore its power, we really shouldn’t.
For students, AI could bring certain advantages like personalized learning, where AI-powered tools track each student’s progress, providing real-time feedback to both the student and the student support team. It could unlock more effective tutoring, and more robust pathway advising by learning where a student’s strengths and weaknesses are. Already, it enables a more direct route to information and easier learning-on-demand.
But it’s not just the students that could benefit from more practical AI integration — institutions could harness AI-driven grading, more automated administrative tasks, and likely even strategic development.
One thing we do know: you don’t need to wait for AI to start making data-driven decisions.
CRI takes time to understand your goals, before conducting and distilling targeted research into simple, actionable recommendations.