Nova Scotia

CBU working group develops guidelines for instructors on AI software

The Center for Teaching & Learning at Cape Breton University is organizing a study group to better understand the strengths and weaknesses of artificial intelligence software.

The move comes as professors, instructors, and other learning professionals in the post-secondary sector are exploring ways to protect academic integrity and take advantage of opportunities offered by tools such as ChatGPT, the hugely popular chatbot developed by OpenAI that has been launched in recent years. made headlines worldwide for months.

Jonathan Doering, an assistant professor of English at the university, says the center has invited stakeholders from the academic community to address issues and developed strategies for course design. Learning more about the technology is important, says Doering, especially when it comes to online university courses.

“It’s easy for me to understand, because I take personal courses and I really know these students, right? It’s not a problem,” he says. “I’d be more concerned if I were teaching an online course.”

Doering adds that the software is intuitive enough to help students understand the basics of subjects. But in the end, the language and flow of essays are the dead giveaway on whether a student has used AI to cheat.

A professor at the university says AI software can teach students, such as how to strengthen a thesis. (Shutterstock/Roman Samborskyi)

“Because I know my students’ voices, I know how they speak, all of their idiosyncrasies and idiosyncrasies,” says Doering. “Once they switch to chatGPT, it’s like they randomly adopted a British accent.”

The idea of ​​seeking help writing a paper isn’t new, says Doering. Some students are already looking for the easiest way to submit a paper, including paying other people to write them, he says. But part of the difficulty in checking the use of AI is that a program can write a paper in seconds, which a student can then edit to improve the flow and content.

‘They have to use their own brains’

Heather Sparling, a professor of ethnomusicology at the university, says that despite the technology’s drawbacks, there is a way to leverage it for further education.

She uses the example of asking an AI chatbot to write a thesis statement for a paper, then asking students to improve the response generated by the program.

“But I think the point is to help them see the limitations, and they need to use their own critical thinking to realize that the first thing they get from ChatGPT isn’t perfect,” says Sparling. “They have to use their own brains to figure out what the weaknesses are here and how are we going to strengthen them?”

The study group plans to collect information from the sessions and create guidelines for faculty on how to use ChatGPT and AI more generally.

See also  U.S. citizens in Canada should vote electronically if they can, Democrat group warns

Related Articles

Leave a Reply

Back to top button