
Educators have plenty to be concerned about when it comes to keeping students from cheating, but gone are the days of having the biggest issue in classrooms be cheat sheets or students staring at a peer’s test.
And it’s all thanks to a new form of technology that has revolutionized the classroom, for better or worse: artificial intelligence.
In the past if a teacher were to assign an essay or a take-home test, students would be warned against plagiarism and told not to look up answers, and cheating wasn’t necessarily easy to do, nor would it be difficult for a teacher to spot it.
Nowadays, a student can just command a chat bot to do the work for them, which sometimes makes it nearly impossible for teachers to figure out whether the essay or test is a student’s own work or the work of AI.
“I don’t think any school, higher ed probably down to first grade, has figured out how to grapple with it from an academic honesty standpoint,” Steven Eskilson, an art and design professor at Eastern said. “I don’t think any instructors have figured out what to do. Honestly, they’re either burying their head in the sand or they just do things in class in-person.”
In the early stages of generative AI, Eskilson tasked his students with commanding it to generate an image and then analyze how well a job it does. He says students often report mistakes made by the AI but anticipates those mistakes will go away as the software becomes more sophisticated.
Eskilson believes that AI will evolve into an advanced textbook, as opposed to an instructional tool in the future.
“I don’t think in a few years you’ll be able to find mistakes [in AI],” Eskilson said. “I don’t know if I would do more than try to avoid it in the future.”
AI can be used as a tool for educators just as much as it can be used as a tool for students, according to the chair of the department of counseling and higher education Catherine Polydore, who has a personal interest in AI and how to promote AI literacy.
Polydore said educators can use AI to generate lesson plans and create tests, which frees up time for them to become more knowledgeable in the subject matter and build more trust with students.
“We do know that having more implicit content knowledge, where you can easily navigate questions posed to you in a classroom, increases [a teacher’s] credibility in the student’s mind, which then increases their openness and willingness to listen to you,” Polydore said.
Polydore believes that schools should look into investing in AI and educate teachers on how to properly use it as a classroom tool and how to detect if a student is misusing it.
She also says the two main issues involving the use of AI are cheating and equity and if a school can find a way to promote proper use, then it should be made available to all students equally.
“Assignments can be created that specifically directs students to things where everybody is basically doing the same thing,” she said.
Education on AI should be age-dependent to still promote critical thinking skills without the technology doing it for somebody, Polydore says.
She suggests teachers outline where AI use is acceptable in assignments and that teachers should be on higher alert when giving assignments that they feel students may misuse AI to do. She also said that students should be made aware of the fact that their teacher is familiar with AI, which will help deter some misuse.
There are methods to stop misuse of AI, but Polydore says ultimately there’s going to be people who abuse new technology.
“As human beings, we like the path of least resistance,” Polydore said. “Why would I want to go in and put effort into something when I can get the same result with less effort?”
Eastern provides guidelines for its students and faculty urging students to use generative AI responsibly, think critically about the use of AI and suggests ways to not use the technology. The faculty guidelines include tips for recognizing AI generated work.
Gabe Newman can be reached at 581-2812 or at ghnewman@eiu.edu.