In many educational settings, professors discourage students from using generative AI tools, preferring they create original content and ideas. However, AI policies are less than typical in the College of Engineering’s School of Computing.
According to computing professor Marilyn Wolf, professors in the School of Computing often permit students to use AI if they disclose that they used it.
“I think it’s reasonable that students tell us when they’re using AI or when they’re using other tools, and AI tools are subject to whatever rules that the class and the School of Computing set up, just like calculators or anything else,” Wolf said. “There may be times when it’s fine for students to use certain types of tools, and other times when that’s not what the class is going after.”
AI provides a jumping-off point for programmers creating code but needs to be verified line-by-line, Leen-Kiat Soh, a professor in the School of Computing, said.
“I think one thing that students need to be careful about is that they need to take whatever AI results they get with a grain of salt,” Wolf said. “The AI tool that they use may not be correct, and they need to be sure that they understand the topic enough to be able to figure out whether what the AI tool is giving them is right or not.”
According to Soh, generative AI tools often cannot process advanced requests.
“For more mundane tasks, it can do very well, but for more advanced tasks, with the prompts you have, it might not be able to cover all you want, all the tiny details,” Soh said.
According to Soh, addressing the use of AI in computer science is complicated as educational interests often contradict industrial interests.
“I think there are two perspectives here,” Soh said. “One is, regardless of whether we accept it or not accept it, this is the technology we’re using. That means now it becomes an issue of ‘how can we teach the students the needed and required knowledge and skills, knowing they have access to ChatGPT or CoPilot?’”
Soh said that the computer science industry is adapting to the professional use of AI tools.
“From that perspective, it’s our responsibility as an institution to produce students with degrees in computer science, and we have this responsibility to teach them how to use AI as well because companies will expect them to be able to use it,” Soh said.
According to Soh, some professors are creating assignments designed to help students learn how to use AI tools while understanding concepts for themselves.
“I’ve had a colleague who I’ve interacted with who had a policy that you can use AI on your code, but you must come in and explain it. If the student can explain every single thing, they get full points,” Soh said. “I think that’s one way to cautiously embrace the use of AI tools in computer science class.”
According to Wolf, neural networks, the core technology behind AI, are effective in embedded computer vision, a technology that uses object recognition to allow computers to “see” and understand their environment. AI is not ready to replace professionals in software and hardware design, she said.
“A traditional computer vision task is detecting an object or classifying ‘is it a kitten or a puppy?’ Generating something such as a piece of software or hardware description language, that’s a very different type of task,” Wolf said.
Wolf said that current AI tools lack the exactness required to design hardware and software effectively.
“The issue is that either software or hardware design require precision and extreme correctness, and most of the AI methods that are out right now aren’t able to deliver that for even modestly sized projects.”