Students should not be deprived of using artificial intelligence (AI) to enhance their learning experiences, but faculty members must establish clear boundaries anchored on ethical considerations when integrating it into delivering quality education, according to Dr. Elmer Rico Mojica, associate professor at Pace University in New York.
Mojica recently held a talk on the risks and benefits of generative AI tools in teaching and research hosted by the Technological Institute of the Philippines (TIP) last month, in a webinar titled “Artificial Intelligence in Teaching and Research: Boon or Bane?”
The webinar is part of an ongoing partnership between TIP and the Philippine-American Academy of Science and Engineering (PAASE), of which the speaker is an active member.
The long-time chemistry professor out of the University of the Philippines Los Baños shared some strategies for working with this technology based on his experience when he tried using software applications, such as ChatGPT and Google Bard, in his own teaching and research practices.
Recognizing the inevitable impact of AI on education, Mojica advised school executives and academicians to lay down the ground rules when incorporating these large language model chatbots in generating outputs for teachers and learners.
“Since this is a discipline-based decision at the department level, teachers should tell their students what is acceptable or not, and the students should know it. That’s why we encourage them (educators) to put it (terms and conditions of AI use) in the syllabus,” he said.
Mojica also emphasized the importance of promoting academic integrity among students from the start, so they can become responsible users of technology and be well informed about the consequences of its misuse.
Citing his personal experience and recent research, he said the penchant for using AI tools may actually stem from the initiative of students to learn more and produce better outputs rather than to cheat, as most teachers fear.
Nevertheless, Mojica acknowledged the necessity of putting guardrails against the risks attached to ChatGPT and other generative AI applications, considering their limited sources of information. This is where expertise on the subject matter becomes indispensable for educators.
When teachers recognize ambiguities or questionable information in student submissions, then the students may be asked to conduct additional research using traditional sources of information like books and journals, according to the US-based educator.
Although his experience in using chatbots to generate ideas and other prompts for writing and research has been positive, Mojica said AI can sometimes “hallucinate and give you answers that don’t exist.” Critical thinking, therefore, remains crucial when accessing such tools.
The lecturer, meanwhile, backs the use of AI-powered resources to ease the workload of teachers particularly when it comes to grading students’ outputs or automating curriculum design. This saves them time while keeping them abreast of useful technologies.
Allaying fears of many who see AI as a threat, Mojica echoes the argument that “AI will not replace humans, but the people who use it will.” Regardless, gray areas such as the sinister rise of deepfakes and other manipulative tools of mass disinformation must be addressed properly.
“If you don’t know which one is true, you will believe anything generated by these AI tools. To prevent this kind of misinformation, I think human expertise and judgment are needed,” he said. Consequently, it is essential to constantly improve our digital literacy skills.
Mojica also mentioned that personal interaction between students and teachers must remain as much as possible despite the proliferation of AI and all kinds of machine learning technologies because “personal intervention is the best way to deal with the students.”
“AI like ChatGPT and Google Bard are just tools that can help us improve the teaching, learning and research writing process. We should ensure that the rise of AI does not get out of hand,” he concluded.