AI chatbots have transformed how students find information and write texts, but researchers are concerned that knowledge gained through using AI is fleeting compared to finding facts through more traditional methods. — Photo: Philipp von Ditfurth/dpa
WASHINGTON: Using AI instead of finding information using web search tools leads to no more than skin-deep knowledge, with any learnings soon turning "passive," according to a team of US-based researchers
"When individuals learn about a topic from LLM syntheses, they risk developing shallower knowledge than when they learn through standard web search, even when the core facts in the results are the same," the team said in a paper published by PNAS Nexus, a journal of the National Academy of Sciences in the US.
ChatGPT, Gemini and other AI chatbots have transformed how school and university-level learners write texts and find information, however researchers Shiri Melumad of the University of Pennsylvania and Jin Ho Yun of New Mexico State University say chatbot users are being prevented from "discovering information for themselves."
"Those who learn from LLM syntheses (vs. traditional web links) feel less invested in forming their advice and, more importantly, create advice that is sparser, less original and ultimately less likely to be adopted by recipients," they said in their paper, citing seven online and laboratory experiments.
"Participants reported developing shallower knowledge from LLM summaries even when the results were augmented by real-time web links," the team explained.
"While searching through LLMs can undoubtedly make it easier to acquire information," according to Melumand and Jin, it can at the same time undermine learning compared to reading items thrown up by a search engine.
"One of the risks of relying on LLM summaries in lieu of traditional web search links is that it can transform learning from a more active to passive activity – which has been shown to yield inferior learning outcomes in other settings," the team said. – dpa
