Hong Kong universities relent to rise of ChatGPT, AI tools for teaching and assignments, but keep eye out for plagiarism


Breakneck pace of AI development prompts most universities to adopt ChatGPT and other tools. Academics revamp teaching, while students who do not acknowledge AI help with assignments will face penalties. — SCMP

Ashley Lam Cheuk-yiu, a second-year marketing student at the Hong Kong University of Science and Technology (HKUST), had an assignment to design an advertising campaign for a brand using concepts taught in class.

The 20-year-old wrote a 600-word essay that proposed tech giant Apple should use its famous “Think Different” slogan to promote human rights and gender equality as well as its iPhone products.

Then she asked ChatGPT, a generative artificial intelligence (AI) tool that emerged last year, to do the same assignment.

Almost instantly, it produced a coherent proposal for a campaign and titled it “Efficiency Unleashed: Experience the Future with iPhone”.

“I think AI did a better job than me in naming the campaign,” she said. “But some of its paragraphs were too repetitive and less creative.”

HKUST is one of the adopters of AI as a learning tool. Photo: Winson Wong

The speed of AI development has led to most of Hong Kong’s universities adopting the generative tools in the new school year.

The software is designed to trawl available information, or have data fed into it, and answer questions. ChatGPT, developed by OpenAI, can respond in humanlike fashion and produce essays based on content available online.

Academics navigating the uncharted waters of AI said they saw problems in ensuring that students acknowledged use of the tools and they also had to revamp their teaching styles.

It took only six months for universities to switch from a blanket ban on AI to giving some leeway, and developing ways to include the technology.

The University of Hong Kong (HKU) reversed a ban imposed in February and will allow students to use AI platforms for classroom work and assignments from this month.

Microsoft, which has backed OpenAI, earlier announced it would join forces with eight universities in Hong Kong to design educational solutions and run a series of workshops to encourage the use of AI.

HKUST has allowed students to use GPT-4, the latest AI model developed by OpenAI that can produce text content, and DALL-E-2, which can turn ideas into AI-generated images.

Academics said a major problem was the detection of plagiarism and assessing originality as they had to assume that most students would seek help from AI-driven chatbots.

“I tried uploading two essays to five different (pieces of plagiarism) detection software, and all the ‘similarity percentages’ were different,” said Chung Shan-shan, a senior lecturer of environmental and public health management at Baptist University.

“It is challenging to judge whether a student actually used generative AI.”

Software programmes used to spot plagiarism include Turnitin, an internet-based similarity detection service often used in academia to check assignments and articles, and which has been upgraded to recognise text written by AI.

Chinese University has also developed a plagiarism detection system called VeriGuide.

Baptist University students are now allowed to use the GPT-3.5 model with a token system developed by the university. But they must declare if they used generative AI for an assignment, specify the tool used and identify the portions of an essay that were AI-generated.

Chung highlighted ChatGPT’s ability to mimic human conversations and predicted it could help students better understand concepts and improve their fluency in essays.

Students at Baptist University who are suspected of wrongdoing or who refuse to admit to using AI will have to sit a face-to-face oral examination. Photo: Nora Tam

She added she had revamped her teaching methods as well and added informal current affairs tests on subjects in the news.

One limitation of AI tools was that they relied on available, outdated information online, which became apparent when students attempted to answer questions on more up-to-date topics, she said.

GPT-4 is based on information available online up to September 2021.

The universities also have had to consider how to penalise students who break the rules, by reproducing the generated content or not indicating the unoriginal portions of work.

Students at Baptist University who are suspected of wrongdoing or who refuse to admit to using AI will have to sit a face-to-face oral examination. Penalties include deduction of marks for the assignment.

Accounting senior teaching fellow Bruce Li Kar-lok of Polytechnic University (PolyU) was less worried. He said the ability of students still trumped chatbots in his discipline.

He said ChatGPT got only two out of six questions correct in an assignment he designed on calculation of shares and changes related to dates, prices, interest, tax and dividends.

Li explained the AI software could handle straightforward, basic formulas, but had trouble figuring out the logic behind complex questions or doing additional calculations.

“AI chatbots are like primary level students ... They need us to teach them logic,” he said.

Li added he would encourage students to compare answers generated by chatbots and spot mistakes. But he said AI tools still had the potential to improve students’ logical thinking and problem solving in the long term.

PolyU assistant nursing professor Arkers Wong Kwan-ching said he already saw some merit in the use of AI in teaching.

He added he had found that students were more willing to interact and ask questions in classes after they had used AI tools that provided “conversational feedback” to prepare for classes.

“Before entering the class, students can use chatbots to quickly clarify complicated concepts. This allows deeper discussions as we can save time explaining concepts,” he said.

Wong said he planned to incorporate more “contextual” questions which the chatbots’ database would not have sufficient information to answer to prevent plagiarism.

Senior lecturer Jean Lai Hok-yin, a researcher of e-learning systems and intelligent agent technologies at Baptist University’s department of computer science, said the decision by the institutions to allow AI tools was less about technological advancement than equipping students to cope in a rapidly changing world.

“These AI-driven tools are unstoppable. When they become an essential part of daily life and work, constructing effective prompts will be very important,” she said.

“The more accurate questions you can ask, the better answers you will get in everything.” – South China Morning Post

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
   

Next In Tech News

Legislative roadmap for AI is coming in weeks, Schumer says
Google DeepMind unveils next generation of drug discovery AI model
Google fights $17 billion UK lawsuit over adtech practices
Bain Capital in talks to buy education-software provider PowerSchool, source says
Turkey's competition board to fine Meta $37.2 million in data-sharing probe
SpaceX's unit Starlink secures Indonesia operating permit
Reddit shares soar as earnings show advertising, AI licensing revenue potential
Uber shares tumble as second-quarter forecast disappoints
EU asks X for details on reducing content moderation resources
New York governor regrets saying Black kids in the Bronx don’t know what a computer is

Others Also Read