The browser you are using is not supported by this website. All versions of Internet Explorer are no longer supported, either by us or Microsoft (read more here: https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Please use a modern browser to fully experience our website, such as the newest versions of Edge, Chrome, Firefox or Safari etc.

In dialogue with a chatbot

ChatGPT on a screen. Photo.
ChatGPT comes with challenges – and possibilities. How will the university handle the new technique? Photo: Rolf van Root/Unsplash

There is a lot of talk about ChatGPT just now – one of the chat-based AI services delivering answers to all possible questions in matter of seconds. But the new technology is not merely generating answers, but also raising many questions. In order to get insight into what everyone is talking about, LUM has given it a try.

It is easy to get started – all that is required is to create an account on their website. Once in, the chatbot is ready for conversation:

“How does ChatGPT work? Explain it for the readers of LUM.” A few seconds later, the answer appears:

“ChatGPT is a deep learning model that is trained on large volumes of text data. It uses transformer architecture to understand the context of text and generate answers or solutions. The user interacts with the model and feedback is used to improve the answers.”

Rephrasing the question

The language is a bit stilted and the description difficult to understand for a layman. A rephrasing of the question is in order:

“How does ChatGPT work? Explain it to a child.” The answer comes back:

“ChatGPT is like a smart person in the computer who has read many books and texts. It has learned to understand language and can answer questions or help perform tasks that require an understanding of language. When you ask ChatGPT a question, it thinks back on everything it has read and gives you an answer based on what it has learned. It is like having a super smart friend in the computer who can help you with everything that involves words and languages!”

This was easier. But how smart is this friend in the computer? It calls itself super smart but writes questionable Swedish.

Not a friend anymore

Friend or not – the chatbot is given another chance, this time with the question, “Explain ChatGPT to a beginner.” But when the answer begins with “ChatGPT is a smart computer program...”, but uses the wrong gender for the noun in Swedish, the friendship is over.

The feeling that remains after all the questions – and quick replies – is something akin to a bad conscious. Should we stop being friends just because the chatbot does not know Swedish grammar? When the questions are posed in English, the answers are much better. Which raises the bigger question – are there any limits for how it can be used?

Woman in a grey pullover. Photo.
Elin Bommenel. Photo: Åsa Hansdotter

One use case that worries lecturers is for the possibility of cheating. Over the course of autumn 2022, Lund University launched a pilot project to investigate how the University can prevent cheating in examinations. The project is led by Elin Bommenel, senior lecturer in service studies.

“One way to reduce anxiety is to get to know the chat tool,” she says, and goes on to list a few things teaching staff should think about: What kinds of questions elicit usable answers? What level are students answering at on the test questions? Is there a pattern in the answers that makes it possible to spot when they come from a bot? Does the student realise when the language model has been creative and invented fictional answers?

"The teacher's salvation"

Elin Bommenel emphasises that if the lecturer asks for a human understanding of something, this is beyond the AI’s capabilities. She calls it the “teacher’s salvation” as it becomes harder to cheat using the chatbot when the questions require analysis or a critical perspective:

“The chatbot usually answers in a very general way to complex questions and produces strange mistakes when it is required to invent in the absence of available information. When the bot is used in English, the structure and grammar are often superhumanly good – people rarely write that well.”

But lecturers may still need to think about how to run their examinations, she argues.

Man with glasses. Photo.
Håkan Eftring. Photo: Jessika Sellergren

Håkan Eftring, senior lecturer in rehabilitation engineering, agrees. He has thought a lot about the user perspective – for teaching staff and students – and has come to the conclusion that he needs to work in a new way with examination questions because he does not want to forbid students from using AI.

"Who am I examining?"

“If I prohibit the use of chatbots, will I detect if they have used AI when answering questions?”

He recognises that he needs to ask students more for their own examples, and that one of the consequences will be more oral exams and more examinations on campus.

“I want to know who it is I am examining. That it is a student, not a chatbot,” he says.

 

Elin Bommenel on ChatGPT

ChatGPT is a chatbot build on GPT3 – the third generation “generative pre-training transformer”. The chatbot pretends to be a search engine but should really be described as a language model that, having been trained on a large amount of textual data, can weave texts together in a way that people perceive as an answer to a question. Advances are moving fast – the fourth generation GPT is already here: GPT4.

AI in teaching

Lund University has set up a working group to collect questions from teaching staff about ChatGPT and how to detect chatbot use, how to design examinations, but also how the tool can be used to develop learning and teaching. Read more at https://www.education.lu.se/en/teaching-tips/ai-education 

Tidningsomslag.

About LUM

The first edition of Lund University Magazine – LUM – was published 1968. Today, the magazine reaches all employees and also people outside the university. The magazine is published six times per year. Editor Jan Olsson.

LUM website in Swedish

Editorial staff

Jan Olsson


046-222 94 79

jan [dot] olsson [at] kommunikation [dot] lu [dot] se

 

Minna Wallén-Widung

046-222 82 01


minna [dot] wallen-widung [at] kommunikation [dot] lu [dot] se