SINGAPORE: In an age of ChatGPT, are students actually doing their homework any more? Even if they are, they would be using artificial intelligence tools, a local survey has found.
This is not only at the university level, where their usage is allowed with rules in place. All of the secondary school students surveyed said they used AI tools.
They are frequent users too: 84 per cent of secondary school respondents use AI for their homework at least weekly, while 29 per cent do so several times a week. The rest of them use it at least monthly.
The survey, commissioned by CNA’s Talking Point, reached out to 500 Singaporean students aged 15 to 25 from secondary schools to higher education institutions. One of its aims was to find out how they were using AI for their assignments.
This comes after Pew Research Center found that 26 per cent of United States teenagers aged 13 to 17 used the AI chatbot ChatGPT for their schoolwork last year, double the figure in 2023.
In Singapore, where all the survey respondents have used AI for this purpose, about 86 per cent said they were doing so for “generating ideas for their homework/assignment”.
Those at secondary school, however, were more likely than the rest to be using these tools for solving mathematics problems (63 per cent) as well as for proofreading and checking grammar (47 per cent).
The latest survey findings suggest a greater prevalence of AI use among students than previously thought.
In a local survey of parents conducted in December by the non-profit Centre for Evidence and Implementation, 68 per cent of them said their teens aged 13 to 17 were using generative AI tools for their schoolwork at least weekly.
What are the consequences if students are letting AI take over, asks Talking Point in a one-hour special this week.
There is a wealth of AI tools that students can tap, and not only ChatGPT.
While it can help with various tasks, “other AI tools often specialise in specific tasks that I might not do perfectly or efficiently in certain situations”, the chatbot replied when asked why it could not do everything for students.
“Some AI tools like Photomath or Wolfram Alpha are specifically designed to handle complex mathematical calculations.”
There are also AI tools such as Grammarly and Wordtune for English homework, ChemBuddy and Labster for science homework, and Perplexity AI and Google Scholar for history and social studies, recommended ChatGPT.
Secondary 3 student Rebekah Low usually uses AI to “help generate points” for her English compositions. “If I feel I’m stuck, I’d just get ChatGPT to list some ideas, just to get the brain juices flowing,” she said.
Take, for example, this topic: “People can only be happy if they feel they’re treated fairly. Do you agree?”
If she agrees with the argument, she will ask ChatGPT for a list of reasons why people feel happy only when they are treated fairly.
“I’ll pick out the ones that I feel are the easiest to elaborate on,” she said, “and then … write the composition.”
Secondary 4 student Ryan Ukail is another who uses AI to do his English homework, for example a writing assignment about something he regretted.
One of the sentences he wrote was “I stepped onto the long, muddy grass”. It was “not the way (he) wanted to sound”, so he asked AI to “make this more in-depth”.
He got this descriptive answer: “I recall cautiously stepping on the wet blades of long grass.”
Secondary 2 student Dorelle Ong, meanwhile, uses AI to generate answers when she does not want to “waste” her time and effort.
If she has a lot of Chinese homework, including filling in the blanks, she will want to do that section “quickly” by uploading a screenshot of that worksheet to ChatGPT with the prompt, “Give me the answers, please”.
She also sends photos of maths worksheets to ChatGPT. “I find maths quite hard, so I don’t even really bother,” she said.
The thing is, their teachers know about them using AI, said the students.
Rebekah said her English teacher often encourages students to use AI for generating ideas “in point form” and to “really think” before elaborating on them for their composition. “It’s still essentially our own work,” she said.
Ryan, who also uses AI for geography and Malay, his mother tongue, said his teachers tell students how to go about “using it to help you build on your answers”, while advising against “using it entirely for your answers”.
But he added: “Our school isn’t too uptight about it.”
In Talking Point’s survey, 51 per cent of secondary students said they modify AI content significantly to reflect their writing style when using it to write essays and reports, while 18 per cent “make minor edits for grammar or clarity”.
Another 31 per cent use AI content as a reference but rewrite it in their own words.
Because most people “have a specific writing style”, said Rebekah, she thinks teachers “definitely notice the discrepancies” when students copy from AI wholesale.
There are also AI detection tools, some of which claim to be 99 per cent accurate.
But “there’s no good measure” out there, said Jonathan Sim, assistant director of pedagogy at the AI Centre for Educational Technologies, National University of Singapore (NUS). “They’re all very, very fallible.”
He demonstrated this with two AI detectors, QuillBot and GPTZero, and this typical Secondary 4 composition topic: Write about a time when you had to make a difficult decision between choosing a friend and doing the right thing.
While the composition Talking Point generated using AI was assessed by GPTZero to be 74 per cent AI-generated, the same sample got an AI score of zero on QuillBot.
Another composition, written by programme host Munah Bagharib, turned out to have a 100 per cent AI score on GPTZero and then 0 per cent on QuillBot.
“That’s how unreliable (AI detection) is,” said Sim, adding that many teachers do not realise this.
Whether they use these tools or try to detect AI themselves, his “biggest concern” is the issue of trust between teacher and student.
If teachers sense that “maybe something isn’t right” with a student’s writing quality, they could ask, for example, what led the student to write in that way. But he cautioned against turning these efforts into a cat-and-mouse game.
“If the student writes like us, and it gets constantly flagged (by) … let’s say, five teachers who say, ‘Hey, I think you use AI’, that’s going to affect the student and the student’s motivation to learn,” he said.
Sim has been leading the charge to embrace proper AI use in schools. But to his knowledge, “there isn’t any very … clear guidance across the board”. This has partly to do with the “broad spectrum” of views among teachers.
“On one end, you have some teachers who are very excited,” he cited. “They want to encourage their students to use AI.
“You also have some who … don’t know how to deal with it — ‘let’s just pretend it doesn’t exist.’ And then there are some, a small minority, who are like, ‘Oh, AI is bad.’”
Among secondary students surveyed by Talking Point, 51 per cent were in schools with rules governing the use of AI. But another 33 per cent were unsure of these rules, while 16 per cent said there were no rules.
When asked if they were concerned about being penalised for AI use, a third of respondents said they were “not at all” concerned.
In reply to queries about whether there are standard guidelines on use of AI by teachers and students, the Ministry of Education (MOE) pointed to the AI-in-Education (AIEd) Ethics Primer, an online learning module covering the ministry’s AIEd Ethics Framework.
Teachers can access this to guide their use of AI in the classroom.
As for guiding students to use AI to support their learning, teachers emphasise ethics related to the use of data and AI, such as the importance of integrity and proper data handling, said the ministry.
“MOE will also facilitate professional conversations in schools on topics such as the benefits and potential risks of AI, and how these risks can be mitigated,” added the ministry.
“As with any form of technology, AI should be used in a fit-for-purpose, pedagogically sound and age-appropriate manner that enhances learning outcomes.”
That is also the objective Sim hopes to achieve. He urged his fellow educators to “rethink how we engage students” and how to use AI to enhance the learning process.
“Minimally, we should make clear how much AI should be used,” he said.
“We should also be clear with our students (about) what they can achieve by doing the assignment. Because one thing AI has done is that it’s taken away a lot of the motivation for learning.”
With students turning to AI for help with their homework, how will it affect their grades then?
For one thing, AI chatbots are known to project the illusion of fact. According to an OpenAI technical report released last month, newer models of chatbots are more prone to hallucinations compared to older models.
This could trip up students, as Munah discovered when she fed O-Level questions from the ubiquitous 10-Year Series to the top AI tool recommended by ChatGPT for each subject.
Erica Low, a tutor from The Polished Word who marked the maths and science papers picked out by Munah, revealed that the maths result was 62 out of 90, which was B3.
“There are many careless mistakes,” said Low, who was told only later that the work was done by AI. “There are some formulas that you’ve used wrongly.”
The grade on the chemistry paper was B4, with 45.5 out of 75 marks awarded. “You’re missing a lot of the keywords,” she pointed out as she ran through the sections.
The physics paper received a failing grade, with 18.5 out of 75 marks. Besides wrong answers, there was “a lack of workings” as well as keywords, she cited.
The English homework done by ChatGPT did not fare much better: D7, mainly because it “didn’t really capture what the questions wanted”, said English and humanities specialist Lia Tan.
She also assessed the history and social studies homework that was done. The grades for both were C6. “You were lacking in concrete examples,” she cited for social studies.
Lee Li Neng, a deputy director at NUS’ Centre for Teaching, Learning and Technology, stressed the importance of “evaluative judgement” — judging whether work done in a particular domain is “good-quality work or not”.
Students must “learn how to think with AI and not just let AI think for them”, he advised. While this is easier said than done because “students are time-starved”, learning is meant to be challenging, he said.
If students are using AI as the path of least resistance, then “this is harmful for everybody”.
“The student feels … ‘I need to pretend all the time that I know (the lesson material),’” Lee continued. “This teacher is also operating under an illusion: ‘I’m doing such a great job. … My students are all learning so well.’
“At some point, the illusion is going to break. And that creates, unfortunately, a huge amount of stress … for the individual student.”
For two of the students whom Talking Point met, AI has helped them to learn, they said. “It gives me a different perspective,” said Rebekah, whose English composition marks have improved.
Ryan is also “doing a lot better” in English and has learnt to be more descriptive. And he is thinking in ways he “never would’ve thought of before”, he said.
Dorelle, however, does not think she learns all that much when using AI to do her homework. “But it helps me get it done,” she said.
Continue reading...
This is not only at the university level, where their usage is allowed with rules in place. All of the secondary school students surveyed said they used AI tools.
They are frequent users too: 84 per cent of secondary school respondents use AI for their homework at least weekly, while 29 per cent do so several times a week. The rest of them use it at least monthly.
The survey, commissioned by CNA’s Talking Point, reached out to 500 Singaporean students aged 15 to 25 from secondary schools to higher education institutions. One of its aims was to find out how they were using AI for their assignments.
This comes after Pew Research Center found that 26 per cent of United States teenagers aged 13 to 17 used the AI chatbot ChatGPT for their schoolwork last year, double the figure in 2023.
In Singapore, where all the survey respondents have used AI for this purpose, about 86 per cent said they were doing so for “generating ideas for their homework/assignment”.
Those at secondary school, however, were more likely than the rest to be using these tools for solving mathematics problems (63 per cent) as well as for proofreading and checking grammar (47 per cent).
The latest survey findings suggest a greater prevalence of AI use among students than previously thought.
In a local survey of parents conducted in December by the non-profit Centre for Evidence and Implementation, 68 per cent of them said their teens aged 13 to 17 were using generative AI tools for their schoolwork at least weekly.
What are the consequences if students are letting AI take over, asks Talking Point in a one-hour special this week.
HOW STUDENTS ARE USING IT
There is a wealth of AI tools that students can tap, and not only ChatGPT.
While it can help with various tasks, “other AI tools often specialise in specific tasks that I might not do perfectly or efficiently in certain situations”, the chatbot replied when asked why it could not do everything for students.
“Some AI tools like Photomath or Wolfram Alpha are specifically designed to handle complex mathematical calculations.”
There are also AI tools such as Grammarly and Wordtune for English homework, ChemBuddy and Labster for science homework, and Perplexity AI and Google Scholar for history and social studies, recommended ChatGPT.
Secondary 3 student Rebekah Low usually uses AI to “help generate points” for her English compositions. “If I feel I’m stuck, I’d just get ChatGPT to list some ideas, just to get the brain juices flowing,” she said.
Take, for example, this topic: “People can only be happy if they feel they’re treated fairly. Do you agree?”
If she agrees with the argument, she will ask ChatGPT for a list of reasons why people feel happy only when they are treated fairly.
“I’ll pick out the ones that I feel are the easiest to elaborate on,” she said, “and then … write the composition.”
Secondary 4 student Ryan Ukail is another who uses AI to do his English homework, for example a writing assignment about something he regretted.
One of the sentences he wrote was “I stepped onto the long, muddy grass”. It was “not the way (he) wanted to sound”, so he asked AI to “make this more in-depth”.
He got this descriptive answer: “I recall cautiously stepping on the wet blades of long grass.”
Secondary 2 student Dorelle Ong, meanwhile, uses AI to generate answers when she does not want to “waste” her time and effort.
If she has a lot of Chinese homework, including filling in the blanks, she will want to do that section “quickly” by uploading a screenshot of that worksheet to ChatGPT with the prompt, “Give me the answers, please”.
She also sends photos of maths worksheets to ChatGPT. “I find maths quite hard, so I don’t even really bother,” she said.
TEACHERS KNOW, BUT NOT ALWAYS
The thing is, their teachers know about them using AI, said the students.
Rebekah said her English teacher often encourages students to use AI for generating ideas “in point form” and to “really think” before elaborating on them for their composition. “It’s still essentially our own work,” she said.
Ryan, who also uses AI for geography and Malay, his mother tongue, said his teachers tell students how to go about “using it to help you build on your answers”, while advising against “using it entirely for your answers”.
But he added: “Our school isn’t too uptight about it.”
In Talking Point’s survey, 51 per cent of secondary students said they modify AI content significantly to reflect their writing style when using it to write essays and reports, while 18 per cent “make minor edits for grammar or clarity”.
Another 31 per cent use AI content as a reference but rewrite it in their own words.
Because most people “have a specific writing style”, said Rebekah, she thinks teachers “definitely notice the discrepancies” when students copy from AI wholesale.
There are also AI detection tools, some of which claim to be 99 per cent accurate.
But “there’s no good measure” out there, said Jonathan Sim, assistant director of pedagogy at the AI Centre for Educational Technologies, National University of Singapore (NUS). “They’re all very, very fallible.”
He demonstrated this with two AI detectors, QuillBot and GPTZero, and this typical Secondary 4 composition topic: Write about a time when you had to make a difficult decision between choosing a friend and doing the right thing.
While the composition Talking Point generated using AI was assessed by GPTZero to be 74 per cent AI-generated, the same sample got an AI score of zero on QuillBot.
Another composition, written by programme host Munah Bagharib, turned out to have a 100 per cent AI score on GPTZero and then 0 per cent on QuillBot.
“That’s how unreliable (AI detection) is,” said Sim, adding that many teachers do not realise this.
Whether they use these tools or try to detect AI themselves, his “biggest concern” is the issue of trust between teacher and student.
If teachers sense that “maybe something isn’t right” with a student’s writing quality, they could ask, for example, what led the student to write in that way. But he cautioned against turning these efforts into a cat-and-mouse game.
“If the student writes like us, and it gets constantly flagged (by) … let’s say, five teachers who say, ‘Hey, I think you use AI’, that’s going to affect the student and the student’s motivation to learn,” he said.
NO “CLEAR GUIDANCE”?
Sim has been leading the charge to embrace proper AI use in schools. But to his knowledge, “there isn’t any very … clear guidance across the board”. This has partly to do with the “broad spectrum” of views among teachers.
“On one end, you have some teachers who are very excited,” he cited. “They want to encourage their students to use AI.
“You also have some who … don’t know how to deal with it — ‘let’s just pretend it doesn’t exist.’ And then there are some, a small minority, who are like, ‘Oh, AI is bad.’”
Among secondary students surveyed by Talking Point, 51 per cent were in schools with rules governing the use of AI. But another 33 per cent were unsure of these rules, while 16 per cent said there were no rules.
When asked if they were concerned about being penalised for AI use, a third of respondents said they were “not at all” concerned.
In reply to queries about whether there are standard guidelines on use of AI by teachers and students, the Ministry of Education (MOE) pointed to the AI-in-Education (AIEd) Ethics Primer, an online learning module covering the ministry’s AIEd Ethics Framework.
Teachers can access this to guide their use of AI in the classroom.
As for guiding students to use AI to support their learning, teachers emphasise ethics related to the use of data and AI, such as the importance of integrity and proper data handling, said the ministry.
“MOE will also facilitate professional conversations in schools on topics such as the benefits and potential risks of AI, and how these risks can be mitigated,” added the ministry.
“As with any form of technology, AI should be used in a fit-for-purpose, pedagogically sound and age-appropriate manner that enhances learning outcomes.”
That is also the objective Sim hopes to achieve. He urged his fellow educators to “rethink how we engage students” and how to use AI to enhance the learning process.
“Minimally, we should make clear how much AI should be used,” he said.
“We should also be clear with our students (about) what they can achieve by doing the assignment. Because one thing AI has done is that it’s taken away a lot of the motivation for learning.”
THE RISK OF PERPETUATING AN ILLUSION
With students turning to AI for help with their homework, how will it affect their grades then?
For one thing, AI chatbots are known to project the illusion of fact. According to an OpenAI technical report released last month, newer models of chatbots are more prone to hallucinations compared to older models.
This could trip up students, as Munah discovered when she fed O-Level questions from the ubiquitous 10-Year Series to the top AI tool recommended by ChatGPT for each subject.
Erica Low, a tutor from The Polished Word who marked the maths and science papers picked out by Munah, revealed that the maths result was 62 out of 90, which was B3.
“There are many careless mistakes,” said Low, who was told only later that the work was done by AI. “There are some formulas that you’ve used wrongly.”
The grade on the chemistry paper was B4, with 45.5 out of 75 marks awarded. “You’re missing a lot of the keywords,” she pointed out as she ran through the sections.
The physics paper received a failing grade, with 18.5 out of 75 marks. Besides wrong answers, there was “a lack of workings” as well as keywords, she cited.
The English homework done by ChatGPT did not fare much better: D7, mainly because it “didn’t really capture what the questions wanted”, said English and humanities specialist Lia Tan.
She also assessed the history and social studies homework that was done. The grades for both were C6. “You were lacking in concrete examples,” she cited for social studies.
Lee Li Neng, a deputy director at NUS’ Centre for Teaching, Learning and Technology, stressed the importance of “evaluative judgement” — judging whether work done in a particular domain is “good-quality work or not”.
Students must “learn how to think with AI and not just let AI think for them”, he advised. While this is easier said than done because “students are time-starved”, learning is meant to be challenging, he said.
If students are using AI as the path of least resistance, then “this is harmful for everybody”.
“The student feels … ‘I need to pretend all the time that I know (the lesson material),’” Lee continued. “This teacher is also operating under an illusion: ‘I’m doing such a great job. … My students are all learning so well.’
“At some point, the illusion is going to break. And that creates, unfortunately, a huge amount of stress … for the individual student.”
For two of the students whom Talking Point met, AI has helped them to learn, they said. “It gives me a different perspective,” said Rebekah, whose English composition marks have improved.
Ryan is also “doing a lot better” in English and has learnt to be more descriptive. And he is thinking in ways he “never would’ve thought of before”, he said.
Dorelle, however, does not think she learns all that much when using AI to do her homework. “But it helps me get it done,” she said.
Continue reading...