• If Laksaboy Forums appears down for you, you can google for "Laksaboy" as it will always be updated with the current URL.

    Due to MDA website filtering, please update your bookmark to https://laksaboyforum.xyz

    1. For any advertising enqueries or technical difficulties (e.g. registration or account issues), please send us a Private Message or contact us via our Contact Form and we will reply to you promptly.

Commentary: Lecturers need to give students clearer instructions about AI use

LaksaNews

Myth
Member
SINGAPORE: Imagine you are a lecturer grading students’ essays about their research methods for the term project. You notice that three students mentioned using artificial intelligence in different ways.

Jane used an AI tool to help format citations in APA style. Don discussed topic ideas with ChatGPT to help narrow down his research focus. Beatrice ran her draft through an AI writing assistant to catch grammatical errors before final submission.

You realise that you did not explicitly address AI use in your course syllabus, and your university's policy broadly states that students must not use such tools without permission from the instructor.

The three students made good-faith attempts at disclosure, but you are uncertain whether their uses violate the spirit of academic integrity. How do you proceed?

This is a hypothetical scenario, but it is happening across universities. Students routinely use programmes like Grammarly without considering them AI, while lecturers may permit some tools such as citation assistance. The recent incident at Nanyang Technological University illustrates how students and lecturers can have different interpretations of what’s acceptable.

FAIR AND UNFAIR AI USE​


In recent years, artificial intelligence has advanced more rapidly than policies can keep up with, resulting in a grey area between AI use and abuse.

Most universities have broad definitions on the acceptable use of AI. The University of Pennsylvania gives a simple analogy: “In the absence of other guidance, treat the use of AI as you would treat assistance from another person. For example, this means if it is unacceptable to have another person substantially complete a task like writing an essay, it is also unacceptable to have AI to complete the task.

Generally, the use of AI for brainstorming, drafting and idea generation is permitted, and where permitted, the explicit declaration or acknowledgement of the use of AI in assignments is also required.

Unfair AI use then entails passing off AI-generated work as one’s own without proper attribution, or employing it when it was explicitly prohibited to gain an unfair advantage.

TASK-SPECIFIC GUIDELINES FOR AI​


Given the wide scope of universities’ academic policies, it is up to lecturers to give instructions regarding AI use, specific to each assignment.

For essays and written tasks, instructors should ensure students understand the distinction between research and writing assistance. Students should be required to disclose AI usage and show documentation to verify authentic thinking.

Related:​



Problem sets and technical work such as coding require a different approach. Instructors must distinguish between when AI assistance is educational and when it becomes academic dependency.

In mathematics courses, for instance, AI might be permitted for checking calculations but prohibited for generating solution methods. Students can also be told to show all work steps manually and to be prepared to explain their solution process to the class.

For creative and analytical assessments, instructors can tell students that AI may be used for initial inspiration and research, but that all content must be produced by students.

Students in fine arts, for instance, may be allowed to utilise AI for brainstorming sessions, but must develop original pieces. Meanwhile, business students may utilise market analysis tools powered by AI, but must produce unique strategy recommendations.

Lecturers can also require students to document any AI-generated ideas that influenced their work.

These guidelines seek to develop each student's capability not only in critical thinking but also develop capabilities in the area of human-AI collaboration.

Related:​


PREVENTION OVER PUNISHMENT​


However, even with clear AI guidelines, there will be students tempted to use tools and software to circumvent the rules.

For example, students may use “humanising” software to disguise an AI-generated assignment to bypass detection software. Students may also use AI tools in oral exams, as current technologies allow for such apps to reside on mobile phones and communicate wirelessly to the students via discreet earpieces.

Rather than play detective, institutions should focus on prevention through clear communication. This means writing unambiguous AI policies with concrete examples.

Other prevention strategies include AI literacy training for faculty and students, redesigning assessments that are more focused on processes rather than answers, and verifying students’ understanding through conversational assessments and in-class discussions.

Universities can also consider "AI-transparent" approaches where students document their use of AI tools throughout the assignment, similar to how they cite traditional sources. This creates accountability on the students' part while avoiding the adversarial effects of detection-based enforcement.

Clear AI guidelines protect the value of university degrees and prepare students for an AI-driven future. They help students develop ethical instinct, emotional intelligence and creative thinking – human skills that AI cannot replace.

University graduates will likely work alongside AI tools and apps throughout their careers. The problem for universities is not about addressing the over-reliance on AI or banning it outright, but teaching students how to collaborate with AI responsibly.

With clear and transparent guidelines, universities can uphold educational integrity while preparing students for an AI-enhanced world.

Dr Caroline Wong is Associate Dean for Teaching & Learning, and Associate Professor of Business, and Mr Harry Klass is Senior Learning Technologies Specialist at James Cook University (Singapore Campus).

Continue reading...
 
Back
Top