Guidelines on the Use of AI Tools For Academic Work
1. DON’T USE AI TO PLAGIARIZE
The following are always improper uses of AI tools:
-
Generating an output and presenting it as your own work or idea without attribution.
-
Generating an output, paraphrasing it, and then presenting the output as your own work or idea without attribution.
-
Processing an original source not created by yourself to plagiarize it (e.g., using an AI paraphrasing tool to disguise someone else’s original work, or even the output of another AI tool, and then presenting the final output as your own work or idea) without attribution.
All of the above violate NUS policies on academic honesty and anyone found to have done any of them will be dealt with accordingly. Keep in mind that even though AI tools are not authors and thus cannot be harmed by someone stealing an idea from them, it’s still wrong of you to represent yourself as having produced something when you did not produce it.
Note that students do not have to paraphrase AI output where its use is permitted. In such cases, so long as they acknowledge the use of AI, they have not committed plagiarism, and they will not be penalized for declaring its use. The assignment will be marked based solely on the quality of the submission. However, it is the students’ responsibility to check that the AI output properly engages with the assignment prompt and has the appropriate tone and style.
2. CHECK WITH YOUR INSTRUCTORS ON PROPER USES OF AI TOOLS
Whether or not using an AI tool in a particular way is allowable depends on the learning purposes of the course and the targeted outcomes of the assignment. Some possible legitimate uses include:
-
Gathering information and looking up explanations for basic concepts.
-
Generating output for critique and analysis, for self-learning, or to compare against one’s work for self-evaluation and improvement.
-
Help with proofreading and editing writing work.
The above is not meant to be comprehensive. An assignment designed to integrate the use of an AI tool, for instance, may require you to use that tool more extensively. Conversely, if there is a need to test whether you possess a certain knowledge or capability without access to AI tools or other resources, your instructors will continue to arrange for appropriate assessment settings (e.g., an on-site proctored exam or oral interview). In general, course instructors will need to impose varying restriction levels for the use of AI tools depending on the learning outcomes targeted. Whenever you have any doubts about whether an AI tool could be used for a specific assignment, or how it could be used, clarify them directly with your course instructors.
3. ACKNOWLEDGING YOUR USE OF AI
If you completed any work with the aid of an AI tool, assuming a setting in which the instructor gave permission for such tools to be used, you should always acknowledge the use. In fact, if you are ever in doubt, it is always a good idea to declare your use of a tool. Using the outputs of an AI tool without proper acknowledgement is equivalent to lifting or paraphrasing a paragraph from a source without citation and attracts the same sanctions.
You can give this acknowledgement through a note or “methods section” at the end of the assignment explaining, e.g., which AI tools were used, in which parts of the process they were used, what were the prompts used to generate results, and what you did with the outputs to add value.
One way this can be done is in a tabular form as shown below:
AI Tool used
|
Prompt and output
|
How the output is used in the assignment
|
|
|
|
|
|
|
Alternatively, if an AI tool was used to generate a more extensive set of intermediate outputs that were then developed into a final product, you can also preserve a full transcript of the relevant interactions with the AI as an appendix for submission with your assignment. Your instructor may also require that if AI tools were not used in a specific part of your assignment, you should declare that explicitly. In all cases, seek advice from your course instructor.
4. YOU ARE RESPONSIBLE FOR YOUR WORK
Remember the limitations of current generative AI tools:
-
Output’s quality is dependent on the quality of the users’ prompts.
-
Output may be out of date, as they are dependent on the available training data.
-
Output may not be accurate (e.g., they don’t always present information that is true, the ‘citations’ they may generate may be made-up and point to non-existent sources).
-
Output may present dominant values and opinions as truth not because other views are incorrect, but simply because dominant claims are more common in the training data.
-
Output may be offensive or discriminatory, as AI tools may not produce opinions or judgment calls that are always aligned with legal and social norms.
You should thus always assume that the AI’s output is incorrect until you have separately checked it against reliable sources (citing those sources properly) or have gone through the workings yourself. You also cannot assume that the AI’s output is relevant and sufficiently contextualized for your purposes. In some cases, the rhetorical structure of the AI’s output is usable, but the details of the content are not. Always remember that you, rather than the AI tool, are responsible for the quality and integrity of the work you submit. AI tools are tools, and as such, cannot take responsibility for any information or text that they produce.
5. START A CONVERSATION WITH YOUR INSTRUCTORS ABOUT THE USE OF AI
These guidelines are framed with typical scenarios in mind, and there’s bound to be uncertainties as the field of Artificial Intelligence continues to evolve over the years to come. Whenever you are in doubt, clarify directly with your course instructors. If you are going for an overseas exchange, find out what the host university’s policy on the use of AI is and clarify any doubts with instructors there. Don’t assume that our university's policy is universally applicable.
Remember that just because there are legitimate uses for AI tools in your academic work, it does not mean you should resort to them at every turn, especially if you are still learning the subject matter. By jumping straight to using the tools, you may end up missing an opportunity to learn the subject matter for yourself. Furthermore, if you don’t already have the subject matter knowledge yourself, you might not even be able to tell if the output is accurate or relevant. There are often also better resources you can access. For instance, if you need help with proofreading and editing, you can turn to the NUS Libraries Writers’ Centre; you will learn more that way too!
More generally, do not be shy to approach your instructors to start conversations about how the learning outcomes they are targeting go beyond what the AI tools can deliver, and how you can use AI tools in ways that will enhance your own learning in your courses. The instructor facing side of these documents tells them to do the same—start a conversation with you!
Acknowledgement: The above guidelines are created by the University Policy Workgroup for AI in Teaching and Learning. The Workgroup acknowledges the contributions and suggestions from various members of the NUS community, including both staff and students.
Feedback: Please contact askalib@nus.edu.sg if you have any queries related to the use of AI tools in teaching and learning.