AI and Teaching: Policies and Guidelines
Dept of Peace and Conflict Research, Uppsala University
Version December 2024
The following policies and guidelines apply to all credit-bearing courses at the Department of Peace and Conflict, including BSSc, MSSc, and PhD level.
Information to students
Generative AI (Artificial Intelligence), henceforth GAI, tools have the ability to generate new content such as, text, images or audio based on patterns in training data, rather than just processing existing data. In recent years, GAI tools have proliferated, offering improved capabilities and user-friendliness across a wider range of applications, with many options available for free. Given the rapid pace of AI development it is currently difficult to define exactly what these tools can and cannot do.
This bears crucial implications on academic teaching and learning. GAI tools can provide students with important resources with the potential to support, improve, and accelerate learning. At the same time, they raise the risk that the tools replace tasks that we consider core to academic learning, as well as questions about fair and secure assessment (i.e. risks of cheating and plagiarism).
In November 2024, Uppsala University decided on centralized guidelines for AI in teaching and examination (Dnr UFV 2023/2129). As these guidelines are broad and delegate the responsibility to individual teachers, the Department of Peace and Conflict Research will continue to use its own guidelines – this document. These guidelines are in alignment with the central guidelines, but more specific and adapted to the courses we offer. With the department-wide guidelines, we wish to ascertain that GAI tools are used responsibly, ethically, and in a way that promotes and accelerates learning while we also safeguard fair and secure assessment. To do so, our current guidelines rest on the following important pillars:
First, we allow the use of GAI tools for learning, but not instead of learning. We encourage students to explore and leverage GAI tools as powerful aids in their learning journey. When used responsibly, GAI tools can be helpful in understanding complex concepts, exploring different perspectives, and organizing ideas. By engaging with AI tools, you can enhance your critical thinking skills, improve your ability to ask insightful questions, and develop a more systematic approach to problem-solving. Examples of using GAI for learning include: using a tool such as ChatGPT to explain concepts, brainstorm ideas, suggest counter-arguments, improve writing, detect errors, work through problems, or provide feedback. The goal is to use GAI to deepen your understanding, spark creativity, and ultimately produce work that reflects your own intellectual growth and academic integrity.
However, GAI tools should neither replace activities which currently are considered core to learning about our subject, nor replace activities which we currently consider core to academic work. An example of using GAI tools instead of learning is giving ChatGPT the assignment instructions and quoting the text directly or with little further engagement. As learning outcomes and forms of assessment vary, your course convener will provide course-specific guidelines on the extent to which GAI tools are suitable for learning in their course.
Second, the learning outcomes always take precedence. Central to all courses are the learning outcomes, as specified in each course plan. These learning outcomes have not become less relevant with the emergence of GAI tools. When grading assignments, teachers can withhold a passing grade if they deem a submission does not sufficiently reflect independent work (as a learning outcome). Additionally, there is a risk of disciplinary action if examining teachers in the grading of assignments conclude that a student has used AI instead of learning and not for learning.
Third, as always, students are responsible for all works submitted in their name. When you as a student submit your work in your name it means that you are responsible for the words, images, and data included in the submission. You should be able to explain and justify the material used, how the final assignment was completed, and to develop your thinking around key ideas or choices made in your submission. If you use a GAI tool to complete an assignment or examination in a different way than the teacher intended, you may mislead the examiner about, for example, your own knowledge and skills or how you carried out the assignment. Such actions can be considered a disciplinary offense according to the Higher Education Ordinance. Further, copy-pasting text, image or other output directly from a GAI tool could constitute plagiarism. Students should be prepared to orally explain the content of their submitted assignments and the process through which they produced the text or any other output. Students should also be prepared to, if requested, submit additional documentation that specifies how GAI was used and that can verify their workflow.
Fourth, students may not treat information obtained from GAI prompts as an author. In other words, providing some kind of information and backing it up with a reference to e.g. ChatGPT and a date is not allowed. An AI tool cannot be quoted as an author as there is no individual human mind behind the quote, the text has not gone through any form of scrutiny (such as peer-review) and your quote will never be traceable.
Fifth, students should be aware of the limitations and risks involved in relying on GAI tools, relating to both the input and output phases. In the input phase it is important to know that students may not upload texts or images to GAI which are copy-right protected or fall under the Data protection policy. This includes, but is not limited to, another students’ work or course material. In the output phase, be aware of risks associated with text generated by GAI tools, such as:
- They often produce incorrect or false information;
- They may provide non-sensical information, as they cannot recognize if a prompted question is absurd or has no answer;
- They may provide you with copyrighted information or personal data that you do not have the right to use;
- They may not give a balanced view of the subject, this includes reproducing myths, biases and stereotypes as truths;
- They may plagiarize someone else’s text or ideas without giving proper credit;
- They may fabricate citations, i.e., they cite sources that do not exist.
Students of peace and conflict studies should familiarize themselves and be cognizant of these risks, as overlooking the risks of GAI use may limit their learning, misrepresent their thinking, and (in the worst-case scenario) lead a student to fail an assignment or face disciplinary action. Carefully weighing the risks and benefits of GAI use is also good practice for their future careers, where GAI use may raise similar concerns.
In sum, we urge students to follow these guiding principles:
- Use GAI tools for learning, not instead of learning.
- The learning outcomes on a course take precedence.
- Remember that you are always responsible for works submitted in your name.
- Do not treat GAI generated text as an author.
- Be aware of the current limitations and risks of using GAI tools (both in terms of input and output), and remember to check the terms and conditions as with any software.
For questions on how the policy applies to a specific course, contact the course convener.
For questions about the policy, contact the Director of Studies (erika.forsberg@pcr.uu.se)