IBG's policy on students' use of AI tools in examining tasks, including theses and other written assignments within courses

Note: See course-specific guidelines for each course

There are many types of AI tools, but we primarily focus on Large Language Models (LLMs) that have gained widespread use in the past year, such as ChatGPT, Microsoft Copilot, and Gemini.

LLMs are language models, not fact models. They are trained on large volumes of text from the internet and other sources, which may include incorrect information. LLMs often present information with linguistic accuracy and confidence. However, their responses frequently consist of a mix of unrelated facts. To someone unfamiliar with the topic, an LLM's output may appear highly convincing, but the reliability of AI-generated content cannot be guaranteed due to the potential for erroneous combinations of information.

LLMs cannot replace your knowledge or your work. When used wisely, LLMs can be helpful. For instance, if you have difficulty expressing yourself, LLMs can provide suggestions for improving the text you have written. However, remember that you are responsible for the content and factual accuracy of the text you submit for examination. You must also be able to explain and account for the content of your text in discussions within the course, such as with fellow students, teachers, or
examiners.

Theses and other written, examined assignments must be created by you as the student, not by an AI tool or another person. Written coursework is essential for developing your ability to produce scientific or popular science texts, as well as for writing exam answers and lab reports.

You need to understand your subject to assess the accuracy of what an LLM generates. Therefore, simply reading an AI-generated summary is not sufficient preparation for your writing. As a student, it is important to search for scientific original references, read them critically and, where relevant, refer to them in your work. Text generated by AI tools cannot be cited as a source in written assignments because it is not an original source. Similarly, AI-generated text cannot be used in theses or other written course components, as the text is not authored by you.

For specific courses and assignments, it is up to the examiner to decide whether AI tools may be used, and if so, in what way, or whether their use is prohibited. Information about the use of AI in courses, including how its use should be reported, should typically be available on the course’s Studium page, website, or SOP. However, it is always your responsibility to use AI in accordance with central and local guidelines.

Also in courses where AI-generated material is permitted, you are responsible for ensuring that the material adheres to academic integrity and meets relevant ethical considerations, such as critically evaluating the material's reliability and providing proper citations. If Ouriginal flags your submission or there are other indications that the text was not written by you, standard disciplinary measures will apply (Plagiarism and Academic Misconduct in Biology Education at UU). Additionally, as you
cannot always determine the sources an LLM has used, there is a risk of inadvertently plagiarising someone else’s work if you rely on text from an LLM without verifying the facts against reliable sources such as original articles and textbooks.

The Biology Education Centre AI-policy is an application of the Uppsala University general AI-policy.

FOLLOW UPPSALA UNIVERSITY ON

Uppsala University on Facebook
Uppsala University on Instagram
Uppsala University on Youtube
Uppsala University on Linkedin