Vers. 1.4, November 20, 2023
This guidance is intended to provide faculty with some tools for pedagogy and instruction in the new world of generative AI, and to outline steps for what to do if you suspect AI was used unethically in your course and assignments. AI is evolving quickly, and this guidance may change over time.
Clearly communicate your expectations.
Due to the variety of AI tools and applications, the ethical use of AI can vary by discipline, class, and even assignment. Instructors should remember that throughout their academic career students will encounter assignments in which they are encouraged to use AI tools and assignments in which they are prohibited from using AI tools, possibly making when and how to use the tools confusing. Thus, it is helpful for instructors to clearly communicate their expectations regarding the appropriate use of AI for the course and for specific assignments by doing one or more of the following:
- Include a general statement in your syllabus that establishes clear expectations regarding the use of generative AI tools in your class. Faculty can find sample syllabus statements on the webpage maintained by the office of the provost.
- Provide examples of what would and would not be an ethical violation of AI use in the context of your course.
- For each assignment, identify which situations AI tools are prohibited, permitted, or required.
- When AI is permitted, make clear what is required as far as documentation and attribution and how the student is expected to verify the information coming from the AI tool.
- Discuss your expectations during a class session and provide an opportunity for students to ask questions about the appropriate use of AI tools.
Advice for creating AI-resistant assignments.
There are many ways to develop assignments that will make it difficult for AI tools to serve as an adequate substitute for the student’s own work. Here are some strategies:
- Scaffold your writing assignments. Consider breaking a large writing assignment into multiple pieces so students can document the progression of their work. This technique works well for combating plagiarism of any kind in academic writing. It will be much harder for a student to get away with submitting a final draft generated by AI if you have observed that student’s thinking and writing process throughout the course.
- Run your prompts through AI. Consider testing your assignments by running them through an AI tool a student would be likely to use. This will give you a sense of whether a student could use AI to generate an answer that would receive a good score based on the content and will allow you to reword the prompt if necessary.
- Use assignments that incorporate personal stories, class discussion, authentic situations, and/or sources and citations. Consider assignments that rely on the personal experiences of students. Create authentic assignments out of collaborative classroom discussions where a student must be present in class to complete the assignment. Require verifiable sources and citations.
- Promote discussion and student sharing about their assignments. Consider asking students to discuss their work orally as a presentation or video submission. Or ask students to write a personal reflection about their writing process. This will be difficult for students whose work is largely AI-generated. If students know that this process will be part of the assignment, they may be incentivized to generate their own work.
- Promote library resources. AI tools currently cannot create an accurate bibliography. The library has discipline-specific resources for students regarding citation and reference practices.
Detecting AI-generated content.
Many of the red flags that occur when students plagiarize work from the internet or other sources occur with AI-generated work as well. For example, when students use specialized terms in answers for an online test question that are not from class discussion or the resources you have provided, this might be an indicator that the student could be using unauthorized sources including AI tools.
Providing definitive proof that a student has used an AI tool for written work is difficult as predictive text generators will generate slightly different text every time they are asked the same question. However, running your assignment prompts through AI (sometimes multiple times) can still be helpful in detecting AI-generated content. Because they have been trained on specific text, predictive text generators tend to generate similar phrasing, cadence, and word order, making it likely that there will be some overlap between a student’s answer and an AI-produced answer you generate at a later time. Identifying these overlaps can help you frame your discussion with a student if you are concerned that they used an unauthorized AI tool for your assignment.
While it is tempting for faculty to use technology-based solutions to detect AI in student work, these detection technologies do not work very well. Detection tools claim to identify work as AI generated, but often cannot provide evidence to support that claim, and tests have revealed significant margins of error. This means that students may be wrongly accused of improper use of generative AI. Use these detection tools carefully, if at all.
What should I do if I suspect a student has used AI unethically?
Instructors who believe a student used a prohibited tool to generate coursework should discuss their concerns with the student. Instructors should make sure that they are prepared with specific reasons for why they suspect the use of AI (the use of specialized terms, overlap with AI-generated content, fabricated sources, etc.). Allow the student an opportunity to address any concerns. If they do not do so adequately and the instructor still feels they have strong evidence the student used an AI tool in an unauthorized manner then contact UMW’s Honor Council and follow the same process you would for any other suspected Honor Code violation. If you have any questions about a possible violation and/or submitting a violation report, please feel free to reach out to Wes Hillyard (rhill5ch@umw.edu) for individual consultation.
Model productive use of AI tools to help students become AI literate.
AI presents a number of challenges to faculty in the classroom, but it also offers a number of promising possibilities. It is very likely that students will be asked to use AI tools in the future workforce. Demonstrate to students how to use AI as a brainstorming or research tool, while expressing clear expectations and guardrails. Considering incorporating discussions about how AI is being used in your discipline and in the jobs students in your field may be likely to enter. AI tools can help students dive deeper into the subject matter, learn about scholarship on the topic, and help students develop better original work. Experimenting with these tools in class can help students see these possibilities.
AI Guidance for Students
Academic integrity guidance for students can be found on the Artificial Intelligence (AI) Tools page on UMW’s Academic Integrity site.