Menu

Handouts

Notes from the “AI Plagiarism: Detection, Mitigation, and Course Policies” Tech Tuesdays that took place on February 25, 2025. Zoom AI Companion was used to provide a summary of the Zoom session, Claud Sonnet 3.7 was used to rewrite the content, which was then edited and expanded upon by a human (me).

Session Overview

This Tech Tuesday session explored the complex landscape of AI in education, focusing on plagiarism detection, mitigation strategies, and developing effective course policies. The discussion brought together faculty members to provide practical insights for educators navigating AI challenges in academic settings.

AI in Education and Plagiarism Concerns

Concerns were raised about AI-generated content in student papers along with the responsibilities of journal editors in identifying academic plagiarism. The discussion touched on AI plagiarism detectors, noting that tools like Turnitin's AI detector can identify AI-generated content, but also have an uncomfortable high false positive rate. Additionally, caution was advised against relying solely on such tools due to potential inequalities they might create among students – content from paid chatbots is detected at lower rates than content from free chatbots.

Key Point: The majority of current AI detection tools work by analyzing language patterns such as perplexity and burstiness, but these methods aren't foolproof and can lead to false positives.

Challenges in Detecting AI-Generated Content

Several significant challenges in detecting AI-generated academic content were highlighted:

  • AI detectors analyze language patterns but are not infallible
  • Students are increasingly using "AI humanizers" to evade detection
  • Relying on multiple AI detectors creates additional burden for faculty
  • Even watermarking technologies like Google's SynthID can potentially be circumvented

The session shared some common indicators of AI-generated text, including the presence of markdown formatting and the frequent use of certain words like "delves" and "showcasing."

Strategies to Prevent Academic Dishonesty


The session explored practical approaches to prevent academic dishonesty in the age of AI:

  • Implementing oral presentations or flipping the classroom so that student assignments are generated in the classroom.
  • Reducing incentives to cheat (e.g., not grading based on quiz performance).
  • Avoiding concurrent deadlines to reduce student stress.

A successful approach shared during the session involved setting deadlines at times when faculty are available to grade, allowing them to respond to students' legitimate issues and create a more supportive environment.

Managing AI in Education

The discussion covered institutional policies and introduced several resources to help faculty manage AI use in their courses:


The session emphasized the importance of thinking critically about AI use and adapting policies to the evolving educational landscape.

AI Tools for Course Design

The session included demonstrations of Microsoft Copilot with Enterprise Data Protection and introduced two frameworks for using AI to creates effective AI syllabus statements (for pre-reasoning models):

Dan Fitzpatrick’s PREPARE Prompt

  • Prompt: Start with a clear question. Provide a stage for what follows.
  • Role: Give the AI a role and outline the context.
  • Explicit: Be specific in your question to avoid misunderstandings.
  • Parameters: Set clear frameworks such as tone of voice and the format of the output.
  • Ask: Ask the AI to ask you clarification questions before it continues.
  • Rate: Ask the AI to rate its own output.
  • Emotion: Add an emotional stimulus. This appears to be able to increase quality.

This can be utilized in Microsoft Copilot with Enterprise Data Protection to help write an effective AI syllabus statement:

Write a syllabus statement for AI usage in a DePaul course. You’re an education expert and skilled teacher. In the syllabus statement, clearly articulate how or how not Generative AI should be used in this course. Use an informative tone and keep the syllabus statement under 300 words. Ask me some clarification questions first, and then answer. Give the summary a rating based on 0-10 points, and indicate what could be improved. Breathe in, and breathe out. Try to really do your best. It’s important to me.

Dave Birss’ CREATE Prompt

  • Character: Describe the role the AI is to assume. For example, “You are an experienced writer who crafts concise text without filler words or jargon.”
  • Request: Clearly and specifically define the request. “I want you to…”
  • Examples: Give examples if you have them.
  • Additions: Refine the task. Describe a point of view to consider or a style to use.
  • Type of Output: For example, a 100-word summary or a chronologically organized. bio
  • Extras: Any further information you wish to provide, including reference text.

This can be utilized in Microsoft Copilot with Enterprise Data Protection to help write an effective AI syllabus statement:

You are an expert educator with 20 years experience and numerous teaching awards. You can create and teach amazing classes that incentivize students to think critically.
I want you to generate a simple, straightforward syllabus statement that sets parameters for if/when generative AI usage is permitted in my class. Start by asking me what type of assignments I will use in class and my expectations of student AI usage.
Draw your language from examples like the AI Assessment Scale, DePaul University’s Academic Integrity Policy, and DePaul University’s AI Teaching Recommendations.
Use an informative tone and keep the syllabus statement under 300 words. Do not suggest a syllabus statement until I give you my expectations of student AI usage and information on the type of assignments I will use. Ask for my expectations and types of assignments then wait for my response before answering.
Write it in plain English without jargon.
Explain your thinking.

Creating Supportive Environments for AI Use

The session advocated for creating supportive environments where students feel comfortable sharing their use of AI tools. Rather than taking an accusatory approach, it was suggested that educators should focus on guiding students to use AI effectively and safely. The discussion emphasized parallels with parenting approaches, highlighting the importance of honesty and open communication about AI use.

Key Insight: AI may have the most significant impact on lower-performing students, suggesting that faculty should adapt by focusing on developing critical thinking and skepticism skills.

Looking Forward


The session concluded with acknowledgment that AI detection remains challenging, but AI use by students is inevitable. The focus should be on adaptation rather than resistance, with an emphasis on:

  • Developing critical thinking skills
  • Creating assignment structures that leverage AI effectively
  • Maintaining open communication about appropriate AI use
  • Continuing to explore and share best practices

Resources