Generative AI tools are rapidly evolving. While we will periodically update the information in this guide, please be aware that the content may become outdated quickly.

Last updated: August 16, 2024

Introduction to Generative AI

What is Generative AI?

As a disruptive technology, generative artificial intelligence (GAI) tools present both new opportunities and challenges for teaching and learning. Since the release of OpenAI’s ChatGPT in November 2022, these technologies have continued to evolve, expanding the opportunities for their use.

Generative artificial intelligence is a subset of artificial intelligence that uses machine learning models to generate new, original content (such as text, images, video, or audio) based on patterns and statistically likely relationships learned from training data. This capacity is achieved through advanced algorithms and neural networks that are trained on vast amounts of data to respond to prompts provided by humans. In response to prompts, these tools can provide contextually relevant, coherent output.

Examples of GAI tools include Microsoft Copilot, OpenAI’s ChatGPT, Claude’s Anthropic, Perplexity AI, OpenAI’s Dall-E, Midjourney, Adobe Firefly, Runway, ElevenLabs, Suno, etc. These tools are capable of performing a wide range of tasks, from generating realistic images and videos to composing music and writing essays. Specialized GAI tools can create slide decks, perform literature reviews, or perform other discipline-specific tasks.

With the increasing ubiquity of GAI tools, stemming from their integration into word processers, search engines, grammar assistants, and more, the importance of clearly communicating with students about what a generative AI tool is (and how to know if they are engaging with one), as well as the appropriate use of these tools in their coursework has grown significantly.

Limitations of Generative AI

It is important to note that while GAI tools can deliver quick and often highly accurate output, they are not human. Therefore, they don’t possess knowledge or comprehension of the materials they generate. In particular, GAI tools:

  • Are not fully reliable – they hallucinate information and make reasoning errors
  • Exhibit a wide range of bias that reflects the human biases found in the training material and/or training process
  • Make it difficult to trace the source and provenance of information incorporated into GAI output

In addition, there exists a wide range of concerns about generative AI tools, from ethical concerns about privacy and IP, to the environmental impacts of generative AI.

Generative AI and Teaching

Here are a few steps you might take as you decide how or whether to incorporate GAI tools or GAI output into your courses:

Learn: Gain a baseline understanding of how these tools function and how they are commonly applied. How might you use these tools, personally or professionally?

Explore: Experiment with GAI tools relevant to your discipline. Practice using GAI to complete an assignment from your course. Identify opportunities for student learning, as well as areas of concern.

Reflect: Might GAI use support or undermine students in achieving any of your course learning outcomes? How important is it for your students to have experience with GAI tools, or understand GAI-related issues?

  • Consider your course learning outcomes, as well as the tasks students complete to demonstrate they have achieved those outcomes. Does GAI leave some of them unchanged, render them moot, or allow you to scale up or enhance some?
  • Consider the expectations in future courses or workplaces regarding the understanding and knowledge that your course helps students develop, as well for their responsible and ethical use of GAI tools. Are there areas where your course learning objectives or course assignments might evolve, or even lean into the use of GAI tools or output?

Set and communicate your stance: Decide whether and when students could engage with GAI tools in your courses. Craft a course policy that clearly communicates this stance (see Example Syllabus Policies related to Generative AI, below). Talk with your students about this stance early and often. If students will be allowed or encouraged to engage with GAI tools, we encourage you to:

  • Share how students should properly document use of GAI tools or cite use of GAI output.
  • Determine if all students will have fair and equal access to GAI tools.
  • Determine what support or education students might need to determine the accuracy and validity of GAI output.

Academic Honesty and Generative AI

At UGA, the default rule for student use of AI on their coursework is that it is not permitted unless it is explicitly authorized by the course instructor.

In each individual course, the instructor of record can determine what constitutes acceptable use of GAI by students. Therefore, it is important to proactively engage students in an open discussion about your expectations regarding GAI. For example, is the use of GAI off-limits, sometimes OK, or always encouraged?

Example Syllabus Policies related to Generative AI

The following examples are general, course-level syllabus policy examples representing varying stances on the use of GAI in a given course (highly permissive, moderately permissive, or maximally restrictive). In addition to these examples, you can explore a growing repository of classroom policies related to generative AI available here.

Turnitin AI Writing Detector

Turnitin’s AI Writing Detector analyzes student writing, flagging segments of text it believes were either copied directly from generative AI output, or created using an AI-paraphrasing or word modulator / spinner tool. Turnitin’s AI Writing Detector is the only AI Writing Detector approved for use at UGA. UGA instructors should not use AI detectors that are not supported by UGA, as these other tools have not been vetted by UGA’s information security team, for FERPA compliance, or for protection of student’s intellectual property.

Turnitin’s AI Writing Detector can be found within the Similarity Report of a Turnitin assignment on eLC. For the Turnitin AI Writing Detector to function, work submitted to the Turnitin assignment must meet the following requirements (as of July 2024):

  • File size must be less than 100 MB
  • File must have at least 300 words of prose text in a long-form writing format
  • File must not exceed 30,000 words
  • File must be written in English
  • Accepted file types: .docx, .pdf, .txt, .rtf

A note of caution about Turnitin’s AI Writing Detector

Although Turnitin’s AI Writing Detector is the sole, preferred AI writing detector for use at UGA, instructors should use Turnitin’s AI Writing Detector with caution.

False positives: Turnitin reports a sentence-level false positive rate of 4%, but studies show that this false positive rate is higher for students who speak English as a second language, as well as for text that was modified by a grammar modulator. Presumably, Turnitin’s AI Writing Detector would also flag students who inadvertently choose words in an order that Turnitin’s probability models say they should not.

Lack of source text: Turnitin’s Similarity detector provides contextual information for any flagged content. This allows instructors to determine if flagged content truly matches what Turnitin has identified as similar, by accessing and reviewing the suspected match. However, this is not the case for Turnitin’s Writing Detector – it is not possible for any AI detection tool to offer such context or suspected source.

AI advances outpace detectors: Turnitin’s AI Writing Detector was trained on output from GPT-3.5. Those with the resources to access more sophisticated tools than GPT-3.5 are better able to avoid detection.

Looking for more information on the AI Writing Detector? You can visit this Turnitin Guide.

Questions about Generative AI & Teaching?

Contact our teaching & learning experts for a one-on-one consultation today!