Teaching & Generative AI
What is generative artificial intelligence (GenAI)?
GenAI models are technologies trained on specific datasets and use algorithms to “generate” outputs. Their outputs vary, with some examples being text, images, videos, and audio. GenAI models are evolving rapidly and influencing teaching and learning environments. There are numerous models (e.g. ChatGPT, DALL-E) and their increased sophistication and access make them easier to use. Reimagining course designs and teaching approaches to support student learning are critical emphasis today in higher education as a result of these technologies.
What are teaching and learning opportunities does it afford?
GenAI technologies can support teaching and learning in various ways, including, but not limited to:
- Brainstorming for ideation or topic selection
- Improving writing and copyediting
- Developing examples and case studies
- Creating assessment questions
- Obtaining ideas for course design and lesson planning
- Organizing text
- Summarizing text
- Creating tailored graphics, media, and audio files
- Role-playing
- Analysis and research (e.g. datasets trained on reputable sources)
What are its limitations?
In a student perspectives study, respondents identified several challenges to the usage of GenAI, a few top ones are noted below (study cited in Addy et al., 2023):
- Hindering learning: shortcutting critical learning processes
- Academic integrity concerns: impermissible usage or cheating (see Rutgers Academic Integrity Policy)
- Inaccurate information: outputs containing incorrect information, highlighting the importance of fact-checking
What are equity concerns surrounding GenAI?
- Access: Students with more financial means may have the ability to purchase the more advanced and robust tools which could create educational disparities for low-income students
- Bias: GenAI technologies are trained on specific datasets which might contain biased information which can harm groups that are minoritized
- Algorithmic epistemic injustice: Kay, Kasirzadeh, & Mohamed (2024) use this term to describe the representational harm the creation of epistemic inequities in multilingual scenarios involving GenAI
What are other ethical concerns?
Various ethical concerns have been raised around GenAI tools, with several noted below.
- Intellectual property/copyright: Various issues have been raised regarding copyright as it pertains to the datasets from which the technologies are trained as well as the outputs
- Energy consumption: Training and usage requires energy leading to climate concerns and sustainability
- Privacy: Protections for personal information imputed into the tool and prompts may vary
- GenAI company regulation: Regulatory processes may lack transparency and clarity
- Labor exploitation: In some instances work involved in training and maintaining models has been outsourced to low paid workers
How do I address GenAI usage in my courses?
-
What are the learning goals of your course(s)? What types of assignments do you include? Is it possible for students to use GenAI to complete these assignments? If so, would such usage impact (positively or negatively) their ability to achieve the course learning goals? Use this information to guide any redesigns.
-
While it is not typically feasible or necessarily advisable to keep track of every GenAI update, periodically engaging in professional learning can better support effective instruction that can support student learning no matter one’s stance on generative AI. Participating in the Teaching & Generative AI Pathways is one way to support this journey. Instructors can also follow major voices on generative artificial intelligence on social media or through their newsletters.
-
Have clear and transparent policies around generative artificial intelligence and discuss them with each class as well as the rationales behind them. An alternative approach is to co-create policies with students to use the opportunity as a two-way partnership. Such policies might also need to be adapted as technologies evolve.
There are a variety of frameworks that can be useful to follow regarding generative artificial intelligence. One example is the AI Assessment Scale, developed and updated by Perkins, Furze, Roe & MacVaugh (2024) (see below). This scale encourages instructors to indicate whether and how students are permitted to use GenAI on an assessment. When adopting a policy it is equally important to have a plan for how to address any violations of the policy.
- “No AI – The assessment is completed entirely without Al assistance in a controlled environment, ensuring that students rely solely on their existing knowledge, understanding, and skills. You must not use Al at any point during the assessment. You must demonstrate your core skills and knowledge.
- AI Planning – Al may be used for pre-task activities such as brainstorming, outlining and initial research. This level focuses on the effective use of Al for planning, synthesis, and ideation, but assessments should emphasise the ability to develop and refine these ideas independently. You may use Al for planning, idea development, and research. Your final submission should show how you have developed and refined these ideas.
- AI Collaboration – Al may be used to help complete the task, including idea generation, drafting, feedback, and refinement. Students should critically evaluate and modify the Al suggested outputs, demonstrating their understanding. You may use Al to assist with specific tasks such as drafting text, refining and evaluating your work. You must critically evaluate and modify any Al-generated content you use
- Full AI – Al may be used to complete any elements of the task, with students directing Al to achieve the assessment goals. Assessments at this level may also require engagement with Al to achieve goals and solve problems. You may use Al extensively throughout your work either as you wish, or as specifically directed in your assessment. Focus on directing Al to achieve your goals while demonstrating your critical thinking.
- AI Exploration – Al is used creatively to enhance problem-solving, generate novel insights, or develop innovative solutions to solve problems. Students and educators co-design assessments to explore unique Al applications within the field of study. You should use Al creatively to solve the task, potentially co-designing new approaches with your instructor.”
-
Education has often focused on the product created by students, e.g. the essay, the project paper, the test score, etc. While having a quality final product is important, the process that students go through to attain that product is even more important for learning. The rapid evolution of GenAI has revealed, in part, that products can be obtained quickly without knowing all the intricacies in the process. Learning is a process that involves productive struggle. To follow student’s learning process, scaffold assignments. For example, break a class project down into 3 or more components or steps where they review and provide feedback.
-
Fostering students’ literacy on GenAI can support their ongoing growth and development. This student infographic provides a starting point for conversations about generative artificial intelligence. Taking time to have conversations about these topics is recommended.
What about GenAI detectors?
Academic integrity violations are a common concern with regarding generative AI and a first instinct might be to focus on detection. However, various studies have uncovered limitations of GenAI detectors. Perkins et al. (2024) conducted a study on six generative AI detectors and found that when a few small techniques were used to modify the text, the detector accuracy decreased to 17%. The researchers indicated that, “The varying performances of GenAI tools and detectors indicate they cannot currently be recommended for determining academic integrity violations due to accuracy limitations and the potential for false accusation which undermines inclusive and fair assessment practices.” Weber-Wulff et al. (2023) studied 21 detectors and concluded that they were neither valid nor reliable at detecting GenAI-generated text. Liang et al. (2023) also found that GenAI detectors can falsely target students whose first language is not English. With such considerations, and concerns over a focus on punitive measures, the usage of GenAI detectors should be treated with caution.
Rethinking assessment
If detection has such challenges, instructors might wonder what other choices they can make. Some might choose to use approaches such as those described previously to take advantage of the opportunities that GenAI can afford to their learning environments. Still, depending on teaching contexts and other concerns over GenAI, some instructors might want to design of robust assignments that support learning without GenAI. Below are sample approaches in this second category.
- Non-digital projects: creative or other hands-on assignments that naturally limit the usage of a GenAI
- Arts – painting, drawing, acting, designing a set, etc.
- Humanities – building a physical model of a historical site
- Social Sciences – creating a concept map of course topics on poster paper
- STEM – building a physical model
- Business – creating a physical prototype of a product
- Experiential learning: learning beyond the classroom
- Arts – attending a live performance and writing a reflection about it that applies course concepts
- Humanities – gathering an oral history and creating video vignettes
- Social Sciences – interviewing people as part of a class project
- STEM – performing science demonstrations at a local elementary school
- Class-specific content
- Any discipline – writing a response, reflection, or synthesis of specific topics from class discussion
- Personal application
- Any discipline – applying course material to a specific experience
- Social annotation: using applications such as Perusall to support more robust online engagement. Examples of course materials that could be annotated based on specific prompts include:
- Arts – a play or musical score
- Humanities – a novel or poem
- Social Sciences – a research study
- STEM – a technical report
- Business – a business plan
- Scaffolding: breaking down larger assignments into parts and have milestones and check-ins to observe student processes and give feedback. Consider integrating a journal where students reflect on their learning along the way. Below are sample projects.
- Humanities – an essay
- STEM – laboratory report
- Business – business plan
Sample assignments
For those interested in assignments that address generative AI, consider the following repositories:
TRAIL Teaching Repository for AI-Infused Learning
Resources
Bowen, J. A., & Watson, C. E. (2024). Teaching with AI: A practical guide to a new era of human learning. JHU Press.
Long, L. Cyborgs and Centaurs: Academic Writing in the Age of Generative AI. Open Press Books.
https://cwi.pressbooks.pub/longenglish102/
Yee, K.; Whittington, K.; Doggette, E.; and Uttich, L., “ChatGPT Assignments to Use in Your Classroom Today” (2023). UCF Created OER Works. 8. https://stars.library.ucf.edu/oer/8
References
Addy, T., Kang, T., Laquintano, T., & Dietrich, V. (2023). Who Benefits and Who is Excluded?: Transformative Learning, Equity, and Generative Artificial Intelligence. Journal of Transformative Learning, 10(2), 92-103.
Furze, L., Perkins, M., Roe, J., & MacVaugh, J. (2024). The AI Assessment Scale (AIAS) in action: A pilot implementation of GenAI supported assessment. arXiv preprint arXiv:2403.14692.
Kay, J., Kasirzadeh, A., & Mohamed, S. (2024, October). Epistemic injustice in generative ai. In Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society (Vol. 7, pp. 684-697).
Liang, W., Yuksekgonul, M., Mao, Y., Wu, E., & Zou, J. (2023). GPT detectors are biased against non-native English writers. Patterns, 4(7).
Perkins, M., Roe, J., Vu, B. H., Postma, D., Hickerson, D., McGaughran, J., & Khuat, H. Q. (2024). Simple techniques to bypass GenAI text detectors: implications for inclusive education. International Journal of Educational Technology in Higher Education, 21(1), 53.
Weber-Wulff, D., Anohina-Naumeca, A., Bjelobaba, S., Foltýnek, T., Guerrero-Dib, J., Popoola, O., … & Waddington, L. (2023). Testing of detection tools for AI-generated text. International Journal for Educational Integrity, 19(1), 26.