Please enable JavaScript to use CodeHS

  • AI

AI Hallucinations: Can You Stump an AI?

In this project, students will explore inaccuracies, or hallucinations, of AI programs. They will try to find a prompt that will cause an inaccurate response and reflect on why they feel that prompt caused an inaccuracy. Note: Students will need access to a large-language model AI, such as ChatGPT or Google's Gemini, for this project.

Easy

1 Hour

Middle School

Project Description

Task

When you communicate with a large-language model AI, you may see a disclaimer at the bottom of the screen that says something like this:

ChatGPT can make mistakes. Consider checking important information.

or

Gemini may display inaccurate info, including about people, so double-check its responses.

These disclaimers are included because of AI Hallucinations. A hallucination is when an AI responds in a very sure manner with information that is completely false or inaccurate.

In this activity, you’ll learn more about AI hallucinations and will try to find a prompt that will cause an AI program to hallucinate.

To begin, you’ll watch a video on AI Hallucinations to understand what they are and why they can happen. Then, you’ll have a conversation with a large-language model AI like OpenAI’s ChatGPT or Google’s Gemini to see if you can find a prompt that will cause a hallucination and reflect on your experience.


Project Lesson Plan