ChatGPT & Generative AI

A Decision Tree to Guide Student AI Use

This model guides students to ask vital questions about their AI use and to reflect on how it benefits their learning.

July 10, 2024

Your content has been saved!

Go to My Saved Content.
Neil Webb / Ikon Images

All of the excitement and trepidation about generative artificial intelligence (AI) has led to thorny questions for teachers, perhaps none more pressing than whether, when, and how to allow student use of AI tools. While these tools can undermine the learning process, they can also be powerful sources of enrichment, clarity, and perspective—when used responsibly. 

To aid teachers and students in responsible Al use, we have collaborated on a decision tree for student decision-making. This framework can help students make better choices about using AI tools and give teachers more confidence in supporting student use. Critical engagement with AI tools—including questioning their outputs, understanding their limitations, and considering ethical implications—is a vital component of digital literacy and citizenship. 

Walking Through The Decision Tree

Permission: Am I permitted to use AI tools for this task? Generative AI is undetectable, making bans impractical and short-sighted. Instead, teachers should collaborate with students to create trust-based norms and transparent processes for AI use. A straightforward “red light, yellow light, green light” system is a good starting place. Teachers need to feel empowered and knowledgeable enough about AI tools to dictate whether students can use them, and students need to understand the rationale for these kinds of decisions as part of their own emerging AI literacy. This first question, then, reminds students of the permission structure and expectations within a given class. 

Enhancement: Am I using AI tools to enhance my learning? Like most academic technology, AI tools can be used to enhance learning or to supplant it, and students need to be prompted to consider not just whether they’re allowed to use these tools but how to use them responsibly—not as a replacement but as an augmentation. This question invites students to reflect metacognitively on the goals of the task and the suitability of the tool. Students are more likely to be successful in this process when their teachers give guidance tailored to an activity and clearly communicate its learning objectives. 

Indeed, this is one way in which AI might inspire teachers toward best practices, as clearly articulated learning objectives are an essential part of effective teaching. When students understand assignment goals, they are also more likely to understand the teacher’s rationale for allowing (or not allowing) AI tools. Similarly, they will be better positioned to evaluate the effectiveness of an AI tool in helping them build their understanding and develop their learning.

Iterative use: Am I using PROMPT and EDIT to generate and analyze my AI outputs? AI tools work best when used iteratively; that is, they are best leveraged through repeated engagement and tuning. Our model offers PROMPT (Purpose, Role, Organize, Model, Parameters, Tweak) and EDIT (Evaluate, Determine, Identify, Transform) as helpful acronyms for designing and refining prompts, again with the goal of introducing some critical metacognitive friction into the students’ use of the tools.

PROMPT encourages students to design scaffolded prompts with the goal of eliciting the most helpful responses, while EDIT reminds them that use of an AI tool is an active process of determining the appropriateness and effectiveness of AI outputs, iteratively analyzing, tweaking, and re-prompting.This approach is equally beneficial to teachers—whether they are creating routine study materials or designing a custom chatbot for classroom use.

PROMPT also pairs nicely with the “AI roles” framework developed by Ethan and Lilach Mollick, which encourages users to give the AI a clearly defined role (e.g., tutor, coach, mentor) in order to get desired results and leverage it appropriately. That role is yet another opening for students to intentionally use the tool within a teacher’s parameters. 

Transparency: Am I prepared to show how and explain why I used AI tools to support my work? This question drives home the metacognitive process even more deeply and is a crucial step for promoting thoughtful collaboration, rather than outsourcing. It requires students to reflect on how they integrated the AI tool’s offerings, placing greater emphasis on process over product.

Teachers can make informed choices about how best to have students share their process. We have had students share conversation links and annotate sections of their writing on which AI served as a coach. There are MLA, APA, and Chicago Style guidelines for citing AI use as well. 

Reflection and metacognition: Am I actively reflecting on my use of AI tools? In this essential final step, students are once again invited to consider the assignment objectives, their own growth and learning, and their own responsible use of these tools. The goal, as always, is to prepare students to be not just tech-savvy but tech-wise, ready to approach the complexities of a digital future with confidence and ethical clarity. 

Educators play a critical role in this process, and through scaffolded reflection, students can develop nuanced understandings of appropriate and inappropriate uses of AI. Facilitating discussions on AI use, experimenting in class, and soliciting feedback can bring students into the fun, messy project of figuring out how best to teach and learn in this new era—empowering them in the process. 

Banning Won’t Work—and Won’t Help Students Learn

For a host of reasons—because AI-generated content is not always detectable, because AI tools are here to stay, and because they can be such robust tools for creating and learning—teachers and schools should resist the temptation (to try) to ban these tools from the classroom process. Instead, AI tools, like all technology, need to be mindfully and carefully woven into learning through a deliberate process of iterative ideation and metacognitive reflection, thereby helping to preserve the human connections that form the core of teaching and learning.

Through structured frameworks like our AI decision tree; a red, yellow, or green light system; and ongoing dialogue about AI’s role and impact, teachers can help students not only leverage AI to enhance their learning but also fully appreciate the opportunities, responsibilities, and downsides that this technology presents. By fostering responsible AI literacy in our classrooms, we can empower students to shape a future that is more human-centric, reflective, and thoughtful.

Share This Story

  • email icon

Filed Under

  • ChatGPT & Generative AI
  • 6-8 Middle School
  • 9-12 High School

Follow Edutopia

  • facebook icon
  • twitter icon
  • instagram icon
  • youtube icon
  • Privacy Policy
  • Terms of Use
George Lucas Educational Foundation
Edutopia is an initiative of the George Lucas Educational Foundation.
Edutopia®, the EDU Logo™ and Lucas Education Research Logo® are trademarks or registered trademarks of the George Lucas Educational Foundation in the U.S. and other countries.