Center for Practical AI
Educator Guide · AI and Mental Health

AI & Mental Health: Educator & Facilitator Guide

For school counselors, health teachers, social workers, youth program staff, and community educators. Trauma-informed facilitation guide for the WhoIsInTheRoom comparison tool, disclosure handling, and safe messaging.

Safe messaging required — read Section 1 before facilitating.

Read Before Facilitating

Safe messaging guidelines

This content discusses suicide and the death of a minor. The American Foundation for Suicide Prevention (AFSP) and the Suicide Prevention Resource Center (SPRC) publish evidence-based safe messaging guidelines specifically for educational and media contexts. Facilitating this content without following those guidelines can increase risk. These are the core principles:

Frame the topic as a public health and policy issue

Not as a personal tragedy that could have been prevented by better individual choices. The Sewell Setzer III case is a product safety failure, a regulatory failure, and an access failure — not a story about a troubled individual.

Describe what help looks like and where to find it

In every session, name 988, Crisis Text Line, and at least one local resource. Post these before and after the session.

Acknowledge that the topic may bring up personal feelings

Open with: 'Some of you may have personal experiences with mental health, yours or someone you know. This is a space where that's okay to have. You don't have to share. If you need to step out, do.' Follow with your school's counselor protocol.

Describe crisis resources specifically — not just 'get help'

Abstract advice has low effectiveness. 'If you are having thoughts of suicide, text or call 988 — it's free, it's confidential, they will not automatically call the police or your parents' is specific enough to act on.

Describe method or detail of Sewell Setzer III's death

This page's content deliberately omits method. So should any facilitation. Method detail is associated with increased risk of contagion among vulnerable individuals.

Use language that implies suicide is a rational response

Avoid framing like 'it's understandable he felt he had no other choice.' The goal is to communicate that other choices exist and are accessible.

Create a 'public spectacle' around the case

Present it as a documented event with public interest and policy implications — not as a morality story or a shocking anecdote.

Disclosure Handling

If a student discloses during this content.

This content can activate personal experiences. Have a plan before you begin.

1

Take it seriously and take it privately

If a student discloses suicidal ideation during or after the session, do not respond in front of the group. Thank them for sharing, tell them you want to talk with them one-on-one, and pause or hand off the group session.

2

Do not leave them alone

Stay with the student or ensure another trusted adult is present while you follow your school's protocol. Do not send them back to class unaccompanied.

3

Do not promise confidentiality

If your mandatory reporting obligation may be triggered, say: 'I'm glad you told me. I may need to involve [school counselor/parent] to make sure you're safe — but I'm going to do that with you, not around you.' Follow school protocol for mandated reporting.

4

Involve the school counselor — not just HR or admin

School counselors are trained in crisis assessment. Administrators are trained in policy. These are different skill sets. If your school has a counselor, they should lead the response to a suicidal ideation disclosure.

5

Follow up the next day

Check in privately. 'I've been thinking about our conversation yesterday. How are you doing today?' This signals the disclosure was taken seriously and they are not a problem to be processed.

6

Document and debrief

Complete any required incident documentation per your school's policy. Debrief with your counselor or supervisor, especially if this is your first time responding to this kind of disclosure. You're allowed to find it hard.

Tool Facilitation

How to run WhoIsInTheRoom in a group.

The tool works best as a shared inquiry — not a lecture. The goal is for students to arrive at the limitations themselves.

Opening framing (2 minutes)

Before opening the tool: "Almost everyone in this room has a phone with an app that can have a conversation with you — Siri, ChatGPT, maybe Replika or Character.AI. Today we're going to look at what those tools actually are — not to say you shouldn't use them, but so you know what you're actually talking to."

This framing positions the exercise as empowerment rather than restriction. The goal is informed use, not prohibition.

Discussion questions by dimension

Training

"If you were sick and needed a doctor, would you be okay with the doctor having no medical school, no internship, and no license? What if they were very good at sounding like a doctor?" — Let the analogy land before connecting it to the AI column.

Crisis response

"If your friend told you they wanted to die, what would you do? What can an AI do in that moment?" — The gap is visceral when stated this directly.

Confidentiality

"Would you tell your therapist something you wouldn't tell your parents? Would you tell an AI something you wouldn't tell your therapist? Why is the order of trust like that?" — Surfaces the counter-intuitive fact that AI may feel safer while being less protected.

Access

"The tool costs nothing and is available at 3am. If a therapist costs $150 and has a 6-week waitlist, is it fair to say 'just see a therapist'?" — Acknowledges the access gap before moving to what's needed.

Safety

"Who is responsible for what happens when a 14-year-old talks to an AI about wanting to die: the company, the parents, the school, the government, or the teenager?" — Opens the policy and ethics conversation.

Closing (5 minutes)

  • Ask: 'After going through this, would you use an AI for mental health support differently than you did before? What would you do the same? What would you change?'
  • Name the resources explicitly: 988 (call/text), Crisis Text Line (text HOME to 741741), and one local option if you know it.
  • End with agency: 'You now know more about what these tools can and can't do than most adults. That knowledge matters.'
  • If Facilitator Mode is on throughout the session: review the discussion prompts for each remaining dimension you didn't cover and save them as optional extension questions.
Audience Considerations

Adjusting for age and context.

Middle school (11–13)

  • Do not include the Sewell Setzer III case. It is a peer story and carries contagion risk for this age group.
  • Focus on: 'what is this app actually doing with what you say.' Privacy and data use are the right entry points.
  • The crisis dimension is appropriate — but use peer scenarios, not the documented case.
  • Ensure a school counselor is aware you're running this content.

High school (14–18)

  • The Sewell Setzer III case can be introduced with full safe messaging protocols. Most participants will have heard of it.
  • The access dimension is particularly resonant — many students have tried to find a therapist and failed.
  • Enable Facilitator Mode throughout — the discussion prompts are calibrated for this age.
  • Run a disclosure protocol check with your counselor before the session.

Adult / community

  • Adults may be seeking guidance for a young person in their life — frame accordingly.
  • The parenting dimension ('ask, don't surveil') often needs more time than other dimensions.
  • Policy dimensions (regulation, class action, KOSA) land with this audience and can extend the session productively.
  • Adults may share their own AI use for mental health processing — the tool holds space for this without judgment.

Want CPAI to facilitate this workshop for your school or community?

We deliver trauma-informed AI and mental health education for schools, parent groups, and healthcare organizations.