Center for Practical AI
Educator Guide · Schools & Students

AI in Schools: Educator Guide

For teachers, administrators, district technology coordinators, and school board members. How to vet EdTech tools, what FERPA actually requires, model AI policy language, and IDEA compliance guidance for AI-assisted IEPs.

EdTech Procurement

How to vet an EdTech tool before it reaches students.

The Future of Privacy Forum and Common Sense Privacy provide the two most widely used rubrics. Here is what to look for.

Data practices

  • Does the vendor sell or share student data with third parties for advertising purposes?
  • What data is collected beyond what is strictly necessary for the educational function?
  • What is the data retention policy after a school contract ends?
  • Does the vendor collect behavioral analytics, advertising IDs, or device fingerprints?

Security

  • Has the vendor experienced a data breach in the past three years? If so, how quickly did they notify affected schools?
  • Does the vendor undergo independent security audits? Are results available to school districts?
  • How are data minimization and least-privilege access controls implemented?

Equity and bias

  • For AI detection or proctoring tools: does the vendor publish demographic disaggregated false positive rates?
  • Has the tool been independently audited for bias by race, language background, or disability status?
  • What accommodation process exists for students whose disabilities are incompatible with the tool's behavioral expectations?

Legal compliance

  • Does the vendor contract meet all four conditions of FERPA's school official exception?
  • For tools used with children under 13: does the tool comply with COPPA and the 2025 FTC rule update?
  • Does the vendor agree not to use student data to train AI models?
Use the SDPC National Data Privacy Agreement: The Student Data Privacy Consortium maintains pre-negotiated vendor agreements that satisfy FERPA requirements. Using these templates shifts the legal negotiation burden from individual districts (which often lack legal capacity) to a vetted standard. Most major EdTech vendors have signed or agreed to participate.
FERPA

What the school official exception actually requires.

Most EdTech contracts do not meet all four FERPA conditions — but enforcement is rare. Here is the standard.

FERPA allows schools to share student educational records with third-party vendors without parental consent under the "school official exception" — but only if four conditions are met:

1

The vendor performs an institutional service or function

The service must be one the school would otherwise perform itself. Marketing analytics and behavioral profiling do not qualify.

2

The school determines the purposes and means of the data use

The vendor cannot independently determine how student data is used. Schools must maintain meaningful control.

3

The use is subject to FERPA requirements

The vendor must not re-disclose student records without authorization and must allow school access for inspection.

4

The school has direct control over the data

The school can require the vendor to return or destroy data. The data cannot be used for the vendor's own purposes.

Source: 34 C.F.R. § 99.31(a)(1); Family Policy Compliance Office (FPCO) guidance. For vendor contract review against these four conditions, use the SDPC agreement template or consult your district's legal counsel.

Model Policy

Five principles for an evidence-based AI policy.

Drawn from CDT's 2025 legislative analysis of 134 bills in 31 states. These are the principles that appear in the strongest legislation.

1

Evidence before expansion

No AI tool may be deployed district-wide without a documented pilot phase producing disaggregated outcome data by student demographic group. Colorado's pilot-evidence model is the legislative standard.

2

Transparency to families

Schools must publish an annual list of all EdTech vendors with whom student data is shared, including the stated purpose of each disclosure. This list must be updated within 30 days of any new vendor contract.

3

Surveillance presumption against

AI student monitoring tools must demonstrate evidence of efficacy and equity before district adoption. The burden of proof is on the vendor, not the district. RAND's 'scant evidence' finding is the current standard.

4

AI detection standards

No student may be disciplined for academic dishonesty based solely on an AI writing detection flag. Any AI detection flag must be reviewed by a human who examines the student's full academic history and has reviewed the student's other writing samples.

5

IEP individualization requirement

AI tools may assist in IEP drafting but may not substitute for the individualized assessment process required by IDEA. Any AI-assisted IEP must be reviewed by all team members with documented evidence that each goal reflects this specific student's assessed needs.

For the complete CDT model policy language and legislative tracker (134 bills, 31 states, updated 2025), see the CDT "Off Task" report linked in the resources section below.
Student Wellbeing

How to talk to students about AI surveillance.

Social-emotional learning framing reduces reactance and builds genuine digital agency.

Students who learn they are being monitored without explanation tend to respond with reactance — finding workarounds, using personal devices instead of school accounts, and losing trust in school systems. The research on adolescent development suggests that surveillance paired with explanation and genuine agency produces better outcomes than surveillance alone.

What to explain, not just enforce

  • Which tools are used and what they monitor
  • What happens when a flag is generated — who reviews it, what the process is
  • What student rights are under FERPA
  • How to raise a concern if they believe a flag was a false positive

SEL framing that works

  • Frame data privacy as a skill, not a restriction: 'Here's how to protect yourself online.'
  • Use the EdTech Privacy Audit as a classroom activity — students reading their own schools' privacy policies is inherently engaging
  • Discuss the false positive problem with older students: why AI systems fail, and what that means for them
  • Separate 'the school is watching' from 'you are not trusted' — explain the institutional pressures schools face
Special Education

IDEA compliance checklist for AI-assisted IEPs.

57% of special education teachers used AI for IEP writing in 2024-25. Here is how to use AI tools without violating IDEA's individualization requirement.

AI may assist with formatting, language clarity, and goal boilerplate — but the substantive content of every goal must be derived from this specific student's assessments, not from a generic template.

Every IEP goal must be traceable to documented assessment data for this student. If a team member cannot explain why a goal appears without reference to the AI tool, the goal does not meet IDEA requirements.

Parents must be meaningfully involved in the IEP development process, not simply presented with an AI-generated document for signature.

Do not use AI to generate the IEP document and then present it as a draft for 'light editing.' This approach inverts the legal standard: the team's judgment, not the AI output, must be the starting point.

Document in the IEP meeting notes any AI tools used in preparation, and confirm in the record that the team reviewed and individualized all content based on direct knowledge of the student.

For full analysis: CDT, "From Personalized to Programmed: The Use of Generative AI to Develop Individualized Education Programs for Students with Disabilities" (2025). Contact your state's Parent Training and Information Center if you have questions about a specific IEP dispute.

Want CPAI resources in your school or district?

We partner with districts, libraries, and nonprofits to deliver research-based AI education where it is needed most.