Rigorous Evaluation of AI Tools

Implementing AI in Schools

young boy using ipad with index finger

Before introducing an AI tool, schools should carefully review its terms and conditions. Schools should fully understand who has legal authority and control over the data that is entered into the system, what uses are permitted, vendor obligations, and third-party data sharing practices. Administrators must understand what liability they may or may not be taking on, and what happens if there is misuse, inaccuracies, or data breaches.

All student data is sensitive, and AI systems must comply with all applicable federal and state laws protecting that data. Schools must comply with applicable student privacy laws, such as the Family Educational Rights and Privacy Act (FERPA) and the Children’s Online Privacy Protection Act (COPPA) when required. As a rule of thumb, AI tools should only be allowed to collect, process, and retain the minimum amount of data necessary to deliver the service they are providing. 

Schools using AI tools must ensure they meet accessibility standards. Schools should evaluate whether the tool can ensure all learners, including students with disabilities, can access the tool and what safeguards are necessary to ensure all students have a safe, productive experience.

Schools should always require a human review and keep a “human in the loop.” When using AI, it is important that a human reviews all items generated by an AI tool, especially in high-impact uses, like grading, placement, and discipline-related decisions. Human judgement should always be the final step in the decision-making process.

Schools should examine whether use of the AI tool is associated with improvements in student learning, including positive changes to relevant assessments and the development of targeted skills, like writing and math.

Schools should assess whether use of the AI tool influences student engagement and participation in the classroom.

Schools should evaluate whether the AI tool meaningfully supports instruction, increases personalized learning, and improves classroom practices.

Other Considerations

Schools should establish or update policies to define appropriate use of AI. Transparent documentation and communication practices are essential to ensure that students, families, and educators understand when AI is being used, for what purposes, and with what limitations.

Schools should ensure their use of AI tools is grounded in a commitment to access and opportunity. AI adoption in the classroom must not deepen digital divides, marginalize under-resourced communities, or create barriers for students with disabilities or multilingual learners.

Schools should ensure educators have opportunities to participate in professional learning that establishes a baseline understanding of how AI systems work including their limitations and risks. These learning opportunities should support educators in evaluating AI tools, integrating them in appropriate ways, and modeling responsible use.

Questions schools should ask when considering using AI tools:

  • What problem are we solving with this tool?
  • What does success look like once it’s adopted?
  • How long do we want to use this tool before we assess success or failure?
  • Does this tool help all students, including multilingual students, students with intellectual disabilities, or otherwise vulnerable students?
  • What data does the tool require to be used successfully? Is that data available and safe to use?
  • How can we ensure private data of all stakeholders, including students, parents, educators, and support staff, isn’t utilized by the tool for training the AI tool or other purposes?