Technology
October 24, 2023

AI’s Potential to Personalize K-12 Learning Starts with Trust

Teddy Hartman
A student sticks out his tongue

A lot of columns about AI innovation lead with breathless speculation about the future. But to understand how AI is changing schools in the United States, you need to start more than a century in the past. 

It’s been 107 years since John Dewey’s Democracy in Education elevated the idea of “personalized learning” to a rallying cry for education reformers. For today’s K-12 teachers, the ideal of personalizing instruction to the unique needs, challenges, and abilities of each child is nothing new. 

The challenge, though, has always been scalability. If you’re teaching a couple dozen kids at once – or the 120 high school students I used to see each day when I taught English Literature in Los Angeles – then there are practical limits to how closely you can tailor lesson plans to each student.

That’s where AI can truly be a gamechanger – and where the conventional wisdom about how it’s going to impact schools is missing the big picture. The big AI narrative in K-12 education won’t be robots replacing teachers or chatbots making essay-writing obsolete, but rather new tools that equip teachers to adjust lesson plans on the fly, personalize practice to meet the needs of each learner, and customize learning pathways to maximize student growth. Far from being threatened by AI, teachers will be more impactful and important than ever as they leverage new technologies to accelerate every child’s learning journey.

This vision can’t get out of the starting gate, however, without a foundation of trust. Teachers need to be able to trust these new tools will empower, rather than undermine, their place in the classroom. Parents need to be able to trust these tools will preserve a safe learning environment for their kids. Policymakers need to be able to trust that the companies building these new tools are taking seriously their responsibility to understand and mitigate risks.

In late October 2023, more than a dozen EdTech industry leaders laid down a big cornerstone for this foundation of trust: an industry-side commitment to key principles for the responsible use of AI in education. Developed by a task force of the Software & Information Industry Association (SIIA) and unveiled Tuesday to an audience of Congressional leaders and U.S. Department of Education officials, SIIA’s principles represent the industry’s ambitious effort to define clear guardrails and hold itself accountable.

It starts with a commitment to put students, educators, and caregivers at the center of the equation. EdTech AI tools need to be purpose-driven to benefit the community they serve.

That includes a commitment to ensuring AI-powered tools in the classroom advance educational equity and fully align with all existing civil rights and privacy laws. Large language models can reflect the biases present in the huge data sets of content on which they’re trained – so careful, intentional product design and testing are essential to avoid perpetuating these biases. Similarly, a privacy-by-design approach to product development will help keep students’ private data secure.

AI tools must also be transparent and explainable. Providers need to help students, parents, and teachers understand clearly what their tools do, how they work, and what they’re intended to accomplish.There needs to be an open line of communication to address concerns and key questions.

And critically, AI developers need to hold themselves accountable to these commitments. The companies building these tools need to adopt internal processes and frameworks – such as the National Institute for Science and Technology (NIST) framework for AI Risk Management – to guide compliance and ensure ethical development and deployment.

With AI technology still so new in the public’s consciousness – ChatGPT hadn’t even launched at this time last year – it’s understandable that parents and teachers have serious questions about how these technologies could impact young learners. The EdTech industry’s commitment to tackle these questions openly, ethically, and accountably is an important step toward a bright future of AI-powered, teacher-led, and student-focused personalized learning.

Recent