Technology
March 30, 2023

Putting Humans First With Responsible AI Policy Making

GoGuardian Team
Teddy Hartman headshot

As algorithms, AI, machine learning, and other advanced technologies come to the forefront of policy, how can we avoid unintended consequences of well-intended policy? What can technologists, advocates, and policymakers learn from the past decade of student data privacy legislation to inform what comes next?

During a 2023 SXSW session titled “Lessons from Past Policy for Future Technology,” GoGuardian’s Senior Director of Privacy and Data Policy Teddy Hartman led a talk on the potential and pitfalls of AI.

Along with panelists Kristina Ishmael, Deputy Director of the Office of Edtech for the U.S. Department of Education, Office of Educational Technology; Paige Kowalski, Executive Vice President of Data Quality Campaign; and Jeremy Roschelle, Executive Director of Learning Science Research for Digital Promise, the group discussed how creating industry standards would greatly improve trust in AI technology.

Here are some key takeaways in regard to how policy will impact edtech today and into the future.

GoGuardian’s Senior Director of Privacy and Data Policy Teddy Hartman; Paige Kowalski, Executive Vice President of Data Quality Campaign; Kristina Ishmael, Deputy Director of the Office of Edtech for the U.S. Department of Education, Office of Educational Technology; and Jeremy Roschelle, Executive Director of Learning Science Research for Digital Promise at SXSW 2023.
  • Edtech at an inflection point — Just as the onset of the app ecosystem revealed the gaps between edtech policy and adoption and prompted a wave of student privacy policy-making, the introduction of AI into the edtech ecosystem is doing the same. Conversations between policymakers and AI developers will be vital in creating and ensuring a safe and responsible ecosystem that honors student privacy.
  • Encouraging responsible use of edtech — AI in edtech has the potential to drive positive change in our education system and is embedded in many tools students likely already use, from autocomplete for text messages to voice recognition technology in Siri or Alexa. At the same time, it also presents important questions about responsible deployment. The sheer number of edtech tools used by schools can be overwhelming. Conducting a needs assessment can help school leaders identify what is truly supportive of student learning and success. Better training with their current tools could help educators understand how to use them to their full potential and ensure student privacy and safety are protected.
  • Robust standards lead to a safer edtech environment — Creating a set of industry standards would help build a better understanding of and trust in the new technology emerging in the edtech space. Without building enough trust, we run the risk of a tech backlash instead of a renaissance. With edtech and student data privacy regulations spanning multiple regulatory agencies and congressional committees, there is no single source for information and guidance. Human input throughout the development process is critical to building trust in responsible AI deployment.

Fostering reasonable dialogue about the benefits and risks of AI is crucial, and all stakeholders, including educators, edtech companies, and policymakers, have a role to play in building trust in AI and maximizing its potential. 

As a company, GoGuardian recognizes the importance of trust and takes a thoughtful approach to student privacy and data protection. We envision a future where all learners are ready and inspired to solve the world's greatest challenges. Our vision helps drive our decisions, from product design to our proactive approach to privacy, more details on which can be found in the Privacy & Trust Center.

Interested in hearing the full conversation? Listen to the full audio recording of the session.

Recent