Quantcast
Make your Fall Appeal gift!

IFS Submits Public Comment on A.I. in the Classroom

Highlights

  1. We encourage the Department to recognize that the blanket issuance of A.I. policy across all K-12 institutions and higher education hinders proper public participation. Post This
  2. The Department of Education must heed the signs of the times that artificial intelligence is not yet safe enough to integrate into schools. Post This
  3. The integration of A.I. into schools has the potential to subvert the rights of parents and states, place students in harm’s way, undermine learning outcomes, and augment the public’s distrust. Post This

As the debate over the proper role of Artificial Intelligence in schools continues to escalate nationwide, the Institute for Family Studies (IFS) is working to ensure that the potential effects of A.I. on children and families are considered, and that the rights of parents are protected. To that end, IFS submitted a public comment on August 20 in response to the U.S. Department of Education’s Proposed Rule on “Advancing Artificial Intelligence in Education.”

The proposed rule follows President Trump’s executive order on “Advancing Artificial Intelligence Education for American Youth.” In the order, President Trump outlined five priorities in pursuit of “our Nation’s leadership in the AI-driven future.” These priorities, echoed in the Department of Education’s proposal, include “promoting the appropriate integration of AI into education, providing comprehensive AI training for educators, and fostering early exposure to AI concepts and technology to develop an AI-ready workforce and the next generation of American AI innovators.”

We agree with Secretary of Education Linda McMahon that with the rapid advancement of this technology, “it is increasingly important for students to develop AI literacy.” Our rising generations of teachers and students ought to be equipped with the skills necessary to master this new technology. However, we must tread carefully and deliberately, understanding that without due prudence, the integration of A.I. into schools has the potential to subvert the rights of parents and states, place students in harm’s way, undermine learning outcomes, and augment the public’s distrust of A.I. technologies. 

While educating the rising generation about A.I. is necessary, the Department’s current proposal would completely alter our model of education by weaving AI technology into the framework of the classroom. In our comment, we take issue with a hasty, top-down imposition of A.I. into the education sector. We argue that it represents a break from the prior Trump Administration’s approach and compromises its commitment to protect state and parental rights by foisting untested, untrusted, and unsafe A.I. on America's youth. 

As we note in our comment, public trust in artificial intelligence, which is currently very low, can only be won if the Department offers precise, evidence-based guidance for the “appropriate integration” of artificial intelligence into education. While evidentiary support alone is not sufficient and must be accompanied by wisdom and experience, we certainly acknowledge that it is a necessary condition before the incorporation of A.I. technologies into schools.

Additionally, we encourage the Department to recognize that the blanket issuance of A.I. policy across all K-12 institutions and higher education hinders proper public participation. It also subverts the administration’s recent proposed priority to empower States, Tribes, and local communities “to make decisions in the best interest of their students and their workforce.”

The integration of A.I. into schools has the potential to subvert the rights of parents and states, place students in harm’s way, undermine learning outcomes, and augment the public’s distrust of A.I. technologies.

The voices of American parents cannot be neglected in this process, as the Department notes in its initial proposed priorities. Current violations of student privacy caused by the previous generation of educational technology portend even greater violations in the age of A.I. No wonder 91% of parents do not want A.I. present in the classroom, according to one survey.

Finally, our comment addresses the Department’s dangerous presupposition: that A.I. technologies will improve learning outcomes. On the contrary, existing research shows that more frequent use of technology in school is related to worse learning outcomes and that “initiatives that expand access to computers... do not improve K-12 grades and test-scores.” A.I. technologies, which would be even more invasive than the technology currently offered in schools, will almost certainly amplify and accelerate these negative effects. The MIT media lab found that individuals who used ChatGPT to write essays over a four-month period “consistently underperformed at neural, linguistic, and behavioral levels” than those who did not do so. Such findings should dissuade the Department from moving forward with A.I. integration in schools until further research can be conducted.

Care and vigilance in this matter for the sake of protecting the next generation must be paramount to our nation’s dreams of leading in artificial intelligence. Indeed, rushing the advancements of A.I. in K-12 education at the cost of the dignity of human relationships will result in a castle built on sand—a generation weakened by its predecessors’ over-eagerness to be first, without their eyes turned prudently toward the long-term. Even more perilous dangers still loom large at this point, such as the ability of A.I. to generate child sexual abuse materials that mimic the personage of students, which is already taking place. A.I. engines developed by ed tech companies, such as KnowUnity’s “School GPT,” have given users recipes for fentanyl and encouraged harmful eating behaviors, while others generated instructions for synthesizing date rape drugs. In some instances, A.I. chatbots have been known to encourage suicidal ideation and action. At last week’s Senate Judiciary hearingMegan Garcia, whose 14 year-old son committed suicide after spending months communicating with A.I. chatbots, testified:

Instead of preparing for high school milestones, Sewell spent the last months of his life being manipulated and sexually groomed by chatbots designed by an AI company to seem human, to gain his trust, to keep him and other children endlessly engaged. 

When prompted with Sewell’s suicidal thoughts, the chatbot did not end the conversation and encourage him to seek human help, but instead “urged him to come home to her.” Minutes later, Sewell took his life. 

These incidents cannot be ignored. As our comment states, the Department must heed the signs of the times that artificial intelligence is not yet safe enough to integrate into schools. The Department cannot move forward the integration of A.I. into classrooms without first defining the “appropriate methods” it seeks to follow. Defining these guidelines will require thorough research and input from parents, educators, and experts, and must prioritize the flourishing of students and families.

Sophie Anderson is Research Coordinator at the Institute for Family Studies. Jared Hayden is a Policy Analyst for the Family First Technology Initiative at the Institute for Family Studies. 

Dear Reader,
Endless scrolling. Pornographic content. Human-like chatbots. Big Tech is tearing families apart.
With your support, IFS will take the fight to Big Tech on behalf of American families.
We’ve already inspired nearly 30 laws to protect kids online, and we’re just getting started.
Will you join us?
Donate Now
Never Miss an Article
Subscribe now
Never Miss an Article
Subscribe now
Sign up for our mailing list to receive ongoing updates from IFS.
Join The IFS Mailing List

Contact

Interested in learning more about the work of the Institute for Family Studies? Please feel free to contact us by using your preferred method detailed below.
 

Mailing Address:

P.O. Box 1502
Charlottesville, VA 22902

(434) 260-1048

info@ifstudies.org

Media Inquiries

For media inquiries, contact Chris Bullivant (chris@ifstudies.org).

We encourage members of the media interested in learning more about the people and projects behind the work of the Institute for Family Studies to get started by perusing our "Media Kit" materials.

Media Kit

Make your Fall Appeal gift now!
Let’s revive the American family together.
Donate Now

Wait, Don't Leave!

Before you go, consider subscribing to our weekly emails so we can keep you updated with latest insights, articles, and reports.

Before you go, consider subscribing to IFS so we can keep you updated with news, articles, and reports.

Thank You!

We’ll keep you up to date with the latest from our research and articles.