Highlights

Print Post
  • A trio of recently introduced, bipartisan approaches to social media regulation are all aimed at giving parents more ability to keep their kids safe online. Tweet This
  • The halls of Congress are echoing with parents’ pleas for better tools to help their children stay safe online. Tweet This
  • Many parents are taken aback by the rabbit holes their children have fallen into online—from pornography to political extremism to eating disorders. Tweet This

If “Power to the People” captured the raucous spirit of the 1970s, perhaps a slogan updated for the energy being felt in state legislatures might be “Power to the Parents.” 

Five states this year have passed legislation to allow nearly all parents to find the school that’s best for their children. Six states, so far, have enacted laws that would put up guardrails around young people online, with more on the way.

And most importantly, the halls of Congress are echoing with parents’ pleas for better tools to help their children stay safe online. If one is a bold experiment and two a crowd, three is an undeniable trend: a trio of recently introduced, bipartisan approaches to social media regulation are all aimed at giving parents more ability to keep their kids safe online. 

The newest bill on the scene—titled the Protecting Kids on Social Media Act—is notable for its willingness to abandon the familiar territory each side has tended to favor. And it builds on the momentum of a recently reintroduced bill that sports a bevy of sponsors from various political factions.  

Democrats like President Biden have tended to stress their concerns over social media companies’ algorithms promoting content they find problematic and ad practices that violate users’ privacy. Earlier this year, the President called on Congress to “limit targeted advertising and ban it altogether for children.” 

Meanwhile, Republican lawmakers have traditionally tended to train most of their rhetorical fire on Big Tech’s various examples of partisan bias, from “misinformation” to the repeated debates over Section 230. Important subjects, to be sure. But for most parents, they pale in comparison to their concern over what sort of messages, content, and peer pressure their children are coming across on sites operated by Meta, Snap, and TikTok, to say nothing of raunchier content elsewhere. 

This new bill, co-sponsored by Senator Tom Cotton (R-Arkansas), Senator Brian Schatz (D-Hawaii), Senator Chris Murphy (D-Connecticut), and Senator Katie Britt (R-Alabama), focuses attention on what parents tell pollsters they’re looking for. Most notably, it would require parental consent before opening a new social media account for a minor under age 18. In polling conducted for a report I wrote for the Institute for Family Studies and the Ethics and Public Policy Center, 81% of parents agreed with the idea of parental consent to open an account, and it’s easy to see why. 

Knowing what sites your teen has access to, and being able to revoke that consent if a given app proves harmful, is something for which parents struggling to keep abreast of changes in tech use would be grateful. It would also officially bar users under age 13 from social media sites altogether, which most social media sites already do as a matter of policy but struggle to enforce in practice. 

The bill would also establish a pilot program that could lay the groundwork for effective and anonymized age verification, a requirement for successfully creating virtual red-light districts. No one would allow a strip club to be built next door to an elementary school; without effective age-verification tech, the internet offers a virtual version of that at the tap of a screen. 

Chris Griswold, policy director at American Compass, a non-doctrinaire conservative think tank, has written that these types of policies can help ensure an effective age gate on certain parts of the internet without violating civil libertarian concerns about privacy. This type of verification technology would be like a digital bouncer, making sure a user is of age before letting them access content as another anonymous face in the crowd. The Schatz-Cotton bill would set up a federal pilot project to make such an approach technologically feasible. 

The Protecting Kids on Social Media Act builds on the momentum generated by another federal piece of legislation, the Kids Online Safety Act (KOSA), which has been spearheaded by the bipartisan duo of Sen. Richard Blumenthal (D-Ct.) and Marsha Blackburn (R-Tenn.). Its primary focus is on content relating to harmful behaviors, like eating disorders or suicidal ideation, that social media platforms enable, and expands access to data for researchers interested on tech use and well-being. 

KOSA was recently re-introduced with 33 co-sponsors, including solidly progressive Democrats like Sen. Sheldon Whitehouse (D-R.I.) and Sen. Tammy Baldwin (D-Wisc.), and Senators closer to the center like Bill Cassidy (R-La.) and Bob Casey (D-Penn.), as well as solid conservatives like Sen. Steve Daines (R-Mont.), Sen. Marco Rubio (R-Fla.), and Sen. Joni Ernst (R-Iowa). As I wrote with my EPPC colleague Clare Morell last year, KOSA recognizes that the digital rules of the road written for an era before MySpace, let alone Facebook or Snapchat, need to be updated.

Another offering, an update to the original Children and Teens’ Online Privacy Protection Act (COPPA), was recently introduced by Sens. Ed Markey (D-Mass.) and Bill Cassidy (R-La.) COPPA 2.0, as it is known, would raise the age at which internet companies can collect minors’ data from 13 to 16, focus on consumer privacy, and allow parents to erase their teens’ data from the web. 

The Protecting Kids on Social Media Act goes furthest, responding to what parents are asking for and genuinely introducing new tools to protect kids online. However, KOSA and COPPA 2.0 may be closer to being included in Congressional negotiations, given who is affiliated with KOSA and the seniority of COPPA 2.0’s champions. But Congress should not treat this as an either/or situation. Each bill focuses on a different element of tech use that troubles parents, and passing a bill like KOSA should in no way take some of the steam out of efforts to respond to parents’ desires for consent over social media creation that the Schatz-Cotton bill would introduce. 

Some parents will decide they have no need to avail themselves of these new tools, just as many parents are happy with their neighborhood public school. No one will force parents to check on their kids’ social media activity if they don’t want to. 

But too many parents are taken aback by the rabbit holes their children have fallen into online—from pornography to political extremism to eating disorders and worse. Many would appreciate a little more ability to set guardrails around safe online behavior, not just turn their children loose onto the wild world of the web. And 68% of U.S. teens say technology doesn’t have a positive impact on their lives; more ability to carve out a little space from digital life might even be welcome to most young people. 

Parents might not expect much from this Congress, and certainly the voices that drive the cable news circus do not tend to inspire much confidence. But in state legislatures, more and more lawmakers are giving parents the tools to raise their child in the environment they see fit. It’s a sign of hope that that energy is being noticed on both sides of the aisle in D.C. as well.

Patrick T. Brown (@PTBwrites) is a fellow at the Ethics and Public Policy Center. He writes from Columbia, South Carolina.   

Editor's Note: The opinions expressed in this article are those of the author and do not necessarily reflect the official policy or views of the Institute for Family Studies.