Protecting Kids Online: Guest Column

This summer, the U.S. Senate passed a pair of bills: the Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0). Each garnered major bipartisan support, passing by an overwhelming margin of 91 to 3. If the bills are passed by the House, they will be the first major legislation aimed at protecting kids online in over two decades. 

The laws originally intended to govern the internet were passed over 20 years ago. These laws were mostly aimed at email exchanges and could never have anticipated the scope and scale of technology today. Not only is the internet used for everything from delivering groceries to running a business, but it is also the epicenter of our worst addictions, from social media to pornography, with algorithms that are incredibly effective at keeping people online. 

Heavy screen time has proven especially harmful for young people, with effects as varied as shortened attention spans, sleep problems, body image issues, depression, bullying, gambling, and addiction. Parents are left to themselves to protect their children online with somewhere between little help and outright animosity from tech companies. As CEO of the National Center on Sexual Exploitation Dawn Hawkins has rightly noted , “The parental controls do not work. … They’ve designed these platforms without parents in mind.”   

As current law stands, social media platforms, websites, and the companies that own them are not legally accountable for what happens to kids while on their sites. Despite additional pressure placed on these tech companies in recent years, there is still not any real incentive to keep children from their sites. Less kids means less money, both now and in the future.  

Despite the now obvious harms, young people have little incentive to pull themselves from what are their primary social and communication hubs. As Jonathan Haidt has argued, today’s situation represents a collective action problem. Many people stand to benefit by collectively coming offline. However, if only one person or small group of people chooses that course of action, it is not beneficial but costly.  

To be restricted from or to opt out of social media today comes at great social cost for individual tweens and teens. The vast majority of their peers own smartphones by age 12. The only way forward is some kind of collective action, so that the health benefits of turning off screens outweigh the social costs. 

This is where KOSA and COPPA 2.0 can help. As currently written, KOSA makes tech companies liable for the harms caused to minors on their platforms based on the platform’s design. It also makes them responsible for creating tools that safeguard minors when using their platforms—tools like protecting privacy, limiting autoplay videos and personalized recommendations, and blocking the distribution of unlawful materials.  

COPPA 2.0 is also a strong step toward incentivizing collective action. The original bill, passed in 1998, prohibited the collection of personal information of kids 13 and under. COPPA 2.0 raised this to any minor 17 and under. This is important because companies use this personal information for targeted advertising, which keeps kids online.  

Of course, these bills will never replace good parenting and collective community actions. Parents must be present with their teens and often in between them and their screens. They also must push their schools, home-school groups, or other educational alternatives to unplug together.  

It’s likely that more laws will be needed. In the battle between families and tech leviathans, families are outmatched. However, these two bills are a strong start. Parents, grandparents, teachers, mentors, and others should contact their representatives to help make sure KOSA and COPPA 2.0 get passed in the House and signed into law.  

This Breakpoint was co-authored by Jared Hayden. If you’re a fan of Breakpoint, leave a review on your favorite podcast app. For more resources to live like a Christian in this cultural moment, go to breakpoint.org.  

Copyright 2024 by the Colson Center for Christian Worldview. Reprinted from BreakPoint.org with permission.

School Districts Should Never Keep Parents, the Ultimate Decisionmakers, In the Dark

The following is from our friends at Alliance Defending Freedom:

A Colorado school district assigned a 5th-grade girl to share a room – and even a bed – with a boy who identified as a girl during an overnight trip.

Read more: https://adflegal.org/article/colorado-school-district-kept-parents-dark-over-gender-identity-rooming-scheme

Joe and Serena Wailes said their daughter had been greatly looking forward to the school-sponsored trip to Philadelphia and Washington, D.C. But when she got to her hotel room, the student with whom she was supposed to share a bed informed her that he was a boy who identified as a girl.

The Waileses’ daughter was then put in multiple uncomfortable situations before school chaperones finally changed her room assignment. Even then, chaperones on the trip told her to lie about the reason for the switch.

The Waileses were not informed about the District Policy that rooms children by gender identity rather than sex prior to the trip, so they had no way to request an accommodation so their daughter did not share a bed with a boy. School districts should be allowing parents to make the best decisions for their children and providing the information required to make informed decisions. Instead, the district jeopardized the privacy of the Waileses’ daughter and deprived the parents of their right to make important decisions for their child.

Watch the video below from our friends at Alliance Defending Freedom to learn more.

Federal Government Sues TikTok Over Alleged Child Privacy Violations

On Friday the United States Department of Justice filed a lawsuit against social media giant TikTok and its parent company for allegedly violating federal laws intended to protect children online.

TikTok boasts approximately one billion users worldwide — including 135 million or more in the U.S. — making it one of the most popular social media platforms on earth.

However, TikTok and its Chinese-based parent company, ByteDance, have come under fire for serving kids a steady “diet of darkness” online and struggling to protect private user data from entities in China, such as the Chinese Communist Party.

In a lawsuit filed Friday in the U.S. District Court for the Central District of California, the U.S. Department of Justice alleged that TikTok and ByteDance violated the Children’s Online Privacy Protection Act of 1998 — a federal law that helps protect children from being tracked online.

The lawsuit accuses TikTok of “unlawful massive-scale invasions of children’s privacy,” saying,

TikTok collects, stores, and processes vast amounts of data from its users, who include millions of American children younger than 13. . . . For years, Defendants [TikTok and ByteDance] have knowingly allowed children under 13 to create and use TikTok accounts without their parents’ knowledge or consent, have collected extensive data from those children, and have failed to comply with parents’ requests to delete their children’s accounts and personal information.

This is not the first lawsuit TikTok has faced for failing to protect children on its platform.

Last year Arkansas Attorney General Tim Griffin filed two lawsuits against the tech giant — one in Cleburne County and another in Union County — for violating the Arkansas Deceptive Trade Practices Act and failing to protect children.

The lawsuits allege that TikTok and ByteDance failed to fully disclose that the company is subject to Chinese laws that mandate cooperation with intelligence activities of the People’s Republic of China, and that TikTok’s algorithm deliberately promotes “intensely sexualized” content — including content that sexualizes children. The A.G.’s legal team has pointed out that objectionable content is even available to users who enable TikTok’s content filtering in the app, and that TikTok aggressively collects sensitive user data.

Social media platforms are more than just websites or phone apps. These are multimillion dollar businesses owned and operated by investors and other interests. The adults who own these companies have a responsibility to follow state and federal laws and to protect children on their platforms.

As we have said before, there’s more and more evidence that social media platforms like TikTok put users’ personal information at risk and are actually designed to push objectionable content to users.

With that in mind, it’s good to see the Department of Justice taking legal action to fight back against these tech companies and protect our children online.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.