Protecting Kids Online: Guest Column

This summer, the U.S. Senate passed a pair of bills: the Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0). Each garnered major bipartisan support, passing by an overwhelming margin of 91 to 3. If the bills are passed by the House, they will be the first major legislation aimed at protecting kids online in over two decades. 

The laws originally intended to govern the internet were passed over 20 years ago. These laws were mostly aimed at email exchanges and could never have anticipated the scope and scale of technology today. Not only is the internet used for everything from delivering groceries to running a business, but it is also the epicenter of our worst addictions, from social media to pornography, with algorithms that are incredibly effective at keeping people online. 

Heavy screen time has proven especially harmful for young people, with effects as varied as shortened attention spans, sleep problems, body image issues, depression, bullying, gambling, and addiction. Parents are left to themselves to protect their children online with somewhere between little help and outright animosity from tech companies. As CEO of the National Center on Sexual Exploitation Dawn Hawkins has rightly noted , “The parental controls do not work. … They’ve designed these platforms without parents in mind.”   

As current law stands, social media platforms, websites, and the companies that own them are not legally accountable for what happens to kids while on their sites. Despite additional pressure placed on these tech companies in recent years, there is still not any real incentive to keep children from their sites. Less kids means less money, both now and in the future.  

Despite the now obvious harms, young people have little incentive to pull themselves from what are their primary social and communication hubs. As Jonathan Haidt has argued, today’s situation represents a collective action problem. Many people stand to benefit by collectively coming offline. However, if only one person or small group of people chooses that course of action, it is not beneficial but costly.  

To be restricted from or to opt out of social media today comes at great social cost for individual tweens and teens. The vast majority of their peers own smartphones by age 12. The only way forward is some kind of collective action, so that the health benefits of turning off screens outweigh the social costs. 

This is where KOSA and COPPA 2.0 can help. As currently written, KOSA makes tech companies liable for the harms caused to minors on their platforms based on the platform’s design. It also makes them responsible for creating tools that safeguard minors when using their platforms—tools like protecting privacy, limiting autoplay videos and personalized recommendations, and blocking the distribution of unlawful materials.  

COPPA 2.0 is also a strong step toward incentivizing collective action. The original bill, passed in 1998, prohibited the collection of personal information of kids 13 and under. COPPA 2.0 raised this to any minor 17 and under. This is important because companies use this personal information for targeted advertising, which keeps kids online.  

Of course, these bills will never replace good parenting and collective community actions. Parents must be present with their teens and often in between them and their screens. They also must push their schools, home-school groups, or other educational alternatives to unplug together.  

It’s likely that more laws will be needed. In the battle between families and tech leviathans, families are outmatched. However, these two bills are a strong start. Parents, grandparents, teachers, mentors, and others should contact their representatives to help make sure KOSA and COPPA 2.0 get passed in the House and signed into law.  

This Breakpoint was co-authored by Jared Hayden. If you’re a fan of Breakpoint, leave a review on your favorite podcast app. For more resources to live like a Christian in this cultural moment, go to breakpoint.org.  

Copyright 2024 by the Colson Center for Christian Worldview. Reprinted from BreakPoint.org with permission.

FTC Order Would Ban Social Media App From Offering Anonymous Messaging to Kids Under 18

Last week the Federal Trade Commission (FTC) and the Los Angeles District Attorney’s Office announced legal action against social media platform NGL and its founders, citing “a host of law violations related to their anonymous messaging app, including unfairly marketing the service to children and teens.”

Launched in 2021, NGL is a social media platform that encourages users to send messages and ask and answer questions — all anonymously. The FTC says NGL’s anonymity promotes cyberbullying and exposes children to inappropriate content.

In a statement, the FTC and the LA D.A.’s Office said,

“NGL marketed its app to kids and teens despite knowing that it was exposing them to cyberbullying and harassment,” said FTC Chair Lina M. Khan. “In light of NGL’s reckless disregard for kids’ safety, the FTC’s order would ban NGL from marketing or offering its app to those under 18. We will keep cracking down on businesses that unlawfully exploit kids for profit.”

“The consequences of these actions can be severe. The anonymity provided by the app can facilitate rampant cyberbullying among teens, causing untold harm to our young people,” Los Angeles District Attorney George Gascón said. “We cannot tolerate such behavior, nor can we allow companies to profit at the expense of our children’s safety and well-being. Today’s charges send a clear message that deceptive practices and targeting vulnerable populations will not be tolerated.”

The FTC’s case is reminiscent of the Arkansas Attorney General’s lawsuits against TikTok, Facebook, and Instagram.

The lawsuits allege social media giant TikTok violated the Arkansas Deceptive Trade Practices Act by promoting “intensely sexualized” content — including content that sexualizes children — on its platform, and that TikTok failed to fully disclose that the platform is subject to Chinese law — including “laws that mandate secret cooperation with intelligence activities of the People’s Republic of China.”

The lawsuit against Meta — owner of Facebook and Instagram — alleges the social media platforms violated the Arkansas Deceptive Trade Practices Act by relying on algorithms intentionally designed “to exploit human psychology and foster addiction to maximize users’ screen time,” noting that this exploitation is especially true of young users with developing brains.

In each case, the Arkansas Attorney General argues the social media platforms deceptively marketed their apps as being appropriate for children under 18. The FTC is now making a similar argument in its case against NGL.

As we have said time and time again, social media platforms aren’t just websites. These are multimillion dollar businesses owned and operated by investors and other interests.

There is mounting evidence that these platforms put users’ personal information at risk and are actually designed to push objectionable content to users. With that in mind, it’s good to see state and federal regulators taking action to protect children on social media.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.