Protecting Kids Online: Guest Column

This summer, the U.S. Senate passed a pair of bills: the Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0). Each garnered major bipartisan support, passing by an overwhelming margin of 91 to 3. If the bills are passed by the House, they will be the first major legislation aimed at protecting kids online in over two decades. 

The laws originally intended to govern the internet were passed over 20 years ago. These laws were mostly aimed at email exchanges and could never have anticipated the scope and scale of technology today. Not only is the internet used for everything from delivering groceries to running a business, but it is also the epicenter of our worst addictions, from social media to pornography, with algorithms that are incredibly effective at keeping people online. 

Heavy screen time has proven especially harmful for young people, with effects as varied as shortened attention spans, sleep problems, body image issues, depression, bullying, gambling, and addiction. Parents are left to themselves to protect their children online with somewhere between little help and outright animosity from tech companies. As CEO of the National Center on Sexual Exploitation Dawn Hawkins has rightly noted , “The parental controls do not work. … They’ve designed these platforms without parents in mind.”   

As current law stands, social media platforms, websites, and the companies that own them are not legally accountable for what happens to kids while on their sites. Despite additional pressure placed on these tech companies in recent years, there is still not any real incentive to keep children from their sites. Less kids means less money, both now and in the future.  

Despite the now obvious harms, young people have little incentive to pull themselves from what are their primary social and communication hubs. As Jonathan Haidt has argued, today’s situation represents a collective action problem. Many people stand to benefit by collectively coming offline. However, if only one person or small group of people chooses that course of action, it is not beneficial but costly.  

To be restricted from or to opt out of social media today comes at great social cost for individual tweens and teens. The vast majority of their peers own smartphones by age 12. The only way forward is some kind of collective action, so that the health benefits of turning off screens outweigh the social costs. 

This is where KOSA and COPPA 2.0 can help. As currently written, KOSA makes tech companies liable for the harms caused to minors on their platforms based on the platform’s design. It also makes them responsible for creating tools that safeguard minors when using their platforms—tools like protecting privacy, limiting autoplay videos and personalized recommendations, and blocking the distribution of unlawful materials.  

COPPA 2.0 is also a strong step toward incentivizing collective action. The original bill, passed in 1998, prohibited the collection of personal information of kids 13 and under. COPPA 2.0 raised this to any minor 17 and under. This is important because companies use this personal information for targeted advertising, which keeps kids online.  

Of course, these bills will never replace good parenting and collective community actions. Parents must be present with their teens and often in between them and their screens. They also must push their schools, home-school groups, or other educational alternatives to unplug together.  

It’s likely that more laws will be needed. In the battle between families and tech leviathans, families are outmatched. However, these two bills are a strong start. Parents, grandparents, teachers, mentors, and others should contact their representatives to help make sure KOSA and COPPA 2.0 get passed in the House and signed into law.  

This Breakpoint was co-authored by Jared Hayden. If you’re a fan of Breakpoint, leave a review on your favorite podcast app. For more resources to live like a Christian in this cultural moment, go to breakpoint.org.  

Copyright 2024 by the Colson Center for Christian Worldview. Reprinted from BreakPoint.org with permission.

Instagram to Take Steps to Protect Teens: WSJ

Social media giant Instagram says it is taking steps to protect teenagers on its platform.

The Wall Street Journal reports,

Starting this week, it [Instagram] will begin automatically making youth accounts private, with the most restrictive settings. And younger teens won’t be able to get around it by changing settings or creating adult accounts with fake birth dates.

Account restrictions for teens include direct messaging only with people they follow or are already connected to, a reduction in adult-oriented content, automatic muting during nighttime hours and more.

Instagram is owned by Meta — the same company that owns Facebook. The social media giant has faced criticism for exposing children to objectionable content, and policymakers around the country are pushing for stronger protections for kids as a result.

Last year, lawmakers in Arkansas enacted the Social Media Safety Act — a good law by Sen. Tyler Dees (R – Siloam Springs) and Rep. Jon Eubanks (R – Paris) requiring major social media companies to ensure minors don’t access social media platforms without parental consent. A social media company that violated the law could be held liable.

Tech giants — including Facebook, Instagram, Twitter, and TikTok — as well as the ACLU have worked to block that good law in court.

This year lawmakers in New York and in Florida passed similar laws protecting children on social media.

Last year Arkansas Attorney General Tim Griffin’s office filed a lawsuit against Meta in Polk County Circuit Court alleging the company has misled the public about the safety and addictiveness of its social media platforms.

The lawsuit alleges Meta structured Facebook and Instagram “to exploit multiple neuropsychological traits in youth.”

It notes that Facebook and Instagram are built around algorithms intentionally designed “to exploit human psychology and foster addiction to maximize users’ screen time.”

The A.G.’s legal complaint says this exploitation is especially true of young users with developing brains.

The lawsuit also says that, “youth mental health problems have advanced in lockstep with the growth of social media platforms that have been deliberately designed to attract and addict youth by amplifying harmful material, dosing users with dopamine hits, and thereby driving youth engagement and advertising revenue.”

Social media platforms aren’t just websites. These are multibillion dollar businesses owned and operated by adults.

The adults who run these social media platforms should not be able to register children as users and promote content to them without — at the very least — parental consent. 

As we have said before, there’s mounting evidence that social media puts users’ personal information at risk and is actually designed to push objectionable content to users. With that in mind, it is essential to protect children on social media.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.