Keeping Kids Safe Online: Guest Column

If we’ve learned anything, parents are the only ones who can protect kids with their devices.

The clear takeaway from the U.S. Senate’s recent hearing, “Big Tech and the Child Sexual Exploitation Crisis,” is that social media is not safe for children. Senators from both sides of the aisle questioned social media CEOs about the harms their platforms cause to kids. Democratic Senator Dick Durbin noted, “As early as 2017, law enforcement identified Snapchat as the pedophile’s go-to sexual exploitation tool.” Republican Ted Cruz chided Meta CEO Mark Zuckerberg for Instagram allowing users to view child sexual content.  

For years, social media companies have claimed that better parental controls would protect children, but as CEO of the National Center on Sexual Exploitation Dawn Hawkins argued in a Heritage Foundation panel after the hearing, “[T]he parental controls … do not work. … They’ve designed these platforms without parents in mind.” 

The conclusion is obvious. Tech companies cannot (and will not even if they could) protect kids. Parents have to

Copyright 2024 by the Colson Center for Christian Worldview. Reprinted from BreakPoint.org with permission.

WSJ Investigates “How TikTok’s Algorithm Figures Out Your Deepest Desires”

The Wall Street Journal recently published an investigative report titled “How TikTok’s Algorithm Figures Out Your Deepest Desires.”

With an estimated one billion users worldwide and 135 million in the U.S., TikTok is considered by some to be the most popular social media platform in the world.

The app relies on a specialized algorithm to suggest videos to users. As users watch certain types of videos, the algorithm makes a point to recommend similar videos in the future — a process sometimes called “rabbit holing.”

The Wall Street Journal‘s video delves into how “rabbit holing” works on TikTok — and why users should be concerned.

All of this comes as elected officials in New York are pushing for restrictions on social media algorithms. Lawmakers reportedly have offered legislation that would prohibit social media platforms from using algorithms to serve content to children without parental consent.

Last year the Arkansas Legislature passed the Social Media Safety Act — a good law by Sen. Tyler Dees (R – Siloam Springs) and Rep. Jon Eubanks (R – Paris) requiring major social media companies to ensure minors don’t access social media platforms without parental consent.

In response, tech giants — such as Facebook, Instagram, Twitter, and TikTok — sued to strike down the Social Media Safety Act as unconstitutional.

Last summer U.S. District Judge Timothy Brooks in Fayetteville blocked the State of Arkansas from enforcing the Social Media Safety Act. Among other things, Judge Brooks’ ruling claimed that Arkansas’ Social Media Safety Act is unconstitutionally broad and vague, and that most social media content is not “damaging, harmful, or obscene as to minors.”

Unfortunately, researchers have found the algorithms on social media platforms like TikTok actually serve teens what some call a steady “diet of darkness” online.

That’s part of the reason the Arkansas Attorney General’s office has sued TikTok and Meta — the company that owns Facebook and Instagram — arguing among other things that that the platforms’ algorithms force feeds many children a non-stop diet of objectionable content. Much of the evidence cited in the A.G.’s lawsuits corroborates what reporters at the Wall Street Journal and elsewhere have found.

As we have said before, there’s mounting evidence that — by design — social media algorithms like TikTok’s deliberately push objectionable content to kids and put users’ personal information at risk. With that in mind, it’s good to see policymakers taking action to rein in these tech giants.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.