The Wall Street Journal recently published an investigative report titled “How TikTok’s Algorithm Figures Out Your Deepest Desires.”
With an estimated one billion users worldwide and 135 million in the U.S., TikTok is considered by some to be the most popular social media platform in the world.
The app relies on a specialized algorithm to suggest videos to users. As users watch certain types of videos, the algorithm makes a point to recommend similar videos in the future — a process sometimes called “rabbit holing.”
The Wall Street Journal‘s video delves into how “rabbit holing” works on TikTok — and why users should be concerned.
All of this comes as elected officials in New York are pushing for restrictions on social media algorithms. Lawmakers reportedly have offered legislation that would prohibit social media platforms from using algorithms to serve content to children without parental consent.
Last year the Arkansas Legislature passed the Social Media Safety Act — a good law by Sen. Tyler Dees (R – Siloam Springs) and Rep. Jon Eubanks (R – Paris) requiring major social media companies to ensure minors don’t access social media platforms without parental consent.
In response, tech giants — such as Facebook, Instagram, Twitter, and TikTok — sued to strike down the Social Media Safety Act as unconstitutional.
Last summer U.S. District Judge Timothy Brooks in Fayetteville blocked the State of Arkansas from enforcing the Social Media Safety Act. Among other things, Judge Brooks’ ruling claimed that Arkansas’ Social Media Safety Act is unconstitutionally broad and vague, and that most social media content is not “damaging, harmful, or obscene as to minors.”
Unfortunately, researchers have found the algorithms on social media platforms like TikTok actually serve teens what some call a steady “diet of darkness” online.
That’s part of the reason the Arkansas Attorney General’s office has sued TikTok and Meta — the company that owns Facebook and Instagram — arguing among other things that that the platforms’ algorithms force feeds many children a non-stop diet of objectionable content. Much of the evidence cited in the A.G.’s lawsuits corroborates what reporters at the Wall Street Journal and elsewhere have found.
As we have said before, there’s mounting evidence that — by design — social media algorithms like TikTok’s deliberately push objectionable content to kids and put users’ personal information at risk. With that in mind, it’s good to see policymakers taking action to rein in these tech giants.
Articles appearing on this website are written with the aid of Family Council’s researchers and writers.