A recent Wall Street Journal column highlights how social media giant TikTok is serving teens “a diet of darkness” online.
Julie Jargon writes,
A recent study found that when researchers created accounts belonging to fictitious 13-year-olds, they were quickly inundated with videos about eating disorders, body image, self-harm and suicide.
If that sounds familiar, a Wall Street Journal investigation in 2021 found that TikTok steers viewers to dangerous content. TikTok has since strengthened parental controls and promised a more even-keeled algorithm, but the new study suggests the app experience for young teens has changed little.
The article goes on to offer examples of harmful content directed at teens — including content that encourages suicide ideation, eating disorders, and other dangerous activities.
All of this underscores that Arkansas’ lawmakers did the right thing this year by passing legislation to regulate social media use among minors.
In April the state’s General Assembly passed S.B. 396, the Social Media Safety Act, by Sen. Tyler Dees (R – Siloam Springs) and Rep. Jon Eubanks (R – Paris) says that social media companies must use age verification to ensure minors do not access social media platforms without parental consent.
The measure contains protections for user privacy. A social media company that violated the law could be held liable.
S.B. 396 narrowly cleared the Arkansas Senate, but received strong support in the Arkansas House of Representatives. Governor Sanders signed it into law following its passage.
More and more, we hear stories illustrating how social media platforms host content that isn’t suitable for children. The adults who operate these platforms should not be able to register children as users without parental consent. Laws like S.B. 396 help address this serious problem.