Social media giant Instagram says it is taking steps to protect teenagers on its platform.
The Wall Street Journal reports,
Starting this week, it [Instagram] will begin automatically making youth accounts private, with the most restrictive settings. And younger teens won’t be able to get around it by changing settings or creating adult accounts with fake birth dates.
Account restrictions for teens include direct messaging only with people they follow or are already connected to, a reduction in adult-oriented content, automatic muting during nighttime hours and more.
Instagram is owned by Meta — the same company that owns Facebook. The social media giant has faced criticism for exposing children to objectionable content, and policymakers around the country are pushing for stronger protections for kids as a result.
Last year, lawmakers in Arkansas enacted the Social Media Safety Act — a good law by Sen. Tyler Dees (R – Siloam Springs) and Rep. Jon Eubanks (R – Paris) requiring major social media companies to ensure minors don’t access social media platforms without parental consent. A social media company that violated the law could be held liable.
Tech giants — including Facebook, Instagram, Twitter, and TikTok — as well as the ACLU have worked to block that good law in court.
This year lawmakers in New York and in Florida passed similar laws protecting children on social media.
Last year Arkansas Attorney General Tim Griffin’s office filed a lawsuit against Meta in Polk County Circuit Court more than a year ago alleging the company has misled the public about the safety and addictiveness of its social media platforms.
The lawsuit alleges Meta structured Facebook and Instagram “to exploit multiple neuropsychological traits in youth.”
It notes that Facebook and Instagram are built around algorithms intentionally designed “to exploit human psychology and foster addiction to maximize users’ screen time.”
The A.G.’s legal complaint says this exploitation is especially true of young users with developing brains.
The lawsuit also says that, “youth mental health problems have advanced in lockstep with the growth of social media platforms that have been deliberately designed to attract and addict youth by amplifying harmful material, dosing users with dopamine hits, and thereby driving youth engagement and advertising revenue.”
Social media platforms aren’t just websites. These are multibillion dollar businesses owned and operated by adults.
The adults who run these social media platforms should not be able to register children as users and promote content to them without — at the very least — parental consent.
As we have said before, there’s mounting evidence that social media puts users’ personal information at risk and is actually designed to push objectionable content to users. With that in mind, it is essential to protect children on social media.
Articles appearing on this website are written with the aid of Family Council’s researchers and writers.