Pornography Harms Children

Live Action recently released a video featuring Matt Fradd explaining how he became addicted to pornography when he was just 8 years old — leading to a decade long battle to overcome his addiction. Today he is one of the leading speakers on the harms of pornography.

Stories like this underscore the importance of legislation like Arkansas passed in 2023 requiring pornographic websites to use a government-issued ID or a commercially available age verification method to protect kids from pornographic material.

You can watch the video below.

Arkansas Authorities Investigating Possible AI-Generated Images of Child Sexual Abuse Material

The Arkansas Democrat-Gazette recently reported police have charged an Amity man with possessing child sexual abuse material — some of which may have been generated through artificial intelligence.

The article says AI-generated child sex abuse material is not against state law, but in 2001 the Arkansas Legislature passed Act 1496 addressing computer exploitation of a child.

The law generally makes it a felony to produce or reproduce child sexual abuse material “by computerized means.”

At the time there was serious discussion about how computers and computer software could be used to manufacture child sexual abuse material.

Of course, in 2001 very few people could have imagined the artificial intelligence technology that exists today, but lawmakers recognized the need to address the issue — and Family Council supported the good law they passed.

As technology changes and artificial intelligence advances, lawmakers likely will need to enact new legislation to protect children.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.

Florida Passes Law Prohibiting Social Media for Children Under 14

On Monday Florida Governor Ron DeSantis signed legislation prohibiting minors under age 14 from registering social media accounts. The law is slated to take effect next year.

Across the board, policymakers are wrestling with how to keep kids safe online.

Researchers have found the algorithms on social media platforms like TikTok actually serve teens what some call a steady “diet of darkness” online.

Last year, lawmakers in Arkansas enacted the Social Media Safety Act — a good law by Sen. Tyler Dees (R – Siloam Springs) and Rep. Jon Eubanks (R – Paris) requiring major social media companies to ensure minors don’t access social media platforms without parental consent. A social media company that violated the law could be held liable. Tech giants — including Facebook, Instagram, Twitter, and TikTok — as well as the ACLU are fighting that law in court.

The Arkansas Attorney General’s office is pushing back by suing TikTok and the company that owns Facebook and Instagram.

The A.G.’s lawsuits cite evidence that the platforms’ algorithms promote objectionable content to children.

Social media platforms aren’t just websites. They are multimillion dollar businesses owned and operated by adults.

The adults who operate these social media platforms should not be able to register children as users and let children post photos and videos of themselves on their platforms without — at the very least — parental consent. 

As we have said before, there’s mounting evidence that social media puts users’ personal information at risk is designed to push objectionable content to users. With that in mind, it’s good to see policymakers taking action to protect children online.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.