AI Tech Giants Enabled Digital Exploitation of Women and Children: Report

Recent news stories allege tech companies like X, Apple, Google have profited from apps that let users digitally undress women and children without their consent.

A report from the Tech Transparency Project reveals that tech giants have hosted dozens of AI-powered “nudify” applications on their app stores, despite having policies that supposedly prohibit such content.

These apps use artificial intelligence to create deepfake nude images of real people, including minors. The apps have reportedly been downloaded more than 705 million times worldwide.

This technology represents a new form of sexual exploitation that lawmakers and parents are struggling to address. The apps essentially let anyone with a smartphone sexually victimize others. And despite repeated warnings from pro-family groups and others, tech companies have been slow to remove these applications.

Arkansas families need to understand this threat. Last year, Arkansas’ lawmakers passed Act 827 by Rep. Stephen Meeks (R — Greenbrier) and Sen. Clint Penzo (R — Springdale) to prohibit people from using artificial intelligence to create and distribute deepfake pornographic images depicting another individual without that individual’s consent. The law also lets the Arkansas Attorney General take legal action against the developers responsible for this kind of technology. It’s a good law that helps address this problem — but it’s possible Arkansas will need to do more as artificial intelligence continues to expand.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.

A.I. Accountability: Lawmakers File Measures Addressing Deepfake Pornography, Child Sexual Abuse Material

New legislation at the capitol would help address AI-generated pornography in Arkansas.

H.B. 1518 by Rep. Stephen Meeks (R — Greenbrier) and Sen. Missy Irvin (R — Mountain View) makes it a crime to create, distribute, possess, or view AI-generated sexual material depicting children.

H.B. 1529 by Rep. Stephen Meeks (R — Greenbrier) and Sen. Clint Penzo (R — Springdale) prohibits people from using artificial intelligence to create and distribute “deepfake” pornographic images depicting another individual without that individual’s consent.

In 2001 the Arkansas Legislature passed Act 1496 addressing computer exploitation of a child. The law generally makes it a felony to produce or reproduce child sexual abuse material “by computerized means.”

At the time there was serious discussion about how computers and computer software could be used to manufacture child sexual abuse material. Of course, in 2001 very few people could have imagined today’s artificial intelligence technology, but lawmakers recognized the need to address the issue — and Family Council supported the good law they passed.

Arkansas also has passed laws prohibiting people from distributing pornographic images of another person without the person’s consent, but the law does not address AI-generated images. News outlets have reported how deepfake pornography can be used to harass or intimidate victims. States are working to enact laws protecting innocent people from AI-generated pornography.

Artificial intelligence has advanced by leaps and bounds in recent years. AI-generated pornography is now a serious concern. State laws must stay ahead of the technology. Measures like H.B. 1518 and H.B. 1529 help do that.

You Can Read H.B. 1518 Here. You Can Read H.B. 1529 Here.

Arkansas Authorities Investigating Possible AI-Generated Images of Child Sexual Abuse Material

The Arkansas Democrat-Gazette recently reported police have charged an Amity man with possessing child sexual abuse material — some of which may have been generated through artificial intelligence.

The article says AI-generated child sex abuse material is not against state law, but in 2001 the Arkansas Legislature passed Act 1496 addressing computer exploitation of a child.

The law generally makes it a felony to produce or reproduce child sexual abuse material “by computerized means.”

At the time there was serious discussion about how computers and computer software could be used to manufacture child sexual abuse material.

Of course, in 2001 very few people could have imagined the artificial intelligence technology that exists today, but lawmakers recognized the need to address the issue — and Family Council supported the good law they passed.

As technology changes and artificial intelligence advances, lawmakers likely will need to enact new legislation to protect children.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.