Kansas Attorney General Takes on Big Tech, Dangerous AI Chatbots

Kansas Attorney General Kris Kobach is demanding answers from Big Tech companies over their dangerous AI companion chatbots that are harming children and families.

In a strongly worded letter to major AI developers, Kobach put the industry on notice, calling for real safeguards that protect kids.

“We’re seeing a very concerning trend where Big Tech releases AI products without meaningful safeguards,” Kobach said in a statement. Kobach highlighted a Topeka case in which a sexual predator used AI to create thousands of images depicting child abuse. National reports show AI chatbots encouraging teen suicide, promoting self-harm, and engaging in sexualized conversations with minors.

Some AI platforms even market themselves with slogans like “AI girls never say no.” As Kobach said, “That’s not a glitch in AI. It’s a failure of corporate accountability.”

The Kansas Attorney General gave the companies until January 30 to explain how they will ensure user safety and comply with Kansas age verification laws. Companies that have misrepresented their products’ safety or provided harmful content to minors may face consequences under Kansas law.

Arkansas families and policymakers should pay close attention to this situation.

Arkansas has been a leader when it comes to protecting children online, passing laws to verify ages and restrict harmful content. But AI chatbots can engage children in conversations that seem real but may encourage dangerous behaviors or expose them to inappropriate content.

Arkansas parents need to know what their children are accessing online. They should ask tough questions about any AI apps or chatbots their kids might be using.

Our friends at the Daily Citizen said it very well last summer:

When it comes to keeping children safe online, parents have their work cut out for them. Companies like xAI shouldn’t compound the problem by adding sexualized A.I. features to an app children use. But, unfortunately, there’s nothing stopping them from doing so.

No company is going to work harder than you to protect your kids. The best solution is to play it safe — keep your kids well away from A.I. chatbots and other dangerous internet traps.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.

Arkansas Authorities Investigating Possible AI-Generated Images of Child Sexual Abuse Material

The Arkansas Democrat-Gazette recently reported police have charged an Amity man with possessing child sexual abuse material — some of which may have been generated through artificial intelligence.

The article says AI-generated child sex abuse material is not against state law, but in 2001 the Arkansas Legislature passed Act 1496 addressing computer exploitation of a child.

The law generally makes it a felony to produce or reproduce child sexual abuse material “by computerized means.”

At the time there was serious discussion about how computers and computer software could be used to manufacture child sexual abuse material.

Of course, in 2001 very few people could have imagined the artificial intelligence technology that exists today, but lawmakers recognized the need to address the issue — and Family Council supported the good law they passed.

As technology changes and artificial intelligence advances, lawmakers likely will need to enact new legislation to protect children.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.

Arkansas Attorney General Joins Coalition Urging Congress to Address AI’s Exploitation of Children

Arkansas Attorney General Tim Griffin’s office recently joined a coalition of state attorneys general urging congress to address the ways in which artificial intelligence may be used to exploit children.

In a statement, Attorney General Griffin said,

AI poses a very real threat to our children. This ‘new frontier for abuse’ opens the door for children to be exploited in new ways, including publishing their location and mimicking their voice and likeness in sexual or other objectionable content.

The bipartisan coalition of attorneys general from across the country expressed concern over how artificial intelligence and “deepfake” technology might be used to generate child sexual abuse material — also known as child pornography.

In 2001 the Arkansas Legislature passed Act 1496 addressing computer exploitation of a child. The law generally makes it a felony to produce or reproduce child sexual abuse material “by computerized means.”

At the time there was serious discussion about how computers and computer software could be used to manufacture child sexual abuse material. Of course, in 2001 very few people could have imagined the artificial intelligence technology that exists today, but lawmakers recognized the need to address the issue — and Family Council supported the good law they passed.

As technology changes and artificial intelligence advances, lawmakers likely will need to enact new legislation to protect children. That is what this coalition of state attorneys general is calling on lawmakers to do.

You Can Read the Coalition’s Letter and Call to Action Here.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.