Kansas Attorney General Takes on Big Tech, Dangerous AI Chatbots

Kansas Attorney General Kris Kobach is demanding answers from Big Tech companies over their dangerous AI companion chatbots that are harming children and families.

In a strongly worded letter to major AI developers, Kobach put the industry on notice, calling for real safeguards that protect kids.

“We’re seeing a very concerning trend where Big Tech releases AI products without meaningful safeguards,” Kobach said in a statement. Kobach highlighted a Topeka case in which a sexual predator used AI to create thousands of images depicting child abuse. National reports show AI chatbots encouraging teen suicide, promoting self-harm, and engaging in sexualized conversations with minors.

Some AI platforms even market themselves with slogans like “AI girls never say no.” As Kobach said, “That’s not a glitch in AI. It’s a failure of corporate accountability.”

The Kansas Attorney General gave the companies until January 30 to explain how they will ensure user safety and comply with Kansas age verification laws. Companies that have misrepresented their products’ safety or provided harmful content to minors may face consequences under Kansas law.

Arkansas families and policymakers should pay close attention to this situation.

Arkansas has been a leader when it comes to protecting children online, passing laws to verify ages and restrict harmful content. But AI chatbots can engage children in conversations that seem real but may encourage dangerous behaviors or expose them to inappropriate content.

Arkansas parents need to know what their children are accessing online. They should ask tough questions about any AI apps or chatbots their kids might be using.

Our friends at the Daily Citizen said it very well last summer:

When it comes to keeping children safe online, parents have their work cut out for them. Companies like xAI shouldn’t compound the problem by adding sexualized A.I. features to an app children use. But, unfortunately, there’s nothing stopping them from doing so.

No company is going to work harder than you to protect your kids. The best solution is to play it safe — keep your kids well away from A.I. chatbots and other dangerous internet traps.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.

Illinois, New York Policymakers Support Flawed Assisted-Suicide Measures

On December 12, Illinois Governor J.B. Pritzker signed legislation letting healthcare providers help terminally ill patients end their lives. The measure makes Illinois the latest state to legalize physician-assisted suicide.

The measure passed the Illinois Senate by just one vote during the fall legislative session after narrowly clearing the House earlier this year. Illinois now joins states like Oregon, California, and Washington in allowing what supporters call “medical aid in dying.”

In a statement, Pritzker said the law “enables patients faced with debilitating terminal illnesses to make a decision, in consultation with a doctor, that helps them avoid unnecessary pain and suffering at the end of their lives.”

A few days later, New York Governor announced she had reached a deal with state legislators to legalize assisted suicide.

In a statement, Hochul said, “I was taught that God is merciful and compassionate, and so must we be. This includes permitting a merciful option to those facing the unimaginable and searching for comfort in their final months in this life.”

However, experience in other states shows assisted-suicide laws don’t help people who are sick and dying.

Oregon first allowed physician-assisted suicide in 1998, and official state reports have shown for years that the reasons people give most often for wanting to end their lives are loss of autonomy, decreasing ability to participate in activities that make life enjoyable, and loss of dignity.

Most patients do not express concerns about pain.

In Oregon, terminally ill patients routinely receive lethal prescriptions without being referred for psychological or psychiatric evaluation.

Last year, less than 1% of patients who received a prescription for physician-assisted suicide in Oregon were referred for a psychiatric evaluation. That’s a serious problem.

Many of these patients are lonely and feel like they are losing control over their lives because of their illness. They need counseling and support — not a prescription for deadly drugs.

In parts of the U.S. where physician-assisted suicide is legal, insurance companies have refused to pay for patients’ medical care, but have offered to cover assisted-suicide drugs.

And we have heard stories about patients in Europe and Canada being denied care or actively euthanized thanks to bad government policies.

That’s why Family Council has worked hard to block assisted suicide legislation in Arkansas.

In 2019 and 2021, Arkansas lawmakers wisely rejected very bad end-of-life laws that were flawed and fundamentally disrespected the right to life. Family Council worked closely with our friends in the legislature to stop these proposals.

Just like abortion, euthanasia and assisted-suicide are murder, and they violate the sanctity of human life.

Being pro-life means believing innocent human life is sacred from conception until natural death.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.