Fayetteville Judge Blocks Part of State Law Protecting Children From Explicit Library Material

On Friday U.S. District Judge Timothy Brooks blocked two sections of Act 372 of 2023, a good law that generally prohibits giving or sending a child harmful material that contains nudity or sexual activity.

The law also eliminates exemptions for libraries and schools in the state’s obscenity statute, and it clarifies how library patrons can work to remove objectionable material from a library’s catalog.

The law was slated to take effect August 1, but in June a coalition of libraries in Arkansas led by the ACLU filed a lawsuit challenging portions of Act 372.

Friday’s preliminary injunction prevents the State of Arkansas from enforcing Section 1 and Section 5 of Act 372.

Section 1 of Act 372 makes it a Class A misdemeanor to give or send a child harmful sexual material that contains nudity or sexual activity.

Section 5 of Act 372 clarifies how library patrons can work to remove objectionable material from a library’s catalog.

The ruling did not affect Sections 2, 3, 4, and 6 of Act 372, which eliminate exemptions for schools and libraries in the state’s obscenity statute, address inappropriate material in public school libraries, and permit the disclosure of certain library records.

Family Council has heard repeatedly from people who are deeply troubled by obscene and inappropriate children’s books that some librarians have placed on the shelves of their local libraries.

For example, the Jonesboro public library has been at the center of multiple controversies over its decision to place books with sexually-explicit images in its children’s section and for failing to adopt a policy that separates sexual material from children’s content.

The library in Jonesboro went so far as to post on Facebook that it isn’t the library’s responsibility to protect kids from obscenity.

Other public libraries in Arkansas have failed to separate sexual material from children’s material as well.

Some of the people who testified publicly against Act 372 last spring signaled that they want to be free to share obscene material with children at a library. That simply isn’t right.

Act 372 is a good law that will help protect children in Arkansas. We believe higher courts will recognize that fact and ultimately uphold this law as constitutional.

Tech Giants, ACLU Try to Block Arkansas’ Social Media Safety Act in Court

Tech giants and the ACLU are working in court against Arkansas’ Act 689, the Social Media Safety Act of 2023.

The Social Media Safety Act is a good law by Sen. Tyler Dees (R – Siloam Springs) and Rep. Jon Eubanks (R – Paris).

It requires major social media companies to use age verification to ensure minors do not access social media platforms without parental consent.

The law contains protections for user privacy. A social media company that violated the law could be held liable.

Act 689 narrowly cleared the Arkansas Senate last spring, but received strong support in the Arkansas House of Representatives. Governor Sanders signed it into law following its passage.

On June 29 the trade association NetChoice filed a lawsuit in federal court in Arkansas on behalf of its members — which include tech giants such as Meta (owner of Facebook and Instagram), Twitter, SnapChat, Pinterest, and TikTok.

The lawsuit alleges that Arkansas’ Social Media Safety Act is unconstitutional and should be struck down.

On July 14 the ACLU of Arkansas filed a proposed amicus brief supporting NetChoice’s lawsuit and opposing Act 689.

The ACLU’s amicus brief claims,

Requiring individuals to verify their ages before using social media will impose significant burdens on the exercise of First Amendment rights online. [The Social Media Safety Act] will rob people of anonymity, deter privacy- and  security-minded users, and block some individuals from accessing the largest social media platforms at all. Additionally, imposing a parental consent requirement on access for young people will impermissibly burden their rights to access information and express themselves online, stigmatize the use of social media, and run counter to the parental authority of parents who do not object to their kids using social media.

The truth is the Social Media Safety Act respects parental authority by prohibiting social media companies from registering children as users without parental consent. Age verification and parental consent requirements for social media companies simply do not violate the First Amendment.

News reports have highlighted time and again how social media giants serve teens a steady “diet of darkness” online.

Despite employing tens of thousands of content moderators, TikTok’s algorithm repeatedly has been shown to inundate teenagers with videos about eating disorders, body image, self-harm, and suicide.

In February the American Psychological Association’s Chief Science Officer told the U.S. Senate Judiciary Committee that social media use heightens the risk of negative influences among adolescents, and that young people are accessing social media sites that promote eating disorders and other harmful behavior.

Social psychologist Jonathan Haidt has published an analysis determining that social media is a major cause of mental illness in girls.

And a recent CDC report found 16% of high school students were electronically bullied in 2021 through texting, Instagram, Facebook, or other social media platforms.

Social media companies are owned and operated by adults. Given how harmful social media content can be, the adults running these tech companies should not be able to let children use their platforms without parental consent. Arkansas’ Social Media Protection Act helps address this serious problem.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.

Social Media Serves “Diet of Darkness” to Teens: WSJ

A recent Wall Street Journal column highlights how social media giant TikTok is serving teens “a diet of darkness” online.

Julie Jargon writes,

recent study found that when researchers created accounts belonging to fictitious 13-year-olds, they were quickly inundated with videos about eating disorders, body image, self-harm and suicide.

If that sounds familiar, a Wall Street Journal investigation in 2021 found that TikTok steers viewers to dangerous content. TikTok has since strengthened parental controls and promised a more even-keeled algorithm, but the new study suggests the app experience for young teens has changed little.

The article goes on to offer examples of harmful content directed at teens — including content that encourages suicide ideation, eating disorders, and other dangerous activities.

All of this underscores that Arkansas’ lawmakers did the right thing this year by passing legislation to regulate social media use among minors.

In April the state’s General Assembly passed S.B. 396, the Social Media Safety Act, by Sen. Tyler Dees (R – Siloam Springs) and Rep. Jon Eubanks (R – Paris) says that social media companies must use age verification to ensure minors do not access social media platforms without parental consent.

The measure contains protections for user privacy. A social media company that violated the law could be held liable.

S.B. 396 narrowly cleared the Arkansas Senate, but received strong support in the Arkansas House of Representatives. Governor Sanders signed it into law following its passage.

More and more, we hear stories illustrating how social media platforms host content that isn’t suitable for children. The adults who operate these platforms should not be able to register children as users without parental consent. Laws like S.B. 396 help address this serious problem.