ADF Files Amicus Brief in Lawsuit Over Forrest City Firefighter Terminated for Voicing Pro-Life Views

The Alliance Defending Freedom recently filed a friend of the court brief in the federal Eighth Circuit Court of Appeals as part of a lawsuit over Forrest City’s decision to terminate a firefighter in 2020 for expressing pro-life views online.

ADF explains:

Steven Melton, a firefighter for the city of Forrest City, Arkansas, was let go from the fire department after he posted on social media expressing his pro-life views. Melton is challenging the city for censoring his views in the public square. . . .

The following quote may be attributed to Alliance Defending Freedom Senior Counsel Tyson Langhofer, director of the ADF Center for Academic Freedom, regarding a friend-of-the-court brief ADF attorneys filed Thursday on behalf of The Douglass Leadership Institute, The Radiance Foundation, and Speak for Life at the U.S. Court of Appeals for the 8th Circuit in the case Melton v. City of Forrest City, in which a firefighter in good standing with the city had his employment terminated after he posted a pro-life image on social media:

“All Americans should be free to express viewpoints and ideas without fear of government intervention. When the government decides which topics are appropriate for debate, we all lose. As we explain in our brief, the First Amendment’s absolute bar on viewpoint discrimination protects the full-bodied discussions necessary for representative democracy to function. If the government monopolizes the marketplace of ideas, organizations like The Douglass Leadership Institute, The Radiance Foundation, and Speak for Life, which stand for life, especially Black communities that are disproportionately affected by abortion, cannot speak without fear of government reprisal. We urge the 8th Circuit to reverse the lower court decision and allow free speech to flourish for all.”

Alliance Defending Freedom is the world’s largest legal organization committed to protecting religious freedom, free speech, the sanctity of life, marriage and family, and parental rights. ADF’s amicus brief provides additional background on the Forrest City case:

In June 2020, like countless Americans, Steve Melton took to Facebook to share his views on important topics. He posted an illustration of the black silhouette of a baby in the womb with a noose around his or her neck. App.147; R. Doc. 36 at 1. The caption read, “I can’t breathe!” Id. Melton—an evangelical Christian—made the post to express his opposition to abortion. App.49; R. Doc. 21-1 at 2. Melton’s friend later told him he found the post offensive because he perceived the image as a black baby with a noose around his or her neck. App.80; R. Doc. 21-4 at 5. So Melton deleted the post. Id. That should have ended the matter.

But instead of allowing citizens to dialogue on their differences, the government stepped in. It decided that it didn’t like Melton’s views. App.50–51; R. Doc. 21-1 at 3–4. Mr. Melton served as a firefighter for the Defendant City of Forrest City. App.48–49; R. Doc. 21-1 at 1–2. And the City found that his opinion—expressed on his personal social media while off duty—meant he could no longer work as a firefighter, despite an “exemplary” record. See App.49; R. Doc. 21-1 at 2.

The district court allowed the government to do something it could otherwise almost never do—punish a citizen for his private speech.

The amicus brief goes on to argue that disagreeing with the content of someone’s speech does not mean the government has the power to “restrict ‘disfavored or unpopular speech,'” and that “the government cannot allow hecklers to veto public employee views, including on abortion.”

You can read more about this case here. You can download a copy of ADF’s amicus brief here.

WSJ Investigates “How TikTok’s Algorithm Figures Out Your Deepest Desires”

The Wall Street Journal recently published an investigative report titled “How TikTok’s Algorithm Figures Out Your Deepest Desires.”

With an estimated one billion users worldwide and 135 million in the U.S., TikTok is considered by some to be the most popular social media platform in the world.

The app relies on a specialized algorithm to suggest videos to users. As users watch certain types of videos, the algorithm makes a point to recommend similar videos in the future — a process sometimes called “rabbit holing.”

The Wall Street Journal‘s video delves into how “rabbit holing” works on TikTok — and why users should be concerned.

All of this comes as elected officials in New York are pushing for restrictions on social media algorithms. Lawmakers reportedly have offered legislation that would prohibit social media platforms from using algorithms to serve content to children without parental consent.

Last year the Arkansas Legislature passed the Social Media Safety Act — a good law by Sen. Tyler Dees (R – Siloam Springs) and Rep. Jon Eubanks (R – Paris) requiring major social media companies to ensure minors don’t access social media platforms without parental consent.

In response, tech giants — such as Facebook, Instagram, Twitter, and TikTok — sued to strike down the Social Media Safety Act as unconstitutional.

Last summer U.S. District Judge Timothy Brooks in Fayetteville blocked the State of Arkansas from enforcing the Social Media Safety Act. Among other things, Judge Brooks’ ruling claimed that Arkansas’ Social Media Safety Act is unconstitutionally broad and vague, and that most social media content is not “damaging, harmful, or obscene as to minors.”

Unfortunately, researchers have found the algorithms on social media platforms like TikTok actually serve teens what some call a steady “diet of darkness” online.

That’s part of the reason the Arkansas Attorney General’s office has sued TikTok and Meta — the company that owns Facebook and Instagram — arguing among other things that that the platforms’ algorithms force feeds many children a non-stop diet of objectionable content. Much of the evidence cited in the A.G.’s lawsuits corroborates what reporters at the Wall Street Journal and elsewhere have found.

As we have said before, there’s mounting evidence that — by design — social media algorithms like TikTok’s deliberately push objectionable content to kids and put users’ personal information at risk. With that in mind, it’s good to see policymakers taking action to rein in these tech giants.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.