Protecting Americans From Malign Foreign Influence

As social media becomes more integral to our daily lives, the need for security to protect Americans from foreign adversaries online has become increasingly prevalent. Bad actors, like the Chinese Communist Party (CCP), have taken advantage of the growing U.S. footprint online by utilizing Chinese-owned apps, like TikTok, to target, surveil, and manipulate Americans.

Although TikTok executives claim that it does not share any data collected by the app, there are several Chinese laws in place that provide CCP officials access to all user data collected by Chinese-owned tech companies, like TikTok. This means the CCP has access to sensitive data, like the location of every TikTok user worldwide, including the over 210 million Americans who have downloaded the app.

I’ve received a handful of classified security briefings regarding the data collected by apps like TikTok, and it’s been made abundantly clear that the U.S. must act quickly to protect the American people from our adversaries online, whether on TikTok or any other app controlled by the CCP, Russia, Iran, Venezuela, or North Korea. I voted yes to pass the Protecting Americans From Foreign Adversary Controlled Applications Act, bipartisan legislation to prohibit U.S. app stores and web services from hosting apps controlled by U.S. foreign adversaries. 

Despite deliberate fear-mongering campaigns from TikTok, this bill does not outright ban TikTok or other social media platforms. Instead, it gives TikTok a clear choice: sever ties with their CCP-managed company or face prohibition in U.S. app stores and web-based services. This bill allows TikTok and other similar apps six months to divest from its CCP-based entity and sell the app to a buyer without ties to another authoritarian regime. The bill in no way restricts free speech or violates the 1st Amendment; it regulates business conduct related to national security concerns, it does not regulate or limit the type of content being posted.

Americans’ phones are being used as weapons against us, and I will not sit by and let that happen. The bipartisan support behind this bill should be a testament to the very real national security threat of apps controlled by our adversaries like the CCP. This bill is a targeted approach to address this threat and get the CCP, Iran, North Korea, and other bad actors out of the pockets of millions of Americans. I urge my colleagues in the Senate to advance it expeditiously.

Congressman Westerman represents Arkansas’ Fourth Congressional District. This article originally appeared here.

Federal Legislation Would Ban TikTok or Force Its Sale as Concerns Grow Over the Social Media App

The Wall Street Journal recently reported legislation in Congress would ban the popular social media app TikTok in the U.S. or force its Chinese-based parent company to sell the platform. President Biden reportedly backs the measure. Congress is expected to vote on the bill today.

With an estimated one billion users worldwide and 150 million in the U.S., TikTok is considered by some to be the most popular social media platform in the world — especially among teens and young adults.

However, the company has struggled to protect private user data from entities in China, and the platform has faced criticism for letting its algorithm serve users what some call a steady “diet of darkness” online.

That’s part of the reason the Arkansas Attorney General’s office has sued TikTok and Meta — the company that owns Facebook and Instagram — arguing among other things that that the platforms’ algorithms force feeds many children a non-stop diet of objectionable content. Much of the evidence cited in the A.G.’s lawsuits corroborates what reporters at the Wall Street Journal and elsewhere have found.

One of the A.G.’s lawsuits alleges that TikTok failed to fully disclose that TikTok is subject to Chinese law — including “laws that mandate secret cooperation with intelligence activities of the People’s Republic of China.”

Another lawsuit calls the TikTok app “a Chinese ‘Trojan Horse’ unleashed on unsuspecting American consumers,” and notes that “tens of millions of minors use TikTok in the United States.”

Once on the TikTok app, the Arkansas Attorney General’s office writes that TikTok’s algorithm “force-feeds” many children a non-stop diet of objectionable content — including sexual content, nudity, and violence.

The lawsuit also alleges much of this content is available to teenagers even when using the app’s Restricted Mode that is intended to filter inappropriate material.

Last year the Arkansas Legislature passed the Social Media Safety Act — a good law by Sen. Tyler Dees (R – Siloam Springs) and Rep. Jon Eubanks (R – Paris) requiring major social media companies to ensure minors don’t access social media platforms without parental consent. A social media company that violated the law could be held liable.

In response, tech giants — such as Facebook, Instagram, Twitter, and TikTok — sued to strike down the Social Media Safety Act as unconstitutional.

Last summer U.S. District Judge Timothy Brooks in Fayetteville issued an ordering blocking the State of Arkansas from enforcing the Social Media Safety Act.

Among other things, Judge Brooks’ ruling claims that Arkansas’ Social Media Safety Act is unconstitutionally broad and vague, and that most social media content is not “damaging, harmful, or obscene as to minors.”

The truth is there’s mounting evidence that — by design — social media platforms like TikTok may deliberately push objectionable content to kids and put users’ personal information at risk. With that in mind, it’s good to see policymakers taking action to rein in these tech giants.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.

WSJ Investigates “How TikTok’s Algorithm Figures Out Your Deepest Desires”

The Wall Street Journal recently published an investigative report titled “How TikTok’s Algorithm Figures Out Your Deepest Desires.”

With an estimated one billion users worldwide and 135 million in the U.S., TikTok is considered by some to be the most popular social media platform in the world.

The app relies on a specialized algorithm to suggest videos to users. As users watch certain types of videos, the algorithm makes a point to recommend similar videos in the future — a process sometimes called “rabbit holing.”

The Wall Street Journal‘s video delves into how “rabbit holing” works on TikTok — and why users should be concerned.

All of this comes as elected officials in New York are pushing for restrictions on social media algorithms. Lawmakers reportedly have offered legislation that would prohibit social media platforms from using algorithms to serve content to children without parental consent.

Last year the Arkansas Legislature passed the Social Media Safety Act — a good law by Sen. Tyler Dees (R – Siloam Springs) and Rep. Jon Eubanks (R – Paris) requiring major social media companies to ensure minors don’t access social media platforms without parental consent.

In response, tech giants — such as Facebook, Instagram, Twitter, and TikTok — sued to strike down the Social Media Safety Act as unconstitutional.

Last summer U.S. District Judge Timothy Brooks in Fayetteville blocked the State of Arkansas from enforcing the Social Media Safety Act. Among other things, Judge Brooks’ ruling claimed that Arkansas’ Social Media Safety Act is unconstitutionally broad and vague, and that most social media content is not “damaging, harmful, or obscene as to minors.”

Unfortunately, researchers have found the algorithms on social media platforms like TikTok actually serve teens what some call a steady “diet of darkness” online.

That’s part of the reason the Arkansas Attorney General’s office has sued TikTok and Meta — the company that owns Facebook and Instagram — arguing among other things that that the platforms’ algorithms force feeds many children a non-stop diet of objectionable content. Much of the evidence cited in the A.G.’s lawsuits corroborates what reporters at the Wall Street Journal and elsewhere have found.

As we have said before, there’s mounting evidence that — by design — social media algorithms like TikTok’s deliberately push objectionable content to kids and put users’ personal information at risk. With that in mind, it’s good to see policymakers taking action to rein in these tech giants.

Articles appearing on this website are written with the aid of Family Council’s researchers and writers.