Facebook’s newly launched warnings about “extremist content” have triggered a backlash from those who worry that the alerts represent a social-media crackdown on politically disfavored speech.
Facebook’s pop-up campaign, rolled out Thursday, includes messages telling some users that “you have been exposed to harmful extremist content recently” and asking if they are “concerned that someone you know is becoming an extremist,” in both cases linking to a “Get Support” page.
“Violent groups try to manipulate your anger and disappointment. You can take action now to protect yourself and others,” says the notification.
Those concerned about Facebook’s latest effort to counter “misinformation” include Republican lawmakers such as Colorado Rep. Lauren Boebert, who said she received an alert on her page.
“Facebook just warned me that I may have been subjected to extremist content and asked me to report anyone I may know that is becoming an extremist,” tweeted Ms. Boebert. “I have more than 200 coworkers I need to report.”
Others, including Rep. Thomas Massie, Kentucky Republican, took a dig at the tech giant by adding sarcastic frames to their Facebook profile photos with the message, “Exposing Friends to Extremist Content.”
Mr. Massie said Facebook “ran out posts to show me after they filtered out the content from my friends on vaccine reactions, alternate Covid treatments, election investigations, and extremist patriotic programming.
“One of my moderate democrat friends told me long ago, back when there were moderate democrats, ‘The definition of an extremist is someone who is completely consistent,’” tweeted Mr. Massie. “I plead guilty as charged by Facebook this week.”
Other critics responded by accusing Facebook of acting as the “Ministry of Truth” and “thought police,” or creating tongue-in-cheek Facebook pages with names like, “Now That’s Some Harmful Extreme Content.”
“I have a real concern that some leftist technocrats are creating an Orwellian environment where people are being arbitrarily silenced or banned for saying something the ‘thought police’ doesn’t like,” tweeted Virginia state Del. Nick Freitas, a Republican.
Rep. Ken Buck, Colorado Republican, tweeted: “Big Tech is acting like Big Brother.”
Facebook described the pop-ups as a test under its Redirect Initiative, which “helps combat violent extremism and dangerous organizations by redirecting hate and violence-related search terms towards resources, education, and outreach groups that can help.”
“This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk,” a Facebook spokesperson told Reuters. “We are partnering with NGOs and academic experts in this space and hope to have more to share in the future.”
Clinton accuser Juanita Broaddrick blasted Facebook CEO Mark Zuckerberg for a warning she said she received about being “exposed to extremist content” for sharing a post from the satirical Babylon Bee site on Fox News host Tucker Carlson.
“Hey, Zuckerberg……This is ignorant. This popped up after I posted the Babylon Bee tweet about Tucker. What the heck??” Ms. Broaddrick wrote on Facebook.
The Babylon Bee post shared by Ms. Broaddrick was headlined: “‘We At The NSA Are Not Spying On You,’ Insists Muffled Voice Coming From Tucker Carlson’s Toaster.”
Not surprisingly, the Babylon Bee weighed in with a post headlined: “Facebook Warns Anyone Attending 4th Of July Fireworks That They May Have Been Exposed To Extremist Activity.”
Alex Berenson, author of the “Unreported Truths About COVID-19” series, shared a message from a Facebook follower saying that his post about teachers’ unions — “Time to pull out and get our tax dollars out of the system and pay a weekly fee for teachers” — triggered an alert.
“Yeah, I’m becoming an extremist. An anti-@Facebook extremist,” tweeted Mr. Berenson. “Who do they think they are? Either they’re a publisher and a political platform legally liable for every bit of content they host, or they need to STAY OUT OF THE WAY. Zuck’s choice.”
In May, Facebook launched a campaign “taking stronger action against people who repeatedly share misinformation on Facebook.”
“Whether it’s false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we’re making sure fewer people see misinformation on our apps,” said Facebook in a May 26 post.
The Washington Times has reached out to Facebook for comment.