Mark Zuckerberg’s decision to end content moderation on Facebook and stop suppressing political content marks a watershed moment in the platform’s chaotic evolution. It is disgraceful in its way, but it may also amount to an admission that whatever they were doing was a half-hearted sham.
The controversial move is widely seen as a capitulation to the right, as it mirrors Elon Musk’s actions following his 2022 acquisition of Twitter (now X), when he reinstated banned right-wing agitators, many of whom had been deplatformed for hate speech and disinformation. Musk’s rationale was championing “free speech,” but in reality, it was a transparent pander and political maneuver indifferent to the cost.
Now Zuckerberg is following suit. By ending efforts to suppress disinformation and political extremism, Facebook is signaling its alignment with the political winds of the moment, and it’ll be seen as an effort to curry favor with President-elect Donald Trump. Meta raised eyebrows by joining other tech giants in donating $1 million to Trump’s inaugural fund.
The rationale for moderating content was clear. Studies consistently
Platforms like Facebook and Twitter profit from the engagement generated by divisive content, even as they foment radicalism while claiming to foster discourse. By dropping moderation, Facebook is signaling to radicals that they are free to propagate their views unchecked, further amplifying the platform’s role as a megaphone for extremism.
As for the political bias, it is sad but true that the perception that disinformation originates from the right is not unfounded: Right-wing populism thrives on visceral appeals to fear and anger, which are tailor-made for social media algorithms. But, as a centrist writer with what I believe is a more nuanced perspective, my own experience tells a fuzzier story.
I write a newsletter and various columns that, while critical of Donald Trump and the Republican stance on guns, healthcare, climate change, and other issues, are equally skeptical of leftist excesses. I’ve defended immigration controls, criticized wokeness, and condemned left-wing flirtations with jihadism. My commentary calls for active intervention against
On Facebook, I maintained a modest writer’s page where I shared articles and TV appearances. Two months ago, I discovered it had been “temporarily” blocked for unspecified “community guidelines violations.” Days later, the block was extended to 3,600 days—a full 10 years. To add insult to injury, my interface language inexplicably switched to Russian, which I do not speak, with seemingly no permanent way to revert it to English (not to mention, the interface to search for one is now in Russian!).
With the help of a former Facebook executive, I navigated the labyrinthine system to find the “feedback page,” which informed me that blocks occur when AI flags something or when a complaint is lodged. The bot asked if I was satisfied with this explanation. When I replied that I was not, I received an automated “thank you”—and that was it. There was no one else to contact; Facebook’s customer service, if it can even be called that, is non-existent. The idea of appealing to the global oversight review board is a joke.
As the cherry on top of this rancid cake, when I tried to promote my newsletter on Instagram, where I have not yet been blocked, the ad was rejected—likely due to the newsletter’s political content—but I was still charged. The receipt came from
This Kafkaesque experience highlights three key truths about Meta’s system: it’s not just the right wing that gets abused; the enforcement mechanisms are arbitrary and idiotic; and the company cares only about users’ data. The platform manifestly does not care about your business or the justice of your case. Such concerns do not drive their business model.
There is a clear universal lesson here: For small businesses and creators, becoming dependent on Meta under these conditions, for revenue or ads or impact, is dangerous folly. The platform’s randomness, lack of accountability, and abysmally indifferent customer service make it a terrible partner to an almost comical degree. I struggle to imagine a way to make it worse. My writer’s page was a minor casualty, but for anyone whose livelihood depends on Meta, the consequences could be devastating.
So what now? Clearly Facebook’s efforts to be responsible were a sham. But on the other hand, by dropping content moderation, the site will further enable the spread of dangerous rhetoric.
What’s left to be done with social media in general?
Well, we could still demand transparency in algorithms, and punish companies that knowingly push radicalism and falsehoods. Platforms should be required to implement robust mechanisms for identifying and blocking real trolls and bots; these exist and are a major problem – not newsletters that dabble in centrist geopolitics. The best solution
Use the others, if you must, for cute pet photos, family vacations, and reconnecting with old friends. And never depend on any of them for your business or state of mind.
Dan Perry was the top AP editor in Europe and Africa, in the Middle East, and in the Caribbean. He was chairman of the Foreign Press Association in Jerusalem. He writes frequently on world affairs, technology, and media, is the author of two books on the Middle East. Follow him at danperry.substack.com.