Subscribe now

Comment and Technology

Content moderation offers little actual safety on Big Social Media

Whether social media sites police their platforms using humans or algorithms, content moderation isn't keeping users safe, says Jess Brough

By Jess Brough

12 March 2025

New Scientist. Science news and long reads from expert journalists, covering developments in science, technology, health and the environment on the website and the magazine.

Adrià Voltà

Since Meta announced an end to third-party fact-checking, claiming it was freeing itself from “societal and political pressure to moderate content”, social media users have questioned the value of content moderation. Is it an important tool for protecting the safety and efficiency of a platform, or a systematic method of censorship? In my view, content moderation merely represents a broken system – the insatiable requirement of human sacrifice for the sake of technological advancement, and its use as a veil to obscure the profit-driven motives of social media companies.

Content moderation has been embedded in the legal framework of online platforms…

Sign up to our weekly newsletter

Receive a weekly dose of discovery in your inbox! We'll also keep you up to date with New Scientist events and special offers.

Sign up

To continue reading, subscribe today with our introductory offers

Piano Exit Overlay Banner Mobile Piano Exit Overlay Banner Desktop