Aemula is now live at aemula.com! Explore the platform and join us on our mission to support community-governed, independent journalism. Claim your 1-month free trial by creating an account today!
“There can be no liberty for a community which lacks the means by which to detect lies.”
— Walter Lippmann
Traditionally, media platforms have been forced to accept a tradeoff of content quality with creator liberty. Community-driven endeavors often face the challenges of balancing the wide distribution of outputs, with exceptional contributions from expert writers to disturbingly harmful content from bad actors. The goal of curation, whether from a traditional publication or social media platform, is to amplify quality while mitigating harm. But where do you draw the line?
Corporate networks and legacy publications maintain complete control over the moderation decisions of their platforms, but typically outsource the process of fact-checking to independent third parties — an inherently flawed approach. As with any small group of people who have self-selected into a profession, fact-checkers will inevitably hold their own biases, and are often tasked with judging material that defies the binary definition of right or wrong. While these firms provide recommendations based on their determinations, they ultimately lack the authority to enforce the actual removal of content. The ultimate control — and accountability — rests with the owners of the platforms.
This system has allowed platforms to hide behind the “fact-checked” label while wielding unchecked censorship power. The result has been the weaponization of fact-checking, where the practice is frequently used to drive narratives rather than uncover truths.
Like any governance structure, fact-checkers struggle to process large amounts of diverse information with high accuracy. Centralized systems, controlled by small groups, are prone to bottlenecks that hinder their effectiveness at scale. Currently, the only credible alternative is to pass platform governance on to the community itself.
The success of community moderation has been proven through real-world use cases such as the increasing reliability of Wikipedia and the effectiveness of Community Notes. Most recently, Meta announced their recent decision to migrate to a community note style process for fact-checking content. The community approach leverages collective knowledge, balances out errors, and promotes full transparency into moderation decisions. Most importantly, it reduces the centralization of power that fuels the debate on censorship.
With the advent of decentralized protocols, it is now possible for communities to efficiently govern themselves at scale, which removes the false dichotomy of liberty and quality and fosters a healthier ecosystem for the distribution of information.
This week, Aemula spotlights writers who have experienced censorship firsthand and critically examined the tradeoffs of a centralized fact-checking process. Their deep insights shed light on how we can begin to implement effective moderation in our increasingly interconnected digital world without tipping the scales towards censorship.
KLIPNEWS
Written by Ken Klippenstein, a former reporter for The Intercept and DC correspondent for The Nation prior to going independent to report more openly on national security, alongside Editor-in-Chief, William M. Arkin, a George Polk Award winning and Pulitzer Prize nominated journalist with previous work for The New York Times, The Washington Post, LA Times, and NBC News, and investigative reporter Dan Boguslaw, with work for publications such as Rolling Stone, Vice, and The Intercept.
“The threats to free speech posed by these and other Meta policies are real and cut against Zuckerberg’s purported desire to stand up to government censorship. Guess how Meta decides what constitutes “dangerous organizations”? By relying on the U.S. State Department’s list of terrorist groups, per a Human Rights Watch report detailing the platform’s systemic censorship of discourse on the Gaza war. For the high crime of merely interviewing Hamas officials to get them on the record, my former Intercept colleagues Ryan Grim and Jeremy Scahill over at Drop Site News have had their reporting removed by Meta.
This is supposed to be “restoring free expression”?
Silver Bulletin
Written by Nate Silver, a statistician and renowned election forecaster, professional poker player, and author of The Signal and the Noise.
“The notion of “fact-checking” as a separate subfield within journalism has always been strange. Fact-checking has long been an essential part of every journalist's job to the point where it doesn’t really need a name. A writer or reporter might be permitted some clunky prose or an unclear thesis in the first draft of their story. But they’re expected to have made every effort to get the facts right from the outset.
Meanwhile, there always have been such things as professional fact-checkers, but the term had a different connotation than how it’s used today. For instance, even though I had both a great editor and a great research assistant, I also hired a “fact checker,” paid out of my own pocket, for On the Edge, who focused especially on the parts of the book that could be legally or politically sensitive. He was incredibly meticulous, having a sixth sense for cases where, say, a fact was correctly cited from the source material, but the source was wrong. Again, though, this was part of the editorial process before the copy went to press — rather than having the fact-checker as some sort of journalistic high priest or arbitrator.”
Reasonable People
Written by Tom Stafford, a cognitive scientist at the University of Sheffield and author of Mind Hacks.
“Despite limitations, in speed and reach, and vulnerabilities to biases or abuse, Community Notes is a fundamentally legitimate approach to moderation. The community of users provide and vote on Notes. Making it clear why and how fact-checks appear engenders trust in platforms which are too often opaque or perceived as biased. The problems of polarisation, speech moderation and misinformation are bigger than any crowd-sourced approach alone can handle, but that’s not a flaw of Community Notes as a system in itself.
Large platforms may currently be using Community Notes as a smokescreen for neglect of their responsibilities to our information environment, but that shouldn’t stop us recognising the value and potential for future development here. Collective intelligence is enhanced when the wisdom of the crowd is structured and channeled. Making transparent that algorithms do this, and building in a commentary layer on top of the raw feed of social media posts, are both hugely positive steps.”
Interested in cross-posting your content to Aemula? Reach out to writers@aemula.com to get started! With cross-posting, you can:
instantly expand your audience
increase your earnings
maintain full ownership of your work
Plus, you will have the opportunity to access community resources and grants to support the content you want to create! Cross-posting comes with no costs, no obligations, and you can stop or remove your content at any time.
We have launched the first open version of the Aemula platform, now live at aemula.com! Users can claim a 1-month free trial by simply creating an account today!
If you want to support any of the writers we spotlight, we highly encourage you to subscribe to their individual publications. If you want to support independent journalism more broadly, we offer both paid and free subscriptions for you to stay informed!
All subscription revenue is reinvested directly into the independent journalism community.
Follow us on X to stay up-to-speed on platform updates and the writers we are adding to our community!
We would love to hear your thoughts on our mission in the comments!
Have you taken our news sentiment survey yet? If not, let us know what you think of the current state of media in the U.S. It is 10 questions and should only take a minute (literally 60 seconds)!