<![CDATA[Facebook]]><![CDATA[Fake News]]><![CDATA[Meta]]><![CDATA[X]]>Featured

The Feed Is the Lie – PJ Media

The most dishonest story in modern media is the claim that misinformation became dangerous because ordinary people were suddenly too free to speak. That story flatters the gatekeepers, excuses the platforms, and treats the public like children who wandered into traffic without adult supervision. The truth is much uglier and much simpler. 





Misinformation became industrialized when social media stopped being a trust network and became a behavior-modification system. Meta says the Facebook Feed in the Home tab is a mix of connected content and recommended content ranked by machine learning, while X says its default “For you” timeline blends followed accounts, topics, and recommended posts. This means the largest platforms are openly acknowledging that the default social media experience is no longer primarily governed by who you choose to trust.

Social media was supposed to be a trust graph. The whole point of following someone was to tell the platform whose judgment you wanted in front of you. When somebody proved unreliable, you could remove them from your world, and their influence over you ended with a click, which was a wonderfully efficient form of accountability before Silicon Valley decided your attention was too valuable to leave in your own custody. Now the feed routes around your choices. 

Unfollowing one account does not solve anything when the platform keeps serving you the same kind of content from other accounts because the machine has decided that your nervous system twitches at the right keywords. A recent IPPR analysis found just 18% of users’ top posts came from someone they actually knew, while 35% came from influencers, public figures, or recommended content, and 29% came from ads and brands. In plain English, social media has become a place where strangers perform, advertisers stalk, and the people you actually trust get buried under the rubble.





Once that architecture took over, misinformation was always going to flourish because the ranking objective is attention, not accuracy. Meta says its systems predict what users will find most valuable and relevant, then assign relevance scores before interspersing recommended content and ads, while X says it recommends posts based on signals such as popularity and network interaction, and aims to show content it thinks users will find meaningful. Neither company describes truth as the organizing principle because truth is slow, stubborn, and not especially lucrative, whereas outrage pays on time. Research has found that users are 1.91 times more likely to share links to negative news articles on Facebook and X, and The Wall Street Journal reported that Facebook’s own internal presentation warned that its algorithms exploit the human brain’s attraction to divisiveness. So when the loudest, sloppiest, most inflammatory operators dominate the feed, that is the system doing exactly what it was built to do.

That is why the war on misinformation was doomed the moment these platforms handed the feed over to recommendation systems and then pretended a moderation team could clean up the fallout. Once the platforms seized control of distribution, truth stopped being the governing principle and emotional reaction took its place, which is exactly how you build a system where the most inflammatory content keeps finding oxygen. Silicon Valley built a feed that rewards provocation, then put on a hall monitor badge and assured the public it was here to restore order. 





The public was then asked to believe that labels, demotions, removals, and selective bans would solve the problem. The evidence says otherwise. A Harvard Kennedy School Misinformation Review study found that tweets flagged by Twitter with warning labels spread farther on Twitter than comparable unlabeled tweets, while tweets blocked from engagement on Twitter still remained popular on Facebook, Instagram, and Reddit, where they were posted more often and garnered more visibility than messages that had only been labeled or had received no intervention at all. 

Deplatforming has fared no better. A peer-reviewed PNAS Nexus study on ban-driven migration from Twitter to Gettr found that banned users had higher retention on Gettr and were five times more active there than a matched cohort, suggesting that deplatforming can intensify loyalty and concentration rather than dissolve influence. In other words, the crackdown does not restore trust, does not repair the feed, and does not end misinformation. It relocates it, dramatizes it, and hands bad actors a fresh layer of martyrdom to monetize, because the internet has always had a weakness for forbidden fruit and a near-spiritual inability to mind its own business.

This is why the answer will never come from a cleaner version of the same broken model. A machine-ranked feed that overrides human choice cannot be trusted to repair the social damage created by machine-ranked feeds that override human choice. The answer is to return control to the user and rebuild social media around trust, accountability, and chosen relationships. That is our principle behind Pickax. We make the user the algorithm. The people you choose to trust shape your experience, which means repeat offenders can lose access to your attention in a way that actually matters. Free people, ordered by trust and responsibility, are better stewards of their own attention than corporations whose business model depends on hijacking it. 





So yes, the present system is working by design, just not by the design the public was sold. If the stated goal had truly been to reduce misinformation, the results would be an embarrassment. If the real goal was to maximize engagement, preserve institutional control over distribution, and give platforms broad discretion to intervene after their own systems amplified the very content they claim to oppose, then the outcome makes perfect sense. The feed is the engine. The censorship regime is the public relations department. Society gets the chaos, the platforms keep the power, and then everyone is told this is the price of safety, which is a remarkably polished way of describing a business model built on surrendering your judgment to the same people who already broke the thing.


Editor’s Note: Do you enjoy PJ Media’s conservative reporting that takes on the radical left and woke media? Support our work so that we can continue to bring you the truth.

Join PJ Media VIP and use promo code FIGHT to receive 60% off your membership.



Source link

Related Posts

1 of 2,412