top of page

BREAKING NEWS

Saturday, December 6, 2025 at 3:37:27 AM

we see the psyops logo.png

Unmasking propaganda, decoding the spin, reclaiming reality

The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society.” –

Edward Bernays, Propaganda (1928)

BREAKING NEWS

Add a Title

underground studios bg.png

How They Program What You See, Think, and Feel

Top Stories

Rebel Lawson

November 27, 2025 at 3:08:19 PM

Half of Americans now get their news from social media feeds built by a handful of companies, tuned by algorithms no one votes for, and quietly ‘advised’ by governments and intelligence‑linked experts. Tonight on The Perception War, we’re pulling receipts on how old‑school propaganda, concentrated media ownership, and your favorite apps fused into one system designed not just to inform you, but to manage what you believe is real.



Most people think propaganda is something governments do “over there” in enemy countries. But for decades, U.S. intelligence agencies have waged information campaigns aimed at shaping what Americans themselves see as reality, first through newspapers and television, and now through the social media feeds on their phones.


In the early Cold War, the CIA quietly built a stable of trusted reporters, editors, and media owners who helped push approved narratives and bury inconvenient facts.  The loose label “Operation Mockingbird” grew up around this network, but behind the mythology sits a hard core of documented influence: the Senate’s Church Committee in the 1970s confirmed that the agency maintained dozens of covert relationships with American journalists and many more with foreign media outlets used to relay propaganda that often flowed back into U.S. news.wikipedia+4​
The point wasn’t just spying; it was perception management. By steering coverage of the Cold War, coups, and covert actions, the intelligence community could make controversial operations look inevitable, justified, or simply invisible.  The public believed it was seeing independent reporting, when in reality some of the sources, story angles, and headlines had been nudged into place by people with security clearances.


After the Church Committee exposed these practices, the CIA pledged to stop using accredited U.S. journalists as paid assets except in “extraordinary” cases approved at the highest levels.  That sounded like a clean break, but the fine print left a lot of room: the restrictions were policy, not a blanket law, and they did not apply to contractors, cut‑outs, friendly think tanks, or foreign outlets that could still feed narratives back into the American media ecosystem.


Meanwhile, the media landscape itself was changing in ways that made perception management easier. From the 1980s onward, research shows a steady concentration of U.S. media ownership, as local papers were swallowed by chains and independent broadcasters merged into a few conglomerates.  By the 2000s, a small cluster of corporations and overlapping institutional investors controlled large swaths of newspaper circulation, cable news, and radio, creating fewer editorial centers where decisions about what counts as “big news” are made.business.


This consolidation hollowed out local reporting and made it simpler for national narratives to roll over communities that no longer had strong independent outlets to push back.  When a handful of companies and financiers sit on top of most of what people watch or read, it takes far fewer well‑placed pressure points—advertising, access, regulation, or national security briefings—to keep certain topics framed in a narrow way.


Then came the real revolution: the feed. Today, just over half of U.S. adults say they at least sometimes get news from social media, with platforms like Facebook and YouTube each serving as regular news sources for around a third of the population.  Younger adults lean even harder into digital channels, with TikTok, Instagram, and YouTube emerging as primary news gateways for Gen Z and younger millennials.internet.psych.


The effect is that for a large share of the country, “the news” is no longer a broadcast you choose at 6 p.m., but a personalized, endless stream curated by algorithms designed to maximize engagement.  Instead of an editor deciding which three stories lead the evening, opaque ranking systems decide what each person sees first, what repeats, and what never surfaces at all.  These algorithms can be tweaked in countless subtle ways—downranking, “friction,” context labels—without any public hearing or front‑page correction.internet.psych.


Governments noticed. In the 2010s and 2020s, security agencies around the world began talking openly about “information warfare,” “cognitive warfare,” and the need to fight disinformation on social platforms.  Those concerns are not fake; foreign states, extremist groups, and scammers do weaponize social media to mislead and manipulate.  But in the United States, efforts to counter these threats slid into something uncomfortably close to the old Mockingbird logic: if you can’t directly censor, you lean on the platforms that control the feeds.


Court filings and reporting around the Murthy v. Missouri case laid out a pattern in which federal officials regularly contacted major platforms about specific posts and narratives they viewed as misinformation, especially on elections and Covid‑19.  Lower courts initially treated some of this as potential “jawboning”—unconstitutional pressure that uses the threat of regulation or political retaliation to push private companies into suppressing speech the state cannot lawfully ban itself.  The Supreme Court ultimately dismissed the case on standing grounds, sidestepping the core question of where coordination ends and coercion begins, but the record still exposed a dense web of back‑channel communications between government officials and content moderators.papers.


Internal document releases deepened the picture. The so‑called Twitter Files showed regular meetings between Twitter’s trust‑and‑safety staff and U.S. security agencies, during which officials flagged accounts and posts as disinformation or threats, and Twitter adjusted moderation in response.  Separate email caches revealed close coordination between Facebook and the Centers for Disease Control and Prevention over how to label or limit Covid‑related content, with the platform asking health officials for guidance on what should count as misinformation.  Meta CEO Mark Zuckerberg later told Congress that White House officials had pressured the company to remove certain Covid posts and expressed frustration when the company refused to go as far as requested.


Civil‑liberties groups and First Amendment scholars argue that this ecosystem of “requests,” briefings, and special portals for flagging content risks recreating the old pattern of state‑shaped narratives, only now routed through tech firms instead of newsroom payrolls.  Policy analyses describe how even without explicit threats, platforms have strong incentives to comply with official preferences to avoid regulation, public scolding, or liability changes, blurring the line between independent moderation and outsourced censorship.


At the same time, foreign and domestic political actors have learned to exploit the same systems for their own perception campaigns, using bots, micro‑targeted ads, influencer networks, and now AI‑generated content to flood the zone with emotionally charged stories and images.  Studies of recent election cycles warn that this constant firehose of tailored information, half‑truths, and outrage can shift beliefs and harden polarization even when people encounter fact‑checks later.mitsloan.


Put together, this is the perception war. The ingredients are familiar from the Church Committee era—classified programs, intelligence talking points, government priorities, corporate incentives—but the delivery system has changed.  Instead of a few editors taking phone calls, there are now automated feeds tuned to hold attention, a concentrated ownership structure above them, and a network of official and quasi‑official actors competing to steer what each of us sees first, most, and as “authoritative.


The answer is not to retreat into conspiracy or to trust nothing, but to recognize the battlefield. News literacy experts emphasize basic defenses that anyone can practice: pause before sharing; separate straight reporting from opinion or PR; look for primary documents when a story matters; and compare coverage across outlets with different owners and incentives.  In a landscape where the old Mockingbird tactics have been upgraded for the algorithmic age, the simple act of asking “who benefits from me believing this, and who chose for me to see it?” is no longer paranoia—it is survival.

Comments

Goldilocks.jpg
Comments
Rated 0 out of 5 stars.
No ratings yet

Add a rating
Share Your ThoughtsBe the first to write a comment.
bottom of page