Disinformation

Preview

And it’s insidious. And it has lasting effects.

It starts with watching people you know begin to sound like they’re living in a different reality.

First, it was the “smoking guns”—the links and screenshots and grainy videos proving, supposedly, that they knew something the rest of us didn’t. The posts about how we were just sheep, swallowing “fake news,” too gullible or too brainwashed to see what was “really” going on. Then, slowly, that same lens started showing up everywhere: in how they talked about elections, protests, vaccines, neighbors, fraud, whole communities. We’ll see about the Epstein files. That one could be a unifier.

And now, it’s in the comment sections under stories about what ICE is doing in Minnesota. The way people casually assume that if the state or the feds are using that level of force, there must be something more to the story—something bad enough to justify it. The way they leap to, “Well, we don’t know what they did,” or “They wouldn’t be treated like that if they were innocent.” Underneath it is the same disinformation‑shaped reflex: that power is benevolent, that cruelty is probably just justice in a sharper outfit, that if an institution is acting brutally, the brutality is evidence that the targets deserved it.

One that I experienced recently was, “We aren’t the experts here, we don’t know.” But some of us are and some of us do. This is why I wrote about knowing earlier this week. I wanted to know about knowing before I looked into how we’re being deliberately manipulated to not know.

This is not me posting another conspiracy theory. This is me—the person who double‑majored in Communication Studies and Women’s, Gender, and Sexuality Studies at Macalester College—deconstructing how we got here: how language like “fake news,” curated media ecosystems, and engineered narratives about “who deserves what” have remapped people’s sense of what is real, what is credible, and what counts as cruelty at all. This isn’t new to most of us, but I wanted to put it all together in one place and tie it to our nervous systems and the damage it’s doing to us.

Disinformation as Engineered Reality

Disinformation is not just “a different opinion” or a few bad facts floating around. It is engineered reality: information that is false, distorted, or selectively framed, created and spread on purpose to shape what people believe, feel, and do. It is not an accident that certain narratives are everywhere at once; disinformation campaigns are designed to circulate within specific communities, using their values, fears, and identities as the fuel.

What makes disinformation so effective is that it doesn’t come in waving a red flag that says “I am a lie.” It comes wrapped in emotional resonance and identity, as if we can feel it in our bones: “People like us already know this.” It helps us feel like we belong. It comes disguised as skepticism, as independence, as “doing your own research,” while quietly telling you whom to trust, whom to hate, and whom to ignore.

Scratching the Psychological Itch

When someone posts their “smoking gun,” they are often in the sweet spot where several psychological forces converge:

  • Confirmation bias: New “evidence” is exciting precisely because it fits what they already fear or want to be true.

  • Identity protection: Beliefs fuse with identity—being the kind of person who sees through the lies—so changing a belief now feels like tearing away part of the self.

  • Social reward: Shares, likes, and “Exactly!” comments become a feedback loop; they are not just right, they are applauded for being right.

There’s also a deep emotional payoff in the narrative that “everyone else is a sheep except us.” It relieves uncertainty and helplessness. Instead of sitting with “This is complex and I don’t fully understand it,” there is the rush of clarity: a villain, a plot, a frame that makes everything make sense. Once that story takes root, it becomes the filter through which all incoming information is interpreted.

Put the “Critical” in “Critical Thinking”

There is a ruse of critical thinking woven through all of this.

On the surface, “doing your own research” and chasing “smoking guns” look like critical thinking. They wear its clothes. They use its words. But they are doing almost the opposite of what actual critical thinking—and especially more formal education—trains us to do. Additional education does not magically make people smarter or more moral; what it tends to give us is more context, more methods, and more practice in holding complexity without immediately reaching for the most emotionally satisfying answer. It gives us habits like checking sources, comparing accounts, asking who benefits, looking for patterns over time instead of chasing the loudest single datapoint.

A smoking gun, on the other hand, offers a bypass. It tells you, “You don’t need all that context. You don’t need to wrestle with ambiguity or keep track of contradictory evidence. This one image, one quote, one leaked document explains everything.” It relieves you of the discomfort of not knowing yet. It relieves you of the slow, sometimes boring work of tracing systems, histories, and material conditions. It lets you feel like you are doing advanced discernment—when what is really happening is that your brain has been handed an emotionally charged shortcut that locks the story into place and slams the door on further inquiry.

The tragedy is that once you’ve been sold that version of “critical thinking,” real critical thinking can start to feel suspect or even elitist. Nuance feels like evasion. Context feels like spin. Asking for more information feels like you’re watering down “the truth.” But underneath that reaction is the nervous system’s reliance on the bypass: it has learned that the rush of a smoking gun is safer and more gratifying than the uncertainty of genuine analysis. So the person who follows the bypass feels like the only one brave enough to see clearly, when in reality, they’ve just been offered the intellectual equivalent of a conspiracy‑flavored fast‑food meal—instantly filling, incredibly compelling, and designed to keep them coming back instead of cooking something slower, healthier, and harder in their own kitchen.

Disinformation v. Our Bodies and Minds

From a neuro‑biopsychosocial perspective, disinformation is not just about individual bad beliefs; it is about how entire systems of perception and meaning are re‑wired.

Biologically, repeated exposure to emotionally intense, outrage‑based content sensitizes the nervous system. The body learns to live in a state of chronic alert: anger, disgust, fear, contempt. That heightened arousal narrows attention, making people more drawn to simple, black‑and‑white explanations and less able to tolerate nuance or ambiguity.

Psychologically, the internal storyline shifts. It becomes, “I am constantly lied to by institutions,” “I can’t trust anyone outside my group,” or “The world is controlled by hidden forces.” Those narratives shape mood, self‑talk, and daily choices. They can increase anxiety, cynicism, and aggression, while making genuine curiosity and openness feel unsafe or foolish.

Socially, this remapping rearranges relationships and information ecosystems. People unfollow or cut off those who challenge the new narratives. They move into media spaces and social groups that reinforce and reward the disinformation, which makes it feel more and more like common sense. At that point, you’re not just holding a false belief—you are living inside a reality where that belief feels like the only sane option.

Example: Layoffs in a Time of Turmoil

One thing disinformation is very good at is offering us an emotionally satisfying story about why we are hurting.

If you, like me, have been laid off after the economy tanked under the new administration, prices went up, tariffs hit supply chains, and your organization started cutting, there are at least two big narrative paths your brain can take. One path is highly personal and local: “My bosses are cruel, incompetent, or out to get me; this is about them and me.” That taps into very normal attribution patterns—humans tend to over‑explain events in terms of individuals’ character and decisions rather than broader situational forces. In that frame, it feels cleaner to direct anger at the people you see and name (“management,” “HR,” “the CEO”), and disinformation can amplify this by feeding you stories about greedy elites, woke HR, or whatever villain matches your existing frustrations.

The other path zooms out and locates more of the causal weight in the economic environment: interest rates, tariffs, inflationary pressures, policy choices, sector‑specific shocks, and how those combine to squeeze organizations until they start cutting jobs. That story is less emotionally tidy because it forces you to sit with structural forces and uncertainty—your bosses might still be imperfect, but they are also operating inside conditions they did not fully choose. It asks you to tolerate the idea that multiple things can be true at once: leadership may have made preventable mistakes and the macro‑economy is a huge driver of the layoff.

I choose to take that path.

Disinformation tends to nudge us toward whichever explanation best serves the people spreading it: sometimes that means blaming “corrupt bosses” or “the Man” or sometimes “the admin,” sometimes “lazy workers” or “immigrants,” but the underlying pattern is the same—give people a clear, blameable target so they feel less helpless. Making a conscious distinction between “this is about my organization’s choices” and “this is about the larger economic system I’m caught in” is part of reclaiming your own meaning‑making instead of letting the loudest narrative (or the most convenient villain) do that work for you.

“Fake News” as a Weaponized Concept

Enter “fake news.” Originally, that phrase described the deliberate production of false or misleading content dressed up as journalism. But very quickly, “fake news” was flipped and weaponized into a rhetorical bludgeon: not “this piece of content is fabricated,” but “anything that contradicts my preferred narrative is automatically fake.”

That flip was not an accident and it was not subtle. It turned a useful critique of low‑quality or deceptive content into a catch‑all dismissal of entire institutions—news organizations, experts, researchers, anyone who might provide a shared factual baseline. It created a one‑step move: you don’t have to consider evidence, methodology, or track records; you just have to say the magic words “fake news” and the whole thing disappears in a puff of distrust.

When people I knew started calling everything outside their chosen ecosystem “fake news,” what I was watching was not independent thinking. It was the installation of a cognitive shortcut: if it’s from them, it’s fake; if it’s from us, it’s truth. No further evaluation required. That is incredibly effective, because once you accept that shortcut, the only sources left to shape your reality are the ones inside the walled garden.

How Fox and “Truth” Platforms Became the New Town Square

The result of all that rhetorical and psychological groundwork was that certain media outlets could reposition themselves as the only trustworthy town square. “Everyone else is lying to you; we’re the only ones telling you the truth.” Once that seed is planted, every correction, every investigative piece, every inconvenient fact from outside can be dismissed as part of the corrupt “fake news” machine.

From there, the creation of explicitly branded “truth” platforms—naming a platform after Truth itself is not subtle—becomes the next logical step. If the mainstream public square is cast as irredeemably fake, you build an alternative square where “truth” is whatever the in‑group and its leaders declare it to be. You fortify that square with algorithms, social reinforcement, and rhetoric that not only questions mainstream sources, but mocks and demonizes them.

This is why it can feel almost impossible to “send an article” to someone and have it land. If the entire category of journalism or expert testimony has been redefined as “fake,” then the content never even enters the evaluation stage. It dies at the gate, refused entry by a concept that has been deliberately repurposed to protect the disinformation and the identities built around it.

How this Remaps ‘Knowing’ Itself

As someone trained to think about communication, discourse, and power, what stands out to me is that this isn’t just a fight about what is true; it’s a fight about how we decide what counts as truth in the first place.

Disinformation plus “fake news” rhetoric rewires epistemology—the way we know—by shifting the criteria from:

  • Evidence, methods, and fallibility (we could be wrong; let’s test this)

    to

  • Identity, emotion, and allegiance (does this feel right to my group; does it make me feel like I’m on the right side).

That shift changes the entire experience of encountering new information. If “truth” is what my side says, then disagreement becomes treason, correction becomes persecution, and curiosity becomes weakness. The nervous system learns that staying aligned with the in‑group narrative equals safety and belonging, while questioning it equals isolation or attack.

That is why, when you try to talk to someone who has been living in this ecosystem, it can feel like you’re not just disagreeing about facts—you’re threatening their sense of self, their social world, and their nervous system’s map of what is safe.

Exercise: Practicing Our Way Back

If disinformation plus “fake news” rhetoric helped remap us, the way back is repetition, not one perfect take.

1. Internal

  • Notice one post or “smoking gun” that feels deliciously confirming.

  • Ask: “If this came from the other side, how hard would I interrogate it? What would I check?”

  • Let “I don’t know yet” be an acceptable place to land, even briefly.

2. Relational

  • Choose one person deep in disinformation.

  • Use a simple script: “I care about you; I’m not going to argue [topic] like this, but I’m still here for you.”

  • Redirect to something human and boring on purpose.

3. Structural

  • Spend two minutes asking of one favorite outlet: “How do they make money? Do they publish corrections?”

  • Add one source that shows its homework. Mute or unfollow one that mainly flatters what you already think.

The point here is not that one side is pure and the other is uniquely corrupt. The point is that there has been a remarkably effective, multi‑layered effort to convince people that all shared reality is fake except the version curated by their chosen ecosystem—and that effort has used our very human needs for certainty, belonging, and meaning as the raw materials.

This is not another conspiracy theory; it is the opposite. It is naming the mechanisms so they become visible, so we can see how they have been working on us and around us. Once we can see the machinery, we have a chance—just a chance—to step out of autopilot and start participating in the remapping on purpose.

Previous
Previous

Coping in Minnesota

Next
Next

Knowing