Believe Anything; Believe Nothing
What compels us to believe something is true? In an age where photographs can be fabricated, film can be manipulated, and speeches are crafted to deceive, our traditional markers of truth have lost their footing. So, the question becomes: what do you look to as your measure of what is real?
I recently came across a post claiming that the newly released Epstein files prove Donald Trump is a pedophile. It presented what appeared to be a detailed set of documents, emails, perhaps, describing an encounter between Trump and a thirteen-year-old girl allegedly brought to his hotel by Jeffrey Epstein himself. I’ll admit I only skimmed it. Reading anything on Facebook demands serious vetting, at least for me. But as I scrolled through, I couldn’t help thinking of the Trump haters who would devour it without a second thought, because when it comes to belief, bias often does the heavy lifting.
And I’m not exempt from that. I catch myself gravitating toward reports that frame certain Trump decisions as sound or even shrewd—not necessarily out of admiration for the man, but out of something more like desperate optimism. Call it hopium, if you must. I simply want something, anything, to be going right out there. So, when a report suggests that a particular move was intelligent or calculated, I feel a wave of relief, and I tend to believe it. Though not always. I do make a habit of looking for corroborating sources before I fully buy in.
What can we actually believe? Am I dismissing the pedophile story simply because believing it would force me to despise Trump? And am I accepting the more flattering accounts of his decision-making for reasons that are purely emotional, dressed up as logic? Or is it genuine common sense telling me one story is implausible and the other more grounded? The trouble is, I’m fairly certain the people who believe the pedophile story, the Trump haters, are equally convinced they’re applying common sense. So, whose common sense is more reliable, and by what measure? Nobody can answer that clearly.
And this is where we find ourselves in an AI-infested world: our old tools for sorting truth from fiction have been quietly retired. There was a time when the information we encountered came from sources that had earned at least a baseline of trust. Journalism once operated by a strict code, no story ran without multiple corroborating sources, on-record confirmation, editorial oversight, and a clear chain of verification. Photographs were considered credible. Film was considered ironclad, nearly impossible to fabricate in any convincing way. Those days are gone.
Mainstream news has largely squandered whatever credibility it once held. And AI has finished off the rest. Any image of any person, doing or saying anything imaginable, can now be generated on demand. Which brings me to something else I recently came across: a photograph of Bill Clinton in a dress and tiara, bent over a table, surrounded by people doing things I won’t describe in detail lest I am accused of pornography. Real? A few years ago I would have said almost certainly not. Now, genuinely, I’m not sure. And that uncertainty is the point, because there are people who looked at that same image and said, “obviously true, it fits everything I know about him,” while an equal number said, “absurd, I refuse to believe that about Bill Clinton.” Both camps decided instantly, and neither questioned themselves for a moment. That is a deeply troubling place for a society to be.
So, then, what actually makes something believable? Is it purely bias, a narrative we’ve already committed to, which determines in advance what “the impossible” looks like? Or is there still a role for common sense, the instinct that rejects certain conclusions simply because the world, however dark, cannot be quite that broken?
I also recently watched a clip of Bill Maher doing something rather remarkable — walking back his long-held mockery of QAnon’s claims about elite-level child trafficking and pedophilia. Maher, who spent years dismissing those claims as paranoid fantasy, acknowledged that the evidence of systemic child sexual abuse among powerful people is real enough that he can no longer laugh it off. He has come, reluctantly, to believe it. And yet, in nearly the same breath, he drew a firm line at cannibalism. Eating children, drinking their blood? No. That’s where Bill gets off the train.
But here’s what I find genuinely fascinating about that distinction: what, exactly, is the difference? If you’ve already accepted that powerful, celebrated, presumably sane individuals are systematically sexually abusing children—why does adding cannibalism to the picture suddenly strain credulity? At what point on that spectrum of depravity does the brain say, “too far”? And what does it tell us about the nature of belief itself that the line exists at all?
And that, perhaps, is the most honest answer any of us can offer. Belief has a ceiling, a point beyond which even the most committed mind refuses to go. That ceiling is different for every person, shaped by upbringing, experience, temperament, and yes, politics. What one person considers an obvious truth, another considers unhinged fantasy. And in an AI-manipulated world, nobody—not you, not me, not Bill Maher—has a reliable map anymore.
There is a concept worth mentioning here: confirmation bias. It isn’t new, and it isn’t unique to any particular tribe. The human brain is wired to seek out information that reinforces what it already suspects to be true, and to unconsciously discount what challenges it. We have always done this. But there was a time when the sheer scarcity of fabricated information acted as a natural brake on the process. You couldn’t just manufacture a convincing photograph, a credible document, or a realistic film clip. The effort required was enormous, and the sources producing information were finite enough to be monitored and challenged.
That brake no longer exists. The floodgates are open, and what pours through is an undifferentiated torrent of the real, the doctored, the partially true, the completely invented, and the strategically misleading, all formatted to look identical. Your Facebook feed does not distinguish between a Reuters dispatch and a basement fabrication. Your eyes cannot tell a genuine photograph from a generated one. And your gut, that old faithful compass, has been so thoroughly manipulated by years of targeted content that it may now be pointing in whatever direction an algorithm has decided is best for your engagement metrics.
So where does that leave us? It leaves us, I think, with only a handful of tools worth trusting, and none of them are passive. The first is ruthless sourcing, not just checking where a story comes from, but asking who benefits from you believing it, and why it is appearing in front of you right now. The second is a tolerance for uncertainty, which is the willingness to say “I don’t know yet” rather than filling the gap with whatever feels satisfying. The third, and perhaps the hardest, is self-suspicion. The moment a story feels deeply gratifying, the moment it perfectly confirms your worst fears about your political enemies or your most hopeful fantasies about your allies, that is precisely the moment to slow down.
I started this with a simple question: what compels us to believe something is true? I don’t have a clean answer. Nobody does anymore. But I do know this: the people most confidently certain of what is real right now are almost certainly the most lost. The rest of us, stumbling through the fog with our skepticism intact and our certainty appropriately rattled, may be the closest thing left to clear-eyed.
Believe carefully. That’s the best any of us can do.



I've had formal and professional training in the field of narrative setting, persuasion in mass media and political campaign messaging. One of my media communications professors went on to be a co-founder of factcheck.org, Kathleen Hall Jamieson. She's now at UPenn's Annenberg School, has moved on to specialize in "medical science"-focused communications, you know, the "follow the science" crowd of scolds, censors and propagandists who pushed The Science (TM) of the plandemic. The pseudoscience of behaviourism, narratives that distort perception and perspectives to provoke a 'desired' behavior change.
She taught me a lot, though it was much earlier in her career. The saying, 'Those who can, do. Those who can't teach.' comes to mind. She now 'does' as much as she teaches. At least for a time she was happy to just teach, and shared many of the tools in the dark arts used by psychological operators.
I took a circuitous path when I graduated, didn't work anywhere near the field until I started working with politicians a decade later, occasional campaign work exposed me to many masters of the field.
I can't say I ever mastered it myself, but I do have more familiarity than most. And I'm able to identify tools from the toolkit I learned existed.
For the longest time I thought that made me pretty immune to the effects of narrative manipulation. And I largely was. A whole lot of bullshit never got through my defenses.
It wasn't until about fifteen years ago that I began to sense my defenses were flawed; that I had a glaring weak spot in them. Called 'my teamism.' I could spot the deceptions and manipulations emanating from 'their team.' *They* weren't going to get anything past me. *They* rarely did.
It was a slow, overdue realization that 'my team' had been pulling the wool over my eyes for most of my life. I didn't want to admit it, fought the slowly unfolding epiphany as long as I could. Until the truth of it was undeniable: I had been mentally sucker-punched by 'my team' for most of my life. I trusted 'my team.' Foolishly.
It's easy to spot the lies and deceptions from 'their team.' We are weak when spotting them from those we identify with as like-minded. Our biases are the weakest link in our mental armor, cloud our discernment.
I've learned that 'their team' is often a source of more truthful information than 'my team.' 'They' will be the ones who tell me I have food stuck in my teeth, or that my clothes make me look fat. Unpleasant things 'my team' might not say. But are true.
We must always be skeptical of information we receive that has agendas behind them. From 'their' information sources, and from 'our' information sources. *Especially* from 'our' information sources. Our mental armor is already pretty well-trained, on alert from 'their' sources. Our weak spot in our mental armor is from 'our' sources. Meaning we must question 'our team' much more rigorously.
If we seek truth.
For some (most?) this level of vigilance is too hard, exhausting. It's easier to trust a side, or say it's all lies, than to try to discern truth from rigorous examination of all sides. Most choose the path of least resistance and pick a team to trust. Some choose to completely drop out and stop caring, declare it all lies. The number of us who seek truth in an untruthful world is frighteningly small.
The narrative engineers know this about the human mind. And exploit it masterfully. Like my old professor.
“The masses have never thirsted after truth. They turn aside from evidence that is not to their taste, preferring to deify error, if error seduce them. Whoever can supply them with illusions is easily their master; whoever attempts to destroy their illusions is always their victim.”
~ Gustav le Bon