You are here

Skeptic.com feed

Subscribe to Skeptic.com feed feed Skeptic.com feed
Popular Science. Nonpartisan. Reality-Based.
Updated: 22 hours 23 min ago

Iranians Are Rejecting Theocracy: The Islamic Republic’s Unintended Legacy

Sat, 01/17/2026 - 4:04pm
The Vote 

On March 30–31, 1979, Iranians went to the polls. The ballot contained a single question: Should Iran become an Islamic Republic? The choices were “Yes” (Green) or “No” (Red). The official result: 98.2% voted Yes.1

Fifty-Eight Days Earlier 

On February 1, 1979, Ayatollah Khomeini returned to Iran after fourteen years in exile. Millions filled the streets of Tehran—the estimates range from two to five million.2 But the man they cheered was a carefully constructed image. During the flight, Khomeini remained secluded in the upper deck of the chartered Boeing 747, praying.3 When the plane landed, he chose to be helped down the stairs by the French pilot rather than his Iranian aides, a calculated move to prevent any subordinate from sharing the spotlight.4

He chose his first destination deliberately: Tehran’s main cemetery, where those who died during the revolution were buried. The crowd was so dense his motorcade could not pass; he took a helicopter instead.5 By speaking among the graves, Khomeini positioned himself as the guardian of those who died in the revolution and as someone who would fulfill what they had sacrificed for. 

In the weeks that followed, Khomeini offered both material goods and spiritual salvation. He promised free electricity, free water, and housing for every family. Then he added the caveat that would define the coming era: “Do not be appeased by just that. We will magnify your spirituality and your spirits.”6

A Coalition of Contradictions 

The crowd that greeted him was not a monolith, but a coalition of contradictions. Marxists marched hoping for a socialist future free of American influence. Nationalists and liberals sought constitutional democracy. The devout sought governance by Sharia—and for them, the revolution was holy war: the Shah represented taghut, the Quranic term for tyrannical powers that lead people from God, and those who died fighting him became shahid, martyrs. 

Khomeini managed these competing visions by keeping his actual plans vague. He spoke of freedom, justice, and independence, terms each faction could interpret as it wished.7 His blueprint for clerical rule, Velayat-e Faqih, remained in the background. Abolhassan Bani-Sadr, who would become the Islamic Republic’s first president, later recalled: “When we were in France, everything we said to him he embraced and then announced it like Quranic verses without any hesitation. We were sure that a religious leader was committing himself.”8 Khomeini himself would later state: “The fact that I have said something does not mean that I should be bound by my word.”9

Ayatollah Mahmoud Taleghani casts his vote in the March 1979 Islamic Republic referendum.The Empty Phrase 

Now, let’s return to the ballot. 

A republic places sovereignty in the people. Citizens choose their laws. An Islamic state places sovereignty in God, but not “God” in some abstract, philosophical sense. The God of the Islamic Republic is specifically Allah as understood in Shia Islam: a God who communicates through the Quran, whose will was interpreted by the Prophet Muhammad, then by the twelve Imams, and now (in the absence of the hidden Twelfth Imam) by qualified Islamic jurists. This is not a deist clockmaker or a personal spiritual presence. This is a God with specific laws, specific requirements, and specific men authorized to speak on His behalf. 

So, what did God want? The ballot never said. 

The 1979 Iranian Islamic Republic referendum ballot showing the “نه” (No) option in red. Voters chose between a simple yes or no on whether Iran should become an “Islamic Republic”—a phrase containing no constitution, no enumerated rights, and no definition of which Islamic laws would apply or who would interpret them.

“Islamic Republic” contained no details. No constitution, no enumerated rights, no definition of which Islamic laws would apply or who would interpret them. Voters were not choosing a specific system of government. They were choosing a phrase, and trusting that its meaning would be filled in later by men they believed spoke for God. 

For those paying attention, there were clues. Khomeini had written extensively about Velayat-e Faqih (the Guardianship of the Islamic Jurist) a system in which a senior cleric would hold supreme authority as God’s representative on Earth. He had lectured on it in Najaf. He had published a book.10 But in the noise of revolution, in the flood of promises about free electricity and spiritual elevation, these details were background static. The crowds were not voting on constitutional theory. They were voting on hope. 

The 98% voted Yes. Forty-seven years later, we can measure what exists in Iranian society. 

Religious Faith 

For this case study to be valid, we must establish a baseline. Was Iranian society already irreligious before 1979, or has religiosity declined under the theocracy? 

Available evidence suggests the latter. 

In 1975, a survey of Iranian attitudes found over 80% of respondents observing daily prayers and fasting during Ramadan. The methodology is not fully documented in accessible sources.11 However, the broader historical record supports the baseline: the 1979 revolution mobilized millions under explicitly Islamic banners, clerical figures commanded genuine social authority, and the Iranian government’s own 2023 leaked survey found 85% of respondents saying society has become lessreligious than it was.12 Forty-seven years later, mosques are empty. 

Official Iranian census data reports 99.5% of the population as Muslim.13 This figure measures legal status, not belief. Under Iranian law, a child born to a Muslim father is automatically registered as Muslim, and leaving Islam carries severe legal consequences. While formal executions for “apostasy” are relatively rare—the regime prefers to charge dissidents with crimes like “Enmity against God” or “Insulting the Prophet”—the threat is sufficient to enforce public silence.

Saadatabad district, Tehran, January 8, 2026: A mosque burns amid protests. (Source: Press Office of Reza Pahlavi)

In June 2020, the Group for Analyzing and Measuring Attitudes in Iran (GAMAAN) surveyed over 50,000 respondents using methods designed to protect anonymity.14

Results: 

  • 32.2% identified as Shia Muslim 
  • 22.2% selected “None” 
  • 8.8% identified as Atheist 
  • 7.7% identified as Zoroastrian 
  • 5.8% identified as Agnostic 
  • 1.5% identified as Christian 

While this online sample skews urban (93.6% vs. Iran’s 79%) and university-educated (85.4% vs. 27.7% nationally), the magnitude of divergence from official statistics—32% Shia vs. 99.5% in census data—is too large to explain through sampling bias alone. Meanwhile, face-to-face surveys suffer the opposite problem: when GAMAAN asked respondents if they’d answer sensitive questions honestly over the phone, 40% said no.15

An interesting outcome of this study is that Iran has approximately only 25,000 practicing Zoroastrians (the total population of Iran is around 92.5 million), yet 7.7% selected this identity. Researchers interpret this as “performing alternative identity aspirations”—claiming pre-Islamic Persian heritage to reject imposed Islamic identity.16

The key findings are, however, clear: 44.5% selected a non-Islamic category when asked their current religion and 47% reported transitioning from religious to non-religious during their lifetime. 

The second figure suggests active deconversion rather than inherited secularism. 

In 2024, a classified survey by Iran’s Ministry of Culture and Islamic Guidance (conducted in 2023) was leaked to foreign media.17 This data provides a comparison point from within the regime itself. 

Indicator

2015

2023

Support separating religion from state

30.7%

72.9%

Pray “always” or “most of the time”

78.5%

54.8%

Never pray

3.1%

22.2%

Never fast during Ramadan

5.1%

27.4%

The same survey found 85% of respondents said Iranian society had become less religious in the previous five years. Only 25% reported trusting clerics. 

Based on my years of closely following Iranian society, the pace of religious abandonment has accelerated significantly since the 2022 “Woman, Life, Freedom” uprising. The leaked government data confirms this trajectory: the sharpest shifts in prayer and fasting occurred within the 2015–2023 window, with 85% saying society had grown less religious in just the previous five years. 

In February 2023, senior cleric Mohammad Abolghassem Doulabi stated that 50,000 of Iran’s approximately 75,000 mosques had closed due to low attendance, a claim partially corroborated by the leaked government survey finding only 11% always attend congregational prayers.18

Election participation has also declined. Official turnout in the June 2024 presidential election was 39.93%, the lowest in the Islamic Republic’s history.19

The Evidence on the Streets 

The data on paper is corroborated by the specific vocabulary of the street. The protest chants have evolved from requesting reform to rejecting the entire theological framework. 

Art by Hamed Javadzadeh — Woman, Life, Freedom Movement (2022)

Consider the chant: “Neither Gaza nor Lebanon, I sacrifice my life for Iran.” 

This is a direct rejection of the regime’s core ideology. The Islamic Republic prioritizes the Ummah—the transnational community of believers—over the nation-state. By rejecting funding for Hamas and Hezbollah in favor of national interests, protesters are secularizing their priorities: the Nation has replaced the Faith as the object of ultimate concern. 

Even more specific is the chant: “Death to the principle of Velayat-e Faqih.” 

The protestors are not merely calling for the death of the dictator (Khamenei); they are targeting the specific theological doctrine that grants him legitimacy. They are rejecting the very concept of divine guardianship. 

But the most striking evidence of the revolution’s failure is the return of the name it sought to erase. In a historical irony that defies all prediction, crowds now chant “Reza Shah, bless your soul,” and call upon Reza Pahlavi, the son of the deposed Shah, to return. The same population that staged a revolution to overthrow a monarchy in 1979 is now invoking that monarchy as the antidote to theocracy. 

The Mechanism 

A note on terminology: When this article refers to “Allah,” it means the legislative deity of the Islamic Republic—a God with enforceable commands interpreted by authorized clerics. This is distinct from the personal God that 78% of Iranians still believe in. 

As mentioned earlier, Iran’s constitution establishes Velayat-e Faqih—the Guardianship of the Islamic Jurist. Article 5 declares that in the absence of the Twelfth Imam (a messianic figure believed to have been in supernatural hiding since the 9th century), authority belongs to a qualified jurist. The Tony Blair Institute’s analysis states it directly: “the supreme leader’s mandate to rule over the population derives from God.”20 Khamenei’s own representative, Mojtaba Zolnour, declared in 2009: “In the Islamic system, the office and legitimacy of the Supreme Leader comes from God, the Prophet and the Shia Imams, and it is not the people who give legitimacy to the Supreme Leader.”21

This is not metaphor. The system’s legitimacy rests on the claim that its laws are Allah’s laws, its punishments are Allah’s punishments, its wars are Allah’s wars. 

When morality police detained Mahsa Amini, leading to her death, they were enforcing the mandatory religious duty of “Forbidding the Wrong.” When courts execute apostates, they enforce Allah’s law. When the regime sends billions to Hezbollah while Iranians face poverty, it pursues Allah’s mission. When it pursues a nuclear program that invites crushing sanctions, it frames the resulting economic ruin not as policy failure, but as a holy “Resistance” against the enemies of Islam. Every act of misrule carries Allah’s signature.

0:00 /1:04 1×

Khorramabad, Iran, January 8, 2026: Protesters raise the pre-1979 lion-and-sun flag, described as a symbol of secular restoration, atop a statue of the Ayatollah. (Source: Press Office of Reza Pahlavi)

In a secular dictatorship, citizens can hate the dictator while preserving their faith. The North Korean who despises Kim Jong-un can still pray. But in a theocracy, the oppressor and God speak with one voice. To oppose the oppressor is to oppose God. To want freedom is to reject divine authority. 

The regime created conditions where, for many, opposing political authority became entangled with questioning religious authority. 

The Psychology of Religious Rebellion 

Jack Brehm’s reactance theory (1966) demonstrates that when people perceive threats to their freedom, they become motivated to restore it, often by embracing the forbidden alternative.22 Subsequent research has applied this specifically to religion. Roubroeks, Van Berkum, and Jonas (2020) found that restrictive religious regulations can trigger reactance that leads to both heresy (holding beliefs contrary to orthodoxy) and apostasy (renouncing religious affiliation entirely).23

The critical insight: In cases of psychological reactance, the emotional pushback against coercion often precedes the intellectual dismantling of the belief system. 

The sequence is rarely a straight line, but the components are clear: 

  1. Coercion: The lived experience of religious enforcement 
  2. Dissonance: The widening gap between the regime’s claims of divine justice and the reality of corruption and violence 
  3. Access: The internet provides a “vocabulary of dissent” 

This third point is crucial. Iran’s internet users grew from 615,000 in 2000 to over 70 million today.24 Despite billions spent on censorship, officials admit 80–90% of Iranians use VPNs, which allow to circumvent restrictions by changing the user’s internet location to that of another country.25

For the intellectually curious, the internet offered arguments against Islamic theology that were previously banned. But for the average citizen, it offered something perhaps more powerful: validation. It showed them that their anger was shared. It broke the “pluralistic ignorance,” the state where everyone privately rejects the norm but publicly conforms because they think they are the only ones. 

Whether through deep study or simple emotional exhaustion, the result was the same: the breaking of the psychological bond between the citizen and the faith. 

The Unintended Outcome 

Iran’s religious decline is among the fastest documented in modern history. Stolz et al. (2025) in Nature Communications established that Europe’s secular transition took approximately 250 years. Iran’s comparable shift from over 80% observing daily prayers in 1975 to 47% reporting lifetime deconversion by 2020 occurred in roughly 45 years. Pew’s global data shows Muslim retention rates averaging 99% across surveyed countries.26

However, Europe secularized without internet or satellite television. Iran’s shift occurred alongside a 90-fold increase in internet access. Theocracy may provide the motive for questioning imposed faith; technology provides the accelerant that compresses generational change into decades. Ex-Muslim testimonies, apostasy narratives, ordinary lives lived without faith—these demonstrated that abandoning religion was survivable. The forbidden became imaginable. Others found arguments that validated what they already felt. The reasoning matched the shape of their anger, and that was enough. 

For forty-seven years, the Islamic Republic worked to manufacture belief. Mandatory religious education from childhood. State control of media. Morality police enforcing dress and behavior. Apostasy punishable by death. A constitution grounding all authority in God. They did not leave this to chance. 

The data suggests it did not work.

Categories: Critical Thinking, Skeptic

Two Movies on One Screen: Conflicting Narratives of the Renee Good Shooting in Minnesota

Tue, 01/13/2026 - 8:38am

Anyone following recent events in Minneapolis has likely noticed something strange. People watching the same videos, reading the same headlines, and reacting to the same street-level events often seem to be describing entirely different realities. Conversations quickly break down, not because people disagree about what should be done, but because they cannot even agree on what is happening. It’s as if people are watching two completely different movies on one screen.

The “two-movies-one-screen” concept was first coined by Scott Adams, the creator of Dilbert turned political commentator, to describe radically different interpretations of the same political events. People with access to the same set of facts come away with completely different understandings of what is happening. In some cases, each side seems genuinely unaware that the other interpretation even exists.

This is not merely disagreement, and it goes beyond ordinary bias. It is also not quite what psychologists usually mean by cognitive dissonance. Cognitive dissonance, first described by Leon Festinger in the 1950s, occurs when people experience psychological discomfort from holding conflicting beliefs or encountering information that contradicts their existing views, and then attempt to reduce that discomfort through rationalization or reinterpretation of the facts. In cases like the Renee Good shooting in Minnesota, however, something else seems to be happening. So, what is going on?

From a psychological standpoint, this resembles dissociation more than cognitive dissonance. Dissociation refers to a class of mental processes in which certain thoughts, perceptions, or experiences are kept out of conscious awareness. As clinical psychologists have long noted, dissociation functions as a defensive mechanism, shielding the individual from information that is experienced as overwhelming or intolerable. The mind does not reject the data after evaluating it. It fails to perceive it in the first place.

The following is an attempt to provide a neutral description of the events, followed by two very different interpretations.

On January 7, 2026, in Minneapolis, Minnesota, 37-year-old Renee Nicole Good was fatally shot by an Immigration and Customs Enforcement (ICE) agent during an operation targeting undocumented immigrants for deportation. Good was a U.S. citizen and mother of three from previous relationships, and present on the scene with her wife, Rebecca (Becca) Good.

Multiple videos from bystanders, body cameras, and agent phones capture the event, showing a chaotic scene lasting about three minutes.

0:00 /0:47 1×

ICE Agent’s Cellphone Video (Credit: Alpha News)

Renee Good was in her SUV, which was blocking or near the path of ICE vehicles during an arrest operation. Agents approached, giving conflicting commands: some ordered her to leave, while others demanded she exit the vehicle. One agent attempted to open her door and banged on the window.

Rebecca Good, Renee’s wife, was outside the vehicle filming and confronting agents.

At one point during the interaction, Renee’s wife urged her to “drive, baby, drive” as the situation escalated. Good maneuvered the vehicle forward and started to accelerate. The vehicle made contact with an ICE agent who was positioned in front; the agent fired through the windshield, striking her in the face and killing her.

0:00 /0:39 1×

Bystander Video (Credit: Nick Sortor)

According to official statements from ICE and the Department of Homeland Security (DHS), the shooting occurred after Good allegedly used her vehicle as a weapon, attempting to run over an agent who then fired in self-defense. Renee and Rebecca Good were part of “ICE Watch” groups monitoring, protesting, and interfering with ICE operations. The ICE agent who fatally shot Good was injured and hospitalized following a prior incident in June 2025, during which an undocumented immigrant with an open warrant for child sexual assault dragged him with his vehicle while attempting to flee arrest.

0:00 /4:26 1×

Bystander Video 2 (Credit: @Dana916 via X.com)

Progressive voices view Good’s killing as an example of ICE overreach, law enforcement brutality, and systemic abuse of power, especially against citizens exercising First Amendment rights. They emphasize Renee was a “legal observer” and had a constitutional right to protest. They further note that Good was an unarmed American citizen on a public road who was fatally shot in the face and head by a masked federal agent. They also interpret the footage as showing Good attempting to navigate away from the scene rather than intentionally trying to harm the agent. They further warn against normalizing state killings, such as in statements made by Rep. Alexandria Ocasio-Cortez (D), who responded to Vice President JD Vance’s defense of the ICE agent by calling it a “regime willing to kill its own citizens.” This sentiment is tied to broader concerns about police/ICE militarization against undocumented immigrants, and observations such as that even if Good erred (e.g., by not complying with instructions of federal law enforcement officers), it wasn’t worth her life, and society needs a higher bar for lethal force.

Conservative commentators frame the shooting as justified self-defense against anti-ICE radicals who disrupted lawful operations. They emphasize Renee’s alleged aggression and Rebecca’s role in escalating the situation by shouting “You wanna come at us? Go get yourself lunch, big boy,” portraying the couple as part of a coordinated harassment campaign rather than passive observers or demonstrators. They also argue Good was an active participant and perpetrator obstructing enforcement of long-standing immigration law, and someone attempting to flee from the scene rather than simply a citizen attending a protest. They maintain that the shooting was tragic, nevertheless law enforcement (and citizens) can use lethal force if they reasonably believe they face imminent serious harm. Further, they make the following distinction: debating whether the officer should or should not have fired is rational, but refusing to acknowledge that being struck/pushed by a vehicle is basis for self-defense isn’t.

These conflicting media narratives matter because most people do not build their understanding of the world through direct experience. Our personal encounters are limited. The rest of our mental model is assembled from stories. Indeed, research in cognitive psychology and media studies consistently shows that humans rely heavily on narrative to organize information and assign meaning. In other words, we are not natural statisticians. As psychologists such as Jerome Bruner and Daniel Kahneman have shown, people reason intuitively through stories, examples, and emotionally salient cases, often treating mediated experience as a stand-in for reality itself. This is why propaganda is most effective when it does not look like propaganda.

Many people assume propaganda is something obvious that you notice and argue with. In reality, the most powerful propaganda works through repetition rather than persuasion. Social psychologists have documented what is known as the “illusory truth effect,” in which repeated statements are more likely to be judged as true, regardless of their accuracy. When a moral narrative is replayed often enough, it stops feeling like a claim and starts feeling like memory.

Consider the recurring portrayal of tech executives in films and television. A wealthy founder speaks in vague abstractions, dismisses ethical concerns, and pursues profit at the expense of ordinary people. The specifics vary, but the moral structure remains the same. Whether any individual depiction reflects the reality of modern technology firms is almost beside the point. After repeated exposure, viewers absorb not just a critique of corporate excess, but an intuitive framework for interpreting innovation, wealth, and motive. Repetition trains audiences to assign intent instantly and to stop questioning it.

This works because fiction bypasses our analytical defenses. Experimental research on narrative persuasion shows that people are less likely to counterargue when they are emotionally absorbed in a story. Psychologists refer to this as “transportation,” a state in which attention and emotion are captured by a narrative, making viewers more receptive to its implicit assumptions. We do not fact-check television dramas. We empathize with them. Their moral premises are absorbed quietly as background knowledge.

For most of us, the names Jeff Bezos, Elon Musk, Mark Zuckerberg, or Peter Thiel evoke an immediate moral impression. But how did that impression evolve? Have you, for example, ever heard them speak at length or know how they run their companies? Do you understand what motivates them? Do they have a good sense of humor?

There is also a structural problem with storytelling itself. Everyday reality, especially everyday crime, is usually chaotic, senseless, and narratively unsatisfying. Criminologists have long observed that much violent crime lacks coherent motives or moral meaning. Writers, understandably, select stories that feel legible, purposeful, and emotionally engaging. But those selections shape our expectations of reality and thus our perception, and make us see otherwise messy events as morally clearer than they actually are.

The result is a moral universe in which certain kinds of harm are treated as profound moral ruptures, while other kinds are treated as routine or unfortunate facts of life. Violence committed by some characters is framed as a social crisis demanding urgent moral response. Similar violence committed by others is portrayed as tragic but unremarkable, something to be managed rather than interrogated.

A clear example appears in the pilot of The Pitt. A dramatic subway assault is immediately interpreted through a moral lens before basic facts are known. The graphic depiction gives viewers the feeling that they are seeing something raw and unfiltered. At the same time, the narrative structure carefully guides inference and sympathy. In the same episode, a different shooting is treated as mundane and procedural. It carries little moral weight and prompts no larger reflection.

The show is not depicting reality. It is presenting a moral map.

This does not require a conspiracy, and it does not require malicious intent. Many writers openly acknowledge that fiction shapes social norms and expectations. Cultural theorists from Walter Lippmann to contemporary media scholars have noted that narratives function as “pictures in our heads,” guiding perception long before conscious judgment enters the picture. What is new is the growing cultural distance between those producing these narratives and the audiences consuming them, combined with a strong confidence that the moral direction of society is already settled.

When this kind of storytelling dominates, it does more than persuade. It trains perception itself. Viewers learn what to notice, what to ignore, and which conclusions should feel obvious. Over time, alternative interpretations stop feeling like interpretations at all. They begin to look irrational or delusional.

This is how “the other movie” disappears.

♦ ♦ ♦

A functioning society does not require agreement on every issue. It does require a shared reality. When large groups of people cannot even see what others are responding to, debate becomes impossible. You cannot resolve disagreements if one side experiences the other as hallucinating.

The answer is not counter-propaganda, and it is not simply more facts. Research on motivated reasoning shows that facts alone rarely change minds when perceptions themselves are structured by narrative. What is required instead is closer attention to how stories shape perception. What they highlight. What they omit. And how repetition turns fiction into intuition.

Was Renee Good heroically intervening in an unlawful abduction and a victim of reckless police violence? Or was she someone who interfered with a lawful enforcement action and nearly ran over an officer? Each interpretation feels obvious to those who hold it, and nearly invisible to those who do not. If you analyze both long enough, you might start to see the narratives and the chain of events that lead one to interpret this particular incident in a particular way after watching the exact same three minutes of video.

Skepticism, properly understood, is not just about questioning explicit claims. It is about examining why certain narratives feel natural, why others feel unthinkable, and why some movies seem to be playing on the screen while others are never seen at all.

Categories: Critical Thinking, Skeptic

How “Us vs. Them” Takes Hold: Tribalism in Byzantium, Sri Lanka, and Modern America

Fri, 01/09/2026 - 10:36am

Sixth-century Byzantium was a city divided by race hatred so intense that people viciously attacked each other, not only in the streets but also in churches. The inscription on an ancient tablet conveys the raw animus that spawned from color differences: “Bind them! … Destroy them! … Kill them!” The historian Procopius, who witnessed this race antagonism firsthand, called it a “disease of the soul,” and marveled at its irrational intensity:

They fight against their opponents knowing not for what end they imperil themselves … So there grows up in them against their fellow men a hostility which has no cause, and at no time does it cease or disappear, for it gives place, neither to the ties of marriage nor of relationship nor of friendship.1

This hostility sparked multiple violent clashes and riots, culminating in the Nika Riot of 532 CE, the biggest race riot of all time: 30,000 people perished, and the greatest city of antiquity was reduced to smoldering ruins.

But the Nika Riot wasn’t the sort of race riot you might imagine. The race in question was the chariot race. The color division wasn’t between black and white but between blue and green—the colors of the two main chariot-racing teams. The teams’ supporters, who were referred to as the Blue and Green “factions,” proudly wore their team colors, not just in the hippodrome but also around town. To help distinguish themselves, many Blues also sported distinctive mullet hairstyles, like those of 1970s rock stars. Both Blues and Greens were fiercely loyal to their factions and their colors. The chariots and drivers were a secondary concern; the historian Pliny asserted that if the drivers were to swap colors in the middle of a race, the factions would immediately switch their allegiances accordingly.

Decades of studies have demonstrated the dangerous power of the human tribal instinct.

The race faction rivalry had existed for a long time before the Nika Riot, yet Procopius writes that it had only become bitter and violent in “comparatively recent times.” So, what caused this trivial division over horse-racing teams to turn so deadly? In short, it was the Byzantine version of “identity politics.”

Detail of “A Roman Chariot Race,” depicted by Alexander von Wagner, circa 1882. During the Nika Riots that took place against Byzantine Emperor Justinian I in Constantinople over the course of a week in 532 C.E., tens of thousands of people lost their lives and half the city was burned to the ground. It all started over a chariot race. (Image courtesy of Manchester Art Gallery)

Modern sociological research helps explain the phenomenon. Decades of studies have demonstrated the dangerous power of the human tribal instinct. Surprisingly, it doesn’t require “primordial” ethnic or tribal distinctions to engage that impulse. Minor differences are often sufficient to elicit acute ingroup-outgroup discrimination. The psychologist Henri Tajfel demonstrated this in a landmark series of studies to determine how minor those differences can be. In each successive study, Tajfel divided test subjects into groups according to increasingly trivial criteria, such as whether they preferred Klee or Kandinsky paintings or underestimated or overestimated the number of dots on a page. The results were as intriguing as they were disturbing: even the most trivial groupings induced discrimination.23

However, the most significant and unexpected discovery was that simply telling subjects that they belonged to a group induced discrimination, even when the grouping was completely random. Upon learning they officially belonged to a group, the subjects reflexively adopted an us-versus-them, zero-sum game attitude toward members of other groups. Many other researchers have conducted related experiments with similar results: a government or an authority (like a researcher) designating group distinctions is, by itself, sufficient to spur contentious group rivalry. When group rewards are at stake, that rivalry is magnified and readily turns malign.

The Robbers Cave Experiment, conducted in 1954 by social psychologists Muzafer and Carolyn Sherif, investigated intergroup conflict and cooperation. The study involved 22 eleven-year-old boys at a summer camp in Robbers Cave State Park, Oklahoma. (Photo: The University of Akron)

The extent to which authority-defined groups and competition for group benefits can foment nasty factionalism was demonstrated in the famous 1954 Robbers Cave experiment, in which researchers brought boys with identical socioeconomic and ethnic backgrounds to a summer camp, dividing them randomly into two official groups. They initially kept the two groups separate and encouraged them to bond through various group activities. The boys, who had not known each other before, developed strong group cohesion and a sense of shared identity. The researchers then pitted the groups against each other in contests for group rewards to see if inter-group hostility would arise. The group antagonism escalated far beyond their expectations. The two groups eventually burned each other’s flags and clothing, trashed each other’s cabins, and collected rocks to hurl at each other. Camp staff had to intervene repeatedly to break up brutal fights. The mounting hostility and risk of violence induced the researchers to abort that phase of the study.4 Other researchers have replicated this experiment: one follow-up study resulted in knife fights, and a researcher was so traumatized he had to be hospitalized for a week.56

How does this apply to the Blues and Greens? As in the Tajfel experiments, the Byzantine race factions had formed a group division based on a trivial distinction—the preference for a color and a horse racing team. However, for many years, the rivalry remained relatively benign. This was likely because the emperors had long played down the factional distinction and maintained a tradition of race neutrality: if they favored a faction, they avoided openly showing it. But that tradition ended a few years before the Nika Riot when emperors began openly supporting either one faction or the other. But more importantly, they extended their support outside the hippodrome with official policies that benefited members of their preferred faction. The emperors Marcian, Anastasius, and Justinian adopted official employment preferences, allocating positions to members of their favored faction and blocking the other faction from coveted jobs. To cast it in modern terms, they began a program of “race-based” affirmative action and identity politics.78

In nearly all the countries where affirmative action programs have been implemented, they have an invidious effect on the group that benefits, imbuing them with a sense of insecurity and defensiveness over the benefits they receive.

Official recognition of the group distinction enhanced the us-versus-them sense of difference between the factions, and the affirmative action scheme turned this sense of difference into bitter antagonism, which eventually exploded in violence. Procopius, our primary contemporary source, placed the blame for the mounting antagonism and the riots squarely on Justinian’s program of identity politics. It had not only promoted an us-versus-them mindset in the factions, it also incited vicious enmity between them, turning a trivial color preference and sporting rivalry into a deadly “race war.”

Considering how identity politics could elicit violence from randomly assembled groups like the Blues and Greens, it is easy to imagine how disastrous identity politics can be when applied to groups that already have some long-standing, historic sense of difference. Indeed, there have been numerous instances of this in history, most ending tragically. For example, Tutsis and Hutus enjoyed centuries of relatively peaceful coexistence in Rwanda up until Belgian colonialists arrived; when the Belgians issued identity cards distinguishing the two groups and instituted affirmative action, it ossified a formerly porous group distinction and infused it with bitter rivalry, preparing the path to genocide. Likewise, when Yugoslavia instituted its “nationality key” system, with educational and employment quotas for the country’s constituent ethnic groups, it hardened group distinctions, pitting the groups against each other and setting the stage for genocide in the Balkans. And, when the Sri Lankan government opted for identity politics and affirmative action, it spawned violent conflict and genocide that destroyed a once peaceful and prosperous country. This last example—Sri Lanka—is so illustrative of the dangers of identity politics that we’ll examine it in more detail.

Sri Lanka: How Identity Politics Destroyed ParadiseShe is a fabulous isle just south of India’s teeming shore, land of paradise … with a proud and democratic people … Her flag is the flag of freedom, her citizens are dedicated to the preservation of that freedom … Her school system is as progressive as it is democratic. —1954 TWA TOURIST VIDEO

Sri Lanka is an island off India’s southeast coast blessed with copious amounts of arable land and natural resources. It has an ethnically diverse population, with the two main groups being Sinhalese (75 percent) and Tamils (15 percent). Before Sri Lanka’s independence in 1948, there was a long history of harmony between these groups. That history goes back at least to the fourteenth century when the Arab traveler Ibn Battuta observed how the different groups “show respect” for each other and “harbor no suspicions.” On the eve of Sri Lanka’s independence, a British governor lauded the “large measure of fellowship and understanding” that prevailed, and a British soldiers’ guide noted that “there are no historic antagonisms to overcome.” With quiescent communal relations, abundant natural resources, and one of the highest literacy rates in the developing world, newly independent Sri Lanka was poised to flourish and prosper. Nobody doubted it would outperform countries like South Korea and Singapore, with the British governor dubbing it “the best bet in Asia.”

It turned out to be a very poor bet. A few years after Sri Lanka’s independence, violent communal conflict erupted, culminating in a protracted civil war and genocide. By the time it ended, over a million people had been displaced or killed. Sri Lanka’s per capita GDP, which was on par with South Korea’s in 1960, was only one-tenth of it by 2009. As in sixth-century Byzantium, identity politics precipitated the calamity.

Turning a Disparity into a Disaster

At the end of British colonial rule in Sri Lanka, there was significant educational and income disparity between Sinhalese and Tamils. This arose by happenstance rather than because of discriminatory policy. The island’s north, where Tamils predominate, is arid and poor in resources. Because of this, the Tamils devoted their productive energy toward developing human capital, focusing on education and cultivating professional skills. This focus was abetted by American missionaries, who set up schools in the north, providing top-notch English-language education, particularly in math and the physical sciences. As a result, Tamils accounted for an outsized proportion of the better-educated people on the island, particularly in higher-paying fields like engineering and medicine.

Because of the Tamils’ superior education, the British colonial administration hired them disproportionately compared to the Sinhalese. In 1948, for example, Tamils accounted for 40 percent of the clerical workers employed by the colonial government, greatly outstripping their 15 percent share of the overall population. This unequal outcome had nothing to do with overt discrimination against the Sinhalese; it merely reflected the different levels and types of education achieved by the different ethnic groups.

When Sri Lanka gained independence, it passed a constitution that prohibited discrimination based on ethnicity. But a few years after that, an opportunist politician, S.W.R.D. Bandaranaike, figured he could advance his career by cynically appealing to identity politics, stoking Sinhalese envy over the Tamils’ over-representation in higher education and government. He launched a divisive campaign to eliminate the disparity, which spurred the majority Sinhalese to elect him. After his election in 1956, Bandaranaike passed a law that changed the official language from English to Sinhala and consigned students to separate Tamil and Sinhalese education “streams” rather than having them all learn English. As one Sinhalese journalist wrote, this divided Sri Lanka, depriving it of its “link language”:

That began a great divide that has widened over the years. Children now go to segregated schools or study in separate streams in the same school. They don’t get to know other people of their own age group unless they meet them outside.

Beyond eliminating Sri Lanka’s common “link language,” this law also functioned as a de facto affirmative action program for Sinhalese. Tamils, who spoke Tamil at home and received their higher education in English, could not gain Sinhala proficiency quickly enough to meet the government’s requirement. So, many of them lost their jobs to Sinhalese. For example, the percentage of Tamils employed in government administrative services dropped dramatically: from 30 percent in 1956 to five percent in 1970; the percentage in the armed forces dropped from 40 percent to one percent.

As has happened in many other countries, Sri Lanka’s identity politics went hand-in-hand with expanded government. Sinhalese politicians made it clear: government would be the tool to redress perceived ethnic disparities. It would allocate more jobs and resources, and that allocation would be based on ethnicity. As one historian writes: “a growing perception of the state as bestowing public goods selectively began to emerge, challenging previous views and breeding mistrust between ethnic communities.” Tamils responded to this by launching a non-violent resistance campaign. With ethnic dividing lines now clearly drawn, mobs of Sinhalese staged anti-Tamil counter-demonstrations and then riots in which hundreds—mostly Tamils—were killed. The us-versus-them mentality was setting in.

Bandaranaike was eventually assassinated by radicals within his own movement. But his widow, Sirimavo, who was subsequently elected prime minister, resolved to maintain his top priorities—expansive government and identity politics. She nationalized numerous industries and launched development projects that were directed by ethnic and political considerations rather than actual need. She also removed the constitutional ban on ethnic discrimination so that she could aggressively expand affirmative action. The existing policies had already cost so many Tamils their jobs that they were now under-represented in government. However, they remained over-represented in higher education, particularly in the sciences, a disparity that Sirimavo and her political allies resolved to eliminate. In a scheme that American universities like Harvard would later emulate, the Sri Lankan universities began to reject high-scoring Tamil applicants in favor of manifestly less-qualified Sinhalese with vastly lower test scores.

Just like Justinian’s “race” preferences, the Sri Lankan affirmative action program exacerbated us-versus-them attitudes, deepening the group divide and spurring enmity between groups. As one Sri Lankan observed:

Identity was never a question for thousands of years. But now, here, for some reason, it is different … Friends that I grew up with, [messed around] with, got drunk with, now see an essential difference between us just for the fact of their ethnic identity. And there are no obvious differences at all, no matter what they say. I point to pictures in the newspapers and ask them to tell me who is Sinhalese and who is Tamil, and they simply can’t tell the difference. This identity is a fiction, I tell you, but a deadly one.9

The lessons of the various affirmative action programs in Sri Lanka were clear to everyone: individuals’ access to education and government employment would be determined by ethnic group membership rather than individual merit, and political power would determine how much each group got. If you wanted your share, you needed to mobilize as a group and acquire and maintain political power at any cost. The divisive effects of these lessons would be catastrophic.

The realization that they would forever be at the mercy of an ethnic spoils system, along with the violent attacks perpetrated against them, induced the Tamils to form resistance organizations—most notably, the Liberation Tigers of Tamil Eelam (LTTE). The LTTE attacked both Sri Lankan government forces and individual Sinhalese, initiating a deadly spiral of attacks and reprisals by both sides committing the sort of atrocities that are tragically common in ethnic conflicts: burning people alive, torture, mass killings, and so on. Over the following decades, the conflict continued to fester, periodically escalating into outright civil war. Ultimately, over a million people would be killed or displaced.

The timeline of the Sri Lankan conflict establishes how communal violence originated from identity politics rather than the underlying income and occupational disparity between the groups. That disparity reached its apex at the beginning of the twentieth century. Yet, there was no communal violence at that point or during the next half-century. It was only after the introduction of affirmative action programs that ethnic violence erupted. The deadliest attacks on Tamils occurred an entire decade after those programs had enabled Sinhalese to surpass Tamils in both income and education. As Thomas Sowell observed: “It was not the disparities which led to intergroup violence but the politicizing of those disparities and the promotion of group identity politics.”10

Consequences of Identity Politics in Sri Lanka and Beyond

Sri Lanka’s experience highlights some underappreciated consequences of identity politics. Most notably, one would expect that affirmative action programs would have warmed the feelings of the Sinhalese toward the Tamils. After all, they were receiving preferences for jobs and education at the Tamils’ expense. Yet, precisely the opposite happened: as the affirmative action programs were implemented, Sinhalese animus toward the Tamils progressively worsened. This pattern has been repeated in nearly all the countries where affirmative action has been implemented: affirmative action programs have an invidious effect on the group that benefits, imbuing them with a sense of insecurity and defensiveness over the benefits they receive. That group tends to justify the indefinite continuation of these benefits by claiming that the other group continues to enjoy “privilege”—or by demonizing them and claiming that they are “systemically” advantaged. Thus, the beneficiaries of affirmative action are often the ones to initiate hostilities. In Rwanda, for example, it was Hutu affirmative action beneficiaries who perpetrated the violence, not Tutsis. The situation in Sri Lanka was analogous, with Sinhalese instigating all of the initial riots and pogroms against the Tamils.

One knock-on effect of identity politics in Sri Lanka was that it ultimately benefited some of the wealthiest and most privileged people in the country. The government enacted several affirmative action schemes, each increasingly contrived to benefit well-heeled Sinhalese. The last of these implemented a regional quota system that was devised so that aristocratic Sinhalese living in the Kandy region would compete for spots against poor, undereducated Tamil farm workers. As one Tamil who lost his spot in engineering wrote: “They effectively claimed that the son of a Sinhalese minister in an elite Colombo school was disadvantaged vis-à-vis a Tamil tea plucker’s son.” This follows the pattern of many other affirmative action programs around the world: the greatest beneficiaries are typically the most politically connected (and privileged) individuals within the group receiving affirmative action. They are often wealthier and more privileged than many of the individuals against whom affirmative action is directed. This has been well documented in India, which has extensive data on the subgroups that benefit from its affirmative action programs.

Decades of sociological research and millennia of history have demonstrated that the tribal instinct is both powerful and hardwired into human behavior.

One unexpected consequence of identity politics in Sri Lanka was rampant corruption. When Sri Lanka became independent, its government was widely deemed one of the least corrupt in the developing world. However, as affirmative action programs were implemented and expanded, corruption increased in lockstep. The adoption of affirmative action set a paradigm that pervaded the government: whoever held power could steer government resources to whomever they deemed “underserved.” A baleful side effect of ethnicity-based distortion of government policy is that it undermines and erodes more general standards of government integrity and transparency, legitimating a paradigm of corruption: if it is acceptable to direct policy for the benefit of an ethnic group, is it not also acceptable to do so for the benefit of a clan or an individual? It is a small step to go from one to the other, a step that many Sri Lankan leaders and bureaucrats took. Today, Sri Lanka’s government, which once rivaled European governments in transparency, remains highly corrupt. This pattern has been repeated in other countries. For example, after the Federation of Malaysia expelled Singapore, it adopted an extensive affirmative action program, whereas Singapore prohibited ethnic preferences. Malaysia subsequently experienced proliferating corruption, whereas Singapore is one of the least corrupt countries in the world today.

Economic divergence between Singapore and Sri Lanka’s GDP per capita, 1960–2023 (Source: Our World in Data)

Perhaps the most profound consequence of identity politics in Sri Lanka was that it ultimately made everybody in the country worse off. After World War II, per capita income in Sri Lanka and Singapore was nearly identical. But after it abandoned its shared “link language” and adopted ethnically divisive policies, Sri Lanka was plagued by violent conflict and economic underperformance; today, one Singaporean earns more than seven Sri Lankans put together. All the group preferences devised to elevate Sinhalese brought down everyone in the country—Tamil, Sinhalese, and all the other groups alike. Lee Kuan Yew, Singapore’s “founding father,” attributed that failure to Sri Lanka’s divisive policies, saying that if Singapore had implemented similar policies, “we would have perished politically and economically.” There are echoes of this in other countries that have implemented identity politics. When I visited Rwanda, I asked Rwandans of various backgrounds whether they thought distinguishing people by race or ethnicity ever helped anyone in their country. There was complete unanimity on this point: after they got over pondering why anyone would ask such a naïve question, they made it very clear that distinguishing people by group made everyone, whether Hutu or Tutsi, distinctly worse off. In the Balkans, I got similar answers from Bosnians, Croatians, Serbians, and Kosovars.

The Perilous Path of Identity Politics

Decades of sociological research and millennia of history have demonstrated that the tribal instinct is both powerful and hardwired into human behavior. As political scientist Harold Isaacs writes:

If anything emerges plainly from our long look at the nature and functioning of basic group identity, it is the fact that the we-they syndrome is built in. It does not merely distinguish, it divides … the normal responses run from … indifference to depreciation, to contempt, to victimization, and, not at all seldom, to slaughter.11

The history of Byzantium and Sri Lanka demonstrates that this tribal instinct is extremely easy to provoke. All it takes is official recognition of group distinctions and some group preferences to balkanize people into bitterly antagonistic groups, and the consequences are potentially dire. Even if a society that is balkanized in this way avoids violent conflict, it is still likely to be plagued by all the concomitants of social fractionalization: higher corruption, lower social trust, and abysmal economic performance.

A country that was once renowned for its communal harmony quickly descended into violence and economic failure—all because it sought to redress group disparities with identity politics.

It is therefore troubling to see the U.S. government, institutions, and society adopt Sri Lankan-style policies that emphasize group distinctions. As the U.S. continues down the perilous path of identity politics, it is unlikely to devolve into another Bosnia or Sri Lanka overnight. But the example of Sri Lanka is a dire warning: a country that was once renowned for its communal harmony quickly descended into violence and economic failure—all because it sought to redress group disparities with identity politics.

Surveys and statistics are now flashing warning signs in the United States. A Gallup poll found that while 70 percent of Black Americans believed that race relations in the United States were either good or very good in 2001, only 33 percent did in 2021.12 Other statistics have shown that hate crimes have been on the rise over that time.13 In the last year, we have also seen the spectacle of angry anti-Israel protesters hammering on the doors of a college hall, terrorizing the Jewish students locked inside, and a Stanford professor telling Jewish students to stand in the corner of a classroom. While identity politics have increasingly directed public policy and institutions, relations between social groups have deteriorated rapidly. This—and a lot of history—suggest it’s time for a different approach.

Categories: Critical Thinking, Skeptic

The Future Leaks Out: William S. Burroughs’s Cut-Ups and Cucumbers

Tue, 01/06/2026 - 1:42pm

William S. Burroughs was one of the most controversial literary figures of the early 1960s, an American postmodern author and visual artist who was considered one of the key figures of the Beat Generation that influenced pop culture (he was friends with Allen Ginsberg and Jack Kerouac). He also became preoccupied by an unusual experiment: the cut-up, a technique in which a written text is cut up and rearranged to create a new text. But this was no mere artistic preoccupation. Burroughs, author of the notorious Naked Lunch (the subject of a major literary censorship case when its publisher was sued for violating a Massachusetts obscenity law) claimed to have found a sort of window into the future, a time warp on paper and on tape.

Burroughs got the cut-up idea in 1959 from his close friend Brion Gysin. Burroughs remembered, “It was simply of course applying the montage method, which was really rather old hat in painting at that time, to writing. As Brion said, writing is fifty years behind painting.”1 Burroughs traced the cut-up back to an incident from the Dada movement of the 1920s, when Tristan Tzara announced his intention to create a poem on the spot by pulling words out of a hat.2

For Burroughs, however, the cut-ups were something more than a creative writing technique. He traced this supposed revelation back to a Time magazine article by the oil industrialist John Paul Getty. (Burroughs may have been referring to a February 1958 Time cover story on Getty. Getty did not write the article.) Upon cutting up the article, Burroughs created the following phrase: “It’s a bad thing to sue your own father.” When Getty was in fact sued by one of his sons, Burroughs came to believe that his cut-up had foretold the future: 

Perhaps events are pre-written and prerecorded and when you cut word lines the future leaks out. I have seen enough examples to convince me that the cut-ups are a basic key to the nature and function of words.3

Years later, in Howard Brookner’s Burroughs, the fedora-clad, now-aged author explains to his poet friend Allen Ginsberg: 

Every particle of this universe contains the whole of the universe. You yourself have the whole of the universe. If I cut you up in a certain way I cut up the universe … So in my cut-ups I was attempting to tamper with the basic pre-recordings. But I think I have succeeded to some modest extent. 

At this, Ginsberg could only nod and utter a number of noncommittal “um hmms,” adding later: “Burroughs was, in cutting up, creating gaps in space and time, as Cezanne, or as meditation does.” Burroughs also cited a dubious summary of Wittgenstein’s Paradox: “This is Wittgenstein: If you have a prerecorded universe, in which everything is prerecorded, the only thing that is not prerecorded are the prerecordings themselves.”4 The actual Wittgenstein’s Paradox holds that “no course of action could be determined by a rule, because any course of action can be made out to accord with the rule.” 

Ludwig Wittgenstein was a philosopher and language theorist, but there is no reason to believe that he thought of the universe as a giant tape recording. Rather, Burroughs’s notion of human consciousness was clearly influenced by L. Ron Hubbard’s engram theory, itself reliant on Freudian psychoanalytic theory with its emphasis on trauma and repressed memory. Seemingly derived from the medical theory of the memory trace, Hubbard described engrams as imprints of unpleasant experiences on the protoplasm of living beings. 

Burroughs went so far as to describe the cut-up method as “streamlining Dianetics therapy system.” Proposing that his tape method could be used for therapy, he went on to suggest wiping “traumatic material” off a magnetic tape.5 He even hinted that Hubbard had borrowed the tape recording idea from him! His friend Ian Sommerville sold Hubbard two recorders, and Burroughs seemed to find it significant that Sommerville had become sick soon after, as if Hubbard were using an insidious black magic.6 Burroughs began to see the Scientology system as a form of brainwashing, even as he was increasingly convinced of Hubbard’s theories. 

Moving on to the world of cinema, Burroughs made two cut-up films, Towers Open Fire in 1963 and The Cut-Ups in 1966, with the help of producer Antony Balch. And, in 1965, Burroughs proposed to Balch “a new type of science fiction film,”7 one that would expose “the story of Scientology and their attempt to take over this planet.”8 The film would explain that “vulgar stupid second rate people” had taken over the planet by means of a “virus parasite.”9

Burroughs brazenly went ahead with his cut up experiment, even though it might have serious ramifications for the universe: “Could you, by cutting up … cut and nullify the pre-recordings of your own future? Could the whole prerecorded future of the universe be prerecorded or altered? I don’t know. Let’s see.” Perhaps he was thinking of the scientists at Los Alamos, who exploded the first atomic bomb without being completely sure of the ramifications.10

Nor was Burroughs’s “sample operation” in influencing the universe an especially ethical exercise. In fall 1972 the author took issue with the Moka, “London’s first espresso bar,” leading to a vengeful exercise with overtones of Maya Deren, the experimental filmmaker who was also a voodoo priestess and flinger of malicious hexes. 

Burroughs’s grudge against the Moka arose over what he described as “unprovoked discourtesy and poisonous cheesecake.” He took a movie camera and began filming. Within two months, the bar was closed. Burroughs recommended using this exercise to “discommode or destroy” any business you did not particularly like. He did not consider the bar might have shut down for some unrelated reason. Maybe word got out about the bad cheesecake.11 Some of the author’s magical thinking in this period may be a result of reliance on drugs, but Burroughs was a believer in curses since childhood.12

It is perhaps not a surprise that some thought the author’s new method was a prank. At a 1962 Edinburgh festival, Burroughs spoke about his new technique, which he was then calling the fold-in method. Members of the crowd thought they were being pranked, causing an Indian author to ask, “Are you being serious?” Burroughs insisted that he was.13

Burroughs presented a summary of his method to a gathering of students at Colorado’s Naropa Institute in 1976, and part of this lecture can be heard on the record Break Through in Grey Room. When Burroughs describes the revelatory Getty cut-up, laughter can be heard from the audience. Perhaps sensing some skepticism, Burroughs insists on his innocence in constructing the Getty rewording: “I mean, it’s purely extraneous information to me. [A woman can be heard laughing.] I had nothing to gain on either side. We had no explanation for this at the time, it’s just suggesting, perhaps, that when you cut into the present the future leaks out.”14

Burroughs may have been a bit disingenuous in telling the Naropa students he had no relationship to the wealthy Getty family. In the mid-1960s, in fact, through the art dealer Robert Fraser, Burroughs mingled with John Paul Getty Jr.15 Then, Burroughs stayed at a flat owned by art dealer Bill Willis from March to July 1967, where he often saw the likes of Getty, Jr.16

Admittedly this would have been later than Burroughs’s initial Getty cut-up (apparently in 1959, when Burroughs first became immersed in the whole cut-up process). But Burroughs may have been acquainted with members of the Getty circle before he actually met the Getty family. Plus, we are relying on a version of events that Burroughs publicly recounted in Daniel Odier’s The Job and later in 1976, and relying on Burroughs’s perception is a dubious proposition. In the 1976 Naropa lecture, Burroughs claims the lawsuit occurred a year after his cut-up,17 while in Daniel Odier’s The Job he claims it was a three-year gap. Also, in The Job he seems to garble matters by conflating the magazine title—Time—with the name of Getty’s company—Tidewater.18 I have not found any record of Getty being sued by one of his sons during the time period described. 

Burroughs’s literary acquaintances were not impressed to see the author seemingly risking his (still quite tenuous) literary reputation on an obsession like this. Samuel Beckett was appalled at the notion of using the words of other writers and said so to Burroughs directly: “That’s not writing. It’s plumbing.”19 The poet Gregory Corso told Burroughs the cut-up method would quickly become “redundant.”20 Novelist Paul Bowles felt the method would “alienate the reader.”21 Norman Mailer was the most prominent literary figure to champion Burroughs’s work to the American mainstream, and he must have been let down to see Burroughs abandoning a major writing career to get hung up on something Mailer probably considered a trivial sidetrack. To Mailer, the cut-up experiments were a mere “recording,” a distraction from the art of fiction.22 Jennie Skerl and Robin Lydenberg note that “positive assessments of Burroughs’s cut-ups were rare … most saw cut-ups as boring or repellent.”23

Nevertheless, Burroughs produced his “cut-up trilogy”: The Soft Machine (1961), The Ticket That Exploded (1962), and Nova Express (1964), although none sold as well as Naked Lunch. Biographer Ted Morgan calls them “inaccessible to the general reader.”24 The impenetrability of Burroughs’s cut-ups added to his reputation as a “difficult” author. Even Burroughs’s off-and-on friend Timothy Leary asked, rhetorically, “Do you actually know anyone who has finished an entire book by Bill Burroughs?”25

Burroughs was greatly impressed by the 1971 English-language publication of Konstantin Raudive’s Breakthrough: An Amazing Experiment in Electronic Communication with the Dead, which popularized what is known today as EVP (Electronic Voice Phenomenon), a widely discredited phenomenon that purports to find hidden messages in audio recordings of background noise, of recordings played backwards, in random static noise between radio stations, and other low information sources. 

Raudive believed these were the voices of the dead. Burroughs offered his own theory in keeping with his cut-up cosmology, namely that the entire universe was a vast playback device, something akin to a tape recording. Inspired by Raudive (and no doubt, Hubbard), Burroughs boldly rejected the precepts of modern psychology. People suffering from schizophrenia were not experiencing hallucinations; they were “tuning in to an intergalactic network of voices.”26

If we look at Burroughs’s supposed predictive phrases, we see a lot of what can only be called “reaching” or grasping at straws. In 1964 Burroughs came up with the phrase, “And here is a horrid air conditioner.” Ten years later, he “moved into a loft with a broken air conditioner.”27 There is nothing mysterious about having an air conditioner break down. If anything, Burroughs was lucky if he went ten years without a broken air conditioner. 

Then there was this cryptic recorded query of Raudive’s: “Are you without jewels?” To Burroughs, this must refer to lasers, “which are made with jewels.” And another especially absurd quote from Raudive’s recordings: “You belong to the cucumbers?” Burroughs had read that “the pickle factory” was a slang term for the CIA, so the recording seemed to be an obvious CIA reference. He read this in either Time or Newsweek. For an icon of bohemian literature, one could argue that Burroughs relied an awful lot on the mainstream media for his prognostications.28 But how were researchers like Raudive and Burroughs tapping into the playback of the universe? Burroughs himself asked this question: 

Now how random is random? We know so much that we don’t consciously know that perhaps the cut-in was not random. The operator at some level knew just where he was cutting in. As you know exactly on some level exactly where you were and what you were doing ten years ago at this particular time.29

Burroughs was admitting that the cutter was influencing the cut-up, but he believed this was because the cutter was unconsciously tuned in to the future. A simpler explanation would be that Burroughs convinced himself that he was doing random work while he was in fact cutting together semiconscious rephrasings. For instance, he may have heard a rumor from one of his monied acquaintances that one of Getty’s sons was considering a legal action well before actually suing. 

If the experimenter (i.e., Burroughs, or Gysin, or Raudive) is unconsciously influencing the experiment, then what we have is a new version of the Ouija board with its self-guided planchette—a device whose movements and messages are created by users who come to believe they are receiving messages from a spirit or other mysterious entity when, in fact, they are moving the planchette. This is known as the ideomotor response. 

It is worth noting that in this lecture Burroughs refers to a number of concepts that are often considered dubious today, such as repressed memories and unreliable eyewitness accounts of events. For instance, he discusses “freaks,” seemingly referring to individuals with alleged eidetic or “photographic” memory. Perhaps he was thinking of his late friend Jack Kerouac, who was known by some in Lowell, Massachusetts, as “Memory Babe” due to his purportedly freakish recall powers? 

There is no evidence to support the notion that anyone can foretell the future by cutting up newspapers, books, or film footage.

Burroughs’s countercultural reputation grew through the 1970s until his death in 1997. But his cut-ups don’t seem to have received much attention from the parapsychological community, perhaps because he was so preoccupied with now-dated media and technology: newspapers, reel-to-reel recordings, and 8mm film. His metaphysical notion of the universe as a “playback” machine seems dated next to the trendier notion of the universe as a computer matrix. 

William Burroughs was one of the most fascinating (and darkly funny) literary figures of the twentieth century, but that doesn’t make him a scientist. There is no evidence to support the notion that anyone can foretell the future by cutting up newspapers, books, or film footage.

Categories: Critical Thinking, Skeptic

Gaslighting

Sun, 01/04/2026 - 1:41pm

Merriam-Webster’s Dictionary announced that 2022 saw a 1740 percent increase in searches for gaslighting “with high interest throughout the year.” Merriam-Webster refines the term:

The idea of a deliberate conspiracy to mislead has made gaslighting useful in describing lies that are part of a larger plan. Unlike lying, which tends to be between individuals, and fraud, which tends to involve organizations, gaslighting applies in both personal and political contexts.1

The term “gaslighting” entered the popular consciousness through a 1944 film, the American psychological thriller Gaslight, in which a husband wants to make his newlywed wife lose her mind to have her locked up in an asylum. His agenda is to steal jewels that he knows are hidden in her late aunt’s house where they are living. The movie’s name is symbolic of the many manipulations the husband undertakes to gaslight his wife into believing she’s insane. 

The film is set in London in the late nineteenth century when lamps were fueled by gas. The wife notices that their lamps randomly go dim. One way the husband destabilizes her is by denying that the gaslights are indeed dimming. It really is such a small manipulation. It’s so minor that you might not make much of it. The husband has been showering his new wife with adoration—referred to in abusive relationships as “love bombing”—making it unlikely for her to think he’s being deceptive. When the wife is told that the gaslights are not dimming, she chooses to believe her devoted husband and doubt her own perceptions. This is the beginning of what could be the end. 

The wife not only notices that the gaslights are dimming, but also that sounds are coming from the attic. Her husband denies the sounds. She can’t find her brooch even though she knows it was in her purse. He has removed it without her knowing. She finds a letter from one “Sergis Bauer,” and her once-adoring husband becomes furious with her. Later, he explains that he became upset because she was upset (which she wasn’t). 

The husband tells his wife that the gaslights are not dimming; there are no sounds from the attic; she lost the brooch as it was not in her purse; she didn’t see a letter from Sergis Bauer. On top of all that, he tells her that she stole a painting, and he has found out that her mother was put in an asylum. He convinces his wife that not only is she fabricating things that don’t exist, but also that she’s a kleptomaniac, too high strung and unwell to be in public. She must be crazy like her mother. Stealing the aunt’s jewels is symbolic of a much more deadly crime: stealing his target’s sanity. The husband is building a case for how his wife is obviously unstable and untrustworthy. Slowly but surely, the wife begins to lose her grip on what’s real and what’s false. She loses faith in her own perceptions. 

Luckily for the 1944 wife in the movie Gaslight, it being a Hollywood movie and all, a policeman takes an interest in the unfolding manipulation. It turns out that the wife is merely useful to the husband, and he exploits her for his own means. In the movie, it turns out that the husband is the one who is untrustworthy and who steals, not his destabilized wife. 

Publicity still from the film Gaslight © 1944 Metro-Goldwyn-Mayer

Gaslighting in a marriage is disturbing. Gaslighting in an institution such as a corporation, church, school, sports club, courthouse, retirement home, government agency, news station, or political party is deeply disturbing. The target in the marriage may lose her mind and come to believe that she is, in fact, corrupt and insane. Her relationship to reality becomes unhinged. As has been demonstrated throughout history, the target in institutional gaslighting leads to whole segments of society losing their minds and coming to believe whatever alternative facts and fabricated events they are being fed by those in positions with power, credibility, and social status. This collective madness can occur in cults, even in nations. We are well-informed by history how incredibly dangerous and destructive this manipulation can be. 

In 2022, the term “gaslighting” was published in a United Kingdom High Court judgment for the first time in what is being called a “milestone” hearing in a domestic abuse case. Describing the case, Maya Oppenheim defines the act as follows: 

Gaslighting refers to manipulating someone by making them question their very grasp on reality by forcing them to doubt their memories and pushing a false narrative of events. 

Although this is being legally identified as manipulation in a marriage, it applies equally well to the workplace. Those who tell the lies of bullying and gaslighting at work make targets question their grasp on reality, force them to doubt their memories, and push a false narrative of events. This false narrative is often believed by higher-ups who have been carefully groomed over time to believe in the power, credibility, and social standing of the one bullying. In this legal ruling, gaslighting is viewed as part of a campaign of psychological abuse that uses coercion and control to destabilize someone. 

Controlling the narrative, silencing questions and concerns, forcing the community to adhere to the institution’s fabricated facts all prop up the harms of institutional complicity. Lawyer and workplace bullying expert Paul Pelletier finds that the lies of workplace bullying flourish when the leadership operates from a coercion and control model as identified in the manipulative and dysfunctional marriage under scrutiny in the UK High Court. Coercion and control as a leadership model sets the stage for the drama of bullying, gaslighting, and institutional complicity to unfold. Psychiatrist Dr. Helen Riess discusses leaders who use fear and intimidation to exert their authority: “This type of failed leadership tends to spread across organizations like the plague.”2

A year later, in 2023, a lawsuit was launched in New Jersey. Once again, gaslighting is one of the alleged behaviors that drove Joseph Nyre, former president of prestigious Seton Hall University, from his institution. As reported by Ted Sherman, Nyre alleges violations of the law against the former chairman of the board at Seton Hall, including the sexual harassment of Nyre’s wife. As a whistleblower, Nyre alleges he was targeted with “gaslighting, retaliation, and intimidation,” which led him to resign. Institutional complicity in silencing those who speak up uses textbook methods and gaslighting is long overdue to be understood as one of the weapons in their arsenal. Dr. Dorothy Suskind, an expert in workplace bullying, refers to the specific abuse meted out to those with “high ethical standards” as a “degradation ceremony.”3

The problem is, those who tell the lies of bullying and gaslighting do not experience self-reflection.

Although gaslighting is being recognized in the law, it is not fully understood from a psychological and brain science perspective, and it is rarely applied to workplace culture. Only recently, in 2023, psychologists Priyam Kukreja and Jatin Pandey developed a “Gaslighting at Work Questionnaire” (GWQ) that revealed two key components in workplace gaslighting: trivialization and affliction. According to psychologist Mark Travers, trivialization may take the form of “making promises that don’t match their actions, twisting or misrepresenting things you’ve said, and making degrading comments about you and pretending you have nothing to be offended about.” Victims start down the path of wondering if they’re being “too sensitive.” Affliction may take the form of excessive control, making you self-critical, creating dependence, or being “very sweet to you and then flipp[ing] a switch, becoming hostile shortly after.”4 Again, this kind of maltreatment causes self-doubt. Kukreja and Pandey conclude: 

The GWQ scale offers new opportunities to understand and measure gaslighting behaviors of a supervisor toward their subordinates in the work context. It adds to the existing literature on harmful leader behaviors, workplace abuse, and mistreatment by highlighting the importance of identifying and measuring gaslighting at work.5

Introducing a questionnaire on gaslighting is an effective way to draw attention to how this form of manipulation occurs. Equally important, it provides vocabulary for workplaces to understand and discuss this specific form of abuse. In recent years, Forbes began publishing articles on gaslighting in the workplace indicating that it is on the leadership radar. Jonathan Westover advises on “How to Avoid and Counteract Gaslighting as a Leader,” and his approach is insightful: 

  • Practice regular self-reflection and foster intellectual humility. 
  • Actively listen to the perceptions of your team members. 
  • Practice vulnerability and own up to your mistakes. 
  • Develop and sustain authentic relationships of mutual accountability and trust.6

The problem is, those who tell the lies of bullying and gaslighting do not experience self-reflection. They do not feel humility as an emotion, just like they don’t feel guilt or remorse. They are disinterested in others’ perceptions as their brain tends to objectify targets especially. They often experience a roller coaster of shame and grandiosity, and they deny vulnerability or the possibility that they have made a mistake. In short, they cannot have authentic relationships. They follow an abusive script that turns them—if not stopped—into a caricature who repeats bullying lies and gaslighting manipulations over and over. They avoid accountability and see trust as a game that they want to win. Using psychological research to understand how the brains of manipulators work hopefully will give us a better chance to prevent their negative impacts in the workplace. 

Manzar Bashir describes several textbook gaslighting behaviors: trivializing your feelings, shifting blame, projecting their behavior, insulting and belittling, and creating confusion and contradictions, but he articulates one in particular—withholding information—that is very tricky to identify and yet can have devastating impacts. “Gaslighters often use a tactic of withholding information and keeping you in the dark about crucial matters. By selectively sharing or concealing facts, they manipulate your perception of reality and limit your ability to make informed decisions.”7 It’s insightful: gaslighting, along with a great deal of psychological manipulation, is harmful in its omissions and passivity. In other words, it’s the opposite of how we measure the harms of physical abuse. When you hurt someone’s body, we assess severity by how much active damage was done. But when the brain is being manipulated, we need to find ways to figure out how much lack of action causes damage. Physical assaults are designed to weaken and harm the body; assaults via gaslighting are designed to weaken and destabilize the brain and the mind. Injuries to the body are far more likely to get immediate treatment, whereas neurological damage to brain architecture and disruption of the mind’s ability to function healthily are too often ignored. 

The more aware we are of how abusive brains operate … the better able we are to prevent workplace bullying and gaslighting.

Psychologists and brain scientists have developed extensive evidence about the way in which gaslighting brains operate, notably different from brains that do not manipulate. Knowledge of psychopathic brains and the way they work can better protect us from the gaslighters’ domineering manipulation and their cruel capacity to exploit us for their own purposes. 

Most of us who are targeted for bullying at work are caught off guard. Because we are not trained to anticipate manipulation, we’re easily victimized. The more aware we are of how abusive brains operate and how our brains are completely thrown off our game by them, the better able we are to prevent workplace bullying and gaslighting. The more leaders, managers, and HR are informed, the less likely they’ll be drawn into institutional complicity. 

Those who tell the self-serving lies of bullying and gaslighting—with ease—are part of a formidable trio referred to in psychology as the Dark Triad: narcissists, Machiavellians, and psychopaths.8 How can we identify these manipulative people more quickly and refuse to believe them? What if there were a way to protect ourselves, and more specifically our sanity, from lies? These are the questions that drove the researching and writing of The Gaslit Brain. I needed to answer them because I was being gaslit at work. 

Excerpted and adapted by the author from The Gaslit Brain, published by Prometheus, an imprint of The Globe Pequot Publishing Group. © 2025 by Jennifer Fraser.
Categories: Critical Thinking, Skeptic

Interviewing the Children of Nazi Leaders: Guilt, Trauma, and the Legacy of Atrocity

Fri, 12/19/2025 - 9:50am

History can be a mirror or a wall. For many people, it’s a mirror only when they see their own family reflected in it—an ancestor who fought in a war, survived a famine, or emigrated under duress. For others, history is a wall they can never climb. The view on the other side is fixed: the past is not what was done to them, but what their parents or grandparents did to others.

That is the reality I discovered when interviewing the sons and daughters of leaders of the Third Reich. 

When I began work on Hitler’s Children, I was not looking for new evidence about what happened in the Nazi Holocaust. The bureaucratic record of the Third Reich was already vast—memos, orders, trial transcripts, camp rosters—the Germans were masters of documenting their crimes. 

What I wanted was something the archives could never provide: a human portrait of the children of top Nazis, the men and women who grew up in the shadow of fathers whose names had become synonyms for evil. 

I wanted to know: What is it like to love a parent whom the world knows as a war criminal? How do you form a sense of self when the world has already decided who you are—and it is an identity you neither chose nor can easily shed? What happens to ordinary human relationships—marriage, friendship, parenthood—when your family name carries an explosive moral charge? 

Those questions took me across Germany and Austria and into conversations that were often guarded, sometimes raw, and occasionally redemptive. Some doors never opened. Some opened a crack and then slammed shut the minute I explained that I could not promise a sympathetic portrait. A few opened wide, and what came out was not a clean confession or a tidy arc toward reconciliation but something more human: ambivalence, anger, loyalty, shame, defiance, grief. What emerged was not a single “Nazi progeny” experience but a spectrum of responses to inherited guilt. 

Polish Jews captured by Germans during the suppression of the Warsaw Ghetto Uprising (Poland) and forced to leave their shelter and march to the Umschlagplatz for deportation, May 1943. Photo by Jürgen Stroop. (Credit: United States Holocaust Memorial Museum, courtesy of National Archives and Records Administration, College Park)Knocking on Closed Doors 

Tracking down the children of the regime’s inner circle required patience and a tolerance for being told no. Some had changed their surnames and slipped into anonymity. Others had moved abroad, where the name on their passport did not immediately freeze a room. Many were instantly hostile when I contacted them. They assumed—not unreasonably—that I was there to condemn their parents or to dredge up what they had spent decades trying to bury. 

I learned quickly that the children of perpetrators could be as guarded as the children of victims. I knew many of the latter intimately because I had earlier co-authored a biography of Nazi Dr. Josef Mengele. I had spent countless hours with concentration camp survivors about their experience and the trauma it had left them. When I approached the children of the perpetrators, I discovered some had been burned by journalists who came for sensational quotes and left nuance on the cuttingroom floor. Others feared the moral judgment of strangers or the social cost in their own communities if they were seen as disloyal to family. 

A few, though, agreed to speak. Some said they wanted the truth to be known while they were still alive. Others hoped that narrating their story aloud might lighten the weight they had carried in silence. What I heard, over time, was less a series of disconnected biographies than a set of recurring moral dilemmas. 

The Spectrum of Inherited Guilt 

To make sense of what I was hearing, I came to think of my interviewees along four rough lines. These are not scientific categories—lives overflow categories—but they capture distinct ways the various individuals navigated the same shadow. 

1. The Rejectors. These were the sons and daughters who saw their fathers’ crimes with scorching clarity and devoted their lives to exposing them. Niklas Frank, son of Hans Frank—the Nazi Governor-General of occupied Poland—was the most uncompromising. He called his father a “spineless jerk,” wrote a book that dismantled the family mythology, and made no room for sentimentality in the face of historical fact. “You don’t put love for your father above the truth,” he told me. The choice for him was not between love and hate but between complicity and moral independence. 

2. The Defenders. At the other end of the spectrum were those who insisted their fathers were maligned by history or punished beyond proportion. Wolf Hess defended his father, Rudolf Hess, Hitler’s deputy, as a “man of peace” betrayed by political enemies and victors’ justice. For Wolf, to defend his father was to defend himself from the conclusion that he was the son of a villain. The defense became a scaffold for identity, a way to live in the world without constantly negotiating contempt. 

3. The Divided. In the middle were those who could neither fully condemn nor fully exonerate. Rolf Mengele—son of Dr. Josef Mengele—met his father only twice after the war. Rolf was sixteen the first time, when his father traveled from his South American hideaway for a skiing vacation in the Swiss Alps. Rolf’s mother had told him his real father had died in war, and the visitor was “Uncle Fritz.” Three years later he learned that Uncle Fritz was in reality his father and he learned about his crimes. He only met him again when Rolf was 33, a visit to South America to confront him about Auschwitz. The elder Mengele closed that door, telling his son never to question him about what happened at the camp and what led to the prisoners dubbing him the “Angel of Death.” 

Public history sees uniforms and titles. Private memory remembers the warmth of a hand.

Rolf did not deny his father’s atrocities; he had studied the documents as had everyone else. However, his sense of loyalty to his family had fractured the moral clarity that comes easily to people who never face the person behind the infamy. Rolf carried two incompatible truths: the father he barely knew and whom his family loved and the historical perpetrator he could not defend. 

4. The Transcenders. Finally, there were those who took the moral debt they inherited and turned it outward—into a public ethic. Dagmar Drexel’s father was not a senior Nazi official but instead one of the murderous Einsatzgruppen, the mobile death squads that killed more than a million civilians. She chose the path of engagement and reconciliation, visiting Israel, supporting dialogue, and insisting that her children and grandchildren be raised in the light of historical truth. Dagmar hoped, as many did, that if her generation did the hard work, the third generation might be free of the burden. 

These categories blur at the edges. People moved along the spectrum over time—hardening or softening as new documents and eyewitness accounts surfaced, as they aged, as their own children asked harder questions than journalists ever could. 

Taken together, however, the spectrum reveals the variety of human strategies for living with the inheritance of atrocity. 

The Private Life of a Perpetrator. The Höss family enjoys a seemingly idyllic domestic life—a swimming pool, a carefully tended garden, children at play—literally abutting the walls of Auschwitz. This publicity still from Jonathan Glazer’s film The Zone of Interest visually captures the double life of memory and the central torment for perpetrators’ children: reconciling the private tenderness of a parent with the public monstrosity of their crimes. (Credit: The Zone of Interest © 2023. Directed by Jonathan Glazer. Photo courtesy of A24.)The Double Life of Memory 

For outsiders, the hardest truth to grasp may be the most banal: perpetrators are still parents. A man who signed deportation orders may also have read bedtime stories, taught a child to swim, or taped the wobbling seat on a first bicycle. Public history sees uniforms and titles. Private memory remembers the warmth of a hand, the tone of a voice in the kitchen at night. 

Reconciling those two realities—public monstrosity and private tenderness—was the central torment for many I met. Some resolved it by letting historical fact erase the personal. They repudiated the father and severed the line. Others clung to the personal, even when it meant being accused of denial. 

Guilt is about actions; shame is about identity.

Edda Göring, devoted to her father’s memory, described Hermann Göring as generous and loving. She did not deny the crimes of the regime for which he was one of its top leaders but resisted the idea that her father had been a fanatic. To critics, that sounded like apologetics. To her, it was loyalty to the man she knew as a kindly father. 

The tension here is not reducible to “truth versus lies.” Rather, it is a collision of kinds of truth—the truth of documented atrocity and the truth of attachment, which does not yield easily to hard facts. I came to believe that part of the work of reckoning is sometimes learning to hold both truths at once without letting either evaporate the other. 

Shame, Guilt, and the Psychology of the Second Generation 

Psychology offers a vocabulary for what I heard. The “intergenerational transmission of trauma” is well documented among the children of victims—especially Holocaust survivors—where symptoms include anxiety, hypervigilance, and a deep mistrust of institutions. Among the children of perpetrators, I discovered that a related but distinct process plays out. Their inheritance is not injury but stigma—the corrosive effects of shame, moral ambiguity, and the fear that others see an invisible mark. 

Guilt is about actions; shame is about identity. One can confess guilt and make amends. Shame, by contrast, whispers that one is something tainted. Several interviewees spoke of carrying a “name that enters the room first.” It affected romance (when to disclose the name), employment (whether a boss would know the family and decide against them), and decisions about parenthood (whether to have children at all). 

Coping strategies reflected familiar psychological defenses. Some changed their names or emigrated—geographic cures for a moral biography. Others chose radical transparency—publicly condemning their fathers in books and interviews to reclaim their own moral agency. A third group practiced radical silence, hoping that if the topic never arose, the past might recede on its own. It never did. Silence, I learned, is a temporary dam. The water rises behind it. 

How Family Systems Carry History 

Beyond the individual psyche lies the family system—the ways stories are told or not told, the rituals of commemoration or erasure. Some families preserved elaborate mythologies in which the father had resisted orders, saved a Jewish neighbor, or known nothing about the machinery of murder. 

The myths were often anchored in a single ambiguous episode—an order not carried out, a mild reprimand from a superior—that became the seed for an alternative history. 

Other families split. Siblings took opposing stances. One condemned; another defended. At holiday meals, the past was both present and forbidden. 

“Intergenerational trauma” named not only what moved from parent to child but what moved from child back to parent: a judgment the older generation could not bear. 

The emotional economy of those households looked familiar to anyone who has studied families marked by addiction or scandal: unspoken rules, competing narratives, and a tacit agreement that love depended on staying within one’s assigned role. 

Children who broke the family line—who published a denunciation or appeared in a documentary—sometimes became moral exiles among their own kin. That rupture was the price of telling the truth as they saw it. In those moments, “intergenerational trauma” named not only what moved from parent to child but what moved from child back to parent: a judgment the older generation could not bear. 

Social Mirrors: Schools, Workplaces, and the Public Gaze 

The burden was not only private. Society itself became a mirror in which these children saw themselves reflected, often in distorted ways. Several spoke of the quiet pause when a teacher or colleague recognized the surname—and then the question that followed, carefully phrased to sound neutral but freighted with suspicion: “Any relation to … ?” In adulthood, some learned to bring it up first, defanging the question with a practiced sentence—“Yes, I’m his daughter; no, I do not share his politics”—and moving on before the conversation stalled. 

In public life, the reception depended on the role they chose. The rejectors found a kind of moral home among activists and historians. The defenders found communities that resent “victors’ justice.” The divided and the transcenders navigated lonelier paths, neither embraced by partisans nor comfortable with silence. 

Hungarian Jews arriving at Auschwitz in May 1944. Moments after disembarking from the train, many faced Nazi selection—some to forced labor, many to death. Photo by Ernst Hofmann or Bernhard Walte. (Credit: German Federal Archives [CC-BY-SA 3.0])What Changes With Time—and What Doesn’t 

We sometimes imagine that moral burdens fade in predictable half-lives. In my experience, time changed the tone but not always the weight. As my interviewees aged, many reported that reckoning deepened, not because new facts appeared but because their own children asked better questions. 

The third generation—further from the emotional bond and closer to the educational curriculum—refused family mythologies in a way the second often could not. “Grandpa couldn’t have known,” a parent would say. “But he was there,” a teenager would answer. 

Anniversaries, documentaries, and new archival releases periodically reset the conversation. A case reopened, a grave discovered, a diary authenticated—and the private work of reconciliation was hauled into public light. At those moments, people who had made peace with their own narrative found themselves having to make peace again, this time with an audience. 

Adolf Hitler with Reich Minister of Propaganda Joseph Goebbels and wife, with their children: Helga, Hilde, and Helmut. (Credit: Bundesarchiv, Bild 183- 1987-0724-502 / Heinrich Hoffmann / CC-BY-SA 3.0)Comparative Frames: Not Only Germany 

The Nazi case is singular in scale and intent, but the dynamics I heard are not unique. Descendants of slave owners in the American South wrestle with family papers that list human beings as property and calculate children as “increase.” In post-apartheid South Africa, the Truth and Reconciliation Commission exposed a generation of children to testimony that shattered family legends. In Rwanda, the gacaca courts forced communities to confront the fact that génocidaires were not abstract monsters but neighbors—and often fathers. Across the former Yugoslavia, the International Criminal Tribunal’s judgments collided with nationalist narratives passed down at kitchen tables. 

In all these contexts, the same questions surface: Am I responsible for the sins of my father? Can I love my parent without condoning their crimes? What do I owe to victims and their descendants? How do I build a life that is truly my own? 

The answers vary by culture and circumstance, but the structure of the dilemma is recognizably human. 

Mechanisms of Transmission: How the Shadow Travels 

If “intergenerational trauma” names an outcome, what are the mechanisms? Scholars point to at least four: 

Silence. When families refuse to speak, children fill the vacuum with fantasy or shame. The mind abhors a narrative void. In several households I encountered, silence was the loudest sound in the room. It produced neither absolution nor forgetfulness—only rumination. 

Mythmaking. The stories families tell—of resistance, ignorance, or necessity—shape the moral horizon. Even a small act of decency can be inflated into an alibi. Conversely, some families cultivate a punitive myth of inherited stain, a fatalism that imprisons the young in a script they cannot revise. 

Ritual and Place. What families visit—or avoid—matters. One daughter told me she had been taken to battlefields but never to camps. Another said the first time she saw the Nuremberg courtroom, she felt she had stumbled into a photograph that had been waiting for her. 

Rituals of remembrance can either widen or narrow moral imagination. 

The second generation experiences a kind of indirect moral injury: an injury not from what they themselves did but from what knowing does to them.

Institutional Echoes. Schools, museums, and media frame the past in ways that either invite reckoning or permit evasion. A curriculum that skips over the depth and breadth of atrocities—as has happened in many academic settings when it comes to the Hamas terror attack of October 7—makes it easier for descendants to imagine their relatives are free of any responsibility. 

Institutions can either dignify the moral labor families attempt or tempt them with a ready-made script of innocence. 

Child survivors of Auschwitz, wearing adult-sized prisoner jackets, stand behind a barbed wire fence. Still photograph from the Soviet film The Liberation of Auschwitz, taken by the film unit of the First Ukrainian Front, Auschwitz, 1945. (Credit: United States Holocaust Memorial Museum, courtesy of Belarusian State Archive of Documentary Film and Photography)Moral Injury and the Cost of Knowledge 

“Moral injury”—a term developed to describe soldiers who feel they have violated their own ethical codes—offers another lens. The second generation experiences a kind of indirect moral injury: an injury not from what they themselves did but from what knowing does to them. Knowledge damages one’s relationship to a beloved parent; truth injures attachment. 

Some choose not to know much. Others choose to know everything and live with the ache. One daughter, who had read deeply in trial transcripts, said that learning the exact logistics of a deportation under her father’s authority broke something in her. “I used to think there must have been chaos,” she said. “It was worse—there was order.” 

For her, the injury was precision—the bureaucratic elegance of evil. 

Choosing Children: Reproduction Under a Shadow 

A notable fraction of those I interviewed had chosen not to become parents. The reasons varied: fear of passing on a name, a desire to end a line, uncertainty about what one could say to a child who asked, “Who was my grandfather?” One son told me that he chose not to become a father because he could not bear to pass on a story line he had never been able to fully explain. 

None believed in genetic guilt. The concern was narrative. Parenthood would require mastering a story they themselves had not yet mastered. Others chose to have children precisely as a defiance of history—an insistence that a life could be built that was neither repetition nor repudiation but revision. 

If we want to interrupt the transmission of harm—whether its currency is trauma or shame—we must map the routes it travels.

These decisions often intersected with partners’ views. Some marriages could not bear the weight of history. One woman described the look on a fiancé’s face when he first grasped the details of her father’s role. 

“It wasn’t revulsion,” she said. “It was calculation. He was calculating whether he could carry it with me.” The engagement ended. 

The Skeptic’s Task: Between Verification and Empathy 

A skeptic acknowledges the limits of memory and the demands of evidence. Interviews with perpetrators’ children are not court records; they are human documents, shaped by self-protection, loyalty, and fatigue. Defensiveness, denial, and selective recall were constants. My job, then and now, is to triangulate: place personal accounts against trial transcripts, diaries, and the scholarship of historians and psychologists. 

Skepticism here is not cynicism. The aim is to understand without excusing, to listen without indulging. If we want to interrupt the transmission of harm—whether its currency is trauma or shame—we must map the routes it travels. That map requires both archival rigor and an ear for the ways people live with the past. 

Freedom for the Third Generation? 

Again and again, interviewees asked whether their children—grandchildren of the perpetrators—could be free. There is some evidence that the burden lightens with distance, especially when the second generation does the work of truthtelling. But it is not inevitable. Silence begets fantasy, and fantasy rarely lands on justice. 

The most hopeful conversations I had were with families who had made memory a practice rather than a panic. They visited sites of the crimes together. They read. They argued. They did not ask love to overrule truth or truth to annihilate love. They let both inhabit the same home. In those households, the third generation seemed less haunted and more oriented—not weighed down by a surname but awake to what it should mean to carry one. 

A line of Dagmar Drexel stays with me: “Our generation has the obligation to confront the truth. Only then can the next one be free.” 

The obligation is not to perpetual penance but to honest narration. Freedom comes not from forgetting, but from telling the story in a way the young can live with. 

A German teacher singles out a child with “Aryan” features for special praise in class. The use of such examples taught schoolchildren to judge each other from a racial perspective. Germany, 1934. (Credit: United States Holocaust Memorial Museum, courtesy of Süddeutsche Zeitung Photo)Living in the Shadow Without Becoming It 

The story of the children of Nazi leaders is not only about Germany, nor only about the Holocaust. It is about the universal human challenge of living with a family legacy that collides with one’s moral values. We do not inherit guilt in the legal sense. Yet we can inherit its shadow—in our names, our family stories, our silences, and our choices. 

Freedom comes not from forgetting, but from telling the story in a way the young can live with. 

The work of a lifetime, for some, is not to step out of the shadow but to learn how to live within it without becoming it. That means choosing accuracy over myth, candor over silence, accountability over performative shame. It means loving a parent, if one can, without lying about him—and refusing to let that love dictate the terms of one’s moral life. 

If there is a single lesson my interviews taught me, it is that history is never safely past; it lives inside our most intimate relationships. To reckon with that is not to remain trapped. On the contrary, it is the only way through—an insistence that the very human bonds that transmitted the shadow can also be the ones that transform it.

Categories: Critical Thinking, Skeptic

Decoding Espionage: Newly Declassified Documents Reveal the Secret Intelligence War

Thu, 12/11/2025 - 7:37am
“Western powers can be in a cold war … before they realize it.” — CALDER WALTON, Spies: The Epic Intelligence War Between East and West

How is the history of espionage relevant to the present? How does recent document declassification change our understanding of the Cold War? Spies, Lies, and Algorithms broadly and concisely surveys the hows and whys of the U.S. intelligence community from multiple perspectives. Spies: The Epic Intelligence War Between East and West deeply surveys a century of espionage by Russia against the U.S. and Britain. Both books offer new information and conclude with sharp warnings for the present.

When I was in graduate school, the professor of a class on cold war history commented that a book he had initially assigned was already out of date just three years after its publication, due to information declassified in the interim. I recalled this often as I read, so many times, in Calder Walton’s Spies, that his sources were documents that had only been accessible or declassified as recently as 2022. As such, Walton’s book rewrites history, from Lenin to Putin. His thesis is that Russian espionage against the U.S. and Britain was as aggressive before and after the Cold War as it was during it.

Some of the book’s new or strengthened conclusions will please partisans on either side of political (U.S.) debates. Conservatives might find grim validation in the relentlessness and depth of Soviet—and then Russian—espionage. For example, Russian archives have not only proven the guilt of President Franklin Roosevelt’s advisers Lauchlin Currie and Treasury Secretary Harry Dexter White (among many others of the Cold War era), but also reveal compelling new evidence of Russian assistance to liberal politician Henry Wallace, as well as to later left-wing intellectuals and the multi-country antinuclear movement, and that the Soviet Union used détente (and later the Soviet Union’s collapse) to increase its espionage. Liberals, on the other hand, may be pleased by new evidence that U.S. Cold War policy did not take into consideration the Soviet perception of NATO, and that the founding U.S. Cold War document’s “domino theory” was based on a false premise (Kremlin documents now show that Soviets did not initiate wars in the Third World).

U.S. science and technology effectively drove both sides of the Cold War.

Newly released material also suggests that from Lenin to Putin, Russian leaders’ refusal to tolerate criticism and alternative points of view severely damaged the Soviet Union (and later Russia), both internally and externally. In contrast, the openness of the U.S. and Britain made it smart for Russia to focus its efforts on human spies. Walton points out, for an example perhaps of particular interest to Skeptic readers, that Russian spies in the U.S. were remarkably successful in their technological espionage, not just in accelerating the Russian development of the atomic bomb, but also more recently in stealing military technology, so that “U.S. science and technology effectively drove both sides of the Cold War.”

One revelation that startled me was new evidence that Truman was never briefed on Korea prior to the outbreak of war, and that, in Russia and the U.S., throughout the Cold War and since, most spies who were caught have been unmasked due to the opposing side’s defectors (and that both sides blundered in promoting people who committed treason). By contrast, Walton argues that, in most other areas, even intelligence historians continue to overemphasize the role of human spies and underestimate the role of communications interception (“signals intelligence”).

Zegart diagnoses the root of the problem as the necessity for secrecy: it is illegal for political scholars to examine most current intelligence.

One reason for the overemphasis on human spies highlighted by Professor Amy Zegart in Spies, Lies, and Algorithms is the explosion in popularity of spy entertainment in recent decades. The ticking time bomb scenario where the hero saves the world is a staple of fiction, but has vanishingly few analogs in real life, according to Zegart, where real intelligence work involves multiple sources being weighed against each other. (Walton’s most dramatic example of how both human and technological methods complement is new evidence of why President Kennedy was able to defuse the Cuban Missile Crisis.) However, this is not how things are portrayed in fiction. Zegart acknowledges how terrific it would be if fantasy were reality, and cites alarming evidence that the general public confuses the two. Her even more damning indictment is that entertainment has been mistaken for fact by senior policy makers in the 21st century, including by a U.S. Supreme Court Justice and in a confirmation hearing for a CIA director.

Zegart diagnoses the root of the problem as the necessity for secrecy: it is illegal for political scholars to examine most current intelligence, and older declassified documents sought by historians can arrive years after they’d been requested, and then only heavily redacted. Thus, Zegart finds it unsurprising that there are remarkably few articles about intelligence in academic journals, and incredibly few college courses on the history or politics of espionage. Zegart sees a similar dynamic at work when it comes to Congressional oversight, where elected representatives can’t talk about secret material.

To provide some much needed background, Zegart discusses the history of U.S. intelligence, including an additional chapter on highly placed traitors. The heart of the book is a chapter-by-chapter discussion of issues in the world of espionage. Real life intelligence work is mostly tedious and mundane. Her coverage of it, and what its results can and cannot do, is nuanced and sobering. Readers of Skeptic will not be surprised by the challenge of overcoming confirmation bias and human frailty at estimating size and probability. Evidence, she suggests, is that these are best overcome by an outsider “devil’s advocate” (a procedure bypassed in the case of Saddam Hussein’s alleged Weapons of Mass Destruction) type counterscenario planning. The paradox of presidential use of covert action analyzes why presidents of opposing views in different times and facing different challenges all criticize secret, morally questionable “active measures” but end up using them anyway.

Intelligence failures result from “the natural variations in the predictability of human events and the limitations of human cognition.” — AMY ZEGART, Spies, Lies, and Algorithms

Walton and Zegart agree that governments are losing their monopoly on intelligence gathering, essentially due to new technology. Zegart sees Google Earth, smart phones, and other public technology as having broken the monopoly that governments once had on the discovery of nuclear weapons sites and other military matters, and she discusses the potential dangers of premature revelation of that information, even if it is true. (Here and elsewhere, she emphasizes that the analysis of images and other data is a highly specialized and sophisticated skill learned by intelligence professionals, with many traps into which even well-meaning amateurs all to0 easily fall.) Where Zegart focuses on the activities of private citizens, Walton sees the future of intelligence being with multinational private companies, selling satellite access or high-end encryption programs to whatever government or business willing to pay their price, and so with no chance of government oversight.

American Cryptology during the Cold War, 1945-1989, Book II: Centralization Wins, 1960–1972. (Source: National Security Agency/Central Security Service)

Both books cite FBI statistics that document that China is by far the greatest threat to the U.S. through both government- and business-allied intelligence agencies sending over a seemingly endless number of highly trained agents to steal military and technological secrets from the U.S. Both authors also discuss the difficulty and urgency of reorienting an intelligence bureaucracy to new realities, Zegart’s being in greater depth and among the sources cited by Walton. Interestingly, both authors agree that history and current practice both indicate that intelligence is most effective when multiple techniques—human spying, satellite imagery, and much more—are used in combination, and both agree that cutting intelligence budgets ends up costing more than they save.

Both authors agree that cutting intelligence budgets ends up costing more than they save.

Both authors also discuss the relation between intelligence and conspiracy theories. First, both identify a few that were real, including the recent revelation of a Cold War deal with a leading manufacturer of government encoding machines. However, far more often people see conspiracies where none exist. Part of Walton’s data comes from England, and he suggests that “Those who tend to see … conspiracy overestimate the competency of those in Whitehall (home of British Secret Intelligence Service, MI6, much as ‘Langley’ is a term used for the American CIA).” Zegart quips that from her analysis of the impact of spytainment and her survey of Ivy League courses, students are more likely to hear a professor discuss U2 the rock band than the U-2 spy plane, one point of which is that ignorance of espionage history and practice is a great breeding ground for conspiracy theories. While Stalin’s paranoia is well known, Walton provides evidence of Lenin’s as well, and concludes that Putin is “a naturally inclined conspiracist.” (Note that Putin began his career in counterintelligence—ferreting out spies.)

As outstanding as both books are, no text of such depth can be perfect. The most serious problem with Walton’s Spies is that the bibliography is solely online (and often inaccessible), and the entries on it do not always include dates and publisher information. In the book itself, if part of Walton’s thesis is that the Cold War started in 1917, shouldn’t he have offered more than one example of early espionage? Recent scholarship in most areas is thorough (based on the endnotes), but some important books are missing, including G-Man by Beverly Gage (for its new data about the FBI’s work abroad [which it is not supposed to do], and which I reviewed in Skeptic).

Most of Walton’s 548 pages of main text are well used, but some ancillary material (such as recently declassified World War II British intelligence work unrelated to Russia) might have been edited out for length, however fascinating and new it is. I also wonder if, in a history book, it is best practice for an author to explicitly discuss implications for the present, which he does, with several opening pages on Russia’s 2022 invasion of Ukraine and a closing chapter on the relevance of the book’s conclusions to 21st century Chinese espionage. That said, this is telling, or at least amusing: for this book, it was discovered that a World War II Russian operative in Ukraine was named Nikita Khrushchev, who might not have gone on to become a future Soviet leader had he tried to warn Stalin about German troops massing on the border of Russia. Another likely case of the futility of speaking truth to power in the old Soviet bloc is the anecdote about a lone non-Communist Czech minister, Jan Masaryk, who tried to warn the public about Soviet tyranny and soon died from falling out a window, allegedly from suicide.

Walton offers two examples of recent or new evidence that the world came closer to nuclear war than previously known.

More to the point, Walton offers two examples of recent or new evidence that the world came closer to nuclear war than previously known in not only the Cuban Missile Crisis, but, possibly, also in a 1980s military exercise that may have been mistaken for the real thing. Both resulted in increased dialogue between the U.S. and Russia.

The two books are masterclasses on their respective subjects. Walton doesn’t just incorporate recent research on Soviet Russian espionage, but he has investigated original documents (some declassified very recently) in Russia, Ukraine (for its intelligence about the Soviet Union), Britain, the U.S., and elsewhere. In addition to Zegart’s original research and government work, she has mastered a vast secondary literature and demonstrates her experience in explaining it. Both can be read by skeptics for evidence of the very real dangers posed by confirmation bias and lack of critical thinking by the highest government officials, as well as by the general public who, at least in some countries, empower them.

Categories: Critical Thinking, Skeptic

I’m Trans. This Is My Story

Tue, 12/02/2025 - 8:03pm

I was born in the late 1970s, back when “transgender” wasn’t a word you’d see on television, let alone in a school curriculum. Back then, there was only “he” or “she,” and if you didn’t fit neatly into one of those boxes, you were expected to hide it. I learned early that whatever I was didn’t fit, and that saying so could make me a target.

I remember being six years old, draping a towel over my head and pretending it was long hair. I wasn’t rebelling against anything. I was aligning myself, in the only way I knew how, with what felt true. It took years before I discovered there were others like me and decades before society began to admit that such people even existed. The shame came later, when I learned that such feelings were unspeakable.

My first experiences with desire were tangled up with fear. As a teenager, I was drawn to boys but couldn’t imagine anyone seeing me that way. Every crush came with an undercurrent of panic: If he knew who I really am, he’d hate me. And all of them did. The first time I came out to someone I liked, he laughed and told his friends. The next day at school, the whispering started. Within a week, I had no friends left.

Transition isn’t a lifestyle. It’s a form of care that restores equilibrium.

For trans women of my generation, that kind of rejection was typical. You learn to move through the world invisible because being seen too clearly can be dangerous. It still can be. Even now in my 40s I find myself editing how I walk, how I speak, how I dress in public. Not out of vanity, but self-preservation.

But when I say that being trans is the last thing I want people to notice about me, it’s not from shame, it’s because being trans should be irrelevant to my humanity. It’s a rare medical condition, not an identity that defines the whole of a person. I’d rather be recognized for my work, my sense of humor, my curiosity, and my contributions to the world than for the fact that I had to undergo medical treatment to live comfortably in my body. Transition isn’t a lifestyle. It’s a form of care that restores equilibrium. A way to make the physical self match the internal one so that life can finally move beyond gender altogether.

That’s why I bristle at the way trans discourse has evolved in the past few years. I’m grateful that young people today have the language, visibility, and community. But I also worry that online activists (many of them very young!) speak about gender transition as if it were a simple matter of identity affirmation rather than the profound, irreversible medical journey it is. Hormone therapy and surgery are not accessories to self-expression. They are life-altering interventions that carry serious physical and emotional consequences.

To conflate transient identity exploration with the rare and lifelong condition experienced by people like me is to risk harm.

We are also witnessing an unprecedented rise in gender dysphoria among adolescents and young adults, particularly girls transitioning to be boys (I intentionally use “boys” here instead of “men”). While I don’t doubt that some number of them are trans—trans people have existed in every culture throughout history and we are not going anywhere anytime soon—the sudden increase suggests many are likely grappling with broader questions of identity, anxiety, and belonging rather than a deep-seated, persistent dysphoria, and latch onto gender identity because it is so visible and so celebrated today. But to conflate transient identity exploration with the rare and lifelong condition experienced by people like me is to risk harm. The medical establishment must be able to tell the difference, without fear of being called bigoted for doing so!

I say all this not to gatekeep, but to underscore the gravity of what transition entails. I’ve had surgeries that permanently changed my body. I inject hormones, knowing they’ll likely to affect my liver, my bones, and my fertility. I made these decisions as an adult, after years of therapy and reflection. I don’t regret them for a second but I also wouldn’t wish their necessity on anyone.

Caution is not cruelty. It’s compassion informed by reality.

They are serious medical interventions that alter the body permanently, often with side effects that require lifelong management. For adults, with informed consent and psychological support, they can be lifesaving. For children and adolescents, whose identities are still in flux, such decisions must be approached with restraint and rigorous oversight. Caution is not cruelty. It’s compassion informed by reality.

Gender-affirming care saves lives, no matter what anyone says. I know this because it saved mine. But that doesn’t mean it should be prescribed without deep, individualized assessment, especially for children and adolescents who are still developing their sense of self. Puberty blockers and cross-sex hormones are not toys, and it’s not transphobic to say so. The medical community must balance compassion with caution. Both can coexist.

At the same time, we cannot let this conversation become an excuse for cruelty. The backlash against gender medicine has brought out voices who see our existence itself as pathology. They call for bans, restrictions, and “re-education,” pretending that trans lives can be legislated out of reality. And many more make disparaging jokes about genitals or trans people supposedly not knowing that plastic surgery cannot change biology. These people are not protecting children. They are using them as pawns.

The surgeries and hormones are not what make us who we are.

What gets lost in the shouting is the truth most trans adults live quietly every day: We don’t want special treatment! We just want to be left in peace! To work, to love, to grow old without fear. The surgeries and hormones are not what make us who we are. They are tools that allow us to stop fighting our reflection and start living!

I wish more activists today understood that dignity doesn’t come from angry rhetoric or slogans. It comes from honesty. And honesty means acknowledging not just the courage, but also the risk and the pain it takes to become yourself. It also means being truthful about those things with our youth.

Trans rights are human rights, not because trans people are flawless or because half-naked activists shout it at a protest, but because no one should have to justify their existence. Defending trans rights means defending the right to live truthfully and safely. But truth also demands clarity: transition is not something to be entered into lightly, nor denied to those who need it. The middle ground—careful, evidence-based, compassionate medicine—is where reason lives. And it’s where our humanity should, too.

Categories: Critical Thinking, Skeptic

The Psychology Behind Black Friday and Cyber Monday

Thu, 11/27/2025 - 7:46am

It’s the most wonderful time of the year. Sure, there are all the twinkly lights on display and family dinners, but really, it’s all about the shopping. Isn’t it? After all, each year we spend thousands of dollars around this time (retail sales in the U.S. between Black Friday and Christmas will likely surpass $1 trillion for the first time this year, up from $994 billion in 2024)—and not all of it on gifts either. It’s the Super Bowl of consumerism.

It’s hard to resist. Every few minutes I’ll get some sort of notification of a once-a-year sale that I must take advantage of immediately. I’m being primed to want things that I never really even thought of because they happen to be 20 percent off. I get nagging follow-up messages to get the items I happened to have glanced at but didn’t succumb to—before it’s too late, before they are gone forever (or at least the discount is)! That’s the annual ritual.

Some, however, are more disciplined than I. They wait until this time of year to buy the things they actually want and need at a discount. They are the real unsung heroes of the season. Just the other day a woman in my writers’ group proudly showed off her new Apple Watch that she had waited all this time to get. “I got it for cheap,” she exclaimed proudly, “I don’t know if it’s any good.” 

As for me, I’ve been eyeing a Mason Pearson brush for at least 15 years. Girl math dictates that had I bought it 10 years ago, I would have gotten it for half off. I’m told the quality is so good that it will last long enough to pass on to my children—because that’s what every child craves: a used hairbrush. Maybe in a few years, when it’s twice the price?

The Mason Pearson website showcases “luxury and efficacy” alongside 130 years of heritage—a masterclass in how legacy brands use elegant design and storytelling to justify premium pricing.

Mason Pearson, though, represents a “legacy” brand in an increasingly disposable world. It is characterized by its longevity (think: Levi’s and Tiffany & Co), rich history, perception of quality, and cultural relevance. Brands come and go, but Mason Pearson gets its name from its founder, an engineer who first created it in 1885. Multiple generations have enjoyed smooth hair from these high-quality durable brushes that continue to be handcrafted in England and are referred to as “the Ferrari of brushes.” It has cult status. I hear that spoilt pets like it too.

But this isn’t an ad for Mason Pearson. It’s a column about psychology.

Legacy brands tend to evoke nostalgia, one of the most powerful feelings a brand that wants to sell lots of products can invoke. It reminds us of simpler or happier times, or connects us to family members who might have used the same products. Or people like Marilyn Monroe, who to this day is still partially responsible for the sales of Erno Laszlo skincare products and Chanel No. 5 perfume. If you wear the latter to bed, you too can continue the legacy. This emotional resonance and sentimental bond help foster brand loyalty by transforming the product into something more meaningful.

The Ed Feingersh photograph that launched decades of marketing: Monroe with Chanel No. 5 in 1955. One interview answer in 1952 became eternal brand mythology—and continues selling perfume in 2025.

As Mad Men’s Don Draper describes it, nostalgia is a “twinge in your heart, far more powerful than memory alone.”

The legacy brand also comes with a story.

Successful legacy brands leverage their rich history and craftsmanship through compelling storytelling. This narrative allows consumers to feel like they are part of an ongoing legacy, connecting them with tradition and artistry that defines the brand. A good example of this type of marketing is deployed by Grado Labs, a company that produces headphones “handmade in Brooklyn, producing the finest audio products since 1953 in the building that our father/grandfather/great grandfather Pasquale bought back in 1918.”

The Maker Stories website featured Grado Labs exemplary legacy brand marketing: four generations of family craftsmanship in the same Brooklyn building since 1918, triggering nostalgia and trust in consumers.

Indeed, research shows that our brains respond to stories, triggering the release of oxytocin, a hormone that happens to promote trust. This helps explain the results of the 2009 Significant Objects experiment conducted by Robert Walker and Joshua Glenn where they found that pairing a story with a product was able to increase its perceived value by up to 2,706 percent.

According to clinical psychologist Clary Tepper, when consumers buy into a brand they are engaging in what is deemed by psychologists to be “symbolic consumption” whereby the brand becomes a representative of a set of ideals or values.

“From a psychological point of view, the principles of memory, identity, and emotional security are all at work here,” she tells me, “For consumers who feel like the world is constantly changing, legacy brands offer a sense of stability and continuity. Those brands also tap into shared cultural history, collective memory, and identity, all of which can foster a sense of belonging and trust. If consumers have had positive experiences with these brands in the past, engaging with them again can activate reward pathways in the brain.”

Deidre Popovich, Associate Professor of Marketing at Texas Tech University, agrees: “People are quite drawn to legacy brands because they feel very familiar and comforting. From a consumer psychology perspective, these brands may be tied to early personal memories, such as thinking back to certain family routines when you were a kid. When someone sees a legacy brand, they often feel this sense of recognition that links back to earlier points in their life. That is what creates a feeling of nostalgia. It is usually less about the product itself or its functional purpose, and more about reconnecting with family moments and/or positive feelings.”

Ownership of a legacy brand’s products can also be a way for consumers to signal their aspirations or social status. It’s part of an identity that they can choose to put on—or change. 

Sarah Seung-McFarland is a psychologist and founder of Trulery, where her focus is specifically on fashion and design psychology. She tells me: “In psychology, we know that consumers don’t just buy a product, they buy into the identity, lifestyle, and social meaning connected to it. Brands like Louis Vuitton have spent decades being linked to wealth, status, and an aspirational lifestyle through film, celebrity culture, and consistent visual storytelling.”

Brands, in a sense, become a stand-in for a world that might be within our reach. Though, says, Seung-McFarland, “For many, the desire came from the inaccessibility itself. Owning a legacy piece represented the version of themselves they hoped to become.”

It’s also a type of reassurance about the quality and reliability of the brand. Its longevity is a testimony to that. That’s why we’re seeing a bit of a revival towards appreciation for long-standing brands. 

The reddit message board “BuyItForLife” receives 1.7M weekly visitors. There the focus seems to be on brands that, as the title suggests, last. Some items are expensive status symbols like Rolex watches and Birkin bags, but others are more practical items like eiderdown bedding, Montblanc fountain pens, Le Creuset cookware, knives with a lifetime of sharpening, Canada Goose coats that can be passed on as a family heirloom, microwaves that don’t break within a year or two, Zojirushi rice cookers, Dyson vacuums, Viberg boots, Barbour’s waxed jacket, Herman Miller office chairs, and on the more affordable side—Stanley water bottles. And yes, my coveted Mason Pearson brush is also a common recommendation. But most surprisingly, there’s even a laptop that users believe can last a lifetime. The purpose—at whichever price point—is the pursuit of quality and longevity.

According to Popovich, going with a long-standing brand helps consumers reduce the cognitive effort involved in making a choice. “Shoppers don’t have to work hard to evaluate it; the feeling of familiarity makes it seem like an obvious decision,” she says, adding, “This is due to cognitive fluency, which is the feeling of ease we get when our brain can process information quickly and without effort. This feeling influences our judgments, making us more likely to perceive information as truthful and likable, simply because it’s familiar.”

I’ll keep that in mind as I inch toward buying that expensive hairbrush that somehow keeps feeling more and more like the “obvious” choice.

Categories: Critical Thinking, Skeptic

Harvard Astronomer Takes Up Skeptic Publisher’s $1,000 Bet on Alien Disclosure by 2030

Tue, 11/25/2025 - 3:41pm

In the long tradition of scientific wagers, Skeptic magazine publisher and historian of science Dr. Michael Shermer has issued a $1000 bet that…

Discovery or disclosure of alien visitation to Earth in the form of UFOs, UAPs, or any other technological artifact or alien biological form, as confirmed by major scientific institutions and government agencies, will not happen by December 31, 02030.

Taking him up on that challenge is Harvard astronomer and Director of the Galileo Project Dr. Avi Loeb. The wager is placed through the Long Now Foundation’s Long Bets program (“an arena for competitive, accountable predictions”), which adds a 0 at the front of all dates on a 10,000 year calendar (“to foster better long-term thinking”), in keeping with their Clock of the Long Now, being built in Texas and designed to tick for 10,000 years. Details of the Shermer-Loeb wager may be found here.

Whoever wins, the $1000 stakes will be donated to the Galileo Project Foundation. Here are the terms for deciding who wins:

By Dec 31st 02030, at least two of these three scientific organizations—NASA, the National Science Foundation, and the American Astronomical Society—will affirm that discovery of extraterrestrial intelligence in the form of UAPs, UFOs, or any other interstellar objects that are determined to be ETI technological in nature, or any alien biological life form found here on Earth, has been made.

Here is Dr. Loeb’s argument:

The search for technological artifacts has just started in earnest in 2025 with the discovery of the anomalous interstellar object 3I/ATLAS, the launch of the Rubin Observatory and the construction of three Galileo Project Observatories.

Given that there are billions of Earth-Sun analogs in the Milky-Way galaxy—most of which are billions of years older than the solar system, and that it will take less than a billion years for our Voyager spacecraft to cross the Milky-Way disk, we must engage in the scientific search for extraterrestrial technological artifacts.

It is better to be an optimist because life is sometimes a self-fulfilling prophecy. This is why I am engaged in the search with the hope that we will find a partner on our blind date with interstellar objects.

Here is Dr. Shermer’s argument:

Since the founding of the Skeptics Society and Skeptic magazine in 1992, I have been documenting predictions by UFOlogists that discovery or disclosure of alien visitation to Earth is coming any day now.

Believers appear in the media boldly predicting that by the end of the year we will have proof of alien contact—33 years later I’m still waiting for said proof. More recently, proponents of UAPs as alien spacecraft have appeared before the U.S. Congress, confidently claiming that they know people who have seen and even touched aliens and/or their spaceships, back-engineered their technologies, and even communicated with the aliens.

Yet when pushed for evidence, they always demur, saying that it’s “classified,” “top-secret,” that Men-in-Black threatened them into silence, that their careers and even their lives are at stake if they disclose said evidence, that they could only reveal the evidence in a “SCIF” (Sensitive Compartmented Information Facility) but not in Congress, and that many people in the U.S. government, CIA, FBI, NSA, etc. (never named) have this information and evidence of alien visitation.

The purpose of this bet, in keeping with the rules of Long Bets and the philosophy of the Long Now Foundation, is to reveal the actual confidence of UFO/UAP alien believers by getting them to put their money where their beliefs appear to be. You say we will have alien disclosure by the end of the year? O-kay, let's place a wager on that prediction. I say it won’t happen.

The Long Bets program was started in 2003 by Stewart Brand and Kevin Kelly, and is part of a long tradition of scientific wagers dating back at least to 1870 when Alfred Russel Wallace, co-discoverer with Charles Darwin of natural selection, accepted a £500 wager (a workingman’s wages for one year) placed by flat-Earther John Hampden that scientists could not prove that the Earth is round.

Wallace proved it by demonstrating same-height poles placed at even intervals along a six-mile stretch of the Old Bedford Canal (north of London) appeared through a telescope lower by the exact “amount calculated from the known dimensions of the earth.”

Unfortunately, Wallace had to take Hampden to court to collect his winnings. Thus, it is important that such wagers be professionally adjudicated by neutral referees. Other wagers include:

  • In 1975, cosmologist Kip Thorne bet cosmologist Stephen Hawking that Cygnus X-1 was a black hole. Thorne won
  • In 1980, biologist Paul Ehrlich bet economist Julian Simon that the price of a portfolio of five mineral commodities (copper, chromium, nickel, tin, and tungsten) would rise in price over the next decade. Simon won.
  • In 1998, neuroscientist Christof Koch bet philosopher David Chalmers that the Hard Problem of Consciousness (a term coined by Chalmers) would be solved in 25 years. Chalmers won.
  • In 2017, astronomer Martin Rees bet psychologist Steven Pinker that “a bioterror or bioerror will lead to one million casualties in a single event within a six-month period starting no later than December 31, 02020.” Since the lab-leak hypothesis for Covid-19 (bioerror) was never proven, Pinker was declared the winner on the Long Bets platform.

If Loeb wins the bet, it will represent what would arguably be the greatest discovery in human history, namely that we are not alone in the universe.

If Shermer wins the bet, it does not mean that we are the only intelligence in the cosmos, only that claims of contact are likely greatly exaggerated and that we need to keep search for the truth about extraterrestrial intelligence.

Categories: Critical Thinking, Skeptic

The Aliens are Here (Again)! A Review of The Age of Disclosure

Sun, 11/23/2025 - 12:55pm

Here we go…again. Another documentary film about how disclosure of alien contact is imminent. It’s a claim I’ve been hearing for over three decades, albeit this one is of a higher quality than the dozens of similar such docs available on Amazon Prime (and hundreds more on YouTube).

With The Age of Disclosure, filmmaker Dan Farah (Call Jane, Ready Player One, The Phenomenon) has lifted the genre to a higher level than the others (James Fox being the exception, with The Phenomenon—credit shared with Farah—well worth watching). I was tempted to offer a snarky “I watched it so you don’t have too,” but if you are relatively new to the UFO/UAP topic, I recommend investing the twenty bucks Amazon Prime charges to rent the film for 30 days ($25 to buy it). The artfully edited trailer hints at what is to come in the full film.

0:00 /2:47 1×

The Age of Disclosure is packaged and produced so well that naïve viewers may come away thinking that something strikingly original, shockingly new, and world-shaking is about to be loosed among the world, everywhere the ceremony of innocence drowned (Yeats, of course).

Alas, it is not to be. Every fact, opinion, or anecdote in the film has been rehearsed elsewhere in recent years, and a good deal of the footage is from Congressional hearings, media reports, and stock interviews that have been circulating for years on CNN, Fox NewsNews Nation, and even the Wall Street Journal and the New York Times, along with other mainstream media sources and large-audience podcasts. But the fusillade of statements, interspersed with the familiar UAP grainy videos and UFO blurry photographs, leaves no doubt about the film’s conclusion:

“We are not alone in the universe.”

“Humans are not the only intelligence in the universe.”

“They’re real, they’re here, and they’re not human.”

“Non-human intelligences are here and have been interacting with humanity for a long time.”

“We are not the only intelligent life form on the planet. There’s something else here.”

“This is the biggest discovery in human history.” 

Wow, can we see these aliens and their spaceships?

Nope.

Why not?

The opening credentialing sequence gives us a clue: when you don’t actually have concrete evidence that we can all see, your case depends on eyewitness accounts so you must establish that their words are trustworthy and reliable. For example, the intrepid UAP proponent Lue Elizando says:

“If you could be in my shoes and see what I’ve seen, there would be zero shadow of a doubt that these things are real and they are not made by humans.”

The problem is that we can’t be in anyone else’s shoes, so we depend on evidence that does not depend on a single eyewitness. “If only you could have been in my shoes that night when I saw Bigfoot—there would be zero shadow of a doubt….” In science, such anecdotes do not count as evidence; you need to be able to show actual physical evidence—in this case a body of a Bigfoot.

Continuing my biological analogy, in order to name a new species you have to present a type specimen—a holotype—that everyone can see, examine, photograph, analyze, etc. If you gave a talk at a biology conference about how you discovered a new species of bipedal primate, no one would take you seriously if you did not also present unmistakable evidence. If all you had were stories about what you saw, and maybe a couple of out-of-focus videos and grainy photographs, no one would believe you…and for good reason!

What scientists and skeptics are asking of the UFO and UAP community is to, at long last, show us the evidence.

What scientists and skeptics are asking of the UFO and UAP community is to, at long last, show us the evidence. We have been hearing of pending disclosure for half a century and are always left wanting. We don’t need to know your credentials, how many years you worked for the U.S. government or military, or how strongly you believe that what you saw was aliens or alien craft; just show us what you claim is here and we will all believe. QED!

But no. Here is parapsychologist, remote viewing researcher, and UFOlogist Hal Putoff: 

“The classified data that we had access to when we joined the program was indisputable.”

Here is astrophysicist and UFOlogist Eric Davis:

“There is 80 years of data that the public isn’t even aware of.”

Here is Jay Stratton, prominently featured in the film as one of the defense officials who first investigated UAP:

“The things that I’ve seen, the clearest videos, the best evidence we have that these are non-human intelligence, remains classified. I have seen with my own eyes non-human craft and non-human beings.”

He saw it himself! No FOAF (Friend of a Friend) urban legend. O-kay, but can I see it with my own eyes? No? Then I remain skeptical, as it should be in science.

The film then reviews most of the standard UAP pilot accounts, such as this from Navy pilot Ryan Graves: “They [UAPs] were ubiquitous. We were seeing them almost daily.” If true, given that nearly every commercial airline passenger has a smart phone with a high-definition camera at the ready, there should be thousands of clear and unmistakable photographs and videos of these UAPs. To date there is not one. Nada. Zilch. Here the absence of evidence is evidence of absence.

Where were the aliens in 1945 to stop the bombing of Hiroshima and Nagasaki?

A key message of the film is that there are political and even military ramifications of UAPs. Here is Stratton again: “They [UAPs] have both activated and deactivated nuclear weapons in both the U.S. and Russia.” In the category of “If this were true, what else would be true?”… where were the aliens in 1945 to stop the bombing of Hiroshima and Nagasaki? Why did they allow us to detonate the first atomic bomb in New Mexico? Why didn’t they curtail the hundreds of nuclear explosions in the Nevada desert and the South Pacific? The answer is classic hand-waving rationalization, as in Stanford University professor and UFOlogist Gary Nolan’s answer: “They [the aliens] were willing to let us see the consequences of our actions.”

To add urgency to the film, Elizondo tells us that “It [UAP sightings] is happening all over the world and it is happening with greater frequency.” The Bayesian reasoner in me asks: can we see some data on the base rate of sightings over the decades to make an assessment if, in fact, there has been an increase in frequency? No such data is provided.

 Another standard theme throughout the film is explaining why—despite the unmitigated confidence that alien contact has been discovered (but not yet disclosed)—the evidence is not readily available. Several reasons are on offer, such as this one from Elizondo: “religious fundamentalists in the Pentagon who had a severe adversity to this topic…put their religion above national security.” Among the fundies, apparently, were those who told Stratton “these were demons and we were messing with Satan’s world.”

The documentary has attracted wide attention, including coverage from Bill Maher on HBO and Joe Rogan.

As for the larger issue of the consequences of disclosure on religious faith, numerous surveys over the years have consistently found that the vast majority of religious people would not find the discovery of extra-terrestrial intelligences (“non-human biologics” in the newfangled UAP jargon meant to legitimize an otherwise fringe movement) in any way a threat to their religious beliefs. Theologian Ted Peters, for example, queried 1,300 people on the matter, finding that most people do not think the discovery of extraterrestrial intelligence would shake their faith. The reason is as obvious as it is logical: If an omnipotent deity can create life on Earth, he could do it elsewhere in the universe. In a cosmos with a sextillion planets (1 followed by 21 zeros, or 1,000,000,000,000,000,000,000), what a terrible waste of space it would be (echoing Carl Sagan) to create a cosmos so vast as to house so many planets, only one of which would contain sentient consciousness beings worthy of saving.

What are these UAPs, exactly? Here the film segues into a chalkboard lecture by Elizondo, who explains that there are four hypotheses on offer:

  1. Foreign adversary technology that we simply don’t understand.
  2. A robust counter-intelligence program to cover-up a U.S. program.
  3. Interdimensional or extraterrestrial.
  4. A combination of the above.

Unfortunately, left off the list was…

  1. Ordinary terrestrial phenomena. 
  2. None of the above.

For #5, I am fond of quoting from Leslie Kean’s 2010 book UFOs: Generals, Pilots and Government Officials Go on the Record, in which the UFOlogist admitted that “roughly 90 to 95 percent of UFO sightings can be explained” as:

weather balloons, flares, sky lanterns, planes flying in formation, secret military aircraft, birds reflecting the sun, planes reflecting the sun, blimps, helicopters, the planets Venus or Mars, meteors or meteorites, space junk, satellites, swamp gas, spinning eddies, sundogs, ball lightning, ice crystals, reflected light off clouds, lights on the ground or lights reflected on a cockpit window, temperature inversions, hole-punch clouds, and the list goes on!

Elizondo then ticked off six characteristics (“observables” because, well, it sounds more scientific) about UAP:

  1. Hypersonic velocity (40,000 mph or faster).
  2. Instantaneous acceleration (would kill pilots; even a drone would disintegrate).
  3. Low visibility (no contrails, no sonic boom, no thrust, no exhaust).
  4. Transmedium travel: (space, air, underwater).
  5. Anti-gravity (without any obvious means—no wings, rockets, etc.).
  6. Biological effects (energy weapons, getting too close to UAPs, burned, inflammation, orbs).
What’s more likely? That all of physics and aerodynamics needs revising, or that someone has misinterpreted a low-resolution video?

All of these assumptions are based on highly questionable interpretations of grainy videos and blurry photographs of UAPs/UFOs. For example, an incredibly grainy video apparently filmed from the USS Omaha off the coast of San Diego in July 2019, shows a dark blob appear to segue from above the waves to below. This, we are told, is clear and unmistakable evidence that UAPs can seemingly transition from the air into the ocean where, the speculation continues, can move through the water at hundreds of miles per hour. What’s more likely? That all of physics and aerodynamics needs revising, or that someone has misinterpreted a low-resolution video?

An unidentified anomalous phenomenon (UAP) was filmed from the USS Omaha off the coast of San Diego in July 2019. CREDIT: Jeremy Corbell/WeaponizedPodcast

I was surprised—even shocked—to see that the film included accusations that Lue Elizondo was not completely honest about his role with the U.S. government in the UAP program. To wit, we are told that Pentagon spokesman Christopher Sherwood said:

“Mr. Elizondo had no responsibilities with regard to the AATIP program”

And Pentagon Spokesperson Susan Gough revealed:

“Luis Elizondo did not have any assigned responsibilities for AATIP.”

So included, I fully expected Elizondo to offer an explanation, or the filmmakers to include proof that Elizondo worked at AATIP. Surely they could have provided a contract or pay stubs or some employment paperwork for Elizondo and AATIP, but no. Did Elizondo work for AATIP? It’s hard to believe that he didn’t, given how much information he reveals about what was going on in that department. And why would anyone lie about something so easy to check? Who knows, but UFOlogist Bob Lazar (who said he worked at Area 51 and back engineered alien spaceships) lied when he said he graduated with degrees in physics from MIT and Caltech when, in fact, he didn’t attend either such institution. Lazar’s lie was exposed by UFOlogist Stan Friedman, and the explanation on offer is that “they” erased all traces of Lazar’s academic record.

The film includes several high-profile interviews, among them Secretary of State Marco Rubio.

Another theme in the film that almost everyone I’ve ever engaged with on this topic is confused about, is articulated by former CIA Director John Brennan: “I think it’s a bit presumptuous, if not arrogant, for us to believe that there’s no other form of life anywhere in the entire universe.” 

 Of course, but that is not what any of this is about, or else the filmmakers would have interviewed SETI scientists, who have been listening for ETI signals for decades. The question “are they out there somewhere?” is a different matter entirely than “have they come here?” My provisional answers are “yes” and “no”, although as a good Bayesian I am willing to update my priors and flip my credence from skepticism to belief…with sufficient evidence.

What do the featured experts in this film think the aliens are? Elizondo suggests that they might be “cryptoterrestrial” (whatever that is—never explained) or some “proto-human” that branched off the family tree long ago and is “as natural to this planet as we are.”

“They’ve been operating here for a very long time.” How long? We are not told.

That’s the sanest of the explanations. Hal Putoff suggests that the UAP aliens might be time travelers, or some ancient civilization hiding here on Earth or on the seabed. Well, they must be hiding exceptionally well, because explorers (and satellites) have covered nearly every square meter of the planet and there is no sign of such an ancient civilization. (Maybe they have a cloaking device, like the one the Starship Enterprise used to monitor primitive civilizations on other planets.)  “Whoever it is and wherever they are,” Putoff concludes, “they’ve been operating here for a very long time.” How long? We are not told.

One segment of the film stands out, and that is the so-called “Legacy Program” that is a “crash retrieval program” to “back-engineer” alien spaceships. Now, to be sure, the U.S. government (along with other governments) have such programs to study downed/crashed airplanes, jets, drones, and spacecraft of other nations, because obviously we’d like to know what the other guy is up to technologically, and that, apparently, has been going on since the First World War (“what kind of altimeter are those German biplanes using, anyway?”). But if you Google search “Legacy Program” this is what you find:

Department of Defense Legacy Resource Management Program: This is a real, long-standing government program that funds projects to protect natural and cultural resources on military installations. Its mission is to balance military readiness with environmental stewardship.

According to this site:

The mission of the Legacy Resource Management Program is to provide coordinated, Department-wide, and partnership-based integration of military mission readiness with the conservation of irreplaceable natural and cultural resources.

When pressed to explain this Legacy crash-retrieval program, the Pentagon's All-domain Anomaly Resolution Office (AARO) concluded in a 2024 report that “there is no evidence of such programs, attributing the claims to misidentified real events or circular reporting.”

Why the lacuna? Here is Lue Elizondo’s explanation: “The ‘Legacy Program’ was so secret that it was withheld from the Secretary of Defense, Congress, and even the President of the United States.” And: “We had a choice: keep silent while keeping Americans in the dark, or resign my position in protest and fulfill my obligations to the American people by telling the truth about what I know about UAP.”

Elizondo quit. How noble. It must fill one’s ego with massive pride to know that you have made the greatest discovery in the history of humanity and no one around you has any idea of this monumental event.

Throughout the film nods are made about UAPs as a “national security threat,” for example: “It could be China. It could be Russia.” Former Director of National Ingtelligence James Clapper: “any unexplained phenomena could pose a national security threat.” Stratton: “Violation of all nations sovereign airspace presents a safety of flight concern for all military and commercial aviation.”

Well, sure it could, but does it in fact? And why include all these admonitions about national security threats to our nation from other nations, when none of these people think that is the origin of UAPs. As stated at the beginning, they all thing they’re space aliens.

An amusing (and to UFOlogists, irritating) question that skeptics such as me like to ask, “Why do they keep crashing?” If the aliens are so advanced, so sophisticated, and have engineered anti-gravity propulsion systems that can use relativistic quantum space-time bubbles to jet about the galaxy in the blink of an eye, why can’t they seem to land in New Mexico (and elsewhere) without slamming into the ground?

The film’s experts have a ready-made answer: They’re not crashing at all! These are intentionally left “gifts” to humanity. Or they’re a giant IQ test. Or, as in the film 2001: A Space Odyssey, it’s the aliens’ way of imputing superior intelligence into one species of hominin, namely us.

Why can’t we all see the evidence that the film’s experts have seen with their own eyes? Because it would freak everyone out: the stock market would tank, economies would collapse, governments would fold, and religions would abandon their faith beliefs. That’s what we’re told, anyway, and the filmmakers insist that the coverup is so extensive and powerful that “99.99 percent of all scientists are skeptical.” Perhaps, but could it be that 99.99 percent of scientists think like scientists who demand extraordinary evidence for extraordinary claims?

Then there is the assertion that “they” are silencing people in the know with threats to their jobs, careers, and lives. Elizondo: “Historically, every time a military member had a UAP encounter, it was very quickly swept under the rug and they were discouraged from talking about it.”

Right, so then why are all these military eyewitnesses going on CNN, Fox News, and Joe Rogan to tell millions of people about their UAP encounters? If “they” are so effective at covering up the existence of aliens, how is it that there are thousands of articles and news stories, hundreds of books and documentaries, and endless podcast discussions ongoing, without a single person (that I know of) fired or killed for telling us all what they know about these programs?

Another tell in the film about the lack of actual photographic or video evidence of said alien spaceships (aside from the half dozen UAP videos that have been recycled endlessly for years—TicTac, Go-Fast, Gimbal, etc.) is the inclusion of artistic representations of hovering spaceships over U.S. military bases. If there are any photographs, videos, or security camera footage of any kind available—as surely there must be if these events happened as reported—they were not included.

Example: Vandenberg Air Force base, where Elon Musk’s SpaceX launches its rockets, appears to be a hotbed of alien surveillance. A former employee there says that there are over 60 cameras that record everything that ever happens during a rocket launch. And yet, mysteriously, on October 14, 2003, there was an “incursion” in which there was “a red square object hovering in the air above the launch pad at low altitude, making no noise, it had no obvious signs of propulsion, and it was just hovering silently. It was a security breach of the area. (…) It was massive. The size of a football field, almost rectangular in shape, it was just floating there, no propulsion system, no windows. It was flat black. Then it shot off thousands of miles an hour up the coast.”

Surely the filmmakers managed to wrangle from SpaceX or the base commanders at Vandenberg actual footage of this sighting? Nope. As usual we are left with our (and an artist’s) imagination.

The film wraps up with speculations about how, exactly, these UAPs manage to pull off such feats of propulsion and maneuverability, going full science fiction mode with the pantheon of experts speculating about space-warping bubbles in which spaceships can zoom off in an instant because space itself is being warped so it doesn’t need to move through normal space (or ocean). Putoff: “So time moves differently for people inside the bubble versus people outside the bubble. (… )This could be the key to interstellar travel.” Hopefully Elon and his SpaceX engineers are taking notes.

On this matter I am reminded of the comedian Mitch Hedberg riff on why photos of Bigfoot are blurry: “It’s not the photographers, it’s the subject. I think Bigfoot is just fuzzy. You know, I think there’s a large, out-of-focus monster roaming the countryside. Run, he’s fuzzy!”

In UAP circles, life imitates art. Radar signals, we are told in the film by Eric Davis, cannot detect UAPs “because the signal just moves around the bubble and doesn’t reflect back to the radar operator.” Here is Hal Putoff in full Hedberg mode: “This explains why people who take a photo of a UAP get a fuzzy and distorted picture because they’re actually taking a photo through a spacetime barrier.”

Once you convince yourself that this is all real, it is natural to ask, “what is their energy source?” Continuing in full science fiction fantasy, Eric Davis calculates that “UAP performance implies the use of 1,100 billion watts of power. This is 100 times the daily electrical utility power generated in the U.S.” Where do the aliens find such energy? “Vacuum energy. Zero-point energy. Quantum entanglement.” The film ends with speculation that when disclosure of this technology comes online it will solve all our energy demands and replace oil, natural gas, and coal.

This is all very entertaining. Who doesn’t love science fiction? But The Age of Disclosure claims to be science fact. The evidence for it remains as elusive as it ever was, as I explained in my $1000 bet on the Long Now Foundation’s Long Bets site that “Discovery or disclosure of alien visitation to Earth in the form of UFOs, UAPs, or any other technological artifact or alien biological form, as confirmed by major scientific institutions, will not happen by December 31, 2030.”

Since posting this, Harvard astronomer Avi Loeb has accepted the bet and we each donated $500 to the Long Now Foundation, the proceeds of the winnings to go to the Galileo Project. I am reasonably confident I will win, but I am hoping to lose because I agree with the experts in The Age of Disclosure that this would indeed be the greatest discovery in the history of humanity.

Categories: Critical Thinking, Skeptic

The Crooked Story Around Thomas Crooks

Thu, 11/20/2025 - 9:15am

The attempt on Donald Trump’s life in Butler, Pennsylvania, remains one of the most consequential security failures in recent political history. It deserves, and still lacks, a full public accounting. For months, legitimate questions have lingered about the background of the gunman, Thomas Matthew Crooks, and the lapses that allowed a 20-year-old with a rifle to reach an unsecured rooftop less than 200 yards from a former President and then current front-runner for the nation’s top job.

But the leap from unanswered questions to sweeping conspiratorial conclusions is a chasm worth avoiding. In recent days Tucker Carlson has encouraged precisely that leap. Rather than pressing for serious transparency, he has mixed factual gaps with political suspicion to construct a theory of concealed motives and hidden hands. The public deserves better than that, and so does the pursuit of truth.

Let’s start with what remains troubling. Federal investigators initially described Crooks as a quiet, socially isolated young man with a limited online presence. Yet Carlson, in a video posted on X on Friday, November 14, showcased material he claimed came from Crooks’ Google Drive and from social media accounts on YouTube, Snapchat, Quora, and Venmo. The content, he contended, suggested a trajectory of threats and firearms practice inconsistent with the FBI’s portrait.

Tucker Carlson, in a video posted on X on Friday, November 14, showcased material he claimed came from Crooks’ Google Drive and from his social media accounts

 The FBI has not publicly explained why these accounts were not part of its early description of Crooks’ digital activity. The haste to cremate the shooter and scrub his apartment, the rapid disappearance of his online postings, and the absence of a detailed biographical narrative have only fueled suspicion about the thoroughness of the FBI’s investigation. Americans can reasonably ask how a major assassination attempt generated to date so little public information about the perpetrator.

As I document in my 1993 book Case Closed, after he shot President John F. Kennedy, the FBI and the CIA quickly complied a detailed account of the life of Lee Harvey Oswald, in some cases what he was up to by the day, hour, and even minute in the months and even years leading up to the assassination. All of that was available to the public a year after the assassination when the Warren Report was published. And yet, over a year after Crooks’ attempted assassination of Donald Trump—and murder of Corey Comperatore, a volunteer firefighter and former fire chief who was in the audience—we know next to nothing about this shooter. How did he get on the roof of the adjacent building without anyone noticing? Why did no one in the Secret Service respond to the numerous verbal warnings by spectators at the rally (that can be heard on cell phone footage) that they saw a man with a rifle on the roof? And despite apparently not seeing Crooks on the roof, how did the Secret Service shoot and kill him within seconds of his opening fire on Trump?

A democratic society should not have to rely on private individuals to surface essential details about an attack on a national political figure.

Those questions merit full answers. A democratic society should not have to rely on private individuals to surface essential details about an attack on a national political figure. If intelligence agencies do their job the country should not need to rely on podcasters for accurate and relevant information about important national events. 

But Carlson’s speculation overshoots the available facts. His error is not raising questions but in constructing a sprawling narrative of deliberate concealment. He suggests the FBI suppressed Crooks’ online footprint and implies a broader conspiracy behind the attack. His confidence in the authenticity of the accounts he identified is not investigative rigor; it is assumption presented as certainty. Even if Carlson’s files are authentic, nothing yet proves the FBI saw them and chose to hide them. It is plausible that Carlson’s source identified material investigators had not verified or did not view as conclusive. 

The FBI’s Rapid Response account stated last week that the agency never claimed Crooks had “no online footprint.” FBI Director Kash Patel has emphasized the scope of the inquiry: more than 1,000 interviews; thousands of public tips; data from 13 digital devices; nearly half a million files reviewed; and financial activity across 10 accounts analyzed. Patel maintains investigators found no evidence Crooks worked with anyone or shared his intent.

FBI Director Kash Patel Patel maintains investigators found no evidence Crooks worked with anyone or shared his intent

This does not close the matter. Federal agencies have a long history of releasing information too slowly and too narrowly. But it also does not substantiate Carlson’s suggestion of a suppressed plot or a rogue bureau determined to hide the truth.

The deeper issue is this: By framing unanswered questions as proof of a coordinated deep-state conspiracy, Carlson undermines the very process required to get real answers. He transforms factual uncertainty into political advantage. This style of commentary turns national tragedies into narrative battlegrounds, where ambiguity becomes opportunity.

Prepackaged conspiracy narratives corrode the public’s ability to assess facts when they ultimately emerge. The Kennedy assassination offers a reminder: early opacity, mixed with political distrust, created a vacuum that conspiracy theories quickly filled. The result is an event still debated six decades later, long after credible evidence should have settled the matter. 

Something similar is now taking shape. Gaps in public information about Crooks have fostered speculation. By framing those gaps as evidence of intent in some nebulous deep state plot, Carlson makes it harder for legitimate investigators—in Congress, in the press, and within federal agencies—to do their work without being accused of participating in a cover-up the moment an answer proves incomplete.

Transparency by the FBI is the only way to reassure the public that its conclusions rest on verified evidence.

Americans deserve a clearer record of Crooks’ ideology, his online activity, and his movements before the shooting. Congress should press for more information about the security breakdowns that allowed the attack. The FBI should release as much documentation as possible. Transparency by the FBI is the only way to reassure the public that its conclusions rest on verified evidence, not institutional defensiveness or a coverup for an inadequate investigation.

Transparency does not require accepting Carlson’s conclusions. It requires accepting that the public has a right to know more than it does today and insisting that institutions meet that obligation.

Carlson is right about one thing: the story of Thomas Matthew Crooks is incomplete. But incompleteness is not proof of conspiracy. It is proof that work remains to be done. The path to clarity is careful inquiry, not sensational extrapolation. If the goal is truth rather than clicks, the method matters as much as the questions.

The Butler attack demands answers. It does not demand a conspiracy theory.

Categories: Critical Thinking, Skeptic

Nuremberg: The Film, The Trial, The Verdict

Sat, 11/15/2025 - 10:59am

Hermann Göring is a larger-than-life character who commands a big-screen presence both in real life and in cinematic representations. His latest portrayal, by the estimable Russell Crowe, captures this presence with notable fidelity.

Crowe plays Göring in all his official guises: Reichsmarschall, Luftwaffe chief, President of the Reichstag, Plenipotentiary of the Four-Year Plan, and Hitler’s designated successor (until the Führer revoked that status in the last days of the war and ordered Göring’s arrest for treason). Crowe’s performance is as convincing as his portrayal of Maximus Decimus Meridius in Ridley Scott’s 2000 film Gladiator. (See also Robert Pugh as Göring in the BBC series Nuremberg: Nazis on Trial, which highlights the more jovial side of the Reichsmarschall’s personality.)

Russell Crowe as Hermann Göring. Courtesy of Sony Pictures Classics

The 148-minute PG-13 film, Nuremberg, directed and written by James Vanderbilt, also stars Rami Malek (who played Queen lead singer Freddie Mercury in the biopic Bohemian Rhapsody) as the official prison psychiatrist Dr. Douglas Kelley, and Colin Hans as Dr. Gustave Gilbert, official prison psychologist.

0:00 /2:04 1×

Un-historically, Gilbert appears in a tertiary role, with the entire film focus on the relationship between Göring and Kelley (credited to Jack El-Hai’s book The Nazi and the Psychiatrist), which is an odd paring because, in fact, Kelley left “after the first month of the trial and was succeeded by Major Leon N. Goldensohn for most of the rest of the trial,” an explanation on offer from Dr. Gilbert himself in his 1947 book Nuremberg Diary.

In the film, the two shrinks are juxtaposed as quarreling over who was going to profit from participation in the trial of the century, departing the scene in anger. But in his Nuremberg Diary acknowledgments, Dr. Gilbert writes:

I am indebted to Colonel B. C. Andrus, prison commandant, and Dr. Douglas M. Kelley, prison psychiatrist for the first two months, for facilitating my assignment to the Nuremberg jail with free access to all the prisoners from the very beginning of my stay there.

Gilbert’s book is a compendium of deep insights into politics, Nazism, war, conflict, personality, free will and moral culpability, and the nature of good and evil. There are also amusing insights like this one from the prison commandant Colonel Andrus: “When Göring came to me at Mondorf, he was a simpering slob with two suitcases full of paracodeine. I thought he was a drug salesman. But we took him off his dope and made a man of him.” I have a hardback first edition that I picked up at a used-book sale at Glendale College in the 1980s, where I was teaching at the time and researching the history of conflict, violence, and war.

My personal copy of the first edition of the “Nuremberg Diary.” Gilbert’s book is a compendium of deep insights into politics, Nazism, war, conflict, personality, free will and moral culpability, and the nature of good and evil

Some notable excerpts from Dr. Gilbert’s book:

In our conversations in his cell, Göring tried to give the impression of a jovial realist who had played for big stakes and lost and was taking it all like a good sport. Any question of guilt was adequately covered by his cynical attitude toward the “justice of the victors.” He had abundant rationalizations for the conduct of the war, his alleged ignorance of the atrocities, the “guilt” of the Allies, and a ready humor which was always calculated to give the impression that such an amiable character could have meant no harm. Nevertheless, he could not conceal a pathological egotism and inability to stand anything but flattery and admiration for his leadership, while freely expressing scorn for other Nazi leaders.

Göring’s ego was on full display over the weekend of March 16-17, 1946, as evidenced in this reflection by Gilbert after visiting the prisoner’s cell:

Göring was very tired from the strain of the past three days’ testimony. His defense being almost completed, he was already moodily brooding over his destiny and speculating on his role in history. Humanitarianism had become a thorn in his side, and he cynically rejected it as a threat to his future greatness. The empire of Genghis Kahn, the Roman Empire, and even the British Empire were not built up with due regard for principles of humanity, he expostulated with weary bitterness—but they achieved greatness in their time and have won a respected place in history. I reiterated that the world was becoming a little too sophisticated in the 20th century to regard war and murder as the signs of greatness. He squirmed and scoffed and rejected the idea as the sentimental idealism of an American who could afford such a self-delusion after America had hacked its way to a rich Lebensraum [living space] by revolution, massacre, and war [a reference to America’s own history of extermination of Native Americans, of which Göring was familiar].

When his testimony was over, Göring “made an outright bid for applause for his performance,” asking Gilbert “Well, I didn’t cut a petty figure, did I?” Gilbert penned in his notes: “He couldn’t help admiring himself, and paused for a moment to do so. Yes, he was quite satisfied with his figure in history. ‘Why, I bet even the prosecution had to admit that I did well, didn’t they? Did you hear anything?’” Gilbert concluded of Göring, “The test of his medieval heroism was admiration by the enemy. I shrugged my shoulders.” 

Another un-historical segment in the Nuremberg film has Dr. Kelley visiting Göring’s wife and daughter, Emmy and Edda, when in fact it was Dr. Gilbert who “visited Frau Emmy Göring at the house in the woods of Sackdilling near Neuhaus, to which she had retired with her daughter and niece after her release from custody” (he devotes a chapter in Nuremberg Diary to the visit). (Why filmmakers take such unnecessary license with historical truths is beyond me.) Emmy Göring had plenty to say about how “disgraceful to us to see how many Germans are saying they never really supported Hitler, that they were forced into the Party, there is so much hypocrisy, it is sickening!” She raged about Hitler’s order to execute the entire Göring family (“can you imagine that madman ordering that child shot?” pointing to Edda), then when her daughter was not present addressed the topic of atrocities:

She told how she had asked Himmler to let her go and see Auschwitz concentration camp, because she had received so many letters saying that things were not quite as they should be. As the first lady of the land she wanted to be convinced that everything was in order. Himmler wrote a polite letter, but told her not to meddle in things that were no concern of hers. 

As for the unprecedented concept of the Nuremberg trial itself, Göring told Gilbert “I still don’t recognize the authority of the court,” adding:

Anything that happened in our country does not concern you in the least. If 5 million Germans were killed, that is a matter for Germans to settle; and our state policies are our own business.

Clearly the court begged to differ, and to drive home the point Göring and the others were subjected to George Stevens’ The Nazi Concentration Camps film footage from, for example, Nordhausen:

0:00 /0:32 1×

Gilbert took notes as he observed the Nazi leaders reacting to the horrors on the screen:

Fritzsche already looks pale and sits aghast as it starts with scenes of prisoners burned alive in a barn… Keitel wipes brow, takes off headphones… Göring supported his head on his hand and looked tired, then stirred uneasily in the doc and changed position. Funk covers his eyes, looks as if he is in agony, shakes his head… Ribbentrop closes his eyes, looks away… Sauckel mops brow… Frank swallows hard, blinks eyes, trying to stifle tears… Funk now in tears, blows nose, wipes eyes, looks down… Frick shakes head at illustration of “Violent death”—frank mutters “Horrible!”… Speer looks very sad, swallows hard… Defense attorneys are now muttering, “For God’s sake—terrible.”

In addition to employing the now discredited Rorschach ink-blot test in hopes of evoking deep personality characteristics and hidden motives of his charges (in the film the Nazis see Jews, vaginas, and Jewish vaginas), Gilbert administered the Wechsler Adult Intelligence Scale and found the following results (100 = average, 115 = one standard deviation above average, or better than 84% of test takers, 130 = two standard deviations above average, or better than 98% of test takers, and 145 = three standard deviations above average, or better than 99.9% of test takers):

Name

IQ Score

Arthur Seyss-Inquart

143

Hermann Göring

138

Karl Doenitz

138

Frans von Papen

134

Hans Frank

130

Joachim von Ribbentrop

129

Wilhelm Keitel

129

Albert Speer

128

Alfred Jodl

127

Walther Funk

124

Fritz Sauckel

118

Ernst Kaltenbrunner

113

As Gilbert concluded:

The IQs show that the Nazi leaders were above average intelligence, merely confirming the fact that the most successful men in any sphere of human activity—whether it is politics, industry, militarism, or crime—are apt to be above average intelligence. It must be borne in mind that the IQ indicates nothing but the mechanical efficiency of the mind, and has nothing to do with character or morals, nor the various other considerations that go into an evaluation of personality. Above all, the individual’s sense of values and the expressions of his basic motivation are the things that truly reveal his character.

Even more insightfully, Gilbert adds:

However, a social movement as far-reaching and catastrophic as that of Nazism cannot be adequately analyzed and understood merely in terms of the individual character traits of its leaders. A realistic psychological approach requires an insight into the total personalities in interaction in their social and historical setting. The Nuremberg Trial afforded an ideal opportunity for such a study.

Here, for example, is one of the most poignant exchanges Gilbert had with Göring about the politics of war:

Göring: Why, of course, the people don't want war. Why would some poor slob on a farm want to risk his life in a war when the best that he can get out of it is to come back to his farm in one piece? Naturally, the common people don't want war; neither in Russia, nor in England, nor in America, nor for that matter in Germany. That is understood. But, after all, it is the leaders of the country who determine the policy and it is always a simple matter to drag the people along, whether it is a democracy, or a fascist dictatorship, or a parliament, or a communist dictatorship.

Gilbert: There is one difference. In a democracy the people have some say in the matter through their elected representatives, and in the United States only Congress can declare wars.

Göring: Oh, that is all well and good, but, voice or no voice, the people can always be brought to the bidding of the leaders. That is easy. All you have to do is tell them they are being attacked, and denounce the pacifists for lack of patriotism and exposing the country to danger. It works the same way in any country.

That is an accurate description of the world before World War II. After, when the world came to grasp the full extent of the Nazi genocide, an International Law Commission was born out of the Nuremberg trials of German war criminals for their “crimes against humanity,” which it defined as follows:

Murder, extermination, enslavement, deportation and other inhumane acts done against any civilian population, or persecutions on political, racial, or religious grounds, when such acts are done or such persecutions are carried on in execution of or in connection with any crime against peace or any war crime.

This was based on a new set of legal and moral principles (from Nuremberg Trial Proceedings Vol. 1. Charter of the International Military Tribunal) such as: Principle 1: “Any person who commits an act which constitutes a crime under international law is responsible therefore and liable to punishment.” And Principle II: “The fact that internal law does not impose a penalty for an act which constitutes a crime under international law does not relieve the person who committed the act from responsibility under international law.” 

From the start, Supreme Court Justice Robert Jackson insisted on a fair trial for all, noting in his opening statement “To pass these defendants a poisoned chalice is to put it to our own lips as well.” The Nuremberg trials were one of the greatest contributions to expanding the moral sphere of justice on a global scale, signaling to dictators and demagogues everywhere that the world was watching and would hold them accountable for their actions. 

Like most of the Nazis at the Nuremberg trial, Göring’s defense was that he was innocent by virtue of the fact that he was only following orders. Befehl ist Befehl—orders are orders—is now known as the Nuremberg defense, and it’s an excuse that seems particularly feeble in a case like Göring’s and the other Nazi leaders in the doc. “My boss told me to kill millions of people so—hey—what could I do?” is not a credible defense, but here are their pleas as documented by Gilbert:

Joachim von Ribbentrop, Foreign Minister: “We were all under Hitler’s shadow.”

Ernst Kaltenbrunner, Chief of Heinrich Himmler’s Security Headquarters: “I do not feel guilty of any war crimes. I have only done my duty as an intelligence organ, and I refuse to serve as an ersatz for Himmler.”

Hans Frank, Governor-General of occupied Poland: “I regard this trial as a God-willed world court, destined to examine and put to an end the terrible era of suffering under Adolf Hitler.”

Field Marshal Keitel, Chief of Staff of the High Command of the Wehrmacht: “For a soldier, orders are orders.”

Admiral Doenitz, Grand Admiral of the German Navy and Hitler’s successor: “None of these indictment counts concerns me in the least.”

In the epilogue to Nuremberg Diary, Gilbert records the 21 Nazi defendants’ final statements, including this gem exchange between Göring and von Papen:

Göring deserted Hitler and made a grandiose protestation of his own innocence, calling upon God and the German people as witnesses to the fact that he had acted out of pure patriotism. This hypocrisy infuriated von Papen so much that he came over and attacked Göring at lunch, demanding furiously, “Who in the world is responsible for all this destruction if not you?! You were the second man in the State! Is nobody responsible for any of this?” He waved his arms at the ruins of Nuremberg visible through the lunchroom windows. 

Goering folded his arms cockily and smirked into von Papen’s face: “Well, why don’t you take the responsibility then? You were Vice-Chancellor!”

Most of the other defendants acknowledged that there had been horrible crimes committed, but claimed that they had individually acted in good faith according to the standards of their respective positions and professions. The generals had only followed orders; the admirals had done no more than other admirals; the politicians had only worked for the Fatherland; the financiers had only attended to business.

The four counts of the indictment against Göring and the others were (1) Conspiracy to commit crimes alleged in other counts; (2) Crimes against peace; (3) War crimes; (4) Crimes against humanity. 

Göring’s verdict was “GUILTY on all 4 counts. Sentence: Death by hanging.” Here is Gilbert’s summary of Göring’s conviction:

Gilbert’s summary of Göring’s conviction

Unbelievably—given the draconian measures of prisoner monitoring after the head of the German Labour Front, Robert Ley, died by suicide in his cell before the trial even began—Göring cheated the hangman’s noose, chomping down on a vial of cyanide in his mouth, somehow smuggled into his cell as the executioners began their preparation for carrying out the court’s sentence.

To this day no one knows who aided the Reichsmarschall’s escape from justice. The film’s artistic interpretation is that he made it appear by “magic”—thereby tying the ending of the film to the beginning with Dr. Kelley making a coin disappear and reappear as he performs magic for a comely woman on the train ride to Nuremberg in the film’s awkward opening scene. The likeliest explanation is that Göring charmed one of the guards into slipping him a capsule at the last moment; given his considerable personal influence over millions of ordinary Germans to commit extraordinarily evil crimes against humanity, it doesn’t seem fanciful that he cajoled some neophyte guard into helping him evade the hangman’s noose.

In the end, none of the convicted Nazi were surprised by their sentences, least of all Göring, who admitted to Gilbert that he expected the death penalty from the start, “and was glad that he had not gotten a life sentence, because those who are sentenced to life imprisonment never become martyrs. But there wasn’t any of the old confident bravado in his voice. Göring seems to realize, at last, that there is nothing funny about death, when you’re the one who is going to die.”

Russell Crowe as Hermann Göring. Courtesy of Sony Pictures Classics

Göring’s observation, along with the lessons from the film itself—well worth seeing in a theater on a big screen—tells an important story our society seems all too ready to forget these days, namely that in the end if justice does not prevail then evil will.

Categories: Critical Thinking, Skeptic

Pages