From best-selling biographer Max Boot comes this revelatory portrait, a decade in the making, of the actor-turned-politician whose telegenic leadership ushered in a transformative conservative era in American politics. Despite his fame as a Hollywood star and television host, Reagan remained a man of profound contradictions, even to those closest to him. Never resorting to either hagiography or hit job, Reagan: His Life and Legend charts his epic journey from Depression-era America to “Morning in America.” Providing fresh insight into “trickle-down economics,” the Cold War’s end, the Iran-Contra affair, and so much more, this definitive biography is as compelling a presidential biography as any in recent decades.
Max Boot is a Russia-born naturalized American historian and foreign-policy analyst and a senior fellow for national security studies at the Council on Foreign Relations. He has worked as a writer and editor at the Wall Street Journal, The New York Times, the Los Angeles Times, The Weekly Standard, and the Christian Science Monitor, and is now a regular columnist for the Washington Post. His New York Times bestseller, The Road Not Taken: Edward Lansdale and the American Tragedy in Vietnam, was a finalist for the Pulitzer Prize in Biography. He is also the author of The Savage Wars of Peace: Small Wars and the Rise of American Power, War Made New: Technology, Warfare, and the Course of History: 1500 to Today, Invisible Armies: An Epic History of Guerrilla Warfare from Ancient Times to the Present, and, controversially, of The Corrosion of Conservatism: Why I Left the Right. His new book is Reagan: His Life and Legend.
Shermer and Boot discuss:
If you enjoy the podcast, please show your support by making a $5 or $10 monthly donation.
Highlights from 18 years of the Skeptoid podcast.
Since Hamas’s attack on Israel last October 7, the term “settler colonialism” has become central to public debate in the United States. A concept new to most Americans, but already established and influential in academic circles, settler colonialism is shaping the way many people think about the history of the United States, Israel and Palestine, and a host of political issues.
This short book is the first to examine settler colonialism critically for a general readership. By critiquing the most important writers, texts, and ideas in the field, Adam Kirsch shows how the concept emerged in the context of North American and Australian history and how it is being applied to Israel. He examines the sources of its appeal, which, he argues, are spiritual as much as political; how it works to delegitimize nations; and why it has the potential to turn indignation at past injustices into a source of new injustices today. A compact and accessible introduction, rich with historical detail, the book will speak to readers interested in the Middle East, American history, and today’s most urgent cultural-political debates.
Adam Kirsch is the author of several books of poetry and criticism. A 2016 Guggenheim Fellow, Kirsch is an editor at the Wall Street Journal’s Weekend Review section and has written for publications including The New Yorker, Slate, The Times Literary Supplement, The New York Times Book Review, Poetry, and Tablet. He lives in New York. His new book is On Settler Colonialism: Ideology, Violence, and Justice.
Shermer and Kirsch discuss:
If you enjoy the podcast, please show your support by making a $5 or $10 monthly donation.
Sixth-century Byzantium was a city divided by race hatred so intense that people viciously attacked each other, not only in the streets but also in churches. The inscription on an ancient tablet conveys the raw animus that spawned from color differences: “Bind them! … Destroy them! … Kill them!” The historian Procopius, who witnessed this race antagonism firsthand, called it a “disease of the soul,” and marveled at its irrational intensity:
They fight against their opponents knowing not for what end they imperil themselves … So there grows up in them against their fellow men a hostility which has no cause, and at no time does it cease or disappear, for it gives place, neither to the ties of marriage nor of relationship nor of friendship.1
This hostility sparked multiple violent clashes and riots, culminating in the Nika Riot of 532 CE, the biggest race riot of all time: 30,000 people perished, and the greatest city of antiquity was reduced to smoldering ruins.
But the Nika Riot wasn’t the sort of race riot you might imagine. The race in question was the chariot race. The color division wasn’t between black and white but between blue and green—the colors of the two main chariot-racing teams. The teams’ supporters, who were referred to as the Blue and Green “factions,” proudly wore their team colors, not just in the hippodrome but also around town. To help distinguish themselves, many Blues also sported distinctive mullet hairstyles, like those of 1970’s rock stars. Both Blues and Greens were fiercely loyal to their factions and their colors. The chariots and drivers were a secondary concern; the historian Pliny asserted that if the drivers were to swap colors in the middle of a race, the factions would immediately switch their allegiances accordingly.
The race faction rivalry had existed for a long time before the Nika Riot, yet Procopius writes that it had only become bitter and violent in “comparatively recent times.” So, what caused this trivial division over horse-racing teams to turn so deadly? In short, it was the Byzantine version of “identity politics.”
Modern sociological research helps explain the phenomenon. Decades of studies have demonstrated the dangerous power of the human tribal instinct. Surprisingly, it doesn’t require “primordial” ethnic or tribal distinctions to engage that impulse. Minor differences are often sufficient to elicit acute ingroup-outgroup discrimination. The psychologist Henri Tajfel demonstrated this in a landmark series of studies to determine how minor those differences can be. In each successive study, Tajfel divided test subjects into groups according to increasingly trivial criteria, such as whether they preferred Klee or Kandinsky paintings or underestimated or overestimated the number of dots on a page. The results were as intriguing as they were disturbing: even the most trivial groupings induced discrimination.2, 3
However, the most significant and unexpected discovery was that simply telling subjects that they belonged to a group induced discrimination, even when the grouping was completely random. Upon learning they officially belonged to a group, the subjects reflexively adopted an us-versus-them, zero-sum game attitude toward members of other groups. Many other researchers have conducted related experiments with similar results: a government or an authority (like a researcher) designating group distinctions is, by itself, sufficient to spur contentious group rivalry. When group rewards are at stake, that rivalry is magnified and readily turns malign.
The extent to which authority-defined groups and competition for group benefits can foment nasty factionalism was demonstrated in the famous 1954 Robbers Cave experiment, in which researchers brought boys with identical socioeconomic and ethnic backgrounds to a summer camp, dividing them randomly into two official groups. They initially kept the two groups separate and encouraged them to bond through various group activities. The boys, who had not known each other before, developed strong group cohesion and a sense of shared identity. The researchers then pitted the groups against each other in contests for group rewards to see if inter-group hostility would arise. The group antagonism escalated far beyond their expectations. The two groups eventually burned each other’s flags and clothing, trashed each other’s cabins, and collected rocks to hurl at each other. Camp staff had to intervene repeatedly to break up brutal fights. The mounting hostility and risk of violence induced the researchers to abort that phase of the study.4 Other researchers have replicated this experiment: one follow-up study resulted in knife fights, and a researcher was so traumatized he had to be hospitalized for a week.5, 6
How does this apply to the Blues and Greens? As in the Tajfel experiments, the Byzantine race factions had formed a group division based on a trivial distinction—the preference for a color and a horse racing team. However, for many years, the rivalry remained relatively benign. This was likely because the emperors had long played down the factional distinction and maintained a tradition of race neutrality: if they favored a faction, they avoided openly showing it. But that tradition ended a few years before the Nika Riot when emperors began openly supporting either one faction or the other. But more importantly, they extended their support outside the hippodrome with official policies that benefited members of their preferred faction. The emperors Marcian, Anastasius, and Justinian adopted official employment preferences, allocating positions to members of their favored faction and blocking the other faction from coveted jobs. To cast it in modern terms, they began a program of “race-based” affirmative action and identity politics.7, 8
In nearly all the countries where affirmative action programs have been implemented, they have an invidious effect on the group that benefits, imbuing them with a sense of insecurity and defensiveness over the benefits they receive.
Official recognition of the group distinction enhanced the us-versus-them sense of difference between the factions, and the affirmative action scheme turned this sense of difference into bitter antagonism, which eventually exploded in violence. Procopius, our primary contemporary source, placed the blame for the mounting antagonism and the riots squarely on Justinian’s program of identity politics. It had not only promoted an us-versus-them mindset in the factions, it also incited vicious enmity between them, turning a trivial color preference and sporting rivalry into a deadly “race war.”
Considering how identity politics could elicit violence from randomly assembled groups like the Blues and Greens, it is easy to imagine how disastrous identity politics can be when applied to groups that already have some long-standing, historic sense of difference. Indeed, there have been numerous instances of this in history, most ending tragically. For example, Tutsis and Hutus enjoyed centuries of relatively peaceful coexistence in Rwanda up until Belgian colonialists arrived; when the Belgians issued identity cards distinguishing the two groups and instituted affirmative action, it ossified a formerly porous group distinction and infused it with bitter rivalry, preparing the path to genocide. Likewise, when Yugoslavia instituted its “nationality key” system, with educational and employment quotas for the country’s constituent ethnic groups, it hardened group distinctions, pitting the groups against each other and setting the stage for genocide in the Balkans. And, when the Sri Lankan government opted for identity politics and affirmative action, it spawned violent conflict and genocide that destroyed a once peaceful and prosperous country. This last example—Sri Lanka—is so illustrative of the dangers of identity politics that we’ll examine it in more detail.
Sri Lanka: How Identity Politics Destroyed ParadiseShe is a fabulous isle just south of India’s teeming shore, land of paradise … with a proud and democratic people … Her flag is the flag of freedom, her citizens are dedicated to the preservation of that freedom … Her school system is as progressive as it is democratic. —1954 TWA TOURIST VIDEO
Sri Lanka is an island off India’s southeast coast blessed with copious amounts of arable land and natural resources. It has an ethnically diverse population, with the two main groups being Sinhalese (75 percent) and Tamils (15 percent). Before Sri Lanka’s independence in 1948, there was a long history of harmony between these groups. That history goes back at least to the fourteenth century when the Arab traveler Ibn Battuta observed how the different groups “show respect” for each other and “harbor no suspicions.” On the eve of Sri Lanka’s independence, a British governor lauded the “large measure of fellowship and understanding” that prevailed, and a British soldiers’ guide noted that “there are no historic antagonisms to overcome.” With quiescent communal relations, abundant natural resources, and one of the highest literacy rates in the developing world, newly independent Sri Lanka was poised to flourish and prosper. Nobody doubted it would outperform countries like South Korea and Singapore, with the British governor dubbing it “the best bet in Asia.”
It turned out to be a very poor bet. A few years after Sri Lanka’s independence, violent communal conflict erupted, culminating in a protracted civil war and genocide. By the time it ended, over a million people had been displaced or killed. Sri Lanka’s per capita GDP, which was on par with South Korea’s in 1960, was only one-tenth of it by 2009. As in sixth-century Byzantium, identity politics precipitated the calamity.
Turning a Disparity into a DisasterAt the end of British colonial rule in Sri Lanka, there was significant educational and income disparity between Sinhalese and Tamils. This arose by happenstance rather than because of discriminatory policy. The island’s north, where Tamils predominate, is arid and poor in resources. Because of this, the Tamils devoted their productive energy toward developing human capital, focusing on education and cultivating professional skills. This focus was abetted by American missionaries, who set up schools in the north, providing top-notch English-language education, particularly in math and the physical sciences. As a result, Tamils accounted for an outsized proportion of the better-educated people on the island, particularly in higher-paying fields like engineering and medicine.
Because of the Tamils’ superior education, the British colonial administration hired them disproportionately compared to the Sinhalese. In 1948, for example, Tamils accounted for 40 percent of the clerical workers employed by the colonial government, greatly outstripping their 15 percent share of the overall population. This unequal outcome had nothing to do with overt discrimination against the Sinhalese; it merely reflected the different levels and types of education achieved by the different ethnic groups.
When Sri Lanka gained independence, it passed a constitution that prohibited discrimination based on ethnicity. But a few years after that, an opportunist politician, S.W.R.D. Bandaranaike, figured he could advance his career by cynically appealing to identity politics, stoking Sinhalese envy over the Tamils’ over-representation in higher education and government. He launched a divisive campaign to eliminate the disparity, which spurred the majority Sinhalese to elect him. After his election in 1956, Bandaranaike passed a law that changed the official language from English to Sinhala and consigned students to separate Tamil and Sinhalese education “streams” rather than having them all learn English. As one Sinhalese journalist wrote, this divided Sri Lanka, depriving it of its “link language”:
That began a great divide that has widened over the years. Children now go to segregated schools or study in separate streams in the same school. They don’t get to know other people of their own age group unless they meet them outside.
Beyond eliminating Sri Lanka’s common “link language,” this law also functioned as a de facto affirmative action program for Sinhalese. Tamils, who spoke Tamil at home and received their higher education in English, could not gain Sinhala proficiency quickly enough to meet the government’s requirement. So, many of them lost their jobs to Sinhalese. For example, the percentage of Tamils employed in government administrative services dropped dramatically: from 30 percent in 1956 to five percent in 1970; the percentage in the armed forces dropped from 40 percent to one percent.
As has happened in many other countries, Sri Lanka’s identity politics went hand-in-hand with expanded government. Sinhalese politicians made it clear: government would be the tool to redress perceived ethnic disparities. It would allocate more jobs and resources, and that allocation would be based on ethnicity. As one historian writes: “a growing perception of the state as bestowing public goods selectively began to emerge, challenging previous views and breeding mistrust between ethnic communities.” Tamils responded to this by launching a non-violent resistance campaign. With ethnic dividing lines now clearly drawn, mobs of Sinhalese staged anti-Tamil counter-demonstrations and then riots in which hundreds—mostly Tamils—were killed. The us-versus-them mentality was setting in.
Bandaranaike was eventually assassinated by radicals within his own movement. But his widow, Sirimavo, who was subsequently elected prime minister, resolved to maintain his top priorities—expansive government and identity politics. She nationalized numerous industries and launched development projects that were directed by ethnic and political considerations rather than actual need. She also removed the constitutional ban on ethnic discrimination so that she could aggressively expand affirmative action. The existing policies had already cost so many Tamils their jobs that they were now under-represented in government. However, they remained over-represented in higher education, particularly in the sciences, a disparity that Sirimavo and her political allies resolved to eliminate. In a scheme that American universities like Harvard would later emulate, the Sri Lankan universities began to reject high-scoring Tamil applicants in favor of manifestly less-qualified Sinhalese with vastly lower test scores.
Just like Justinian’s “race” preferences, the Sri Lankan affirmative action program exacerbated us-versus-them attitudes, deepening the group divide and spurring enmity between groups. As one Sri Lankan observed:
Identity was never a question for thousands of years. But now, here, for some reason, it is different … Friends that I grew up with, [messed around] with, got drunk with, now see an essential difference between us just for the fact of their ethnic identity. And there are no obvious differences at all, no matter what they say. I point to pictures in the newspapers and ask them to tell me who is Sinhalese and who is Tamil, and they simply can’t tell the difference. This identity is a fiction, I tell you, but a deadly one.9
The lessons of the various affirmative action programs in Sri Lanka were clear to everyone: individuals’ access to education and government employment would be determined by ethnic group membership rather than individual merit, and political power would determine how much each group got. If you wanted your share, you needed to mobilize as a group and acquire and maintain political power at any cost. The divisive effects of these lessons would be catastrophic.
The realization that they would forever be at the mercy of an ethnic spoils system, along with the violent attacks perpetrated against them, induced the Tamils to form resistance organizations—most notably, the Liberation Tigers of Tamil Eelam (LTTE). The LTTE attacked both Sri Lankan government forces and individual Sinhalese, initiating a deadly spiral of attacks and reprisals by both sides committing the sort of atrocities that are tragically common in ethnic conflicts: burning people alive, torture, mass killings, and so on. Over the following decades, the conflict continued to fester, periodically escalating into outright civil war. Ultimately, over a million people would be killed or displaced.
The timeline of the Sri Lankan conflict establishes how communal violence originated from identity politics rather than the underlying income and occupational disparity between the groups. That disparity reached its apex at the beginning of the twentieth century. Yet, there was no communal violence at that point or during the next half-century. It was only after the introduction of affirmative action programs that ethnic violence erupted. The deadliest attacks on Tamils occurred an entire decade after those programs had enabled Sinhalese to surpass Tamils in both income and education. As Thomas Sowell observed: “It was not the disparities which led to intergroup violence but the politicizing of those disparities and the promotion of group identity politics.”10
Consequences of Identity Politics in Sri Lanka and BeyondSri Lanka’s experience highlights some underappreciated consequences of identity politics. Most notably, one would expect that affirmative action programs would have warmed the feelings of the Sinhalese toward the Tamils. After all, they were receiving preferences for jobs and education at the Tamils’ expense. Yet, precisely the opposite happened: as the affirmative action programs were implemented, Sinhalese animus toward the Tamils progressively worsened. This pattern has been repeated in nearly all the countries where affirmative action has been implemented: affirmative action programs have an invidious effect on the group that benefits, imbuing them with a sense of insecurity and defensiveness over the benefits they receive. That group tends to justify the indefinite continuation of these benefits by claiming that the other group continues to enjoy “privilege”—or by demonizing them and claiming that they are “systemically” advantaged. Thus, the beneficiaries of affirmative action are often the ones to initiate hostilities. In Rwanda, for example, it was Hutu affirmative action beneficiaries who perpetrated the violence, not Tutsis. The situation in Sri Lanka was analogous, with Sinhalese instigating all of the initial riots and pogroms against the Tamils.
One knock-on effect of identity politics in Sri Lanka was that it ultimately benefited some of the wealthiest and most privileged people in the country. The government enacted several affirmative action schemes, each increasingly contrived to benefit well-heeled Sinhalese. The last of these implemented a regional quota system that was devised so that aristocratic Sinhalese living in the Kandy region would compete for spots against poor, undereducated Tamil farm workers. As one Tamil who lost his spot in engineering wrote: “They effectively claimed that the son of a Sinhalese minister in an elite Colombo school was disadvantaged vis-à-vis a Tamil tea plucker’s son.” This follows the pattern of many other affirmative action programs around the world: the greatest beneficiaries are typically the most politically connected (and privileged) individuals within the group receiving affirmative action. They are often wealthier and more privileged than many of the individuals against whom affirmative action is directed. This has been well documented in India, which has extensive data on the subgroups that benefit from its affirmative action programs.
One unexpected consequence of identity politics in Sri Lanka was rampant corruption. When Sri Lanka became independent, its government was widely deemed one of the least corrupt in the developing world. However, as affirmative action programs were implemented and expanded, corruption increased in lockstep. The adoption of affirmative action set a paradigm that pervaded the government: whoever held power could steer government resources to whomever they deemed “underserved.” A baleful side effect of ethnicity-based distortion of government policy is that it undermines and erodes more general standards of government integrity and transparency, legitimating a paradigm of corruption: if it is acceptable to direct policy for the benefit of an ethnic group, is it not also acceptable to do so for the benefit of a clan or an individual? It is a small step to go from one to the other, a step that many Sri Lankan leaders and bureaucrats took. Today, Sri Lanka’s government, which once rivaled European governments in transparency, remains highly corrupt. This pattern has been repeated in other countries. For example, after the Federation of Malaysia expelled Singapore, it adopted an extensive affirmative action program, whereas Singapore prohibited ethnic preferences. Malaysia subsequently experienced proliferating corruption, whereas Singapore is one of the least corrupt countries in the world today.
Perhaps the most profound consequence of identity politics in Sri Lanka was that it ultimately made everybody in the country worse off. After World War II, per capita income in Sri Lanka and Singapore was nearly identical. But after it abandoned its shared “link language” and adopted ethnically divisive policies, Sri Lanka was plagued by violent conflict and economic underperformance; today, one Singaporean earns more than seven Sri Lankans put together. All the group preferences devised to elevate Sinhalese brought down everyone in the country—Tamil, Sinhalese, and all the other groups alike. Lee Kuan Yew, Singapore’s “founding father,” attributed that failure to Sri Lanka’s divisive policies, saying that if Singapore had implemented similar policies, “we would have perished politically and economically.” There are echoes of this in other countries that have implemented identity politics. When I visited Rwanda, I asked Rwandans of various backgrounds whether they thought distinguishing people by race or ethnicity ever helped anyone in their country. There was complete unanimity on this point: after they got over pondering why anyone would ask such a naïve question, they made it very clear that distinguishing people by group made everyone, whether Hutu or Tutsi, distinctly worse off. In the Balkans, I got similar answers from Bosnians, Croatians, Serbians, and Kosovars.
The Perilous Path of Identity PoliticsDecades of sociological research and millennia of history have demonstrated that the tribal instinct is both powerful and hardwired into human behavior. As political scientist Harold Isaacs writes:
If anything emerges plainly from our long look at the nature and functioning of basic group identity, it is the fact that the we-they syndrome is built in. It does not merely distinguish, it divides … the normal responses run from … indifference to depreciation, to contempt, to victimization, and, not at all seldom, to slaughter.11
The history of Byzantium and Sri Lanka demonstrates that this tribal instinct is extremely easy to provoke. All it takes is official recognition of group distinctions and some group preferences to balkanize people into bitterly antagonistic groups, and the consequences are potentially dire. Even if a society that is balkanized in this way avoids violent conflict, it is still likely to be plagued by all the concomitants of social fractionalization: higher corruption, lower social trust, and abysmal economic performance.
It is therefore troubling to see the U.S. government and institutions adopt Sri Lankan-style policies that emphasize group distinctions. Echoing Sri Lanka’s separate language “streams,” many American universities now have ethnically segregated orientation and graduation ceremonies. Some offer “theme houses”—dormitories segregated by ethnicity. An Illinois public high school offers separate mathematics classes for Black and Latino students. As the U.S. continues down the perilous path of identity politics, it is unlikely to devolve into another Bosnia or Sri Lanka overnight. But the example of Sri Lanka is a dire warning: a country that was once renowned for its communal harmony quickly descended into violence and economic failure—all because it sought to redress group disparities with identity politics.
Surveys and statistics are now flashing warning signs in the United States. A Gallup poll found that while 70 percent of Black Americans believed that race relations in the United States were either good or very good in 2001, only 33 percent did in 2021.12 Other statistics have shown that hate crimes have been on the rise over that time.13 In the last year, we have also seen the spectacle of angry anti-Israel protesters hammering on the doors of a college hall, terrorizing the Jewish students locked inside, and a Stanford professor telling Jewish students to stand in the corner of a classroom. While identity politics have increasingly directed public policy and institutions, ethnic relations have deteriorated rapidly. This—and a lot of history—suggest it’s time for a different approach.
About the AuthorJens Kurt Heycke was educated in Economics and Near Eastern Studies at the University of Chicago, the London School of Economics, and Princeton University. He worked as an early employee or executive in several successful technology startups. Since retiring from tech, he has worked as a writer and researcher, conducting field research in more than forty countries, from Bosnia to Botswana. He is the author of Out of the Melting Pot, Into the Fire: Multiculturalism in the World’s Past and America’s Future.
ReferencesWe’ve all heard the phrase “it’s not brain surgery.” But what exactly is brain surgery? It’s a profession that is barely a hundred years old and profoundly connects two human beings, but few know how it works, or its history. How did early neurosurgeons come to understand the human brain—an extraordinarily complex organ that controls everything we do, and yet at only three pounds is so fragile? And how did this incredibly challenging and lifesaving specialty emerge?
In this warm, rigorous, and deeply insightful book, Dr. Theodore H. Schwartz explores what it’s like to hold the scalpel, wield the drill, extract a tumor, fix a bullet hole, and remove a blood clot—when every second can mean life or death. Drawing from the author’s own cases, plus media, sports, and government archives, this seminal work delves into all the brain-related topics that have long-consumed public curiosity, like what really happened to JFK, President Biden’s brain surgery, and the NFL’s management of CTE. Dr. Schwartz also surveys the field’s latest incredible advances and discusses the philosophical questions of the unity of the self and the existence of free will.
A neurosurgeon as well as a professor of neurosurgery at Weill Cornell Medicine, one of the busiest and most highly ranked neurosurgery centers in the world, Dr. Schwartz tells this story like no one else could. Told through anecdote and clear explanation, this is the ultimate cultural and scientific history of a literally mind-blowing human endeavor, one that cuts to the core of who we are.
Theodore Schwartz, MD, is the David and Ursel Barnes Endowed Professor of Minimally Invasive Neurosurgery at Weill Cornell Medicine, one of the busiest and highest-ranked neurosurgery centers in the world. He has published over five hundred scientific articles and chapters on neurosurgery, and has lectured around the world—from Bogotá to Vienna to Mumbai—on new, minimally invasive surgical techniques that he helped develop. He also runs a basic science laboratory devoted to epilepsy research. He studied philosophy and literature at Harvard. His new book is: Gray Matters: A Biography of Brain Surgery.
Shermer and Schwartz discuss:
If you enjoy the podcast, please show your support by making a $5 or $10 monthly donation.
Fifteen trivia questions from previous aviation themed episodes of Skeptoid.
Biologist Colin Wright joins the podcast to explore one of today’s most contentious topics: the intersection of biological sex and gender.
Drawing on his expertise in animal behavior and evolutionary biology, Colin breaks down key concepts such as biological sex, gender identity, and gender dysphoria. He also examines the shift in societal definitions of what it means to be a man or woman, and how these evolving perspectives fit with long-standing biological principles.
This session was presented at FreedomFest 2024. To see more speeches and sessions from FreedomFest, visit freedomfest.com/civl.
If you enjoy the podcast, please show your support by making a $5 or $10 monthly donation.
I recently wrote a piece for Skeptic titled “Ranking Presidents: Does It Make Any Sense?”, in which I outlined three reasons why ranking Presidents against one another is a fool’s errand: presentism, the evolving role of the presidency, and sui generis.1 The current trend of the first of these criteria, presentism, becomes problematic when applied to entertainment made for previous generations. Viewing and evaluating the culture of the past through a contemporary lens has led to erasing history in at least three relatively recent incidents. This is, I believe, a slippery slope toward censorship and a missed opportunity for valuable lessons about our collective past.
In 1991, Disney released a video version of their 1940 masterpiece Fantasia, describing it as “a meticulously restored version of the original, full-length film.” It wasn’t, though. The version Disney released omitted an original scene in which a Black centaurette named Sunflower is shown shining shoes of a White centaur.2 Seen today, Sunflower is a patently offensive stereotype.3 Ten years later Disney released the censored version for the film’s 60th anniversary DVD.4 Disney’s use of racist stereotypes is not limited to Fantasia. In varying degrees, such tropes are seen in Dumbo (1941),5 Peter Pan (1953),6 The Aristocats (1970),7 and Aladdin (1992).8
In 2020, the company (admirably, in my view) took steps toward addressing this controversy by adding disclaimers to their films on their streaming services, noting the “harmful impact” of racist stereotypes. Unlike the quiet actions the company took censoring the re-releases of Fantasia, the films are viewable in their original forms.
This begs the question: If the racism was so apparent, why weren’t these films decried upon initial release? The answer is they weren’t considered offensive by the public at the time, and applying today’s attitudes toward race crystallizes the fallacy of presentism.
In 2014, Ruth Wise, professor emerita of Yiddish and Comparative Literatures at Harvard, criticized Fiddler on the Roof (1971) for sacrificing Jewish identity to make the musical more universally appealing.9 The problem with Wise’s argument is (again) presentism. In the early 1970s, M*A*S*H writers employed rape jokes,10 and America’s most popular sitcom (All in the Family) featured a working-class bigot who employed racial slurs for laughs.11 John Lennon released a song titled “Woman is The (N-word) of the World”12 and Richard Pryor would use the same racial epithet in an album title three years later.13 Our attitudes towards cultural authenticity and appropriation have evolved since the early 1970s.
In 2020, a 1988 Golden Girls episode called “Mixed Feelings” was pulled from the streaming platform Hulu due to “a scene in which Betty White and Rue McClanahan are mistaken for wearing blackface.”14 In the episode, Dorothy’s (White) son introduces his fiancé, a much older Black woman. Blanche and Rose are mortified with embarrassment when they unexpectedly meet the couple wearing cosmetic mud masks.
Were Rose and Blanche revisiting a minstrel show to characterize Black Americans as lazy, hypersexual thieves, ala “Amos ‘n Andy,” as minstrel shows were in the past?15 Of course not. The joke lay in their mutual embarrassment of appearing as if they were in blackface.16 Each Golden Girls actress (Betty White, Bea Arthur, Rue McClanahan, and Estelle Getty) came of age decades before the women’s movement, but their show was considerably progressive for their time. In its seven-year run, The Golden Girls featured episodes centered on then-controversial topics of racism, sexual harassment, same-sex marriage, age discrimination, homelessness, the death of children, and addiction.17 Perhaps most significantly, a 1990 episode titled “72 Hours,” has Rose worried that she may have come in contact with HIV.18 It was only five years prior that President Reagan first addressed the AIDS crisis, by which time 42,600 people had died from the disease. By 1990, that number had spiked to 310,000, a third of which were deaths occurring that same year.19 When one considers the climate of the times, airing the episode was courageous.
The same year “Mixed Feelings” was removed from Hulu, an actor named François Clemmons published Officer Clemmons: A Memoir. Clemmons played “Officer Clemmons” on Mr. Roger’s Neighborhood in the late 1960s, the first African American actor to have a recurring role on a children’s television program.20 In Clemmon’s mostly heartwarming book, he relates an incident in which Fred Rogers called him into his office. His boss said to him, “Someone has informed us that you were seen at the local gay bar downtown. Now, I want you to know, Franc, that if you’re gay, it doesn’t matter to me at all. Whatever you say and do is fine with me, but if you’re going to be on the show as an important member of the Neighborhood, you can’t be out as gay.”
Was Mr. Rogers homophobic? When Rogers had the conversation with Clemmons, homosexuality was still listed as a disorder in the DSM. It wasn’t until 1974 that it was replaced with “sexual orientation disturbance.”21 In reality, Fred Rogers, a Presbyterian minister, was an LGBTQ ally. He’d intentionally hired gay men and women since the 1960s and rebuffed efforts from his viewers to renounce homosexuality.22
In John Hughes’ The Breakfast Club (1985)23 and Jeff Kanus’ Revenge of the Nerds (1984),24 there are scenes of sexual assault upon women played for laughs. Both Gene Siskel and Roger Ebert (renowned film critics) praised each film, neither noting their discomfort with the now-troubling scenes in either review.25, 26, 27 Why did they fail to do so? Were both Siskel and Ebert misogynists willing to overlook scenes of women being sexually assaulted? Of course not. The social mores in the early 1980s didn’t apply to those we share today. Are these scenes excusable? No, but both actresses (Molly Ringwald and Julie Montgomery) have publicly reckoned with the blatant sexism in their roles and neither has insisted the scenes be omitted.28, 29
In 2022, the UK’s Channel 5 aired the 1961 classic Breakfast at Tiffany’s, but bowdlerized scenes of Mickey Rooney as “Mr. Yunioshi,” an over-the-top yellow-face Asian caricature.30 Should Rooney’s role be excised? No. Just like the racist characters in Disney movies of the 1940s–1990s, and the sexual assaults depicted for laughs in 1980s raunchy comedies, the climate in 1961 was different.
Pop culture of the past is just that: of the past. Applying today’s standards to them is at best a fool’s errand and, at worst (as seen in the cases above) a slippery slope toward censorship. Entertainment from yesteryear should be taken in context while viewed in its entirety.
About the AuthorJohn D. Van Dyke is an academic and science educator. His personal website is vandykerevue.org.
ReferencesOn balance, will AI help humanity or harm it? AI could revolutionize science, medicine, and technology, and deliver us a world of abundance and better health. Or it could be a disaster, leading to the downfall of democracy, or even our extinction. In Taming Silicon Valley, Gary Marcus, one of the most trusted voices in AI, explains that we still have a choice. And that the decisions we make now about AI will shape our next century. In this short but powerful manifesto, Marcus explains how Big Tech is taking advantage of us, how AI could make things much worse, and, most importantly, what we can do to safeguard our democracy, our society, and our future.
Marcus explains the potential—and potential risks—of AI in the clearest possible terms and how Big Tech has effectively captured policymakers. He begins by laying out what is lacking in current AI, what the greatest risks of AI are, and how Big Tech has been playing both the public and the government, before digging into why the U.S. government has thus far been ineffective at reining in Big Tech. He then offers real tools for readers, including eight suggestions for what a coherent AI policy should look like—from data rights to layered AI oversight to meaningful tax reform—and closes with how ordinary citizens can push for what is so desperately needed.
Taming Silicon Valley is both a primer on how AI has gotten to its problematic present state and a book of activism in the tradition of Abbie Hoffman’s Steal This Book and Thomas Paine’s Common Sense. It is a deeply important book for our perilous historical moment that every concerned citizen must read.
Gary Marcus is a leading voice in artificial intelligence, well known for his challenges to contemporary AI. He is a scientist and best-selling author and was founder and CEO of Geometric.AI, a machine learning company acquired by Uber. A Professor Emeritus at NYU, he is the author of five previous books, including the bestseller Guitar Zero, Kluge (one of The Economist’s eight best books on the brain and consciousness), and Rebooting AI: Building Artificial Intelligence We Can Trust (with Ernest Davis), one of Forbes’s seven must-read books on AI.
“Move fast and break things.” —Mark Zuckerberg, 2012
“We didn’t take a broad enough view of our responsibility.” —Mark Zuckerberg, speaking to the U.S. Senate, 2018
“Generative AI systems have proven themselves again and again to be indifferent to the difference between truth and bullshit. Generative models are, borrowing a phrase from the military, ‘frequently wrong, and never in doubt.’ The Star Trek computer could be counted on to gives sound answers to sensible questions; Generative AI is a crapshoot. Worse, it is right often enough to lull us into complacency, even as mistakes invariably slip through; hardly anyone treats it with the skepticism it deserves. Something with reliability of the Star Trek computer could be world-changing. What we have now is a mess, seductive but unreliable. And too few people are willing to admit that dirty truth.” —Gary Marcus
Shermer and Marcus discuss:
If you enjoy the podcast, please show your support by making a $5 or $10 monthly donation.
15 popular myths about sleeping, debunked.
From the impact of the COVID-19 pandemic to the rise of DEI (Diversity, Equity, and Inclusion) initiatives and Artificial Intelligence, in this episode Steven Pinker, Matt Ridley, and Michael Shermer challenge conventional narratives and explore how we can continue to move forward.
They discuss the state of democracy, autocracy, and the lessons learned from historical crises, while offering insights into how innovation, rationality, and education can lead us through challenging times.
This session was presented at FreedomFest 2024. To see more speeches and sessions from FreedomFest, visit freedomfest.com/civl.
If you enjoy the podcast, please show your support by making a $5 or $10 monthly donation.
On January 1, 2024, a skeptic from Malawi named Wonderful Mkhutche shared a video1 of a witch-hunting incident that took place days before on December 28, 2023. In the video, a local mob is shown burying an elderly woman. According to local sources, the woman was accused of causing the death of a family member who had passed away the previous day. These accusations often arise after family members consult local diviners, who claim to be able to identify suspects. In this instance, a local vigilante group abducted the woman. They were in the midst of burying her alive as punishment for allegedly using witchcraft to “kill” a relative when the police intervened and rescued her.
While witch-hunting is largely a thing of the past in the Western world, the persecution of alleged witches continues with tragic consequences in many parts of Africa. Malawi, located in Southeastern Africa, is one such place. Mr. Mkhutche reports that between 300 to 500 individuals accused of witchcraft are attacked and killed every year.
The Malawi Network of Older Persons’ Organizations reported that 15 older women were killed between January and February 2023.2 Local sources suggest that these estimates are likely conservative, as killings related to witchcraft allegations often occur in rural communities and go unreported. Witch-hunting is not limited to Malawi; it also occurs in other African countries. In neighboring Tanzania, for example, an estimated 3,000 people were killed for allegedly practicing witchcraft between 2005 and 2011, and about 60,000 accused witches were murdered between 1960 and 2000.3 Similar abuses occur in Nigeria, Ghana, Kenya, Zambia, Zimbabwe, and South Africa, where those accused of witchcraft face severe mistreatment. They are attacked, banished, or even killed. Some alleged witches are buried alive, lynched, or strangled to death. In Ghana, some makeshift shelters—known as “witch camps”—exist in the northern region. Women accused of witchcraft flee to these places after being banished by their families and communities. Currently, around 1,000 women who fled their communities due to witchcraft accusations live in various witch camps in the region.4
The belief in the power of “evil magic” to harm others, causing illness, accidents, or even death, is deeply ingrained in many regions of Africa. Despite Malawi retaining a colonial-era legal provision that criminalizes accusing someone of practicing witchcraft, this law has not had a significant impact because it is rarely enforced. Instead, many people in Malawi favor criminalizing witchcraft and institutionalizing witch-hunting as a state-sanctioned practice. The majority of Malawians believe in witchcraft and support its criminalization,5 and many argue that the failure of Malawian law to recognize witchcraft as a crime is part of the problem, because it denies the legal system the mechanism to identify or certify witches. Humanists and skeptics in Malawi have actively opposed proposed legislation that recognizes the existence of witchcraft.6 They advocate for retaining the existing legislation and urge the government to enforce, rather than repeal, the provision against accusing someone of practicing witchcraft.
Islam7 and Christianity8 were introduced to Malawi in the 16th and 19th centuries by Western Christian missionaries and Arab scholars/jihadists, respectively. They coerced the local population to accept foreign mythologies as superior to traditional beliefs. Today, Malawi is predominantly Christian,9 but there are also Muslims and some remaining practitioners of traditional religions. And while the belief in witchcraft predates Christianity and Islam, religious lines are often blurred, as all the most popular religions contain narratives that sanctify and reinforce some form of belief in witchcraft. As a result, Malawians from various religious backgrounds share a belief in witchcraft.
Witch-hunting also has a significant health aspect, as accusations of witchcraft are often used to explain real health issues. In rural areas where hospitals and health centers are scarce, many individuals lack access to modern medical facilities and cannot afford modern healthcare solutions. Consequently, they turn to local diviners and traditional narratives to understand and cope with ailments, diseases, death, and other misfortunes.10
While witch-hunting occurs in both rural and urban settings, it is more prevalent in rural areas. In urban settings, witch-hunting is mainly observed in slums and overcrowded areas. One contributing factor to witch persecution in rural or impoverished urban zones is the limited presence of state police. Police stations are few and far apart, and the law against witchcraft accusations is rarely enforced11due to a lack of police officers and inadequate equipment for intervention. Recent incidents in Malawi demonstrate that mob violence, jungle justice, and vigilante killings of alleged witches are common in these communities.
Another significant aspect of witch-hunting is its highly selective nature. Elderly individuals, particularly women, are usually the targets. Why is this the case? Malawi is a patriarchal society where women hold marginalized sociocultural positions. They are vulnerable and easily scapegoated, accused, and persecuted. In many cases, children are the ones driving these accusations. Adult relatives coerce children to “confess” and accuse the elderly of attempting to initiate them into the world of witchcraft. Malawians believe that witches fly around at night in “witchcraft planes” to attend occult meetings in South Africa and other neighboring countries.12
The persistence of witch-hunting in Africa can be attributed to the absence of effective campaigns and measures to eliminate this unfounded and destructive practice. The situation is dire and getting worse. In Ghana, for example, the government plans on shutting down safe spaces for victims, and the president has declined to sign a bill into law that would criminalize witchcraft accusations and the act of witch-hunting.
For this reason, in 2020 I founded Advocacy for Alleged Witches (AfAW) with the aim of combating witch persecution in Africa. Our mission is to put an end to witch-hunting on the continent by 2030.13 AfAW was created to address significant gaps in the fight against witch persecution in Africa. One of our primary goals is to challenge the misrepresentation of African witchcraft perpetuated by Western anthropologists. They have often portrayed witch-hunting as an inherent part of African culture, suggesting that witch persecution serves useful socioeconomic functions. (This perspective arises from a broader issue within modern anthropology, where extreme cultural relativism sometimes leads to an overemphasis on the practices of indigenous peoples. This stems from an overcorrection of past trends that belittled all practices of indigenous peoples). Some Western scholars tend to present witchcraft in the West as a “wild” phenomenon, and witchcraft in Africa as having domestic value and benefit. The academic literature tends to explain witchcraft accusations and witch persecutions from the viewpoint of the accusers rather than the accused. This approach is problematic and dangerous, as it silences the voices of those accused of witchcraft and diminishes their predicament.
Due to this misrepresentation, Western NGOs that fund initiatives to address abuses linked to witchcraft beliefs have waged a lackluster campaign. They have largely avoided describing witchcraft in Africa as a form of superstition, instead choosing to adopt a patronizing approach to tackling witch-hunting—they often claim to “respect” witchcraft as an aspect of African cultures.14 As a result, NGOs do not treat the issue of witch persecution in Africa with the urgency it deserves.
Likewise, African NGOs and activists have been complicit. Many lack the political will and funding to effectively challenge this harmful practice. In fact, many African NGO actors believe in witchcraft themselves! Witch-hunting persists in the region due to lack of accurate information, widespread misinformation, and insufficient action. To end witch-hunting, a paradigm shift is needed. The way witchcraft belief and witch-hunting are perceived and addressed must change.
AfAW aims to catalyze this crucial shift and transformation. It operates as a practical and applied form of skepticism, employing the principles of reason and compassion to combat witch-hunting. Through public education and enlightenment efforts, we question and debate witchcraft and ritual beliefs, aiming to dispel the misconceptions far too often used to justify abuses. Our goal is to try to engage African witchcraft believers in thoughtful dialogue, guiding them away from illusions, delusions, and superstitions.
The persistence of abuses linked to witchcraft and ritual beliefs in the region is due to a lack of robust initiatives applying skeptical thinking to the problem. To effectively combat witch persecution, information must be translated into action, and interpretations into tangible policies and interventions. To achieve this, AfAW employs the “informaction” theory of change, combining information dissemination with actionable steps.
At the local level, we focus on bridging the information and action gaps. Accusers are misinformed about the true causes of illnesses, deaths, and misfortunes, often attributing these events to witchcraft due to a lack of accurate information. Many people impute misfortunes to witchcraft because they are unaware of where to seek help or who or what is genuinely responsible for their troubles. This lack of understanding extends to what constitutes valid reasons and causal explanations for their problems.
As part of the efforts to end witch-hunting, we highlight misinformation and disinformation about the true causes of misfortune, illness, death, accidents, poverty, and infertility. This includes debunking the falsehoods that charlatans, con artists, traditional priests, pastors, and holy figures such as mallams and marabouts exploit to manipulate the vulnerable and the ignorant. At AfAW, we provide evidence-based knowledge, explanations, and interpretations of misfortunes.
Our efforts include educating the public on existing laws and mechanisms to address allegations of witchcraft. We conduct sensitization campaigns targeting public institutions such as schools, colleges, and universities. Additionally, we sponsor media programs, issue press releases, engage in social media advocacy, and publish articles aimed at dispelling myths and misinformation related to witch-hunting in the region.
We also facilitate actions and interventions by both state and non-state agencies. In many post-colonial African states, governmental institutions are weak with limited powers and presence. One of our key objectives is to encourage institutional collaboration to enhance efficiency and effectiveness. We petition the police, the courts, and state human rights institutions. Our work prompts these agencies to act, collaborate, and implement appropriate measures to penalize witch-hunting activities in the region.
Additionally, AfAW intervenes to support individual victims of witch persecution based on their specific needs and the resources available. For example, in cases where victims have survived, we relocate them to safe places, assist with their medical treatment, and facilitate their access to justice. In situations where the accused have been killed, we provide support to the victims’ relatives and ensure that the perpetrators are brought to justice.
We get more cases than we can handle. With limited resources, we are unable to intervene in every situation we become aware of. However, in less than four years, our organization has made a significant impact through our interventions in Nigeria and beyond. We are deploying the canon of skeptical rationality to save lives, awaken Africans from their dogmatic and superstitious slumber, and bring about an African Enlightenment.
This is a real culture war, with real consequences, and skepticism is making a real difference.
About the AuthorLeo Igwe is a skeptic and director of the Advocacy for Alleged Witches which aims to end witch-hunting in Africa by 2030. His human rights fieldwork has led to his arrest on several occasions in Nigeria.
ReferencesJoin the Skeptoid Flash Mob at CSICon 2024 in Las Vegas. Visit skeptoid.com/store to get your shirts.
Matthew Stewart is an independent philosopher and historian who has written extensively about the philosophical origins of the American republic, the history of philosophy, management theory, and the culture of inequality. His work has appeared in The Atlantic, the Washington Post, the Wall Street Journal, and Harvard Business Review, among other publications. In recent years he has lived in Boston, New York, and Los Angeles, and is currently based in London. He is the author of Nature’s God: The Heretical Origins of the American Republic and An Emancipation of the Mind: Radical Philosophy, the War over Slavery, and the Refounding of America.
Shermer and Stewart discuss:
If you enjoy the podcast, please show your support by making a $5 or $10 monthly donation.
Lots of companies sell pheromone products claiming to calm down your dog or cat, but there's a very big problem with that basic claim.
From precognitive dreams and telepathic visions to near-death experiences, UFO encounters, and beyond, so-called impossible phenomena are not supposed to happen. But they do happen—all the time. Jeffrey J. Kripal asserts that the impossible is a function not of reality but of our everchanging assumptions about what is real. How to Think Impossibly invites us to think about these fantastic (yet commonplace) experiences as an essential part of being human, expressive of a deeply shared reality that is neither mental nor material but gives rise to both. Thinking with specific individuals and their extraordinary experiences in vulnerable, open, and often humorous ways, Kripal interweaves humanistic and scientific inquiry to foster an awareness that the fantastic is real, the supernatural is super natural, and the impossible is possible.
Jeffrey J. Kripal holds the J. Newton Rayzor Chair in Philosophy and Religious Thought at Rice University. He is the author of numerous books, including The Superhumanities: Historical Precedents, Moral Objections, New Realities, The Flip: Epiphanies of Mind and the Future of Knowledge, Authors of the Impossible: The Paranormal and the Sacred, Esalen: America and the Religion of No Religion, Mutants and Mystics: Science Fiction, Superhero Comics, and the Paranormal, and just published, also by the University of Chicago Press, How to Think Impossibly: About Souls, UFOs, Time, Belief, and Everything Else.
If you enjoy the podcast, please show your support by making a $5 or $10 monthly donation.
During a two-hour interview with Tucker Carlson, Darryl Cooper made sensational claims about the Holocaust and World War II, with Carlson calling him “the best and most honest popular historian in the United States.” In this solo episode, Michael Shermer takes a critical look at the pseudohistory and historical revisionism presented by Cooper on Carlson’s show.
We are three college professors who wish to call attention to a growing problem, namely the erosion of the foundational values of a college education: free inquiry and free speech, rationality and empiricism, civil discussion and debate, and openness to new ideas.
The Rise of Critical TheoriesCritical theory is a school of thought that has its roots in Marxist theories of human nature and society. It originally developed in Germany in the 1920s among a group of scholars at the Institute for Social Research. They were attempting to salvage some of the failed ideas of Marxism by extending theory to embrace non-economic forms of inequality and oppression.
Critical theorists believe that mainstream knowledge is used to promote the interests of the powerful. Unlike traditional social science, which aims to objectively describe human nature and society by carrying out scientific research, critical theory promotes ideological narratives as self-evidently true. Based on their theories about human nature and social justice, critical theorists promote political activism (or “praxis”), and at times even violent revolution, to achieve their goals.
The predecessor to critical theory, Marxism, simplistically divided people into groups labeled as oppressors or oppressed. Marxism’s original group division was economic—the groups were the oppressive Bourgeois (those who controlled the means of production) and the oppressed Proletariat (the workers). It tried to explain the systemic causes of these group divisions (capitalism) and it developed a set of proposed solutions, including violent revolution, that it presumed would lead to a utopian communist society. These steps, which we will call “Marxist methodology, subsequently became part of critical theories that then focused on additional ways of dividing people into categories of oppressors and oppressed. The Marxist methodology follows the steps shown in Table 1.
Many social movements based on critical theories have used this Marxist methodology, as noted in Table 1.
All these ideological movements have restricted free speech, encouraged an “us” versus “them” political tribalism, employed personal ad hominem attacks against opponents, and promoted cancellation campaigns. While it is important to respect diversity and historical injustices, we should keep in mind that truly liberal worldviews emphasize our common humanity—which is far less divisive.
What Does “Social Justice” Mean?The new higher education mantra, “social justice” sounds good, but it in fact can refer to either of two often mutually exclusive philosophies: liberal social justice or critical social justice. Though few acknowledge it, increasingly social justice is sold as the former, but practiced as the latter. Consider how they compare in the Table 2.
As is evident, liberal social justice and critical social justice employ two very different methods in determining what constitutes social justice.
Language RevisionismCritical social justice activists often use the “Motte and Bailey strategy” (see Table 3) to make extreme proposals appear moderate. In this gambit a highly defensible “Motte” position is promoted, while successively working toward a more radical “Bailey” position. This gambit is used often in postmodernist discourses. For example, by asserting that morality is socially constructed, the Motte is that our beliefs are socially influenced, and the Bailey is that there is no such thing as morality or truth. Another example:
Here are more examples of the Motte and Bailey strategies with respect to the re-definition of some commonly used words.
What is social justice when re-interpreted from a critical social justice lens?1Again, the term “social justice” in common language refers to the liberal social justice conceptions of individual rights and responsibilities, equal opportunity, blind justice, equality before the law, etc., as noted above. These ideas evolved from historic common law, the Enlightenment (particularly the Scottish Enlightenment), and U.S. constitutionalism.
However, over recent decades the term social justice has come to be redefined in terms of critical, not liberal, social justice. This re-definition was accomplished surreptitiously through the Motte and Bailey gambit, and it also allowed the more radical philosophy of critical theory itself to be covertly introduced into college campuses while flying under the academic radar. By analogy, the term “social justice” has been used as a terminological Trojan horse to insert critical social justice and critical theory into the academy under the guise of liberal social justice.
Restrictions on Freedom of Speech and Open InquiryThis sort of critical social justice activism and indoctrination (as opposed to exposing students to these perspectives in the context of discussing and debating the respective strengths and weaknesses of a range of perspectives) is the opposite of free expression and open inquiry, and thus it is the antithesis of the foundational values of traditional higher education.
Often mere attempts to question how, why, or whether X-injustice is happening leads to accusations that the questioner must be a bigoted “X-ist” or “X-phobe.” Questioning is often dismissed by critical theorists as defensive rhetoric employed to defend one’s privilege and power. The questioner needs thus be silenced, ostracized, and/or canceled. As documented by the Foundation for Individual Rights and Expression (FIRE), this has in fact happened thousands of times.
Some Examples of Restriction of Speech and Open InquiryA series of large-scale empirical studies beginning in the year 2000 found that both students and professors report fearing to express or explore political and ideological viewpoints that are critical of critical theory.2 Further, campuses have little ideological diversity among faculty and administrators, with typically a 12:1 ratio of liberal/progressive to conservative/libertarian, and many departments and some whole fields lacking any conservative or libertarian faculty members. Studies document that many professors freely admit to discriminating against colleagues and students who support liberal, rather than critical, conceptions of social justice. Here are some recent representative examples:
Critical pedagogy is an ideological approach to teaching that attempts to impose political views and activism in the classroom that are consistent with critical theory. It was founded by the Brazilian philosopher and educator Paulo Freire, who promoted it through his 1968 book Pedagogy of the Oppressed. It pressures students to adopt a specific political ideology and rejects dissenting views. Doing so takes time away from developing core academic skills, including critical thinking skills.
At its worst, critical pedagogy can produce an environment where some professors and administrators try to tell students not how to think, but what to think. Professors should not be using the lectern as an activist bully pulpit to push their personal ideological or political beliefs. Since professors are in positions of power relative to their students, such activism in the classroom is unethical and constitutes professional misconduct. Students should not be expected to conform to ideologies or dogmas in the classroom.
It is unfortunate that students will very likely be subject to activism on the part of some of their professors and even some fellow students. If they disagree with them, they may at times feel that they should keep their thoughts to themselves. But do not. Speak up!
Spotting Education v. IndoctrinationTo be clear, although we do not subscribe to critical theory because of the difficulties with it that we (and many others) have identified, we do not object to a professor teaching or discussing critical theory and critical social justice and presenting his or her opinions about matters based on those perspectives. College is all about exposing students to a range of ideas and opinions. However, professors should not attempt to indoctrinate their students with critical theory or anything else, and they should expose students to a range of perspectives on various issues. Below are a few pointers to help students to identify whether a course or a professor is promoting critical theory through indoctrination rather than education.
Courses that educate tend to have:
Whereas courses that indoctrinate tend to:
Campus activism can be covert rather than overt, with the professor communicating to students what is acceptable or not based on their reactions to student comments, the topics they select to discuss and to omit, their grading practices and feedback, how they interact with and treat students having different opinions, or even their body language when talking about various topics.
A 2007 American Association of University Professors (AAUP) subcommittee report stated such activist professors present their favored worldview “dogmatically, without allowing students to challenge their validity or advance alternative understandings” and such instructors “insist that students accept as truth propositions that are in fact professionally contestable.” Given that professors are in a position of power over their students, this type of behavior is especially inappropriate. And, as far back as 1915, the AAUP advised that professors “should, in dealing with [controversial] subjects, set forth justly, without suppression or innuendo, the divergent opinions” on the issue. This 1915 advisory is still in effect. Indeed, any failure to do so may constitute an ethical breach. Professors should teach students about different sides of an issue and do so fairly, rather than pretending there is just one permitted viewpoint, as in a Marxist or authoritarian organization or system.
What Should Be Done?First, if students encounter a professor that they believe is using the classroom to engage in ideological or political activism, students should speak up. That may be less risky than students think. Remember, education should empower students to engage in critical thinking and constructive dialogue. If students encounter concerning situations, approach professors for respectful discussions. If needed, seek guidance from department chairs or administrators who value open inquiry. There usually are some.
Often, students cannot rely on their institution’s hierarchy alone—indeed they may be part of the problem. Moreover, there is safety in numbers. Enlist parents and outside organizations to lobby college or university to ensure that it is promoting intellectual diversity, open inquiry, and free thought. (A very simple change is to ask that course evaluations include questions on whether students felt free to voice their opinions in class, whether the professor dealt fairly with students having divergent views, and whether different sides of controversial issues were presented or discussed.)
Today, there are numerous organizations to help students. Currently, the most prominent bipartisan protectors and promoters of free thought are the Foundation for Individual Rights and Expression (FIRE) and Speech First. The important thing to remember is that you are not alone. Aside from those organizations, it is certain that many others at their institution will be rooting for students, even if they feel that they can only do so privately.
Second, know that when confronted with transparency (sometimes supplemented with attorneys), bullies tend to back down.
Third, know that the trials students are facing now can make them stronger, and further, are nothing like those faced by Alexander Solzhenitsyn, Vaclav Havel, Martin Luther King, Jr., Jackie Robinson, James Meredith, and thousands of others who faced suppression for their beliefs or their identity. The worst fate awaiting students would be having to transfer from a school which does not value free thought to one that does. Students have choices. Make them wisely.
About the AuthorMichael Mills is an evolutionary psychologist at Loyola Marymount University (LMU). He earned his B.A. from UC Santa Cruz and his Ph.D. from UC Santa Barbara. He has served as Chair, and as the Director of the Graduate Program, at the LMU Psychology Department. He serves on the editorial boards of several academic journals and on the executive board of the Society for Open Inquiry in the Behavioral Sciences (SOIBS).
Robert Maranto is the 21st Century Chair in Leadership in the Department of Education Reform at the University of Arkansas, where he studies bureaucratic reform and edits the Journal of School Choice. He has served on the Fayetteville School Board (2015-20) and currently serves on the executive board of the Society for Open Inquiry in the Behavioral Sciences (SOIBS). With others, he has produced about 100 refereed publications and 17 scholarly books so boring his own mother refused to read them, including President Obama and Education Reform (Palgrave/Macmillan, 2012), Educating Believers: Religion and School Choice (Routledge, 2021), and The Free Inquiry Papers (AEI, 2024). He can be reached at rmaranto@uark.edu.
Richard E. Redding is the Ronald D. Rotunda Distinguished Professor of Jurisprudence and Associate Dean, and Professor of Psychology and Education at Chapman University. He has written extensively on the importance of viewpoint and sociopolitical diversity in teaching, research, and professional practice. Notable publications include Ideological and Political Bias in Psychology: Nature, Scope, and Solutions (Springer, 2023); Sociopolitical Values at the Deep Culture in Culturally-Competent Psychotherapy (Clinical Psychological Science, 2023); and, Sociopolitical Diversity in Psychology: The Case for Pluralism (American Psychologist, 2001). He is the founding President of the Society for Open Inquiry in the Behavioral Sciences (soibs.com).
OrganizationsIn his new book Disbelief: The Origins of Atheism in a Religious Species, Will Gervais, PhD., a global leader in the psychological study of atheism, shows that the ubiquity of religious belief and the peculiarities of atheism are connected pieces in the puzzle of human nature. Does God exist? This straightforward question has spawned endless debate, ranging from apologists’ supposed proofs of God’s existence to New Atheist manifestos declaring belief in God a harmful delusion.
It’s undeniable that religion is a core tenet of human nature. It is also true that our overwhelmingly religious species is also as atheistic as it’s ever been. Yet, no scientific understanding of religion is complete without accounting for those who actively do not believe. In this refreshing and revelatory book, Gervais argues that religion is not an evolutionary puzzle so much as two evolutionary puzzles that can only be solved together. First is the Puzzle of Faith: the puzzle of how Homo sapiens – and Homo sapiens alone – came to be a religious species. Second is the Puzzle of Atheism: how disbelief in gods can exist within our uniquely religious species. The result is a radically cohesive theory of both faith and atheism, showing how we became a uniquely religious species, and why many are now abandoning their belief.
Through a firsthand account of breakthroughs in the scientific study of atheism, including key findings from cognitive science, cultural evolution, and evolutionary psychology, Disbelief forces a rethinking of the prevailing theories of religion and reminds both believers and atheists of the shared psychologies that set them on their distinct religious trajectories. In casual prose and with compelling examples, Gervais explains how we became religious, why we’re leaving faith behind, and how we can get along with others across the religious divides we’ve culturally evolved.
Will Gervais, PhD, is a cultural evolutionary psychologist and has been a global leader in the scientific study of atheism for over a decade. Dr. Gervais’s research has been featured in media such as The New York Times, the Washington Post, National Public Radio, Der Speigel, Psychology Today, Vox, and Scientific American. His interdisciplinary work, lying at the intersection of cultural evolution, evolutionary psychology, and cognitive science, has garnered international scientific recognition. He was named a Rising Star by the Association for Psychological Science and is the recipient of the Margaret Gorman Early Career Award from the American Psychological Association and the SAGE Young Scholar Award from the Foundation for Personality and Social Psychology.
Gervais and Shermer discuss:
If you enjoy the podcast, please show your support by making a $5 or $10 monthly donation.
Is this just another in a long line of legendary lost mines that never produced a speck of gold, or is there more to it this time?