You are here

Why Evolution is True Feed

Subscribe to Why Evolution is True Feed feed Why Evolution is True Feed
Why Evolution is True is a blog written by Jerry Coyne, centered on evolution and biology but also dealing with diverse topics like politics, culture, and cats.
Updated: 5 hours 27 min ago

More wokeness from the journal Nature, and a response from a reader

Tue, 03/31/2026 - 9:00am

It’s hopeless: Nature, like nearly all prominent science journals, has been colonized by woke craziness.  Perhaps the word “craziness” for the present topic is a bit too strong, but the headline below suggests a degree of unhinged-ness that often comes with virtue-flaunting. And of course this isn’t the first such article in Nature.

Click the screenshot to below read the article, part of a series billed as “profiles [of] scientists with unusual career histories or outside interests”. This scientist, Dr. Anne Poelina, has the unusual habit of naming a river as the first author of her science papers:

An excerpt;

Conservationist Anne Poelina has a deep connection to the fresh water that runs through the dry red-rock landscape of the Kimberley region in Western Australia. Poelina identifies as a Nyikina Warrwa woman, and her people are the Traditional Custodians of the Martuwarra Fitzroy River. The river meanders through the region’s arid land, cutting a path of about 735 kilometres long through steep gorges, savannahs and flood plains before terminating at King Sound, a delta fringed by tidal mangroves by the Indian Ocean.

The Martuwarra Fitzroy River is one of Australia’s last-remaining relatively intact, undammed tropical river systems. For now.

The river faces many threats, for instance, from water use in agricultural irrigation. It’s also at risk from proposed plans to extract natural gas through fracking, or hydraulic fracturing, and to look for rare-earth elements and metals such as vanadium and titanium. Moreover, climate change is predicted to cause extreme floods and droughts.

. . .Poelina is connected to the river through her matrilineal heritage — her mother’s people are the Nyikina First Nation. The Nyikina’s traditional territory, or Country, lies in the river’s watershed, as do those of nine other Indigenous communities. (Country is the term that Aboriginal Australian people use to refer to their ancestral lands, its meaning is similar to the Western concept of nature.)

Poelina explains that “in terms of property rights, the river owns me. So, I have a duty of care and the fiduciary duty to protect this river’s right to life.” Because Poelina works with the river to produce fresh knowledge and assimilate ancient wisdom, she decided to recognize its contributions formally. In 2020, she started including the Martuwarra River of Life as the first author on her publications.

Poelina says, “Country is a first author for Indigenous people in the Northern Territory of Australia. So, I just did it.” Whether the journal to which she submitted her first paper assumed “that the name was human or not, I don’t know”, she adds.

Here’s a list of her papers on Google Scholar, and, sure enough, a few of them—but far from all—have “MRiverofLife” as first author, with “M” standing for “Martuwarra”. Here’s one (click to go to site):

Here’s a description of the river in northwest Australia (it’s called either “Martuwarra” or “Fitzroy”), and here’s a description of its place in local culture, where the river is called a “living ancestral being.”  It’s neither living nor an ancestral being: that is just lore. Still, the indigenous council of “river keepers” consults with the Australian government to keep the river in good shape, and that’s an admirable thing, But making a river a coauthor? Perhaps I should have made my Drosophila flies the first author of my papers, maybe disguised as “Dr. O. Sophila.”

At any rate, reader and professor Jente Ottenburghs (an evolutionary biologist who works on birds) couldn’t take it the Nature paper, and wrote me this: “This seems to be another case where a high-profile journal romanticizes indigenous knowledge (similar to the situations in New Zealand and Canada that you covered recently). I also decided to write a blog post about it, partly inspired by the book The Beginning of Infinity by David Deutsch which I am currently reading.”

Sure enough, his blog post is below, and you can access it for free by clicking the screenshot:

Two excerpts. First, on the ubiquity and sacralization of the “two-eyed seeing” trope and the sacralization of the oppressed (i.e., indigenous people). Note that yes, Australian indigenous people were badly treated by European colonists, but that is not what’s under consideration here.

In recent years, there has been a growing interest in incorporating indigenous knowledge into scientific research. There are indeed nice examples where such knowledge has proven valuable. For instance, a recent study in Oryx combined ethnospecies lists from local communities with scientific datasets to reveal a consistent decline in bird body mass across three continents. Approaches like this study demonstrate that local knowledge can complement scientific inquiry, particularly in data collection and long-term ecological observation.

However, indigenous knowledge is often romanticized, sometimes being portrayed as inherently superior to scientific knowledge. This tendency is partly driven by a legitimate desire to correct historical injustices (such as colonialism and the marginalization of local communities) and to show greater respect for indigenous perspectives. While this shift is clearly necessary and overdue, it should not come at the expense of critical evaluation of indigenous knowledge.

Many elements of indigenous knowledge consist of local myths or context-bound explanations. As such, they are often parochial rather than universal, and therefore do not qualify as good scientific explanations. This does not diminish their cultural, historical, or philosophical value, but it does mean they should not automatically be treated as reliable sources of scientific insight.

Of authorship and the river:

There appears to be growing pressure within academia to signal the recognition of indigenous knowledge, sometimes in ways that blur the distinction between cultural respect and scientific rigor. A striking example appeared in Nature, where conservationist Anne Poelina listed the Martuwarra River of Life as a co-author on her publications.

Poelina explains that “in terms of property rights, the river owns me. So, I have a duty of care and the fiduciary duty to protect this river’s right to life.” Because Poelina works with the river to produce fresh knowledge and assimilate ancient wisdom, she decided to recognize its contributions formally. In 2020, she started including the Martuwarra River of Life as the first author on her publications.

When asked why the river should be listed as first author, she responded: “Because it’s the authority. It’s where I get my authority.” This reasoning stands in direct contrast with the scientific method, which explicitly rejects appeals to authority as a basis for truth. Science operates as a culture of criticism, where ideas must withstand scrutiny regardless of their source. As physicist Richard Feynman famously put it: “If it disagrees with experiment, it’s wrong. In that simple statement is the key to science. It doesn’t make any difference how beautiful your guess is, it doesn’t matter how smart you are, who made the guess, or what his name is … If it disagrees with experiment, it’s wrong. That’s all there is to it.”

Hence, attributing authorship to a river on the grounds of authority is not just unconventional; it reflects a fundamental misunderstanding of how knowledge is evaluated in science.

. . .A similar issue arises in arguments that emphasize the age of indigenous knowledge (or any other knowledge system). Poelina suggests that “if we have the oldest systems of thinking around science and law, shouldn’t the world be listening to what our people have to say?”. But age is not a marker of reliability. As discussed earlier, Greek myths are thousands of years old, but they obviously fail as scientific explanations because they are easily varied and lack universality.

The same principle applies more broadly: all knowledge claims (whether scientific or indigenous) must be evaluated using the same standards. Some elements of indigenous knowledge may indeed prove robust and valuable under scrutiny, while other elements may not. We still need to separate the trustworthy wheat from the superstitious chaff. And the scientific method is the best approach to do just that.

There’s a preliminary section of Ottenburghs’ paper, inspired by his reading of Deutsch, about how science works and how scientific explanations are evaluated, which fed into the post (or riposte) above.  This whole thing may seem trivial, but if we don’t keep calling out the creeping sacralization of indigenous knowledge, and the intrusion into science of myth, storytelling, and superstition, it will become stuck in science like a tick on your leg, with the potential to cause the scientific equivalent of Lyme disease.

Categories: Science

The ten best-selling books in history, and what I’m reading

Tue, 03/31/2026 - 6:15am

I think the site below was suggested to me by Facebook, but at any rate one can subscribe for free. It’s called 1000 Libraries Magazine, and it specializes in news about books, which of course interests me.  Here, for example, is one of their latest articles whose title was catnip for me (click to read; you may have to give them your email and subscribe):

Now of course everybody knows at least one of these: the Bible. But can you guess the others? Some are obvious when you think about it, but others are not. I’ll list the top ten giving the number of copies estimated to have been sold. Text from the site is indented. I’ll also tell you if I’ve read them (total read: 8/10).

1.)  The Bible. 5 billion copies sold. 

Sitting firmly at the top, and likely forever unchallenged, is The Bible. With an estimated 5 billion copies sold, it’s the most distributed and translated book in human history.

What makes this even more remarkable is how it spread. Long before modern publishing, social media, or mass literacy. The Bible has been translated into over 3,000 languages, carried across continents by missionaries, scholars, and believers, and printed continuously for centuries.

I read this when I was writing Faith Versus Fact. It was a tedious exercise, and assertions that it’s a great work of literature are bogus. Parts of it are good, yes, but I always say that if there was only one copy of the book, sitting in a dusty “reduced price” bin somewhere, critics would claim it is boring—which it is.  Try reading how the Ark was constructed near the beginning!  It is considered a great work of literature only because it was influential, not because it was good. However, the King James translators did do a good job on the translation.

2.) The Little Red Book. 1.1 billion copies sold. 

This one surprises many people. Officially titled Quotations from Chairman Mao ZedongThe Little Red Book reached 1.1 billion copies sold, largely during China’s Cultural Revolution.

It wasn’t sold in the traditional sense. It was distributed, required reading, and a political tool. At one point, owning a copy wasn’t optional; it was a social expectation.

I haven’t read it.

3.) The Qur’an.  800 million copies sold. 

As the central religious text of Islam, the Quran has sold an estimated 800 million copies worldwide.

Muslims believe it to be the literal word of God, revealed in Arabic, which is why translations are often considered interpretations rather than replacements. Like the Bible, it’s recited, memorized, studied, and revered, not just read once and shelved.

Yes, I read it, also when writing Faith Versus Fact. It’s not only boring like the Bible, but filled with more animosity, bellicosity, and hatred than you can imagine. I was surprised that so few copies were sold: there are nearly as many Muslims as there are Christians on the planet, but their sacred book has sold less than 20% as much as the Bible.

4.) The Bhagavad Gita. 503 million copies sold. 

Part philosophy, part spiritual guide, part epic dialogue, The Bhagavad Gita has sold over 503 million copies.

Embedded within the Indian epic Mahabharata, this relatively short text explores duty, morality, devotion, and the nature of life itself. It has inspired thinkers from Mahatma Gandhi to modern self-help writers.

Yes, I read this, but simply because it was touted as a work of philosophy and because it had a big influence on India, a country I love. I thought it was definitely worth reading. I have not read the entire Mahabarata.

Robert Oppenheimer certainly read at least the Bhagavad Gita (and in the original Sanskrit!), for he gave a famous quote from it when the atomic bomb was successfully tested in New Mexico. Here’s what he said to NBC in 1965:

“I remembered the line from the Hindu scripture, the Bhagavad-Gita; Vishnu [a principal Hindu deity] is trying to persuade the prince that he should do his duty, and to impress him, takes on his multi-armed form and says, “Now I have become death, the destroyer of the worlds’. I suppose we all thought that, one way or another.”

5.) Don Quixote. 5oo million copies sold.

Often called the first modern novel, Don Quixote has galloped its way to 500 million copies sold since its publication in 1605.

Written by Miguel de Cervantes, this satirical tale of a delusional knight tilting at windmills is hilarious, tragic, and surprisingly modern. It pokes fun at idealism while also celebrating imagination, a tricky balance Cervantes somehow nailed, even way back then.

Yep, I’ve read it, and found it good but not great. My bad.

6.) A Tale of Two Cities. 200 million copies sold. 

“It was the best of times, it was the worst of times…” and apparently, it was also one of the most read. Set during the French Revolution, A Tale of Two Cities has sold 200 million copies, making it Charles Dickens’ bestselling novel.

Yes, I read it, but think there are better works by Dickens, like Bleak House or David Copperfield.

7.) The Little Prince.  200 million copies sold. 

The Little Prince has sold 200 million copies and remains one of the most translated works ever written. On the surface, it’s a children’s story. Underneath, it’s a poetic meditation on love, loneliness, and what really matters.

It’s the kind of book people reread at different stages of life, and somehow find something new each time.

Yes, I read it—twice, once when younger and once when I was over 40.  I didn’t find much new the second time, and thought it was sappy. Sue me.

8.) The Book of Mormon. 190 million copies sold.

With 190 million copies sold, The Book of Mormon stands as another major religious text with global reach.

Published in 1830, it forms the foundation of the Church of Jesus Christ of Latter-day Saints. Its distribution has been driven largely by missionary efforts, making it one of the most actively shared books in modern history.

Yep, I read it, again while writing Faith Versus Fact. It’s a straight ripoff of the Bible, confected not by God but by Joseph Smith, who apparently loved the phrase, “And so it came to pass.” The only part worth reading are the two “testimonies” at the beginning, with 11 people swearing that they actually saw the golden plates. They were all lying. Here’s the second testimony (you can see the whole book here).  Given the fraudlent way the book came to be, I always question the credibility of Mormons who think it’s true.

9.) The Lord of the Rings. 155 million copies sold.

One epic fantasy, three volumes, and 155 million copies sold.

J.R.R. Tolkien’s Middle-earth saga didn’t just entertain readers; it redefined fantasy as a genre. Elves, hobbits, detailed world-building, invented languages… all roads lead back to The Lord of the Rings.

Of course I’ve read it—who hasn’t?  I watched part of one of the movies, and was not engaged, since I had the scenery and the characters in my mind from reading the book, and the movie didn’t match, though Gollum was good.  The Hobbit is also an essential part of the Tolkien experience. You have to admire Tolkien for creating an entire fantasy world, complete with its own language—all while he was a professor.

10.) The Alchemist. 150 million copies sold.

Rounding out the list is The Alchemist, with 150 million copies sold. It stands as proof that modern books can still join legendary company.

Paulo Coelho’s spiritual fable about following your dreams resonates across cultures and ages. It’s short, simple, and endlessly quotable, a book people gift, recommend, and return to when they’re feeling lost.

This, along with The Little Red Book, is one of the two out of ten that I haven’t read. In fact, I haven’t even heard of it until now, though it was published in 1988, so I looked it up on Wikipedia. Here’s part of what I read:

The Alchemist (Portuguese: O Alquimista) is a novel by Brazilian author Paulo Coelho which was first published in 1988. Originally written in Portuguese, it became a widely translated international bestseller. The story follows Santiago, a shepherd boy, in his journey across North Africa to the Egyptian pyramids after he dreams of finding treasure there. It has since been translated into more than 65 languages and has sold more than 150 million copies worldwide.  In 2009, Paulo Coelho was recognized by the Guinness World Records as the world’s most translated living author.

. . . The book’s main theme is about finding one’s destiny, although according to The New York TimesThe Alchemist is “more self-help than literature”. The advice given to Santiago that “when you really want something to happen, the whole universe will conspire so that your wish comes true” is the core of the novel’s thinking. Coelho originally wrote The Alchemist in only two weeks, explaining later that he was able to work at this pace because the story was “already written in [his] soul.”

The NYT take, archived, is here. where Will Smith, who likes the book, calls it “real metaphysical, esoteric nonsense.”  I don’t think I’ll be reading it: life is too short. But if you have read it, weigh in below. The author must be bloody rich!

I’ve recently finished three books, all recommended by my erstwhile editor at Viking Penguin, who knows her books. I enjoyed them all, and I’m reading another book now in preparation for travel (the last below):

We Don’t Know Ourselves:  A Personal History of Ireland Since 1958, published in 2021 Fintan O’Toole. I wouldn’t have thought I’d be engrossed by a history of modern Ireland, but this book did the job. O’Toole, a respected Irish journalist and drama critic, decided to recount the modern history of Ireland from the year he was born up to the time of publication, with each chapter encompassing a period of time.  As I said, I really liked the book and learned a ton, especially about the entangled and convoluted history of the Catholic Church and Irish politics during this period.  Even in O’Toole’s youth and young manhood, the Church was enslaving children and unwed pregnant mothers, engaging in financial misdealings with the government, and oppressing the Irish (condoms were legalized only for married people in 1979, and for the unmarried in 1985; while abortions were illegal until just seven years ago).  That the Irish came through all this shows their resilience.

Empire of the Sun, published in 1984 novel by the English writer J. G. Ballard. This is a “fictionalized biography” based on Ballard’s experiences as a youth in China when he was separated from his parents and interred in a Japanese prison camp near Shanghai for some years.  The resourcefulness of Ballard, insofar as his depiction is true, is amazing, and the book engrossing. I gather that it was turned into a very successful 1987 film with a screenplay by Tom Stoppard and directed by Stephen Spielberg. You can’t do better than that pair. I must see the movie. However, I found I have a bit of a problem with biography turned into fiction, as I get distracted trying to separate truth from imagination.  I should just let that endeavor go, but it somehow interrupts my reading.

Running in the Family by Michael Ondaatje, published in 1982. Ondaatje wrote the Booker-Prize-winning novel The English Patient, while Running in the Family is a somewhat fictionalized memoir of his youth in Sri Lanka and of two subsequent visits he made there as an adult. It seems to be more truthful than the two books above in terms of recounting what happened, and the characters are surely somewhat accurate, though bizarre. It suffers a bit in talking about only the rich, English-associated people of the country, so one doesn’t learn anything about the Sri Lankans (then “Sinhalese”) themselves. But as a portrait of upper-class “colonialist” life in the country it is colorful and absorbing.

Midnight in the Garden of Good and Evil by John Berendt (1994).  I am visiting Savannah in mid-April with some old friends, and was told to read this book as preparation. It’s another “nonfiction novel,” about which Wikipedia says this:

The book’s plot is based on real-life events that occurred in the 1980s and is classified as non-fiction. Because it reads like a novel (and rearranges the sequence of true events in time), it is sometimes referred to as a “non-fiction novel.”

The characters are unbelievably colorful and eccentric, but they were apparently like that in real life. So far I’ve read about 120 pages and haven’t gotten into the main plot, but already the setting has made me eager to go to a renowned and beautiful city that I’ve never visited.

This of course is also a prompt for readers to let us know what they’ve read lately, and whether they liked it (I get a lot of suggestions from such comments). Your turn.

Categories: Science

Alan Lightman in The Atlantic: Dualism or not?

Mon, 03/30/2026 - 9:00am

Alan Lightman a physicist best known for his writing about science, most famously his 1992 novel Einstein’s Dreams. At present he’s a “professor of the practice of the humanities at MIT.”

Lightman’s recent article in The Atlantic (click headline below or find article archived for frere here), while seeming to buy into the magazine’s recent penchant for osculating religion, really is not.  It seems to mostly express a kind of spiritual wonder.  But it’s confusing for two reasons.

First, he denies materialism, but latter accepts it (see below).

Second he deals with two forms of dualism: the mind/body dualism dealt with by Descartes, but also a dualism caused by recent advances in medical technology, in which part of your body is not made of tissue (examples are artificial hearts and mind/electrode interfaces) making people part human, part machine.

After reading the piece, I wasn’t sure what the point was except to mirror Lightman’s wonder at the world and his unanswered questions.

It began when Lightman had a colonoscopy, which got  him wondering what was going on inside himself; as he said, “I felt like a trespasser in my own body.” And that gets him into the first form of dualism.  All bolding henceforth is mine:

Modern neuroscience has largely overthrown the classical view that the mind and the body are fundamentally different substances, and it has shown that all of our thoughts and mental experiences are rooted in the material brain. But even granting that scientific view, there remains a profound disconnect between our conscious self-awareness—rooted in the three pounds of gooey stuff in our skulls—and the rest of our body.

And here’s the confusing bit, where he denies materialism: he simply has to be more than just the substance of his body. Bolding is mine:

After that unsettling medical adventure, I began mulling over why I was so disturbed to see the insides of my body. A number of issues come to mind. For starters, the experience struck me as a vivid demonstration of my materiality. Even though I am a scientist and have a materialist view of the world, I still harbor the belief that I am more than just a jumble of tissues and nerves. The experience of consciousness and life is so sublime that it is hard to imagine it all arising from mere atoms and molecules. 

This seems like a case of cognitive dissonance, but it’s not clear whether he really believes what’s in bold as opposed to “harboring” that belief. Yes, we don’t know how consciousness works, but what else is there to create it except the stuff of our bodies and brains? For other people, like Ross Douthat, a failure to understand is by default evidence for god, but nobody who knows the history of science would think that.

Lightman then muses for a while about our failure to fully understand our own bodies, but what is a source of puzzlement to him is a challenge to scientists. We have never made progress in understanding nature by assuming that naturalism is wrong, and so the program to understand consciousness must begin with a naturalistic program—until we find an exception to naturalism!

But later on, Lightman says that he’s really a materialist:

I must again confess that I am a materialist. I respect the belief in an immortal soul. I respect the belief in a nonphysical mind. But, despite my predilection for some transcendent element, I do not share those beliefs. Still, I am baffled by the disconnect I feel between body and mind. I look down at my bare feet and command my toes to wiggle. And they wiggle. But “I” am looking down at them from above. My toes are things that I gaze at from some distance. But what distance? The distance from the camera of my eyes? The distance from my conscious mind, which has these thoughts? And my toes are visible. The inside of my body is even more distant.

Once again his source of wonder is his victimization by an illusion, one described so clearly by Dan Dennett, that there is an “Alan Lightman” sitting somewhere in his brain, a little homunculus that looks down on his toes. Again, he’s baffled, while a biologist would see a challenge. My own view, and I’m no expert, is that the “hard problem of consciousness” will simply devolve to a problem of what brain connections are necessary for the sensation consciousness, and then we’ll have to say, “And that is all we know.”

Finally, having confessed his bafflement, Lightman goes on to describe some medical advances that truly are amazing, but, like the one below, must surely have a naturalistic explanation:

In 2013, scientists at the California Institute of Technology and the University of Southern California implanted two computer chips in the brain of Erik Sorto, then 32, who was paralyzed from the neck down from a gunshot wound. The output from the chips is connected to a computer, which interprets the patterns of their electrical activity; the computer, in turn, is connected to a robot arm. When Sorto is thirsty and merely thinks about reaching for a cup of water, the computer chips in his brain sense his desire and relay that thought to the computer, and the robot arm grabs a cup of water and brings it to his lips. When I interviewed Sorto in November 2021 and asked him what it felt like to have this machine in his body, he said that he felt mostly human but also part cyborg.

Now that is amazing, especially because, as far as I know, the way it works was not designed from first principles, although some knowledge of neuroscience was surely required (where do you put the chips?). But this surely has a naturalistic explanation, unless you think that god did it or some fundamental principles of how neurons and muscles work has eluded us.

And that’s pretty much it.  I may have failed to be impressed simply because I’m jaded, and as a scientist I’m used to unsolved problems that to other conjure up spiritual or even non-naturalistic explanations. But still, I wonder why The Atlantic published this.

Categories: Science

Bill Maher’s latest “new rule”: UFOs

Sun, 03/29/2026 - 10:00am

I didn’t realize how popular Bill Maher is on YouTube: this clip, a fragment of his “Real Time” show on HBO, has garnered over 520,000 views since it was posted on Friday night.  At any rate, this week he takes on the recent conspiracy theories that alien spacecraft are appearing regularly, and that some of their spaceships and even their bodies are in possession of companies or the government.

Maher notes that the UFO believers are no longer nutjobs but reputable people: politicians and “people with buzzcuts.”  Steven Spielberg even has a movie coming out this summer about UFOs; it’s called “Disclosure Day.”

He then addresses the aliens directly, asking them not to kill us even though they could, and apologizes for America’s once firing a missile at a UFO. He even offers minerals to the aliens and then pleads for them to destroy our nuclear weapons, and to “get us off oil–and our phones.” In the end, he tells the aliens “you should think of the human race as Britney Spears. It would be nice if we needed a conservatorship, but in reality we really do.”

It’s clear that Maher, like me, doesn’t believe that these aliens and their craft really exist. He’s just using the notion to criticize what’s wrong with America.  If craft and aliens did exist, and we possess crashed ships and alien bodies, then somehow there has been a massive conspiracy to hide it by both Democratic and Republican governments—as well as by the press— to cover up the greatest news story in the history of humanity.

The panel guests for this show were Laura Coates, CNN chief legal analyst, and Stephen A. Smith, host of Straight Shooter.

Here’s a teaser for “DIsclosure Day”. There are aliens in animal bodies, and it even buys into the discredited idea that aliens make crop circles.

Categories: Science

How an American college woman turned into a propagandist for terrorism

Sun, 03/29/2026 - 8:45am

The Free Press has a long article on Calla Walsh, a 21 year old American woman who became radicalized at about 16 and now lives in the Middle East, making propaganda for Hezbollah and Iran.

Here’s a mugshot of Walsh from an earlier arrest in the U.S. for vandalism and trespassing on the grounds of an Israeli-owned company said to be involved with “genocide.”  She served two months in prison.

Merrimack, Massachusetts Police Department, Public domain, via Wikimedia Commons

Here she is all growed up (see link below). I’ve put a recent Instagram post at the bottom.

Walsh was the definition of “privileged” when young:  the scion of two academics from Cambridge, Massachusetts, and one who attended two fancy prep schools before dropping out of Canada’s McGill University after one semester.  She started off advocating against climate change and writing about the Democratic Socialists of America for Teen Vogue (thank goodness that rag went belly-up!), and campaigned for the successful re-election of Senator Ed Markey, as well as other Democrats (see an admiring profile of Calla in a 2021 Boston Magazine ). She then went on to oppose the “genocide” committed by Israel in the Gaza war, visited Cuba, and after that she went full Monty—or should I say full Mamdani? She traveled to Iran and has apparently settled in Beirut.

The Free Press details what happened to her and where she is now. Click below to read; the article cannot be archived.  If you want a shorter account, read her Wikipedia bio, which includes the following:

Walsh’s political ideology has shifted over time. In the 2010s, she was a member and activist in the Democratic Party, as well as an environmental activist. In 2021, NPR described her as a “progressive organizer and activist based in Massachusetts”. Later in 2021, she identified as a Democratic Socialist and a member of the Democratic Socialists of America. In 2023, she described herself as a communist and anti-imperialist.

Following the October 7 attacks, Walsh tweeted that “defending their homeland from illegal occupation and genocide isn’t ‘ugly Palestinian retaliation'”. Later that day, Walsh posted a map of the attack sites with the message, “This is what decolonization looks like”. In April, Walsh added, “We will never forget who called Palestinian freedom fighters ‘terrorists’ after October 7 and then turned around and claimed to support Palestine”. On October 12, Walsh tweeted that those living in the US have “an obligation to take direct action against murderous companies like Elbit”.

For the 2024 presidential election in the United States, Walsh filled in the name of Yahya Sinwar as a write-in candidate. Walsh has expressed support for Tyler RobinsonElias Rodriguez, and Mohamed Sabry Soliman.

In December 2025, Walsh was nominated by the pro-Israel group StopAntisemitism for “Antisemite of the Year”.

In February and March 2026, following the 2026 Iran conflict, Walsh wrote a series of posts on X social network calling for the use of global violence against Israeli and American officials.  Following the start of the 2026 Iran massacres, Walsh was one of several media personalities that promoted the Iranian state’s claim that the protests had been stoked by the CIA and Mossad.

She’s clearly a hater of both Jews and America, and given what she’s done, she’ll never be allowed back in the U.S., or, if she is, she’ll have to spend a long time behind bars.

A couple of long excerpts of a very long article:

In the weeks since the U.S. and Israel launched their joint assault on Iran, perhaps no American has more aggressively and publicly rallied behind the Islamic Republic than Calla Walsh. From her new base in Lebanon, the 21-year-old Cambridge-raised activist has taken to social media and left-wing podcasts to incite her fellow countrymen and women to sabotage U.S. and Israeli defense contractors wherever they can find them. On March 3, she mocked four American soldiers killed in an Iranian drone strike, posting: “They all died fighting for fascism, genocide, pedophilia, and cannibalism.” She attached pictures of the dead Americans. In recent days she reposted a list of missile-production sites inside the U.S.

“We have a duty to escalate,” Walsh told her host on the Psychic Militancy podcast last Saturday from Beirut, noting that “lockdowns” of weapons factories and vandalism alone are “not sufficient at this point.”

She added: “And as the genocide and these wars of aggression continue to escalate, much more is demanded of people in the West.”

Walsh looks every part the art-school hipster, with her thick-rimmed glasses and a mop of curly hair. But she’s a chameleon of terror. Five years earlier, as a 16-year-old, Walsh was fawned over by The New York Times for being a young, social media-savvy activist who was helping to shake up the Democratic Party in Massachusetts. But as a monthslong investigation by The Free Press shows, she’s thrown her allegiance squarely behind the Islamic Republic of Iran and its Axis of Resistance, which includes the Palestinian terrorist group Hamas and Lebanon’s Hezbollah. The U.S. government has placed her on a suspicious persons watch list for her expansive dealings with the governments of Cuba and Iran, U.S. officials told me, as well as a spiderweb of U.S.-designated terrorist groups.

Over the past few years, Walsh’s radicalization has played out in real time on X and Instagram. She quickly moved from political organizing for the Democrats to the Democratic Socialists of America (DSA) to conducting guerrilla-style raids on Israeli defense companies in New England, for which she received jail time in 2024. She has regularly called for the assassinations of Israeli officials and their allies in the U.S. and elsewhere.

In October, she formally relocated to Beirut, The Free Press learned, where she has established herself as a regular contributor to Iranian state media, particularly the English-language site of Tehran’s Press TV on which Washington has imposed sanctions. She is actively engaging in propaganda and information-warfare operations on behalf of the Iranian regime and Hezbollah, which is fighting Israeli forces in south Lebanon.

She’s too far gone, I think, to ever return to the U.S., and perhaps she doesn’t want to. But those who ally themselves with American enemies in this way are rarely either allowed back to where they were born or, when do they are jailed. I suspect that Walsh will be in Lebanon for good. A bit more:

According to U.S. counterterrorism officials I spoke with, any financial or operational ties Walsh has established with blacklisted organizations—whether in Iran, Cuba, or Lebanon—means she could be indicted for providing material support to proscribed groups. Walsh’s latest trip to Tehran places her in even greater legal jeopardy if she ever returns home.

“I’ve never seen someone who’s done jail time so publicly integrate herself into terrorist infrastructure,” a senior national security official told me. “She’s totally exposed now.”

More:

. . . . Walsh’s new role as Tehran’s Gen Z propagandist has her regularly appearing on Iranian state media with other outcasts from Europe and North America. In December, Walsh was a guest on a Press TV show called Palestine Declassified, with two Britons who have been banished from UK politics and academia in recent years for antisemitism. She set up a camera from her Beirut apartment to tape the show and extol the military prowess of the late Iranian general, Qasem Soleimani, whom the U.S. assassinated in a 2020 drone strike.

And this, near the end, show how ideologically captured she is, so much so that she blames Mossad for massacring Iranian civilians:

In February, when Walsh returned to Iran, she provided an American face to spread the message that it was the CIA and the Israeli spy service, Mossad—not the Iranian government—that massacred thousands of Iranians during the prior month’s uprising. (The United Nation’s special rapporteur on Iran estimated the number could be over 20,000.) “The fact of the matter is police in Iran are defending the revolution and are defending their government from U.S. and Zionist-backed regime change,” she said on Press TV.

. . Experts in extremism tell me they’re deeply worried that the American is now too far gone. In recent weeks, she’s crisscrossed the Middle East in support of the Axis of Resistance, placing her in the range of American and Israeli bombs and fighter jets, whether in Tehran or south Lebanon.

“She’s a true believer. It’s over after that, because you’ll do anything for your cause,” said Mubin Shaikh. He was a jihadist in Pakistan and Syria before going through a deradicalization program in Canada and emerging as one of his country’s top counterterrorism experts. “Martyrdom? Don’t think that’s off the table.”

You can imagine how dicombobulated her parents are. They are quoted as saying, “We love Calla deeply and absolutely. . . . And we have serious, fundamental political disagreements with her.”

Here’s a recent Instagram post. Click screenshot to see a video interview from presstvchannel, an Iranian-controlled state media site. Note that she uses “anti-Zionist” instead of “antisemitic”.  This euphemistic ploy disgusts me.

Categories: Science

We have wood ducks!

Sun, 03/29/2026 - 7:30am

Botany Pond now harbors (temporarily, I think) a pair of the most beautiful American ducks:  wood ducks (Aix sponsa), in the same genus as mallards. At first I mistook them for mallards on the duckcam, but when I went down to investigate, it was clear that they were a bonded pair of woodies.

I love these ducks, but one or two show up at Botany Pond only every couple of years, and they do not breed here. We had a post-breeding pair, Frisky and Ruth, a few years ago, but although they hung around a while, it was after they had bred, and they were probably headed south. My photos from that era have disappeared from this site, but here is Frisky nuzzling Ruth. It’s one of my favorite duck photos (I like to imagine that wood ducks are very romantic!):

Frisky was so named because although the mallards chased him, he was very quick and adept at sneaking among them at feeding time to get pellets. After he filled his belly, he’d get quite rotund and then perch on a knob of the bald cypress that used to be in the pond. He used that knob so often we called it The Sacred Knob.  Here he is having a postprandial rest. Look at those colors!

Males have satanic red eyes. Here’s a closeup of Frisky’s head. Their bills are short compared to those of mallards.

They’re called “wood ducks” because they nest in treeholes and perch on trees—nearly the only species of duck to do so. Sure enough, when I first saw them a few days ago, they were both up in trees next to the pond. I thought they left, but, sure enough, they were back three days ago and haven’t left since.

Some photos of our new pair.  Feel free to suggest names, but they should be fitting for these glorious birds.

The new male:

. . . and the new female. The shots aren’t great as I took them in the early morning when it was light, and the shutter speed was slow:

Wikipedia describes them like this:

The adult male has stunning multicolored iridescent plumage and red eyes, with a distinctive white flare down the neck. The female, less colorful, has a white eye-ring and a whitish throat. Both adults have crested heads. The speculum is iridescent blue-green with a white border on the trailing edge.

Besotted with each other, the ducks are always together. Here are two videos of our new pair swimming together:

In this next video, the male gives her a little kiss 6 seconds in. He then chirps at her (they don’t quack).

The loving couple. Look at that sexual dimorphism!  These ducks are in full breeding plumage:

Vashti is nesting nearby, and Armon is always in the pond waiting for her to drop in for a quick snack, a drink, and a preen before she hurries back to her nest.  There are seven lovely green eggs in Vashti’s nest, and I anticipate ducklings will hatch around April 20 (a bit early in the season) if all goes well.

Armon chases the woodies, but only in a desultory manner, and they manage to sneak some of the food I give him.  At other times he allows them to rest next to him on the rocks.

Here’s Armon halfheartedly chasing the female.  He never gets near either of them as they swim faster than he, and they can simply jump out of the pond when they’re tired of being chased.

Ducks on the rocks (a good name for a drink). You can see that Armon doesn’t mind them being nearby so long as it’s not feeding time. The size difference between mallards and woodies is clear:

The male is like a feathered jewel!  Here is the range of Aix sponsa from Wikipedia. As you see, Chicago is in their year-round range.

Cephas, CC BY-SA 4.0, via Wikimedia Commons

You might be able to see them on the DuckCam if you look now, but keep looking from time to time.  In the meantime, feel free to suggest woody names, and get ready for mallard ducklings in a few weeks.

I think the woodies will soon depart to breed in a place where there are trees with holes.

I’ll add a Smithsonian video of a large clutch of hatched woodies jumping about 50 feet down to the water as mother calls them:

Categories: Science

Readers’ wildlife photos

Sun, 03/29/2026 - 6:15am

Abby Thompson of UC Davis has sent in some pictures of California tidepool organisms, as well as a video. Abby’s captions are indented and you can enlarge her photos by clicking on them.

Late January-early March tidepools, plus an octopus.

Bryozoans:

Lepas anatifera (pelagic gooseneck barnacle).   Usually found clinging to something drifting around in the open ocean (the “pelagic” part of their name), these were on a large log washed up on shore:

Intertidal zones, illustrated.    A well-placed vertical rock face, like this one, exhibits the idea of the different intertidal “zones”, each of which has its own specific collection of inhabitants.  You can see mussels and barnacles clustered at the top (in the “high intertidal”), exposed to the air as soon as the tide goes out even a little. There are smaller colonial anemones next, beneath them the orange and purple ochre stars, and below those, arriving at the low intertidal level, some giant green anemones.   If you peer into the water under the open giant green anemone, you’ll see a crab, probably a rock crab.   There’s some back and forth- there are a few giant green anemones pretty high up in this photo- but the general idea holds.

This reflects each animal’s differing tolerance for specific conditions- time out of the water as the tide goes out, harshness of wave actions, etc.      The nudibranchs (next few pictures) are usually in the very low intertidal:

Orienthella piunca (nudibranch):

Hermissenda opalescens (nudibranch):

Doto amyra (nudibranch).  Visible through the translucent skin on its back are lobules of the “ovotestis” (thanks inaturalist expert! ).   From google AI: “Ovotestes in nudibranchs are specialized, hermaphroditic reproductive glands that produce both male (sperm) and female (oocytes/eggs) gametes simultaneously”:

More eggs, this time from a snail in the genus Amphissa. I like the pointy egg casings, like wizards’ hats:

And here’s an adult of the genus- almost certainly Amphissa versicolor, but it’s an unusual color (they’re usually shades of orange or brown/tan):

In honor of Ghost the octopus, and also because I’ve finally figured out how to include videos, below is a clip from 2021 of an East Pacific red octopus (Octopus rubescens), cruising around the rocks (out of the water!) at low tide.  I’ve only seen one twice, probably because they’re too cleverly camouflaged (possibly just too clever) for me to spot.    This guy was about the size of a human hand, a miniature compared to the 50 pound Ghost.

Point Reyes peninsula at sunset:

Camera: Olympus TG-7.

Categories: Science

Caturday felid trifecat: Library accepts cat photos in lieu of late fees; the history of Downing Street cats; lost cat reunites with family after five years; and lagniappe

Sat, 03/28/2026 - 9:00am

Yes, I know I’ve misspelled “trifecta” in the title, but that’s deliberate. Today we have the usual three items for Caturday, plus two lagniappes.

First, a heartening story from the 100 Libraries Magazine Click to read.

The Worcester Public Library in Massachusetts is, well, let them announce it:

Photo courtesy of the Worcester Public Library

From the text:

Embracing the spirit of the program, library patrons are encouraged to swap traditional monetary payments for a simple yet delightful alternative—a photo or drawing of a cat. Executive Director Jason Homer extends a warm invitation to all, regardless of feline companionship status.

“Even if you don’t have a cat in your life, you can still draw one,” Mr. Homer expressed to NBC Boston, highlighting the inclusive nature of the initiative. From domestic cats to majestic big cats like tigers and lions, all submissions are welcome with open arms.

Recognizing that accidents happen and fees can sometimes hinder access to library resources, the library hopes to extend a gesture of goodwill and understanding through this unique initiative.

Why March Meowness?

The inspiration behind “March Meowness” stems from a noticeable increase in overdue fines among young patrons, a trend exacerbated by the challenges posed by the pandemic. However, what started as a simple solution to address financial barriers quickly blossomed into something much more profound. Within just the first five days of its launch, the program had already cleared over 400 accounts, underscoring its immediate impact and resonance within the community.

In essence, “March Meowness” transcends the realm of library fines, serving as a testament to the power of creativity, compassion, and community spirit. Through a simple yet meaningful gesture, the Worcester Public Library not only alleviates financial burdens but also reaffirms its role as a cornerstone of the community—a place where everyone is welcome, regardless of their ability to pay. As patrons flock to share their beloved feline companions or imaginative renditions thereof, the library continues to serve as a beacon of hope, connection, and joy in uncertain times.

And here, from TikTok, is a wall o’cats showing how many people donated photos in lieu of fines (sound up):

@worcesterpublib

It was a valiant effort by our staff to try to get a many cat photos as possible up on our cat walls at several of our locations. We ran out of time and staff, but did our best! We hope you enjoyed this a much as we did! #catsoftiktok #catstagram #cats #librarytok #librarytiktok #library #worcesterma #mywpl #MarchMeowness #viral #fyp #fypシ #fypage #foryou #foryoupage #foryourpage

♬ Come Check This (Quickie Edit) – FETISH

And from the NBC Evening News, where we learn  that photos have been sent to the library from all over the world. The response would not be nearly as awesome if they wanted dog pictures, because cats rule the internet. 

********************

In honor of Larry the Cat‘s 15th birthday and 5 years of service as Chief Mouser to the Cabinet Office, Mark Felton Productions has put out a lovely 11-minute video detailing the history of Downing Street Cats beginning on June 3, 1929 when the government authorized the position. Rufus was the first Downing Street cat, replaced by Peter, and so on (one was also called “The Munich Mouser,” who was staffed by Neville “Peace in Our Time” Chamberlain.

It’s a great video, so watch.

Here’s a timeline of all the 10 Downing Street cats (via Wikipedia), and the parthy of the PM’s who staffed them. Click to enlarge.

********************

From the UPI’s “odd news” we have the story of a lost cat who came home after five years:

 

From the text:

A Maryland family whose pet cat went missing five years ago were reunited with the pet when she turned up in a stranger’s basement.

Melissa and Brooke Garci said Aremis, their indoor/outdoor cat, wandered away from home in Hickory in September 2021 and never came back.

The family canvassed the neighborhood and put up flyers, but there was no trace of Artemis.

The Harford County Humane Society said a woman came to the facility in February with a cat she found in her unfinished basement.

The shelter scanned the feline for a microchip, which identified her as Artemis.

“I was like, I couldn’t cry yet because I was in disbelief, but when I saw her, I was like, ‘OMG, she looks exactly the same,'” Brooke Garci told CBS Baltimore. “It was a beautiful thing.”

The humane society said the reunion was a reminder of the importance of having pets microchipped.

Always get your cat chipped, even if it’s an indoor cat. They can escape, you know, as Artemis did.  40% of lost cats with chips are returned to their owners. (Be sure your chip is updated if you move or change your phone number.)

Here’s a video of the grand reunion (warning, there are d*gs as well):

********************x

Lagniappe: Two cat videos from The Good News girl:

Kiki the School Cat:

Categories: Science

Reader’s wildlife photos

Sat, 03/28/2026 - 6:50am

Today Athayde Tonhasca Júnior is back with one of his patented text-and-photo posts, which have always been very informative. Today he talks about palms and their pollinators in one area of Brazil.  Athayde’s captions are indented, and you can enlarge his photos by clicking on them.

Beneficial saboteurs

It’s approaching midday somewhere in the caatinga, northeastern Brazil’s hinterland, and the temperature will soon hit the 40o C mark. All is quiet, as most animals are sensibly sheltering from the sizzling sun. The vegetation looks dead and stunted, but it is in fact quiescent, in a state of dormancy that helps plants endure the heat and drought until the rainy season arrives.

Fig.1.  The caatinga vegetation in northeastern Brazil looks dead during dry season, but palm trees are green year round:

One palm tree, however, known locally as licuri (Syagrus coronata), doesn’t seem bothered by the harsh climate; it is verdant and in full bloom. The plant is monoecious, that is, it produces separate male and female flowers in the same individual. Male flowers grow at the end of large (~90 cm long) inflorescences, while the female flowers are at the base. Anthesis (the stage at which a flower is open and functional) is asynchronous: male flowers open first, releasing pollen and scent for 7 to 10 days. These flowers then shrivel and fall off. In about two weeks, it’s the female flowers’ turn; they are open for 10 to 15 days. Plants also bloom asynchronously, so at any given time of the year there are licuri flowers.

Fig.2. Licuri inflorescences © Drumond, 2007:

These flowery details may seem like too much information, but they are important for understanding the plant’s relationship with one of its most important flower visitors, the weevil Anchylorhynchus trapezicollis.

Like the overwhelming majority of the ~83.000 known species of weevil (family Curculionidae), A. trapezicollis feeds on plant tissues. Attracted by the scent of male flowers, a beetle uses its big schnozzle (in fact its rostrum, the snout-like projection from the head) to pry flowers open and take their pollen. While feeding, the beetle ends up with pollen grains attached to its body. As male flowers open at different times, there’s isn’t much food to be consumed in one sitting. The beetle is then encouraged to move to another plant, taking with it pollen that will result in cross pollination if the insect lands on a receptive female flower.

Fig.3. An A. trapezicollis in action on a licuri flower © Bruno de Medeiros, iNaturalist.Lu:

After feeding, a female beetle looks for female flowers to lay her eggs between the petals and sepals. The resulting larvae are cannibals: one larva will eat any competitor in the same flower. As they grow older, the little darlings shift their attention to developing fruits, which are aborted and fall off. Because it destroys forming fruits to complete its life cycle, A. trapezicollis is a seed predator. But for the cost of a portion of its fruits, the licuri palm is pollinated. This form of mutualism is known as brood-site pollination or nursery pollination, a trade-off association that has evolved for the yucca and the yucca moth, figs and fig wasps, and several other plant-insect partnerships.

Fig. 4. The licuri‘s trunk ends in a distinct crown of slightly arched leaves, a feature that inspired its specific epithet coronata (crowned) © Kelen P. Soares, Flora e Funga do Brasil:

Other weevils and bees also pollinate licuri, but A. trapezicollis seems to be the most important agent (Medeiros et al., 2019). This tight relationship has profound ecological consequences.

It is said that everything from a pig can be used except the oink, but licuri is not far behind in relation to its usefulness to humans. Its apical meristem (palm heart) is edible; the leaves are the source of a high quality wax, building materials, hats, baskets, sleeping mats and other handicrafts; ground-up leaves are fed to livestock in times of food scarcity; the tasty seeds (endosperm or nuts) are eaten raw or roasted, or added to confectionery and local dishes; oil extracted from seeds is used for lighting and the manufacture of soap, perfumes and other products.

Fig.5. The greenish pulp (mesocarp), brown hard shell (endocarp) and the nutritious white nut (kernel) of a licuri fruit © B. Phalan, Wikimedia Commons:

Humans are not the only creatures to benefit from licuri: many animals take the wholesome fruits. Among them, the Lear’s macaw (Anodorhynchus leari), an endemic and endangered species, for which licuri nuts represent the bulk of its nutrition.

Fig.6. Lear’s macaws, big fans of licuri nuts © João Quental, Wikimedia Commons:

There you have it: a palm tree of unordinary value, from people’s welfare and economy to endangered macaws and wildlife in general, is greatly dependent on pollination provided by unassuming weevils. And this is not an isolated case. More than 200 palm species (family Arecaceae) are pollinated by weevils, and so are many other plants from different lineages (Haran et al., 2023). The ‘million dollar weevil’ (Elaeidobius kamerunicus) illustrates well the relevance of these insects as pollinators. This beetle was introduced from Africa to Asia to help improve pollination of cultivated African oil palm (Elaeis guineensis), resulting in considerable increase in yields.

When we think of pollinators, bees, flies and moths are most likely to come to mind, as they contribute to the reproduction of crops and wildflowers familiar to us. Adding weevils to this select club may sound peculiar: after all, many weevils are pests capable of inflicting enormous damage on cultivated plants, trees and stored products (you may have had your pantry invaded by weevils). But that would be a parochial view. For millions of people in tropical and subtropical regions, palm trees are more than props in holiday brochures: they are crucial for wildlife food chains, human nutrition, building materials and commodities such as medicines, industrial products and fibre. A great deal of these benefits depends on a range of poorly known, frequently dismissed and often vilified weevils.

Fig.7. Six species of weevil known to be involved in brood-site pollination © Haran et al., 2023:

 

References

Drumond, M.A. 2007. Documentos, 199. Embrapa Semi-Árido.

Haran, J. et al. 2023. Most diverse, most neglected: weevils (Coleoptera: Curculionoidea) are ubiquitous specialized brood-site pollinators of tropical flora. Peer Community Journal 3: e49.

Medeiros, B.A.S. et al. 2019. Flower visitors of the licuri palm (Syagrus coronata): brood pollinators coexist with a diverse community of antagonists and mutualists. Biological Journal of the Linnean Society 126: 666-687.

 

Categories: Science

Indigenous “ways of knowing” invade Canadian science classes

Fri, 03/27/2026 - 9:00am

I’ve spent a lot of time pushed many electrons going after the fallacy in New Zealand that indigenous “ways of knowing”—in this case from the Māori—are just as valid as so-called “Western ways of knowing,” which is what Kiwi progressives call “science”. You can see my pieces here, but there are many.

This sacralization of the oppressed, whereby the beliefs of minorities are given extra credibility, has now spread to Canada, a pretty woke place.  Lawrence Krauss, who now lives in British Columbia, was astonished and depressed to find indigenous (Native American) superstitions treated as science in the secondary-school curriculum.

You can read his lament by clicking the screenshot below, or find the article archived here.

Quotes from Krauss’s piece are indented, and my comments are flush left. This battle apparently needs to be fought in every country where science, which is not “Western” but worldwide, has been diluted via the efforts of “progressives” who think they’re doing a good thing. They’re not: they are impeding the education of kids by conflating superstitions and established science.

Check out the links in the first paragraph:

I now live in British Columbia (B.C.). A colleague recently forwarded me the current B.C. high school science curriculum for grades nine and twelve. It includes an embarrassing amalgam of religious gobbledygook and anti-science rhetoric. It is an insult to school children in B.C. and does a disservice to the students of the province at a time when understanding the nature and process of science is becoming increasingly important to their competitive prospects in a world dominated by technology.

You may wonder how religious fundamentalism could so effectively creep into the curriculum in a progressive place like British Columbia. The answer is simple. The religious nonsense being inserted into the curriculum has nothing to do with Christian fundamentalism; rather, it is Indigenous religious nonsense. And in the current climate, Indigenous “knowledge” is held to a different standard from scientific knowledge—or, rather, to no standard at all.

. . . In the B.C. science curriculum for grade nine, this agenda is explicit. Students are expected to: “Apply First Peoples’ perspectives and knowledge, other ways of knowing, and local knowledge as sources of information.” “Ways of knowing” are defined as “the various beliefs about the nature of knowledge that people have; they can include, but are not limited to, Aboriginal, gender-related, subject/discipline specific, cultural, embodied and intuitive beliefs about knowledge.”

Here’s one example of how indigenous knowledge dilutes superstition. Like me and many others, Krauss has no problem in teaching this stuff as “social science or history”, but bridles at equating it with science:

For example, lesson three of the “BC Grade 9 Student Notes and Problems Workbook,” contains a section entitled “The Universe: Aboriginal Perspectives.” Over the course of two pages, the creation myths of various aboriginal peoples are described in detail, as “beautifully descriptive legends depicting the relationship between Earth and various celestial bodies.” Such subjects as the creation of the universe by a raven; the presence of water everywhere on Earth except on Vancouver Island; the eternal efforts of the Moon to get some of that water to drink; how and why a divine son and daughter team set out to make the Sun traverse the sky, while ensuring that it seems to stop in the middle of the day; how one of the jealous siblings turned into the Moon; how lunar eclipses occur when the spirit of Ling Cod tries to swallow the Moon; how one constellation of stars is the remnants of a giant bird that flew up from Earth; and how the celestial raven eventually released the Moon, stars, and Sun from boxes, in that order. These are quaint myths, and one can imagine how a reasonable science book might describe how we overcame these prehistoric notions to arrive at our modern understanding via the process of science. Instead, the conclusion at the end of this chapter reads, “These stories parallel the Big Bang Theory.”

The only answer to that is, “No they don’t.”  Krauss continues:

As if the insults to the process of science reflected in these curricular statements weren’t bad enough, when the workbook actually discusses science, it gets it all wrong. For example, the book states that, “Indications are that all galaxies are moving away from a central core area. Thus, the universe is said to be expanding.” In fact, the central premise of the Big Bang picture of our expanding universe is that there is simply no centre to the universe. The Universe is uniformly expanding but not from a single central point, but from everywhere. Elsewhere, the process that describes the power generation in stars is listed three times as nuclear fission. This is the opposite of the actual process, nuclear fusion, which explains how light nuclei combine to form heavier nuclei.

This is not surprising, for the people who tout indigenous knowledge as coequal with modern science often are not conversant with modern science. This is also true in New Zealand: advocates for native people simply look for parallels that can be used to say, “Look—indigenous people had a parallel but equally correct way of understanding the universe.” And the answer to that, too, is “No they didn’t.

The damage done to children’s education, and to science itself, are obvious, but summed up by Krauss at the end:

The understanding of the modern world is based on science and that understanding was built up, often at great cost, by overcoming myth and superstition. It is a giant leap backwards to cater to such superstitions in a misguided attempt to somehow pay back Indigenous peoples for historical wrongs. Students today had nothing to do with the sins of the past, and we owe it to them to teach them the best possible science we can. That means separating religious myths from science, and in the process actually trying to get the science straight. The B.C. science curriculum is a disgrace on both counts.

Amen.  I suspect the only reason this tactic hasn’t spread to Europe is that they have—with the exception of the Sámi of Scandinavia—almost no indigenous people to sacralize. But India has plenty, and already science is being diluted there by Hindu “ways of knowing”, including the government’s establishment of institutes tasked with revealing the scientific wonders of cows and their urine, dung, and milk. When I visited India on a lecture tour, I spent a long time listening to credible scientists beef about (sorry for the pun) the stupidity of the government’s dilution of science. Their complaint? “Where’s the beef?”, for despite a big government expenditure, there was little to show. That’s what happens when “scientists” are more or less ordered to come up with results wanted by others.

Categories: Science

Now The Atlantic touts religion—or rather, beliefs that don’t need evidence

Fri, 03/27/2026 - 7:30am

I’ve been posting from time to time about how the mainstream media is suddenly touting religion and its benefits—a phenomenon I don’t fully understand. Now The Atlantic has joined the queue with an article by Elizabeth Bruenig, who’s written for the magazine for 6 years, and before that for the NYT, the WaPo, and the New Republic. She also has a master’s degree in Christian theology from Cambridge University.  All this means that she’s fully qualified to tout religion to liberals.

And in the article below she does just that, but in an unusual way.  She dismisses the need for any evidence for gods or specific religions, and takes the position that belief itself, however arrived at, is sufficient to warrant the truths of that belief. It’s bizarre, and another example of a supposedly reputable publication jumping the rails.

You can read the article archived below, or find it archived here.  (Thanks to the many readers who sent me this piece.)


Bruenig begins by dissing the New Atheists (unfairly, of course), and then segues into her Frozen Waterfall Moment: the epiphany that solidified her waning faith.

I grew up in a faithful Methodist household in deep-red Texas during the George W. Bush years, when the political sway of Evangelicals was at its zenith. At the same time, evangelists of a robust atheism—figures such as the biologist Richard Dawkins, the critic Christopher Hitchens, and the neuroscientist Sam Harris—toured the country offending salt-of-the-earth Americans with their contempt for religious belief. It was hard for me to ignore that a number of their assertions were clearly correct: Young-Earth creationism, for instance, instantly struck me as absurd when I first learned about it from a history teacher in my public junior-high school, who confidently told me that the world is only a few thousand years old.

That wasn’t what my family or church taught, but Christians who subscribed to those beliefs were suddenly ascendant, and their thinking colored the country’s religious landscape. Meanwhile, the New Atheists were making hay of the fact that such faithful misapprehensions about nature were easily disproved by scientific discovery. Though I continued to attend church as usual, I privately wondered whether the entire enterprise might be rooted in nothing more than a misunderstanding.

This steady diminishing of faith probably would have continued indefinitely, were it not for one brisk autumn afternoon in 2011 when, standing alone at a bus stop, I happened to witness the presence of God.

The unevenly paved lane where I waited was a quiet one-way street tucked away in a clutch of trees. I gazed down the road, preoccupied with other things—midterm exams, campus-club minutiae—and expecting the bus to trundle around the bend. A sudden icy wind tore around the corner instead, sweeping into gray branches and climbing ivy to send a spray of golden birch leaves spiraling into the sky, taking my breath along with them. And I knew that my soul was bared to something indescribably majestic and bracing—something that overwhelmed me with the unmistakable sensation of eye contact. What I saw, I felt, also saw me. Before I could rationally account for what had happened, a verse of poetry from John Ashbery came to mind:

look of glass stops you

And you walk on shaken: was I the perceived?

That seemed to explain things perfectly, jarringly so. I was dazed in class as afternoon darkened to evening.

Note that at the same time she sneers at New Atheists for their “contempt” for religion, she notes that they also dispelled misguided beliefs in creationism, so chalk that up to New Atheism. In her case, the ephipany was more mundane than the three frozen waterfalls that brought Francis Collins to Christ: hers involved a wind blowing leaves into the sky.  And for some reason that made her think about a poem that is not at all about God, but (as far as I can see), the creative process of a writer and how that process is perceived by the poet and how it interprets reality. It’s an okay poem, but it doesn’t rhyme, so it’s really a bunch of fragmentary thoughts, as in Ulysses, but put into verse form. At any rate, when Breunig, the wind that blew the leaves around somehow blew faith into her soul.

Surprisingly, given Bruenig’s own contempt for the need for evidence to buttress one’s faith, she spends a long time describing a new big book that appears to make the same old arguments about the facts of science that point to God (fine-tuning, the Big Bang, etc.):

The latest evidence suggests that God most likely exists, argues a big recent book by Michel-Yves Bolloré, a computer engineer, and Olivier Bonnassies, a Catholic author. Tracts that aim to prove the reality of God are hardly novel. What makes this endeavor unique, say the French writers behind God, The Science, the Evidence: The Dawn of a Revolution, is the scientific nature of their work. Medieval monks toiling away at poetic meditations on the divine have their place, the authors allow, but their own arguments are meant to surpass mere abstract justifications for belief. Instead they assert that cutting-edge empirical proof observable in the natural world makes a firm case for God. With this, they strive for the ultimate alchemy, transforming faith into fact.

Bolloré and Bonnassies’s book is part of a burgeoning genre of apologetics that relies on relatively new scientific developments and theories, like quantum mechanics and cosmology, to make an ancient case. Their book, which has already sold more than 400,000 copies around the world, arrives at a time of both bloody religious conflict and rapidly collapsing religious belief, especially among the young and the highly educated. It joins other recent projects—including two new documentaries, The Story of Everything: The Science That Reveals a Mind Behind the Universe and Universe Designed—that propose the same tantalizing theory: that there is incontrovertible proof that a divine power created the cosmos, and that this evidence is mounting.

. . . [the authors] identify a series of scientific breakthroughs that helped undermine religious faith over the centuries, including Galileo’s heliocentrism, Newton’s clockwork universe,

The publisher says pretty much the same thing: scientific discoveries in quantum mechanics, cosmology, the “fine-tuning of the Universe,” and the incredible complexity of living organisms” (i.e., Intelligent Design) have dispelled materialism and naturalism:

Yet, with unexpected and astonishing force, the pendulum of science has swung back in the opposite direction. Driven by a rapid succession of groundbreaking discoveries—thermodynamics, the theory of relativity, quantum mechanics, the Big Bang, theories about the expansion and fine-tuning of the Universe, and the incredible complexity of living organisms—old certainties have been completely overturned. Materialism increasingly has the appearance of an irrational belief.

I’ll admit I haven’t read this 500-page behemoth, whose summaries recycle the same old arguments for God from science, and I’m not sure I want to read it (you can see a critical review of its content archived from Medium), whose author (“Matthew”) confirms the impression I got from above, but adds that the book also throws in some theology. From Medium:

Yet what is strange is how much [the book] feels like a nostalgic throwback, it is reminiscent of the publishing fads of the 00s when New Atheism was in its peak and church book stands were full of books with titles like “The Dawkins Delusion” or “How Science Proves God” or whatever it might have been. The book even approvingly quotes Dawkins’ claim that God is basically a scientific hypothesis that we can prove or disprove, and the authors claim we should be able to look at science and find evidence of God, or at least we shouldn’t find evidence that contradicts the idea that there is a divine creator. Yet it is also far weirder than intelligent design rebuttals of atheists, the book goes beyond science, including lengthy chapters on the bible, the person of Jesus, the continued existence of the Jewish people, the persecution of scientists in the Soviet Union and (sorry Substack) for some reason, the Fatima miracle.

I will be honest up front, I found the book to be absolutely mad, hamfisted and confused. It is error strewn, misrepresents various ideas completely, and in spite of being written by two Catholics claiming to be retrieving a more ancient worldview, it largely constitutes a clumsy argument for a God of enlightenment deism, making some absolutely eye wateringly odd claims along the way. As the reviews all seem to say it is extremely “readable” but mostly because it is presented as a skim over of topics in soundbites and quotes so that it reads like a print out of a load of powerpoint slides.

. . . More to the point, I find it hard to believe we are in an “intellectual paradigm shift” when the authors have offered what is essentially undigested quotes from wikipedia and a bunch of arguments that were in vogue nearly two decades ago. This book is the definition of singing to the choir, except by the choir it must mean a very particular set of Christians inclined to share the author’s theology but not inclined to know anything about the arguments.

You can read the rest of the review for yourself.  The fact is, though, that the quality and arguments of the book are irrelevant, for Atlantic author Breunig says that people don’t need no stinking evidence to accept gods and their natures. The argument from science, she says, is misguided (bolding henceforth is mine):

To imagine that one might find traces of the divine strewn throughout the universe, or that earthly methods of inquiry might uncover some of those signs, isn’t ridiculous. But this latest round of arguments in favor of intelligent design seems aimed mostly at establishing that God could or should exist within the rational frameworks we already employ. This is both weak grounds for belief and a fundamental misunderstanding of faith. The route to durable faith in God often runs not through logical proofs or the sciences, but through awe, wonder, and an attunement to the beauty and poetry of the world, natural and otherwise.

In other words, it’s the “beauty and poetry of the world” that convinced Bruenig of the divine. Apparently she has overlooked the ugliness of the world: the cancers in children, the incessant wars and killings, the death of thousands of innocent people in natural disasters, and even humans’ destruction of the very beauty that inspires her. Is this evidence for Satan?

It’s quite bizarre to read about Breunig’s transformation into a believer, one who rejects science but still touts “objective evidence” for divinity.

She turned her Golden Leaf Epiphany over in her mind, and it is that epiphany—a purely emotional experience—that led her to see reality (OBJECTIVE reality) through a god-shaped lens. And she disses New Atheism again for its supposed claim that believing in gods makes one unsophisticated or dumb.  No, she’s wrong: the argument is that accepting theism means you’re credulous. Breunig:

 I began to ask myself what it would cost me intellectually if I were to choose to metabolize the experience as it had occurred to me. That decision came with several implications. If God is real, then perhaps other things—goodness, righteousness, beauty—that are usually dismissed as matters of subjective experience might also be objectively real. That prospect was much more agreeable to me than another consequential implication of electing to believe: that, as the New Atheists had so vigorously argued, theism meant putting aside any pretensions I had of sophistication or intellect.

As I explored this problem, I spent hours in my college library reading Saint Augustine, a foundational philosopher and theologian. Here I encountered another strange sensation: Every word I read felt like remembering something I had once known but somehow forgotten.

Oh dear God, St. Augustine, a man who was a Biblical literalist (something that Bruenig rejects). Like many early theologians, Augustine argued that the Bible could be read both literally and metaphorically, but insisted on the absolute truth of what’s in print. Augustine accepted instantaneous creation from Genesis, Adam and Eve, Noah’s ark, and the whole Biblical mishigas. Bruenig ignores those parts, for she’s looking to buttress her incipient belief. (And remember that she concluded, apparently objectively, that God exists because of the feeling that swept over her when she saw the wind blow the leaves around.)  And so, after reading Augustine, she decided to accept an “objective” reality that didn’t need empirical support, and re-embraced religion:

And maybe the Christian Neoplatonists, Augustine among them, had some points as well. I contemplated this for a while before I realized that there wasn’t any sense in debating it with myself anymore. I knew what I felt, so I gave up and chose to believe.

Note that she has no evidence for Christianity, but chose to believe, even though she uses the word “objectively,” implying that other people would agree with her “choice”. (They don’t: Christians are in a minority of the world’s people.) At the end of her piece, Bruenig simply asserts that you don’t need anything but emotion to buttress your Christianity. In so doing she simply shrugs off all the arguments that have been raised against belief and says “faith is enough”, effectively immunizing her beliefs against refutation. (Bolding is mine.)

In my years of working out exactly what I believe, I have been relieved to learn that faith does not in fact demand the surrender of logic and vigorous intellectual inquiry—a case Bolloré and Bonnassies convincingly bolster with numerous testimonials from award-winning scientists. Still, to trust in the existence of God is to accept both the appearance and the possibility of being naive or delusional. No accumulation of promising developments in our analytical understanding of the world can delay confrontation with that essential fact. Having faith is a vulnerable thing.

Bolloré and Bonnassies’s arguments are more likely to shore up the faith of wavering believers than to win new converts. This itself is no small thing. The authors may even be right about the growing evidence for the existence of God secreted away in the latest science. But their approach has a history of upsets. The only way to inoculate belief against that cycle of disruption is to treat faith as a decision that transcends scientific proof.

It’s clear here that she wants to inoculate her belief against disruption (i.e., against disproof), and by arguing, “It’s true because I believe it,” she’s succeeded.  Well, good for her, but she’s not going to convince people who think that giving your life to Christianity and its beliefs of a divine Jesus who was also God, the miracles he performed, and the crucifixion and resurrection—you are donning the mantle of a superstitious belief system without a rational reason to do so. Remember, emotions and feelings are not part of rationality.

This whole essay could be summed up on one sentence:  “I believe because I want to believe, and I don’t need reasons (or rationality) to do so.”

Shame on The Atlantic for pushing this pabulum!

h/t: Jim
Categories: Science

Once again, the superstitionists proclaim the death of New Atheism—and atheism in general

Thu, 03/26/2026 - 9:30am

The oxymoronically-named Union of Catholic Christian Rationalists (UCCR) has joined the yammering pack of believers that keeps telling us that New Atheism has died, when, in fact, it did its job and then moved on. It’s like saying that suffragism failed and has died out!  The New-Atheist-dissers are trying desperately to explain the failure of a phenomenon that not only succeeded in changing minds, but whose proponents, no longer consumed by a need to point out the lack of evidence for gods, have moved on to other things.

You can read this tripe by clicking the UCCR articlebelow. Excerpts are indented, and my own comments are flush left. The piece is also archived here in case they want to correct stuff like their mis-naming of Rebecca Watson.

Here’s their intro (bolding is theirs):

Why did New Atheism fail?

Numerous observers have tried to explain the astonishing failure of new atheism, despite a society that was intellectually lazy, affluent, and consumerist, and that agreed with them on everything: the supposed anachronism of religious thought, the bigotry of moral judgments, the violence generated by religions, and the unhealthy mixing of politics and religion.

And yet, as the rationalist Scott Alexander observed“in the bubble where no one believes in God anymore and everyone is fully concerned with sexual minorities and Trump, it is less painful to be a Catholic than a fan of Dawkins.”

Indeed, Alexander continues, only in the case of “New Atheism”“modern progressive culture turned toward the ‘new atheists’ and, seeing itself, said: ‘This is truly stupid and annoying.’”

UCCR was born precisely during the years of fame of the “new atheists,” out of the need to provide a tool for believers “surrounded” by opinion-makers, intellectuals, and journalists. We followed the evolution of the phenomenon and its deflation, despite predictions that it would dominate the scene.

Having familiarity with the topic, we suggest five decisive factors to explain the disastrous end of “New Atheism.”

They are of course more biased against atheism than they are familiar with the topic.  I’ll condense the five factors; there is more text at the site:

1.) The election of Obama

It may seem incredible, but former U.S. President Barack Obama delivered the first major blow to the “new atheists.”

First of all, his election removed the “common enemy” that had ensured unity within the movement.

Before 2008, the glue binding activists was the much-hated conservative George W. Bush. Biologist PZ Myers and Richard Dawkins (today bitter enemies) appeared together publicly to oppose Bush and became idols celebrated by the progressive establishment.

Secondly, the Obama administration—supported by major media and cultural circles—pulled the rug out from under them: it reshaped American (and thus Western) culture by making criticism of Islam politically incorrect.

In fact, “New Atheism” emerged in the aftermath of September 11, and for years Islam was the preferred tool for generalizing about religious violence.

Under Obama, however, it became a minefield, and the first to step on it were two leading figures, Sam Harris and Michael Onfray, who began to be viewed negatively and portrayed as racists even by progressive media.

Obama was elected in 2008 and, as you see below, America’s rejection of established religion was well underway by then.

2.) Rejection by the academic world.

After the publication of his bestseller “The God Delusion” (2006), Dawkins, together with the other “horsemen,” began to denigrate agnostics and “moderate atheists,” accusing them of tolerating religious opinions and refusing to take sides.

Over time, the entire academic world was accused of cowardice for not joining the attack on religion. One example was Coyne’s media campaign against the agnostic historian Bart D. Ehrman, author of works defending the historicity of Christ.

Another emblematic case was the media pressure by Sam Harris in the New York Times and against the scientific community to prevent the Christian geneticist Francis Collins from remaining head of the National Institutes of Health (NIH).

The attempt of “New Atheism” to enter and influence the academic world was explicitly stated at the 2006 Beyond Belief conference.

But the resounding failure was confirmed by the deep embarrassment expressed by non-believing academics themselves. For example, Nobel laureate Peter Higgs stated: “The problem with Dawkins is that he focuses his attacks on fundamentalists, but clearly not all believers are like that. In this sense, I think Dawkins’ attitude is fundamentalist, from the opposite side.”

Having lost the academic world, all their visibility depended entirely on media support, which gradually began to crumble, as seen above.

In fact, in the last relevant survey I could find, published in 2010, 23% of American college professors were agnostics and atheists, compared to just 4% of the American public. If there were no reporting bias, the rate of nonbelief among university academics is about six times higher than that of the American public in general. Once again, the authors of this dire piece are not using data as evidence, but simply ad hominem arguments—mostly detailing people’s criticisms of Dawkins and Sam Harris. But given the continuing rise of “nones” (which may have hit a temporary plateau but has not decreased), these are post hoc rationalizations. As faith slips away from Americans, it’s not enough for religionists to hold on to their personal beliefs—they need the support of like-minded people to make them think they’re on the right track.

I should add that as I quote and document in Faith Versus Fact, American scientists are 41% atheists, with only 33% believing in God (the other didn’t answer or were “spiritual”). If you look at more accomplished scientists, the rate of atheism rises to nearly 100%. It’s simply dumb to think that academics as a lot have rejected New Atheism.

3.) The response of believing intellectuals

Another reason for the decline of “New Atheism” lies in the entry into the debate of various Christian scientists, philosophers, and thinkers.

A new generation of believing intellectuals succeeded in presenting reasonable arguments in support of faith, showing that the “New Atheists” spent much of their time constructing straw-man arguments about religion, only to knock them down.

In his books, for example, Richard Dawkins strongly opposed a god that no one has ever believed in: the famous “god of the gaps”.

Some of these Christian intellectuals engaged directly with “New Atheism” by publishing books explicitly opposed to it, catching irreligious activists off guard. Among them:

  • John Lennox, emeritus professor of Mathematics at Oxford University, author of “God’s Undertaker”;
  • Amir D. Aczel, professor of Mathematics at the University of Massachusetts, author of “Why Science Does Not Disprove God”;
  • Francis Collins, renowned geneticist, author of “The Language of God”;
  • Kenneth R. Miller, emeritus professor of biology at Brown University, author of “Finding Darwin’s God”;
  • Owen Gingerich, emeritus professor of Astronomy and History of Science at Harvard University, author of “God’s Universe”;
  • Arthur Peacocke, theologian and biochemist at Oxford, author of “Paths From Science Towards God”.

More briefly, we also mention philosophers Alvin Plantinga, William Lane Craig, Robert Spaemann, Roger Trigg, Richard Swinburne, and Richard Schroder; physicists Gerald Schroeder, John Polkinghorne, and Russell Stannard; and sociologist Rodney Stark.

I have to laugh when I look at that list of names.  While Ken Miller, who’s circumspect about exactly what he believes, is a good scientist and textbook writer, I’ve look at the beliefs of most of these people either on this website or in Faith Versus Fact. I usually don’t count theologians as intellectuals because most of them adhere to an unevidenced superstition—that there’s a God.  They are academics with a delusion.  If you want to take frozen waterfalls as evidence for God, for example, read Francis Collins. Or, for a good laugh when you want reasons why people think that Jesus was Lord, read the “evidence” used by C. S. Lewish. For every name they give above, I could give the name of five real intellectuals who are atheists.

This next one’s a corker, and even mentions me:

4.) The “Elevatorgate Scandal”

In 2011, a minor dispute about the behavior of participants at an atheist convention became known as “elevatorgate” and sparked the first major internal feud among irreligious activists online.

Feminist Emma Watson was sexually harassed in an elevator and publicly reported it, but was rebuked by leaders of “New Atheism” for risking negative publicity for their movement.

This episode marked the beginning of a break between the movement and feminism.

The situation worsened when Richard Dawkins made sexist remarks about the victim, hosted on the blog of PZ Myers. The community split between feminists and Dawkins supporters.

At that point, PZ Myers turned against Dawkins, labeling him racist and Islamophobic, alongside Sam Harris.

The media amplified everything and even named Dawkins among the worst misogynists of the year—a devastating blow to the movement.

Gradually, more commentators began to turn against the “priests of atheism”. Biologist Jerry Coyne tried for a time to defend Dawkins and Harris but eventually burned out. Today, much of his blog focuses on cats. . . .

First of all, “Elevatorgate” involved Rebecca Watson, not the actress Emma Watson. Do your homework, Christians! But beyond that, no, Elevatorgate did not make people start believing in God again, or erode the increase in nonbelief, as you can see by looking at the years around 2011 in the two plots below. It was a tempest in a teapot, and there’s not a scintilla of evidence that it buttressed faith, stemmed the rise of atheism, and so on. It just led some people who already hated Dawkins to criticize him even more.

As for me being “burned out” and focusing on cats, that’s ludicrous. I’m as atheistic as ever, and still promulgating it, as I am in this piece. But after I spent three years researching and writing Faith Versus Fact, I grew weary of banging the same old drum, and decided to bang it only when necessary, for example when this moronic article came out. As for “focusing on cats”, you be the judge. Sure, I write about them, but they’re by no means in every post I put up.

And, god help me, we have the last one:

5.) Richard Dawkins

The creator himself turned out to be the worst cause of his creation’s demise.

Richard Dawkins was the most prominent figure, a YouTube celebrity and tireless preacher. After “elevatorgate,” however, he became a target of internal criticism.

His downfall, however, came with social media—especially Twitter. Without editorial filtering, the zoologist revealed aspects of himself that had previously remained hidden.

With nearly a million followers, his sexist and racist remarks, his defense of “mild pedophilia”, encouragement of infidelity, and criticism of mothers who give birth to children with Down syndrome did not go unnoticed.

For years he has become a mockery online, especially after opposing the transgender movement.

According to Vice“he has dishonored atheism”. His books have flopped, and even his most important scientific theory, the “selfish gene”, has been challenged by physiologist Denis Noble.

Yes, people have found plenty of “reasons” to go after Richard Dawkins, and he’s become the lightning rod for believers who hate atheism.  But nowhere in those criticisms, or in this very piece, do we see any refutation of Richard’s main reason to be an atheist: lack of evidence.  One would think that a genuine reason for rejecting atheism is that new evidence for a personal god has appeared. It hasn’t, and even a new line of anti-atheistic arguments, Intelligent Design, has come to nothing.

As for Dawkins’s books flopping, I’d suggest the authors look up the sales of The God Delusion, Climbing Mount Improbable, The Blind Watchmaker, and others. All of them were bestsellers, and all gave arguments against religious belief.

Here’s the summary of the piece:

Primatologist Frans De Waal accused the “new atheists” of being obsessed with the non-existence of God, going on media campaigns, wearing T-shirts proclaiming their lack of faith, and calling for militant atheism.”

But he also asked: What does atheism have to offer that is worth fighting for in this way?”

This is the question that remains. Defining oneself as “anti-” allows only limited survival; without offering meaningful answers to life’s meaning, failure is inevitable.

Philosopher Philippe Nemo wrote a remarkable epitaph for “New Atheism,” which we reproduce in full:

“Despite attempts to eradicate Christianity, atheism has died a natural death; it was not killed, since the modern world has given—and continues to give—it every opportunity to defend its cause and offer humanity new reasons for living. Opportunities wasted, because it failed to keep its promises, did not fulfill the intellectual programs that constituted its only attraction, and did not succeed in showing that man is less miserable without God than with God.”1.

This is ridiculous, of course. First, nobody, including the Great Satan Richard Dawkins—thinks of atheism as something that gives their life meaning.  It is simply a lack of belief in gods: an abandonment of religious superstition.

And what were the “promises” that New Atheism made? None, as far as I can tell. They maintained only that if you accept things based on evidence, you’re not going to embrace religion. And as the power of science grows (it’s one reason people give for leaving religions), so the grip of belief loosens.

The rise of nonbelief in America is documented in the two plots below, one from Pew and the other from Gallup. The plots (summaried in The Baptist News!) show the rise of the “nones”—people who don’t embrace an established church—as well as the fall of the ‘not-nones,” that is, people who do adhere to an established church.

Yeah, nonbelief has really fallen in America since the first New Atheist book (by Sam Harris) in 2004. NOT!

One question for readers:

Why are so many people eager to proclaim the death of New Atheism?

This is a Gallup plot:

And a Pew plot:

Categories: Science

Readers’ wildlife photos

Thu, 03/26/2026 - 6:15am

In the absence of much of a backlog, I’ve stolen some gorgeous photos from reader Scott Ritchie of Cairns, Australia (his FB page is here).  Scott’s captions and IDs are indented, and you can enlarge his photos by clicking on them.

Recently I visited my friends, Karen and David Young in the Crater Lake cabins near Lake Eacham, Atherton Tablelands, west of Cairns. This area is a mega for birds and they did not disappoint. In particular, we got great up close and personal views of our local bird of paradise, bird of prayer, paradise, the Victoria rifle bird.

The male of the species has jet black feathers. However, in just the right light you get a lovely iridescent reflection. The other thing these birds do is dance. It’s an amazing shuffling of the wings while top of stump while throwing their head out and flashing your lovely iridescent blue throat. The immature riflebird is a beautiful brown/rufous color, and they can’t help to practice their dance moves. And of course dad’s gotta come along and join in the festivities.

Also, here’s a few photos of some other creatures that I saw on my little five hour trip to the table lands. I hope you enjoy them.

Male Victoria’s Riflebird (Ptiloris victoriae),in full dance pose. Note the jet black feathers:

Swishes wings sideways, like a flying saucer. Peering above the wings:

But in the right light, iridescent rainbows appear:

I love the cooper and purple sheen on his back:

Meanwhile, youngster, an immature male, practices his dance moves. He leans back, showing off his wild yellow throat:

“Peek-a-boo”
Stands up, and swishes his wings back and forth, hiding his head behind them:

Then stands proud:

And then the adult male shows up. I’ll show you who’s boss:

Has he lost his mind?

I’m definitely King of the Stump:

Off youngster goes, only to be replaced by another male!:

And a few other local birds made an appearance. Pacific Emerald Dove (Chalcophaps longirostris):

Macleay’s Honeyeater (Xanthotis macleayanus):

Grey-headed Robin (Heteromyias cinereifrons):

And the musky rat-kangaroo (Hypsiprymnodon moschatus), our smallest proper roo!:

And the Boyd’s Forest Dragon (Lophosaurus boydii) appeared for the lizard and snake lovers:

Categories: Science

This just in from Colossal Biosciences: Wolves eat meat!

Wed, 03/25/2026 - 10:30am

I’ve often criticized Colossal Biosciences for their overblown science, which includes pretending that they’ve resurrected the dire wolf (Aenocyon dirus), when they’ve only created grey wolves (Canis lupus) with a few gene edits that make them white (real dire wolves probably were not white!) and a bit larger. Three of these edited wolves have been released at a secret location, and Colossal has pronounced them to be dire wolves even though they’re they’re not dire wolves. In fact Colossal has admitted they haven’t “de-extincted” dire wolves—and yet they pretend otherwise. It’s a squirrelly business, but they need to keep attracting and keeping donors.

On tap: their promise to “de-extinct” the woolly mammoth, which will in fact not be a wooly mammoth but at best a hairy Asian elephant. And they say that they’ll get a population of these creatures going on the tundra within eight or so years.  I wouldn’t bet on it!

This morning I got an announcement that the three dire wolves in captivity have eaten an animal—a dead deer! Surprise!

Here’s the announcement:

Did you know most wolf hunts end in failure?  Wolves look like perfect hunters, but in the wild they actually fail nearly 9 out of 10 hunts. So how do they ever get good at it? In our latest video, narrated by Colossal’s Manager of Animal Husbandry Paige McNickle, we explore how wolves actually learn to hunt, and see the dire wolves Romulus and Remus face a messy milestone as they try to figure out how to skin, tear, crunch, and share a whole deer. Will the young dire wolves be successful? The answer might surprise you. When I said the whole thing was ridiculous, Matthew (another critic) responded, “Exactly. “Look, Homer, they are just like our pet dog!”
Note the videos showing a bunch of white “dire wolves” chasing buffalo. I don’t think that can be real, as they made only three dire wolve, and they aren’t penned in where the buffalos roam.Note as well that they are showing gray wolves, not dire wolves.  And of course they’ll eat a wolf carcass, for “dire wolves” are just tweaked gray wolves, and they are going to eat a deer carcass if they get it.  This deer was (I hope) killed before presented to these mutants.The YouTube notes.

Narrated by Paige McNickle, Colossal’s Manager of Animal Husbandry, this episode also gives you a close look at the continuing development of our young dire wolves, Romulus and Remus. They’ve already taken down small prey, and they’re continuing to learn the essential behaviors of being wolves. To help them develop more skills, the dire wolves are given a whole deer carcass. This is an important milestone in their development, as they learn how to skin, tear, crunch, and share a full prey animal.

This is undoubtedly meant to keep the public excited and, more important, keeping the donors satisfied and bringing more $$ in.  But what is the purpose showing these gray wolves learning to be gray wolves? They’ll never be released into the wild! I suppose you could say that this shows how gray wolves not born into a pack can learn various behaviors. But that has nothing to do with dire wolves.  To me it’s a big yawn in the service of Mammon.

And where is the third “dire wolf”—Khaleesi? Is she getting dog food somewhere?

Categories: Science

DuckCam is up!

Wed, 03/25/2026 - 9:00am

They have finally turned on the DuckCam (or PondCam, if you will) at Botany Pond. There’s a good view of nearly the entire Pond, and you are likely to see Armon there; in fact, he’s should be there now. A few minutes ago there was another pair that I drove off, as we don’t want to couples nesting at about the same time.  Oh, I forgot to add the important note that Vashti has begun incubating her eggs at a secret location (I know where it is), and we should have ducklings in a bit less than four weeks!

Even the channel is visible now, to the right behind the lamppost.

Categories: Science

Two obituaries of Robert Trivers

Wed, 03/25/2026 - 7:30am

Although I did call attention to the death of Robert Trivers, age 83, on March 12, and I knew him slightly, I did not have the chops to summarize his many contributions, nor did I know him that well (we overlapped at Harvard). Fortunately, Steve Pinker has produced an absolutely terrific bio of Trivers at Quillette: a piece that summarizes the many contributions to evolutionary biology made as a young man, and then his many eccentricities, quirks and obnoxious or even illegal behaviors that made Trivers somewhat of an apostate. He was a complex and fascinating person, and I hope someone will write his biography (he did write an autobiography, Wild Life: Adventures of an Evolutioanry Biologist, but deserves a thorough, disinterested, and Cobb-like treatment).

Steve’s obituary, which you can access by clicking on the screenshot below or seeing it archived here, is roughly in three sections: Trivers’s contributions to the field, an analysis of why they came so young and so fast (he did almost nothing during the last five decades of his life), and a description of his complex personality and behavior. It’s long for an obituary, but Trivers deserves long, and of course Pinker summarizes his life eloquently.

Trivers’s major contributions as Steve outlines them (Steve’s words are indented, bold headings are mine):

. . . two weeks after the death of Robert Trivers, one of the greatest evolutionary biologists since Charles Darwin, not a single major news source has noticed his passing. This despite Trivers’s singular accomplishment of showing how the endlessly fascinating complexities of human relations are grounded in the wellsprings of complex life. And despite the fact that the man’s life was itself an object of fascination. Trivers was no ordinary academic. He was privileged in upbringing but louche in lifestyle, personally endearing but at times obstreperous and irresponsible, otherworldly brilliant but forehead-slappingly foolish.

I still can’t see an obituary for Trivers in either the NYT or the Washington Post. That lacuna is shameful. On to his contributions (

Contributions:

Parent-offspring conflict:

Trivers’s innovation was to show how the partial overlap of genetic interests between individuals should put them in a partial conflict of psychological interest. The key resource is parental investment: the time, energy, and risk devoted to the fitness of a child. Parents have to apportion their investment across all their children, each equally valuable (all else the same). But although parents share half their genes with each child, the child shares all its genes with itself, so its interest in its own welfare will exceed that of its parents. What the parent tacitly wants—half for Jack, half for Jill—is not what Jack and Jill each want: two thirds for the self, one third for the sib. Trivers called the predicamentparent-offspring conflict.

Sex differences in parental investment:

Trivers explained the contrast by noting that in most species the minimal parental investments of males and females differ. Males can get away with a few seconds of copulation; females are on the hook for metabolically expensive egg-laying or pregnancy, and in mammals for years of nursing. The difference translates into differences in their ultimate evolutionary interests: males, but not females, can multiply their reproductive output with multiple partners. Darwin’s contrast can then be explained by simple market forces. And in species where the males invest more than the minimum (by feeding, protecting, or teaching their offspring), males are more vulnerable than females to infidelity (since they may be investing in another male’s child) and females are more vulnerable to desertion (since they may bear the costs of rearing their mutual offspring alone).

Reciprocal altruism:

In another landmark, Trivers turned to relations among people who are not bound by blood. No one doubts that humans, more than any other species, make sacrifices for nonrelatives. But Trivers recoiled from the romantic notion that people are by nature indiscriminately communal and generous. It’s not true to life, nor is it expected: in evolution as in baseball, nice guys finish last. Instead, he noted, nature provides opportunities for a more discerning form of altruism in the positive-sum exchange of benefits. One animal can help another by grooming, feeding, protecting, or backing him, and is helped in turn when the needs reverse. Everybody wins.

Trivers called it reciprocal altruism, and noted that it can evolve only in a narrow envelope of circumstances.

This to me is Trivers’s most important contribution, explaining not only why we sacrifice for unrelated people, but also making testable (and largely verified) predictions about human behavior, including morality.  Now that humans no longer live in small groups of acquainted people—conditions under which reciprocal altruism presumably evolved—we can expect some of those behaviors to disappear, but civilization is a mere eyeblink compared to the long, long period in which the conditions were right for the evolution of altruism (and deceit; see below).

Asymmetries in human relationships:

. . . in a passage that even fewer readers noticed, Trivers anticipated a major phenomenon later studied in the guise of “partner choice.” Though it pays both sides in a reciprocal partnership to trade favours as long as each one gains more than he loses, people differ in how much advantage they’ll try to squeeze out of an exchange while leaving it just profitable enough for the partner that he won’t walk away. That’s why not everyone evolves into a rapacious scalper: potential partners can shun them, preferring to deal with someone who offers more generous terms. Just as a store with a reputation for fair prices and good service can attract a loyal clientele and earn a bigger profit in the long run than a store that tries to wring every cent out of its customers only to drive them away, a person who is inherently generous can be a more attractive friend, ally, or teammate than one who dribbles out favours only to the extent he expects them to be repaid with a bonus. The advantage in attracting good partners makes up for the disadvantage in forgoing the biggest profit in each transaction.

And since humans are language users—indeed, reciprocity may be a big reason language evolved—any tendency of an individual to reciprocate or cheat, lavish or stint, does not have to be witnessed firsthand but can be passed through the grapevine. This leads to an interest in the reputation of others, and a concern with one’s own reputation.

The evolutionary significance of deceit and self-deception:

Trivers’s fifth blockbuster was laid out not in an academic paper but in a pair of sentences in his foreword to The Selfish Gene:

If (as Dawkins argues) deceit is fundamental to animal communication, then there must be strong selection to spot deception and this ought, in turn, to select for a degree of self-deception, rendering some facts and motives unconscious so as not to betray—by the subtle signs of self-knowledge—the deception being practiced. Thus, the conventional view that natural selection favors nervous systems which produce ever more accurate images of the world must be a very naïve view of mental evolution.

We lie to ourselves the better to lie to others, protecting compromising private knowledge from emotional tells or factual contradictions (as in the Yiddish saying, “A liar must have a good memory.”) In his book Social Evolution(1985), Trivers muses on how this can play out:

Consider an argument between two closely bound people, say, husband and wife. Both parties believe that one is an altruist of long standing, relatively pure in motive, and much abused, while the other is characterized by a pattern of selfishness spread over hundreds of incidents. They only disagree over who is altruistic and who selfish.

The theory of self-deception is deeper (and more enigmatic) than the commonplace that people’s views of themselves are mistuned in their favour. The self, Trivers implied, is divided: one part, seamless with the rest of consciousness, mounts a self-serving PR campaign; another, unconscious but objective, prevents the person from getting dangerously out of touch with reality.

Trivers wrote an entire book about this, a book that he intended to co-author with the (in)famous Huey Newton, a founder of the Black Panthers (Newton was murdered before it could be written): The Folly of Fools: the Logic of Deceit and Self-Deception in Human Life. It’s an uneven book, larded with bizarre personal anecdotes, but it also contains a lot of intriguing food for thought. In other words, it’s pure Trivers.

Why did Trivers make these contributions?  A few of Steve’s thoughts:

. . . Trivers revelled in explaining the contradictions of the human condition, and he himself was a mess of them. Foremost is how he revolutionised the human sciences in a fusillade of ideas he had between the ages of 28 and 33 (I didn’t even mention a sixth one, on how parents should invest in sons versus daughters). But then he did nothing comparable for fifty years. He wrote some good books, but they were reviews of his and others’ contributions, breaking little new ground. How do we explain this shooting star?

Part of the answer is that, as with all intellectual revolutions, the right mind found itself in the right era. In 1971 the gene’s-eye view of evolution was new and counterintuitive, as it remains to this day. People, including scientists, project their moral and political convictions onto the things they study, and the ideal that we should love our neighbours, act for the good of the group, and strive for social betterment is easy to read into nature, even if it flouts the logic of natural selection. And whenever the word “gene” comes up, readers get distracted by hallucinations such as that humans are robots controlled by their genes, that each of their traits is determined by a single gene, that they may be morally excused for selfishness, that they try to have as many babies as possible, that they are impervious to culture, and other non sequiturs.

The young Trivers, mentored at Harvard by the biologists William Drury and Ernst Mayr, immediately grasped the new way of looking at evolution, and never got hung up by these misconceptions. A jaundiced view of animals, not excluding Homo sapiens, came naturally to his rebellious temperament, and many puzzles he observed in his field work (including on ants, lizards, gulls, songbirds, caribou, baboons, and chimps) fell into place when he considered their reproductive interests from their viewpoints.

. . . In the early 1970s, then, Trivers was standing on the shoulders of giants, looking with a gimlet eye over a rich array of poorly explained animal behaviour (not excluding humans, since he had recently binged on novels). In this virgin landscape, the implications of the overlapping conflicts of genetic interests were waiting to be discovered, foreshadowed in scattered passages from Hamilton and Williams. Someone had to see them first, and Trivers was there.

. . . But Trivers rapidly spotted what everyone else missed, and still misses, together with the less biologically obvious concept of self-deception, so there must be another piece to the puzzle. During his junior year at Harvard, Trivers suffered two weeks of mania and then a breakdown that hospitalised him for two months. Bipolar disorder afflicted him throughout his life. I can’t help but wonder whether Trivers’s fecund period was driven by episodes of hypomania, when ideas surge and insights suddenly emerge through clouds of bafflement.

I had never thought of that, though Trivers made no secret of his diagnosis.  Finally, a bit about his behavior:

Though his upbringing was patrician and cosmopolitan (son of a poet and a diplomat, schooled in Europe and then Andover and Harvard), he was afflicted with a strong nostalgie de la boue. This contributed to his adoption of Jamaica, originally the site of his research on lizards, as a second home. Trivers’s life in Jamaica was filled with boozing, brawling, whoring, and of course toking, together with a stint in jail and a narrow escape from death during an armed robbery. His memoir Wild Lifeis peppered with homicidal fantasies and expressions of admiration for thuggish vigilantes, including Huey Newton, co-founder of the radical Black Panther Party. Trivers befriended Newton, made him godfather of his daughter, coauthored a paper with him on the role of self-deception in a fatal plane crash, and became a white Black Panther himself before Newton ushered him out of the organisation for his own safety.

. . . But Trivers’s neuroatypicality shaded into eccentricity and downright boorishness. He might try to drop off a passenger without stopping the car, or miscount the number of dinner guests and force two of them to share a chair. He repaid the colleagues who offered him professional lifelines at their universities with truancy, belligerence, and gross inappropriateness (greeting female students in his underwear when they had been sent to his apartment to fetch him to a late lecture; requesting that straitlaced academic hosts supply him with cannabis). His violent musings could make acquaintances genuinely fear for their safety. His last graduate student, Robert Lynch, spoke for many when he ended his affectionate obituary, “I’ll miss you, Robert. You asshole.”

. . . As for himself, Trivers liked to poke fun at some of his eccentricities and indignities. But he never squarely faced his record of betrayals, hurts, and squandered talent. All this is exactly what Trivers’s greatest theoretical brainchild would predict.

That “greatest theoretical brainchild” must be self-deception, of course, but I think that was perhaps the least important of his contributions.

Trivers’s had an erratic life, but also a rewarding one and a tumultuous ones. It makes me want to paraphrase Nagel: “What was it like to be Robert Trivers?”

There is also a shorter obituary in The Times of London, which you can see by clicking below or reading it archived here. Although author Finkelstein is not a biologist, he does a pretty good job summing up Trivers’s contributions, though he concentrates too much on the deceit and self-deception part, seeing it mirrored in modern politicians like Donald Trump and Liz Truss. If you want a short read it is okay, but given the choice, you should read the longer Pinker obituary. It will also teach you a lot about modern evolutionary psychology—known as “sociobiology” when Trivers and I overlapped at Harvard.

Categories: Science

“Little Darlin'”: cover or parody? Or is it two, two, two songs in one?

Mon, 03/23/2026 - 9:30am

Reader Thomas from Marginal REVOLUTION sent me the linked article piece, which shows a video of the 1957 doo-wop hit “Little Darlin'” by the Diamonds. If you’re my age, you’ll remember it.  I always thought it was a serious song, and apparently so did most Americans, but the first comment says this:

From what I recall, the Diamonds were a conventional vocal group who recorded “Little Darlin'” at the end of a session as a kind of joke song – a parody of the Ink Spots and Mills Brothers. Fortunately for them, the studio musicians joined in with perfectly irresistible beat and the song became a rock and roll hit. Early rock and roll was born as parody as much as anything else, like Elvis’s Jokey rendition of Bill Monroe’s “Blue Moon of Kentucky” turning out to be the founding document of rockabilly.

Really? And then they present the video, which, unlike any music video I know from that era, shows the group clowning around. Have a look at the video for a hit song (more cowbell!). But yes, this is the original song. which went to #2 on the Billboard charts.

To get to the bottom of this, I sent to AI, in particular Grok. Sure enough, the bot pronounced it a parody after scanning what’s published about it.

“Was “Little Darlin'” by the Diamonds a parody song?”

Yes, “Little Darlin'” by The Diamonds (1957) is widely regarded as a stylistic parody/satire of the doo-wop genre—not a lyrical parody like a Weird Al track (the words and melody are the same), but a deliberately exaggerated, campy cover that poked fun at the over-the-top clichés of the style.Here’s the story, straight from music historians:

  • The song was written by 16-year-old Maurice Williams and first recorded in January 1957 by his group The Gladiolas (on the Excello label). Their version was a straight, heartfelt R&B/doo-wop track with a Latin-tinged rhythm, falsetto, spoken bridge, and percussion. It charted modestly (#11 R&B, minor pop crossover).
  • The Diamonds (a Canadian quartet) rushed out their cover just a month later on Mercury Records. It was reportedly a last-minute, end-of-session lark—they had about 15 minutes left, no drummer, so they improvised with castanets. Lead singer Dave Somerville and the group leaned hard into the nonsense syllables (“ai-ya-yi-yai-yai-ya,” “wella-wella,” etc.), hamming it up with theatrical flair. They even performed it in an over-the-top way on TV (like The Steve Allen Show).

AllMusic critic Stephen Thomas Erlewine noted that the vocal phrases “verge on self-parody… which may be why The Diamonds’ version is superior.” Radio historian John Gilliland (on the Pop Chronicles series) explicitly stated that their recording “was in fact a parody of the genre.” Other sources describe it as a “joke song,” “satire of doo-wop,” or “intended as a satire” with “campy vocal delivery.”Despite (or because of) the tongue-in-cheek approach, it became a monster hit: #2 on the Billboard Hot 100 for eight weeks (behind only Elvis’s “All Shook Up”), the biggest version by far, and now the definitive classic. The original Gladiolas take is respected by R&B fans but far less known.It’s a classic example of a white cover version (common in the 1950s) that amplified the style for mainstream appeal—here with humor that stuck. Later, Elvis Presley covered the Diamonds’ arrangement on his final album Moody Blue, and Joan Baez even recorded an actual lyrical parody.

From Wikipedia, we learn that the hit version was actually a re-recording of a song recorded just a month earlier by a black group, the Gladiolas, one of whose members wrote the song. There’s also a brief not about the parody aspect:

It was written by a 16-year-old Maurice Williams with both melody and doo-wop accompaniment strongly emphasizing a calypso rhythm. First recorded in January 1957 by Williams’ group the Gladiolas, it was quickly released as a single on Excello Records, a small swamp blues label owned by Nashville record man Earnie Young, who was responsible for creating the song’s Latin feel, naming the group and ensuring Williams would retain the song’s publishing.

The recording, inspired by a book Williams was writing, and originally called “Little Darlin’/ The Beginning,” is noted for its trademark doo-wop falsetto by Fred Mangum and its spoken bridge by Williams (“My Darlin’ I need you …”). The Gladiolas were from Lancaster, South Carolina, where they had been together since high school.

The Diamonds‘ successful cover version followed a month later. The Diamonds were a Canadian pop group that evolved into a doo-wop group. The Diamonds’ version reached number two in sales for eight weeks on the Billboard Hot 100Billboard ranked this version as the No. 3 song for 1957. In Canada, the song was No. 11 on the premiere CHUM Chart, May 27, 1957.

The Diamonds’ version is generally considered superior. AllMusic critic Stephen Thomas Erlewine argues that the Diamonds “Little Darlin'” is an unusual example of a cover being better than the original:

[T]he Diamonds’ take remained the bigger hit, and over the years, the better-known version. Normally, this would have been an outrage, but there’s a reason why the Diamonds’ version has sustained its popularity over the years: it’s a better, fiercer recording. Both versions are good, even if they’re a little silly, because it’s a good doo wop song, giving each member of the quartet a lot to do. At times, the vocal phrases verge on self-parody — the “ai-ya-yi-yai-yai-ya”‘s or the “wella-wella”‘s — which may be why The Diamonds’ version is superior.

On the Pop Chronicles, host John Gilliland claimed that their version was in fact a parody of the genre. Nonetheless, “Little Darlin'” (primarily the Diamonds’ version, but to some extent the Gladiolas’ version) remains an all-time rock ‘n roll R&B classic.

Here’s the original version by The Gladiolas, and sure enough, it’s pretty much like the parody (or cover), including the talking interlude. It was not a hit. Is this cultural appropriation?

Categories: Science

“The right war is being waged by the wrong people, for the wrong reasons”: Sam Harris on the conflict with Iran

Mon, 03/23/2026 - 7:15am

Sam Harris is widely demonized by the know-nothings, and I’m not quite sure why. Yes, he discussed the possibility of torture in certain circumstance, but this was a philosophical rumination which is perfectly justifiable if you have a utilitarian or consequentialist view of ethics, and in fact a similar discussion appears in the Stanford Encyclopedia of Philosophy. His view are speculative and nuanced, and he thinks the torture by the American government should remain illegal. Nevertheless, over and over again I’ve seen him damned for advocating willy-nilly torture, as if he were some kind of latter-day Josef-Mengele.  And that is what I’ve seen people emphasize, though Sam has also discussed the possibility of American preemptive nuclear strikes, the nature of Islam (he’s been deemed an “Islamophobe”), and for profiling groups of people for airline security. That has been more fodder for going after him, though people always neglect the nuance.

I myself have criticized his position on “objective morality,” but that would never make me dismiss Sam, as his writings are always measured and thoughtful—and largely philosophical. They promote thought, and that is, after all, the goal of philosophy. But I guess when an avowed atheist dips into philosophy, he’s almost automatically damned.  So be it; I will continue to read him.

Yesterday Sam posted his views about our war with Iran, and his overall take is summarized in the title of this post. You can read his discussion by clicking on the link below, or reading the the piece archived here.

I think the moral confusion about the war, which I see as the conflict between feeling it’s a just war and the despair at the rationalizations for the war by our administration and Trump’s repeated lies about what’s going on and what he intends to do—all of this is the cause for the increasing anomie many of us feel about our engagement, as well as for the widespread opposition to the war by Americans. As Sam says, “To think clearly about this war, we need to hold two sets of ideas in our minds at the same moment: the Iranian regime is evil, and the Trump administration is dangerously amoral, corrupt, and incompetent.”

I’ll give a few quotes from Sam (indented). Text that is flush left is mine, as are the bold headings:

Why it’s a morally just war

The Islamic Republic has tormented its own people for forty-seven years. It has hanged dissidents from cranes, crushed peaceful protests with live ammunition, tortured political prisoners, and funded jihadist proxies throughout the Middle East and beyond. When Salman Rushdie was nearly killed by a knife-wielding fanatic, after living for thirty-three years under the shadow of the Ayatollah’s imbecilic curse, this was a direct export from the theocracy in Tehran—which has grown increasingly unpopular with the Iranian people. The protests of 2025 and 2026 reminded the world, yet again, of the Iranian majority’s desperation to be free. The assassination of Ayatollah Khamenei on the first day of this war was greeted with celebrations in Tehran, Isfahan, and among the Iranian diaspora in Los Angeles and New York. Whatever else one thinks about the decision to strike Iran, it is obscene to pretend that there was no moral or pragmatic argument for doing so.

Note that a credible figure for civilians killed by the Iranian regime for protesting is 30,000+.  The regime apprehends political prisoners for protesting the war, including the families of protestors, and tries to assassinate critics of the regime even when they are overseas (e.g., Masih Alinejad).  There’s no doubt that the Iranian people, overall, want to acquire the freedoms they had before the Revolution. All this, and the determination of Iran to export terrorism throughout the Middle East, makes this a just war.

The lame criticism of this war. (Sam sees this as a failure to recognize moral evil (or recognizing it and ignoring it), as he did when he wrote about the Gaza War)

And yet, most critics of the war speak as though Iran was a peaceful nation attacked by foreign aggressors. Notions of “sovereignty” and “international law” are invoked as though the Islamic Republic were Sweden. Almost no prominent critic of this war has anything cogent to say about the decades of misery the mullahs have inflicted on their own citizens, the threat that Iran’s network of proxy militias poses to the entire region, or the inconceivability of establishing deterrence once a jihadist death cult acquires nuclear weapons. If your opposition to this war cannot acknowledge the evil we are facing, your opposition is not morally sane.

I’ll add to this something that Sam takes up later: some of the motivation for criticizing this war comes from the fact that we’re allies with Israel, seen as a settler-colonialist and evil apartheid regime,and of course a Jewish state.

The justified criticism of this war.

But there is a serious case to be made against this war. One might believe, along with Damon Linker, that the risk of Iran becoming a failed state—a larger, more dangerous version of post-invasion Iraq—far outweighs the benefits of toppling the regime today. One could point to the apparent absence of a credible plan for what comes next, or to the fact that three weeks of bombardment haven’t produced anything resembling the “unconditional surrender” that Trump once demanded. Iranian state media has reported that the conflict has already killed more than 1,500 people, including over 200 children. Whatever the actual numbers, there can be no doubt that the humanitarian toll is real and mounting. These are intelligible concerns, and they deserve to be taken seriously.

But this is not the argument that most opponents of this war are making. They are making a much lazier set of claims—and often treating any American use of force as inherently unjust. Most critics are simply ignoring the question of what the world should do about a jihadist regime that has spent decades aspiring to commit genocide, views any peace as a temporary interval in which to gain the upper hand, and happens to be on the verge of acquiring nuclear weapons.

Admittedly, it is difficult to separate the ethics of destroying the Iranian regime, and seeking to liberate the Iranian people, from the staggering incompetence and callousness of the reality-television stars who are now sit atop our vast machinery of war and wield it for no clearly stated purpose.

Sam then recounts a lot of Trump’s lies (or confusions), including his back-and-forth on whether we’ve destroyed the regime and its nuclear program, whether or not he’ll demand “unconditional surrender”, his failure to include Congress in his deliberations, and his failure to prepare the American people for the conflict.  This leads to feelings of both anomie and impotence:

Congressional Democrats now face the agonizing reality that they cannot stop a war that is already underway—the question is whether they can impose constitutional oversight on an administration that started it without asking anyone’s permission.

How the war increased antisemitism

I think we all know that antisemitism is now a tenacious termite in Western democracies. Though not ubiquitous, we see it growing in America, in Canada, in Europe, and in Australia.  I’m not sure whether the conflict with Iran has actually increased it, or simply given antisemites an excuse to parade their views more openly. Sam blames Trump for his incompetence that has created an explanatory vacuum that’s fbeen filled with Jew-hating.

Finally, all of this bluster, confusion, dishonesty, and strategic incoherence has been a gift to the world’s antisemites. In the three weeks since the war began, antisemitic incidents worldwide have spiked by 34 percent. The resignation of Joe Kent, former director of the National Counterterrorism Center—who blamed “high-ranking Israeli officials and influential members of the American media” for encouraging the conflict—gave mainstream respectability to the conspiracy theory that most of our wars are fought at the behest of perfidious Jews. Far-right figures have seized on the joint U.S.-Israeli operation to push the narrative that American soldiers are dying for a foreign power. And the antiwar Left, rather than reject this framing, has amplified it. While the similarities between the far Right and the far Left may be overstated, for the Jews, “horseshoe theory” is now a daily reality.

And Trump’s chaotic messaging bears much of the blame. When a president cannot clearly articulate why his country has gone to war, he leaves a vacuum that conspiracy theories will fill. A competent leader would have made the case for this war on its merits—the destruction of a terroristic theocracy’s military infrastructure, the elimination of its nuclear program, the liberation of 88 million people from a regime that jails and tortures women for the crime of uncovering their hair. Instead, Trump has offered a jumble of contradictions: Iran’s military is destroyed, but we need more troops; the war is almost over, but we may yet put boots on the ground; Iran wants to talk, but there is no one to talk to. This moral and logical void has become a vessel for antisemitic paranoia on both the Right and the Left. Needless to say, it doesn’t help that Trump and his family have been accepting personal gifts and payments from the Gulf states—a plane from Qatar, secretive investments from the UAE—while American forces protect those same regimes. The man is simultaneously waging a war of choice and running what appears to be an extortion racket.

The resultant ambivalence.

I think the first paragraph below, which involves us believing things that seem at odds with each other, explains at least for my ambivalence about the war.  A nutjob is in charge, someone who continually contradicts himself and acts on impulse, and yet he’s in charge of a just war.

It is possible, even necessary, to believe all of the following at once: the Iranian regime is a monstrosity that should be destroyed; the Iranian people deserve to be free; the risk that this war will end in catastrophe is real, largely because of the character of those who are waging it; and the rising tide of antisemitism that this conflict has unleashed is yet another moral emergency that people on both sides of the debate have a responsibility to confront.

The tragedy of this moment is that the right war is being waged by the wrong people, for the wrong reasons. And the opponents of the war, rather than making this case, have mostly opted for blinkered pacifism and conspiracy theories, while refusing to grapple with the manifest evil of the Iranian regime. Of course, the Iranian people, caught between their own tyrants, a reckless American president, and his feckless critics, will pay the heaviest price.

Categories: Science

Readers’ wildlife photos

Mon, 03/23/2026 - 6:15am

Send ’em in if you got ’em.  The photo situation is dire.

But today we have whale photos by reader Ephriam Heller. His captions and IDs are indented, and you can enlarge his photos by clicking on them.

The best whale watching I have experienced is observing gray whales (Eschrichtius robustus) in the San Ignacio Lagoon on the Pacific coast of the Baja peninsula in Mexico. These whales are curious and “friendly,” often swimming up to boats to observe their occupants and even allowing themselves to be touched. Here is an example of an interaction between two species that each appear to exhibit curiosity and intelligence:

This is what a gray whale looks like (Image courtesy of International Whaling Commission):

They engage in numerous photogenic behaviors, such as rolling, riding the surf, waving their flippers and flukes in the air, and spyhopping to observe their surroundings.

The gray whale has longitudinal double blowholes. People claim that they form a heart shape, but think a heart with this shape needs immediate treatment:

When the sunlight hits their spray just right, one sees “rainblows”:

The gray whale has the most parasites of any whale, carrying up to 180 kg. At birth, babies have no barnacles or sea lice, but quickly acquire them from their mothers. The older the whale, the more barnacles and lice they collect. The whales rub along the seabed and piers to try to rid themselves of the parasites.

The whales carry one species of barnacle and four species of whale lice. The barnacles are Cryptolepas rhachianecti (whale barnacles) which are specific to gray whale hosts (i.e., they rarely occur on any other species), and they die when the whale dies.

There are four species of “whale lice,” which are not true lice (which are insects) but are amphipods in family Cyamidae: Cyamus scammony (the most common), Cyamus kessleri, and Cyamus eschrichtii are all found only on gray whales. Cyamus ceti is found on gray and bowhead whales. These cause minor irritation to healthy whales. Researchers view cyamid coverage and distribution (e.g., heavy clusters near blowhole, mouthline, genital slit) as indicators of stress, nutritional status, and chronic skin disease rather than as a primary cause of these problems.

There are two populations. The larger Eastern North Pacific population migrates along the continental coast between its breeding grounds in Baja, Mexico and its feeding grounds in Alaska. The small Western North Pacific population migrates along the Pacific coast of Asia. Gray whales hold the record for the longest migration of any mammal, with typical round-trip distances of about 20,000 km annually (although this isn’t close to the 70,000 km migration of the arctic tern).

Whales fall into two suborders: baleen (Mysticeti) and toothed (Odontoceti). Gray whales are in Mysticeti and use their baleen to feed on amphipods and plankton on the seafloor. During the six month summer feeding season, adults consume over 1 ton of food per day. They then fast for the remainder of the year, including the migration and winter birthing / breeding season. They exhibit “handedness,” in that most gray whales feed by scooping up sediments from the seafloor with the right side of their heads, resulting in their right sides having fewer adhering barnacles and sea lice.

They live up to ~70 years. Biggs transient killer whales (orcas) kill up to 35% of the calf population annually. Based on scarring, researchers speculate that almost every gray whale has been attacked by orcas. Most attacks occur as the young calves migrate north through Monterey Bay, California and Unimak Pass, Alaska.

The Eastern North Pacific population dropped to ~1,000 individuals around 1885 due to whaling, but has since recovered to ~27,000 in 2015-2016. The Western North Pacific population is tiny, comprising just a few hundred individuals. North Atlantic populations were extirpated (perhaps by whaling at the end of the medieval warm period) on the European coast in the 12th to 14th centuries, and on the American and African Atlantic coasts around the late 17th to early 18th centuries. Remains of gray whales from the time of the Roman empire have been found in the Mediterranean Sea, and they are still rarely seen there in modern times.

The gray whale has a dark slate-gray color and is covered by characteristic gray-white patterns, which are scars left by parasites that drop off in its cold feeding grounds. Individuals can be identified by their pigmentation patterns and their scars. I got this great photo of a whale’s tail; but it was just a fluke:

In case you are the kind of person who is interested in this sort of thing, this is what it looks like when whales mate:

Anyone with a younger brother will recognize this as the “head butt” greeting, a conserved behavior across all mammal species:

And this is the view when you saddle up a gray whale (I use a western saddle):

The eyes of gray whales are unlike the eyes of any other mammal I have seen, with what appear to be tangled filaments. My AI friend assures me that this is not the case and that they do not have any “extra” organs in their eyes: “The ‘tangled filaments’ you’re seeing are structures in the gray whale’s iris and surrounding tissues that become visible because the eye is small, very dark, and strongly three‑dimensional, so you are effectively looking across folded, ridged iris and ciliary tissues rather than through a flat, open pupil as in most mammals you see up close.”

Categories: Science

Today’s covert anti-Israel slant on the news

Sun, 03/22/2026 - 8:45am

As usual, I watched the NBC Evening News last night, even though some of its reporting has seemed slanted against Israel.  Since I wrote about the Guardian article yesterday, though, I’ve become more sensitized to how the media uses language to express political opinions—even in supposedly objective news reports.

Here’s a video showing all the NBC Evening News from last night, but you don’t have to watch it all unless you want to see bodycam video of a clearly inebriated Justin Timberlake being arrested for DUI (17:05).  The part that made me prick up my ears is at 4:07, when the news shows cute little Lebanese Muslim kids getting presents at the end of Ramadan. But they are not in their homes.  The narration says this (bolding is mine):

While across the Muslim world, the end of Ramadan means presents for children.  These kids are among the one million people displaced in Lebanon by Israel’s expanding offensive against Hezbollah.

The rest of the short segment seems designed to evoke the viewers’ sympathy for Lebanese people—especially the kids—displaced by the wicked Jewish state.  And indeed, it’s sad that people have to flee their homes. HOWEVER, the report neglects to mention that there had been a cease-fire between Israel and Hezbollah that largely held until March 2 of this year. Then, on March 2, three days after the U.S. and Israel struck Iran, Hezbollah in fired a barrage of missiles and drones from Lebanon at northern Israel, explicitly saying that this was in response to the killing of Iran’s Supreme Leader Ayatollah Ali Khamenei and what Hezbollah called were “repeated Israeli aggressions” (there were minor attacks by both sides during the ceasefire, with UNIFIL and the Lebanese government failing to rein in Hezbollah, as they are supposed to. Israel responded big time, but to construe that as an “expanding offensive” minimizes the defensive nature of Israel’s attacks, designed to stop Hezbollah’s rockets and drones for once and for all.

Again, it’s a small remark, but a telling one. “Expanding offensive” implies that Israel started the attacks in Lebanon going on now.  It didn’t, just as Israel didn’t start the war with Hamas on October 7.

But at the end you might want to see the inevitable “there’s-good-news-tonight” segment (several nice pieces starting at 18:05, with an especially moving bit at 19:54 as a woman is assigned to take the final call from an Air Force officer as he leaves the military—an officer who happens to be her dad).    As the world is falling apart, nearly all the major television news stations like to leave viewers with a good taste in their mouths.

Categories: Science

Pages