Researchers from the Institute of Cosmos Sciences of the University of Barcelona (ICCUB) and the Institute of Space Studies of Catalonia (IEEC), in collaboration with the Institute of Astrophysics of the Canary Islands (IAC), have led the most extensive observational study to date of runaway massive stars, which includes an analysis of the rotation and binarity of these stars in our galaxy.
One of the hardest things to accept, especially for people who care about rationality, is that epistemic rigor is rarely applied consistently. Most of us do not give up bad arguments. Instead, we give up standards of evidence when the conclusion becomes socially or morally important to us.
There are well-established psychological reasons why this happens. Decades of research in social psychology show that many of our beliefs are not just opinions we hold, but parts of who we are. They become woven into our identities, our friendships, and often our professional lives.
Put more simply, we build our identities, friendships, and careers around certain beliefs. As a result, challenges to those beliefs are not experienced as abstract disagreements but as personal threats. Our self-preservation mechanism kicks in: We bend reality as far as necessary to preserve a flattering story about ourselves and our ingroup. Denial and aggression toward the outgroup follow naturally.
Psychologists Henri Tajfel and John Turner, who developed Social Identity Theory, showed that people internalize the values and beliefs of the groups they belong to, treating them as extensions of the self. When those beliefs are questioned, the threat is processed much like a threat to your status or belonging. The reaction is often defensive rather than reflective.
More recent work on motivated reasoning helps explain why such a reaction is so persistent. In the 1990s, psychologist Ziva Kunda demonstrated that people selectively evaluate evidence in ways that protect conclusions they are already motivated to believe. When a belief supports your identity or social standing, the mind unconsciously applies stricter standards to disconfirming evidence and looser standards to supporting evidence.
Intelligence does not necessarily make you more objective; it can make you a more effective advocate for your own side.Political scientist Dan Kahan later expanded this idea with what he called “identity-protective cognition.” His research showed that people with higher cognitive ability are often better, not worse, at rationalizing beliefs that align with their cultural or political identities. In other words, intelligence does not necessarily make you more objective; it can make you a more effective advocate for your own side!
This body of research helps explain why challenges to core beliefs can feel existential. If your moral worldview underwrites your relationships, your career, or your sense of being a good person, abandoning it comes with real social and psychological costs. Under those conditions, defending the belief feels like defending your life as it is currently organized.
Seen in this light, the selective abandonment of evidentiary standards is not a moral failing unique to any one group. It is a predictable human response to perceived identity threat. Reasoning shifts from a tool for understanding the world to a mechanism for self-preservation.
I learned this firsthand during my years in the New Atheist movement. What struck me was how selective people’s skepticism could be. In debates about religion, the standards were ruthless. In debates about politics and social issues, those same standards were easily relaxed, and often vanished.
Take prayer. For decades, skeptics have pointed to controlled trials showing no measurable benefit of intercessory prayer. The best-known example is the STEP trial, a randomized study of nearly 1,800 cardiac bypass patients published in The American Heart Journal. It found no improvement in outcomes for patients who were prayed for, and in one group outcomes were slightly worse among patients who knew they were being prayed for. Among the New Atheists, prayer was considered resolved beyond reasonable debate not only because the experimental evidence showed no effect, but because the underlying causal story itself collapsed upon examination.
Reasoning shifts from a tool for understanding the world to a mechanism for self-preservation.Philosophically, intercessory prayer fails at the most basic level: It posits an immaterial agent intervening in the physical world in ways that are neither specified nor independently detectable. There is no plausible mechanism, no dose-response relationship, no way to distinguish divine intervention from coincidence, regression to the mean, or natural recovery.
When some studies do claim positive effects of prayer, they almost invariably collapse under close inspection—small sample sizes, multiple uncorrected comparisons, vague outcome measures, post hoc subgroup analyses, or outright publication bias. Some define “answered prayer” so flexibly that any outcome counts as success; others rely on self-reported well-being, which is especially vulnerable to expectancy effects and motivated reasoning.
This is precisely why large, preregistered trials and systematic reviews, such as those published in The American Heart Journal, are treated as decisive: They close off these escape hatches. The conclusion that prayer “doesn’t work” is not dogma; it is the residue left after methodological rigor strips away every alternative explanation.
Now compare that level of scrutiny to how many people treat evidence in politically favored domains. What matters here is not even whether these conclusions are right or wrong, but how they become insulated from refutation.
In debates over trans healthcare, for example, studies in favor of many invasive medical interventions are based largely on self-reported outcomes, short follow-up periods, and substantial attrition. Despite these limitations, they are frequently treated as definitive. Criticisms that would be routine in almost any other medical context are instead dismissed as bad faith. But the fact that these issues involve real suffering should not exempt them from evidentiary scrutiny; it should raise the bar for it. In this case, the most comprehensive evidence available—multiple systematic reviews—has raised serious concerns about the overall quality of the evidence base, particularly with respect to pediatric interventions.
The UK’s Cass Review, commissioned by the National Health Service and published in stages between 2022 and 2024, concluded that the evidence for puberty blockers and cross-sex hormones in adolescents is generally of low certainty. Similar conclusions were reached by Sweden’s National Board of Health and Welfare and Finland’s Council for Choices in Health Care, both of which revised clinical guidelines after finding the evidence weaker than previously assumed. None of this proves that such treatments never help anyone, especially adults who exhausted other options. It does show that claims of scientific certainty are unjustified.
The same pattern appears at the level of theory. New Atheists made a cottage industry out of attacking unfalsifiable religious claims and god-of-the-gaps reasoning. Yet many of the same people now defend claims about “systemic discrimination” that are structured in exactly the same way: When disparities persist, they are treated as proof. When they shrink, the explanation retreats to subtler and less measurable mechanisms. Evidence against the claim rarely counts against the claim in the way it would in other domains.
Consider policing. It is often treated as a settled fact that racial bias is the primary driver of police shootings. But when Harvard economist Roland Fryer examined multiple large national datasets on police use of force, he found that there were no racial differences in officer-involved shootings once relevant contextual factors—such as crime rates, encounter circumstances, and suspect behavior—were taken into account.
What followed was not a broad reevaluation of the claim, but a shift in how it was framed. Rather than direct bias operating at the level of individual officers, explanations moved toward less specific and harder-to-measure forces: institutional culture, historical legacy, or diffuse forms of “structural” racism. These explanations may or may not be true, but they function differently from the original claim. Because they are more abstract and less tightly specified, they are also far more difficult to test or falsify.
Here’s the key issue: The pattern we can observe in all this is not that evidence resolved the question, but that disconfirming evidence changed the nature of the claim itself. A hypothesis that was once presented as empirically straightforward became broader, more elastic, and increasingly insulated from direct empirical challenge. Sounds familiar? It’s the god of the gaps fallacy.
The same pattern appears in debates over wage gaps. Raw differences in average earnings between groups are often presented as straightforward evidence of discrimination. But when researchers such as June O’Neill and later Claudia Goldin showed that simply controlling for factors such as occupation, hours worked, experience, career interruptions, and job risk substantially narrows or eliminates many commonly cited wage disparities, the original claim quietly shifted.
Evidence that would count against the claim in any other domain instead causes the claim to become broader, more abstract, and less falsifiable.It was no longer argued that some demographics were being paid less than others for the same work under the same conditions. Instead, the explanation moved upstream: Sexism or systemic racism were said to operate on the variables themselves, shaping career choices, work hours, and occupational sorting in ways that produced lower average pay.
Again, these higher-level explanations may be partly true. But they function very differently from the initial claim. A hypothesis that began as a concrete, testable assertion about unequal pay for equal work became broader, more abstract, and harder to falsify. Evidence that would ordinarily count against the claim did not weaken it; it simply pushed the claim into less measurable territory. In other words, evidence that would count against the claim in any other domain instead causes the claim to become broader, more abstract, and less falsifiable. In these cases, disparities function the way miracles once did in theology: as proof of hidden forces.
What bothered me about the New Atheism movement was not disagreement over conclusions. It was the collapse of standards. Arguments once dismissed as unscientific were rehabilitated the moment they became morally fashionable. I focus here on the New Atheism movement because it marked the first time in my life (and, as far as I can tell, the first time in history) that a movement, at least on its surface, explicitly committed itself to applying the highest standards of evidence to some of the most consequential claims about the world, and in doing so successfully and very publicly dismantled societal structures and beliefs that had endured for millennia.
Skepticism is adopted when it flatters the self and abandoned when it threatens a moral narrative.I’ve been thinking about all this for a long time, and I’ve come to suspect that most people—not by choice, but by evolutionary design—do not want or need a fully accurate understanding of how the world works. They want beliefs that protect their identity, signal membership in the right group, and increase their chances of (social) survival. Michael Shermer explained some of the evolutionary processes at hand here rather well in his books How We Believe and Conspiracy. In short, when it comes to patternicity—the human tendency to find meaningful patterns in meaningless noise—making Type 1 errors, (i.e., finding nonexistent patterns), carries little evolutionary risk while the opposite (i.e., missing real patterns) often can be the difference between life and death. This means that natural selection will favor strategies that make many incorrect causal associations in order to establish those that are essential for survival and reproduction.
Under those conditions, reasoning becomes performative. Skepticism is adopted when it flatters the self and abandoned when it threatens a moral narrative. That is why debates on these topics so often drift toward unfalsifiable language and moral imperatives.
A fair question follows: How does anyone know they are not doing the same thing?
I think the real danger we should try to internalize is not that other people do this. It is that all of us do.
When I say that the universe is 13.77 billion years old, it sounds rather authoritative.
Astronomers haven't found many gas giants orbiting white dwarfs. But is that because they're so difficult to spot? Or is it because their survival rate is so low? New research probes the issue.
The only television show I watch regularly is the NBC Evening News: I watch the whole thing from 5:30-6, completely ignoring phone calls and other disturbances. Last night the lead story was about the disappearance of Nancy Guthrie, the mother of Savannah Guthrie, a well-liked NBC news journalist and co-anchor of the network’s Today show. Mother and daughter were close, with Nancy often appearing on Savannah’s show.
Nancy Guthrie was 84, and simply disappeared from her home in Tucson, Arizona on Sunday. She has limited mobility, and when she didn’t show up for church a friend called the police, who discovered her disappearance. Nancy Guthrie relies on medication that she must take every 24 hours or she might die. An interview with the local sheriff revealed that there were signs of violence, and that Nancy was probably abducted. It’s now Tuesday, so she might already be dead.
The NBC news, both national and local, gave the disappearance not only the lead story, but also lots of air time because Savannah’s a member of the network family. The first paragraph of the NBC national news story is this:
“TODAY” co-anchor Savannah Guthrie is asking for prayers for her mother’s safe return as Arizona authorities continue to investigate her possible abduction.
Savannah also related, on the evening news, that the greatest gift she got from her mother was a deep belief in God, as you see in the plea for prayers above. On the local NBC news, anchor Alison Rosati ended her report on the disappearance by saying that she and other NBCers were also praying for Nancy Guthrie.
This is a tragedy for the Guthrie family, especially because Savannah and her mom were so close, and I won’t be dismissive of the call for prayers by nearly all the reporters. It did, however, get me thinking about people’s views about what prayers are supposed to accomplish, how they’re received by the God people imagine, and how educated people (Savannah has a J.D. from Georgetown Law) come to think that prayers are useful.
It’s clear that all the calls for prayer by newspeople reflect the still-pervasive religiosity of America, though I’m not sure whether, for some, the call for prayer is just a pro forma expression of sympathy. But surely for many prayers are supposed to work: God is supposed to hear them and do something—in this case intercede to help bring Nancy Guthrie back alive. And that got me thinking about how people connect prayer with the listener: God. Religious Jews are, by the way, among the most fervent pray-ers, with prayer serving as a constant connection with God. And, like prayers in other religions. Jews sometimes use prayer to ask for personal benefits or simply to propitiate God.
The train of thought continued. What kind of God is more likely to effect changes requested in prayer? If God is omniscient, omnipotent, and good, wouldn’t He know that people want things, like Nancy Guthrie’s return, and not need their prayers to find out? (He presumably can read people’s minds.) A god who requires prayers to effect change would be dictatorial and mean-spirited, demanding that obsequious people supplicate and propitiate him. But surely that’s not the kind of God most Christians imagine. (My feeling is that Jews envision a somewhat angrier God—the one in the Old Testament.)
Nevertheless, despite quasi-scientific studies showing that intercessory prayers don’t work, people ignore that data, as of course they would; it’s tantamount to admitting that there’s no personal God who has a relationship with you. Sam Harris has suggested that these studies are weak, and Wikipedia quotes him this way:
Harris also criticized existing empirical studies for limiting themselves to prayers for relatively unmiraculous events, such as recovery from heart surgery. He suggested a simple experiment to settle the issue:[32]
Get a billion Christians to pray for a single amputee. Get them to pray that God regrow that missing limb. This happens to salamanders every day, presumably without prayer; this is within the capacity of God. I find it interesting that people of faith only tend to pray for conditions that are self-limiting.
He has a point of course, and that experiment would never work. But it’s intercessory prayer. Perhaps God answers only prayers coming from the afflicted themselves. But that implies that the “thoughts and prayers” of other people, as in the Guthrie case, are useless. In the end, the very idea of petitionary and intercessory prayer being effective implies that God is, as Christopher Hitchens said, like a Celestial Dictator presiding over a divine North Korea, requiring constant propitiation by obsequious believers. How could it be otherwise?
One response by liberal religionists is that one prays not for help, but simply as a form of meditation or rumination. In other words, perhaps putting things into words—even words that nobody is hearing—helps you as a form of therapy, or in sorting out your thoughts and problems. That’s fine, though it’s unclear why rumination alone wouldn’t suffice.
I won’t deny anybody their belief in God, but I don’t want people forcing their beliefs on me, which is what occurs when newspeople ask for my prayers. I have none to give, though I wish people in trouble well, and hope that Nancy Guthrie returns.
These thoughts may sound cold-hearted, but they’re similar to what Dan Dennett wrote in his wonderful essay, “Thank Goodness“, describing who should really have been thanked for saving his life after a near-fatal aortic dissection:
What, though, do I say to those of my religious friends (and yes, I have quite a few religious friends) who have had the courage and honesty to tell me that they have been praying for me? I have gladly forgiven them, for there are few circumstances more frustrating than not being able to help a loved one in any more direct way. I confess to regretting that I could not pray (sincerely) for my friends and family in time of need, so I appreciate the urge, however clearly I recognize its futility. I translate my religious friends’ remarks readily enough into one version or another of what my fellow brights have been telling me: “I’ve been thinking about you, and wishing with all my heart [another ineffective but irresistible self-indulgence] that you come through this OK.” The fact that these dear friends have been thinking of me in this way, and have taken an effort to let me know, is in itself, without any need for a supernatural supplement, a wonderful tonic. These messages from my family and from friends around the world have been literally heart-warming in my case, and I am grateful for the boost in morale (to truly manic heights, I fear!) that it has produced in me. But I am not joking when I say that I have had to forgive my friends who said that they were praying for me. I have resisted the temptation to respond “Thanks, I appreciate it, but did you also sacrifice a goat?” I feel about this the same way I would feel if one of them said “I just paid a voodoo doctor to cast a spell for your health.” What a gullible waste of money that could have been spent on more important projects! Don’t expect me to be grateful, or even indifferent. I do appreciate the affection and generosity of spirit that motivated you, but wish you had found a more reasonable way of expressing it.
In other words, “thoughts” are fine; “prayers,” not so much.
I’m writing this simply to work out my own thoughts about prayer and its ubiquity, but I would appreciate hearing from readers about this issue. What do you think when you hear others asking for prayers. Is prayer a good thing, and what does it presume about God? Any thoughts (but no prayers) are welcome, and put them below.
Engaging on social media to discuss pseudoscience can be exhausting, and make one weep for humanity. I have to keep reminding myself that what I am seeing is not necessarily representative. The loudest and most extreme voices tend to get amplified, and people don’t generally make videos just to say they agree with the mainstream view on something. There is massive selection bias. But still, to some extent social media does both reflect the culture and also influence it. So I like to not only address specific pieces of nonsense I find but also to look for patterns, patterns of claims and also of thought or narratives.
Especially on TikTok but also on YouTube and other platforms, one very common narrative that I have seen amounts to denying history, often replacing it with a different story entirely. At the extreme the narrative is – “everything you think you know about history if wrong.” Often this is framed as – “every you have been told about history is a lie.” Why are so many people, especially young people, apparently susceptible to this narrative? That’s a hard question to research, but we have some clues. I wrote recently about the Moon Landing hoax. Belief in this conspiracy in the US has increased over the last 20 years. This may be simply due to social media, but also correlates with the fact that people who were alive during Apollo are dying off.
Another factor driving this phenomenon is pseudoexperts, who also can use social media to get their message out. Among them are people like Graham Hancock, who presents himself as an expert in ancient history but actually is just a crank. He has plenty of factoids in his head, but has no formal training in archaeology and is the epitome of a crank – usually a smart person but with outlandish ideas and never checks his ideas with actual experts, so they slowly drift off into fantasy land. The chief feature of such cranks is a lack of proper humility, even overwhelming hubris. They casually believe that they are smarter that the world’s experts in a field, and based on nothing but their smarts can dismiss decades or even centuries of scholarship.
Followers of Hancock believe that the pyramids and other ancient artifacts were not built by the Egyptians but an older and more advanced civilization. There is zero evidence for this, however – no artifacts, no archaeological sites, no writings, no references in other texts, nothing. How does Hancock deal with this utter lack of evidence? He claims that an asteroid strike 12,000 years ago completely wiped out all evidence of their existence. How convenient. There are, of course, problems with this claim. First, the asteroid strike at the end of the last glacial period was in North America, not Africa. Second, even an asteroid strike would not scrub all evidence of an advanced civilization. He must think this civilization lived in North America, perhaps in a single city right where the asteroid struck. But they also traveled to Egypt, built the pyramids, and then came home, without leaving a single tool behind. Even a single iron or steel tool would be something, but he has nothing.
Of course, there is also a logical problem, arguing from a lack of evidence. This emerges from the logical fallacy of special pleading – making up a specific (and usually implausible) explanation to explain away inconvenient evidence or lack thereof.
Core to the alternative history narrative is also that those ancient people could not possibly have built these fantastic artifacts. This is partly a common modern bias – we grossly underestimate what was possible with older technology, and how smart ancient people could be. Even thousands of years ago, in any culture, people were still human. Sure, there has been some genetic change over the last few thousand years, but not dramatically, and this is also in how common alleles were, not their existence. In other words – every culture could have had their Einstein. Ancient Egypt had genius architects, and is some cases we even know who they were.
People also underestimate the willingness of ancient people to engage in long periods of harsh work in order to accomplish things. Perhaps this is a “modern laziness bias” (I think I just coined that term). We are so used to modern conveniences, that the idea of polishing stone for 12 hours a day for a year in order to create one vase seems inconceivable. The pyramids, it is estimated, were constructed with 20-30,000 workers over 20 years. This included skilled masons, who likely became very skilled during the project. Egypt had an infrastructure of such skilled workers, supported by many long term projects over centuries.
Which brings up another point – we underestimate how much time these ancient civilizations existed. My favorite stat is that Cleopatra lived closer in time to the Space Shuttle than the building of the pyramids. Wrap your head around that. These ancient people were clever, they included highly skilled crafters, and they had centuries, at least, to advance their techniques.
What amazes me is that this narrative of denying history extends to recent events. Again, the Moon landing is an example. But there is also a narrative circulating on TikTok that buildings from the 18th, 19th, and even 20th century were not built by the people who historians said built them. They were found in place, and were built by an older and more advanced civilization – called Tartaria. Never heard of it? That’s because it does not exist. This civilization was wiped out by a world-wide mud flood in the 19th century. According to this particular nuts conspiracy theory, modern governments just occupied the buildings they left behind then conspired together to wipe the history of the mud flood and Tartaria from all records.
What is even more amazing to me is that, in far less time than it took to create a TikTik video spreading this nonsense, someone with even white-belt level Google-fu could have found convincing evidence that this is wrong. You can find pictures of the buildings being built, or of the city before they were built, or documentation of them being built, or experts who have already gathered all this information for you. You can also find that “Tartaria” was a medieval label used to denote the “land of the Tartars”, which simple refers to Mongols. It was a nonspecific geographic label, not an actual place or nation.
But of course, none of this matters in a social media world in which narrative is truth, everything “they” say is a lie, and in fact truth or lie is not even really a thing. It’s all narrative, it’s all performance and clicks.
And this is why scholars and scientists need to engage with the world, much more than they currently do. We cannot simply ignore the nonsense with the idea that it will shrivel and die if we don’t give it light. That is such a pre-social media idea (if it were ever true). We have to fight for scholarship, or logic, facts, and evidence. We have to fight for history.
The post Forgetting History first appeared on NeuroLogica Blog.
Magnetism on the Moon has always been a bit confusing. Remote sensing probes have noted there is some magnetic signature, but far from the strong cocoon that surrounds Earth itself. Previous attempts to detect it in returned regolith samples blended together all of the rocks in those samples, leading to confusion about the source - whether they were caused by a strong inner dynamo in ages past, or by powerful asteroid impacts that magnetized the rocks they hit. A new study from Yibo Yang of Zhejiang University and Lin Xing of the Chinese Academy of Sciences, published recently in the journal Fundamental Research, shows that the right answer seems to be - a little of both.
How concerned do you truly need to be about vintage ceramicware leaching lead into your food?
Learn about your ad choices: dovetail.prx.org/ad-choicesHad MAHA doctors been in charge in 2020, when COVID swamped our hospitals and morgues, they would have brought the same level of malevolence and incompetence they are displaying today.
The post Pandemic Gurus: If You Can’t Defend MAHA Doctors Today, Then You Must Discard Everything They Said Regarding COVID. first appeared on Science-Based Medicine.SpaceX CEO Elon Musk says he’s making space-based artificial intelligence the “immediate focus” of a newly expanded company that not only builds rockets and satellites, but also controls xAI’s generative-AI software and the X social-media platform. That’s the upshot of Musk's announcement that SpaceX has acquired xAI.
Arp 220 is a well-known pair of galaxies that are merging. New ALMA observations of polarized light reveal the complex and powerful magnetic fields that shape the process.