On Feb.11th, China successfully conducted a low-altitude demonstration and verification flight test of the Long March-10 rocket and a maximum dynamic pressure escape test of the Mengzhou crewed spaceship system. Credit: Xinhua]
In 2020 Joe Biden became the first Democratic nominee in 36 years without a degree from the Ivy League. Obama, before him, filled no less than two-thirds of all cabinet positions with Ivy League graduates—over half of which were drawn from either Harvard or Yale.1 In Congress today, 95 percent of House members and 100 percent of senators are college educated.
According to a recent study published in Nature, 54 percent of “high achievers” across a broad range of fields—law, science, art, business, and politics—hold degrees from the 34 most elite universities in the country.2 The sociologist Lauren Rivera, studying top firms in finance, consulting, and law, found that recruiters are jonesing for applicants from a prestigious academic institution; typically targeting just three to five “core” universities in their hiring efforts—Harvard, Yale, Princeton, Stanford, and MIT—the usual suspects; then identifying five to fifteen additional second-tier options—such as Berkeley, Amherst, and Duke—from which they will more tentatively accept resumés.3 Everyone else—almost certainly never even gets a reply email. Why? Because, one lawyer explained the strategy to Rivera, “Number one people go to number one schools.”
“If destruction be our lot, we must ourselves be its author and finisher.” —Abraham LincolnGiven this new American caste system, it’s no surprise that 63 percent of Americans think that “experts in this country don’t understand the lives of people like me,” or that 69 percent feel the “political and economic elite don’t care about hardworking people.”4 And, I suggest, they’re not wrong. A culture that sanctifies college as the gateway to full citizenship, over time corrodes the foundations of democratic life. It devalues work that doesn’t come with a degree, licenses contempt for those not formally educated, and locks the working class out of positions of power. The result isn’t just underrepresentation; it’s resentment. As the journalist David Goodhart writes, “We now have a single route into a single dominant cognitive class”; where “an enormous social vacuum cleaner has sucked up status from manual occupations, even skilled ones,” and appropriated it to white-collar jobs, even low-level ones, in “prosperous metropolitan centers and university towns”; and where broad civic contribution has been replaced with narrow intellectual consensus.5 The result is a backlash not against education, but against the assumption that only one kind of education counts.
“At a time when racism and sexism are out of favor,” writes Harvard philosopher Michael Sandel, “credentialism is the last acceptable prejudice.”6 In a cross-national study conducted in the United States, Britain, the Netherlands, and Belgium, a team of social psychologists led by Toon Kuppens found that the college-educated class had a greater bias against less educated people than they did other disfavored groups.7 In a list that included Muslims, poor people, obese people, disabled people, and the working class, “stupid people” were the most disliked. Moreover, the researchers found that elites are unembarrassed by the prejudice; that unlike homophobia or classism, it isn’t hidden, hedged, or softened—it’s worn openly, with an air of self-congratulation. As the Swedish political scientist Bo Rothstein observes, “The more than 150-year-old alliance between the industrial working class and what one might call the intellectual-cultural Left is over.”8
Today we are living through a strange time in American life in which the numbers have declared victory. By most standard economic measures—employment, wages, even household net worth—the working class is better off than it was a generation ago.9, 10, 11 The average elevator mechanic gets paid over $100,000 per year12; master plumbers can make more than double that.13 Even in Mississippi, our country’s poorest state, workers see higher average wages than in Germany, Britain, or Canada.14
Elites are unembarrassed by the prejudice; that unlike homophobia or classism, it isn’t hidden, hedged, or softened—it’s worn openly, with an air of self-congratulation.It is, for working-class Americans today, the best of times, objectively—and the worst of times, subjectively. This is not because the spreadsheets are wrong, but because we fail to count the things that history records in tone, not totals—but rather things like mood, myth, and cultural resolve.
The Service EconomyAccording to the most recent data available from the United States Bureau of Labor Statistics, nearly four out of five Americans work in the service sector.15 For most Americans in most states, that means retail, fast-food, or some other smile-for-hire job located at the end of a check-out line.16 It’s a kind of work where labor isn’t just accomplished, it’s seen—performed under the soft surveillance of the American customer. So, beneath inflation charts and unemployment rates, if you want to understand the feelings side of the postindustrial economy—you might start with tipping.
It is, today, perhaps our most American habit—tipping for service; whether it be good, bad, or not provided. In restaurants, hair salons, and hotel lobbies, Americans tip over a hundred billion dollars a year—indeed, more than any other country on earth, and more than all of them combined.17 We tip cab drivers and pool cleaners and dog groomers and coat checkers. We tip the doorman on the way in, the bellhop on the way up, and the concierge on the way out. Americans tip so much that, as one European put it—the whole “approach [has become] completely deranged and out of control.”18
However, it wasn’t always this way. In fact, for much of the early 20th century, it was Americans who mocked Europeans for tipping—seeing it as smug, corrupt, and born of feudal etiquette.19 States such as Iowa, South Carolina, and Tennessee—among others—outlawed the practice entirely20; and wherever it remained legal, businesses proudly posted signs that read “No Tipping Allowed.”21 Some hotels even installed “servidors”—a two-way drawer that opened from hallway and room—so staff could deliver laundry without being seen, and without being tipped.22 As the author William R. Scott, in a book-length critique, put it in 1916:
In an aristocracy a waiter may accept a tip and be servile without violating the ideals of the system. In the American democracy to be servile is incompatible with citizenship … Every tip given in the United States is a blow at our experiment in democracy … Tipping is the price of pride. It is what one American is willing to pay to induce another American to acknowledge inferiority.Somewhere along the way, however—somewhere between the Marshall Plan and the first McDonald’s Happy Meal—the parts reversed; and we became the punchline. It became the Americans who tipped like royals—and the Europeans who saw it as such.
It was during this time that the gesture was institutionalized—not of custom or conscience—but because the Pullman Company, the National Restaurant Association, and eventually big tech sold it as part of the deal.23 Lobbying congress, adding tip lines to receipts and making feudalism feel American—if you’re the one tipping.24 Because on the other end—where the customer is always right—yes, the tip is now expected and yes, it is now appreciated; but gratuity has never been the same thing as respect and especially not when, for most working-class Americans, IHOP has become the least humiliating option.
The Status EconomyWe are signaling obsessed, hierarchy calibrated social apes. All of us, according to author Will Storr in The Status Game, walk around like buzzed-up antennas—attuned to the faintest frequency of admiration or disdain, gossip, or snicker.25 Given that for most of human history, it wasn’t guns, germs, or steel that mattered most; it was access to the cooperative networks and high-yield alliances of a species where insiders eat first and the gates are closely guarded. And so what governs our decisions—above all else, even when no one’s watching—is the paranoia of social scrutiny. In other words, it’s a cost-benefit analysis where the material outcome barely matters and utility is downstream of reputational impact.
Absent this understanding of human behavior, very little of it makes sense; a core theme in the work of the early 20th century economist Thorstein Veblen, whose concept of “conspicuous consumption” describes how people often consume products they don’t need—or even want—in order to flaunt status and social class.26 Luxury watches that tell time worse, minimalist chairs you can’t sit on are purchases where the high price is the point.
Of course, it is no major insight to say that people buy things to show off. The anthropological record is rich with lavish feasts and displays of abundance. The famous “potlatch ceremonies” of Pacific Northwest Indian tribes, for example, involved burning immense stores of wealth—copper shields, hand-carved canoes that took years to build, blankets, oil, and food—generations of accumulated capital, in a single afternoon; just to signal status.27
But what about meditating, carrying around a well-worn copy of The New Yorker in your back pocket, or believing in climate change? Veblen’s brilliance was seeing that even our quietest preferences are currency in a market economy of social prestige. As British philosopher Dan Williams puts it:
Much cognition is competitive and conspicuous. People strive to show off their intelligence, knowledge, and wisdom. They compete to win attention and recognition for making novel discoveries or producing rationalizations of what others want to believe. They often reason not to figure out the truth but to persuade and manage their reputation. They often form beliefs not to acquire knowledge but to signal their impressive qualities and loyalties. When people are angry, it’s rarely about money. It’s about being looked down on.It’s the kind of signaling that thrives in what sociologists call “post-material economies” such as contemporary America.28 Because in a society maxed out on comfort—where even the ultrawealthy can’t buy a better Netflix or a softer couch—the only lines left to draw are ideological; and social distinction becomes the new class war. The rub, however, is that unlike the peacock’s tail—a hard- to-fake signal, metabolically costly, and policed by survival—immaterial prestige hierarchies are cultural inventions; often arbitrary, often performative, and almost always enforced from the top down. In other words, social prestige isn’t earned—it’s distributed by those who already have it. As social scientists Johnston and Baumann described in a 2007 paper:
The dominant classes affirm their high social status through consumption of cultural forms consecrated by institutions with cultural authority. Through family socialization and formal education, class‑bound tastes for legitimate culture develop alongside aversions for unrefined, illegitimate, or popular culture.29The elite don’t just consume goods. They consecrate tastes, turning culture into a class barrier such that status is socially assigned rather than materially demonstrated. French sociologist Pierre Bourdieu called it symbolic capital—where opinions double as vocabulary tests and entry fees for membership into the aristocracy.30 As Princeton’s Shamus Khan explains, “Culture is a resource used by elites to recognize one another and distribute opportunities on the basis of the display of appropriate attributes.”31
Observing today’s ruling class, social psychologist Rob Henderson has coined the term “luxury beliefs,” arguing that the experts, the celebrities, and the institutions are all fluent in the same woke-speak, and by their material abundance can afford to focus almost exclusively on social justice issues that, ensconced in their gated communities, have no effect on their own luxurious lives (nor those of the people they profess to be helping).32
The words turn and turn again—testing for status, enforcing the pecking order.33 And now, just as working-class Americans born in the industrial economy once rejected cash tips—those born in the culture-capital economy don’t want the tip either. They want respect. The redneck reluctance to simply “trust the experts” or pronounce it “people of color” instead of “colored people” isn’t about bigotry or Bible verses or disinformation—it’s about refusing the role of grateful recipient in someone else’s moral theater. It’s not anti-intellectualism or anti-love and kindness. It’s anti-elitism.
A culture that sanctifies college as the gateway to full citizenship, over time corrodes the foundations of democratic life.How is it that a born-rich multibillionaire has become the standard-bearer for the working class? It’s because his favorite food is McDonald’s; and to Nancy Pelosi, George Clooney, and my high school guidance counselor—Trump is trash. They see him the same way they see trailer park America—as tacky, ignorant, and disposable; always on the lowborn side of the tip. It’s a feeling well-known in union organizing circles.34 That when people are angry, it’s rarely about money. It’s about being looked down on.
A New NationalismCulture can often be hard to think about because it doesn’t exist in the world of objects—it exists in the world as a perceptual experience. It has no mass, no edge, no location. It’s not made of things; it’s made of meanings—real, but not tangible.
The cultural backlash hypothesis, the status threat hypothesis, the social isolation hypothesis, the political alienation hypothesis, the nostalgic deprivation hypothesis—a growing body of scholarship has emerged to name and quantify the immaterial contours of twenty-first century populist discontent; all circling the drain of an old, half-remembered truth.35, 36, 37, 38, 39
For most of history, kings, philosophers, and statesmen took seriously the idea that civilizations depend on symbolic cohesion—on rituals, traditions, and agreed-upon fictions capable of domesticating our most socially inconvenient biological biases. They understood, whether by insight or instinct, that there’s something important about ceremony and uniform and national character. That propaganda isn’t all bad. That done right, good slogans make good citizens. And good citizens make great nations. As Gidron and Hall put it in a recent paper:
[I]ssues of social integration [must be taken] more seriously in studies of comparative political behavior. Such issues figured prominently in the work of an earlier era … but they fell out of fashion as decades of prosperity seemed to cement social integration.40In the old economy it was simple. You had the rich, who lunched at steakhouses and voted Republican; the working class, who labored in factories and voted Democrat; and in between, the mass suburban middle class. When it came, the conflict was clear—members of the working class joining forces with progressive intellectuals to oppose the moneyed elite. Yet every once in a while, a new, revolutionary class of citizens comes along and scrambles the whole social order. In the late 20th century it was the scholastic king—and the new culture-laureate class. He is not merely an academic; he is society’s central planner, a warden of elite passage, and the face of the new American aristocracy; and as The New York Times columnist David Brooks put it:
If our old class structure was like a layer cake—rich, middle, and poor—the creative class is like a bowling ball that was dropped from a great height onto that cake. Chunks splattered everywhere.41Outsourcing made economic sense, globalization was in large part inevitable, and cheap goods are always good politics—sure, fine. But for over fifty years now, neither political party has been able to solve the social problem of a postindustrial economy. And no American president has been able to tell a story good enough to replace the one previous generations called true. As sociologist Arlie Hochschild explained in a recent interview with The New York Times:
We keep looking for real policies. That’s not the thing. Trump offers a veneer of policies and a story, and we’ve got to tune in to the effect of that story on people who feel like the world’s melting and sinking … Because whatever the policies, these voters are following the story and the emotional payoff of that anti-shaming ritual. So we have to stop the story, reverse the story: Nobody stole your pride, we’re restoring it together.42In the same way philanthropy never solves economic inequality, bigger and better information tips will never win the culture war—because it’s not about being rich or poor, stupid or smart; it’s about better than or worse than. And the only thing that can make a rich person feel worse than a poor person—or a smart person worse than a stupid one—is a national story written by poor people and stupid people too. It’s the sort of new nationalism that, in the past, has required several interconnected efforts.
The Bottom LineRobert F. Kennedy, in March of 1968, in a speech at the University of Kansas, noted: “The gross national product can tell us everything about America except why we are proud that we are Americans.”43
Rubber in Akron. Meat in Chicago. Coal in Scranton. Steel in Gary. It used to be you knew a city by what it made—how it sounded, how it smelled. In 1950 Detroit was the richest city in the world—that’s right, the entire world.44 On Zug Island, they used to make the whole car, start to finish—iron ore mined and smelted on one end, parts shaped and assembled along the way, and a new Ford rolled off the line at the other—no imports, no one else. It was vertical integration—of work, of community, of pride.
But by the 1970s a new day had dawned, the old days were gone, and the unraveling had begun. Over half the manufacturing jobs moved elsewhere, a quarter of the population went too; and with whole neighborhoods left to rot, Detroit, once called “the Paris of the Midwest,” became one of the deadliest cities in the country.45, 46 From 1965 to 1974, homicides quintupled47; the central business district earned the name “zone of decay”; and businesses began installing bulletproof glass—floor to ceiling—to protect storefront clerks.
Just like that—two short decades transformed America’s motor city into America’s murder city. And burnt, bled, and bankrupt, the once shining example rolled out perhaps the saddest, most pitiful ad campaign in American history: “Say Nice Things About Detroit.”48
It’s not about being rich or poor, stupid or smart; it’s about better than or worse than.The bottom line is this. Every new economy produces different winners and losers—it’s just the way it is. What happened in Detroit was, in many ways, what was expected. But when the losses came—when the bottom fell out for the millions of working-class Americans still there, still trying—it was treated not as a national obligation but as an unfortunate footnote to progress. Detroit was told to retrain, relocate, find a way to adjust—and when they failed, just like the people still living in Akron, Scranton, and Gary, they were humiliated, cast as mascots of ignorance and failure. The problem is that the ignorant and the failed far outnumber those who aren’t. And so, as Franklin Roosevelt said, it’s not “whether we add more to the abundance of those who have much” that matters—“it is whether we provide enough for those who have too little.”
Because when the empire falls—when the American experiment joins the long ledger of civilizations past, it won’t be at the hands of China or Russia or Al Qaeda or anyone else. We are the richest nation in the history of the world; no other society has ever wielded as much global influence; not even a coalition of all the world’s armies could best ours. “If destruction be our lot,” wrote a 28-year- old Abraham Lincoln, “we must ourselves be its author and finisher.”49 As “a nation of freemen, we must live through all time, or die by suicide.”
And if it comes to that—if we choose death; it won’t be about free trade or wages or unemployment rates any more than it was about taxes in 1776. Once again, it will be about respect.
In this Free Press article, Steve Pinker and Marian Tupy (the latter identified as “the founder and editor of HumanProgress.org, a senior fellow at the Cato Institute, and co-author of Superabundance”) once again recount the undoubtable progress that humanity has made over the past six or seven centuries. The progress described here will be familiar to you if you’ve read Pinker’s two big books, Better Angels and Enlightenment Now: the progress has been in health, longevity, reduced poverty, better nutrition, less chance of violent death, and almost all indices of “well being”.
Click to read (if you have a subscription):
I’m not sure why Pinker is constantly attacked by people for touting progress, as the data are irrefutable, but I guess there’s a subgroup of “progressive” historians (and perhaps conservative ones) who like to aver that we’ve made little progress since the Middle Ages. Indeed, perhaps we’ve even regressed, and we’d be better off living in the Middle Ages. This Whiggish view is usually espoused by the religious, who say that the waning of religion has impoverished modern life. Perhaps leftish people don’t like the notion that we’re making progress (e.g., some say we’re worse off in racial relations now than during Jim Crow days), while rightish ones don’t like the palpable loss of faith of people in the West.
A few quotes:
Last month at Yale, the influential political blogger Curtis Yarvin, in a debate against Free Press contributor Jed Rubenfeld, argued that America ought to “end the democratic experiment”—and establish a monarchy. Yarvin has noted that Donald Trump is “biologically suited” to be America’s monarch. The ideas may sound extreme, but they have been influential. J.D. Vance describes Yarvin as “a friend,” and has cited his work. And Yarvin is part of a family of movements, known as the Dark Enlightenment, Techno-authoritarianism, and Neo-Reaction (NRx)—that reject the entire family of enlightenment values.
Meanwhile, theocracy is making a comeback, in movements known as theoconservatism, Christian Nationalism, and National Conservatism. The “National Conservatism Statement of Principles,” for example, declares that “where a Christian majority exists, public life should be rooted in Christianity and its moral vision, which should be honored by the state and other institutions both public and private.” The list of signatories is a lookbook of influential conservatives, including Charlie Kirk, Peter Thiel, and Trump administration insiders Michael Anton and Russell Vought—as well as our fellow Free Press contributors Christopher Rufo and Rod Dreher.
The latter, a friend of the vice president, has said elsewhere that the West will not “recover until and unless we become re-enchanted and seek a form of Christianity, and indeed of Judaism, that is more mystical, that valorizes this direct perception of the Holy Spirit, of holiness, and of transcendence.”
. . . Of course, humanity has already tried monarchy and theocracy—during the Middle Ages—and sure enough, some of the new reactionaries are saying that those times were not so bad after all. Dreher writes admiringly: “In the mind of medieval Christendom, the spirit world and the material world penetrated each other. . . . Men construed reality in a way that empowered them to harmonize everything conceptually and find meaning amid the chaos.”
Other influential conservatives go further in justifying medieval hierarchies. On his eponymous show, Tucker Carlson recently declared: “Feudalism is so much better than what we have now. Because at least in feudalism, the leader is vested in the prosperity of the people he rules.”
One of the themes of this article is how religion has in fact been an impediment in progress, and this seems to be the strongest attack on religion I’ve seen yet from Pinker (I haven’t read Tupy before). Perhaps Steve is preparing for his debate with Ross Douthat later this month (stay tuned), which will be about God. Doubthat’s new book is Believe: Why Everyone Should be Religious.
Here’s the money quote about progress, which I’ve put in bold:
It’s said that the best explanation for the good old days is a bad memory, and the historical amnesia of the romanticizers of medieval Christendom is near-complete. Among the blessings of modernity is an Everest of data about life in the past, painstakingly collected by economic historians from original sources over many decades. This quantitative scholarship circumvents fruitless back-and-forth about whether the Dark Ages were really all that dark: We can go to the numbers.
I won’t go through the numbers, as you probably know them, but they’re impressive. Here are just a few facts:
Some numbers can shake us out of this spoiled complacency. (For sources, see our respective books Ten Global Trends Every Smart Person Should Know and Enlightenment Now.) In 1800, the European life expectancy was 33 years; today, it is 79 years—which means that we have been granted not just extra life, but an extra life. Much of that gift came from leaps in prosperity that spared the lives of children. Before the turn of the 20th century, a third to a half of European children perished before their 5th birthday. Today that fate befalls three-tenths of one percent. Even the poorest countries today lose a fraction of the children that Europe did until recently. If being spared the agony of losing a child is not “meaningful,” what is?
Do people really want to go back to medieval times if they lose, on average, 46 years of life?
But the other theme of the piece is morality. In short, religious morality impedes human well-being by not giving people an impetus to help humanity, but rather telling them to live by this or that religious dictum that will please their God. I agree with the harm to behavior done by religion, but have taken issue with Pinker and Tupy’s idea not that morality can be humanistic, which it can be, but that humanistic morality is objective rather than subjective. And they seem certain about this:
Our moral purpose, then, is to use knowledge and sympathy to reduce suffering and enhance flourishing: health, freedom, peace, knowledge, beauty, social connection.
. . .The Enlightenment project of grounding morality in reason and well-being left us with a coherent fabric of arguments against the brutality and injustice that had been ubiquitous in human history. These arguments became the foundation of civilized society.
A partial list: Kant’s categorical imperative and his practical prescriptions for peace. The American Founders’ analyses of tyranny, democracy, and fundamental rights. Bentham’s cases against cruelty to animals and the persecution of homosexuals. Astell’s brief against the oppression of women. Voltaire’s arguments against religious persecution. Montesquieu’s case against slavery. Beccaria’s arguments against judicial torture. Rousseau’s case against harsh treatment of children.
In contrast to the Enlightenment’s exaltation of universal well-being, the morality of holy scriptures was dubious at best.
Crucially, these moral conclusions were based on reasons. As Plato pointed out 2,300 years ago, morality can’t be grounded in divine edicts. If a commandment itself has no moral justification, why should we obey it? If it does, why not just appeal to the justification itself?
Such justification is not hard to find. All of us claim a basic right to our own well-being. If we were not alive, healthy, nourished, educated, and embedded in a community, we could not deliberate about morality (or anything else) in the first place. And because we are embedded in a community, where people can affect each other’s well-being, we can’t stop at this basic claim. None of us can coherently demand these conditions for ourselves without granting them to others. I can’t say “I’m allowed to hurt you, but you’re not allowed to hurt me, because I’m me and you’re not,” and expect to be taken seriously.
Now I agree that society will run better if people conduct themselves in a manner that won’t injure other people. But to say that morality is objective, that the moral act is the one that increases “well-being”, is to buy into the fallacies that beset Sam Harris’s identical theory broached in his book The Moral Landscape. While increasing well-being does jibe with our usual notions of what’s moral, there are problems. I’ve described some of these in a previous post called “The absence of objective morality“, asserting that, in the end, no morality is objective; all forms of morality are based on subjective preferences. I’ll quote myself here:
It’s clear that empirical observation can inform moral statements. If you think that it’s okay to kick a dog because it doesn’t mind it, well, just try kicking a dog. But in the end, saying whether it’s right or wrong to do things depends on one’s preferences. True, most people agree on their preferences, and their concept of morality by and large agrees with Sam’s consequentialist view that what is the “right” thing to do is what maximizes “well being”. But that is only one criterion for “rightness”, and others, like deontologists such as Kant, don’t agree with that utilitarian concept. And of course people disagree violently about things like abortion—and many other moral issues.
One problem with Sam’s theory, or any utilitarian theory of morality, is how to judge “well being”. There are different forms of well being, even in a given moral situation, and how do you weigh them off against one another? There is no common currency of well being, though we know that some things, like torturing or killing someone without reason, clearly does not increase well being of either that person or of society. Yet there is no objective way to weigh one form of well being against another. Abortion is one such situation: one weighs the well being of the fetus, which will develop into a sentient human, against that of the mother, who presumably doesn’t want to have the baby.
But to me, the real killer of objective morality is the issue of animal rights—an issue that I don’t see as resolvable, at least in a utilitarian way. Is it moral to do experiments on primates to test human vaccines and drugs? If so, how many monkeys can you put in captivity and torture before it becomes wrong? Is it wrong to keep lab animals captive just to answer a scientific question with no conceivable bearing on human welfare, but is just a matter of curiosity? Is it moral to eat meat? Answering questions about animal rights involves, if you’re a Harris-ian utilitarian, being able to assess the well being of animals, something that seems impossible. We do not know what it is like to be a bat. We have no idea whether any creatures value their own lives, and which creatures feel pain (some surely do).
But in the end, trying to find a truly factual answer to the statement, “Is it immoral for humans to eat meat?” or “is abortion wrong?”, or “is capital punishment wrong?” seems a futile effort. You can say that eating meat contributes to deforestation and global warming, and that’s true, but that doesn’t answer the question, for you have to then decide whether those effects are “immoral”. Even deciding whether to be a “well being” utilitarian is a choice. You might instead be a deontologist, adhering to a rule-based and not consequence-based morality.
You can make a rule that “anybody eating meat is acting immorally,” but on what do you base that statement? If you respond that “animals feel pain and it’s wrong to kill them,” someone might respond that “yes, but I get a lot of pleasure from eating meat.” How can you objectively weigh these positions? You can say that culinary enjoyment is a lower goal than animal welfare, but again, that’s a subjective judgment.
By saying I don’t accept the idea of moral claims representing “facts”, I’m not trying to promote nihilism. We need a moral code if, for nothing else, to act as a form of social glue and as a social contract. Without it, society would degenerate into a lawless and criminal enterprise—indeed, the idea of crime and punishment would vanish. All I’m arguing is that such claims rest at bottom on preference alone. It’s generally a good thing that evolution has bequeathed most of us with a similar set of moral preferences. I hasten to add, though, that what feelings evolution has instilled in us aren’t necessarily ones we should incorporate into morality, as some of them (widespread xenophobia, for instance) are outmoded in modern society. Others, like caring for one’s children, are good things to do.
In the end, I agree with Hume that there’s no way to derive an “ought” from an “is”. “Oughts” have their own sources, while “is”s may represent in part our evolutionarily evolved behaviors derived from living in small groups of hunter-gatherers. But that doesn’t make them evolutionary “oughts.”
To abortion, meat-eating, and animal rights we can now add “assisted dying.” I favor it because I think it reduces suffering, but others say that it will actually increase net suffering by killing off people who could eventually be happy, or create societies in which people are sacrificed at will. And don’t forget Hiroshima and Nagasaki. If, as the authors claim, “None of us can coherently demand these conditions [of well being] for ourselves without granting them to others” then we open up a whole can of worms, especially involving war. In the end, saying that “well being” is a guide to objective morality begs the question of ethics: we are supposed to do X because it is more moral, and that’s because it increases “well being”. But why is increasing well being always more moral? If it’s by definition, then that really is begging the question.
I’m clearly not a philosopher, but I don’t see “increasing well being” as an objective guide to what’s moral. It is a preference, based on the subjective choice that a society with more “well being” is the one we should prefer. That is usually true, I think, but not always, and runs into substantial difficulties when you try to do the moral calculus in given situations.
Otherwise, I look forward to Steve’s debate with Douthat in two weeks, which should be great fun, even if nobody changes their minds about God.
Okay, so as this website slowly circles the drain, we’re still going to have cats on Caturday, and three items to boot.
First on deck is Larry, the Chief Mouser to the Cabinet Office at 10 Downing Street; he just turned 19, and served 15 of those years in the service of the Prime Minister. He’s in remarkably good shape for such an old cat, and here’s a two-minute video, in his own words, recounting how a careless photographer nearly tripped over him. Fortunately, Larry skittered away, perhaps losing half a life or so:
********************
From Bored Panda we have another large selection of cat memes. I’ll choose a few for your delectation. Click the screenshot to read; the intro says this:
Last year, the estimated expenses of owning a cat were between $830 and $3,000. Clearly, no expense is spared for cat owners when it comes to their beloved fluffballs.
Bored Panda loves cats too. That’s why we are blessing you with a collection of wholesome and cute cat memes, courtesy of the “happycat318” Instagram page. Check out the times kitties cracked up their owners with some diabolical shenanigans!
More info: Instagram [the happycat 218 Instagram Page], the source of all the memes:
. . . And this is a true cat lover:
********************
A persistent moggy described by the UPI; click on screenshot to read:
The tail:
A cat escaped from his owners’ camper during a stop at a gas station in Spain and reappeared months later less than a mile from their home in France.
Patrick and Evelyne Sire, who live in Olonzac, in the Hérault region of France, said their cat, Filou apparently jumped out of an open window in their camper during an Aug. 9, 2025, stop at a gas station in Maçanet de la Selva, Spain, located near the French border about 155 miles from home.
Patrick Sire said Filou’s absence wasn’t noticed until the next morning.
Sire said he returned to the gas station twice in the ensuing days and weeks, but no one in the area had seen any signs of the missing feline.
The couple said they started to give up hope as the months passed, but they received a call Jan. 9 from a resident in Homps, less than a mile from their home, reporting Filou had been found.
The woman said she had been feeding the cat outdoors since December, and noticed he was very thin and appeared to be coughing. She took the feline to a local veterinarian, where a microchip scan identified him as Filou.
“Filou traveled all that way to get to us. But how did he do it? Did he follow the highway? Did he go through towns? Did he follow the rivers?” Patrick Sire told France3 News. “We’ll never know.”
Here’s a video in French, which shows the GPS cat and his staff. If you know a bit of French you can probably understand it, but if not you can still see how happy the staff is!:
********************
Lagniappe: A cat makes a deposit:
Happy #Caturdaypic.twitter.com/nb4NfkS7CJ
— Larry the Cat (@Number10cat) January 24, 2026
A sneaky and lazy moggy and its exercise wheel:
. . . and a woman talks to her cat, but inadvertently insults it:
h/t: Ginger K., Simon, Merilee
These are the last photos I have, and I’ve gathered singletons in a potpourri of photos. Please send me any good wildlife photos you have—otherwise there will be a LACUNA tomorrow. Captions are indented, and you can enlarge the photos by clicking on them.
From Pratyaydipta Rudra in Oklahoma.
This is a Pine Squirrel [Tamiascirus sp.], photographed in Rocky Mountain National Park, CO.
From Adrian:
Here’s a picture of a European Pine Marten (Martes martes) from the shores of Loch Duich, near the Isle of Skye, Scotland:From Guy:
Taken in Lake Saint Clair Metropark in Michigan a few years back by my 12 year old son Nolan at a bird-banding station where we volunteer. I think it’s a Blackpoll Warbler (Setophaga striata) with the image taken in the fall (so I don’t really know if it’s male or female):
From Robert Lang, whose house and studio burned to the ground during the California fires last year; both are being rebuilt:
Our gardener found this California native tarantula (Aphonopelma sp.) while clearing some fire debris at my former studio and, knowing that my wife had a pet tarantula and was helping the Eaton Canyon Nature Center in its fire recovery, he left it for us at our temporary home in a little plastic bottle. (Umm…the tarantula was in a little plastic bottle. Not our home.) After we determined that ECNC didn’t have a place for one yet, we released it locally, but I took this picture before it wandered away. When we got home from the release, there was another plastic bottle on the porch with another tarantula inside.A Hummingbird Moth (species unknown) from Marty Riddle:
The Hawk Moths, aka Hummingbird Moth, love the nectar in resident maintained gardens at Brooksby Village Peabody, Massachusetts:
And a cat/bird encounter from Barry Lyons:
For years now, I’ve had mourning doves [Zenaida macroura] alight on my air conditioner. Some of them are regulars, and what interests me is that they haven’t taken the next obvious step: pecking at the window. What I mean is that a dove arrives and then stares into my apartment, sometimes moving its head back and forth: “Are you in there? Ah, there you are!” And then I get up from my chair and go feed them. But when will a dove start pecking at the window to alert me that he’s there? Why hasn’t it figured out that it’s something it can do? And at no cost to his safety because he can still fly away. And look at this photo. The dove seems to understand windows. Every time a cat goes to the window (I don’t own a cat; I cat-sit) it flares its wings instead of flying off, as if to say, “Ha ha, you can’t get me. I’m out here, you idiot.”On 14 January, 2025, two colliding black holes sent the clearest gravitational wave signal ever recorded rippling across the universe to Earth’s detectors. This remarkably crisp signal, designated GW250114, has allowed physicists to conduct the most stringent test yet of Einstein’s general relativity by measuring multiple “tones” from the collision. The wave passed the test with flying colours, but researchers remain optimistic that future detections might finally reveal where Einstein’s century old theory breaks down, potentially offering the first glimpses of quantum gravity.
Astronomers have discovered a massive galaxy cluster assembling itself just one billion years after the Big Bang, there’s just one problem… it shouldn’t exist! Current models suggest it shouldn’t have formed when it did, Using NASA’s Chandra X-ray Observatory and James Webb Space Telescope working in tandem, scientists spotted JADES-ID1, a protocluster containing at least 66 galaxies wrapped in a vast cloud of million degree gas forming during what should have been the universe’s infancy.
The Amaterasu particle was detected in 2021 by the Telescope Array experiment in the U.S. It is the second-highest-energy cosmic ray ever observed, carrying around 40 million times more energy than particles accelerated at the Large Hadron Collider. Such particles are exceedingly rare and thought to originate in some of the most extreme environments in the universe.
I have been wondering about the question above for a while, as I’ve read quite a few novels lately that use the word “luncheon”, with seemingly no distinction between that word and “lunch”. I was too lazy to look it up, but, typing it in the search box, I found this short (1.5-minute) YouTube explanation below:
The Oxford English Dictionary agrees (the first meaning is “A large chunk of something, esp. bread, cheese, or some other food; a thick slice, a hunk; = lunch“). The relevant entry:
There you go. But I still would like to be able to invite a friend to a restaurant for an informal luncheon. That’s not correct, but it’s fun to say. And, at any rate, I don’t think I’ve heard anyone say “luncheon” lately, even referring to a formal meal. And in fiction it’s used incorrectly all the time.
Here’s a new article in Nature (click on the title screenshot below to read it); it’s about the dearth of information about the safety of drugs used by pregnant women. Except, to Nature, they refer not to “women” but to “pregnant people,” for in the article, that is about the only term that refers to women who are pregnant. “Women” is used almost exclusively when it’s in quotations from others.
Here’s my count:
“Pregnant people”: Used 41 times
“Women”: Used 5 times, 4 of them in quotes from others
Clearly some bowdlerization is going on here.
The sad part of this article is that it has a lesson worth reading—a dearth of knowledge about how many drugs affect pregnant women—but it’s annoyingly peppered with politically correct and annoying usages. For example:
The first usage of the “pp” term is in fact in the subtitle, which I’ve highlighted below (again, click the article to read it):
Here’s a screenshot with “pregnant people” highlighted. This is only a small sample of the article:
Need I say more? What this means is that Nature is clearly truckling to the language adopted by extreme gender activists, who consider trans-identified men as “women”. Ergo, the words “pregnant women” are seen as offensive, because “women” include trans-idenfied men who can’t have babies. Voilà: “pregnant people.” Also, as reader Coel says below, “The main problem is trans-IDing women, aka ‘trans men’, who, being women, can get pregnant, but who they regard as ‘men’. Hence ‘pregnant women’ would exclude them, and so amount to erasure of and thus genocide of those ‘trans men’ who are indeed pregnant.”
Here are the five uses of the word “women”, all but the last quotes from other authors (they can’t sanitize other people’s words):
Note that the last usage of woman, not in quotes, is required because they are referring to females who are not pregnant. But the journal still slipped up: they could have used “people with uteruses”, or, like The Lancet, “bodies with vaginas”:
Conclusion: Nature has been ideologically captured. But we already knew that, didn’t we?
The journal should be ashamed of itself.
h/t: Schnoid
Well, this is the last batch of photos I have, and it’s very sad that the tank is empty. Please send some in if you have them. Don’t make me beg!
Today we have photos of ducks—or rather, one female duck— rom Aussie reader Keira McKenzie in Perth. Keira’s captions are indented, and you can enlarge her photos by clicking on them.
Here is a series of photos I took of a lone Pacific Black Duck [Anas superciliosis] from this afternoon [Feb. 11] at the park. Since the islands in the ponds have been completely cleared of all vegetation (the western island) and all the undergrowth cleared from the eastern island (this is because of the devastation throughout Perth’s trees from the polyphagous shothole borer), moat of the waterbirds have left for areas where they can roost & nest.
The photos are taken in Hyde Park, Perth, Western Australia, on a hot humid afternoon.
I am very fond of them. I rescued one when it flew into the electric wires on the other side of the road one night. I carried it back across the road and into the park, putting it near the water’s edge. It was a pond-smelling little bundle, seemed uninjured and was very calm, and waddled off into the water and sailed into the night.
What a beautiful hen! It makes me eager for Duck Season to arrive at Botany Pond. Keira also sent a picture of her cat:
I shall sign off with a pic of my little Baba (currently zooming around the place for no apparent reason) slothing in the armchair in the heat with one of her favourite toys (the other is a wombat).
The universe is a big place, and tracking down some of the more interesting parts of it is tricky. Some of the most interesting parts of it, at least from a physics perspective, are merging black holes, so scientists spend a lot of time trying to track those down. One of the most recent attempts to do so was published in The Astrophysical Journal Letters by the North American Nanohertz Observatory for Gravitational Waves (NANOGrav) collaboration. While they didn’t find any clear-cut evidence of continuous gravitational waves from merging black hole systems, they did manage to point out plenty of false alarms, and even disprove some myths about ones we thought actually existed.
Eclipse season is nigh. The first of two eclipse seasons for 2026 kicks off next week on Tuesday, February 17th, with an annular solar eclipse. And while solar eclipses often inspire viewers to journey to the ends of the Earth in order to stand in the shadow of the Moon, this one occurs over a truly remote stretch of the world, in Antarctica.