The third interstellar object detected in our Solar System (3I/ATLAS) has a unique and continually unfolding story to tell of its nature and origin. In a recent paper, scientists from the i4is show how a spacecraft performing a Solar Oberth Manoeuvre (SOM) could intercept 3I/ATLAS to learn its secrets.
Saturn's tiny moon Enceladus, famous for its water geysers, has been revealed as a giant electromagnetic powerhouse whose influence extends over half a million kilometres through the ringed planet's magnetosphere. Analysis of 13 years of Cassini data shows the 500 kilometre wide moon creates a lattice like structure of crisscrossing electromagnetic waves known as Alfvén wings, that bounce between Saturn's ionosphere and the plasma torus surrounding Enceladus's orbit, reaching distances 2,000 times the moon's own radius. It changes our understanding of how small icy moons can influence their giant planetary hosts, with implications for the moons of Jupiter and perhaps even distant exoplanetary systems.
Scientists have discovered a revolutionary way to measure Earth's radiation budget by observing our planet from the Moon. A team of astronomers have revealed that lunar observations capture Earth as a complete disk, filtering out local weather noise and revealing planet scale radiation patterns dominated by spherical harmonic functions, effectively creating a unique "fingerprint" of Earth's outgoing radiation. This Moon based perspective solves fundamental limitations of satellite observations, which struggle to achieve both temporal continuity and spatial consistency, offering a new tool for understanding global climate change with unprecedented clarity.
Designed for versatility, Ariane 6 can adapt to each mission: flying with two boosters for lighter payloads, or four boosters when more power is needed. In its four-booster configuration, Ariane 6 can carry larger and heavier spacecraft into orbit, enabling some of Europe’s most ambitious missions.
Well, I might as well reveal part of my very long list of “best music”. This time I’ll post my choice of the best “songs about aging or dying” for Baby Boomers. These aren’t necessarily all good (I’m not a fan of Mellencamp, for instance), but they’re all notable. And yes, I realize that “Long May You Run” is really about Neil Young’s car (a 1948 Buick Roadmaster hearse he called Mortimer Hearseburg), but it’s still appropriate. Further, some of the songs are about lost love, but all refer to the sadness of passing time.
Father and Son Cat Stevens
Touch of Gray The Grateful Dead
When I’m Sixty-Four The Beatles
Boys of Summer Don Henley
Cherry Bomb John Mellencamp
Long May You Run Stills-Young Band
All Summer Long The Beach Boys
Caroline No The Beach Boys
Nick of Time Bonnie Raitt
When We Was Fab George Harrison
All those Years Ago George Harrison
Rockin’ Chair The Band
Taxi Harry Chapin
Cat’s in the Cradle Harry Chapin
Old Friends (Bookends) Simon and Garfunkel
Don’t Fear the Reaper Blue Öyster Cult
Wasted on the Way Crosby Stills & Nash
I welcome readers’ suggestions, and I’ll put up five of the songs that I think are particularly good and underappreciated:
“Boys of Summer” (1984). For some reason this song absolutely brings back my own teenage years, and quite vividly:
“Caroline, No” (1966), by the great Brian Wilson.
“All Those Years Ago” (1981). Nobody seems to remember this song by George Harrison, but it’s not only great, but a moving tribute to his late fellow Beatle, John Lennon. It’s clear that despite their tiffs, Harrison really loved Lennon.
“Taxi” by Harry Chapin (1972). I’m sure this song is long forgotten, but it’s among the very best ones on the list. The “soprano” part is sung by “Big John” Wallace, Chapin’s bassist; everybody thought that the original record used a female voice. You can end the song at 7:31; it just repeats with the lyrics shown.
“Nick of Time” by Bonnie Raitt (1989). I love this song; the tune is excellent, with a good hook, and the words are wonderful:
I am SO tired of people demonizing J. K. Rowling for being a transphobe and a bigot without ever having paid attention to what’s she said and written. In fact, she’s sympathetic to trans people, but, like me, thinks that trans rights on occasion clash with the rights of biological women, and in those cases the rights of natal women can take precedence (this occurs in sports, prisons, and a few other circumstances). And, like Rowling, I have been somewhat demonized by taking a stand identical to hers (I was, for example, recently branded “anti-trans” by the head of our department’s DEI Committee, clearly by people who have ignored what I’ve written, too).
But I kvetch. This Substack post by Katie Pinns tries to un-demonize Rowling by actually showing us what she wrote. Now you know that won’t change the minds of those like Emma Watson who have parted ways with Rowling on no good grounds: gender ideologues are impervious to the facts. But at least Pinns has Rowling’s statements down in black and white, and I’ve added one important link. Click screenshot to read:
I’ll give some quotes from Pinns (indented) who in turn quotes Rowling (doubly indented). There are several pages worth, so check for yourself if you think I’m cherry-picking.
Few public figures attract as much noise as J.K. Rowling. For many people, the controversy around her name has become so thick with slogans, screenshots, and second‑hand outrage that her actual words have been buried under the reaction to them. People repeat that she “hates trans people,” or that women’s crisis centres are “transphobic,” without ever checking what she has actually said.
So this piece goes back to the source. Not the discourse. Not the memes. Her words.
Rowling’s central point is simple: sex is real, and it matters. She has said:
“If sex isn’t real, there’s no same-sex attraction. If sex isn’t real, the lived reality of women globally is erased… It isn’t hate to speak the truth.”
This is the foundation of her position. She argues that biological sex shapes women’s lives, especially in relation to male violence, discrimination, and safeguarding. She also says explicitly that recognising sex does not erase or demean trans people.
Her concern is that if society stops acknowledging sex, women lose the language they need to describe their experiences. That’s not a fringe view; it’s the basis of decades of women’s rights advocacy.
Rowling has repeatedly said she supports trans people’s right to live free from discrimination:
“I respect every trans person’s right to live any way that feels authentic and comfortable to them. I’d march with you if you were discriminated against on the basis of being trans.”
She also describes feeling “kinship” with trans people because both women and trans people are vulnerable to male violence. Her objection is not to trans people themselves, but to the idea that acknowledging sex is inherently hateful.
And, as Pinns notes, Rowling makes these pronouncements not to “erase” or demonize trans people, but to prompt a discussion about clashes of “rights” as well as whether there’s a need for affirmative care, including surgery, on people below an age of consent. As Pinns says, “Much of the public anger directed at her is based on claims she never made. Her insistence on correcting the record is part of why she continues to speak.”
There are more quotes from Rowling, and you can read her longer explanations of her views at places like this one. She has of course been subject to a multitude of threats of violence, but she’s stood her ground, responding with humor and not a small amount of snark, which makes her enemies even madder. Here’s a quote from her sober and revealing essay linked in the first sentence of this paragraph:
Well, I’ve got five reasons for being worried about the new trans activism, and deciding I need to speak up.
Firstly, I have a charitable trust that focuses on alleviating social deprivation in Scotland, with a particular emphasis on women and children. Among other things, my trust supports projects for female prisoners and for survivors of domestic and sexual abuse. I also fund medical research into MS, a disease that behaves very differently in men and women. It’s been clear to me for a while that the new trans activism is having (or is likely to have, if all its demands are met) a significant impact on many of the causes I support, because it’s pushing to erode the legal definition of sex and replace it with gender.
The second reason is that I’m an ex-teacher and the founder of a children’s charity, which gives me an interest in both education and safeguarding. Like many others, I have deep concerns about the effect the trans rights movement is having on both.
The third is that, as a much-banned author, I’m interested in freedom of speech and have publicly defended it, even unto Donald Trump.
The fourth is where things start to get truly personal. I’m concerned about the huge explosion in young women wishing to transition and also about the increasing numbers who seem to be detransitioning (returning to their original sex), because they regret taking steps that have, in some cases, altered their bodies irrevocably, and taken away their fertility. Some say they decided to transition after realising they were same-sex attracted, and that transitioning was partly driven by homophobia, either in society or in their families.
. . . .Which brings me to the fifth reason I’m deeply concerned about the consequences of the current trans activism.
I’ve been in the public eye now for over twenty years and have never talked publicly about being a domestic abuse and sexual assault survivor. This isn’t because I’m ashamed those things happened to me, but because they’re traumatic to revisit and remember. I also feel protective of my daughter from my first marriage. I didn’t want to claim sole ownership of a story that belongs to her, too. However, a short while ago, I asked her how she’d feel if I were publicly honest about that part of my life, and she encouraged me to go ahead.
I’m mentioning these things now not in an attempt to garner sympathy, but out of solidarity with the huge numbers of women who have histories like mine, who’ve been slurred as bigots for having concerns around single-sex spaces.
Finally, I’ll quote Pinns again:
Much of the backlash against Rowling spills over onto women’s crisis centres, rape support services, and safeguarding charities that maintain female-only spaces. These organisations often base their policies on:
– the reality of male violence
– the needs of traumatised women
– legal exemptions that allow single-sex services
– safeguarding obligations
Rowling’s position aligns with these long-standing principles. Calling such services “transphobic” erases the reasons they exist.
Despite the headlines, Rowling has not said that trans people shouldn’t exist, shouldn’t have rights, or are a threat. She has not argued against healthcare for trans adults. She has not advocated discrimination.
As the West starts to realize that it’s unfair for biological men, however they identify, to enter some women’s spaces, or to compete in women’s sports, or that there are dangers in “affirmative care” doled out to adolescents who aren’t of age, I’m hoping that Rowling will no longer be immediately dismissed by ideologues, but that her arguments will be taken seriously and answered.
Even the title of this New Yorker article is dumb: “faith in atheism” is an oxymoron, for a lack of belief in gods is not a “faith” in any meaningful sense. But of course the New Yorker is uber-progressive,”which means it’s soft on religion. And this article, recounting Christopher Beha’s journey from Catholicism to atheism and then back to a watery theism, is a typical NYer article: long on history and intellectual references, but short on substance. In the end I think it can be shortedned to simply this:
“Atheism in all its forms is a kind of faith, but it doesn’t ground your life by giving it meaning.. This is why I became a theist.”
So far as I can determine, that is all, though the article is tricked out with all kinds of agonized assertions as the author finds he cannot “ground his life” on a lack of belief in God. But whoever said they could? But it plays well with the progressive New Yorker crowd (same as the NY Times crowd) in being soft on religion and hard on atheism. The new generation of intellectuals need God, for to them, as to Beha, only a divine being can give meaning to one’s life.
Christopher Beha, a former editor of Harper’s Magazine, is the author of a new book, Why I am Not an Atheist, with the subtitle Confessions of a Skeptical Believer. The NYer piece is taken from that book
You can read his article for free as it’s been archived. Click below if you want a lame justification for theism:
Beha, considering nonbelief after he gave it up in college, decided that there were two forms of atheism: a scientific form and a “romantic” form. Quotes from his article are indented below, though bold headings are mine,
Scientific atheism
Among other things, this reading taught me that atheists do hold beliefs, not just about morals and ethics but about how the world actually is and how humans fit into it. Of course, not all atheists hold the same beliefs—just as not all theists do—but I found that modern atheist belief tends to cluster into two broad traditions.
The most prevalent atheist world view goes by many names—empiricism, positivism, physicalism, naturalism—but the term that best captures the fullness of its present‑day iteration, as I see it, is scientific materialism. Roughly speaking, this view holds that the material world is all that exists, that humans can know this world through sense perception, that the methods of science allow us to convert the raw data of these perceptions into general principles, and that these principles can be both tested and put to practical use by making predictions about future events.
As world views go, scientific materialism has a lot to say for it. It tells us that humans are capable, without any supernatural aid, of coming to understand, and ultimately to master, all of reality. It tells us that the store of human knowledge is constantly increasing and continuously improving our material conditions. To this end, it points to the astonishing human progress that has occurred in the time of science’s reign. And it encourages us to enjoy the fruits of this progress as much as possible, since our life here on earth is the only one we’ll get.
Most people who subscribe to scientific materialism take it to be so obviously correct that it could not be denied by any rational person who truly understood it. But my reading showed me that this world view has its shortcomings. The most basic is perhaps inherent to any world view at all: it rests on a set of principles which often can’t be proven, even by the standards of proof the world view embraces. The general principle that all real knowledge is derived from sense perception of material facts cannot itself be derived from the perception of facts in the world, and thus can’t really be sanctioned by scientific materialism’s own methods. Indeed, no general principle can be. The very legitimacy of deriving general principles from the particulars of experience can never be established from experience without already having the principle in hand.
Of course I don’t give a rat’s patootie if we can’t establish from first principles that we can understand the world through our senses. The answer to that blockheaded objection is that yes, that’s right, but only the scientific method construed broadly (i.e. empirical work with testing or replication) actually WORKS. If you want to establish where typhoid comes from, and then prevent it or cure it, then you must use a secular, empirical method: science.
Now Beha admits that this world view does “work”. But then he says it has problems. Fur one thing, it doesn’t give you meaning, nor, he adds, does it explain consciousness:
If by “works” one means that it can be put to good use, this is unquestionably so. But, if we mean that it captures within its frame all the notable features of our experience, that’s a different matter. In fact, what materialism can’t adequately capture is experience itself. Consciousness is not material, not publicly available through sense perception, not subject to the kind of observation that scientific materialism takes as the hallmark of knowledge. By the standards of the materialist world view, it simply doesn’t exist. For me, this limitation proved fatal. I spent far too much time within the confines of my mind to accept a world view that told me whatever was going on in there wasn’t real.
Here the man is deeply confused. Of course subjective experience is “real” to the subject, but it’s very hard (“the hard problem”) to figure out how it arises in the brain. And denying that consciousness arises through materialistic processes in the brain (and elsewhere) is just wrong. We know it’s wrong, for we can affect consciousness by material interventions like anesthesia and psychological tricks, so the phenomenon must, unless it comes from God, be “material” in origin. Here Beha seems perilously close to Douthat saying that because science can’t explain consciousness, there must be a god.
Romantic atheism
Luckily, I’d by then come into contact with the other great family of modern atheist belief, which I eventually came to call romantic idealism. This is the atheism of Nietzsche and Martin Heidegger and their existentialist descendants, which begins in precisely the place where scientific materialism leaves off, with the will of the subjective, conscious agent. At its most extreme, romantic idealism treats each of us as willing our own world into being, creating the reality in which we live. Even when it does not go quite this far, it treats our subjective experience as the proper subject of knowledge, in fact the only thing we can ever be said to know.
Romantic idealism arose in the post‑Enlightenment era, and it grew in opposition to the principles of Enlightenment rationality as much as it did to religious authority. Although atheism is often associated with hyperrationality, this form of it is unapologetically irrational. In place of reason, observation, and scientific study, it valorizes emotion, imagination, and artistic creativity. The ethics of romantic idealism are an ethics of authenticity: the greatest good is not maximizing pleasure and minimizing pain but living in a way that is true to our subjective reality. The movement rejects religious belief not for being empirically false but for being a ready‑made and inherited response to existential problems that we must work out for ourselves. The appeal of this world view—particularly for a young person engaged in just such a working out—should be obvious, and I soon found myself in thrall to it.
Like scientific materialism, romantic idealism does not have a solid foundation in any provable universal truth. But it revels in this condition: it is the lack of any such foundation that makes it possible for each of us to construct our own truth. This relativism carries clear dangers. Since the time of Locke, empiricism has been closely linked with political liberalism, whereas romantic idealism is associated with rather darker political forces. Jean-Jacques Rousseau, one of the founders of Romanticism, was a great inspiration for the French Revolution’s Reign of Terror. He argued that liberalism’s supposed universal rights were covers for bourgeois self-interest. This argument was later developed at great length by Nietzsche, one of several thinkers in this tradition who inspired the rise of fascism.
But romantic atheism also fails to give us “meaning,” and Beha desperately wants and needs meaning!
A more basic problem with romantic idealism occurs on the personal level: building meaning from scratch turns out to be an incredibly difficult task. The romantic-idealist approach is fraught with fear and trembling, a fact it doesn’t deny. It is not a route to happiness; indeed, it seems to hold the goal of happiness in contempt.
Once again we see Beha desperately looking for a world view that gives his life meaning—and happiness. That much is clear from not only the above, but from other stuff.
Beha wants “meaning”, and that meaning must come from faith (Some quotes)
Anyway, I wasn’t really looking for practical guidance. To ask “How am I to live?” is to inquire as to not just what is right but what is good. It is to ask not just “What should I do?” but “How should I be?” The most generous interpretation of the New Atheist view on this question is that people ought to have the freedom to decide for themselves. On that, I agreed completely, but that left me right where I’d started, still in need of an answer.
. . .After nearly twenty years of searching unsuccessfully for a livable atheist world view, I began, in my mid-thirties, to entertain the possibility that atheism itself might be part of the problem. There were many steps from here to my eventual return to robust belief, but I started with the notion that for me the authentic life might be one of faith—one that recognized the existence of both the external material world and the internal ideational world and sought to reconcile them, and one that accepted an absolute foundation to things and attempted to understand, in some provisional and imperfect way, the nature of this foundation and what it wanted from me.
I’m not sure how “faith”—Beha is curiously reticent to tell us what he actually believes—is supposed to provide us with an “absolute foundation”, unless you become a traditional theist who thinks that God interacts with you personally and that it is this God that gives your life meaning. But he won’t say that in clear, explicit terms. One hallmark of the new “liberal” religion is that it’s both fuzzy and slippery.
Beha goes on to argue that “liberals” (aka people who don’t buy Trump) adhere to both forms of atheism, but, in the end, to ground not just life but also society requires theism, for theism is our only source of “rights”:
Meanwhile, the failure of these traditions to respond adequately to the challenge is bound up with the problem identified by their earliest proponents: they have a very hard time articulating their foundational justification. When liberalism runs smoothly, it does a remarkable job delivering the goods it promises. For most people, this is a sufficient achievement to quiet any worries about its philosophical underpinnings. But when many people within liberal societies do not feel that the system is working, when the practical case for liberalism comes into question, secular liberals don’t have much else to go on.
. . .Locke had the empiricist’s healthy suspicion that we could never have metaphysical certainty about what the Creator’s will was, which meant that no person should impose his answer to that question on another. It is for these reasons that faith must be treated as a matter of personal conscience, but also more generally that a regime grounded in a social contract must be one that respects individual freedoms. Our status as creatures of God confers on us certain rights that can’t be handed over as part of the social contract, rights that are at once natural and inalienable.
“Our status as creatures of God”? How does he know there is a God? Is it because science can’t explain emotions and other subjective experiences—that we don’t understand consciousness? In the end, Beha apparently thinks there’s a God because it makes him feel better, and gives his life meaning.
Well, good for him! But there are plenty of us who derive “meaning” as a result of doing what we find fulfilling and joyful (see this interesting post and thread). I, for one, never pondered the question “what must I do to give my life meaning?” That meaning arose, as for many of us, as post facto rationalization of doing what we found to be fulfilling.
At any rate, this is a curiously anodyne essay, absolutely personal and not generalizable to the rest of humanity. It is the story of a journey, but one that ends with embracing a god for which there’s no evidence. Excuse me if I can’t follow that path.
*************
Beha, clearly flogging his newfound theism, has a guest essay in the Feb. 11 NYT, “My conversion to skeptical belief” (archived here), which emphasizes that his beliefs are inextricably intertwined with doubt, and so he repeats what many believers have said before. An example:
In the face of this I attempt — with varying degrees of success at varying times — to take a page from Montaigne’s book and embrace skeptical belief. I’m well aware that religion has often served as precisely that “one great truth” that people are punished for refusing to accept. But it has also served as an expression of the fundamental mystery at the heart of reality and the radical limitations of human understanding. It is a way of living with skepticism.
What does this mean in practice? Embracing skeptical belief does not mean believing things without “really” believing them. It means understanding your beliefs as limited, contingent and fallible, recognizing that they can’t be proved correct, that someone else’s refusal to come around to them does not indicate stupidity or obstinacy or bad faith.
Similarly, a skeptical believer recognizes doubt as an essential component of belief, rather than its opposite. To a skeptical believer, the great mark of sincerity is the extent to which you attempt to live out your beliefs in your own life despite your own doubts, not the extent to which you silence those doubts or the doubts of others.
. . . To push ahead of someone on the train, to refuse a dollar to the woman selling candy with a baby on her back, to make a snarky remark at the register about my misunderstood coffee order, all while I have ashes on my head, would announce to anyone who cared to notice the disjunction between my supposed beliefs and my life in the world.
What I try instead to do on this day is simply meet each choice I face with my fallible and limited beliefs, and respond to that choice in the way those beliefs actually commend.
Of course the worldview of humanism could yield the same results, except you needn’t ground your acts and beliefs in a Sky Daddy. Why must actions be somehow grounded in the supernatural instead of in a philosophy that you should be kind and helpful to your fellow humans?
h/t Barry
After five years of development and a nail biting launch from Antarctica, the PUEO experiment has completed a 23 day balloon flight at the edge of space, hunting for some of the most energetic particles in the universe. The instrument flew at 120,000 feet above Antarctica, using the entire continent as a detector to search for ultra high energy neutrinos, elusive particles that could reveal secrets about the universe’s most violent events. Now safely back on the ice with 50-60 terabytes of data, scientists are preparing to search through the results to see if they’ve caught these messengers from distant galaxies.
Astronomers have solved the mystery of a star that dimmed dramatically for nearly 200 days, one of the longest stellar dimming events ever recorded. The culprit appears to be either a brown dwarf or a super Jupiter with an enormous ring system, creating a giant saucer like structure that blocked 97% of the star’s light as it passed in front. This rare alignment offers scientists a unique opportunity to study planetary scale ring systems far beyond our Solar System.
New research has revealed that Mars’ most recent volcanoes weren’t formed by simple, one off eruptions as scientists previously thought. Instead, these volcanic systems evolved over millions of years, fed by complex underground magma chambers that changed and developed over time. By studying surface features and mineral signatures from orbit, researchers have pieced together a far more intricate volcanic story than anyone expected.
A recent study, led by the Center for Astrobiology (CAB), CSIC-INTA and using modelling techniques developed at the University of Oxford, has uncovered an unprecedented richness of small organic molecules in the deeply obscured nucleus of a nearby galaxy, thanks to observations made with the James Webb Space Telescope (JWST). The work, published in Nature Astronomy, provides new insights into how complex organic molecules and carbon are processed in some of the most extreme environments in the Universe.
On Feb.11th, China successfully conducted a low-altitude demonstration and verification flight test of the Long March-10 rocket and a maximum dynamic pressure escape test of the Mengzhou crewed spaceship system. Credit: Xinhua]
In 2020 Joe Biden became the first Democratic nominee in 36 years without a degree from the Ivy League. Obama, before him, filled no less than two-thirds of all cabinet positions with Ivy League graduates—over half of which were drawn from either Harvard or Yale.1 In Congress today, 95 percent of House members and 100 percent of senators are college educated.
According to a recent study published in Nature, 54 percent of “high achievers” across a broad range of fields—law, science, art, business, and politics—hold degrees from the 34 most elite universities in the country.2 The sociologist Lauren Rivera, studying top firms in finance, consulting, and law, found that recruiters are jonesing for applicants from a prestigious academic institution; typically targeting just three to five “core” universities in their hiring efforts—Harvard, Yale, Princeton, Stanford, and MIT—the usual suspects; then identifying five to fifteen additional second-tier options—such as Berkeley, Amherst, and Duke—from which they will more tentatively accept resumés.3 Everyone else—almost certainly never even gets a reply email. Why? Because, one lawyer explained the strategy to Rivera, “Number one people go to number one schools.”
“If destruction be our lot, we must ourselves be its author and finisher.” —Abraham LincolnGiven this new American caste system, it’s no surprise that 63 percent of Americans think that “experts in this country don’t understand the lives of people like me,” or that 69 percent feel the “political and economic elite don’t care about hardworking people.”4 And, I suggest, they’re not wrong. A culture that sanctifies college as the gateway to full citizenship, over time corrodes the foundations of democratic life. It devalues work that doesn’t come with a degree, licenses contempt for those not formally educated, and locks the working class out of positions of power. The result isn’t just underrepresentation; it’s resentment. As the journalist David Goodhart writes, “We now have a single route into a single dominant cognitive class”; where “an enormous social vacuum cleaner has sucked up status from manual occupations, even skilled ones,” and appropriated it to white-collar jobs, even low-level ones, in “prosperous metropolitan centers and university towns”; and where broad civic contribution has been replaced with narrow intellectual consensus.5 The result is a backlash not against education, but against the assumption that only one kind of education counts.
“At a time when racism and sexism are out of favor,” writes Harvard philosopher Michael Sandel, “credentialism is the last acceptable prejudice.”6 In a cross-national study conducted in the United States, Britain, the Netherlands, and Belgium, a team of social psychologists led by Toon Kuppens found that the college-educated class had a greater bias against less educated people than they did other disfavored groups.7 In a list that included Muslims, poor people, obese people, disabled people, and the working class, “stupid people” were the most disliked. Moreover, the researchers found that elites are unembarrassed by the prejudice; that unlike homophobia or classism, it isn’t hidden, hedged, or softened—it’s worn openly, with an air of self-congratulation. As the Swedish political scientist Bo Rothstein observes, “The more than 150-year-old alliance between the industrial working class and what one might call the intellectual-cultural Left is over.”8
Today we are living through a strange time in American life in which the numbers have declared victory. By most standard economic measures—employment, wages, even household net worth—the working class is better off than it was a generation ago.9, 10, 11 The average elevator mechanic gets paid over $100,000 per year12; master plumbers can make more than double that.13 Even in Mississippi, our country’s poorest state, workers see higher average wages than in Germany, Britain, or Canada.14
Elites are unembarrassed by the prejudice; that unlike homophobia or classism, it isn’t hidden, hedged, or softened—it’s worn openly, with an air of self-congratulation.It is, for working-class Americans today, the best of times, objectively—and the worst of times, subjectively. This is not because the spreadsheets are wrong, but because we fail to count the things that history records in tone, not totals—but rather things like mood, myth, and cultural resolve.
The Service EconomyAccording to the most recent data available from the United States Bureau of Labor Statistics, nearly four out of five Americans work in the service sector.15 For most Americans in most states, that means retail, fast-food, or some other smile-for-hire job located at the end of a check-out line.16 It’s a kind of work where labor isn’t just accomplished, it’s seen—performed under the soft surveillance of the American customer. So, beneath inflation charts and unemployment rates, if you want to understand the feelings side of the postindustrial economy—you might start with tipping.
It is, today, perhaps our most American habit—tipping for service; whether it be good, bad, or not provided. In restaurants, hair salons, and hotel lobbies, Americans tip over a hundred billion dollars a year—indeed, more than any other country on earth, and more than all of them combined.17 We tip cab drivers and pool cleaners and dog groomers and coat checkers. We tip the doorman on the way in, the bellhop on the way up, and the concierge on the way out. Americans tip so much that, as one European put it—the whole “approach [has become] completely deranged and out of control.”18
However, it wasn’t always this way. In fact, for much of the early 20th century, it was Americans who mocked Europeans for tipping—seeing it as smug, corrupt, and born of feudal etiquette.19 States such as Iowa, South Carolina, and Tennessee—among others—outlawed the practice entirely20; and wherever it remained legal, businesses proudly posted signs that read “No Tipping Allowed.”21 Some hotels even installed “servidors”—a two-way drawer that opened from hallway and room—so staff could deliver laundry without being seen, and without being tipped.22 As the author William R. Scott, in a book-length critique, put it in 1916:
In an aristocracy a waiter may accept a tip and be servile without violating the ideals of the system. In the American democracy to be servile is incompatible with citizenship … Every tip given in the United States is a blow at our experiment in democracy … Tipping is the price of pride. It is what one American is willing to pay to induce another American to acknowledge inferiority.Somewhere along the way, however—somewhere between the Marshall Plan and the first McDonald’s Happy Meal—the parts reversed; and we became the punchline. It became the Americans who tipped like royals—and the Europeans who saw it as such.
It was during this time that the gesture was institutionalized—not of custom or conscience—but because the Pullman Company, the National Restaurant Association, and eventually big tech sold it as part of the deal.23 Lobbying congress, adding tip lines to receipts and making feudalism feel American—if you’re the one tipping.24 Because on the other end—where the customer is always right—yes, the tip is now expected and yes, it is now appreciated; but gratuity has never been the same thing as respect and especially not when, for most working-class Americans, IHOP has become the least humiliating option.
The Status EconomyWe are signaling obsessed, hierarchy calibrated social apes. All of us, according to author Will Storr in The Status Game, walk around like buzzed-up antennas—attuned to the faintest frequency of admiration or disdain, gossip, or snicker.25 Given that for most of human history, it wasn’t guns, germs, or steel that mattered most; it was access to the cooperative networks and high-yield alliances of a species where insiders eat first and the gates are closely guarded. And so what governs our decisions—above all else, even when no one’s watching—is the paranoia of social scrutiny. In other words, it’s a cost-benefit analysis where the material outcome barely matters and utility is downstream of reputational impact.
Absent this understanding of human behavior, very little of it makes sense; a core theme in the work of the early 20th century economist Thorstein Veblen, whose concept of “conspicuous consumption” describes how people often consume products they don’t need—or even want—in order to flaunt status and social class.26 Luxury watches that tell time worse, minimalist chairs you can’t sit on are purchases where the high price is the point.
Of course, it is no major insight to say that people buy things to show off. The anthropological record is rich with lavish feasts and displays of abundance. The famous “potlatch ceremonies” of Pacific Northwest Indian tribes, for example, involved burning immense stores of wealth—copper shields, hand-carved canoes that took years to build, blankets, oil, and food—generations of accumulated capital, in a single afternoon; just to signal status.27
But what about meditating, carrying around a well-worn copy of The New Yorker in your back pocket, or believing in climate change? Veblen’s brilliance was seeing that even our quietest preferences are currency in a market economy of social prestige. As British philosopher Dan Williams puts it:
Much cognition is competitive and conspicuous. People strive to show off their intelligence, knowledge, and wisdom. They compete to win attention and recognition for making novel discoveries or producing rationalizations of what others want to believe. They often reason not to figure out the truth but to persuade and manage their reputation. They often form beliefs not to acquire knowledge but to signal their impressive qualities and loyalties. When people are angry, it’s rarely about money. It’s about being looked down on.It’s the kind of signaling that thrives in what sociologists call “post-material economies” such as contemporary America.28 Because in a society maxed out on comfort—where even the ultrawealthy can’t buy a better Netflix or a softer couch—the only lines left to draw are ideological; and social distinction becomes the new class war. The rub, however, is that unlike the peacock’s tail—a hard- to-fake signal, metabolically costly, and policed by survival—immaterial prestige hierarchies are cultural inventions; often arbitrary, often performative, and almost always enforced from the top down. In other words, social prestige isn’t earned—it’s distributed by those who already have it. As social scientists Johnston and Baumann described in a 2007 paper:
The dominant classes affirm their high social status through consumption of cultural forms consecrated by institutions with cultural authority. Through family socialization and formal education, class‑bound tastes for legitimate culture develop alongside aversions for unrefined, illegitimate, or popular culture.29The elite don’t just consume goods. They consecrate tastes, turning culture into a class barrier such that status is socially assigned rather than materially demonstrated. French sociologist Pierre Bourdieu called it symbolic capital—where opinions double as vocabulary tests and entry fees for membership into the aristocracy.30 As Princeton’s Shamus Khan explains, “Culture is a resource used by elites to recognize one another and distribute opportunities on the basis of the display of appropriate attributes.”31
Observing today’s ruling class, social psychologist Rob Henderson has coined the term “luxury beliefs,” arguing that the experts, the celebrities, and the institutions are all fluent in the same woke-speak, and by their material abundance can afford to focus almost exclusively on social justice issues that, ensconced in their gated communities, have no effect on their own luxurious lives (nor those of the people they profess to be helping).32
The words turn and turn again—testing for status, enforcing the pecking order.33 And now, just as working-class Americans born in the industrial economy once rejected cash tips—those born in the culture-capital economy don’t want the tip either. They want respect. The redneck reluctance to simply “trust the experts” or pronounce it “people of color” instead of “colored people” isn’t about bigotry or Bible verses or disinformation—it’s about refusing the role of grateful recipient in someone else’s moral theater. It’s not anti-intellectualism or anti-love and kindness. It’s anti-elitism.
A culture that sanctifies college as the gateway to full citizenship, over time corrodes the foundations of democratic life.How is it that a born-rich multibillionaire has become the standard-bearer for the working class? It’s because his favorite food is McDonald’s; and to Nancy Pelosi, George Clooney, and my high school guidance counselor—Trump is trash. They see him the same way they see trailer park America—as tacky, ignorant, and disposable; always on the lowborn side of the tip. It’s a feeling well-known in union organizing circles.34 That when people are angry, it’s rarely about money. It’s about being looked down on.
A New NationalismCulture can often be hard to think about because it doesn’t exist in the world of objects—it exists in the world as a perceptual experience. It has no mass, no edge, no location. It’s not made of things; it’s made of meanings—real, but not tangible.
The cultural backlash hypothesis, the status threat hypothesis, the social isolation hypothesis, the political alienation hypothesis, the nostalgic deprivation hypothesis—a growing body of scholarship has emerged to name and quantify the immaterial contours of twenty-first century populist discontent; all circling the drain of an old, half-remembered truth.35, 36, 37, 38, 39
For most of history, kings, philosophers, and statesmen took seriously the idea that civilizations depend on symbolic cohesion—on rituals, traditions, and agreed-upon fictions capable of domesticating our most socially inconvenient biological biases. They understood, whether by insight or instinct, that there’s something important about ceremony and uniform and national character. That propaganda isn’t all bad. That done right, good slogans make good citizens. And good citizens make great nations. As Gidron and Hall put it in a recent paper:
[I]ssues of social integration [must be taken] more seriously in studies of comparative political behavior. Such issues figured prominently in the work of an earlier era … but they fell out of fashion as decades of prosperity seemed to cement social integration.40In the old economy it was simple. You had the rich, who lunched at steakhouses and voted Republican; the working class, who labored in factories and voted Democrat; and in between, the mass suburban middle class. When it came, the conflict was clear—members of the working class joining forces with progressive intellectuals to oppose the moneyed elite. Yet every once in a while, a new, revolutionary class of citizens comes along and scrambles the whole social order. In the late 20th century it was the scholastic king—and the new culture-laureate class. He is not merely an academic; he is society’s central planner, a warden of elite passage, and the face of the new American aristocracy; and as The New York Times columnist David Brooks put it:
If our old class structure was like a layer cake—rich, middle, and poor—the creative class is like a bowling ball that was dropped from a great height onto that cake. Chunks splattered everywhere.41Outsourcing made economic sense, globalization was in large part inevitable, and cheap goods are always good politics—sure, fine. But for over fifty years now, neither political party has been able to solve the social problem of a postindustrial economy. And no American president has been able to tell a story good enough to replace the one previous generations called true. As sociologist Arlie Hochschild explained in a recent interview with The New York Times:
We keep looking for real policies. That’s not the thing. Trump offers a veneer of policies and a story, and we’ve got to tune in to the effect of that story on people who feel like the world’s melting and sinking … Because whatever the policies, these voters are following the story and the emotional payoff of that anti-shaming ritual. So we have to stop the story, reverse the story: Nobody stole your pride, we’re restoring it together.42In the same way philanthropy never solves economic inequality, bigger and better information tips will never win the culture war—because it’s not about being rich or poor, stupid or smart; it’s about better than or worse than. And the only thing that can make a rich person feel worse than a poor person—or a smart person worse than a stupid one—is a national story written by poor people and stupid people too. It’s the sort of new nationalism that, in the past, has required several interconnected efforts.
The Bottom LineRobert F. Kennedy, in March of 1968, in a speech at the University of Kansas, noted: “The gross national product can tell us everything about America except why we are proud that we are Americans.”43
Rubber in Akron. Meat in Chicago. Coal in Scranton. Steel in Gary. It used to be you knew a city by what it made—how it sounded, how it smelled. In 1950 Detroit was the richest city in the world—that’s right, the entire world.44 On Zug Island, they used to make the whole car, start to finish—iron ore mined and smelted on one end, parts shaped and assembled along the way, and a new Ford rolled off the line at the other—no imports, no one else. It was vertical integration—of work, of community, of pride.
But by the 1970s a new day had dawned, the old days were gone, and the unraveling had begun. Over half the manufacturing jobs moved elsewhere, a quarter of the population went too; and with whole neighborhoods left to rot, Detroit, once called “the Paris of the Midwest,” became one of the deadliest cities in the country.45, 46 From 1965 to 1974, homicides quintupled47; the central business district earned the name “zone of decay”; and businesses began installing bulletproof glass—floor to ceiling—to protect storefront clerks.
Just like that—two short decades transformed America’s motor city into America’s murder city. And burnt, bled, and bankrupt, the once shining example rolled out perhaps the saddest, most pitiful ad campaign in American history: “Say Nice Things About Detroit.”48
It’s not about being rich or poor, stupid or smart; it’s about better than or worse than.The bottom line is this. Every new economy produces different winners and losers—it’s just the way it is. What happened in Detroit was, in many ways, what was expected. But when the losses came—when the bottom fell out for the millions of working-class Americans still there, still trying—it was treated not as a national obligation but as an unfortunate footnote to progress. Detroit was told to retrain, relocate, find a way to adjust—and when they failed, just like the people still living in Akron, Scranton, and Gary, they were humiliated, cast as mascots of ignorance and failure. The problem is that the ignorant and the failed far outnumber those who aren’t. And so, as Franklin Roosevelt said, it’s not “whether we add more to the abundance of those who have much” that matters—“it is whether we provide enough for those who have too little.”
Because when the empire falls—when the American experiment joins the long ledger of civilizations past, it won’t be at the hands of China or Russia or Al Qaeda or anyone else. We are the richest nation in the history of the world; no other society has ever wielded as much global influence; not even a coalition of all the world’s armies could best ours. “If destruction be our lot,” wrote a 28-year- old Abraham Lincoln, “we must ourselves be its author and finisher.”49 As “a nation of freemen, we must live through all time, or die by suicide.”
And if it comes to that—if we choose death; it won’t be about free trade or wages or unemployment rates any more than it was about taxes in 1776. Once again, it will be about respect.
In this Free Press article, Steve Pinker and Marian Tupy (the latter identified as “the founder and editor of HumanProgress.org, a senior fellow at the Cato Institute, and co-author of Superabundance”) once again recount the undoubtable progress that humanity has made over the past six or seven centuries. The progress described here will be familiar to you if you’ve read Pinker’s two big books, Better Angels and Enlightenment Now: the progress has been in health, longevity, reduced poverty, better nutrition, less chance of violent death, and almost all indices of “well being”.
Click to read (if you have a subscription):
I’m not sure why Pinker is constantly attacked by people for touting progress, as the data are irrefutable, but I guess there’s a subgroup of “progressive” historians (and perhaps conservative ones) who like to aver that we’ve made little progress since the Middle Ages. Indeed, perhaps we’ve even regressed, and we’d be better off living in the Middle Ages. This Whiggish view is usually espoused by the religious, who say that the waning of religion has impoverished modern life. Perhaps leftish people don’t like the notion that we’re making progress (e.g., some say we’re worse off in racial relations now than during Jim Crow days), while rightish ones don’t like the palpable loss of faith of people in the West.
A few quotes:
Last month at Yale, the influential political blogger Curtis Yarvin, in a debate against Free Press contributor Jed Rubenfeld, argued that America ought to “end the democratic experiment”—and establish a monarchy. Yarvin has noted that Donald Trump is “biologically suited” to be America’s monarch. The ideas may sound extreme, but they have been influential. J.D. Vance describes Yarvin as “a friend,” and has cited his work. And Yarvin is part of a family of movements, known as the Dark Enlightenment, Techno-authoritarianism, and Neo-Reaction (NRx)—that reject the entire family of enlightenment values.
Meanwhile, theocracy is making a comeback, in movements known as theoconservatism, Christian Nationalism, and National Conservatism. The “National Conservatism Statement of Principles,” for example, declares that “where a Christian majority exists, public life should be rooted in Christianity and its moral vision, which should be honored by the state and other institutions both public and private.” The list of signatories is a lookbook of influential conservatives, including Charlie Kirk, Peter Thiel, and Trump administration insiders Michael Anton and Russell Vought—as well as our fellow Free Press contributors Christopher Rufo and Rod Dreher.
The latter, a friend of the vice president, has said elsewhere that the West will not “recover until and unless we become re-enchanted and seek a form of Christianity, and indeed of Judaism, that is more mystical, that valorizes this direct perception of the Holy Spirit, of holiness, and of transcendence.”
. . . Of course, humanity has already tried monarchy and theocracy—during the Middle Ages—and sure enough, some of the new reactionaries are saying that those times were not so bad after all. Dreher writes admiringly: “In the mind of medieval Christendom, the spirit world and the material world penetrated each other. . . . Men construed reality in a way that empowered them to harmonize everything conceptually and find meaning amid the chaos.”
Other influential conservatives go further in justifying medieval hierarchies. On his eponymous show, Tucker Carlson recently declared: “Feudalism is so much better than what we have now. Because at least in feudalism, the leader is vested in the prosperity of the people he rules.”
One of the themes of this article is how religion has in fact been an impediment in progress, and this seems to be the strongest attack on religion I’ve seen yet from Pinker (I haven’t read Tupy before). Perhaps Steve is preparing for his debate with Ross Douthat later this month (stay tuned), which will be about God. Doubthat’s new book is Believe: Why Everyone Should be Religious.
Here’s the money quote about progress, which I’ve put in bold:
It’s said that the best explanation for the good old days is a bad memory, and the historical amnesia of the romanticizers of medieval Christendom is near-complete. Among the blessings of modernity is an Everest of data about life in the past, painstakingly collected by economic historians from original sources over many decades. This quantitative scholarship circumvents fruitless back-and-forth about whether the Dark Ages were really all that dark: We can go to the numbers.
I won’t go through the numbers, as you probably know them, but they’re impressive. Here are just a few facts:
Some numbers can shake us out of this spoiled complacency. (For sources, see our respective books Ten Global Trends Every Smart Person Should Know and Enlightenment Now.) In 1800, the European life expectancy was 33 years; today, it is 79 years—which means that we have been granted not just extra life, but an extra life. Much of that gift came from leaps in prosperity that spared the lives of children. Before the turn of the 20th century, a third to a half of European children perished before their 5th birthday. Today that fate befalls three-tenths of one percent. Even the poorest countries today lose a fraction of the children that Europe did until recently. If being spared the agony of losing a child is not “meaningful,” what is?
Do people really want to go back to medieval times if they lose, on average, 46 years of life?
But the other theme of the piece is morality. In short, religious morality impedes human well-being by not giving people an impetus to help humanity, but rather telling them to live by this or that religious dictum that will please their God. I agree with the harm to behavior done by religion, but have taken issue with Pinker and Tupy’s idea not that morality can be humanistic, which it can be, but that humanistic morality is objective rather than subjective. And they seem certain about this:
Our moral purpose, then, is to use knowledge and sympathy to reduce suffering and enhance flourishing: health, freedom, peace, knowledge, beauty, social connection.
. . .The Enlightenment project of grounding morality in reason and well-being left us with a coherent fabric of arguments against the brutality and injustice that had been ubiquitous in human history. These arguments became the foundation of civilized society.
A partial list: Kant’s categorical imperative and his practical prescriptions for peace. The American Founders’ analyses of tyranny, democracy, and fundamental rights. Bentham’s cases against cruelty to animals and the persecution of homosexuals. Astell’s brief against the oppression of women. Voltaire’s arguments against religious persecution. Montesquieu’s case against slavery. Beccaria’s arguments against judicial torture. Rousseau’s case against harsh treatment of children.
In contrast to the Enlightenment’s exaltation of universal well-being, the morality of holy scriptures was dubious at best.
Crucially, these moral conclusions were based on reasons. As Plato pointed out 2,300 years ago, morality can’t be grounded in divine edicts. If a commandment itself has no moral justification, why should we obey it? If it does, why not just appeal to the justification itself?
Such justification is not hard to find. All of us claim a basic right to our own well-being. If we were not alive, healthy, nourished, educated, and embedded in a community, we could not deliberate about morality (or anything else) in the first place. And because we are embedded in a community, where people can affect each other’s well-being, we can’t stop at this basic claim. None of us can coherently demand these conditions for ourselves without granting them to others. I can’t say “I’m allowed to hurt you, but you’re not allowed to hurt me, because I’m me and you’re not,” and expect to be taken seriously.
Now I agree that society will run better if people conduct themselves in a manner that won’t injure other people. But to say that morality is objective, that the moral act is the one that increases “well-being”, is to buy into the fallacies that beset Sam Harris’s identical theory broached in his book The Moral Landscape. While increasing well-being does jibe with our usual notions of what’s moral, there are problems. I’ve described some of these in a previous post called “The absence of objective morality“, asserting that, in the end, no morality is objective; all forms of morality are based on subjective preferences. I’ll quote myself here:
It’s clear that empirical observation can inform moral statements. If you think that it’s okay to kick a dog because it doesn’t mind it, well, just try kicking a dog. But in the end, saying whether it’s right or wrong to do things depends on one’s preferences. True, most people agree on their preferences, and their concept of morality by and large agrees with Sam’s consequentialist view that what is the “right” thing to do is what maximizes “well being”. But that is only one criterion for “rightness”, and others, like deontologists such as Kant, don’t agree with that utilitarian concept. And of course people disagree violently about things like abortion—and many other moral issues.
One problem with Sam’s theory, or any utilitarian theory of morality, is how to judge “well being”. There are different forms of well being, even in a given moral situation, and how do you weigh them off against one another? There is no common currency of well being, though we know that some things, like torturing or killing someone without reason, clearly does not increase well being of either that person or of society. Yet there is no objective way to weigh one form of well being against another. Abortion is one such situation: one weighs the well being of the fetus, which will develop into a sentient human, against that of the mother, who presumably doesn’t want to have the baby.
But to me, the real killer of objective morality is the issue of animal rights—an issue that I don’t see as resolvable, at least in a utilitarian way. Is it moral to do experiments on primates to test human vaccines and drugs? If so, how many monkeys can you put in captivity and torture before it becomes wrong? Is it wrong to keep lab animals captive just to answer a scientific question with no conceivable bearing on human welfare, but is just a matter of curiosity? Is it moral to eat meat? Answering questions about animal rights involves, if you’re a Harris-ian utilitarian, being able to assess the well being of animals, something that seems impossible. We do not know what it is like to be a bat. We have no idea whether any creatures value their own lives, and which creatures feel pain (some surely do).
But in the end, trying to find a truly factual answer to the statement, “Is it immoral for humans to eat meat?” or “is abortion wrong?”, or “is capital punishment wrong?” seems a futile effort. You can say that eating meat contributes to deforestation and global warming, and that’s true, but that doesn’t answer the question, for you have to then decide whether those effects are “immoral”. Even deciding whether to be a “well being” utilitarian is a choice. You might instead be a deontologist, adhering to a rule-based and not consequence-based morality.
You can make a rule that “anybody eating meat is acting immorally,” but on what do you base that statement? If you respond that “animals feel pain and it’s wrong to kill them,” someone might respond that “yes, but I get a lot of pleasure from eating meat.” How can you objectively weigh these positions? You can say that culinary enjoyment is a lower goal than animal welfare, but again, that’s a subjective judgment.
By saying I don’t accept the idea of moral claims representing “facts”, I’m not trying to promote nihilism. We need a moral code if, for nothing else, to act as a form of social glue and as a social contract. Without it, society would degenerate into a lawless and criminal enterprise—indeed, the idea of crime and punishment would vanish. All I’m arguing is that such claims rest at bottom on preference alone. It’s generally a good thing that evolution has bequeathed most of us with a similar set of moral preferences. I hasten to add, though, that what feelings evolution has instilled in us aren’t necessarily ones we should incorporate into morality, as some of them (widespread xenophobia, for instance) are outmoded in modern society. Others, like caring for one’s children, are good things to do.
In the end, I agree with Hume that there’s no way to derive an “ought” from an “is”. “Oughts” have their own sources, while “is”s may represent in part our evolutionarily evolved behaviors derived from living in small groups of hunter-gatherers. But that doesn’t make them evolutionary “oughts.”
To abortion, meat-eating, and animal rights we can now add “assisted dying.” I favor it because I think it reduces suffering, but others say that it will actually increase net suffering by killing off people who could eventually be happy, or create societies in which people are sacrificed at will. And don’t forget Hiroshima and Nagasaki. If, as the authors claim, “None of us can coherently demand these conditions [of well being] for ourselves without granting them to others” then we open up a whole can of worms, especially involving war. In the end, saying that “well being” is a guide to objective morality begs the question of ethics: we are supposed to do X because it is more moral, and that’s because it increases “well being”. But why is increasing well being always more moral? If it’s by definition, then that really is begging the question.
I’m clearly not a philosopher, but I don’t see “increasing well being” as an objective guide to what’s moral. It is a preference, based on the subjective choice that a society with more “well being” is the one we should prefer. That is usually true, I think, but not always, and runs into substantial difficulties when you try to do the moral calculus in given situations.
Otherwise, I look forward to Steve’s debate with Douthat in two weeks, which should be great fun, even if nobody changes their minds about God.
Okay, so as this website slowly circles the drain, we’re still going to have cats on Caturday, and three items to boot.
First on deck is Larry, the Chief Mouser to the Cabinet Office at 10 Downing Street; he just turned 19, and served 15 of those years in the service of the Prime Minister. He’s in remarkably good shape for such an old cat, and here’s a two-minute video, in his own words, recounting how a careless photographer nearly tripped over him. Fortunately, Larry skittered away, perhaps losing half a life or so:
********************
From Bored Panda we have another large selection of cat memes. I’ll choose a few for your delectation. Click the screenshot to read; the intro says this:
Last year, the estimated expenses of owning a cat were between $830 and $3,000. Clearly, no expense is spared for cat owners when it comes to their beloved fluffballs.
Bored Panda loves cats too. That’s why we are blessing you with a collection of wholesome and cute cat memes, courtesy of the “happycat318” Instagram page. Check out the times kitties cracked up their owners with some diabolical shenanigans!
More info: Instagram [the happycat 218 Instagram Page], the source of all the memes:
. . . And this is a true cat lover:
********************
A persistent moggy described by the UPI; click on screenshot to read:
The tail:
A cat escaped from his owners’ camper during a stop at a gas station in Spain and reappeared months later less than a mile from their home in France.
Patrick and Evelyne Sire, who live in Olonzac, in the Hérault region of France, said their cat, Filou apparently jumped out of an open window in their camper during an Aug. 9, 2025, stop at a gas station in Maçanet de la Selva, Spain, located near the French border about 155 miles from home.
Patrick Sire said Filou’s absence wasn’t noticed until the next morning.
Sire said he returned to the gas station twice in the ensuing days and weeks, but no one in the area had seen any signs of the missing feline.
The couple said they started to give up hope as the months passed, but they received a call Jan. 9 from a resident in Homps, less than a mile from their home, reporting Filou had been found.
The woman said she had been feeding the cat outdoors since December, and noticed he was very thin and appeared to be coughing. She took the feline to a local veterinarian, where a microchip scan identified him as Filou.
“Filou traveled all that way to get to us. But how did he do it? Did he follow the highway? Did he go through towns? Did he follow the rivers?” Patrick Sire told France3 News. “We’ll never know.”
Here’s a video in French, which shows the GPS cat and his staff. If you know a bit of French you can probably understand it, but if not you can still see how happy the staff is!:
********************
Lagniappe: A cat makes a deposit:
Happy #Caturdaypic.twitter.com/nb4NfkS7CJ
— Larry the Cat (@Number10cat) January 24, 2026
A sneaky and lazy moggy and its exercise wheel:
. . . and a woman talks to her cat, but inadvertently insults it:
h/t: Ginger K., Simon, Merilee
These are the last photos I have, and I’ve gathered singletons in a potpourri of photos. Please send me any good wildlife photos you have—otherwise there will be a LACUNA tomorrow. Captions are indented, and you can enlarge the photos by clicking on them.
From Pratyaydipta Rudra in Oklahoma.
This is a Pine Squirrel [Tamiascirus sp.], photographed in Rocky Mountain National Park, CO.
From Adrian:
Here’s a picture of a European Pine Marten (Martes martes) from the shores of Loch Duich, near the Isle of Skye, Scotland:From Guy:
Taken in Lake Saint Clair Metropark in Michigan a few years back by my 12 year old son Nolan at a bird-banding station where we volunteer. I think it’s a Blackpoll Warbler (Setophaga striata) with the image taken in the fall (so I don’t really know if it’s male or female):
From Robert Lang, whose house and studio burned to the ground during the California fires last year; both are being rebuilt:
Our gardener found this California native tarantula (Aphonopelma sp.) while clearing some fire debris at my former studio and, knowing that my wife had a pet tarantula and was helping the Eaton Canyon Nature Center in its fire recovery, he left it for us at our temporary home in a little plastic bottle. (Umm…the tarantula was in a little plastic bottle. Not our home.) After we determined that ECNC didn’t have a place for one yet, we released it locally, but I took this picture before it wandered away. When we got home from the release, there was another plastic bottle on the porch with another tarantula inside.A Hummingbird Moth (species unknown) from Marty Riddle:
The Hawk Moths, aka Hummingbird Moth, love the nectar in resident maintained gardens at Brooksby Village Peabody, Massachusetts:
And a cat/bird encounter from Barry Lyons:
For years now, I’ve had mourning doves [Zenaida macroura] alight on my air conditioner. Some of them are regulars, and what interests me is that they haven’t taken the next obvious step: pecking at the window. What I mean is that a dove arrives and then stares into my apartment, sometimes moving its head back and forth: “Are you in there? Ah, there you are!” And then I get up from my chair and go feed them. But when will a dove start pecking at the window to alert me that he’s there? Why hasn’t it figured out that it’s something it can do? And at no cost to his safety because he can still fly away. And look at this photo. The dove seems to understand windows. Every time a cat goes to the window (I don’t own a cat; I cat-sit) it flares its wings instead of flying off, as if to say, “Ha ha, you can’t get me. I’m out here, you idiot.”On 14 January, 2025, two colliding black holes sent the clearest gravitational wave signal ever recorded rippling across the universe to Earth’s detectors. This remarkably crisp signal, designated GW250114, has allowed physicists to conduct the most stringent test yet of Einstein’s general relativity by measuring multiple “tones” from the collision. The wave passed the test with flying colours, but researchers remain optimistic that future detections might finally reveal where Einstein’s century old theory breaks down, potentially offering the first glimpses of quantum gravity.