Results are coming out from the samples returned by China’s Chang’e-6 sample return mission to the far side of the Moon. They offer our first close-up look at the geology and history of the far side, and a recent paper published in Science Advances from researchers at the Chinese Academy of Sciences has very interesting insights about the impact history of the Moon itself, and even some for the solar system at large.
As I am out of photos, and readers are withholding theirs, I once again steal the lovely photos of Australian Scott Ritchie from Cairns, whose Facebook page is here. Scott’s captions and IDs are indented, and you can enlarge his photos by clicking on them. .
I went to Melbourne during the middle of January to visit friends. Of course, birds are my feathered friends. This report cover a visit to the WTP, Western Treatment Plant, at Werribee, Victoria. These names pass mythically from the lips of Australian birders. I’ve been there before and really enjoyed it, but this past trip was wild. WTP is a series of quite large, secondary sewage treatment ponds, and lagoons. These abut along the great Southern Ocean and you get this wonderful interaction of freshwater and saltwater wetlands and associated birds. These are used, particularly in our summer, as overwintering sites for migratory shorebirds. But there’s a lot of resident waterfowl and waders there too. The WTP is so valued that you have to have a key to the gate for access to the site. Fortunately, my friend David was a key-carrying twitcher.The weather was crazy, with 45 KPH winds. One of the first things I discovered was that strong winds can really mess with a telephoto lens. My lens was being buffeted by the gusty, easterly winds to the point where I had to remove the lens hood to stabilise the camera. But a few interesting things happen because of the wind. It was a great opportunity for BIF shots; birds in flight. Birds generally take-off and land into the wind, and because it was so strong, they were moving quite slowly. So I got nice shots of normally very fast birds such as terns and sandpipers as they came into land. Attached are some fun pics.
The next day I did a short walk through Banyule Flats Reserve, an urban Melbourne wetland. The highlight was to see the oh so cute Owlet Nightjar, as well as a family of Tawny Frogmouths. Shout out to Lyn Easton for leading the tour.
A Black Swan Beach. The high winds packed the east facing beach with seagrass. And the Black Swans [Cygnus stratus] made for the buffet:
A beach of Blacks Swans, necks writhing like snakes:
Amazing!
“Ahh, now that feels good.” An Australian Spotted Crake [Porzana fluminea] enjoys the breeze up its bum:
“Bugger off!” But is not happy with the hordes of shore flies:
An immature Black-shouldered Kite [Elanus axillaris] gives us the eye:
Whiskered Terns [Chlidonias hybrida] cruised flew slowly against the wind, providing good views for the camera:
. . . and another:
A Black Kite [Milvus migrans] swings down to pick up a dead little bird that have been by a car:
A large flock of Australian shelducks [Tadorna tadornoides] into the WTP. It was great to see large numbers of waterfowl darken the skies:
I had fun shooting small short birds, as they say, coming into land against the wind at adjacent pool. This is a Rednecked Stint [Calidris ruficollis]:
And here comes a Sharp-tailed Sandpiper [Calidris acuminata]:
A family of Tawny Frogmouths [Podargus strigoides] greet the day at Banyule Flats Reserve:
But he poses stoically, “You can’t see me!” Frogmouths sit still, imitating dead branches and stumps:
A bit of a loose feather gives him away:‘
An Owlet-Nightjar [Aegotheles sp.] peaks out of his hole hollow. He stared at all the photographers down below. We must’ve started him because then he just disappeared. But we waited and waited:
“Come on, take your pictures!” He suddenly popped up, posing nicely:
Using local resources will be key to any mission to either the Moon or Mars - in large part because of how expensive it is to bring those resources up from Earth to our newest outposts. But Mars in particular has one local resource that has long been thought of as a negative - perchlorates. These chemicals, which are toxic to almost all life, make up between 0.5-1% of Martian soil, and have long been thought to be a hindrance rather than a help to our colonization efforts for the new planet. But a new paper from researchers at the Indian Institute of Science and the University of Florida shows that, when making the bricks that will build the outpost, perchlorates actually help.
Advertising wants your attention, not your soul; and it’s not nearly as good at getting either as you might think.
Learn about your ad choices: dovetail.prx.org/ad-choicesEnshittification, also known as crapification and platform decay, is a process in which two-sided online products and services decline in quality over time. As some of you may be aware, I was an Infectious Disease (ID) physician for almost 40 years, retiring 3 years ago. My practice was almost entirely concerned with taking care of patients in several acute care hospitals. So […]
The post MD Enshittification first appeared on Science-Based Medicine.According to the researchers from the University of Pennsylvania, some of the amino acids found in the asteroid Bennu likely formed in a different way than was previously thought, effectively challenging what we thought we knew about the origins of life.
A complex web of interrelated factors make Earth a life-supporting planet, and some of those factors are chemical. New research shows how oxygen abundance regulates the availability of the important chemicals phosphorous and nitrogen on planets, and that few planets get it right. While discouraging, it could help us optimize our search for habitable worlds.
In this week’s news-and-snark piece, Bill Maher offers a piece that may be controversial, for it’s about how men need to be “men” again. He avers that the loss of masculinity in males is one reason why women are disappointed in men, and why people are having less sex. The data are eye-opening; for example, 44% of Gen Z men say they’ve had no relationship experience at all during their teen years. That means up to age 20! And you might be interested in the new genre of literature he describes: “romantasy”, in which women get involved with animals or half-animals like centaurs.
His solution? Men should “man up”. His example: Taylor Swift being engaged to football star Travis Kelce (“old-school wood”) after writing songs about all the lame men she was once involved with. (He describes songs by other women.) Is he right? I have no idea.
The guests are Jonathan Haidt (not shown), Stephanie Ruhle and Lt. Gen. H.R. McMaster (Retired).
The Moon has a busy next two weeks ahead of it. Fresh off of Tuesday’s annular solar eclipse, the Moon begins an evening tour of the planets in the last half of February 2026. The waxing Moon actually slides by every planet except Mars over the next week. As a highlight, the waxing crescent Moon actually occults the planet Mercury in a rare celestial event on the night of Wednesday, February 18th.
It’s not easy being a futurist (which I guess I technically am, having written a book about the future of technology). It never was, judging by the predictions of past futurists, but it seems to be getting harder as the future is moving more and more quickly. Even if we don’t get to something like “The Singularity”, the pace of change in many areas of technology is speeding up. Actually it’s possible this may, paradoxically, be good for futurists. We get to see fairly quickly how wrong our predictions were, and so have a chance at making adjustments and learning from our mistakes.
We are now near the beginning of many transformative technologies – genetic engineering, artificial intelligence, nanotechnology, additive manufacturing, robotics, and brain-machine interface. Extrapolating these technologies into the future is challenging. How will they interact with each other? How will they be used and accepted? What limitations will we run into? And (the hardest question) what new technologies not on that list will disrupt the future of technology?
While we are dealing with these big question, let’s focus on one specific technology – controllable robotic prosthetics. I have been writing about this for years, and this is an area that is advancing more quickly than I had anticipated. The reason for this is, briefly, AI. Recent advances in AI are allowing for far better brain-machine interface control than previously achievable. Recent advances in AI allow for technology that is really good at picking out patterns from tons of noisy data. This includes picking out patterns in EEG signals from a noisy human brain.
This matters when the goal is having a robotic prosthetic limb controlled by the user through some sort of BMI (from nerves, muscles, or directly from the brain). There are always two components to this control – the software driving the robotic limb has to learn what the user wants, and the user has to learn how to control the limb. Traditionally this takes weeks to months of training, in order to achieve a moderate but usable degree of control. By adding AI to the computer-learning end of the equation, this training time is reduced to days, with far better results. This is what has accelerated progress by a couple of decades beyond where I thought it would be.
But it turns out this AI-assisted control can be a double-edged sword. To understand why we need to quickly review how the human brain adapts to artificial bodies or body parts. The short answer is – quite well. The reason is that our sense of ownership and control is all a constructed illusion of the brain in the first place. Circuits in our brain create the subjective sensation that each part of our body is part of us, that we own that body part (the sense of ownership) and the we control that body part (a sense of agency). We know about this largely from studying patients who have damage in one or more of these circuits that causes them to feel like a body part is not theirs or that they don’t control it.
This means that this circuitry can be hacked to make the brain create the sensation that you own and control a robotic or virtual limb. Luckily, this hacking is actually pretty simple. The brain compares different sensory inputs to see if they match, while also comparing motor intentions with motor outputs. So – if you see and feel a limb being touched, your brain will interpret that as you owning the limb. It can be that simple. If you intend to make a movement, and you see and feel the limb make that movement, then you feel as if you control the limb. So a robotic limb with some sensation, with some haptic feedback, and that does what we want it to do, will feel as if it is naturally part of us. The research is moving now in this direction, to close these loops as much as possible.
This, however, is where we run into a snag with AI-controlled robotic limbs. Part of the advance is that AI can add fine motor control to an artificial hand, say. Quickly, robotic movement tends to fall into one of three categories. You can directly control the robot, the robot can carry out a pre-programmed sequence of movements, or the robot can determine its movements in real time based on sensory feedback. When seeing a robotic demonstration you should always ask – what type of control is being demonstrated?
For robotic limbs what we want is direct control of the robot. While this is advancing, it is still somewhat limited and clumsy. So we can refine the direct control by adding one or both of the other two types of control. This means to some extent the robotic limb is carrying out the desired movements of the user with internal control. This can greatly increase the functionality of the robotic limb, but it comes at a cost of the user’s sense of embodiment and agency. Imagine if your hand were executing movements all by itself. It would feel uncanny and unnerving.
This is a long windup to a new study which tries to address this issue. The researchers were looking at the effect of the movement speed of the AI-controlled robotic limb to see how that affected the user’s sense of ownership and agency. What they found was not surprising, but good to know that this variable is effective and needs to be taken into consideration. They varied the execution time of an AI-controlled movement from 125 ms to 4 seconds. A moderate speed, about 1 second, resulted in the best sense of ownership and agency (or we can say the least interference with these senses). The further you got to either extreme the more the user felt an uncanny sense of unease, as if they did not own or control the robotic limb. This is a Goldilocks effect – too fast or too slow is no bueno, but just right results in a good outcome.
This result also makes sense from the perspective that prior neurological research shows that our brains also evaluate the world by how it moves. We separate agents from non-agents by how they move (the latter moves in an inertial frame while the former does not). Neurologists also know this because diseases that are movement disorders can often be diagnosed (and sometimes at a glance) by how the patient moves. Our brains are finely tuned to what constitutes normal human movement. Too fast or too slow, hypokinetic or hyperkinetic, and our brains immediately register that something is wrong.
So if we see our robotic limb moving at a normal human pace, doing what we want it to do (even though the fine movements are enhanced by AI) that can still be good enough for us to accept the limb as belonging to us and that we control it. There is likely also a Goldilocks zone here as well – too much AI control will break the illusion of control, while too little is of no use, but just right will be the best compromise between functionality and acceptance.
The nuances of neurological control through a brain-machine interface of an AI-enhanced robotic limb is one of those futurism problems that would have been difficult to anticipate.
The post The Future of AI-Powered Prosthetics first appeared on NeuroLogica Blog.
Dark energy is one of those cosmological features that we are still learning about. While we can’t see it directly, we can most famously observe its effects on the universe - primarily how it is causing the expansion of the universe to speed up. But recently, physicists have begun to question even that narrative, pointing to results that show the expansion isn’t happening at the same rate our math would have predicted. In essence, dark energy might be changing over time, and that would have a huge impact on the universe’s expansion and cosmological physics in general. A new paper available in pre-print on arXiv from Dr. Slava Turyshev, who is also famously the most vocal advocate of the Solar Gravitational Lens mission, explores an alternative possibility that our data is actually just messy from inaccuracies in how we measure particular cosmological features - like supernovae.