Pondering the future of a few infections. As to the image, consider it a metaphor of before and after where Jesus is a stand in for science or public health or whatever you want him to be. Except a promoter of peace and helping the poor. Not in 2025 anyway. See https://www.bu.edu/sequitur/2016/04/29/handler-ecce/ for more.
The post It’s the end of the world as I know it. And I feel fine. first appeared on Science-Based Medicine.Some believe a fabulous Spanish treasure is buried on the slopes of Oregon's Neahkahnie Mountain.
Learn about your ad choices: dovetail.prx.org/ad-choicesWhat can Venus atmospheric samples returned to Earth teach us about the varied evolution of both planets? This is what a recent study presented at the American Geophysical Union (AGU) Fall 2024 Meeting discussed a compelling mission concept called VATMOS-SR (Venus ATMOSphere – Sample Return), which is designed to collect samples from Venus’ atmosphere and return them to Earth for further study. This mission has the potential to help scientists gain greater insights into the formation and evolution of Venus and how it diverged so far from Earth’s evolution, despite both planets being approximately the same size.
Here, Universe Today discusses this incredible mission concept with Dr. Guillaume Avice, who is a National Centre for Scientific Research (CNRS) Permanent Researchers at the Paris Institute of Global Physics and lead author of the mission concept, regarding the motivation behind VATMOS-SR, advantages and limitations, significant results they hope to achieve with VATMOS-SR, steps being taken to address specific sampling concerns, nest steps in making VATMOS-SR a reality, and what can VATMOS-SR potentially teach us about finding life in Venus’ atmosphere. Therefore, what was the motivation behind the VATMOS-SR mission concept?
“The scientific motivation concerns the origin and evolution of planetary atmospheres,” Dr. Avice tells Universe Today. “We know very well the Earth’s atmosphere and we have some insights about the ancient Earth’s atmosphere. For Venus, there are measurements done in the 70’s but we have only very partial data. Returning a sample from the Venus atmosphere would allow us to put strong constraints on the delivery of volatile elements to terrestrial planets soon after solar system formation. Indeed, the two planets are very similar in terms of size, position relative to the Sun etc. Yet, their respective evolution diverged, and it remains a mystery why. Another motivation is that we would return for the first time (if we do it before Mars Sample Return) a sample from another planet than Earth.”
For VATMOS-SR, the researchers aim to accomplish three primary scientific objectives: the sources of volatile elements in Venus’ atmosphere, comparing today’s number of volatile elements to when they first formed billions of years ago, and examining the gases that transferred from Venus’ interior to its atmosphere throughout the planet’s history (also called outgassing). To accomplish this, VATMOS-SR is designed to collect several atmospheric liter-sized samples approximately 110 kilometers (68 miles) above the surface of Venus while traveling at more than 10 kilometers per second (6 miles per second).
VATMOS-SR builds off a previous mission concept called Cupid’s Arrow, which was presented at the 49th Lunar and Planetary Science Conference in 2018, with the primary difference being VATMOS-SR will return the samples to Earth whereas Cupid’s Arrow was slated to analyze the samples while still at Venus. Like all mission concepts, the authors note there are advantages and limitations for VATMOS-SR.
“The great advantage is that instruments in our laboratories are very precise for determining the abundance and isotopic composition of volatile elements,” Dr. Avice tells Universe Today. “This is a much better situation compared to in-situ measurements by an instrument onboard a space probe which has numerous limitations. The limitation of the mission is that, in order to return the sample back to Earth, sampling will happen at high velocity (10-13 km/s) meaning that the gas will be fractionated. We can correct for this effect but this is a limitation of the mission. Another one is that sampling gas means that measurements have to be done quickly when back on Earth because any sampling device you could imagine will have a leak rate. We can use high-tech technology to preserve the gas but ideally the preliminary science will have to be done quickly after return.”
As noted, Earth and Venus are approximately the same size, with Venus’s diameter approximately 95 percent of Earth’s. Despite this, both planets are starkly different regarding their characteristics, specifically surface temperatures and pressures. While Earth’s average surface temperature is a livable 15 degrees Celsius (59 degrees Fahrenheit), Venus’s average surface temperature is a scorching 462 degrees Celsius (864 degrees Fahrenheit), which is hot enough to melt lead.
While Earth’s average surface pressure is measured at 14.7 pounds per square inch (psi), Venus’ average surface pressure is approximately 92 times higher, which is equivalent to experiencing the pressures at 900 meters (3,000 feet) underwater on Earth. This is due to Venus’ atmosphere being extremely dense and composed of carbon dioxide (~96.5 percent), leading to a runaway greenhouse effect. In contrast, while the atmosphere of the planet Mars is also composed of largely carbon dioxide (~95 percent), its atmosphere is much thinner, resulting in significantly lower average surface pressure. Therefore, despite the vast differences between Earth and Venus, what are the most significant results the team hopes to achieve with VATMOS-SR?
“To understand the origin and evolution of the atmosphere of Venus to better understand Earth’s sister planet but also to understand what makes a planet habitable or not,” Dr. Avice tells Universe Today. “This is also extremely important to understand exoplanets because atmospheres of exoplanets are the only reservoir that can be measured remotely with telescopes. Understanding exoplanets thus requires to understand the composition of planetary atmospheres in our solar system.”
Regarding the fractionation concerns about obtaining the samples at such high speeds, Dr. Avice notes statistical studies have been conducted in collaboration with NASA showing promising results and notes the next steps will involve similar tests but with better probe designs.
Going from a concept to becoming an actual mission and delivering groundbreaking science often takes years to decades to happen, often involving several stages of ideas, scientific implications, systems analysis, designs, prototypes, re-designs, and funding availability. Once components and hardware are finally built, they are tested and re-tested to ensure maximum operational capacity since they can’t be fixed after launch. This ensures all systems function independently and together to achieve maximum mission success, including science data collection and transmitting data back to Earth in a timely and efficient manner.
For example, while NASA’s New Horizons spacecraft conducted its famous flyby of Pluto in July 2015, the mission concept was first proposed in August 1992, accepted as a concept in June 2001, received funding approval in November 2001. It was finally launched in June 2006 and endured a 9-year journey to Pluto where it sent back breathtaking images of the dwarf planet in July 2015. Therefore, what are the next steps to making VATMOS-SR a reality?
Dr. Avice tells Universe Today, “We gathered a European team of scientists and engineers together with American and Japanese colleagues to propose VATMOS-SR to the coming ESA call for F-class (fast) mission. The CNES (French space agency) is supporting VATMOS-SR and is providing a lot of help with engineers and specialists to build a strong case to answer this call. This call will be released next month and, if selected, VATMOS-SR will be under consideration by the European Space Agency with developing activities starting as soon as 2026.”
The VATMOS-SR concept comes as debate continues to rage regarding whether the atmosphere of Venus is capable of hosting life as we know it, since the upper atmosphere has been shown to exhibit Earth-like temperatures and pressures, which is a stark contrast to the surface of Venus. It is estimated that the habitable zone of Venus’ atmosphere is between 51 kilometers (32 miles) and 62 kilometers (38 miles) above the surface that exhibit temperatures between 65 degrees Celsius (149 degrees Fahrenheit) and -20 degrees Celsius (-4 degrees Fahrenheit), respectively. As noted, VATMOS-SR is slated to collect samples at approximately 110 kilometers (68 miles) above the surface, or more than twice the altitude from the estimated atmospheric habitable zone. Despite this, what can VATMOS-SR teach us about finding life in Venus’ atmosphere?
Dr. Avice tells Universe Today, “Nothing directly (and no chance to have live organisms in the gas samples) but VATMOS-SR will tell us why Venus became such an inhabitable place. This is of course linked to the question, ‘Is it possible that life appeared on Venus at some point in its history?’”
For now, VATMOS-SR remains a very intriguing mission concept with the goal of helping us unravel the history of Venus and potentially the solar system, along with being an international collaboration between the United States, Europe (CNES), and Japan. While Dr. Avice is designated as the principal investigator, it was Dr. Christophe Sotin, who is a Co-PI, professor at the University of Nantes, former senior research scientist at NASA JPL, and lead author of the Cupid’s Arrow study, who first proposed measuring Venus’ atmosphere.
What new insights into Venus’ evolutionary history could VATMOS-SR provide scientists in the coming years and decades? Only time will tell, and this is why we science!
As always, keep doing science & keep looking up!
The post Unlocking Venus’ Secrets with VATMOS-SR Mission Concept appeared first on Universe Today.
Type 1a supernovae are extremely powerful events that occur in binary systems containing at least one white dwarf star – the core remnant of a Sun-like star. Sometimes, the white dwarf’s powerful gravity will siphon material from its companion star until it reaches critical mass and explodes. In another scenario, a binary system of two white dwarfs will merge, producing the critical mass needed for a supernova. Unlike regular supernovae, which occur every fifty years in the Milky Way, Type Ia supernovae happen roughly once every five hundred years.
In addition to being incredible events, Type 1a supernovae are useful astronometric tools. As part of the Cosmic Distance Ladder, these explosions allow astronomers to measure the distances to objects millions or billions of light-years away. This is vital to measuring the rate at which the Universe is expanding, otherwise known as the Hubble Constant. Thanks to an international team of researchers, a catalog of Type 1a Supernovae has just been released that could change what we know of the fundamental physics of supernovae and the expansion history of the Universe.
This new catalog constitutes the second data release (DR2) from the Zwicky Transient Facility (ZTF), a wide-field astronomical survey that began in 2018. This survey relies on the ZTF camera on the 1.2-meter (4-foot) Samuel Oschin Telescope at the Palomar Observatory near San Diego, California. It has classified over 8,000 supernovae, including 3628 nearby Type 1a supernovae (SNe Ia), more than doubling the number of known SNe Ia’s discovered in the past 30 years. Despite being rare, the ZTF’s depth and survey strategy have allowed the ZTF Collaboration to detect nearly four per night.
This catalog contains 3628 nearby SNe Ia and is the first large and homogenous dataset astrophysicists can access. The release is detailed in a paper released on February 14th in Astronomy & Astrophysics, alongside a Special Issue containing 21 related publications. The paper’s lead authors are Dr. Mickael Rigault, head of the ZTF Cosmology Science working group and a Research Scientist at the Centre National de la Recherche Scientifique (CNRS), the Université Claude Bernard Lyon, and Dr. Matthew Smith, a Lecturer in Astrophysics at Lancaster University. As Dr. Rigault said:
“For the past five years, a group of thirty experts from around the world have collected, compiled, assembled, and analyzed these data. We are now releasing it to the entire community. This sample is so unique in terms of size and homogeneity that we expect it to significantly impact the field of Supernovae cosmology and to lead to many additional new discoveries in addition to results we have already published.”
The key component of the ZTF system is its 47-square-degree, 600-megapixel cryogenic CCD mosaic science camera. The camera scans the entire northern sky daily in three optical bands with a magnitude of 20.5, allowing it to detect nearly all supernovae within 1.5 billion light-years of Earth. Co-author Prof. Kate Maguire of Trinity College Dublin said, “Thanks to ZTF’s unique ability to scan the sky rapidly and deeply, we have captured multiple supernovae within days—or even hours—of [the] explosion, providing novel constraints on how they end their lives.”
The ultimate purpose of the survey is to determine the expansion rate of the Universe (aka. the Hubble Constant). Since the late 1990s and the Hubble Deep Fields observations, which used SNe Ia to measure cosmic expansion, astronomers have known that the expansion rate is accelerating. This effectively demonstrated that the Hubble Constant is not constant and gave rise to the theory of Dark Energy. In addition, the ability to observe the Universe all the way back to roughly 1 billion years after the Big Bang led to the “Crisis in Cosmology.”
Also known as the “Hubble Tension,” astronomers noted that distance measurements along the Cosmic Ladder produced different values. Since then, cosmologists have been looking for explanations for this Tension, which include the possibility of Early Dark Energy (EDE). A key part of this is obtaining truly accurate measurements of cosmic distances. Co-author Professor Ariel Goobar, the Director of the Oskar Klein Centre in Stockholm and one of the founding institutions of ZTF, was also a member of the team that discovered the accelerated expansion of the Universe in 1998.
“Ultimately, the aim is to address one of our time’s biggest questions in fundamental physics and cosmology, namely, what is most of the Universe made of?” she said. “For that, we need the ZTF supernova data.” One of the biggest takeaways from this catalog and the studies that went into creating it is that more than previously thought, Type Ia Supernovae vary based on their host environment. As a result, the correction mechanism used to date needs revising, which could change how we measure the expansion rate of the Universe.
This could have consequences for the Standard Model of Cosmology – aka. the Lambda Cold Dark Matter (Lambda-CDM) model – and issues arising from it like the Hubble Tension. This data will be essential when the Nancy Grace Roman Space Telescope (RST) launches into space and begins making observations leading to the first wide-field maps of the Universe. Combined with observations by the ESA’s Euclid mission, these maps could finally resolve the mystery of Dark Matter and cosmic expansion. As Dr Rigault said:
“With this large and homogeneous dataset, we can explore Type Ia supernovae with an unprecedented level of precision and accuracy. This is a crucial step toward honing the use of Type Ia Supernovae in cosmology and assess[ing] if current deviations in cosmology are due to new fundamental physics or unknown problem[s] in the way we derive distances.”
Further Reading: Lancaster University, Astronomy & Astrophysics
The post Huge Release of Type 1a Supernovae Data appeared first on Universe Today.
What’s the story of our Moon’s early history? Despite all we know about our closest natural satellite, scientists are still figuring out bits of its history. New measurements of rocks gathered during the Apollo missions now show it solidified some 4.43 billion years ago. It turns out that’s about the time Earth became a habitable world.
University of Chicago scientist Nicolas Dauphas and a team of researchers made the measurements. They looked at different proportions of elements inside Moon rocks. They provide a window into the Moon’s early epochs. It started out as a fully molten blob after a collision between two early solar system bodies.
As it cooled and crystallized, the molten proto-moon separated into layers. Eventually, about 99% of the lunar magma ocean had solidified. The rest was a unique residual liquid called KREEP. That acronym stands for the elements potassium (K), rare earth elements (REE), and phosphorus (P).
Dauphas and his team analyzed this KREEP and found that it formed about 140 million years after the birth of the Solar System. It’s in the Apollo rocks and scientists hope to find it in samples from the South Pole-Aitken basin. This is the region where Artemis astronauts will eventually explore. If analysis confirms it there, then it indicates a uniform distribution of this KREEP layer across the lunar surface.
Understanding KREEP’s History on the MoonThe clues to the Moon’s ultimate “cooling off period” lie in a faintly radioactive rare earth element called “lutetium”. Over time, it decays to become hafnium. In the early Solar System, all rocks had about the same amounts of lutetium. Its decay process helps determine the age of the rocks where it exists.
However, the Moon’s solidification and subsequent formation of KREEP reservoirs didn’t result in a lot of lutetium compared to other rocks created at the same time. So, the scientists wanted to measure the proportions of lutetium and hafnium in Moon rocks and compare them to other bodies created around the same time—such as meteorites. That would allow them to calculate a more precise time for when the KREEP formed on the Moon.
They tested tiny samples of Moon rocks and looked at the ratio of hafnium in embedded lunar zircons. Through that analysis, they found that the rock ages are consistent with formation in a KREEP-rich reservoir. Those ages are consistent with the formation of KREEP reservoirs about 140 million years after the birth of the solar system, or about 4.43 billion years ago. “It took us years to develop these techniques, but we got a very precise answer for a question that has been controversial for a long time,” said Dauphas.
Placing KREEP in PerspectiveInterestingly, the team’s results showed that lunar magma ocean crystallization occurred while leftover planetary embryos and planetesimals bombarded the Moon. Those objects were the birth “seeds” of the planets and Moon, which began after the Sun coalesced starting some 4.6 billion years ago. What remained from the formation of the planets continued to batter the already-formed planets.
The formation of the Moon itself began some 60 million years after the solar system itself was born. The most likely event was the collision of a Mars-sized world called Theia with the infant Earth. That sent molten debris into space and it began to coalesce to make the Moon. “We must imagine a big ball of magma floating in space around Earth,” said Dauphas. Shortly thereafter, that ball began to cool. That process eventually resulted in the formation of the lunar KREEP layers.
An artist’s conception of the cooling lunar magma ocean. Courtesy ESA.The study of the decay of lutetium to hafnium in samples of those KREEP rocks is a big step forward in understanding the most ancient epoch of lunar history. More rock samples brought back from the South Pole-Aitken basin will help fill in the remaining blanks and help researchers clarify the timeline of both the cooling of the lunar rock and the subsequent creation of such rock deposits as the mare basalts. Those rock layers were created when impactors slammed into the lunar surface, generating lava flows that filled the impact basins.
The mare formed as a result of impacts later in the early history of the Moon, some 240 million years after the birth of the Solar System formation. Those impacts stimulated lava flows that covered less than 20 percent of the lunar surface and engulfed the oldest surfaces.
Timing is EverythingFixing the dating of lunar cooling not only tells us about the history of the Moon but helps scientists understand Earth’s evolution. That’s because the impact that formed the Moon was probably also the last major impact on Earth. It could well mark a time when the Earth may have begun its transformation into a stable world. That’s an important step toward evolving into a place hospitable for life.
“This finding aligns nicely with other evidence—it’s a great place to be in as we prepare for more knowledge about the Moon from the Chang’e and Artemis missions,” said Dauphas. “We have a number of other questions that are waiting to be answered.”
For More InformationLunar Rocks Help Scientists Pinpoint When the Moon Crystallized
Completion of Lunar Magma Ocean Solidification at 4.43 Ga
Moon Formation
The post The Moon Solidified 4.43 Billion Years Ago appeared first on Universe Today.
When it comes to particles, only photons are more abundant than neutrinos, yet detecting neutrinos is extremely difficult. Scientists have gone to extreme lengths to detect them, including building neutrino observatories in deep, underground mines and in the deep, clear ice of Antarctica.
One of their latest efforts to detect neutrinos is KM3NeT, which is still under construction at the bottom of the Mediterranean Sea. Though the neutrino telescope isn’t yet complete, it has already detected the most energetic neutrino ever detected.
The Universe is flooded with them, yet they’re extremely difficult to detect. They’re like tiny, abundant ghosts and are sometimes called “ghost particles.” They have no electric charge, which limits the ways they interact with matter. The fact that they only interact through gravity and the weak nuclear force explains their elusiveness.
Neutrinos can’t be seen and are only detected indirectly on the rare occasions when they interact with matter through the weak force. These interactions release Cherenkov Radiation that detectors can sense. Detectors have to be very large to catch these rare interactions. Km3NeT (Cubic Kilometre Neutrino Telescope) features thousands of individual detectors in each of two sections. At the end of 2024, Km3NeT was only 10% complete, yet on February 13th, it detected an extraordinarily energetic neutrino.
The detection is presented in new research in Nature titled “Observation of an ultra-high-energy cosmic neutrino with KM3NeT.” The KM3NeT Collaboration is credited with authorship.
“The detection of cosmic neutrinos with energies above a teraelectronvolt (TeV) offers a unique exploration into astrophysical phenomena,” the paper states. “Here we report an exceptionally high-energy event observed by KM3NeT, the deep-sea neutrino telescope in the Mediterranean Sea, which we associate with a cosmic neutrino detection.”
This is an artist’s impression of a KM3NeT installation in the Mediterranean. Underwater neutrino detectors take advantage of location to track these fast particles. Image Courtesy Edward Berbee/Nikhef.Though neutrinos themselves are undetectable, the muons created by their rare interactions with matter are detectable. In this detection, the muon’s energy level was 120 (+110/-60) petaelectronvolts (PeV). High-energy neutrinos like these are produced when “ultra-relativistic cosmic-ray protons or nuclei interact with other matter or photons,” according to the paper.
Because neutrinos seldom interact with matter and aren’t affected by magnetic fields, they could originate from extremely distant places in the Universe. These are called cosmogenic neutrinos rather than solar neutrinos, the more plentiful type that comes from the Sun. Cosmogenic neutrinos are more energetic than solar neutrinos because they’re created by cosmic rays from high-energy astrophysical phenomena like active galactic nuclei and gamma-ray bursts. Since they travel virtually unimpeded from distant sources, they can provide insights into their sources.
In terms of energy level, there are two types of neutrinos: atmospheric and cosmogenic. Cosmogenic neutrinos are more energetic and less plentiful than atmospheric neutrinos. “The neutrino energy is thus a crucial parameter for establishing a cosmic origin,” the paper states.
“The energy of this event is much larger than that of any neutrino detected so far,” the paper states. This could be the first detection of a cosmogenic neutrino and it could be the result of ultra-high energy cosmic rays that interact with background photons.
“Of interest in this article are neutrino interactions that produce high-energy muons, which can travel several kilometres in seawater before being absorbed,” the paper states. As these muons travel through the water, they lose energy. The amount of energy lost in each unit of travel is proportional to the muon’s energy level. By recording the signals and their time of arrival at different individual detectors in the KM3NeT array, scientists can then reconstruct the muon’s initial energy level and its direction.
This figure shows side and top views of the event in (a), with the Eiffel Tower shown for scale. The red line shows the reconstructed trajectory of the muon created by the neutrino interaction. The hits of individual photomultiplier tubes (PMTs) are represented by spheres stacked along the direction of the PMT orientations. Only the first five hits on each PMT are shown. The spheres are colour-coded relative to the first initial detection, and the larger they are, the more photons were detected, equating to energy level. Image Credit: The KM3NeT Collaboration, 2025.“The muon trajectory is reconstructed from the measured times and positions of the first hits recorded on the PMTs, using a maximum-likelihood algorithm,” the paper states. The new detection is referred to as KM3-230213A. The 21 detection lines registered 28,086 hits, and by counting the number of PMTs that are triggered, the researchers can estimate the muon energy at the detector.
This figure shows the number of detections in a simulation of the KM3-230213A event. The simulation helps researchers determine the true muon energy. “The normalized distributions of the number of PMTs participating in the triggering of the event for simulated muon energies of 10, 100 and 1,000?PeV,” the authors write. The vertical dashed line indicates the observed value in KM3-230213A with 3,672 PMT detections. Image Credit: The KM3NeT Collaboration, 2025.The KM3NeT Collaboration detected the most energetic neutron ever while still incomplete, and that bodes well for the future. However, the incomplete facility did limit one aspect of the detection. There’s uncertainty about the direction it came from. “A dedicated sea campaign is planned in the future to improve the knowledge of the positions of the detector elements on the seafloor,” the authors write. Once that campaign is complete, the data from KM3-230213A will be recalibrated.
Still, the researchers learned something about the direction of its source, albeit with an uncertainty estimated to be 1.5°. At the vast distances involved, that’s a significant uncertainty. “The probability that KM3-230213A is of cosmic origin is much greater than any hypothesis involving an atmospheric origin,” the paper states.
The researchers identified some candidate sources.
“Extragalactic neutrino sources should be dominated by active galactic nuclei, and blazars are of particular interest considering the very-high energy of KM3-230213A,” the paper states. “To compile a census of potential blazar counterparts within the 99% confidence region of KM3-230213A, archival multiwavelength data were also explored.”
The researchers identified 12 potential source blazars in different survey catalogues.
The red star in this figure shows KM3-230213A. The three concentric red circles show the error regions within R(68%), R(90%) and R(99%). Selected source candidates and their directions are shown as coloured markers. The colours and marker type indicate the criterion according to which the source was selected, e.g. VLBI is Very Large Baseline Interferometry. The sources are numbered according to their proximity to KM3-230213A. Image Credit: The KM3NeT Collaboration, 2025.Neutrinos are abundant yet elusive. They pass right through the Earth unimpeded, and about 100 trillion of them pass through our bodies every second. Detecting them is important because of what they can tell us about the Universe.
The extraordinary energy level of this neutrino is significant in neutrino astrophysics. It shows that nature can generate ultra-high-energy neutrinos, possibly from blazars, which are active galactic nuclei with jets pointed right at us.
“This suggests that the neutrino may have originated in a different cosmic accelerator than the lower-energy neutrinos, or this may be the first detection of a cosmogenic neutrino, resulting from the interactions of ultra-high-energy cosmic rays with background photons in the Universe.”
The post An Unfinished Detector has Already Spotted the Highest-Energy Neutrino Ever Seen appeared first on Universe Today.
In 1974, science fiction author Larry Niven wrote a murder mystery with an interesting premise: could you kill a man with a tiny black hole? I won’t spoil the story, though I’m willing to bet most people would argue the answer is clearly yes. Intense gravity, tidal forces, and the event horizon would surely lead to a messy end. But it turns out the scientific answer is a bit more interesting.
On the one hand, it’s clear that a large enough black hole could kill you. On the other hand, a black hole with the mass of a single hydrogen atom is clearly too small to be noticed. The real question is the critical mass. At what minimum size would a black hole become deadly? That’s the focus of a new paper on the arXiv.
The study begins with primordial black holes. These are theoretical bodies that may have formed in the earliest moments of the Universe and would be much smaller than stellar-mass black holes. Anywhere from atom-massed to a mass several times that of Earth. Although astronomers have never found any primordial black holes, observations do rule out several mass ranges. For example, any primordial black hole smaller than 1012 kg would have already evaporated thanks to Hawking radiation. Anything larger than 1020 kg would gravitationally lens stars in the Milky Way. Since we haven’t detected these lensing effects, they must at the very least be exceedingly rare. If they exist at all.
Some theoretical models argue that primordial black holes could be the source of dark matter. If that’s the case, observational limits constrain their masses to the 1013 – 1019 kg range, which is similar to the mass range for asteroids. Therefore, the study focuses on this range and looks at two effects: tidal forces and shock waves.
Tidal forces occur because the closer you get to a mass, the stronger its gravity. This means a black hole exerts a force differential on you as it gets near. So the question is whether this force differential is strong enough to tear flesh. Asteroid-mass black holes are less than a micrometer across, so even the tidal forces would cover a tiny area. If one passed through your midsection or one of your limbs, there might be some local damage, but nothing fatal. It would be similar to a needle passing through you.
But if the black hole passed through your head, that would be a different story. Tidal forces could tear apart brain cells, which would be much more serious. Since brain cells are delicate, even a force differential of 10 – 100 nanonewtons might kill you. But that would take a black hole at the highest end of our mass range.
Shockwaves would be much more dangerous. In this case, as a black hole entered your body, it would create a density wave that would ripple through you. These shockwaves would physically damage cells and transfer heat energy that would do further damage. To create a shockwave of energy similar to that of a 22-caliber bullet, the black hole would only need a mass of 1.4 x 1014 kg, which is well within the range of possible primordial black holes.
So yes, a primordial black hole could kill you.
While that makes for a great story, it would never happen in real life. Even if asteroid-mass primordial black holes exist, the number of them out there compared to the vastness of space means that the odds of it happening to anyone in their lifetime are less than one in 10 trillion.
Reference: Niven, Larry. “The Hole Man.” Analog Science Fiction/Science Fact (1974): 93-104.
Reference: Robert J. Scherrer. “Gravitational Effects of a Small Primordial Black Hole Passing Through the Human Body.” arXiv preprint arXiv:2502.09734
(2025)
The post What Would Happen if a Tiny Black Hole Passed Through Your Body? appeared first on Universe Today.
A lot of what the Trump administration is doing is aimed at health and science, and not necessarily in a good way. The most obvious blunder is the appointment of Robert F. Kennedy Jr., a palpably unqualified man with some bizarre views, as Secretary of Health and Human Services, the person who advises the President on all health matters. Given Trump’s abysmal ignorance of science, having someone like RFK Jr. guiding government policy is scary.
There’s a lot of beefing as well about the government cutting the “overhead” (money given to universities, supposedly to support the infrastructure of grants) uniformly to 15%, down from over 60% in some cases (each university negotiates it rate with the government). This slashing will reduce university budgets substantially. But in some cases in which a university has huge endowments, like Harvard ($53 billion), I can’t shed many tears over that. Given that in many cases we simply don’t know where overhead goes, the assumption has been that many schools simply use it as a source of money for almost anything, and that means that the taxpayers are unwittingly subsidizing not just scientific research, but universities in general.
At any rate, the potential damage that the Trump administration will do to American science is outlined in this new Atlantic article by Katherine Wu. It doesn’t cohere like a good science piece should, but at least lays out some scary things in store for American science. To me, the scariest is the hiding of already-obtained scientific results, financed by taxpayers, that were publicly available but are no long so.
Click below to see the article, or find it archived here.
First, the payoff for funding science. I hope this is accurate as it’s characterizing science as “research and development”:
Every dollar invested in research and development has been estimated to return at least $5 on average—billions annually.
It also looks as if the National Science Foundation is on the chopping block:
The administration’s actions have also affected scientific pursuits in ways that go beyond those orders. The dismantling of USAID has halted clinical trials abroad, leaving participants with experimental drugs and devices still in their bodies. Last week, NIH announced that it would slash the amount its grants would pay for administrative costs—a move that has since been blocked by a federal judge but that would substantially hamper entire institutions from carrying out the day-to-day activities of research. The administration is reportedly planning to cut the budget for the National Science Foundation. Mass layoffs of federal workers have also begun, and two NIH scientists (who asked not to be identified for fear of professional repercussions) told me they participated in a meeting this morning in which it was announced that thousands of staff across the Department of Health and Human Services would be let go starting today. Robert F. Kennedy Jr. has now become the head of that department, after two confirmation hearings in which he showed a lack of basic understanding of the U.S. health system and a flagrant disregard for data that support the safety and effectiveness of various lifesaving vaccines. (The White House did not return repeated requests for comment.)
It’s not clear whether the DEIrestrictions described in the previous post will severely impede science. Wu says this:
Many also expect that the moratorium on DEI-focused programming will have severe impacts on who is able to do the work of science—further impeding women, people of color, and other groups underrepresented in the field from entering and staying in it.
But it’s not clear the restrictions will have that effect, nor that making science more “diverse” (not just via race, but in other traits) will improve our understanding of nature.
There are restrictions on Social-Justice-aimed projects, but again, many of these have been a waste of money and effort, performative efforts not aimed at understandind science, and will we simply have to see how this shakes. But those who do such work are beefing about what the government did. Here’s an example of a peeved but woke scientist whose work I’ve often criticized (click screenshot to go to thread). Most of the commenters don’t support Fuentes’s griping:
One problem is that the government is looking for suspicious grants by doing word searches, and those searches include terms like “environment,” “climate”, and “race”. It’s a quick way to find suspicious grants, but you have to evaluate their quality, not simply defund them because they come up in a keyword search.
Here’s what I find most distressing about what the government did (besides appointing RFK Jr.):
In yesterday’s executive order, Trump highlighted the importance of “protecting expert recommendations from inappropriate influence and increasing transparency regarding existing data.” But that is exactly what the administration’s critics have said it is already failing to do. At the end of last month, the CDC purged its website of several decades’ worth of data and content, including an infectious-disease-surveillance tool as well as surveys tracking health-risk behaviors among youths. (On Tuesday, a federal judge ordered the government to restore, for now, these and other missing data and webpages to their pre-purge state.) And as soon as the Trump administration started pulling data sets from public view, scientists started worrying that those data would reappear in an altered form, or that future scientific publications would have to be modified.
I’m not as worried about the reappearance of data in altered form as I am about the simple removal of data—data funded by us, the American taxpayers—from public view. Fortunately, a judge stopped the data removal, but that may be temporary.
What will be the outcome? While Wu thinks this will reduce trust in science, I’m not so sure about that, especially given that trust in science fell strongly during the Biden administration, and trust is reduced simply because science is getting mixed up with politics in every administration. What worries me more is the vulnerability of science to the whims of the administration—an administration that seems to care more about key words than about research itself. My view is that the government is entitled to vet science funding and cut waste if it wants, but that governments are poorly equipped to judge scientific merit. A grant that looks wasteful may come up with useful results, though of course there are some that simply look like government funded-virtue flaunting. It’s best if a generous dollop of money is allocated to science, and then scientists themselves decide how to dole it out, for they are the best equipped people to do so. In this I agree with Wu’s conclusion:
There will undoubtedly be periods, in the coming weeks and months, when the practice of science feels normal. Many scientists are operating as they usually do until they are told otherwise. But that normalcy is flimsy at best, in part because the Trump administration has shown that it may not care what data, well collected or not, have to say. During his Senate confirmation hearings, Kennedy repeatedly refused to acknowledge that vaccines don’t cause autism, insisting that he would do so only “if the data is there.” Confronted by Senator Bill Cassidy with decades of data that were, in fact, there, he continued to equivocate, at one point attempting to counter with a discredited paper funded by an anti-vaccine group.In all likelihood, more changes are to come—including, potentially, major budgetary cuts to research, as Congress weighs this year’s funding for the nation’s major research agencies. Trump and his administration are now deciding how deep a rift to make in America’s scientific firmament. How long it takes to repair the damage, or whether that will be possible at all, depends on the extent of the damage they inflict now.
I’m just glad that I don’t have to apply for science grants any more.