You are here

News Feeds

Air inside your home may be more polluted than outside due to everyday chemical products

Matter and energy from Science Daily Feed - Mon, 02/17/2025 - 9:07pm
Bringing aromas indoors with the help of chemical products -- yes, air fresheners, wax melts, floor cleaners, deodorants and others -- rapidly fills the air with nanoscale particles that are small enough to get deep into your lungs, engineers have found over a series of studies.
Categories: Science

Unlocking Venus’ Secrets with VATMOS-SR Mission Concept

Universe Today Feed - Mon, 02/17/2025 - 9:04pm

What can Venus atmospheric samples returned to Earth teach us about the varied evolution of both planets? This is what a recent study presented at the American Geophysical Union (AGU) Fall 2024 Meeting discussed a compelling mission concept called VATMOS-SR (Venus ATMOSphere – Sample Return), which is designed to collect samples from Venus’ atmosphere and return them to Earth for further study. This mission has the potential to help scientists gain greater insights into the formation and evolution of Venus and how it diverged so far from Earth’s evolution, despite both planets being approximately the same size.

Here, Universe Today discusses this incredible mission concept with Dr. Guillaume Avice, who is a National Centre for Scientific Research (CNRS) Permanent Researchers at the Paris Institute of Global Physics and lead author of the mission concept, regarding the motivation behind VATMOS-SR, advantages and limitations, significant results they hope to achieve with VATMOS-SR, steps being taken to address specific sampling concerns, nest steps in making VATMOS-SR  a reality, and what can VATMOS-SR potentially teach us about finding life in Venus’ atmosphere. Therefore, what was the motivation behind the VATMOS-SR mission concept?

“The scientific motivation concerns the origin and evolution of planetary atmospheres,” Dr. Avice tells Universe Today. “We know very well the Earth’s atmosphere and we have some insights about the ancient Earth’s atmosphere. For Venus, there are measurements done in the 70’s but we have only very partial data. Returning a sample from the Venus atmosphere would allow us to put strong constraints on the delivery of volatile elements to terrestrial planets soon after solar system formation. Indeed, the two planets are very similar in terms of size, position relative to the Sun etc. Yet, their respective evolution diverged, and it remains a mystery why. Another motivation is that we would return for the first time (if we do it before Mars Sample Return) a sample from another planet than Earth.”

For VATMOS-SR, the researchers aim to accomplish three primary scientific objectives: the sources of volatile elements in Venus’ atmosphere, comparing today’s number of volatile elements to when they first formed billions of years ago, and examining the gases that transferred from Venus’ interior to its atmosphere throughout the planet’s history (also called outgassing). To accomplish this, VATMOS-SR is designed to collect several atmospheric liter-sized samples approximately 110 kilometers (68 miles) above the surface of Venus while traveling at more than 10 kilometers per second (6 miles per second).

VATMOS-SR builds off a previous mission concept called Cupid’s Arrow, which was presented at the 49th Lunar and Planetary Science Conference in 2018, with the primary difference being VATMOS-SR will return the samples to Earth whereas Cupid’s Arrow was slated to analyze the samples while still at Venus. Like all mission concepts, the authors note there are advantages and limitations for VATMOS-SR.

“The great advantage is that instruments in our laboratories are very precise for determining the abundance and isotopic composition of volatile elements,” Dr. Avice tells Universe Today. “This is a much better situation compared to in-situ measurements by an instrument onboard a space probe which has numerous limitations. The limitation of the mission is that, in order to return the sample back to Earth, sampling will happen at high velocity (10-13 km/s) meaning that the gas will be fractionated. We can correct for this effect but this is a limitation of the mission. Another one is that sampling gas means that measurements have to be done quickly when back on Earth because any sampling device you could imagine will have a leak rate. We can use high-tech technology to preserve the gas but ideally the preliminary science will have to be done quickly after return.”

As noted, Earth and Venus are approximately the same size, with Venus’s diameter approximately 95 percent of Earth’s. Despite this, both planets are starkly different regarding their characteristics, specifically surface temperatures and pressures. While Earth’s average surface temperature is a livable 15 degrees Celsius (59 degrees Fahrenheit), Venus’s average surface temperature is a scorching 462 degrees Celsius (864 degrees Fahrenheit), which is hot enough to melt lead.

While Earth’s average surface pressure is measured at 14.7 pounds per square inch (psi), Venus’ average surface pressure is approximately 92 times higher, which is equivalent to experiencing the pressures at 900 meters (3,000 feet) underwater on Earth. This is due to Venus’ atmosphere being extremely dense and composed of carbon dioxide (~96.5 percent), leading to a runaway greenhouse effect. In contrast, while the atmosphere of the planet Mars is also composed of largely carbon dioxide (~95 percent), its atmosphere is much thinner, resulting in significantly lower average surface pressure. Therefore, despite the vast differences between Earth and Venus, what are the most significant results the team hopes to achieve with VATMOS-SR?

“To understand the origin and evolution of the atmosphere of Venus to better understand Earth’s sister planet but also to understand what makes a planet habitable or not,” Dr. Avice tells Universe Today. “This is also extremely important to understand exoplanets because atmospheres of exoplanets are the only reservoir that can be measured remotely with telescopes. Understanding exoplanets thus requires to understand the composition of planetary atmospheres in our solar system.”

Regarding the fractionation concerns about obtaining the samples at such high speeds, Dr. Avice notes statistical studies have been conducted in collaboration with NASA showing promising results and notes the next steps will involve similar tests but with better probe designs.

Going from a concept to becoming an actual mission and delivering groundbreaking science often takes years to decades to happen, often involving several stages of ideas, scientific implications, systems analysis, designs, prototypes, re-designs, and funding availability. Once components and hardware are finally built, they are tested and re-tested to ensure maximum operational capacity since they can’t be fixed after launch. This ensures all systems function independently and together to achieve maximum mission success, including science data collection and transmitting data back to Earth in a timely and efficient manner.

For example, while NASA’s New Horizons spacecraft conducted its famous flyby of Pluto in July 2015, the mission concept was first proposed in August 1992, accepted as a concept in June 2001, received funding approval in November 2001. It was finally launched in June 2006 and endured a 9-year journey to Pluto where it sent back breathtaking images of the dwarf planet in July 2015. Therefore, what are the next steps to making VATMOS-SR a reality?

Dr. Avice tells Universe Today, “We gathered a European team of scientists and engineers together with American and Japanese colleagues to propose VATMOS-SR to the coming ESA call for F-class (fast) mission. The CNES (French space agency) is supporting VATMOS-SR and is providing a lot of help with engineers and specialists to build a strong case to answer this call. This call will be released next month and, if selected, VATMOS-SR will be under consideration by the European Space Agency with developing activities starting as soon as 2026.”

The VATMOS-SR concept comes as debate continues to rage regarding whether the atmosphere of Venus is capable of hosting life as we know it, since the upper atmosphere has been shown to exhibit Earth-like temperatures and pressures, which is a stark contrast to the surface of Venus. It is estimated that the habitable zone of Venus’ atmosphere is between 51 kilometers (32 miles) and 62 kilometers (38 miles) above the surface that exhibit temperatures between 65 degrees Celsius (149 degrees Fahrenheit) and -20 degrees Celsius (-4 degrees Fahrenheit), respectively. As noted, VATMOS-SR is slated to collect samples at approximately 110 kilometers (68 miles) above the surface, or more than twice the altitude from the estimated atmospheric habitable zone. Despite this, what can VATMOS-SR teach us about finding life in Venus’ atmosphere?

Dr. Avice tells Universe Today, “Nothing directly (and no chance to have live organisms in the gas samples) but VATMOS-SR will tell us why Venus became such an inhabitable place. This is of course linked to the question, ‘Is it possible that life appeared on Venus at some point in its history?’”

For now, VATMOS-SR remains a very intriguing mission concept with the goal of helping us unravel the history of Venus and potentially the solar system, along with being an international collaboration between the United States, Europe (CNES), and Japan. While Dr. Avice is designated as the principal investigator, it was Dr. Christophe Sotin, who is a Co-PI, professor at the University of Nantes, former senior research scientist at NASA JPL, and lead author of the Cupid’s Arrow study, who first proposed measuring Venus’ atmosphere.  

What new insights into Venus’ evolutionary history could VATMOS-SR provide scientists in the coming years and decades? Only time will tell, and this is why we science!

As always, keep doing science & keep looking up!

The post Unlocking Venus’ Secrets with VATMOS-SR Mission Concept appeared first on Universe Today.

Categories: Science

Huge Release of Type 1a Supernovae Data

Universe Today Feed - Mon, 02/17/2025 - 6:02pm

Type 1a supernovae are extremely powerful events that occur in binary systems containing at least one white dwarf star – the core remnant of a Sun-like star. Sometimes, the white dwarf’s powerful gravity will siphon material from its companion star until it reaches critical mass and explodes. In another scenario, a binary system of two white dwarfs will merge, producing the critical mass needed for a supernova. Unlike regular supernovae, which occur every fifty years in the Milky Way, Type Ia supernovae happen roughly once every five hundred years.

In addition to being incredible events, Type 1a supernovae are useful astronometric tools. As part of the Cosmic Distance Ladder, these explosions allow astronomers to measure the distances to objects millions or billions of light-years away. This is vital to measuring the rate at which the Universe is expanding, otherwise known as the Hubble Constant. Thanks to an international team of researchers, a catalog of Type 1a Supernovae has just been released that could change what we know of the fundamental physics of supernovae and the expansion history of the Universe.

This new catalog constitutes the second data release (DR2) from the Zwicky Transient Facility (ZTF), a wide-field astronomical survey that began in 2018. This survey relies on the ZTF camera on the 1.2-meter (4-foot) Samuel Oschin Telescope at the Palomar Observatory near San Diego, California. It has classified over 8,000 supernovae, including 3628 nearby Type 1a supernovae (SNe Ia), more than doubling the number of known SNe Ia’s discovered in the past 30 years. Despite being rare, the ZTF’s depth and survey strategy have allowed the ZTF Collaboration to detect nearly four per night.

This catalog contains 3628 nearby SNe Ia and is the first large and homogenous dataset astrophysicists can access. The release is detailed in a paper released on February 14th in Astronomy & Astrophysics, alongside a Special Issue containing 21 related publications. The paper’s lead authors are Dr. Mickael Rigault, head of the ZTF Cosmology Science working group and a Research Scientist at the Centre National de la Recherche Scientifique (CNRS), the Université Claude Bernard Lyon, and Dr. Matthew Smith, a Lecturer in Astrophysics at Lancaster University. As Dr. Rigault said:

“For the past five years, a group of thirty experts from around the world have collected, compiled, assembled, and analyzed these data. We are now releasing it to the entire community. This sample is so unique in terms of size and homogeneity that we expect it to significantly impact the field of Supernovae cosmology and to lead to many additional new discoveries in addition to results we have already published.”

The key component of the ZTF system is its 47-square-degree, 600-megapixel cryogenic CCD mosaic science camera. The camera scans the entire northern sky daily in three optical bands with a magnitude of 20.5, allowing it to detect nearly all supernovae within 1.5 billion light-years of Earth. Co-author Prof. Kate Maguire of Trinity College Dublin said, “Thanks to ZTF’s unique ability to scan the sky rapidly and deeply, we have captured multiple supernovae within days—or even hours—of [the] explosion, providing novel constraints on how they end their lives.”

The ultimate purpose of the survey is to determine the expansion rate of the Universe (aka. the Hubble Constant). Since the late 1990s and the Hubble Deep Fields observations, which used SNe Ia to measure cosmic expansion, astronomers have known that the expansion rate is accelerating. This effectively demonstrated that the Hubble Constant is not constant and gave rise to the theory of Dark Energy. In addition, the ability to observe the Universe all the way back to roughly 1 billion years after the Big Bang led to the “Crisis in Cosmology.”

Also known as the “Hubble Tension,” astronomers noted that distance measurements along the Cosmic Ladder produced different values. Since then, cosmologists have been looking for explanations for this Tension, which include the possibility of Early Dark Energy (EDE). A key part of this is obtaining truly accurate measurements of cosmic distances. Co-author Professor Ariel Goobar, the Director of the Oskar Klein Centre in Stockholm and one of the founding institutions of ZTF, was also a member of the team that discovered the accelerated expansion of the Universe in 1998.

“Ultimately, the aim is to address one of our time’s biggest questions in fundamental physics and cosmology, namely, what is most of the Universe made of?” she said. “For that, we need the ZTF supernova data.” One of the biggest takeaways from this catalog and the studies that went into creating it is that more than previously thought, Type Ia Supernovae vary based on their host environment. As a result, the correction mechanism used to date needs revising, which could change how we measure the expansion rate of the Universe.

This could have consequences for the Standard Model of Cosmology – aka. the Lambda Cold Dark Matter (Lambda-CDM) model – and issues arising from it like the Hubble Tension. This data will be essential when the Nancy Grace Roman Space Telescope (RST) launches into space and begins making observations leading to the first wide-field maps of the Universe. Combined with observations by the ESA’s Euclid mission, these maps could finally resolve the mystery of Dark Matter and cosmic expansion. As Dr Rigault said:

“With this large and homogeneous dataset, we can explore Type Ia supernovae with an unprecedented level of precision and accuracy. This is a crucial step toward honing the use of Type Ia Supernovae in cosmology and assess[ing] if current deviations in cosmology are due to new fundamental physics or unknown problem[s] in the way we derive distances.”

Further Reading: Lancaster University, Astronomy & Astrophysics

The post Huge Release of Type 1a Supernovae Data appeared first on Universe Today.

Categories: Science

The Moon Solidified 4.43 Billion Years Ago

Universe Today Feed - Mon, 02/17/2025 - 3:40pm

What’s the story of our Moon’s early history? Despite all we know about our closest natural satellite, scientists are still figuring out bits of its history. New measurements of rocks gathered during the Apollo missions now show it solidified some 4.43 billion years ago. It turns out that’s about the time Earth became a habitable world.

University of Chicago scientist Nicolas Dauphas and a team of researchers made the measurements. They looked at different proportions of elements inside Moon rocks. They provide a window into the Moon’s early epochs. It started out as a fully molten blob after a collision between two early solar system bodies.

As it cooled and crystallized, the molten proto-moon separated into layers. Eventually, about 99% of the lunar magma ocean had solidified. The rest was a unique residual liquid called KREEP. That acronym stands for the elements potassium (K), rare earth elements (REE), and phosphorus (P).

Dauphas and his team analyzed this KREEP and found that it formed about 140 million years after the birth of the Solar System. It’s in the Apollo rocks and scientists hope to find it in samples from the South Pole-Aitken basin. This is the region where Artemis astronauts will eventually explore. If analysis confirms it there, then it indicates a uniform distribution of this KREEP layer across the lunar surface.

Understanding KREEP’s History on the Moon

The clues to the Moon’s ultimate “cooling off period” lie in a faintly radioactive rare earth element called “lutetium”. Over time, it decays to become hafnium. In the early Solar System, all rocks had about the same amounts of lutetium. Its decay process helps determine the age of the rocks where it exists.

However, the Moon’s solidification and subsequent formation of KREEP reservoirs didn’t result in a lot of lutetium compared to other rocks created at the same time. So, the scientists wanted to measure the proportions of lutetium and hafnium in Moon rocks and compare them to other bodies created around the same time—such as meteorites. That would allow them to calculate a more precise time for when the KREEP formed on the Moon.

They tested tiny samples of Moon rocks and looked at the ratio of hafnium in embedded lunar zircons. Through that analysis, they found that the rock ages are consistent with formation in a KREEP-rich reservoir. Those ages are consistent with the formation of KREEP reservoirs about 140 million years after the birth of the solar system, or about 4.43 billion years ago. “It took us years to develop these techniques, but we got a very precise answer for a question that has been controversial for a long time,” said Dauphas.

Placing KREEP in Perspective

Interestingly, the team’s results showed that lunar magma ocean crystallization occurred while leftover planetary embryos and planetesimals bombarded the Moon. Those objects were the birth “seeds” of the planets and Moon, which began after the Sun coalesced starting some 4.6 billion years ago. What remained from the formation of the planets continued to batter the already-formed planets.

The formation of the Moon itself began some 60 million years after the solar system itself was born. The most likely event was the collision of a Mars-sized world called Theia with the infant Earth. That sent molten debris into space and it began to coalesce to make the Moon. “We must imagine a big ball of magma floating in space around Earth,” said Dauphas. Shortly thereafter, that ball began to cool. That process eventually resulted in the formation of the lunar KREEP layers.

An artist’s conception of the cooling lunar magma ocean. Courtesy ESA.

The study of the decay of lutetium to hafnium in samples of those KREEP rocks is a big step forward in understanding the most ancient epoch of lunar history. More rock samples brought back from the South Pole-Aitken basin will help fill in the remaining blanks and help researchers clarify the timeline of both the cooling of the lunar rock and the subsequent creation of such rock deposits as the mare basalts. Those rock layers were created when impactors slammed into the lunar surface, generating lava flows that filled the impact basins.

The mare formed as a result of impacts later in the early history of the Moon, some 240 million years after the birth of the Solar System formation. Those impacts stimulated lava flows that covered less than 20 percent of the lunar surface and engulfed the oldest surfaces.

Timing is Everything

Fixing the dating of lunar cooling not only tells us about the history of the Moon but helps scientists understand Earth’s evolution. That’s because the impact that formed the Moon was probably also the last major impact on Earth. It could well mark a time when the Earth may have begun its transformation into a stable world. That’s an important step toward evolving into a place hospitable for life.

“This finding aligns nicely with other evidence—it’s a great place to be in as we prepare for more knowledge about the Moon from the Chang’e and Artemis missions,” said Dauphas. “We have a number of other questions that are waiting to be answered.”

For More Information

Lunar Rocks Help Scientists Pinpoint When the Moon Crystallized
Completion of Lunar Magma Ocean Solidification at 4.43 Ga
Moon Formation

The post The Moon Solidified 4.43 Billion Years Ago appeared first on Universe Today.

Categories: Science

An Unfinished Detector has Already Spotted the Highest-Energy Neutrino Ever Seen

Universe Today Feed - Mon, 02/17/2025 - 1:12pm

When it comes to particles, only photons are more abundant than neutrinos, yet detecting neutrinos is extremely difficult. Scientists have gone to extreme lengths to detect them, including building neutrino observatories in deep, underground mines and in the deep, clear ice of Antarctica.

One of their latest efforts to detect neutrinos is KM3NeT, which is still under construction at the bottom of the Mediterranean Sea. Though the neutrino telescope isn’t yet complete, it has already detected the most energetic neutrino ever detected.

The Universe is flooded with them, yet they’re extremely difficult to detect. They’re like tiny, abundant ghosts and are sometimes called “ghost particles.” They have no electric charge, which limits the ways they interact with matter. The fact that they only interact through gravity and the weak nuclear force explains their elusiveness.

Neutrinos can’t be seen and are only detected indirectly on the rare occasions when they interact with matter through the weak force. These interactions release Cherenkov Radiation that detectors can sense. Detectors have to be very large to catch these rare interactions. Km3NeT (Cubic Kilometre Neutrino Telescope) features thousands of individual detectors in each of two sections. At the end of 2024, Km3NeT was only 10% complete, yet on February 13th, it detected an extraordinarily energetic neutrino.

The detection is presented in new research in Nature titled “Observation of an ultra-high-energy cosmic neutrino with KM3NeT.” The KM3NeT Collaboration is credited with authorship.

“The detection of cosmic neutrinos with energies above a teraelectronvolt (TeV) offers a unique exploration into astrophysical phenomena,” the paper states. “Here we report an exceptionally high-energy event observed by KM3NeT, the deep-sea neutrino telescope in the Mediterranean Sea, which we associate with a cosmic neutrino detection.”

This is an artist’s impression of a KM3NeT installation in the Mediterranean. Underwater neutrino detectors take advantage of location to track these fast particles. Image Courtesy Edward Berbee/Nikhef.

Though neutrinos themselves are undetectable, the muons created by their rare interactions with matter are detectable. In this detection, the muon’s energy level was 120 (+110/-60) petaelectronvolts (PeV). High-energy neutrinos like these are produced when “ultra-relativistic cosmic-ray protons or nuclei interact with other matter or photons,” according to the paper.

Because neutrinos seldom interact with matter and aren’t affected by magnetic fields, they could originate from extremely distant places in the Universe. These are called cosmogenic neutrinos rather than solar neutrinos, the more plentiful type that comes from the Sun. Cosmogenic neutrinos are more energetic than solar neutrinos because they’re created by cosmic rays from high-energy astrophysical phenomena like active galactic nuclei and gamma-ray bursts. Since they travel virtually unimpeded from distant sources, they can provide insights into their sources.

In terms of energy level, there are two types of neutrinos: atmospheric and cosmogenic. Cosmogenic neutrinos are more energetic and less plentiful than atmospheric neutrinos. “The neutrino energy is thus a crucial parameter for establishing a cosmic origin,” the paper states.

“The energy of this event is much larger than that of any neutrino detected so far,” the paper states. This could be the first detection of a cosmogenic neutrino and it could be the result of ultra-high energy cosmic rays that interact with background photons.

“Of interest in this article are neutrino interactions that produce high-energy muons, which can travel several kilometres in seawater before being absorbed,” the paper states. As these muons travel through the water, they lose energy. The amount of energy lost in each unit of travel is proportional to the muon’s energy level. By recording the signals and their time of arrival at different individual detectors in the KM3NeT array, scientists can then reconstruct the muon’s initial energy level and its direction.

This figure shows side and top views of the event in (a), with the Eiffel Tower shown for scale. The red line shows the reconstructed trajectory of the muon created by the neutrino interaction. The hits of individual photomultiplier tubes (PMTs) are represented by spheres stacked along the direction of the PMT orientations. Only the first five hits on each PMT are shown. The spheres are colour-coded relative to the first initial detection, and the larger they are, the more photons were detected, equating to energy level. Image Credit: The KM3NeT Collaboration, 2025.

“The muon trajectory is reconstructed from the measured times and positions of the first hits recorded on the PMTs, using a maximum-likelihood algorithm,” the paper states. The new detection is referred to as KM3-230213A. The 21 detection lines registered 28,086 hits, and by counting the number of PMTs that are triggered, the researchers can estimate the muon energy at the detector.

This figure shows the number of detections in a simulation of the KM3-230213A event. The simulation helps researchers determine the true muon energy. “The normalized distributions of the number of PMTs participating in the triggering of the event for simulated muon energies of 10, 100 and 1,000?PeV,” the authors write. The vertical dashed line indicates the observed value in KM3-230213A with 3,672 PMT detections. Image Credit: The KM3NeT Collaboration, 2025.

The KM3NeT Collaboration detected the most energetic neutron ever while still incomplete, and that bodes well for the future. However, the incomplete facility did limit one aspect of the detection. There’s uncertainty about the direction it came from. “A dedicated sea campaign is planned in the future to improve the knowledge of the positions of the detector elements on the seafloor,” the authors write. Once that campaign is complete, the data from KM3-230213A will be recalibrated.

Still, the researchers learned something about the direction of its source, albeit with an uncertainty estimated to be 1.5°. At the vast distances involved, that’s a significant uncertainty. “The probability that KM3-230213A is of cosmic origin is much greater than any hypothesis involving an atmospheric origin,” the paper states.

The researchers identified some candidate sources.

“Extragalactic neutrino sources should be dominated by active galactic nuclei, and blazars are of particular interest considering the very-high energy of KM3-230213A,” the paper states. “To compile a census of potential blazar counterparts within the 99% confidence region of KM3-230213A, archival multiwavelength data were also explored.”

The researchers identified 12 potential source blazars in different survey catalogues.

The red star in this figure shows KM3-230213A. The three concentric red circles show the error regions within R(68%), R(90%) and R(99%). Selected source candidates and their directions are shown as coloured markers. The colours and marker type indicate the criterion according to which the source was selected, e.g. VLBI is Very Large Baseline Interferometry. The sources are numbered according to their proximity to KM3-230213A. Image Credit: The KM3NeT Collaboration, 2025.

Neutrinos are abundant yet elusive. They pass right through the Earth unimpeded, and about 100 trillion of them pass through our bodies every second. Detecting them is important because of what they can tell us about the Universe.

The extraordinary energy level of this neutrino is significant in neutrino astrophysics. It shows that nature can generate ultra-high-energy neutrinos, possibly from blazars, which are active galactic nuclei with jets pointed right at us.

“This suggests that the neutrino may have originated in a different cosmic accelerator than the lower-energy neutrinos, or this may be the first detection of a cosmogenic neutrino, resulting from the interactions of ultra-high-energy cosmic rays with background photons in the Universe.”

The post An Unfinished Detector has Already Spotted the Highest-Energy Neutrino Ever Seen appeared first on Universe Today.

Categories: Science

What Would Happen if a Tiny Black Hole Passed Through Your Body?

Universe Today Feed - Mon, 02/17/2025 - 12:53pm

In 1974, science fiction author Larry Niven wrote a murder mystery with an interesting premise: could you kill a man with a tiny black hole? I won’t spoil the story, though I’m willing to bet most people would argue the answer is clearly yes. Intense gravity, tidal forces, and the event horizon would surely lead to a messy end. But it turns out the scientific answer is a bit more interesting.

On the one hand, it’s clear that a large enough black hole could kill you. On the other hand, a black hole with the mass of a single hydrogen atom is clearly too small to be noticed. The real question is the critical mass. At what minimum size would a black hole become deadly? That’s the focus of a new paper on the arXiv.

The study begins with primordial black holes. These are theoretical bodies that may have formed in the earliest moments of the Universe and would be much smaller than stellar-mass black holes. Anywhere from atom-massed to a mass several times that of Earth. Although astronomers have never found any primordial black holes, observations do rule out several mass ranges. For example, any primordial black hole smaller than 1012 kg would have already evaporated thanks to Hawking radiation. Anything larger than 1020 kg would gravitationally lens stars in the Milky Way. Since we haven’t detected these lensing effects, they must at the very least be exceedingly rare. If they exist at all.

Some theoretical models argue that primordial black holes could be the source of dark matter. If that’s the case, observational limits constrain their masses to the 1013 – 1019 kg range, which is similar to the mass range for asteroids. Therefore, the study focuses on this range and looks at two effects: tidal forces and shock waves.

Tidal forces occur because the closer you get to a mass, the stronger its gravity. This means a black hole exerts a force differential on you as it gets near. So the question is whether this force differential is strong enough to tear flesh. Asteroid-mass black holes are less than a micrometer across, so even the tidal forces would cover a tiny area. If one passed through your midsection or one of your limbs, there might be some local damage, but nothing fatal. It would be similar to a needle passing through you.

But if the black hole passed through your head, that would be a different story. Tidal forces could tear apart brain cells, which would be much more serious. Since brain cells are delicate, even a force differential of 10 – 100 nanonewtons might kill you. But that would take a black hole at the highest end of our mass range.

Shockwaves would be much more dangerous. In this case, as a black hole entered your body, it would create a density wave that would ripple through you. These shockwaves would physically damage cells and transfer heat energy that would do further damage. To create a shockwave of energy similar to that of a 22-caliber bullet, the black hole would only need a mass of 1.4 x 1014 kg, which is well within the range of possible primordial black holes.

So yes, a primordial black hole could kill you.

While that makes for a great story, it would never happen in real life. Even if asteroid-mass primordial black holes exist, the number of them out there compared to the vastness of space means that the odds of it happening to anyone in their lifetime are less than one in 10 trillion.

Reference: Niven, Larry. “The Hole Man.” Analog Science Fiction/Science Fact (1974): 93-104.

Reference: Robert J. Scherrer. “Gravitational Effects of a Small Primordial Black Hole Passing Through the Human Body.” arXiv preprint arXiv:2502.09734
(2025)

The post What Would Happen if a Tiny Black Hole Passed Through Your Body? appeared first on Universe Today.

Categories: Science

Novel catalyst development for sustainable ammonia synthesis

Matter and energy from Science Daily Feed - Mon, 02/17/2025 - 10:34am
A groundbreaking study explores Ba-Si orthosilicate oxynitride-hydride (Ba3SiO5 xNyHz) as a sustainable catalyst for ammonia synthesis, offering a potential alternative to traditional transition metal-based systems. Synthesized through low-temperature solid-state reactions and enhanced with ruthenium nanoparticles, these compounds demonstrated improved catalytic performance under milder conditions, providing a more energy-efficient route to ammonia production. This approach also addresses the environmental challenges associated with conventional methods, signaling a shift toward greener industrial practices in ammonia production.
Categories: Science

Breaking the pattern: How disorder toughens materials

Matter and energy from Science Daily Feed - Mon, 02/17/2025 - 10:34am
Researchers have found that adding just the right amount of disorder to the structure of certain materials can make them more than twice as resistant to cracking.
Categories: Science

AI-generated optical illusions can sort humans from bots

New Scientist Feed - Mon, 02/17/2025 - 10:00am
Artificial intelligences fail to identify optical illusions in images created by other AIs – so these images could form the basis of a new kind of CAPTCHA test
Categories: Science

The Atlantic on the government’s attacks on science

Why Evolution is True Feed - Mon, 02/17/2025 - 9:15am

A lot of what the Trump administration is doing is aimed at health and science, and not necessarily in a good way.  The most obvious blunder is the appointment of Robert F. Kennedy Jr., a palpably unqualified man with some bizarre views, as Secretary of Health and Human Services,  the person who advises the President on all health matters. Given Trump’s abysmal ignorance of science, having someone like RFK Jr. guiding government policy is scary.

There’s a lot of beefing as well about the government cutting the “overhead” (money given to universities, supposedly to support the infrastructure of grants) uniformly to 15%, down from over 60% in some cases (each university negotiates it rate with the government). This slashing will reduce university budgets substantially. But in some cases in which a university has huge endowments, like Harvard ($53 billion),  I can’t shed many tears over that. Given that in many cases we simply don’t know where overhead goes, the assumption has been that many schools simply use it as a source of money for almost anything, and that means that the taxpayers are unwittingly subsidizing not just scientific research, but universities in general.

At any rate, the potential damage that the Trump administration will do to American science is outlined in this new Atlantic article by Katherine Wu.  It doesn’t cohere like a good science piece should, but at least lays out some scary things in store for American science. To me, the scariest is the hiding of already-obtained scientific results, financed by taxpayers, that were publicly available but are no long so.

Click below to see the article, or find it archived here.

First, the payoff for funding science. I hope this is accurate as it’s characterizing science as “research and development”:

Every dollar invested in research and development has been estimated to return at least $5 on average—billions annually.

It also looks as if the National Science Foundation is on the chopping block:

The administration’s actions have also affected scientific pursuits in ways that go beyond those orders. The dismantling of USAID has halted clinical trials abroad, leaving participants with experimental drugs and devices still in their bodies. Last week, NIH announced that it would slash the amount its grants would pay for administrative costs—a move that has since been blocked by a federal judge but that would substantially hamper entire institutions from carrying out the day-to-day activities of research. The administration is reportedly planning to cut the budget for the National Science Foundation. Mass layoffs of federal workers have also begun, and two NIH scientists (who asked not to be identified for fear of professional repercussions) told me they participated in a meeting this morning in which it was announced that thousands of staff across the Department of Health and Human Services would be let go starting today. Robert F. Kennedy Jr. has now become the head of that department, after two confirmation hearings in which he showed a lack of basic understanding of the U.S. health system and a flagrant disregard for data that support the safety and effectiveness of various lifesaving vaccines. (The White House did not return repeated requests for comment.)

It’s not clear whether the DEIrestrictions described in the previous post will severely impede science. Wu says this:

Many also expect that the moratorium on DEI-focused programming will have severe impacts on who is able to do the work of science—further impeding women, people of color, and other groups underrepresented in the field from entering and staying in it.

But it’s not clear the restrictions will have that effect, nor that making science more “diverse” (not just via race, but in other traits) will improve our understanding of nature.

There are restrictions on Social-Justice-aimed projects, but again, many of these have been a waste of money and effort, performative efforts not aimed at understandind science, and will we simply have to see how this shakes. But those who do such work are beefing about what the government did. Here’s an example of a peeved but woke scientist whose work I’ve often criticized (click screenshot to go to thread). Most of the commenters don’t support Fuentes’s griping:

 

One problem is that the government is looking for suspicious grants by doing word searches, and those searches include terms like “environment,” “climate”, and “race”.  It’s a quick way to find suspicious grants, but you have to evaluate their quality, not simply defund them because they come up in a keyword search.

Here’s what I find most distressing about what the government did (besides appointing RFK Jr.):

In yesterday’s executive order, Trump highlighted the importance of “protecting expert recommendations from inappropriate influence and increasing transparency regarding existing data.” But that is exactly what the administration’s critics have said it is already failing to do. At the end of last month, the CDC purged its website of several decades’ worth of data and content, including an infectious-disease-surveillance tool as well as surveys tracking health-risk behaviors among youths. (On Tuesday, a federal judge ordered the government to restore, for now, these and other missing data and webpages to their pre-purge state.) And as soon as the Trump administration started pulling data sets from public view, scientists started worrying that those data would reappear in an altered form, or that future scientific publications would have to be modified.

I’m not as worried about the reappearance of data in altered form as I am about the simple removal of data—data funded by us, the American taxpayers—from public view. Fortunately, a judge stopped the data removal, but that may be temporary.

What will be the outcome? While Wu thinks this will reduce trust in science, I’m not so sure about that, especially given that trust in science fell strongly during the Biden administration, and trust is reduced simply because science is getting mixed up with politics in every administration. What worries me more is the vulnerability of science to the whims of the administration—an administration that seems to care more about key words than about research itself.  My view is that the government is entitled to vet science funding and cut waste if it wants, but that governments are poorly equipped to judge scientific merit. A grant that looks wasteful may come up with useful results, though of course there are some that simply look like government funded-virtue flaunting. It’s best if a generous dollop of money is allocated to science, and then scientists themselves decide how to dole it out, for they are the best equipped people to do so. In this I agree with Wu’s conclusion:

There will undoubtedly be periods, in the coming weeks and months, when the practice of science feels normal. Many scientists are operating as they usually do until they are told otherwise. But that normalcy is flimsy at best, in part because the Trump administration has shown that it may not care what data, well collected or not, have to say. During his Senate confirmation hearings, Kennedy repeatedly refused to acknowledge that vaccines don’t cause autism, insisting that he would do so only “if the data is there.” Confronted by Senator Bill Cassidy with decades of data that were, in fact, there, he continued to equivocate, at one point attempting to counter with a discredited paper funded by an anti-vaccine group.

In all likelihood, more changes are to come—including, potentially, major budgetary cuts to research, as Congress weighs this year’s funding for the nation’s major research agencies. Trump and his administration are now deciding how deep a rift to make in America’s scientific firmament. How long it takes to repair the damage, or whether that will be possible at all, depends on the extent of the damage they inflict now.

I’m just glad that I don’t have to apply for science grants any more.

Categories: Science

CAR T-cells enable record-breaking 18-year nerve cancer remission

New Scientist Feed - Mon, 02/17/2025 - 8:00am
A person with neuroblastoma, which occurs when developing nerve cells in children turn cancerous, has remained tumour-free for over 18 years thanks to CAR T-cell therapy
Categories: Science

From headaches to tics, how mass nocebo effects spread real symptoms

New Scientist Feed - Mon, 02/17/2025 - 8:00am
Social media is enabling health symptoms and mass psychogenic illnesses to spread quickly around the world. But by knowing how it happens, you can protect yourself
Categories: Science

Most DEI endeavors in higher education are declared illegal

Why Evolution is True Feed - Mon, 02/17/2025 - 7:30am

The time has come that many have feared but many will celebrate: DEI (“diversity, equity, and inclusion) is effectively gone from campuses by federal order.

Inside Higher Ed reports; click headline to read:

An excerpt:

The Education Department’s Office for Civil Rights declared all race-conscious student programming, resources and financial aid illegal over the weekend and threatened to investigate and rescind federal funding for any institution that does not comply within 14 days.

In a Dear Colleague letter [JAC: see below] published late Friday night, acting assistant secretary for civil rights Craig Trainor outlined a sweeping interpretation of the Supreme Court’s 2023 ruling in Students for Fair Admissions v. Harvard, which struck down affirmative action. While the decision applied specifically to admissions, the Trump administration believes it extends to all race-conscious spending, activities and programming at colleges.

. . . . .The letter mentions a wide range of university programs and policies that could be subject to an OCR investigation, including “hiring, promotion, compensation, financial aid, scholarships, prizes, administrative support, discipline, housing, graduation ceremonies, and all other aspects of student, academic, and campus life.”

“Put simply, educational institutions may neither separate or segregate students based on race, nor distribute benefits or burdens based on race,” Trainor writes.

Backlash to the letter came swiftly on Saturday from Democratic lawmakers, student advocates and academic freedom organizations.

“This threat to rip away the federal funding our public K-12 schools and colleges receive flies in the face of the law,” Senator Patty Murray, Democrat of Washington, wrote in a statement Saturday. “While it’s anyone’s guess what falls under the Trump administration’s definition of ‘DEI,’ there is simply no authority or basis for Trump to impose such a mandate.”

But most college leaders have, so far, remained silent.

Since virtually every institution of higher learning depends on some federal funding, this gives colleges the choices of abandoning DEI or abandoning federal money. You know which they’ll prefer. The former, of course, but they’ll try to have both, sometimes by duplicitous practices.

Since the Supreme Court has declared that universities can’t use race as a basis for admitting students, but will allow them to identify their race in essays (this is a backdoor many colleges use to promote affirmative action), the letter also deals with that:

The Dear Colleague letter also seeks to close multiple exceptions and potential gaps left open by the Supreme Court ruling on affirmative action and to lay the groundwork for investigating programs that “may appear neutral on their face” but that “a closer look reveals … are, in fact, motivated by racial considerations.”

Chief Justice John Roberts wrote that colleges could legally consider a student’s racial identity as part of their experience as described in personal essays, but the OCR letter rejects that.

“A school may not use students’ personal essays, writing samples, participation in extracurriculars, or other cues as a means of determining or predicting a student’s race and favoring or disfavoring such students,” Trainor wrote.

It would be hard to determine, though, whether colleges are actually doing this. Essays and the like aren’t banned—only their use for race-based admissions, and that would be a lot harder to prove than what Harvard did, which was give Asian applicants lower “personality scores” in a way that could be statistically affirmed. Further, the elimination of standardized tests as a requirement for application—another backdoor approach to promoting affirmative action—is also now banned:

Going even further beyond the scope of the SFFA decision, the letter forbids any race-neutral university policy that could conceivably be a proxy for racial consideration, including eliminating standardized test score requirements.

The department has never revoked a college or state higher education agency’s federal funding over Title VI violations. If the OCR follows through on its promises, it would be an unprecedented exercise of federal influence over university activities.

The letter is likely to be challenged in court, but in the meantime it could have a ripple effect on colleges’ willingness to continue funding diversity programs and resources for underrepresented students.

On top of that, there will be no more race or gender-based graduation ceremonies (Harvard had at least ten “affinity graduations”), no more ethnically-segregated dormitories, no more segregation of any type. As the letter notes (my emphasis):

Although SFFA addressed admissions decisions, the Supreme Court’s holding applies more broadly. At its core, the test is simple: If an educational institution treats a person of one race differently than it treats another person because of that person’s race, the educational institution violates the law. Federal law thus prohibits covered entities from using race in decisions pertaining to admissions, hiring, promotion, compensation, financial aid, scholarships, prizes, administrative support, discipline, housing, graduation ceremonies, and all other aspects of student, academic, and campus life. Put simply, educational institutions may neither separate or segregate students based on race, nor distribute benefits or burdens based on race

Of course this will be challenged in court, though I don’t see a clear reason why the executive branch can’t make such a policy since the Supreme Court has disallowed race-based admissions.  In the meantime, you can find the whole letter at this site (this one was sent to Harvard, but they’re all the same), or click on the screenshots below, where I’ve given just a short excerpt.  Colleges will be poring over the whole four-page letter.

My Chicago colleague Dorian Abbot, who’s opposed to DEI, wrote a short piece about this on Heterodox Substack with this information about how to report violations:

If you want to report something but are concerned about potential retaliation, Jonathan Mitchel at Faculty, Alumni, & Students Opposed to Racial Preferences (FASORP) has offered to file the complaints with OCR. You can give information anonymously at the FASORP website, including any documents, websites, or other relevant information. The website does not track IP addresses and you can use a VPN before navigating to it if you want to be extra safe.

If you have any information about ongoing illegal discrimination, it is essential to report it as soon as possible. General Council at every educational institution needs to quickly understand and advise their administration that discrimination really is illegal and must stop immediately.

As for me, I have mixed feelings, and have gone back and forth on this issue in the past few years. On the one hand, I’m strongly opposed to requiring DEI statements for hiring or promotion.  This is illegal compelled speech and, in fact, is banned by the University of Chicago’s 1970 Shils Report. Nor do I think that there should be preferential admission on the basis of race, nor the elimination of standardized tests as a sneaky way to increase “diversity”, though I have suggested that when two candidates are equally qualified, the minority candidate might be favored.

The fact is that, historically, minorities have been disadvantaged by bias in a way that has affected them over the long term. In my view, the way to remedy this is not through “equity”—a misguided claim that groups should be represented in all institutions in the same proportion as in the general population.  The proper remedy is equal opportunity, but of course that is a much harder remedy than simply forcing equity on institutions through preferential treatment. But equal opportunity from birth is the only way to guarantee that groups are truly treated equally now, and seems the fairest solution.

Besides the possibility of preferential admission when students have equal records (this is of course illegal under the present “Dear Colleague” letter), the only DEI that I think colleges and universities need is a small office—or even just a procedure—for dealing with reported instances of bias against students or university members, and those reports cannot be anonymous. In the meantime, DEI should consist of promulgating these two statements:

1.) All students should be treated equally regardless of ethnicity, religion, disability, ideology, and so on

2.) Any instances of bias or harassment of students can be reported here (give link or location).

It will be interesting to see what happens in the next three years, but we can be sure that once the Democrats re-assume power, all of the above will be deep-sixed.

Categories: Science

Readers’ wildlife photos

Why Evolution is True Feed - Mon, 02/17/2025 - 6:15am

Today we have volume IV of Robert Lang’s 13-set series of photos from his recent trip to the Pantanal, today featuring birds. Robert’s captions are indented, and you can enlarge the photos by clicking on them.

Readers’ Wildlife Photos: The Pantanal, Part IV: Birds

Continuing our mid-2025 journey to the Pantanal in Brazil, by far the largest category of observation and photography was birds: we saw over 100 different species of birds (and this was not even a birding-specific trip, though the outfitter also organizes those for the truly hard core).

Not all of what we saw was so gracious as to pose sufficiently close, still, and well-lighted to get a good photo, but the Pantanal still offered much better photo opportunities than did the Amazon a few years ago, where most of the birds presented as a tiny black silhouette high in a distant tree. Although I usually try to say a few words about each photo in my RWP contributions, there’s just to many here, so in most, I’ll just give the name and species and move on, proceeding alphabetically by common name. (Species identification are courtesy of our guide, augmented sometimes by Merlin Bird ID. Corrections gratefully accept.)

A female anhinga (Anhinga anhinga), in its characteristic holding-out-the-wings-to-dry pose:

Bare-faced curassows (Crax fasciolata), male on the left, female on the right:

And a female with its crest up:

A bare-faced ibis (Phimosus infuscatus):

Black-backed water tyrant (Fluvicola albiventer). Quite a scary name for such a small, unassuming bird

Black-bellied whistling ducks (Dendrocygna autumnalis):

Black-collared hawk (Busarellus nigricollis), this one flying:

A black-crowned night heron (Nycticorax nycticorax):

A black-fronted nunbird (Monasa nigrifrons):

And that’s all for this installment. We’re not even out of the B’s. (Heck, we’re not even out of the “black-“s!) More to come soon!

Categories: Science

Breaking the Curse of the Habitable Zone

Universe Today Feed - Mon, 02/17/2025 - 6:13am

The Habitable Zone is a central concept in our explorations for life outside the Earth. Is it time to abandon it?

The Habitable Zone is defined as the region around a star where liquid water can exist on the surface of a planet. At first glance, that seems like a good starting place to hunt for alien life in other systems. After all, there’s only one kind of life known in the universe (ours) and it exists in the Habitable Zone of the Sun.

But researchers have long noted that the Habitable Zone concept is far too restrictive. Besides the examples of the icy moons in our own solar system, life itself is able to alter the chemistry of a planet, shifting its ability to retain or remove heat, meaning that the un-habitable regions of a distant system might be more clement than we thought.

Even if we restrict ourselves to the basic biochemistry that makes Earthly life possible, we have many more options than we naively thought. Hycean worlds, planets thought to be englobed by water surrounded by thick hydrogen atmospheres, once thought to be too toxic for any kind of life, might be even more suitable than terrestrial worlds.

What about tidally-locked planets around red dwarf stars, like our nearest neighbor Proxima b and the intriguing system of TRAPPIST-1? Conditions on those planets might be hellish, with one side facing the incessant glare of its star and the other locked in permanent night. Neither of those extremes seem suitable for life as we know it. But even those worlds can support temperate atmospheres if the conditions are just right. A delicate balancing act for sure, but a balancing act that every life-bearing planet must walk.

Our galaxy contains billions of dead stars, the white dwarves and neutron stars. We know of planets in those systems. Indeed, the first exoplanets were discovered around a pulsar. Sometimes those dead stars retain planets from their former lives; other times the planets assemble anew from the stellar wreckage. In either case, the stars, though dead, are still warm, providing a source of energy for any life that might find a home there. And considering the sheer longevity of those stars the incredibly long history of our galaxy, life has had many chances to appear – and sustain itself – in systems that are now dead.

Who needs planets, anyway? Methanogens could take advantage of the exotic, cold chemistry of molecular clouds, feasting on chemicals processed by millennia of distant high-energy starlight. It might even be possible for life to sustain itself in a free-floating biological system, with the gravity of its own mass holding on to an atmosphere. It’s a wild concept, but all the foundational functions of a free-floating habitat – scaffolding, energy capture and storge, semi-permeable membranes – are found on terrestrial life.

We should absolutely continue our current searches – after all, they’re not groundless. But before we invest in the next generation of super-telescopes, we should pause and reconsider our options. We should invest in research that pushes the edges of what life means and where it can exist, and we should explore pathways to identifying and observing those potential habitats. Only after we have extended research along these lines can we decide on a best-case strategy.

In other words, we should replace a goal, that of finding life like our own, with a vision of finding life wherever we can. Nature has surprised us many times in the past, and we shouldn’t let our biases and assumptions get in the way of our path of discovery.

The post Breaking the Curse of the Habitable Zone appeared first on Universe Today.

Categories: Science

Eight habits that could keep your heart healthy

New Scientist Feed - Mon, 02/17/2025 - 6:00am
From staying active to getting plenty of sleep, there are many ways to keep your heart healthy
Categories: Science

How to Visualize a Wave Function

Science blog of a physics theorist Feed - Mon, 02/17/2025 - 5:45am

Before we knew about quantum physics, humans thought that if we had a system of two small objects, we could always know where they were located — the first at some position x1, the second at some position x2. And after Isaac Newton’s breakthroughs in the late 17th century, we believed that by combining this information with knowledge of the objects’ motions and the forces acting upon them, we could calculate where they would be in the future.

But in our quantum world, this turns out not to be the case. Instead, in Erwin Schrödinger’s 1925 view of quantum physics, our system of two objects has a wave function which, for every possible x1 and x2 that the objects could have, gives us a complex number Ψ(x1, x2). The absolute-value-squared of that number, |Ψ(x1, x2)|2, is proportional to the probability for finding the first object at position x1 and the second at position x2 — if we actually choose to measure their positions right away. If instead we wait, the wave function will change over time, following Schrödinger’s wave equation. The updated wave function’s square will again tell us the probabilities, at that later time, for finding the objects at those particular positions.

The set of all possible object locations x1 and x2 is what I am calling the “space of possibilities” (also known as the “configuration space”), and the wave function Ψ(x1, x2) is a function on that space of possibilities. In fact, the wave function for any system is a function on the space of that system’s possibilities: for any possible arrangement X of the system, the wave function will give us a complex number Ψ(X).

Drawing a wave function can be tricky. I’ve done it in different ways in different contexts. Interpreting a drawing of a wave function can also be tricky. But it’s helpful to learn how to do it. So in today’s post, I’ll give you three different approaches to depicting the wave function for one of the simplest physical systems: a single object moving along a line. In coming weeks, I’ll give you more examples that you can try to interpret. Once you can read a wave function correctly, then you know your understanding of quantum physics has a good foundation.

For now, everything I’ll do today is in the language of 1920s quantum physics, Schrödinger style. But soon we’ll put this same strategy to work on quantum field theory, the modern language of particle physics — and then many things will change. Familiarity with the more commonly discussed 1920s methods will help you appreciate the differences.

Complex Numbers

Before we start drawing pictures, let me remind you of a couple of facts from pre-university math about complex numbers. The fundamental imaginary number is the square root of minus one,

which we can multiply by any real number to get another imaginary number, such as 4i or -17i. A complex number is the sum of a real number and an imaginary number, such as 6 + 4i or 11 – 17i.

More abstractly, a complex number w always takes the form u + i v, where u and v are real numbers. We call u the “real part” of w and we call v the “imaginary part” of w. And just as we can draw a real number using the real number line, we can draw a complex number using a plane, consisting of the real number line combined with the imaginary number line; in Fig. 1 the complex number w is shown as a red dot, with the real part u and imaginary part v marked along the real and imaginary axes.

Figure 1: Two ways of representing the complex number w, either as u + i v or as |w|eiφ .

Fig. 1 shows another way of representing w. The line from the origin to w has length |w|, the absolute value of w, with |w|2 = u2 + v2 by the Pythagorean theorem. Defining φ as the angle between this line and the real axis, and using the following facts

  • u = |w| cos φ
  • v = |w| sin φ
  • eiφ = cos φ + i sin φ

we may write w = |w|eiφ , which indeed equals u + i v .

Terminology: φ is called the “argument” or “phase” of w, and in math is written φ = arg(w).

One Object in One Dimension

We’ll focus today only on a single object moving around on a one-dimensional line. Let’s put the object in a “Gaussian wave-packet state” of the sort I discussed in this post’s Figs. 3 and 4 and this one’s Figs. 6 and 7. In such a state, neither the object’s position nor its momentum [a measure of its motion] is completely definite, but the uncertainty is minimized in the following sense: the product of the uncertainty in the position and the uncertainty in the momentum is as small as Heisenberg’s uncertainty principle allows.

We’ll start with a state in which the uncertainty on the position is large while the uncertainty on the momentum is small, shown below (and shown also in Fig. 3 of this post and Fig. 6 of this post.) To depict this wave function, I am showing its real part Re[Ψ(x)] in red and its imaginary part Im[Ψ(x)] in blue. In addition, I have drawn in black the square of the wave function:

  • |Ψ(x)|2 = (Re[Ψ(x)])2 + (Im[Ψ(x)])2

[Note for advanced readers: I have not normalized the wave function.]

Figure 1: For an object in a simple Gaussian wave packet state with near-definite momentum, a depiction of the wave function for that state, showing its real and imaginary parts in red and blue, and its absolute-value squared in black.

But as wave functions become more complicated, this way of doing things isn’t so convenient. Instead, it is sometimes useful to represent the wave function in a different way, in which we plot |Ψ(x)| as a curve whose color reflects the value of φ = arg[Ψ(x)] , the argument of Ψ(x). In Fig. 2, I show the same wave function as in Fig. 1, depicted in this new way.

Figure 2: The same wave function as in Fig. 1; the curve is the absolute value of the wave function, colored according to its argument.

As φ cycles from 0 to π/4 to π/2 to 3π/4 and back to 2π (the same as φ = 0), the color cycles from red to yellow-green to cyan to blue-purple and back to red.

Compare Figs. 1 and 2; its the same information, depicted differently. That the wave function is actually waving is clear in Fig. 1, where the real and imaginary parts have the shape of waves. But it is also represented in Fig. 2, where the cycling through the colors tells us the same thing. In both cases, the waving tells us that the object’s momentum is non-zero, and the steadiness of that waving tells us that the object’s momentum is nearly definite.

Finally, if I’m willing to give up the information about the real and imaginary parts of the wave function, and just want to show the probabilities that are proportional to its squared absolute value, I can sometimes depict the state in a third way. I pick a few spots where the object might be located, and draw the object there using grayscale shading, so that it is black where the probability is large and becomes progressively lighter gray where the probability is smaller, as in Fig. 3.

Figure 3: The same wave function in Figs. 1 and 2, here showing only the probabilities for the object’s location; the darker the grey, the more likely the object is to be found at that location.

Again, compare Fig. 3 to Figs. 1 and 2; they all represent information about the same wave function, although there’s no way to read off the object’s momentum using Fig. 3, so we know where it might be but not where it is going. (One could add arrows to indicate motion, but that only works when the uncertainty in the momentum is small.)

Although this third method is quite intuitive when it works, it often can’t be used (at least, not as I’ve described it here.) It’s often useful when we have just one object to worry about, or if we have multiple objects that are independent of one another. But if they are not independent — if they are correlated, as in a “superposition” [more about that concept soon] — then this technique usually does not work, because you can’t draw where object number 1 is likely to be located without already knowing where object number 2 is located, and vice versa. We’ve already seen examples of such correlations in this post, and we’ll see more in future.

So now we have three representations of the same wave function — or really, two representations of the wave function’s real and imaginary parts, and two representations of its square — which we can potentially mix and match. Each has its merits.

How the Wave Function Changes Over Time

This particular wave function, which has almost definite momentum, does indeed evolve by moving at a nearly constant speed (as one would expect for something with near-definite momentum). It spreads out, but very slowly, because its speed is only slightly uncertain. Here is its evolution using all three representations. (The first was also shown in this post’s Fig. 6.)

I hope that gives your intuition some things to hold onto as we head into more complex situations.

Two More Examples

Below are two simple wave functions for a single object. They differ somewhat from the one we’ve been using in the rest of this post. What do they describe, and how will they evolve with time? Can you guess? I’ll give the full answer tomorrow as an addendum to this post.

Two different wave functions; in each case the curve represents the absolute value |Ψ(x)| and the color represents arg[Ψ(x)], as in Fig. 2. What does each wave function say about the object’s location and momentum, and how will each of them change with time?
Categories: Science

China launches hunt for ways to protect data from quantum computers

New Scientist Feed - Mon, 02/17/2025 - 2:00am
Efforts to develop next-generation cryptography algorithms that can't be broken by quantum computers are already underway in the US, but now China has announced it will seek its own solutions
Categories: Science

Pompeii’s streets show how the city adapted to Roman rule

New Scientist Feed - Mon, 02/17/2025 - 12:00am
Pompeii only came under Roman control around 160 years before its destruction – and its traffic-worn streets show how the locals adjusted their business operations
Categories: Science

So it begins: Robert F. Kennedy Jr. is confirmed as HHS Secretary and immediately starts dismantling US federal science infrastructure

Science-based Medicine Feed - Mon, 02/17/2025 - 12:00am

The nightmare has come true. Robert F. Kennedy Jr. has been confirmed as HHS Secretary and didn't wait long to start dismantling federal science and health programs. The White House even formed a "MAHA commission" to draw up a battle plan.

The post So it begins: Robert F. Kennedy Jr. is confirmed as HHS Secretary and immediately starts dismantling US federal science infrastructure first appeared on Science-Based Medicine.
Categories: Science

Pages

Subscribe to The Jefferson Center  aggregator