The giant-impact hypothesis posits that billions of years ago a Mars-sized body named Theia collided with the early Earth.
The immense energy from this impact not only significantly altered Earth’s rotational dynamics but also resulted in debris being ejected into space. Over time, this debris coalesced to form the Moon.
We do not know for sure if Theia existed and if it collided with the young proto-Earth, but the evidence is compelling.
For one, we are the only rocky planet with a substantial moon. Mercury and Venus have none, while Mars lays claim to only two small, captured asteroids. The very existence of our large moon demands explanation.
Second, there’s spin. The Earth spins much faster than the other rocky planets, and the Moon orbits around us at a surprisingly swift pace. Something deep in our past must have provided all that energy, and a collision with another protoplanet explains it with ease.
Lastly, we have an unexpected piece of evidence from our human adventures to the Moon. The Apollo missions were more than pursuits of glory; they were scientific enterprises. Trained by expert geologists, the Apollo astronauts, beginning with Armstrong and Aldrin, where taught to search for and extract interesting findings.
What they returned to Earth revealed an enormous wealth of scientific knowledge of the Moon’s composition, because for the first time we were able to acquire large amounts of regolith – the generic term for the loose material that makes up the lunar surface – and return it to Earth for further study. All told, the six successful Apollo missions brought back 2,200 samples totaling almost 400 kilograms of material.
The regolith returned by the Apollo missions displayed a remarkable property: the lunar surface is oddly similar in constitution to the Earth’s crust, with similar ratios of elements. The only conclusion is that we must have a common origin.
So while we are never able to turn the clock back and witness the formation of the Earth and Moon, we can use the clues scattered around us to help us understand this cataclysmic event that took place over four billion years ago.
The post Why We Think Theia Existed appeared first on Universe Today.
What can Venus atmospheric samples returned to Earth teach us about the varied evolution of both planets? This is what a recent study presented at the American Geophysical Union (AGU) Fall 2024 Meeting discussed a compelling mission concept called VATMOS-SR (Venus ATMOSphere – Sample Return), which is designed to collect samples from Venus’ atmosphere and return them to Earth for further study. This mission has the potential to help scientists gain greater insights into the formation and evolution of Venus and how it diverged so far from Earth’s evolution, despite both planets being approximately the same size.
Here, Universe Today discusses this incredible mission concept with Dr. Guillaume Avice, who is a National Centre for Scientific Research (CNRS) Permanent Researchers at the Paris Institute of Global Physics and lead author of the mission concept, regarding the motivation behind VATMOS-SR, advantages and limitations, significant results they hope to achieve with VATMOS-SR, steps being taken to address specific sampling concerns, nest steps in making VATMOS-SR a reality, and what can VATMOS-SR potentially teach us about finding life in Venus’ atmosphere. Therefore, what was the motivation behind the VATMOS-SR mission concept?
“The scientific motivation concerns the origin and evolution of planetary atmospheres,” Dr. Avice tells Universe Today. “We know very well the Earth’s atmosphere and we have some insights about the ancient Earth’s atmosphere. For Venus, there are measurements done in the 70’s but we have only very partial data. Returning a sample from the Venus atmosphere would allow us to put strong constraints on the delivery of volatile elements to terrestrial planets soon after solar system formation. Indeed, the two planets are very similar in terms of size, position relative to the Sun etc. Yet, their respective evolution diverged, and it remains a mystery why. Another motivation is that we would return for the first time (if we do it before Mars Sample Return) a sample from another planet than Earth.”
For VATMOS-SR, the researchers aim to accomplish three primary scientific objectives: the sources of volatile elements in Venus’ atmosphere, comparing today’s number of volatile elements to when they first formed billions of years ago, and examining the gases that transferred from Venus’ interior to its atmosphere throughout the planet’s history (also called outgassing). To accomplish this, VATMOS-SR is designed to collect several atmospheric liter-sized samples approximately 110 kilometers (68 miles) above the surface of Venus while traveling at more than 10 kilometers per second (6 miles per second).
VATMOS-SR builds off a previous mission concept called Cupid’s Arrow, which was presented at the 49th Lunar and Planetary Science Conference in 2018, with the primary difference being VATMOS-SR will return the samples to Earth whereas Cupid’s Arrow was slated to analyze the samples while still at Venus. Like all mission concepts, the authors note there are advantages and limitations for VATMOS-SR.
“The great advantage is that instruments in our laboratories are very precise for determining the abundance and isotopic composition of volatile elements,” Dr. Avice tells Universe Today. “This is a much better situation compared to in-situ measurements by an instrument onboard a space probe which has numerous limitations. The limitation of the mission is that, in order to return the sample back to Earth, sampling will happen at high velocity (10-13 km/s) meaning that the gas will be fractionated. We can correct for this effect but this is a limitation of the mission. Another one is that sampling gas means that measurements have to be done quickly when back on Earth because any sampling device you could imagine will have a leak rate. We can use high-tech technology to preserve the gas but ideally the preliminary science will have to be done quickly after return.”
As noted, Earth and Venus are approximately the same size, with Venus’s diameter approximately 95 percent of Earth’s. Despite this, both planets are starkly different regarding their characteristics, specifically surface temperatures and pressures. While Earth’s average surface temperature is a livable 15 degrees Celsius (59 degrees Fahrenheit), Venus’s average surface temperature is a scorching 462 degrees Celsius (864 degrees Fahrenheit), which is hot enough to melt lead.
While Earth’s average surface pressure is measured at 14.7 pounds per square inch (psi), Venus’ average surface pressure is approximately 92 times higher, which is equivalent to experiencing the pressures at 900 meters (3,000 feet) underwater on Earth. This is due to Venus’ atmosphere being extremely dense and composed of carbon dioxide (~96.5 percent), leading to a runaway greenhouse effect. In contrast, while the atmosphere of the planet Mars is also composed of largely carbon dioxide (~95 percent), its atmosphere is much thinner, resulting in significantly lower average surface pressure. Therefore, despite the vast differences between Earth and Venus, what are the most significant results the team hopes to achieve with VATMOS-SR?
“To understand the origin and evolution of the atmosphere of Venus to better understand Earth’s sister planet but also to understand what makes a planet habitable or not,” Dr. Avice tells Universe Today. “This is also extremely important to understand exoplanets because atmospheres of exoplanets are the only reservoir that can be measured remotely with telescopes. Understanding exoplanets thus requires to understand the composition of planetary atmospheres in our solar system.”
Regarding the fractionation concerns about obtaining the samples at such high speeds, Dr. Avice notes statistical studies have been conducted in collaboration with NASA showing promising results and notes the next steps will involve similar tests but with better probe designs.
Going from a concept to becoming an actual mission and delivering groundbreaking science often takes years to decades to happen, often involving several stages of ideas, scientific implications, systems analysis, designs, prototypes, re-designs, and funding availability. Once components and hardware are finally built, they are tested and re-tested to ensure maximum operational capacity since they can’t be fixed after launch. This ensures all systems function independently and together to achieve maximum mission success, including science data collection and transmitting data back to Earth in a timely and efficient manner.
For example, while NASA’s New Horizons spacecraft conducted its famous flyby of Pluto in July 2015, the mission concept was first proposed in August 1992, accepted as a concept in June 2001, received funding approval in November 2001. It was finally launched in June 2006 and endured a 9-year journey to Pluto where it sent back breathtaking images of the dwarf planet in July 2015. Therefore, what are the next steps to making VATMOS-SR a reality?
Dr. Avice tells Universe Today, “We gathered a European team of scientists and engineers together with American and Japanese colleagues to propose VATMOS-SR to the coming ESA call for F-class (fast) mission. The CNES (French space agency) is supporting VATMOS-SR and is providing a lot of help with engineers and specialists to build a strong case to answer this call. This call will be released next month and, if selected, VATMOS-SR will be under consideration by the European Space Agency with developing activities starting as soon as 2026.”
The VATMOS-SR concept comes as debate continues to rage regarding whether the atmosphere of Venus is capable of hosting life as we know it, since the upper atmosphere has been shown to exhibit Earth-like temperatures and pressures, which is a stark contrast to the surface of Venus. It is estimated that the habitable zone of Venus’ atmosphere is between 51 kilometers (32 miles) and 62 kilometers (38 miles) above the surface that exhibit temperatures between 65 degrees Celsius (149 degrees Fahrenheit) and -20 degrees Celsius (-4 degrees Fahrenheit), respectively. As noted, VATMOS-SR is slated to collect samples at approximately 110 kilometers (68 miles) above the surface, or more than twice the altitude from the estimated atmospheric habitable zone. Despite this, what can VATMOS-SR teach us about finding life in Venus’ atmosphere?
Dr. Avice tells Universe Today, “Nothing directly (and no chance to have live organisms in the gas samples) but VATMOS-SR will tell us why Venus became such an inhabitable place. This is of course linked to the question, ‘Is it possible that life appeared on Venus at some point in its history?’”
For now, VATMOS-SR remains a very intriguing mission concept with the goal of helping us unravel the history of Venus and potentially the solar system, along with being an international collaboration between the United States, Europe (CNES), and Japan. While Dr. Avice is designated as the principal investigator, it was Dr. Christophe Sotin, who is a Co-PI, professor at the University of Nantes, former senior research scientist at NASA JPL, and lead author of the Cupid’s Arrow study, who first proposed measuring Venus’ atmosphere.
What new insights into Venus’ evolutionary history could VATMOS-SR provide scientists in the coming years and decades? Only time will tell, and this is why we science!
As always, keep doing science & keep looking up!
The post Unlocking Venus’ Secrets with VATMOS-SR Mission Concept appeared first on Universe Today.
Type 1a supernovae are extremely powerful events that occur in binary systems containing at least one white dwarf star – the core remnant of a Sun-like star. Sometimes, the white dwarf’s powerful gravity will siphon material from its companion star until it reaches critical mass and explodes. In another scenario, a binary system of two white dwarfs will merge, producing the critical mass needed for a supernova. Unlike regular supernovae, which occur every fifty years in the Milky Way, Type Ia supernovae happen roughly once every five hundred years.
In addition to being incredible events, Type 1a supernovae are useful astronometric tools. As part of the Cosmic Distance Ladder, these explosions allow astronomers to measure the distances to objects millions or billions of light-years away. This is vital to measuring the rate at which the Universe is expanding, otherwise known as the Hubble Constant. Thanks to an international team of researchers, a catalog of Type 1a Supernovae has just been released that could change what we know of the fundamental physics of supernovae and the expansion history of the Universe.
This new catalog constitutes the second data release (DR2) from the Zwicky Transient Facility (ZTF), a wide-field astronomical survey that began in 2018. This survey relies on the ZTF camera on the 1.2-meter (4-foot) Samuel Oschin Telescope at the Palomar Observatory near San Diego, California. It has classified over 8,000 supernovae, including 3628 nearby Type 1a supernovae (SNe Ia), more than doubling the number of known SNe Ia’s discovered in the past 30 years. Despite being rare, the ZTF’s depth and survey strategy have allowed the ZTF Collaboration to detect nearly four per night.
This catalog contains 3628 nearby SNe Ia and is the first large and homogenous dataset astrophysicists can access. The release is detailed in a paper released on February 14th in Astronomy & Astrophysics, alongside a Special Issue containing 21 related publications. The paper’s lead authors are Dr. Mickael Rigault, head of the ZTF Cosmology Science working group and a Research Scientist at the Centre National de la Recherche Scientifique (CNRS), the Université Claude Bernard Lyon, and Dr. Matthew Smith, a Lecturer in Astrophysics at Lancaster University. As Dr. Rigault said:
“For the past five years, a group of thirty experts from around the world have collected, compiled, assembled, and analyzed these data. We are now releasing it to the entire community. This sample is so unique in terms of size and homogeneity that we expect it to significantly impact the field of Supernovae cosmology and to lead to many additional new discoveries in addition to results we have already published.”
The key component of the ZTF system is its 47-square-degree, 600-megapixel cryogenic CCD mosaic science camera. The camera scans the entire northern sky daily in three optical bands with a magnitude of 20.5, allowing it to detect nearly all supernovae within 1.5 billion light-years of Earth. Co-author Prof. Kate Maguire of Trinity College Dublin said, “Thanks to ZTF’s unique ability to scan the sky rapidly and deeply, we have captured multiple supernovae within days—or even hours—of [the] explosion, providing novel constraints on how they end their lives.”
The ultimate purpose of the survey is to determine the expansion rate of the Universe (aka. the Hubble Constant). Since the late 1990s and the Hubble Deep Fields observations, which used SNe Ia to measure cosmic expansion, astronomers have known that the expansion rate is accelerating. This effectively demonstrated that the Hubble Constant is not constant and gave rise to the theory of Dark Energy. In addition, the ability to observe the Universe all the way back to roughly 1 billion years after the Big Bang led to the “Crisis in Cosmology.”
Also known as the “Hubble Tension,” astronomers noted that distance measurements along the Cosmic Ladder produced different values. Since then, cosmologists have been looking for explanations for this Tension, which include the possibility of Early Dark Energy (EDE). A key part of this is obtaining truly accurate measurements of cosmic distances. Co-author Professor Ariel Goobar, the Director of the Oskar Klein Centre in Stockholm and one of the founding institutions of ZTF, was also a member of the team that discovered the accelerated expansion of the Universe in 1998.
“Ultimately, the aim is to address one of our time’s biggest questions in fundamental physics and cosmology, namely, what is most of the Universe made of?” she said. “For that, we need the ZTF supernova data.” One of the biggest takeaways from this catalog and the studies that went into creating it is that more than previously thought, Type Ia Supernovae vary based on their host environment. As a result, the correction mechanism used to date needs revising, which could change how we measure the expansion rate of the Universe.
This could have consequences for the Standard Model of Cosmology – aka. the Lambda Cold Dark Matter (Lambda-CDM) model – and issues arising from it like the Hubble Tension. This data will be essential when the Nancy Grace Roman Space Telescope (RST) launches into space and begins making observations leading to the first wide-field maps of the Universe. Combined with observations by the ESA’s Euclid mission, these maps could finally resolve the mystery of Dark Matter and cosmic expansion. As Dr Rigault said:
“With this large and homogeneous dataset, we can explore Type Ia supernovae with an unprecedented level of precision and accuracy. This is a crucial step toward honing the use of Type Ia Supernovae in cosmology and assess[ing] if current deviations in cosmology are due to new fundamental physics or unknown problem[s] in the way we derive distances.”
Further Reading: Lancaster University, Astronomy & Astrophysics
The post Huge Release of Type 1a Supernovae Data appeared first on Universe Today.
What’s the story of our Moon’s early history? Despite all we know about our closest natural satellite, scientists are still figuring out bits of its history. New measurements of rocks gathered during the Apollo missions now show it solidified some 4.43 billion years ago. It turns out that’s about the time Earth became a habitable world.
University of Chicago scientist Nicolas Dauphas and a team of researchers made the measurements. They looked at different proportions of elements inside Moon rocks. They provide a window into the Moon’s early epochs. It started out as a fully molten blob after a collision between two early solar system bodies.
As it cooled and crystallized, the molten proto-moon separated into layers. Eventually, about 99% of the lunar magma ocean had solidified. The rest was a unique residual liquid called KREEP. That acronym stands for the elements potassium (K), rare earth elements (REE), and phosphorus (P).
Dauphas and his team analyzed this KREEP and found that it formed about 140 million years after the birth of the Solar System. It’s in the Apollo rocks and scientists hope to find it in samples from the South Pole-Aitken basin. This is the region where Artemis astronauts will eventually explore. If analysis confirms it there, then it indicates a uniform distribution of this KREEP layer across the lunar surface.
Understanding KREEP’s History on the MoonThe clues to the Moon’s ultimate “cooling off period” lie in a faintly radioactive rare earth element called “lutetium”. Over time, it decays to become hafnium. In the early Solar System, all rocks had about the same amounts of lutetium. Its decay process helps determine the age of the rocks where it exists.
However, the Moon’s solidification and subsequent formation of KREEP reservoirs didn’t result in a lot of lutetium compared to other rocks created at the same time. So, the scientists wanted to measure the proportions of lutetium and hafnium in Moon rocks and compare them to other bodies created around the same time—such as meteorites. That would allow them to calculate a more precise time for when the KREEP formed on the Moon.
They tested tiny samples of Moon rocks and looked at the ratio of hafnium in embedded lunar zircons. Through that analysis, they found that the rock ages are consistent with formation in a KREEP-rich reservoir. Those ages are consistent with the formation of KREEP reservoirs about 140 million years after the birth of the solar system, or about 4.43 billion years ago. “It took us years to develop these techniques, but we got a very precise answer for a question that has been controversial for a long time,” said Dauphas.
Placing KREEP in PerspectiveInterestingly, the team’s results showed that lunar magma ocean crystallization occurred while leftover planetary embryos and planetesimals bombarded the Moon. Those objects were the birth “seeds” of the planets and Moon, which began after the Sun coalesced starting some 4.6 billion years ago. What remained from the formation of the planets continued to batter the already-formed planets.
The formation of the Moon itself began some 60 million years after the solar system itself was born. The most likely event was the collision of a Mars-sized world called Theia with the infant Earth. That sent molten debris into space and it began to coalesce to make the Moon. “We must imagine a big ball of magma floating in space around Earth,” said Dauphas. Shortly thereafter, that ball began to cool. That process eventually resulted in the formation of the lunar KREEP layers.
An artist’s conception of the cooling lunar magma ocean. Courtesy ESA.The study of the decay of lutetium to hafnium in samples of those KREEP rocks is a big step forward in understanding the most ancient epoch of lunar history. More rock samples brought back from the South Pole-Aitken basin will help fill in the remaining blanks and help researchers clarify the timeline of both the cooling of the lunar rock and the subsequent creation of such rock deposits as the mare basalts. Those rock layers were created when impactors slammed into the lunar surface, generating lava flows that filled the impact basins.
The mare formed as a result of impacts later in the early history of the Moon, some 240 million years after the birth of the Solar System formation. Those impacts stimulated lava flows that covered less than 20 percent of the lunar surface and engulfed the oldest surfaces.
Timing is EverythingFixing the dating of lunar cooling not only tells us about the history of the Moon but helps scientists understand Earth’s evolution. That’s because the impact that formed the Moon was probably also the last major impact on Earth. It could well mark a time when the Earth may have begun its transformation into a stable world. That’s an important step toward evolving into a place hospitable for life.
“This finding aligns nicely with other evidence—it’s a great place to be in as we prepare for more knowledge about the Moon from the Chang’e and Artemis missions,” said Dauphas. “We have a number of other questions that are waiting to be answered.”
For More InformationLunar Rocks Help Scientists Pinpoint When the Moon Crystallized
Completion of Lunar Magma Ocean Solidification at 4.43 Ga
Moon Formation
The post The Moon Solidified 4.43 Billion Years Ago appeared first on Universe Today.
When it comes to particles, only photons are more abundant than neutrinos, yet detecting neutrinos is extremely difficult. Scientists have gone to extreme lengths to detect them, including building neutrino observatories in deep, underground mines and in the deep, clear ice of Antarctica.
One of their latest efforts to detect neutrinos is KM3NeT, which is still under construction at the bottom of the Mediterranean Sea. Though the neutrino telescope isn’t yet complete, it has already detected the most energetic neutrino ever detected.
The Universe is flooded with them, yet they’re extremely difficult to detect. They’re like tiny, abundant ghosts and are sometimes called “ghost particles.” They have no electric charge, which limits the ways they interact with matter. The fact that they only interact through gravity and the weak nuclear force explains their elusiveness.
Neutrinos can’t be seen and are only detected indirectly on the rare occasions when they interact with matter through the weak force. These interactions release Cherenkov Radiation that detectors can sense. Detectors have to be very large to catch these rare interactions. Km3NeT (Cubic Kilometre Neutrino Telescope) features thousands of individual detectors in each of two sections. At the end of 2024, Km3NeT was only 10% complete, yet on February 13th, it detected an extraordinarily energetic neutrino.
The detection is presented in new research in Nature titled “Observation of an ultra-high-energy cosmic neutrino with KM3NeT.” The KM3NeT Collaboration is credited with authorship.
“The detection of cosmic neutrinos with energies above a teraelectronvolt (TeV) offers a unique exploration into astrophysical phenomena,” the paper states. “Here we report an exceptionally high-energy event observed by KM3NeT, the deep-sea neutrino telescope in the Mediterranean Sea, which we associate with a cosmic neutrino detection.”
This is an artist’s impression of a KM3NeT installation in the Mediterranean. Underwater neutrino detectors take advantage of location to track these fast particles. Image Courtesy Edward Berbee/Nikhef.Though neutrinos themselves are undetectable, the muons created by their rare interactions with matter are detectable. In this detection, the muon’s energy level was 120 (+110/-60) petaelectronvolts (PeV). High-energy neutrinos like these are produced when “ultra-relativistic cosmic-ray protons or nuclei interact with other matter or photons,” according to the paper.
Because neutrinos seldom interact with matter and aren’t affected by magnetic fields, they could originate from extremely distant places in the Universe. These are called cosmogenic neutrinos rather than solar neutrinos, the more plentiful type that comes from the Sun. Cosmogenic neutrinos are more energetic than solar neutrinos because they’re created by cosmic rays from high-energy astrophysical phenomena like active galactic nuclei and gamma-ray bursts. Since they travel virtually unimpeded from distant sources, they can provide insights into their sources.
In terms of energy level, there are two types of neutrinos: atmospheric and cosmogenic. Cosmogenic neutrinos are more energetic and less plentiful than atmospheric neutrinos. “The neutrino energy is thus a crucial parameter for establishing a cosmic origin,” the paper states.
“The energy of this event is much larger than that of any neutrino detected so far,” the paper states. This could be the first detection of a cosmogenic neutrino and it could be the result of ultra-high energy cosmic rays that interact with background photons.
“Of interest in this article are neutrino interactions that produce high-energy muons, which can travel several kilometres in seawater before being absorbed,” the paper states. As these muons travel through the water, they lose energy. The amount of energy lost in each unit of travel is proportional to the muon’s energy level. By recording the signals and their time of arrival at different individual detectors in the KM3NeT array, scientists can then reconstruct the muon’s initial energy level and its direction.
This figure shows side and top views of the event in (a), with the Eiffel Tower shown for scale. The red line shows the reconstructed trajectory of the muon created by the neutrino interaction. The hits of individual photomultiplier tubes (PMTs) are represented by spheres stacked along the direction of the PMT orientations. Only the first five hits on each PMT are shown. The spheres are colour-coded relative to the first initial detection, and the larger they are, the more photons were detected, equating to energy level. Image Credit: The KM3NeT Collaboration, 2025.“The muon trajectory is reconstructed from the measured times and positions of the first hits recorded on the PMTs, using a maximum-likelihood algorithm,” the paper states. The new detection is referred to as KM3-230213A. The 21 detection lines registered 28,086 hits, and by counting the number of PMTs that are triggered, the researchers can estimate the muon energy at the detector.
This figure shows the number of detections in a simulation of the KM3-230213A event. The simulation helps researchers determine the true muon energy. “The normalized distributions of the number of PMTs participating in the triggering of the event for simulated muon energies of 10, 100 and 1,000?PeV,” the authors write. The vertical dashed line indicates the observed value in KM3-230213A with 3,672 PMT detections. Image Credit: The KM3NeT Collaboration, 2025.The KM3NeT Collaboration detected the most energetic neutron ever while still incomplete, and that bodes well for the future. However, the incomplete facility did limit one aspect of the detection. There’s uncertainty about the direction it came from. “A dedicated sea campaign is planned in the future to improve the knowledge of the positions of the detector elements on the seafloor,” the authors write. Once that campaign is complete, the data from KM3-230213A will be recalibrated.
Still, the researchers learned something about the direction of its source, albeit with an uncertainty estimated to be 1.5°. At the vast distances involved, that’s a significant uncertainty. “The probability that KM3-230213A is of cosmic origin is much greater than any hypothesis involving an atmospheric origin,” the paper states.
The researchers identified some candidate sources.
“Extragalactic neutrino sources should be dominated by active galactic nuclei, and blazars are of particular interest considering the very-high energy of KM3-230213A,” the paper states. “To compile a census of potential blazar counterparts within the 99% confidence region of KM3-230213A, archival multiwavelength data were also explored.”
The researchers identified 12 potential source blazars in different survey catalogues.
The red star in this figure shows KM3-230213A. The three concentric red circles show the error regions within R(68%), R(90%) and R(99%). Selected source candidates and their directions are shown as coloured markers. The colours and marker type indicate the criterion according to which the source was selected, e.g. VLBI is Very Large Baseline Interferometry. The sources are numbered according to their proximity to KM3-230213A. Image Credit: The KM3NeT Collaboration, 2025.Neutrinos are abundant yet elusive. They pass right through the Earth unimpeded, and about 100 trillion of them pass through our bodies every second. Detecting them is important because of what they can tell us about the Universe.
The extraordinary energy level of this neutrino is significant in neutrino astrophysics. It shows that nature can generate ultra-high-energy neutrinos, possibly from blazars, which are active galactic nuclei with jets pointed right at us.
“This suggests that the neutrino may have originated in a different cosmic accelerator than the lower-energy neutrinos, or this may be the first detection of a cosmogenic neutrino, resulting from the interactions of ultra-high-energy cosmic rays with background photons in the Universe.”
The post An Unfinished Detector has Already Spotted the Highest-Energy Neutrino Ever Seen appeared first on Universe Today.
In 1974, science fiction author Larry Niven wrote a murder mystery with an interesting premise: could you kill a man with a tiny black hole? I won’t spoil the story, though I’m willing to bet most people would argue the answer is clearly yes. Intense gravity, tidal forces, and the event horizon would surely lead to a messy end. But it turns out the scientific answer is a bit more interesting.
On the one hand, it’s clear that a large enough black hole could kill you. On the other hand, a black hole with the mass of a single hydrogen atom is clearly too small to be noticed. The real question is the critical mass. At what minimum size would a black hole become deadly? That’s the focus of a new paper on the arXiv.
The study begins with primordial black holes. These are theoretical bodies that may have formed in the earliest moments of the Universe and would be much smaller than stellar-mass black holes. Anywhere from atom-massed to a mass several times that of Earth. Although astronomers have never found any primordial black holes, observations do rule out several mass ranges. For example, any primordial black hole smaller than 1012 kg would have already evaporated thanks to Hawking radiation. Anything larger than 1020 kg would gravitationally lens stars in the Milky Way. Since we haven’t detected these lensing effects, they must at the very least be exceedingly rare. If they exist at all.
Some theoretical models argue that primordial black holes could be the source of dark matter. If that’s the case, observational limits constrain their masses to the 1013 – 1019 kg range, which is similar to the mass range for asteroids. Therefore, the study focuses on this range and looks at two effects: tidal forces and shock waves.
Tidal forces occur because the closer you get to a mass, the stronger its gravity. This means a black hole exerts a force differential on you as it gets near. So the question is whether this force differential is strong enough to tear flesh. Asteroid-mass black holes are less than a micrometer across, so even the tidal forces would cover a tiny area. If one passed through your midsection or one of your limbs, there might be some local damage, but nothing fatal. It would be similar to a needle passing through you.
But if the black hole passed through your head, that would be a different story. Tidal forces could tear apart brain cells, which would be much more serious. Since brain cells are delicate, even a force differential of 10 – 100 nanonewtons might kill you. But that would take a black hole at the highest end of our mass range.
Shockwaves would be much more dangerous. In this case, as a black hole entered your body, it would create a density wave that would ripple through you. These shockwaves would physically damage cells and transfer heat energy that would do further damage. To create a shockwave of energy similar to that of a 22-caliber bullet, the black hole would only need a mass of 1.4 x 1014 kg, which is well within the range of possible primordial black holes.
So yes, a primordial black hole could kill you.
While that makes for a great story, it would never happen in real life. Even if asteroid-mass primordial black holes exist, the number of them out there compared to the vastness of space means that the odds of it happening to anyone in their lifetime are less than one in 10 trillion.
Reference: Niven, Larry. “The Hole Man.” Analog Science Fiction/Science Fact (1974): 93-104.
Reference: Robert J. Scherrer. “Gravitational Effects of a Small Primordial Black Hole Passing Through the Human Body.” arXiv preprint arXiv:2502.09734
(2025)
The post What Would Happen if a Tiny Black Hole Passed Through Your Body? appeared first on Universe Today.
The Habitable Zone is a central concept in our explorations for life outside the Earth. Is it time to abandon it?
The Habitable Zone is defined as the region around a star where liquid water can exist on the surface of a planet. At first glance, that seems like a good starting place to hunt for alien life in other systems. After all, there’s only one kind of life known in the universe (ours) and it exists in the Habitable Zone of the Sun.
But researchers have long noted that the Habitable Zone concept is far too restrictive. Besides the examples of the icy moons in our own solar system, life itself is able to alter the chemistry of a planet, shifting its ability to retain or remove heat, meaning that the un-habitable regions of a distant system might be more clement than we thought.
Even if we restrict ourselves to the basic biochemistry that makes Earthly life possible, we have many more options than we naively thought. Hycean worlds, planets thought to be englobed by water surrounded by thick hydrogen atmospheres, once thought to be too toxic for any kind of life, might be even more suitable than terrestrial worlds.
What about tidally-locked planets around red dwarf stars, like our nearest neighbor Proxima b and the intriguing system of TRAPPIST-1? Conditions on those planets might be hellish, with one side facing the incessant glare of its star and the other locked in permanent night. Neither of those extremes seem suitable for life as we know it. But even those worlds can support temperate atmospheres if the conditions are just right. A delicate balancing act for sure, but a balancing act that every life-bearing planet must walk.
Our galaxy contains billions of dead stars, the white dwarves and neutron stars. We know of planets in those systems. Indeed, the first exoplanets were discovered around a pulsar. Sometimes those dead stars retain planets from their former lives; other times the planets assemble anew from the stellar wreckage. In either case, the stars, though dead, are still warm, providing a source of energy for any life that might find a home there. And considering the sheer longevity of those stars the incredibly long history of our galaxy, life has had many chances to appear – and sustain itself – in systems that are now dead.
Who needs planets, anyway? Methanogens could take advantage of the exotic, cold chemistry of molecular clouds, feasting on chemicals processed by millennia of distant high-energy starlight. It might even be possible for life to sustain itself in a free-floating biological system, with the gravity of its own mass holding on to an atmosphere. It’s a wild concept, but all the foundational functions of a free-floating habitat – scaffolding, energy capture and storge, semi-permeable membranes – are found on terrestrial life.
We should absolutely continue our current searches – after all, they’re not groundless. But before we invest in the next generation of super-telescopes, we should pause and reconsider our options. We should invest in research that pushes the edges of what life means and where it can exist, and we should explore pathways to identifying and observing those potential habitats. Only after we have extended research along these lines can we decide on a best-case strategy.
In other words, we should replace a goal, that of finding life like our own, with a vision of finding life wherever we can. Nature has surprised us many times in the past, and we shouldn’t let our biases and assumptions get in the way of our path of discovery.
The post Breaking the Curse of the Habitable Zone appeared first on Universe Today.
The Habitable Worlds Observatory, NASA’s planned successor to the James Webb Space Telescope, will be a monster of an instrument. Using the same origami-like technique pioneered by the James Webb, the HWO will unfold a mirror spanning 6-8 meters across. Among its many science goals, its primary mission will be to directly image promising nearby exoplanets to hunt for biosignatures, which are signs of life as we know it.
The HWO is expected to take cost $11 billion and launch in the first half of the 2040’s. But if the tortured history of the James Webb is any indication, then those numbers are highly optimistic lower bounds.
After all those resources, all that money and time and talent devoted to one single telescope, designers of the HWO hope to survey a grand total of 25 potentially habitable Earth-like worlds.
Surely there’s a better way.
We need to heavily invest in a program of diversification to have the best – and cheapest – chances of success when it comes to finding life outside the Earth. That means we need to search for life in all the places where we least expect it.
Right now our life-hunting programs focus on Earth-like planets orbiting their parent stars within the so-called Habitable Zone, the band where the star’s radiation is just right to allow for liquid water on the surface. On one hand, these expectations are built on a solid foundation. The only known life to exist in the universe – ours – thrives in exactly that environment. And we know what our kind of life looks like and what it does to planetary atmospheres, increasing the chances of a confirmed detection of a biosignature.
But the other hand, our preconceived notions have been challenged in the past, and assuming that nature is as limited as our current thinking could be a costly mistake, as we spend billions on future programs with little chance of success.
Take the methanogens, a broad group of Archaea that “eat” hydrogen and emit methane as a by-product. Mars might be a suitable home for them. Not on the surface, but kilometers underground.
Additionally, the last place you might think to look for life is in the outer reaches of the solar system, home to the giant planets and their icy moons. And yet many of those moons host liquid water oceans vaster than the Earth’s – and they are now prime candidates for extraterrestrial life in our own solar system. If we had forged ahead with Habitable Zone searches in our own solar system, we would have spent decades fruitlessly digging in the Martian dirt, ignoring the potential watery goldmines of the outer moons.
We should take the lesson offered by our own backyard and extend that thinking to the wider galaxy. There have already been researchers exploring the edges of what life could be and where it could thrive, pulling their examples from extreme lifeforms on Earth and cutting-edge research into the definition of habitability. Before we invest billions of dollars in a next-generation mega-observatory, we should carefully consider all the options.
The post Is the Habitable Worlds Observatory a Good Idea? appeared first on Universe Today.
The asteroid belt beckons – it contains enough resources for humans to expand into the entire rest of the solar system and has no biosphere to speak of. Essentially, it is a giant mine just waiting to be exploited. So, a student team from the University of Texas at Austin has devised a plan to exploit it as part of the Revolutionary Aerospace System Concepts – Academic Linkage (RASC-AL), a competition sponsored by NASA to encourage undergraduate and graduate students to develop innovative ideas to solve some of space exploration’s challenges. UT Austin’s submission to the competition last year, known as the Autonomous Exploration Through Extraterrestrial Regions (AETHER) project, certainly fits that bill.
AETHER was submitted to the AI-Powered Self-Replicating Probes sub-section of RASC-AL 2024, which solicited ideas that would advance John von Neumann’s idea of a self-replicating space probe. AETHER addresses those challenges in two distinct ways.
First, it combines a spring-loaded landing system and a metal-burning rocket engine to hop between different asteroids in the belt. To fuel its rocket, it uses a system to harvest water and metal (specifically aluminum) from the surface of the asteroid it’s currently on, splits it into its components, and then dumps them into a fuel tank that can be used to power its next trip to a different asteroid. All of this is powered by a Kilowatt Reactor Using Stirling TechnoloY (KRUSTY) nuclear reactor that has been undergoing NASA and DoE testing for over a decade.
Fraser discusses the concept of von Neumann probes.The springs in AETHER’s legs have a two-fold purpose. First, they allow for a soft landing on the surface of the gravitationally weak asteroid and can transfer some of the energy created by that landing into stored energy, which can be used to launch the system from its landing place later. It also has a set of wheels to navigate around the asteroid’s surface. When it’s time to jump off again, it replants its legs and springs back into space – with a little help from its rocket engine.
The rocket engine designed as part of AETHER can burn metal, such as aluminum, that the craft harvests from the asteroid to use as fuel. It is the primary system designed to take the craft from asteroid to asteroid, and it is meant to be a high-delta-v option for doing so quickly.
AETHER also tries to mimic a von Neumann probe by using a machine-learning algorithm to improve its resource-harvesting efforts. It would take data from various sensors, including synthetic aperture radar and a spectrometer, and estimate where the best spot would be to land to refuel. While collecting that additional fuel material, it would communicate back with Earth via a high-speed optical communication link, allowing an Earth-based server to update the machine learning parameters and improve the algorithm’s outcome for the next hop.
Fraser’s interest with self-replicating robots goes back a long way – here’s his explanation on HeroX about the concept.The original mission design for AETHER has it stopping at two specific asteroids before moving on to as-yet-unnamed ones. The first, which is probably no surprise, is Psyche, the big metallic asteroid that is about to be visited by its own dedicated probe. Data from that probe will help inform the first iteration of AETHER’s learning algorithm, and the input the sensors provide from its visit will update it before its next step – Themis. That asteroid, though smaller, is expected to contain a large amount of water ice, which is a necessary component for AETHER’s rocket engines.
After visiting the first two asteroids, the mission moves on to places unknown, as completing those steps would be considered a success. But given the longevity of the KRUSTY reactor and the craft’s ability to refill its own fuel tank, it is possible, or even likely, that AETHER would consider operating well past its rendezvous with Themis.
The UT Austin team was comprised entirely of undergraduate students, though it’s unclear what year of study they were in. But, given their experience with the 2024 version of RASC-AL, they would seem well-placed to submit a project proposal for the recently announced 2025 version. If they do, hopefully, their idea will be just as innovative as AETHER’s.
Learn More:
Flores et al – AETHER
UT – Miniaturized Jumping Robots Could Study An Asteroid’s Gravity
UT – NASA Funds the Development of a Nuclear Reactor on the Moon That Would Last for 10 Years
UT – Engineers Design a Robot That Can Stick To, Crawl Along, and Sail Around Rubble Pile Asteroids
Lead Image:
Landing and take-off depiction of AETHER.
Credit – Flores et al.
The post Spring-loaded Robot Could Explore the Asteroid Belt Almost Indefinitely appeared first on Universe Today.
RCW 38 is a molecular cloud of ionized hydrogen (HII) roughly 5,500 light-years from Earth in the direction of the constellation Vela. Located in this cloud is a massive star-forming cluster populated by young stars, short-lived massive stars, and protostars surrounded by clouds of brightly glowing gas. The European Southern Observatory (ESO) recently released a stunning 80-million-pixel image of the star cluster that features the bright streaks and swirls of RCW 38, the bright pink of its gas clouds, and its many young stars (which appear as multi-colored dots).
The image was captured by the Visible and Infrared Survey Telescope for Astronomy (VISTA), located at the ESO’s Paranal Observatory in the Atacama Desert of Chile. The telescope is the world’s largest survey telescope and combines a 4.1-meter (~13.5-foot) mirror, the most highly curved mirror of its size. The extremely high curvature reduces the focal length, making the telescope’s structure extremely compact. This design enables VISTA to map large areas of the southern sky quickly, deeply, and systematically.
The telescope also has a wide field of view and a huge camera weighing three metric tons (3.3 U.S. tons) with 16 state-of-the-art infrared-sensitive detectors. VISTA’s surveys in the near-infrared (NIR) spectrum have revealed completely new views of the southern sky. Star clusters are often called “stellar nurseries” since they contain all the ingredients for star formation, including dense gas clouds and opaque clumps of cosmic dust.
When clumps of this gas and dust collect to the point that they undergo gravitational collapse, new stars are born. The strong radiation produced by these newborn stars causes the gas shrouding the star cluster to glow brightly, creating the colorful display we see in this image. Despite that, many of the cluster’s stars cannot be observed in visible light because they are obscured by dust. However, these stars are still visible in infrared light, which passes through clouds of dust unimpeded.
This allowed the VISTA telescope and the VISTA InfraRed CAMera (VIRCAM) to capture the interior of the RCW 38 stellar cluster and reveal the true extent of its beauty. Visible in the cluster’s interior are young stars within dusty cocoons and colder “failed” stars known as brown dwarfs. The roughly 2000 stars in RCW 38 are very young, less than a million years old compared to our Sun (4.6 billion years old). Through its six public surveys, the telescope has mapped small patches of sky for long periods to detect extremely faint objects.
These range from distant galaxies, red dwarf stars, and brown dwarfs to small bodies in our Solar System. The newly-released infrared image was taken as part of the VISTA Variables in the Vía Láctea (VVV) survey, which studied the central parts of the Milky Way in five near-infrared bands. This survey took over 200,000 images of our galaxy and captured more than 355 open and 33 globular clusters. The data was used to create the most detailed infrared map of our home galaxy ever made. In fact, this map contains 10 times more objects than a previous one released by the same team back in 2012.
A catalog is also being created from VISTA data that will contain about a billion point sources and will be used to create a three-dimensional map of the central bulge of the Milky Way. Since the image of RCW 38 was taken, the VIRCAM camera has been retired after seventeen years of service. Later this year, it will be replaced by a new instrument, the 4-meter Multi-Object Spectrograph Telescope (4MOST). This second-generation instrument will give new life to the VISTA telescope, allowing it to obtain spectra of 2400 objects at once over a large area of the sky.
Further Reading: ESO
The post Stunning 80 Megapixel Image of a Stellar Nursery appeared first on Universe Today.
When astronomers detected the first known interstellar object, ‘Oumuamua, in 2017, it sparked a host of new studies trying to understand the origin and trajectory of the galactic sojourner.
‘Oumuamua’s unique properties – unlike anything orbiting our sun – had scientists pondering how such an object could have formed. Now, a pair of researchers, Xi-Ling Zheng and Ji-Lin Zhou, are using numerical simulations to test out possible solar system configurations that could result in ‘Oumuamua-like objects. Their findings show that solar systems with a single giant planet have the necessary orbital mechanics at work to create such an object – but that other explanations may still be required.
Zheng and Zhou published their findings in the Monthly Notices of the Royal Astronomical Society in February 2025.
They began their study by working backward from the known properties of ‘Oumuamua.
When it was visible to Earth’s telescopes for just a few months in 2017, it showed an intensely variable brightness, changing from bright to dim every four hours. Astronomers interpreted this variability as an elongated, cigar-shaped object tumbling through space.
Two other things made ‘Oumuamua unique. First, it appeared to have a dry, rocky surface, akin to the asteroids known in our solar system. But it also changed its orbit in a way that could not purely be explained by the laws of gravity – something else made it change direction.
Redirections like this are sometimes seen in icy comets. As they approach the Sun, off-gassing released from the heated ice acts like a thruster, changing the comet’s trajectory.
An artist’s depiction of the interstellar comet ‘Oumuamua, as it warmed up in its approach to the sun and outgassed hydrogen (white mist), which slightly altered its orbit. (Image credit: NASA, ESA and Joseph Olmsted and Frank Summers of STScI)Somehow, ‘Oumuamua displayed a mix of both comet-like and asteroid-like properties.
One plausible explanation, proposed in 2020, is that ‘Oumuamua-like objects are formed by tidal fragmentation. That’s when a ‘volatile-rich’ parent body (like a large comet) passes too close to its star at high speeds, shattering it into long, thin shards. The heating process in these extreme interactions causes the formation of an elongated rocky shell, but preserves an interior of subsurface ice. This unique combination, not seen in our own solar system, would explain ‘Oumuamua’s orbital maneuvers despite its rocky composition.
It also explains why we don’t tend to see them in our solar system, because “ejected planetesimals experienced tidal fragmentation at more than twice the rate of surviving planetesimals (3.1% versus 1.4%),” the authors write. In other words, if the orbital forces are strong enough for tidal fragmentation to happen, it also means they’re strong enough to kick the object out of the system entirely.
Interstellar space may therefore be full of dagger-shaped shards of rock and ice (an exaggeration, but a fun quote for dinner parties nonetheless).
The white dwarf Sirius B compared to Earth. Credit: ESA and NASAThe simplest star system that could cause this type of tidal fragmentation are those home to white dwarfs. These are the extremely dense, dead cores of old exploded stars. A white dwarf, encircled by a belt of distant comet-like objects, similar to the Sun’s Oort cloud, could spawn ‘Oumuamua clones with regular frequency.
But the process is enhanced in systems that host Jupiter-sized planets.
The exception is ‘Hot Jupiters’ that orbit close to their star. These are less likely to interact with objects subject to tidal fragmentation.
But Jupiter-sized planets distant from their host star are very effective at producing ‘Oumuamua clones, especially if they have eccentric orbits. But even here, it’s not a perfect match for the origin of ‘Oumuamua, because these interactions tend to produce shards that are not as elongated, and at a rate lower than what is expected for ‘Oumuamua-type objects.
The authors conclude that the planetary systems most likely to have spawned ‘Oumuamua are those with many planets, which are more “efficient at producing interstellar objects,” the authors say, though they propose a few other possibilities too.
So while there is now a strong, plausible explanation for the process that birthed ‘Oumuamua, the type of solar system that produced it is still very much an open question.
Xi-Ling Zheng amd Ji-Lin Zhou, “Configuration of single giant planet systems generating ‘oumuamua-like interstellar asteroids.” Monthly Notices of the Royal Astronomical Society.
The post Many Stars Could Have Sent Us ‘Oumuamua appeared first on Universe Today.
NASA continues to progress with the development of the Nancy Grace Roman Space Telescope (RST), the next-generation observatory with a target launch date of 2027. As the direct successor to the venerable Hubble Space Telescope, Roman will build on the successes of Hubble and the James Webb Space Telescope (JWST). Named after NASA’s first chief astronomer, the “mother of the Hubble,” the Nancy Grace Roman Space Telescope will have a panoramic field of view 200 times greater than Hubble’s infrared view, enabling the first wide-field maps of the Universe.
Combined with observations by the ESA’s Euclid mission, these maps will help astronomers resolve the mystery of Dark Matter and cosmic expansion. The development process reached another milestone as the mission team at NASA’s Goddard Space Flight Center successfully integrated the mission’s sunshade—a visor-like aperture cover—into the outer barrel assembly. This deployable structure will shield the telescope from sunlight and keep it at a stable temperature, allowing it to take high-resolution optical and infrared images of the cosmos.
Similar in function to Webb‘s sunshield, Roman’s is designed to make its instruments more sensitive to faint light sources, allowing the telescope to resolve distant galaxies, dimmer stars, brown dwarfs, and the gas and dust that permeate the interstellar medium (ISM). The shield consists of two layers of reinforced thermal blankets that will remain folded during launch, allowing the telescope to fit inside its payload fairing. It will deploy once the telescope has reached space using a system of three booms that are triggered electronically.
NASA’s Nancy Grace Roman Space Telescope, named after NASA’s first Chief of Astronomy.The integration took a few hours, during which the technicians joined the sunshield and outer barrel assembly in the largest clean room at NASA Goddard. In addition to protecting the telescope from micrometeoroid impacts, the outer barrel assembly will also prevent light contamination and keep the telescope at a stable temperature. This will be accomplished by a series of heaters that prevent the telescope’s mirrors from experiencing temperature swings that would cause them to expand and contract. Said Brian Simpson, Roman’s deployable aperture cover lead at NASA Goddard, in a NASA press release:
“We’re prepared for micrometeoroid impacts that could occur in space, so the blanket is heavily fortified. One layer is even reinforced with Kevlar, the same thing that lines bulletproof vests. By placing some space in between the layers we reduce the risk that light would leak in, because it’s unlikely that the light would pass through both layers at the exact same points where the holes were.”
With this integration complete, the mission has now passed the Key Decision Point-D (KDP-D) milestone, the transition from fabrication to the assembly phase. This will be followed by the integration and testing phases, which Roman is on track for completion by fall 2026, followed by the launch phase no later than May 2027. The sunshade and outer barrel assembly were built by Goddard engineers and have been individually tested many times. Following the integration, the engineers conducted a deployment test that verified that they function together.
Since the sunshade was designed to deploy in space, the system isn’t powerful enough to deploy in Earth’s gravity, so the test involved a gravity negation system to offset its weight. Next, the team will conduct a thermal vacuum test to ensure the components function in the temperature and pressure environment of space. After that, they will put the assembled components through a shake test to simulate the intense vibrations they will experience during launch.
The view from below the Roman Space telescopes Outer Barrell Assembly’s baffles towards the deployed Deployable Aperture Cover. Credit: NASA/Chris GunnIn the coming months, technicians will attach the telescope’s solar panels (which completed testing this past summer) to the outer barrel assembly and sunshade. The team expects to have these components integrated with the rest of the observatory by the end of the year. Said Laurence Madison, a mechanical engineer at NASA Goddard:
“Roman is made up of a lot of separate components that come together after years of design and fabrication. The deployable aperture cover and outer barrel assembly were built at the same time, and up until the integration the two teams mainly used reference drawings to make sure everything would fit together as they should. So the successful integration was both a proud moment and a relief!”
In addition to surveying billions of galaxies and investigating the mystery of Dark Energy, Roman will use its wide-field imagers and advanced suite of spectrometers to directly image exoplanets and planet-forming disks, supermassive black holes (SMBHs), stellar nurseries, and small bodies in our Solar System. Said Sheri Thorn, an aerospace engineer working on Roman’s sunshade at NASA Goddard:
“It’s been incredible to see these major components go from computer models to building and now integrating them. Since it’s all coming together at Goddard, we get a front row seat to the process. We’ve seen it mature, kind of like watching a child grow up, and it’s a really gratifying experience.”
Further Reading: NASA
The post Construction of Roman Continues With the Addition of its Sunshade appeared first on Universe Today.
Our neighbour, the Large Magellanic Cloud (LMC), is rich in gas and dust and hosts regions of extremely robust star formation. It contains about 700 open clusters, groups of gravitationally bound stars that all formed from the same giant molecular cloud. The clusters can contain thousands of stars, all emitting vibrant energy that lights up their surroundings.
One of these clusters is NGC 2040 in the constellation of Dorado, and the Gemini South Telescope captured its portrait.
NGC 2040 is noteworthy because it contains so many O-type and B-type stars. They’re hot, massive stars that tend to live fast and die young as explosive supernovae. The cluster contains more than a dozen of these stars.
There are two things at play in this image. Supernova explosions buffet the gas and dust and help shape the nebula while the young stars light it up. The explosions also create shock waves that compress the surrounding gas, leading to the formation of the next generation of stars.
A press release describes the nebula as a “Valentine’s Day rose.” What we’re really seeing is oxygen and hydrogen atoms energized by UV light from young stars and emitting light at different wavelengths. However, since it’s Valentine’s Day, we’ll concede to their more poetic description.
Human eyes can never see something like this naturally. The light spans wavelengths from the ultraviolet to the optical to the infrared. Instead, the Gemini South telescope captures the light at wavelengths beyond our range. The telescope employs filters to manage the light, showing us the deep red and orange colours from hydrogen and the light blue of oxygen. Bright white regions are abundant in both. It’s a nice partnership between telescope technology and human vision.
NGC 2024 is part of a larger structure called LH 88, one of the LMC’s largest star formation regions. The stars in the cluster are moving together, though they’re widely separated. They’re ensconced in gas and dust, some left behind by stars that have already exploded as supernovae. The gas and dust are further shaped by the strong stellar winds from so many young stars.
Our Sun likely formed in a cluster similar to NGC 2024. However, since that happened about five billion years ago, the stars have dispersed, and so have the gas and dust. There’s no more nebula.
The Hubble Space Telescope captured this image of NGC 2040 back in 2012 with its Wide Field Planetary Camera 2. Image Credit: ESA/Hubble, NASA and D. A Gouliermis. Acknowledgement: Flickr user Eedresha SturdivantIt might not seem like it in our busy lives here on Earth’s surface, but this image tells a story we’re all wrapped up in: The cyclical nature of birth, death, and rebirth. When stars die and explode as supernovae, their material is expelled into space and taken up in the next round of star formation. And who knows, some of that material may be taken up in planet formation, maybe even rocky planets in the habitable zones of some of the new stars. Perhaps life will take root on one of those planets.
A zoom-in of the main image. Are planets forming in here somewhere? Rocky ones in habitable zones? Image Credit: International Gemini Observatory/NOIRLab/NSF/AURANothing lasts forever. Everything has a beginning and an end. One day, our Sun will become a red giant, Earth will be destroyed, and humanity may be destroyed with it. Though it’s a bleak proposition, it seems likely. But so is a kind of rebirth in a Universe that constantly recycles matter.
“Death is certain for one who has been born, and rebirth is inevitable for one who has died,” the Bhagavad Gita tells us. “Therefore, you should not lament over the inevitable.”
The post A Flaming Flower in the Large Magellanic Cloud appeared first on Universe Today.
New research on locomotion techniques that could be used in space exploration is constantly coming out. A lab from UCLA known as the Robotics and Mechanisms Laboratory (RoMeLa) is presenting a paper at the upcoming IEEE Aerospace Conference in March that details a unique system. The Space and Planetary Limbed Intelligent Tether Technology Exploration Robot (SPLITTER) consists of two miniaturized jumping robots tethered together.
Such a system might sound like a recipe for chaos and bring back memories of ladder ball games where no amount of control seems to make the tether go where you want it to. But, according to the paper, that system is actually quite stable, even in airless environments.
Mechanically, their system consists of two four-legged robots designed for jumping and tied together at their tops by a tether. Jumping is much more effective than “roving” on the surface of an asteroid because of all the jagged obstacles that need to be avoided. It is also more effective than flying since there is no atmosphere to push against in many space environments. Jumping robots, however, have been around for a while, but the real secret sauce is in the controls the RoMeLa team has developed.
Video describing some of the underlying tech of the SPLITTER robot.The concept they used is called inertial morphing. In the case of SPLITTER, the robots “adjust inertia with changes in limb configurations and tether length,” according to lead author Yusuke Tanaka in an interview with TechXplore. The researchers turned to a technique called Model Predictive Control (MPC) to determine how each variable needs to be adjusted.
MPC is used in various industries and comes as advertised, with a model (i.e., a mathematical representation of the robots) and a prediction, which reflects what the software estimates will happen to the model next. With the model’s current state and expected next state, a controller can change the variables that affect the model’s state. Those changes will result in a stable flying path, allowing SPLITTER to soar through the skies, even without air. It also uses a physical phenomenon known as the Tennis Racket Theorem, which describes how an object can flip rotation around its intermediate axis while rotating around it. Most famously, this was demonstrated on the ISS with a t-handle. It looks chaotic, but the mathematics behind the motion are well-understood.
Implementing it in a tethered robotic system is another matter altogether, though. While SPLITTER is flying, it looks a lot like a bola used in ladder ball, except instead of round spheres on each end, it’s a robot body with four legs splayed out in different directions. The orientation of how those legs are spread out and the length of the tether connecting the two ends are the variables the MPC controls to stabilize its flight. SPLITTER can operate without heavy attitude control hardware, like reaction wheels or thrusters.
Famous video of the Tennis Racket Effect on the ISS.It also allows the system to perform other actions, like spelunking, where one robot is anchored firmly to the top of a cave system while the other rappels using the tether. Both robots only weigh about 10kg each on Earth, as well, which would make them even more agile on a world with smaller gravity like the Moon or an asteroid.
This isn’t the first robot system the RoMeLa lab designed for this purpose. They initially worked on a robot called the Spine-enhanced Climbing Autonomous Legged Exploration Robot) (SCALER), which had its limitations as they found the limbed climbing robot was too slow.
With SPLITTER, the research team thinks they have a better concept that can both traverse terrain faster and collect data that a robot tied to the ground would be unable to do. Unfortunately, for now, at least, SPLITTER is best described as a computer model, though some preliminary work has been done on the physics of MPC controlling a reaction wheel. Researchers at the lab intend to continue working on the concept, so maybe soon we’ll see a bola robot test jumping near Los Angeles.
Learn More:
TechXplore – Modular robot design uses tethered jumping for planetary exploration
Tanaka, Zhu, & Hong – Tethered Variable Inertial Attitude Control Mechanisms through a Modular Jumping Limbed Robot
UT – Miniaturized Jumping Robots Could Study An Asteroid’s Gravity
UT – A Jumping Robot Could Leap Over Enceladus’ Geysers
Lead Image:
Depiction of one SPLITTER robot descending into a crater while the other anchors on the rim.
Credit – Yusuke Tanaka, Alvin Zhu, & Dennis Hong
The post A Bola Robot Could Provide Stable Jumping Capability on Low-Gravity Bodies appeared first on Universe Today.
White dwarfs are the remnants of once brilliant main sequence stars like our Sun. They’re extremely dense and no longer perform any fusion. The light they radiate is from remnant heat only.
Astronomers have doubted that white dwarfs could host habitable planets, partly because of the tumultuous path they follow to become white dwarfs, but new research suggests otherwise.
White dwarfs are so small that their habitable zones would be equally as small. Their habitable zones could range from only 0.0005 to 0.02 AU from the star. At that range, any planets would be tidally locked. One side of the planet could suffer from the runaway greenhouse effect, while the other could be frigid. Another problem concerns the existence of any white dwarf planets themselves. There are indications that they exist, but their population is undefined.
There are about 10 billion white dwarfs (WDs) in the Milky Way, and new research in The Astrophysical Journal suggests that some of them could harbour life-supporting planets. The research is titled “Increased Surface Temperatures of Habitable White Dwarf Worlds Relative to Main-sequence Exoplanets.” The lead author is Aomawa Shields, associate professor of physics and astronomy at UC Irvine.
“These results suggest that the white dwarf stellar environment, once thought of as inhospitable to life, may present new avenues for exoplanet and astrobiology researchers to pursue.”
Aomawa Shields, lead author, UC Irvine“Discoveries of giant planet candidates orbiting white dwarf (WD) stars and the demonstrated capabilities of the James Webb Space Telescope bring the possibility of detecting rocky planets in the habitable zones (HZs) of WDs into pertinent focus,” the authors write. If we do find more WD planets with the JWST or other telescopes, how likely is it that they’re habitable?
This research sought to find out by simulating two Earth-like aqua planets (ocean worlds) orbiting two different stars. They’re both tidally locked, follow circular orbits, and have Earth’s mass, atmospheric composition, and surface pressure. One is in the HZ of a main sequence star named Kepler-62, and the other is in the HZ of a hypothetical WD. Astronomers have already discovered large planets around WDs, so this simulation is based on real situations.
The researchers created synthetic spectra for both Kepler-62 and the white dwarf based on what is known about both. This image shows the spectral energy distribution of the modelled WD with an effective temperature of 5000 K (red) and a synthetic spectrum of Kepler-62 (4859 K, purple). Image Credit: Shields et al. 2025.“While white dwarf stars may still give off some heat from residual nuclear activity in their outer layers, they no longer exhibit nuclear fusion at their cores. For this reason, not much consideration has been given to these stars’ ability to host habitable exoplanets,” lead author Shields said in a press release. “Our computer simulations suggest that if rocky planets exist in their orbits, these planets could have more habitable real estate on their surfaces than previously thought.”
Shields and her co-researchers used a 3D climate model to simulate planets around the stars. Both planets are tidally locked to their stars. Although both stars have similar effective temperatures, the results show that the planets’ climates differ considerably. The HZ around the white dwarf is much closer, meaning its planet is closer. That proximity means the planet had a higher surface temperature and a much faster rotation period, which is critical to the results.
“The synchronously rotating WD planet’s global mean surface temperature is 25 K higher than that of the synchronously rotating planet orbiting K62 due to its much faster (10 hr) rotation and orbital period,” the authors explain in their paper.
The simulated planet orbiting K62 had a much longer orbital period, which allowed a large mass of water vapour clouds to accumulate on the dayside. These clouds cooled more of the planet’s surface, subtracting habitable surface area. “The planet orbiting Kepler-62 has so much cloud cover that it cools off too much, sacrificing precious habitable surface area in the process,” Shields said.
“On the other hand, the planet orbiting the white dwarf is rotating so fast that it never has time to build up nearly as much cloud cover on its dayside, so it retains more heat, and that works in its favor,” Shields said.
The WD planet’s faster rotation circulated the atmosphere more effectively, avoiding the runaway greenhouse effect. “This ultrafast rotation generates strong zonal winds and meridional flux of zonal momentum, stretching out and homogenizing the scale of atmospheric circulation and preventing an equivalent buildup of thick, liquid water clouds on the dayside of the planet compared to the synchronous planet orbiting K62,” the paper states. The authors also explain that this transports heat from higher latitudes toward the equator and that this pattern is seen in other simulations of short-period planets.
The simulations show that zonal winds are weaker on the K62 planet (left) than on the WD planet (right.) The WD planet’s more powerful winds create a more habitable planet. Image Credit: Shields et al. 2025“We expect synchronous rotation of an exoplanet in the habitable zone of a normal star like Kepler-62 to create more cloud cover on the planet’s dayside, reflecting incoming radiation away from the planet’s surface,” Shields said. “That’s usually a good thing for planets orbiting close to the inner edge of their stars’ habitable zones, where they could stand to cool off a bit rather than lose their oceans to space in a runaway greenhouse. But for a planet orbiting squarely in the middle of the habitable zone, it’s not such a good idea.”
This figure shows surface temperatures on the K62 planet (left), which has a 155-day orbit, and the WD planet (right), which has a 0.44-day orbit. The planet orbiting K62 “shows a characteristic, oval-shaped temperature pattern,” the authors write. The hottest point is at the substellar point on the planet’s dayside, and a cold nightside. The WD planet has stretched-out scales of circulation across the planet. and midlatitude jets. The hottest surface temperatures are located in the midlatitude jets, which is similar to simulations of other short-period planets. Image Credit: Shields et al. 2025.Fewer clouds on the dayside of WD planets, combined with a stronger greenhouse effect on the night side, would create warmer, more habitable conditions than on the Kepler-62 planet, despite the fact that WD energy outputs slowly decline over time. If these results hold up, they could be game-changing in our search for exoplanets in habitable zones.
“White dwarfs may, therefore, present amenable environments for life on planets formed within or migrated to their HZs, generating warmer surface environments than those of planets with main-sequence hosts to compensate for an ever-shrinking incident stellar flux,” the authors explain.
“These results suggest that the white dwarf stellar environment, once thought of as inhospitable to life, may present new avenues for exoplanet and astrobiology researchers to pursue,” Shields said.
What’s not clear is how many planets there are around WDs. The transition from a red giant to WD isn’t a peaceful process. When red giants expand, they engulf and destroy nearby planets. Our Sun will one day become a red giant, and it will engulf Mercury, Venus, and probably Earth. Maybe even Mars.
Artist’s impression of a red giant star. When red giants expand, they engulf and destroy nearby planets. Planets further away could migrate inwards and orbit the star when it’s a white dwarf. Image Credit: NASA/ Walt FeimerThese destroyed planets can form a debris disk around the white dwarf, from which a new generation of planets could emerge. Or planets further away from the red giant could survive and move closer to the star as it undergoes its changes. More research is needed to understand these possibilities.
“As it is likely that many of the planets orbiting WD progenitors will have been engulfed during the red giant phase, WD planets may be few within their systems and possibly orbiting alone in single-planet systems,” the authors write.
Our knowledge of exoplanet habitability is incomplete. Yet, it’s a critical issue in understanding the Universe and one of our biggest questions: Is there other life? We can’t answer the big one without a much better understanding of habitability and what conditions it exists in. The only way to gain that knowledge is with more powerful observations.
“As powerful observational capabilities to assess exoplanet atmospheres and astrobiology have come on line, such as those associated with the James Webb Space Telescope, we could be entering a new phase in which we’re studying an entirely new class of worlds around previously unconsidered stars.”
Press Release: UC Irvine astronomers gauge livability of exoplanets orbiting white dwarf stars
Research: Increased Surface Temperatures of Habitable White Dwarf Worlds Relative to Main-sequence Exoplanets
The post White Dwarfs Could Be More Habitable Than We Thought appeared first on Universe Today.
We all know that asteroids are out there, that some of them come dangerously close to Earth, and that they’ve struck Earth before with catastrophic consequences. The recent discovery of asteroid 2024 YR4 reminds us of the persistent threat that asteroids present. There’s an organized effort to find dangerous space rocks and determine how far away they are and where their orbits will take them.
A team of scientists has developed a method that will help us more quickly determine an asteroid’s distance, a critical part of determining its orbit.
Our asteroid concern is centred on NEOs or Near-Earth Objects. These are asteroids whose closest approach to the Sun is less than 1.3 astronomical units (AU). (A small number of NEOs are comets.) There are more than 37,000 NEOs, and while potential impacts are rare, the results can be catastrophic. Considering what happened to the dinosaurs, there’s not much room for complacency or hubris.
Large asteroids in the Main Asteroid Belt (MAB) are easier to study. Their large sizes mean they produce a bigger signal when observed, and astronomers can more easily determine their orbits. However, the MAB holds many smaller asteroids around 100-200 meters. There could be hundreds of millions of them. They’re big enough to devastate entire cities if they strike Earth, and they’re more difficult to track. The first step in determining their orbits is determining their distances, which is challenging and takes time.
Recent research submitted to The Astronomical Journal presents a new method of determining asteroid distances in much less time. It’s titled “Measuring the Distances to Asteroids from One Observatory in One Night with Upcoming All-Sky Telescopes” and is available at arxiv.org. The lead author is Maryann Fernandes from the Department of Electrical and Computer Engineering at Duke University.
The Vera Rubin Observatory (VRO) should see its first light in July 2025. One of its scientific objectives is to find more small objects in the Solar System, including asteroids, by scanning the entire visible southern sky every few nights. If it moves and reflects light, the VRO has a good chance of spotting it. However, it won’t automatically determine the distance to asteroids.
The Vera Rubin Observatory is poised to begin observations in 2025. It could detect 130 Near Earth Objects each night. Image Credit: Rubin Observatory/NSF/AURA/B. Quint“When asteroids are measured with short observation time windows, the dominant uncertainty in orbit construction is due to distance uncertainty to the NEO,” the authors of the new paper write. They claim their method can shorten the time it takes to determine an asteroid’s distance to one night of observations. It’s based on a technique called topocentric parallax.
Topocentric parallax is based on the rotation of the Earth. In a 2022 paper by some of the same researchers, the authors wrote that “Topocentric parallax comes from the diversity of the observatory positions with respect to the center of the Earth in an inertial reference frame. Observations from multiple observatories or a single observatory can measure parallax because the Earth rotates.”
In the two years since that paper, the researchers have refined their method. The research expands on previous algorithms and tests the technique using both synthetic data and real-world observations.
“In this paper, we further develop and evaluate this technique to recover distances in as quickly as a single night,” the authors write in the new paper. “We first test the technique on synthetic data of 19 different asteroids ranging from ~ 0.05 AU to ~ 2.4 AU.”
The figure below shows the results of the test with synthetic data. Each asteroid was observed six times in one night, and two different equations were employed to process the data.
This figure shows the measured and true distances to 19 asteroids as part of the method’s test. In this test, each asteroid was observed six times in one night. The top shows Measured distance (AU) versus True distance (AU) for all 19 asteroids considered in this analysis. Each panel is based on a separate equation that can be employed in the method. “We see the fit from Eq. 1 for the group of asteroids yielding precise distances with relatively good agreement with true distances,” the authors write. Image Credit: Fernandes et al. 2025.The researchers also tested their method by taking 15 observations of each asteroid over five nights (3 per night). In this test, Equation 1 performed poorly, while Equation 2 performed well.
This scenario featured 15 observations taken over 5 nights, with three observations per night. Equation 1 produces poor distance agreement, while with Equation 2, the distance recovery improves. Image Credit: Fernandes et al. 2025.Of course, the distance to the asteroid affected the accuracy of the measurements. The closer the object was, the more precise the measurement was. The paper notes that the method was able to recover distances “with uncertainties as low as the ~ 1.3% level for more nearby objects (about 0.3 AU or less) assuming typical astrometric uncertainties.”
After these tests with synthetic data, the team acquired their own single-night observations of two asteroids using a different algorithm. The real observations produced a less precise result, but it was still a meaningful improvement. The authors explain that they were able to recover distances “to the 3% level.”
So, what do all these tests, equations, and algorithms boil down to?
When we hear of an asteroid that could potentially strike Earth in a few years, people can wonder why the situation is so uncertain. Shouldn’t we know if an asteroid is heading straight for us? Trying to determine the orbit of these small rocks from tens of millions of km away is extremely difficult. An AU is almost 150 million km (93 million miles). 2024 YR, the latest asteroid of concern, is only 40 to 90 metres (130 to 300 ft) in diameter. Those numbers illustrate the problem.
If this method can improve the accuracy of our distance measurements and do it based on a single night of observations, that’s a big improvement.
The technique can be applied to data generated by the Vera Rubin Observatory and the Argus Array. According to the authors, “distances to NEOs on the scale of ~ 0.5 AU can be constrained to below the percent level within a single night.” As the study shows, the accuracy of those measurements from a single-site observatory depends heavily on the spacing between individual observations. If multiple observatories at different sites are used on the same night, the accuracy increases.
The Argus Array is a planned astronomical survey instrument that will be unique in its ability to observe the entire visible sky simultaneously. It will consist of 900 small telescopes, each with its own camera. It’s currently under construction, but its location isn’t being publicized. The researchers say their method can work with Argus’ data. Image Credit: Argus ArrayThough larger asteroids, like the one that wiped out the dinosaurs, tend to remain stable in the main asteroid belt, smaller asteroids are more easily perturbed and can become part of the NEO population. An impact from a smaller asteroid might not spell the end of civilization, but it can still be extremely destructive.
Anything humanity can do to understand the asteroid threat is wise. Many asteroids have struck Earth in the past, and it’s only a matter of time before another one comes our way. If we can see it coming in advance, we can try to do something about it.
Research: Measuring the Distances to Asteroids from One Observatory in One Night with Upcoming All-Sky
Telescopes
The post Dramatically Decreasing the Time it Takes to Measure Asteroid Distances appeared first on Universe Today.
It’s a familiar sight to see astronauts on board ISS on exercise equipment to minimise muscle and bone loss from weightlessness. A new study suggests that jumping workouts could help astronauts prevent cartilage damage during long missions to the Moon and Mars. They found that the knee cartilage in mice seems to grow stronger after jumping exercises, potentially counteracting the effects of low gravity on joint health. If effective in humans, this approach could be included in pre-flight routines or adapted for space missions.
In space, astronauts experience significant loss of bone and muscle mass due to microgravity. Without Earth’s gravitational pull, bones lose density, increasing fracture risk, while muscles, especially in the lower body and spine, weaken from reduced use. This deterioration can impair mobility when back on Earth and effect overall health. To combat this, astronauts follow rigorous exercise routines, including resistance and cardiovascular training, to maintain strength and bone integrity.
ESA astronaut Alexander Gerst gets a workout on the Advanced Resistive Exercise Device (ARED). Credit: NASAThe next obvious step as we reach out into the Solar System is the red planet Mars. Heading that far out into space will demand long periods of time in space since its a 9 month journey there. Permanent bases on the Moon too will test our physiology to its limits so managing the slow degradation is a big challenge to space agencies. A paper published by lead author Marco Chiaberge from the John Hopkins University has explored the knee joints of mice and how their cartilage grows thicker if they jump! They suggest astronauts should embed jumping activities into their exercise regiment.
Mars seen before, left, and during, right, a global dust storm in 2001. Credit: NASA/JPL/MSSSCartilage cushions the joints between bones and decreases friction allowing for pain free movement. Unlike many other tissues in the body, cartilage does not regenerate as quickly so it is important to protect it. Prolonged periods of inactivity, even from bed rest but especially long duration space flight can accelerate the degradation. It’s also been shown that radiation from space can accelerate the effect too.
To maintain a strong healthy body, astronauts spend a lot of time, up to 2 hours a day running on treadmills. This has previously shown to slow the breakdown of cartilage but the new study has shown that jumping based movements is particularly effective. T
The team of researchers found that, over a nine week program of reduced movement, mice experienced a 14% reduction in cartilage thickness in joints. Other mice performed jumping movements three times a week and their cartilage was found to be show a 26% increase compared to a control group of mice. Compared to the group that had restricted movement, the jumping mice had 110% thicker cartilage. The study also showed that jumping activities increased bone strength too with the jumping mice having a 15% higher density than the control.
An interesting piece of research but further work is needed to see whether jumping would herald in the same benefits to humans but the study is promising. If so, then jumping exercises are likely to be a part of pre-flight and inflight exercise programs for astronauts. It is likely that for this to be a reality in the micro-gravitational environment, astronauts will be attached to strong elasticated material to simulate the pull of gravity.
Source : Jumping Workouts Could Help Astronauts on the Moon and Mars, Study in Mice Suggests.
The post Should Astronauts Add Jumping to their Workout Routine? appeared first on Universe Today.
One of the basic principles of cosmology is the Cosmological Principle. It states that, no matter where you go in the Universe, it will always be broadly the same. Given that we have only explored our own Solar System there is currently no empirical way to measure this. A new study proposes that we can test the Cosmological Principle using weak gravitational lensing. The team suggests that measuring tiny distortions in light as it passes through the lenses, it may just be possible to find out if there are differences in density far away.
The Cosmological Principle is a fundamental assumption stating that the universe is homogeneous on a large scale. In other words regardless of location or direction, the universe appears uniform and it underpins many cosmological models, including the Big Bang theory. Taking the assumption that physical laws apply consistently everywhere makes calculations and predictions about the universe’s structure and evolution far simpler, but research has been testing its validity by searching for potential anomalies.
This illustration shows the “arrow of time” from the Big Bang to the present cosmological epoch. Credit: NASAA paper has been published by a team of astrophysicists, led by James Adam from the University of Western Cape in South Africa and explains that the Standard Model of Cosmology predicts the Universe has no centre and has no preferred directions (isotropy.) The paper, which was published in the Journal of Cosmology and Astroparticle Physics, articulates a new way to test the isotropy of the Universe using the Euclid space telescope.
The Euclid telescope is a European Space Agency mission to explore dark matter and dark energy. It was launched in 2023 and maps the positions and movements of billions of galaxies. It’s using this instrument that the team hope to search for variations in the structure of the Universe that might challenge the Cosmological Principle.
Artist impression of the Euclid mission in space. Credit: ESAPrevious studies have found such anomalies before but there are conflicting measurements of the expansion rate of the Universe, in the microwave background radiation and in various cosmological data. Further independent observations are required though, providing more data to see if the observations were the result of measurement errors.
The team explore using weak gravitational lenses, which occur when matter sits between us and a distant galaxy, slightly bending the galaxies light. Analysis of this distortion can be separated into two components; E-mode shear (caused by the distribution of matter in an isotropic and homogenous Universe) and B-mode shear which is weak and would not appear in an isotropic Universe at large scale.
If the team can detect large scale B-modes this in itself wouldn’t be enough to confirm the anisotropies since the measurements are tiny and prone to measurement errors. To confirm, and finally test the Cosmological Principles, E-mode shear needs to be detected as well. Such discovery and correlation of E-mode and B-mode shear would suggest the expansion of the Universe is anisotropic.
Ahead of the Euclid observations, the team simulated the effects of an anisotropic universe expansion on a computer. They were able to use the model to describe the effect of the weak gravitational force and predict that Euclid data would be sufficient to complete the study.
Source : Does the universe behave the same way everywhere? Gravitational lenses could help us find out
The post Do We Live in a Special Part of the Universe? Here’s How to Find Out appeared first on Universe Today.
Hypervelocity stars have been seen before but NASA scientists have just identified a potential record-breaking exoplanet system. They found a hypervelocity star that has a super-Neptune exoplanet in orbit around it. This discovery could reshape our understanding of planetary and orbital mechanics. Understanding more about these fascinating high velocity stars challenges current models of stellar evolution. However it formed, its amazing that somehow, it has managed to hang on to its planet through the process!
High-velocity stars travel through space at extraordinarily high speeds, often in excess of hundreds of kilometres per second. These rapidly moving stars are usually expelled from their galaxies due to gravitational forces, perhaps from close encounters with supermassive black holes or other stars. Some of them move so fast that they can break free from the Milky Way’s gravitational pull. It’s important to study them as they offer crucial insights into the dynamics of our Galaxy, interactions with black holes, and even the distribution of dark matter across the cosmos.
The positions and reconstructed orbits of 20 high-velocity stars, represented on top of an artistic view of our Galaxy, the Milky Way. Credit: ESA (artist’s impression and composition); Marchetti et al. 2018 (star positions and trajectories); NASA / ESA / Hubble (background galaxies)Details of the discovery were published in a paper that was authored by lead astronomer Sean Terry in The Astronomical journal. It tells of the discovery of what the team think is a super-Neptune world that is in orbit around a star with a low mass. The system is travelling at an estimated 540 kilometres per second! If it were aligned with our own Solar System and the star was where our Sun was, then the planet would sit somewhere between the orbits of Venus and Earth. Terry, who is a researcher at the University of Maryland and said “it will be the first planet ever found orbiting a hypervelocity star.”
Finding objects like this in space is tricky. This object was first seen in 2011 following analysis of data from the Microlensing Observations in Astrophysics survey that had been conducted by the University of Canterbury in New Zealand. The study had been on the lookout for evidence for exoplanets around distant stars.
The star-filled sky in this NASA/ESA Hubble Space Telescope photo lies in the direction of the Galactic centre. The light from stars is monitored to see if any change in their apparent brightness is caused by a foreground object drifting in front of them. The warping of space by the interloper would momentarily brighten the appearance of a background star, an effect called gravitational lensing. One such event is shown in the four close-up frames at the bottom. The arrow points to a star that momentarily brightened, as first captured by Hubble in August 2011. This was caused by a foreground black hole drifting in front of the star, along our line of sight. The star brightened and then subsequently faded back to its normal brightness as the black hole passed by. Because a black hole doesn’t emit or reflect light, it cannot be directly observed. But its unique thumbprint on the fabric of space can be measured through these so-called microlensing events. Though an estimated 100 million isolated black holes roam our galaxy, finding the telltale signature of one is a needle-in-a-haystack search for Hubble astronomers.The presence of a mass between Earth and a distant object creates these microlensing events. As such a mass passes between us and a star, its presence can be revealed through analysis of its light curve. In the 2011 data, the signals revealed a pair of celestial bodies and allowed the researchers to calculate that one was about 2,300 times heavier than the other.
The 2011 study suggested the star was about 20 percent as massive as the Sun and a planet 29 times heavier than Earth. Either that, or it was a nearer planet about four times the mass of Jupiter, maybe even with a moon. To learn more about the object the team searched through data from Keck Observatory and the Gaia satellite. They found the star, located about 24,000 light years away so still within the Milky Way. By comparing the location of the star in 2011 and then ten years later in 2021, the team were able to calculate its speed.
Having calculated the speed of the star to be around 540,000 kilometres per second, the team are keen to secure more observations in the years ahead. If it is around the 600,000 kilometres per second mark then it’s likely to escape the gravity of the Milky Way and enter intergalactic space millions of years in the future.
Source : NASA Scientists Spot Candidate for Speediest Exoplanet System
The post A Hyper Velocity Star Found with an Exoplanet Hanging on for Dear Life appeared first on Universe Today.
Finding alien life may have just got easier! If life does exist on other worlds in our Solar System then it’s likely to be tiny, primative bacteria. It’s not so easy to send microscopes to other worlds but chemistry may have just come to the rescue. Scientists have developed a test that detects microbial movement triggered by an amino acid known as L-serine. In lab testing, three different types of microbes all moved towards this chemical and could be a strong indicator of life.
The search for primitive alien life focuses on finding simple organisms, like microbes or bacteria that can survive in extreme environments. Scientists target places like Mars or moons of the outer planets like Europa (Jupiter,) and Enceladus (Saturn,) where liquid water and energy sources might exist. By studying extremophiles on Earth—organisms that seem to thrive in harsh conditions—researchers can gain clues about where and how to look for extraterrestrial life. Advanced technologies, including chemical sensors and microscopic imaging, are being developed to detect signs of life on future space missions.
Europa captured by JunoOne of the great challenges is exactly what to look for. One aspect of life be it primative or advanced, is the ability to move independently. The process where a chemical causes an organism to move in response is known as chemotaxis and it this that a team of researchers in Germany are interested in. They have developed a new method for creating the chemotactic movement in some of the most basic forms of life here on Earth. The team published their results in Frontiers in Astronomy and Space Sciences.
The team undertook experiments with three different types of microbe, two of them were bacteria and one was an archaea – a single celled microorganism. Each one has the capability of surviving in the types of extreme environments that might be found in space. One of the microbes has the catchy name Bacillus Subtilis and is known to be able to survive temperatures up to 100°C while others can survive down to -2.5°C. Each of the microbes responded, moving toward the chemical L-serine. The positive response from the microbes gives scientists a great insight into searching for organisms that are living on other worlds in our Solar System.
Image of a tardigrade, which is a microscopic species and one of the most well-known extremophiles, having been observed to survive some of the most extreme environments, including outer space. (Credit: Katexic Publications, unaltered, CC2.0)The scientists used a microscope slide that contained two separate chambers that were separated by a thin membrane. The sample microbes were placed on one side with L-serine placed on the other. The concept is simple, if the microbes are alive, they will move toward the chemical. On a future space mission however, it may need some slight refinements, chiefly it would need to work without human interaction.
It’s not the first time the chemical has been used to trigger movement in primative life and is thought to exist beyond the confines of Earth. Its presence beyond our home planet suggest that it may also be useful in helping the search for alien life. If L-serine does exist on other worlds in our Solar System then it may induce movement in microbes and may therefore help us to find that life.
Source : Efforts to find alien life could be boosted by simple test that gets microbes moving
The post Efforts to Detect Alien Life Advanced by Simple Microbe Mobility Test appeared first on Universe Today.