In 1971, the Soviet Mars 3 lander became the first spacecraft to land on Mars, though it only lasted a couple of minutes before failing. More than 50 years later, it’s still there at Terra Sirenum. The HiRISE camera NASA’s Mars Reconnaissance Orbiter may have imaged some of its hardware, inadvertently taking part in what could be an effort to document our Martian artifacts.
Is it time to start cataloguing and even preserving these artifacts so we can preserve our history?
Some anthropologists think so.
Justin Holcomb is an assistant research professor of anthropology at the University of Kansas. He and his colleagues argue that it’s time to take Martian archaeology seriously, and the sooner we do, the better and more thorough the results will be. Their research commentary, “The emerging archaeological record of Mars,” was recently published in Nature Astronomy.
Artifacts of the human effort to explore the planet are littered on its surface. According to Holcomb, these artifacts and our effort to reach Mars are connected to the original human dispersal from Africa.
“Our main argument is that Homo sapiens are currently undergoing a dispersal, which first started out of Africa, reached other continents and has now begun in off-world environments,” said lead author Holcomb. “We’ve started peopling the solar system. And just like we use artifacts and features to track our movement, evolution and history on Earth, we can do that in outer space by following probes, satellites, landers and various materials left behind. There’s a material footprint to this dispersal.”
Tracks from Opportunity stretch across this vista taken by the rover on Sol 3,781 in September 2014. This is from only ten years ago, but those missions already seem historical. Credit: NASA/JPL-Caltech/Cornell Univ./Arizona State Univ.It’s tempting to call debris from failed missions wreckage or even space junk like we do the debris that orbits Earth. But things like spent parachutes and heat shields are more than just wreckage. They’re artifacts the same way other cast-offs are artifacts. In fact, what archaeologists often do in the field is sift through trash. “Trash is a proxy for human behaviour,” said one anthropologist.
In any case, one person’s trash can be another person’s historical artifact.
Spacecraft that land on Mars have to eject equipment – like this protective shell from Perseverance and imaged by Ingenuity– on their way to the Martian surface. Spacecraft can’t reach the surface without protection. As time passes, trash and debris like this become important artifacts. NASA/JPL-Caltech“These are the first material records of our presence, and that’s important to us,” Holcomb said. “I’ve seen a lot of scientists referring to this material as space trash, galactic litter. Our argument is that it’s not trash; it’s actually really important. It’s critical to shift that narrative towards heritage because the solution to trash is removal, but the solution to heritage is preservation. There’s a big difference.”
14 missions to Mars have left their mark on the red planet in the form of artifacts. According to the authors, this is the beginning of the planet’s archaeological record. “Archaeological sites on the Red Planet include landing and crash sites, which are associated with artifacts including probes, landers, rovers and a variety of debris discarded during landing, such as netting, parachutes, pieces of the aluminum wheels (for example, from the Curiosity rover), and thermal protection blankets and shielding,” they write.
This figure from the research shows fourteen missions to Mars, along with key sites and examples of artifacts. MER A and B are NASA’s Spirit and Opportunity. a) Basemap generated from data derived from the Mars Orbiter Laser Altimeter (MOLA) and the High-Resolution Stereo Camera (HRSC)12. b) Viking-1Other features include rover tracks and rover drilling and sampling sites.
Curiosity captured this self-portrait at the ‘Windjana’ Drilling Site in 2014. The right panel shows its work. Image Credit: NASA/JPL-Caltech/MSSSWe’re already partway to taking our abandoned artifacts seriously. The United Nations keeps a list of objects launched into space called the Register of Objects Launched into Outer Space. It’s a way of identifying which countries are liable and responsible for objects in space (but not which private billionaires.) The Register was first implemented in 1976, and it says that about 88% of crewed spacecraft, elements of the ISS, satellites, probes, and landers launched into space are registered.
UNESCO also keeps a register of heritage sites, including archaeological and natural sites. The same could be done for Mars.
This UNESCO list of heritage sites shows both natural and cultural heritage sites, including ones that are considered to be in danger. Click the image to visit the site and explore the map. Image Credit: UNESCOThere’s already one attempt to start documenting and mapping sites on Mars. The Perseverance Rover team is documenting all of the debris they encounter to make sure it can’t contaminate sampling sites. There are also concerns that debris could pose a hazard to future missions.
According to one researcher, there is over 1700 kg (16,000) pounds of debris on Mars, not including working spacecraft. While much of it is just scraps being blown around by the wind and broken into smaller pieces, there are also larger pieces of debris and nine intact yet inoperative spacecraft.
So far, there have been only piecemeal attempts to document these Martian artifacts.
“Despite efforts from the USA’s Perseverance team, there exists no systematic strategy for documenting, mapping and keeping track of all heritage on Mars,” the authors write. “We anticipate that cultural
resource management will become a key objective during planetary exploration, including systematic surveying, mapping, documentation, and, if necessary, excavation and curation, especially as we expand
our material footprint across the Solar System.”
Holcomb and his co-authors say we must understand that our spacecraft debris is the archaeological record of our attempt to explore not just Mars but the entire Solar System. Our effort to understand Mars is also part of our effort to understand our own planet and how humanity arose. “Any future accidental destruction of this record would be permanent,” they point out.
The authors say there’s a crucial need to preserve things like Neil Armstrong’s first footsteps on the Moon, the first impact on the lunar surface by the USSR’s Luna 2, and even the USSR’s Venera 7 mission, the first spacecraft to land on another planet. This is our shared heritage as human beings.
A bootprint in the lunar regolith, taken during Apollo 11 in 1969. Credit: NASA.“These examples are extraordinary firsts for humankind,” Holcomb and his co-authors write. “As we move forward during the next era of human exploration, we hope that planetary scientists, archaeologists and geologists can work together to ensure sustainable and ethical human colonization that protects
cultural resources in tandem with future space exploration.”
There are many historical examples of humans getting this type of thing wrong, particularly during European colonization of other parts of the world. Since we’re still at (we hope) the beginning of our exploration of the Solar System, we have an opportunity to get it right from the start. It will take a lot of work and many discussions to determine what this preservation and future exploration can look like.
“Those discussions could begin by considering and acknowledging the emerging archaeological record on Mars,” the authors conclude.
The post Archaeology On Mars: Preserving Artifacts of Our Expansion Into the Solar System appeared first on Universe Today.
Did you find the cat amongst the owls in today’s Hili Dialogue? If not, I’ve circled it below.
There’s not much news today, it’s cold and gray, and my building is empty, as all the sane people appear to have already buggered off for the holidays. Feel free to talk or rant about what you want below. For example, here’s one thought I had: “Increasing decrepitude with age is nature’s way of preparing you for death. In other words, by the time one gets really old and hobbled with many ailments and pains, it becomes easier to die.”
In 2019, astronomers observed an unusual gravitational chirp. Known as GW190521, it was the last scream of gravitational waves as a black hole of 66 solar masses merged with a black hole of 85 solar masses to become a 142 solar mass black hole. The data were consistent with all the other black hole mergers we’ve observed. There was just one problem: an 85 solar mass black hole shouldn’t exist.
All the black hole mergers we’ve observed involve stellar mass black holes. These form when a massive star explodes as a supernova and its core collapses to become a black hole. An old star needs to be at least ten times the mass of the Sun to become a supernova, which can create a black hole of about 3 solar masses. Larger stars can create larger black holes, up to a point.
The first generation of stars in the cosmos were likely hundreds of solar masses. For a star above 150 solar masses or so, the resulting supernova would be so powerful that its core would undergo what is known as pair-instability. Gamma rays produced in the core would be so intense they decay into an electron-positron pair. The high-energy leptons would then rip apart the core before gravity could collapse it. To overcome the pair-instability, a progenitor star would need a mass of 300 Suns or more. This means that the mass range of stellar black holes has a “pair-instability gap.” Black holes from 3 solar masses to about 65 solar masses would form from regular supernovae, and black holes above 130 solar masses could form from stellar collapse, but black holes between 65-130 solar masses shouldn’t exist.
For GW190521, the 66 solar mass black hole is close enough to the limit that it likely formed from a single star. The 85 solar mass black hole, on the other hand, is smack-dab in the middle of the forbidden range. Some astronomers have argued that the larger black hole might have formed from a hypothetical boson star known as a Proca star, but if that’s true, then GW190521 is the only evidence that Proca stars exist. More likely, the 85 solar mass black hole formed from the merger of two smaller black holes, making GW190521 a staged merger. The difficulty with that idea is that black hole mergers are often asymmetrical, in a way that the resulting black hole is kicked out of its region of origin. Multiple black hole mergers would only occur under certain circumstances, which is where a new study in The Astrophysical Journal comes in.
The authors looked at how the mass, spin, and motion of a merging black hole pair determine the mass, spin, and recoil velocity of the resulting black hole. By creating a statistical distribution of outcomes, the team could then work backwards. Given the mass, spin, and velocity of a “forbidden” black hole relative to its environment, what were the properties of its black hole ancestors? When the authors applied this to the progenitors of GW190521, they found that the only possible ancestors would have given a relatively large recoil velocity. This means that the merger must have occurred within the region of an active galactic nucleus, where the gravitational well would be strong enough to hold the system together.
This work has implications for what are known as intermediate mass black holes (IMBHs), which can have masses of hundreds or thousands of Suns. It has been thought that IMBHs form within globular clusters, but if the recoil velocities of black hole mergers are large, this would be unlikely. As this study shows, GW190521 could not have occurred in a globular cluster.
Reference: Araújo-Álvarez, Carlos, et al. “Kicking Time Back in Black Hole Mergers: Ancestral Masses, Spins, Birth Recoils, and Hierarchical-formation Viability of GW190521.” The Astrophysical Journal 977.2 (2024): 220.
The post Building the Black Hole Family Tree appeared first on Universe Today.
Telling time in space is difficult, but it is absolutely critical for applications ranging from testing relativity to navigating down the road. Atomic clocks, such as those used on the Global Navigation Satellite System network, are accurate, but only up to a point. Moving to even more precise navigation tools would require even more accurate clocks. There are several solutions at various stages of technical development, and one from Germany’s DLR, COMPASSO, plans to prove quantum optical clocks in space as a potential successor.
There are several problems with existing atomic clocks – one has to do with their accuracy, and one has to do with their size, weight, and power (SWaP) requirements. Current atomic clocks used in the GNSS are relatively compact, coming in at around .5 kg and 125 x 100 x 40 mm, but they lack accuracy. In the highly accurate clock world terminology, they have a “stability” of 10e-9 over 10,000 seconds. That sounds absurdly accurate, but it is not good enough for a more precise GNSS.
Alternatives, such as atomic lattice clocks, are more accurate, down to 10e-18 stability for 10,000. However, they can measure .5 x .5 x .5m and weigh hundreds of kilograms. Given satellite space and weight constraints, those are way too large to be adopted as a basis for satellite timekeeping.
Rendering of a passive hydrogen maser atomic clock.To find a middle ground, ESA has developed a technology development roadmap focusing on improving clock stability while keeping it small enough to fit on a satellite. One such example of a technology on the roadmap is a cesium-based clock cooled by lasers and combined with a hydrogen-based maser, a microwave laser. NASA is not missing out on the fun either, with its work on a mercury ion clock that has already been orbitally tested for a year.
COMPASSO hopes to surpass them all. Three key technologies enable the mission: two iodine frequency references, a “frequency comb,” and a “laser communication and ranging terminal.” Ideally, the mission will be launched to the ISS, where it will sit in space for two years, constantly keeping time. The accuracy of those measurements will be compared to alternatives over that time frame.
Lasers are the key to the whole system. The iodine frequency references display the very distinct absorption lines of molecular iodine, which can be used as a frequency reference for the frequency comb, a specialized laser whose output spectrum looks like it has comb teeth at specific frequencies. Those frequencies can be tuned to the frequency of the iodine reference, allowing for the correction of any drift in the comb.
engineerguy explains how atomic clocks work with the GNSS.The comb then provides a method for phase locking for a microwave oscillator, a key part of a standard atomic clock. Overall, this means that the stability of the iodine frequency reference is transferred to the frequency comb, which is then again transferred to the microwave oscillator and, therefore, the atomic clock. In COMPASSO’s case, the laser communication terminal is used to transmit frequency and timing information back to a ground station while it is active.
COMPASSO was initially begun in 2021, and a paper describing its details and some breadboarding prototypes were released this year. It will hop on a ride to the ISS in 2025 to start its mission to make the world a more accurately timed place—and maybe improve our navigation abilities as well.
Learn More:
Kuschewski et al – COMPASSO mission and its iodine clock: outline of the clock design
UT – Atomic Clocks Separated by Just a few Centimetres Measure Different Rates of Time. Just as Einstein Predicted
UT – Deep Space Atomic Clocks Will Help Spacecraft Answer, with Incredible Precision, if They’re There Yet
UT – A New Atomic Clock has been Built that Would be off by Less than a Second Since the Big Bang
Lead Image:
Benchtop prototype of part of the COMPASSO system.
Credit – Kuschewski et al
The post Need to Accurately Measure Time in Space? Use a COMPASSO appeared first on Universe Today.
Binary stars are common throughout the galaxy. Roughly half the stars in the Milky Way are part of a binary or multiple system, so we would expect to find them almost everywhere. However, one place we wouldn’t expect to find a binary is at the center of the galaxy, close to the supermassive black hole Sagittarius A*. And yet, that is precisely where astronomers have recently found one.
There are several stars near Sagittarius A*. For decades, we have watched as they orbit the great gravitational well. The motion of those stars was the first strong evidence that Sag A* was indeed a black hole. At least one star orbits so closely that we can see it redshift as it reaches peribothron.
But we also know that stars should be ever wary of straying too close to the black hole. The closer a star gets to the event horizon of a black hole, the stronger the tidal forces on the star become. There is a point where the tidal forces are so strong a star is ripped apart. We have observed several of these tidal disruption events (TDEs), so we know the threat is very real.
Tidal forces also pose a threat to binary stars. It wouldn’t take much for the tidal pull of a black hole to disrupt binary orbits, causing the stars to separate forever. Tidal forces would also tend to disrupt the formation of binary stars in favor of larger single stars. Therefore astronomers assumed the formation of binary stars near Sagittarius A* wasn’t likely, and even if a binary formed, it wouldn’t last long on cosmic timescales. So astronomers were surprised when they found the binary system known as D9.
Distance and age of D9 in the context of basic dynamical processes and stellar populations in the Galactic center. Credit: Peißker et alThe D9 system is young, only about 3 million years old. It consists of one star of about 3 solar masses and the other with a mass about 75% that of the Sun. The orbit of the system puts it within 6,000 AU of Sag A* at its closest approach, which is surprisingly close. Simulations of the D9 system estimate that in about a million years, the black hole’s gravitational influence will cause the two stars to merge into a single star. But even this short lifetime is unexpected, and it shows that the region near a supermassive black hole is much less destructive than we thought.
It’s also pretty amazing that the system was discovered at all. The center of our galaxy is shrouded in gas and dust, meaning that we can’t observe the area in the visible spectrum. We can only see stars in the region with radio and infrared light. The binary stars are too close together for us to identify them individually, so the team used data from the Enhanced Resolution Imager and Spectrograph (ERIS) on the ESO’s Very Large Telescope, as well as archive data from the Spectrograph for INtegral Field Observations in the Near Infrared (SINFONI). This gave the team data covering a 15-year timespan, which was enough to watch the light of D9 redshift and blueshift as the stars orbit each other every 372 days.
Now that we know the binary system D9 exists, astronomers can look for other binary stars. This could help us solve the mystery of how such systems can form so close to the gravitational beast at the heart of our galaxy.
Reference: Peißker, Florian, et al. “A binary system in the S cluster close to the supermassive black hole Sagittarius A.” Nature Communications 15.1 (2024): 10608.
The post A Binary Star Found Surprisingly Close to the Milky Way's Supermassive Black Hole appeared first on Universe Today.
The Science-Based Medicine blog was established way back in 2008. Since that time, contributors to this blog have been sounding the alarm about the harmful effects of pseudoscience and conspiracy theories related to health. Few people in positions of authority heeded these warnings or recognized the severity of the threat over the next decade. Sometimes we as health professionals were even mocked […]
The post Homeopathy: Magical thinking, not medicine first appeared on Science-Based Medicine.Jupiter’s moon Io is the most volcanically active body in the Solar System, with roughly 400 active volcanoes regularly ejecting magma into space. This activity arises from Io’s eccentric orbit around Jupiter, which produces incredibly powerful tidal interactions in the interior. In addition to powering Io’s volcanism, this tidal energy is believed to support a global subsurface magma ocean. However, the extent and depth of this ocean remains the subject of debate, with some supporting the idea of a shallow magma ocean while others believe Io has a more rigid, mostly solid interior.
In a recent NASA-supported study, an international team of researchers combined data from multiple missions to measure Io’s tidal deformation. According to their findings, Io does not possess a magma ocean and likely has a mostly solid mantle. Their findings further suggest that tidal forces do not necessarily lead to global magma oceans on moons or planetary bodies. This could have implications for the study of exoplanets that experience tidal heating, including Super-Earths and exomoons similar to Io that orbit massive gas giants.
The study was led by Ryan Park, a Senior Research Scientist and Principal Engineer at NASA’s Jet Propulsion Laboratory (JPL). He was joined by multiple colleagues from NASA JPL, the Centro Interdipartimentale di Ricerca Industriale Aerospaziale (CIRI) at the Università di Bologna, the National Institute for Astrophysics (NIAF), the Sapienza Università di Roma, the Southwest Research Institute (SwRI), and NASA’s Goddard Space Flight Center, and multiple universities. Their findings were described in a paper that appeared in the journal Nature.
An amazingly active Io, Jupiter’s “pizza moon,” shows multiple volcanoes and hot spots, as seen with Juno’s infrared camera. Credit: NASA/JPL-Caltech/SwRI/ASI/INAF/JIRAM/Roman TkachenkoAs they explain in their paper, two types of analysis have predicted the existence of a global magma ocean. On the one hand, magnetic induction measurements conducted by the Galileo mission suggested the existence of a magma ocean within Io, approximately 50 km [~30 mi] thick and located near the surface. These results also implied that about 20% of the material in Io’s mantle is melted. However, these results were subjected to debate for many years. In recent years, NASA’s Juno mission conducted multiple flybys of Io and the other Jovian moons and obtained data that supported this conclusion.
In particular, the Juno probe conducted a global mapping campaign of Io’s volcanoes, which suggested that the distribution of volcanic heat flow is consistent with the presence of a global magma ocean. However, these discoveries have led to considerable debate about these techniques and whether they can be used to distinguish whether a shallow global magma ocean drives Io’s volcanic activity. This is the question Park and his colleagues sought to address in their study:
“In our study, Io’s tidal deformation is modeled using the gravitational tidal Love number k2, which is defined as the ratio of the imposed gravitational potential from Jupiter to the induced potential from the deformation of Io. In short, if k2 is large, there is a global magma ocean, and if k2 is small, there is no global magma ocean. Our result shows that the recovered value of k2 is small, consistent with Io not having a global magma ocean.”
The significance of these findings goes far beyond the study of Io and other potentially volcanic moons. Beyond the Solar System, astronomers have discovered countless bodies that (according to current planetary models) experience intense tidal heating. This includes rocky exoplanets that are several times the size and mass of Earth (Super-Earths) and in the case of tidally-locked planets like the TRAPPIST-1 system. These findings are also relevant for the study of exomoons that also experience intense tidal heating (similar to the Jovian moons). As Park explained:
“Although it is commonly assumed among the exoplanet community that intense tidal heating may lead to magma oceans, the example of Io shows that this need not be the case. Our results indicate that tidal forces do not universally create global magma oceans, which may be prevented from forming due to rapid melt ascent, intrusion, and eruption, so even strong tidal heating – like that expected on several known exoplanets and super-Earths – may not guarantee the formation of magma oceans on moons or planetary bodies.”
Further Reading: Nature
The post New Research Suggests Io Doesn’t Have a Shallow Ocean of Magma appeared first on Universe Today.
The star HD 65907 is not what it appears to be. It’s a star that looks young, but on closer inspection is actually much, much older. What’s going on? Research suggests that it is a resurrected star.
Astronomers employ different methods to measure a star’s age. One is based on its brightness and temperature. All stars follow a particular path in life, known as the main sequence. The moment they begin fusing hydrogen in their cores, they maintain a strict relationship between their brightness and temperature. By measuring these two properties, astronomers can roughly pin down the age of a star. But there are other techniques, like measuring the amount of heavy elements in a stellar atmosphere. Older stars tend to have fewer of these elements, because they were born at a time before the galaxy had become enriched with them.
Going by its temperature and brightness, HD 65907 is relatively young, with an age right around 5 billion years old. And yet it contains very little heavy elements. Plus, its path in the galaxy isn’t in line with other young stars, which tend to serenely orbit around the center. HD 65907 is much more erratic, suggesting that it only recently moved here from somewhere else.
In a recent paper, an international team of astronomers dug into archival data to see if they could resolve the mystery, and they believe that HD 65907 is a kind of star known as a blue straggler, and that it has its strange combination of properties because of a violent event in its past, causing it to be resurrected.
If two low-mass stars collide, the remnants can sometimes survive as a star on its own. At first that newly merged star will be both massive and large, with its outer surface flung far away from the core due to the enormous rotation after the collision. But eventually some astrophysical process (perhaps strong magnetic fields might be to blame) drag down the rotation rate of the star, causing it to slow down and settle into equilibrium. In this new state the star will appear massive and incredibly hot: a blue straggler.
No matter what, blue straggler stars get a second chance on life. Those mergers transform small stars into big stars, and they’re just now enjoying their hydrogen-burning main sequence lives.
The astronomers believe this is the case for HD 65907. What makes this star especially unique is that it’s not a member of a cluster, where frequent mergers can easily lead to blue stragglers. Instead, it’s a field star, wandering the galaxy on its own. It must have cannibalized a companion five billion years ago, leading to its apparent youthful age.
Work like this is essential to untangling the complicated lives of stars in the Milky Way, and it shows how the strangest stars hold the keys to unlocking the evolution of elements that lead to systems like our own.
The post The Mysterious Case of the Resurrected Star appeared first on Universe Today.
It’s axiomatic that the Universe is expanding. However, the rate of expansion hasn’t remained the same. It appears that the Universe is expanding more quickly now than it did in the past.
Astronomers have struggled to understand this and have wondered if the apparent acceleration is due to instrument errors. The JWST has put that question to rest.
American astronomer Edwin Hubble is widely credited with discovering the expansion of the Universe. But it actually stemmed from relativity equations and was pioneered by Russian scientist Alexander Freedman. Hubble’s Law bears Edwin’s name, though, and he was the one who confirmed the expansion, called Hubble’s constant, and put a more precise value to it. It measures how rapidly galaxies that aren’t gravitationally bound are moving away from one another. The movement of objects due solely to the Hubble constant is called the Hubble flow.
Measuring the Hubble constant means measuring distances to far-flung objects. Astronomers use the cosmic distance ladder (CDL) to do that. However, the ladder has a problem.
This illustration shows the three basic steps astronomers use to calculate how fast the universe expands over time, a value called the Hubble constant. All the steps involve building a strong “cosmic distance ladder” by starting with measuring accurate distances to nearby galaxies and then moving to galaxies farther and farther away. Image Credit: NASA, ESA and A. Feild (STScI)The first rungs on the CDL are fundamental measurements that can be observed directly. Parallax measurement is the most important fundamental measurement. But the method breaks down at great distances.
Beyond that, astronomers use standard candles, things with known intrinsic brightness, like supernovae and Cepheid variables. Those objects and their relationships help astronomers measure distances to other galaxies. This has been tricky to measure, though advancing technology has made progress.
Another pair of problems plagues the effort, though. The first is that different telescopes and methods produce different distance measurements. The second is that our measurements of distances and expansion don’t match up with the Standard Model of Cosmology, also known as the Lambda Cold Dark Matter (LCDM) model. That discrepancy is called the Hubble tension.
The question is, can the mismatch between the measurements and the LCDM be explained by instrument differences? That possibility has to be eliminated, and the trick is to take one large set of distance measurements from one telescope and compare them to another.
New research in The Astrophysical Journal tackles the problem by comparing Hubble Space Telescope measurements with JWST measurements. It’s titled “JWST Validates HST Distance Measurements: Selection of Supernova Subsample Explains Differences in JWST Estimates of Local H0.” The lead author is Adam Riess, a Bloomberg Distinguished Professor and Thomas J. Barber Professor of Physics and Astronomy at Johns Hopkins University. Riess is also a Nobel laureate, winning the 2011 Nobel Prize in Physics “for the discovery of the accelerating expansion of the Universe through observations of distant supernovae,” according to the Nobel Institute.
As of 2022, the Hubble Space Telescope gathered the most numerous sample of homogeneously measured standard candles. It measured a large number of standard candles out to about 40 Mpc or about 130 million light-years. “As of 2022, the largest collection of homogeneously measured SNe Ia is complete to D less than or equal to 40 Mpc or redshift z less than or equal to 0.01,” the authors of the research write. “It consists of 42 SNe Ia in 37 host galaxies calibrated with observations of Cepheids with the Hubble Space Telescope (HST), the heritage of more than 1000 orbits (a comparable number of hours) invested over the last ~20 yrs.”
In this research, the astronomers used the powerful JWST to cross-check the Hubble’s work. “We cross-check the Hubble Space Telescope (HST) Cepheid/Type Ia supernova (SN Ia) distance ladder, which yields the most precise local H0 (Hubble flow), against early James Webb Space Telescope (JWST) subsamples (~1/4 of the HST sample) from SH0ES and CCHP, calibrated only with NGC 4258,” the authors write. SH0ES and CCHP are different observing efforts aimed at measuring the Hubble constant. SH0ES stands for Supernova H0 for the Equation of State of Dark Energy, and CCHP stands for Chicago-Carnegie Hubble Program, which uses the JWST to measure the Hubble constant.
“JWST has certain distinct advantages (and some disadvantages) compared to HST for measuring distances to nearby galaxies,” Riess and his co-authors write. It offers a 2.5 times higher near-infrared resolution than the HST. Despite some of its disadvantages, the JWST “is able to provide a strong cross-check of distances in the first two rungs,” the authors explain.
Observations from both telescopes are closely aligned, which basically minimizes instrument error as the cause of the discrepancy between observations and the Lambda CDM model.
There’s a lot to digest in this figure from the research. It shows “Comparisons of H0 between HST Cepheids and other measures (JWST Cepheids, JWST JAGB, and JWST NIR-TRGB) for SN Ia host subsamples selected by different teams and for the different methods,” the authors explain. JAGB stands for J-region Asymptotic Giant Branch, and TRGB stands for Tip of the Red Giant Branch. Both JAGB and TRGB are ways of measuring distance to specific types of stars. Basically, coloured circles represent Hubble measurements, and squares represent JWST measurements. “The HST Cepheid and JWST distance measurements themselves are in good agreement,” the authors write. Image Credit: Riess et al. 2024.“While it will still take multiple years for the JWST sample of SN hosts to be as large as the HST sample, we show that the current JWST measurements have already ruled out systematic biases from the first rungs of the distance ladder at a much smaller level than the Hubble tension,” the authors write.
This research covered about one-third of the Hubble’s data set, with the known distance to a galaxy called NGC 4258 serving as a reference point. Even though the data set was small, Riess and his co-researchers achieved impressively precise results. They showed that the measurement differences were less than 2%. That’s much less than the 8% to 9% in the Hubble tension discrepancy.
NGC 4258 is significant in the cosmic distance ladder because it contains Cepheid variables similar to both the metallicities of the Milky Way and other galaxies’ Cepheids. Astronomers use it to calibrate distances to Cepheids with different metallicities. A new composite of NGC 4258 features X-rays from Chandra (blue), radio waves from the VLA (purple), optical data from Hubble (yellow and blue), and infrared with Spitzer (red). Image Credit: ChandraThat means that our Lamda CDM model is missing something. The standard model yields an expansion rate of about 67 to 68 kilometres per second per megaparsec. Telescope observations yield a slightly higher rate: between 70 and 76 kilometres per second per megaparsec. This work shows that the discrepancy can’t be due to the different telescopes and methods.
“The discrepancy between the observed expansion rate of the universe and the predictions of the standard model suggests that our understanding of the universe may be incomplete. With two NASA flagship telescopes now confirming each other’s findings, we must take this [Hubble tension] problem very seriously—it’s a challenge but also an incredible opportunity to learn more about our universe,” said lead author Riess.
What could be missing from the Lambda CDM model?
Marc Kamionkowski is a Johns Hopkins cosmologist who helped calculate the Hubble constant and recently developed a possible new explanation for the tension. Though not part of this research, he commented on it in a press release.
“One possible explanation for the Hubble tension would be if there was something missing in our understanding of the early universe, such as a new component of matter—early dark energy—that gave the universe an unexpected kick after the big bang,” said Kamionkowski. “And there are other ideas, like funny dark matter properties, exotic particles, changing electron mass, or primordial magnetic fields that may do the trick. Theorists have license to get pretty creative.”
The door is open, theorists just have to walk in.
The post The JWST Looked Over the Hubble’s Shoulder and Confirmed that the Universe is Expanding Faster appeared first on Universe Today.