Putting humans on Mars has been one of NASA’s driving missions for years, but they are still in the early stages of deciding what exactly that mission architecture will look like. One major factor is where to get the propellant to send the astronauts back to Earth. Advocates of space exploration often suggest harvesting the necessary propellant from Mars itself – some materials can be used to create liquid oxygen and methane, two commonly used propellants. To support this effort, a group from NASA’s COMPASS team detailed several scenarios of the infrastructure and technologies it would take to make an in-situ resource utilization (ISRU) system that could provide enough propellant to get astronauts back to a Mars orbit where they could meet up with an Earth return vehicle. However, there are significant challenges to implementing such a system, and they must be addressed before the 8-9-year process of getting the system up and running can begin.
To understand these challenges, it’s first essential to understand some of the requirements the team was trying to meet. The goal was to provide 300 tons of liquid oxygen and liquid methane to a Mars Ascent and Landing Vehicle (MALV) being developed at other parts of NASA. That much propellant is necessary to get a crew of astronauts back into orbit, where they can be met by an orbiting Earth return vehicle.
Creating liquid oxygen and methane requires many ISRU systems, such as pumps, electrolyzers, dryers, scrubbers, and significant power systems, to run all these machines. Some raw materials, such as CO2, can be pulled from the Martian atmosphere. However, the system will also require 150 tons of water, which could be trucked in from Earth or harvested from Mars.
Fraser discusses how ISRU can provide resources to use for exploration.Designing the overall system architecture is the first step in determining the best method for getting enough propellant to get the astronauts back off of Mars. A paper from the group compares five different approaches to solving that problem and details three of them, focusing on three different methods of getting water to use in the creation of liquid propellants on the surface of Mars.
Let’s first look at the two options for extracting water locally on Mars. One architecture uses a borehole drill to melt subsurface ice and pump it back to the surface, which can be used in electrolysis. The other architecture uses surface harvesting techniques, where soil with a high frozen water content can be sorted, and the water itself melted to provide sufficient stockpiles for creating propellant.
Drilling a borehole deep enough to access subsurface ice has never been done before. It does have some advantages over other water collection methods, including taking less time and requiring one less MALV delivery of equipment (i.e., making it lower cost). However, it does require more power plants and some specialized equipment to be developed.
Fraser speculates on how a real Mars mission could play out.Collecting water from surface regolith utilizes some technologies already being developed at NASA – including the RAZZOR surface mining system that could be used on the Moon or Mars. However, it requires as much time and as many launches as shipping water from Earth, with many possible unknown failure points in the architecture.
By comparison, sending 150 tons of water directly from Earth, while it might be expensive in terms of launch costs, simplifies the overall architecture significantly. There would still technically be ISRU in this scenario, as the water would still be used to create propellant from local Martian resources. However, the added step of getting that water locally would be eliminated.
Even that is a more complicated process than the other two options the team considered, without as much detail in the paper as the actual ISRU setups. Mission designers could send either the methane or both the methane and oxygen from Earth directly, bypassing the need for any ISRU to happen. While these options require potentially more MALV landers, their overall risk is minimized, as the necessary chemicals would be available for use at any point the astronauts would need them. However, they would take longer to set up – especially the option of sending all of the propellants directly from Earth, which could take upwards of 10 years to get set up.
Fraser interviews Dr. Michael Hecht, an expert in ISRU on Mars.Other challenges abound for utilizing Martian resources to create propellants – including limited locations where the necessary water may be found. This geographical restriction might not overlap with where astronauts might be needed to do exciting science, so the architects would have to prioritize either scientific discovery or derisking the ISRU equipment – they likely couldn’t do both.
So, all things considered, if the purpose is to send people to Mars and back safely, it seems like the best, most reliable option is to send the total amount of propellant from Earth. However, in the long run, if humanity plans to make a sustainable presence on Mars, we will need to utilize local resources. The paper from the COMPASS team clearly defines a few strategies that could do that, and someday, it will become the better option – just maybe not quite yet.
Learn More:
Oleson et al – Kiloton Class ISRU Systems for LO2/LCH4 Propellant Production on the Mars Surface
UT – A Single Robot Could Provide a Mission To Mars With Enough Water and Oxygen
UT – Resources on Mars Could Support Human Explorers
UT – Mars Explorers are Going to Need air, and Lots of it. Here’s a Technology That Might Help Them Breath Easy
Lead Image:
Architecture Design of the water from Earth delivery option.
Credit – Oleson et al. / NASA
The post Scaling Propellant Production on Mars is Hard appeared first on Universe Today.
There are good reasons to keep an eye on the Leonid meteors this year.
It’s still one of the coolest things I ever saw. I was in the U.S. Air Force in the 90s, and November 1998 saw me deployed to the dark skies of Kuwait. That trip provided an unexpected treat, as the Leonid meteors hit dramatic storm levels on the morning of the 17th. Meteor came fast and furious towards local sunrise, often lighting up the desert floor like celestial photoflashes in the sky.
Once every 33 years or so, the ‘lion roars,’ as Leonid meteors seem to rain down from the Sickle asterism of the constellation Leo. And while the last outbreak was centered around the years surrounding 1999, there’s some interesting discussion about possible encounters with past Leonid streams in 2024.
The Leonids in 2024To be sure, 2024 is otherwise slated to be an off year for the shower. The normal annual maximum for 2024 is expected to occur on Sunday, November 17th at around ~4:00 Universal Time (UT), with an expected Zenithal Hourly Rate (ZHR) of 15-20 meteors per hour seen under ideal conditions. This favors Europe in the early dawn hours.
The Leonid radiant, looking east at 2AM local. Credit: Stellarium. A Leonid Outburst in 2024?But there are also a few other streams that may arrive earlier this week and are worth watching for. Jérémie Vaubaillon of the Paris Observatory IMCCE notes that Earth may encounter three older streams from periodic comet 55P/Tempel-Tuttle. The comet is the source of the Leonids. On a 33.8 year orbit, a meteor shower occurs when the Earth plows headlong into the stream of dust and debris laid down by the comet.
The three suspect trails are:
-A trail laid down in 1633 (the source of the 2001 meteor storm). Earth is near this trail on November 14th at 16:37 UT, favoring northwestern North America in the early morning hours.
-A dust trail from 1733, peaking on November 19/20th at 23:53 to 00:54 UT, favoring north/central Asia.
-And finally, an encounter with a string of older (more than a millennium old) streams on November 14th at 16:37 UT, (the same time as the 1633 stream). It is worth noting that the 1733 stream was the suspected source of the 1866 Leonid meteor storm.
A bright green Leonid from 2023. Credit: Frankie Lucena.Watching this Thursday morning on the 14th could be a harbinger as to whether or not we’re in for a show. Unfortunately, the Moon is waxing gibbous and headed towards Full this week on November 15th, meaning that it with provide increasing illumination and cut down observed meteor rates.
The Leonids on past recent years have held steady at predicted rates of about or so 20 per hour. It’s worth noting that another encounter with the 1699 stream and possible outburst is predicted for next year, 2025.
Leonid TEFF (Total Effective observation time) rate versus meteors over the years. Credit: the International Meteor Organization (IMO). Meteor Shower… or Storm?Meteor storms occur when the zenithal hourly rate tops 500 or more per hour. Keep in mind, a ZHR of a thousand or higher means that you’re seeing a meteor every few seconds. The October Draconids and the December Andromedids are also prone to great outbursts, but the Leonids are the most notorious and well-known. The 1966 shower seen over the U.S. southwest topped an amazing ZHR of up to 150,000 per hour (!)
A depiction of the 1833 outburst over Niagara Falls. Credit: Mechanic’s Magazine/Popular Domain. Observing and Imaging the LeonidsEarly morning hours are best to see meteors, as you’re standing on the swath of the surface of the Earth that’s turned forward in to the stream. Pinpoint meteors will occur near the shower radiant, while long streaks will stand out out in stark profile about 45 to 90 degrees away on either side of the radiant. I like to aim my tripod-mounted DSLR at these regions, set the lens to the widest field of view possible, and simply let it run taking auto-exposures and see what turns up. An intervalometer is a great device to automate this process. This allows me to just sit back with a steaming hot cup of tea (a must for cold November mornings) and simply watch the show, as meteors slide by.
A Leonid pierces the night sky over southern Arizona. Credit: Eliot Herman.Perhaps, we’ll simply have to wait for 2030s to see strong activity from the Leonids again. But do you really want to risk missing a surprise show? To quote hockey player Wayne Gretzky: “You miss 100% of the shots you don’t take.” The same holds true for missing versus catching meteor storms: you just have to show up and watch.
The post Is an ‘Off-Year’ Leonid Outburst in the Cards For November? appeared first on Universe Today.
Rarely does something get developed which is a real game changer in space exploration. One example is the Skylon reusable single-stage-to-orbit spaceplane. Powered by the hypersonic SABRE engine it operates like a jet engine at low altitude and more like a conventional rocket at high altitude. Sadly, ‘Reaction Engines’ the company that designs the engines has filed for bankruptcy.
Launching rockets into space is an expensive business and it has often been a significant barrier in space exploration. This is largely because traditional rockets include a significant proportion of expendable elements. A typical launch into low Earth orbit for example can cost anything from tens to hundreds of millions of dollars due to those single use components. Movement has however been seen with reusable rocket technology like the Falcon 9 and Starship rockets which are refurbished and reused for multiple launches. This has helped to drive down the cost of a rocket launch but still about $2,000 per kilogram there is still much to do to drive down the cost of space exploration.
A SpaceX Falcon 9 rocket sends the European Space Agency’s Hera spacecraft into space from its Florida launch pad. (Credit: SpaceX)The idea for a fully reusable single-stage-to-orbit (SSTO) spaceplane is one such development and was the brainchild of Reaction Engines Limited. The Skylon spaceplane was designed to take off and land like a conventional aircraft significantly reducing the launch costs. Instead of relying upon multiple expendable stages during ascent, Skylon’s Synergetic Air-Breathing Rocket Engine (SABRE) combines jet and rocket propulsion technology to reach orbit. Instead of being fuelled by conventional rocket propellant carried aloft, it utilises atmospheric oxygen reducing the need to carry heavy oxygen and therefore drastically improves fuel efficiency. Once at sufficient altitude, the SABRE engine switches to rocket mode and only then starts to use onboard oxygen to reach final orbit.
An artist’s conception of Reaction Engines’ Skylon spacecraft. Credit: Reaction EnginesReaction Engines Limited was formed in the UK back in 1989 and focussed its attention on propulsion technology. In particular to address access issues to space and hypersonic flight. The SABRE engine they developed showed successfully that a dual-mode rocket could efficiently transition between high speed flight within the atmosphere to rocket powered flight in space. It relies upon a pre-cooler system that cools incoming air from over 1,000°C to room temperature in fractions of a second to drive high speeds without the engine over heating.
The company is based in Oxfordshire and has to date, secured significant investments including BAE Systems, Boeing and the European Space Agency. Unfortunately, the company has been struggling to source funding to continue operations so formally entered administration on 31 October 2024. An eight week process is now underway to develop plans to restructure, sell the company or liquidate its assets. Most of its 200 employees have now been laid off.
Source : Reaction Engines Limited
The post Reaction Engines Goes Into Bankruptcy, Taking the Hypersonic SABRE Engine With it appeared first on Universe Today.
The dream of traversing the depths of space and planting the seed of human civilization on another planet has existed for generations. For long as we’ve known that most stars in the Universe are likely to have their own system of planets, there have been those who advocated that we explore them (and even settle on them). With the dawn of the Space Age, this idea was no longer just the stuff of science fiction and became a matter of scientific study. Unfortunately, the challenges of venturing beyond Earth and reaching another star system are myriad.
When it comes down to it, there are only two ways to send crewed missions to exoplanets. The first is to develop advanced propulsion systems that can achieve relativistic speeds (a fraction of the speed of light). The second involves building spacecraft that can sustain crews for generations – aka. a Generation Ship (or Worldship). On November 1st, 2024, Project Hyperion launched a design competition for crewed interstellar travel via generation ships that would rely on current and near-future technologies. The competition is open to the public and will award a total of $10,000 (USD) for innovative concepts.
Project Hyperion is an international, interdisciplinary team composed of architects, engineers, anthropologists, and urban planners. Many of them have worked with agencies and institutes like NASA, the ESA, and the Massachusetts Institute of Technology (MIT). Their competition is sponsored by the Initiative for Interstellar Studies (i4is), a non-profit organization incorporated in the UK dedicated to research that will enable robotic and human exploration and the settlement of exoplanets around nearby stars.
Artist’s concept of a generation ship. Credit: Maciej Rebisz/Michel LamontagneWhile concepts for an interstellar spacecraft go back to the early Space Age, interest in the field has grown considerably in the past two decades. This is largely due to the recent explosion in the number of known exoplanets in our galaxy, which currently stands at 5,787 confirmed planets in 4,325 star systems. This is illustrated by concepts like Breakthrough Starshot, Swarming Proxima Centauri, and the Genesis Project. These concepts leverage gram-scale spacecraft, directed energy (lasers), and lightsails to achieve speeds of up to 20% of the speed of light, allowing them to make the journey in decades rather than centuries or millennia.
However, sending crewed spacecraft to other star systems with enough passengers to settle on another planet is far more challenging. As addressed in a previous article, a spacecraft relying on known or technically feasible propulsion methods would take between 1,000 and 81,000 years to reach even the nearest star (Proxima Centauri). While some advanced concepts like Project Orion, Daedalus, and Icarus could theoretically reach Proxima Centauri in 36 to 85 years, the costs and amount of propellant needed are prohibitive.
The alternative to these “go fast” concepts is to settle in for the long ride, which may last centuries or even millennia. This necessitates a spacecraft of sufficient size capable of accommodating hundreds (or thousands) of human beings over multiple generations. To save room and reduce the mass of cargo space, the crews will need to grow much of their food and rely on life support systems that are bioregenerative in nature. In short, the ship would need to be self-sustaining so the passengers could live comfortable, healthy lives until they reached their destination.
Andreas Hein, an Associate Professor of Aerospace Engineering at the University of Luxembourg and the Chief Scientist at the Interdisciplinary Centre for Security, Reliability and Trust, is part of the Hyperion Project’s Organizing Committee. As he told Universe Today via email:
“Think about the difference between a drone and an ocean liner. Previous designs for interstellar spacecraft, such as Orion, Daedalus, and Icarus, focused on uncrewed probes with the primary objective of gathering scientific data from target star systems, including searching for signs of life. In contrast, generation ships are designed to transport a crew, with the primary goal of settling an exoplanet or other celestial body in the target star system. They also tend to be much larger than interstellar probes, though they would likely use similar propulsion systems, such as fusion-based propulsion.”
Generation ShipsThe first known description of a generation ship was made by rocketry engineer Robert H. Goddard, one of the “forefathers of modern rocketry,” for whom NASA’s Goddard Space Flight Center is named. In his 1918 essay, “The Ultimate Migration,” he described an “interstellar ark” leaving the Solar System in the distant future after the Sun reached the end of its life cycle. The passengers would cryogenically frozen or in a state of induced torpor for much of the journey except for the pilot, who would be awakened periodically to steer the ship.
Goddard recommended that the ship be powered by atomic energy if the technology were realized. If not, a combination of hydrogen, oxygen, and solar energy would suffice. Goddard calculated that these power sources would allow the vessel to achieve velocities of 4.8 to 16 km/s (3 to 10 mi/s), or roughly 57,936 km/h (36,000 mph). This was followed by famed Russian rocket scientist and cosmologist Konstantin E. Tsiolkovsky, also recognized as one of the “forefathers of modern rocketry.” In 1928, he wrote an essay titled “The Future of Earth and Mankind” that described an interstellar “Noah’s Ark.”
In Tsiolkovsky’s version, the spaceship would be self-sufficient, and the crew would be awake for the journey, which would last for thousands of years. In 1964, NASA scientist Dr. Robert Enzmann proposed the most detailed concept to date for a generation ship, known as an “Enzmann Starship.“ The proposal called for a ship measuring 600 meters (2,000 feet) in length powered by a fusion thruster that uses deuterium as a propellant. According to Enzmann, this ship would house an initial crew of 200 people with room for expansion along the way.
In recent years, the concept has been explored from various angles, from biological and psychological to ethical. This included a series of studies (2017-2019) conducted by Dr. Frederic Marin of the Astronomical Observatory of Strasbourg using tailor-made numerical software (called HERITAGE). In the first two studies, Dr. Marin and colleagues conducted simulations that showed that a minimum crew of 98 (max. 500) would need to be coupled with a cryogenic bank of sperm, eggs, and embryos to ensure genetic diversity and good health upon arrival.
In the third study, Dr. Marin and another group of scientists determined that the ship carrying them would need to measure 320 meters (1050 feet) in length, 224 meters (735 feet) in radius, and contain 450 m² (~4,850 ft²) of artificial land to grow enough food to sustain them. In short, these proposals and studies establish that a generation ship and its crew must bring “Earth with them” and rely on bioregenerative systems to replenish their food, water, and air throughout generations.
Credit: Maciej Rebisz/Michel LamontagneAs noted, most studies regarding interstellar exploration have focused on probes or ships and tended to emphasize speed over ensuring passengers could make the journey. As Hein explained, this makes the Hyperion Project the first competition to focus on generation ships and ensuring the interstellar voyagers remain healthy and safe until they arrive in a nearby star system:
“This competition is unprecedented—a true first. To our knowledge, it marks the first time a design competition specifically focused on generation ships has been launched. It builds on our team’s prior research, conducted since 2011, which addresses fundamental questions such as the required population size. This competition uniquely explores the complex interplay between generation ship technologies and the dynamics of a highly resource-constrained society.
“Most studies have focused on the technological aspects, such as propulsion and life support, while often treating the ship’s technology and onboard society as separate issues. This approach is understandable given the challenge of analyzing these interdependencies. We even got the advice to stay away. Our goal is to take an initial step toward exploring and envisioning these interdependencies. We aim to be Cayley instead of Da Vinci. Da Vinci imagined aircraft, but Cayley conceived their basic design principles, which paved the way for the Wright Brothers.”
The CompetitionRegistration for the competition will remain open until December 15th, 2024, and all participating teams must pay a $20 registration fee. The top three winning entries will be announced on June 2nd, 2025, and awarded $5000 for first place, $3000 for second, and $2000 for third. In addition, ten teams will receive honorary mentions for creative and innovative ideas. For more information, check out Project Hyperion’s website and the Mission Brief.
Per their mission statement, Project Hyperion is a preliminary study and feasibility assessment for crewed interstellar flight using current and near-future technologies. The goal is to inform the public about the future possibility of interstellar space travel and to guide future research and technology development. As they state on their website, the competition has the following theme:
“Humanity has overcome the great sustainability crisis in the 21st century and has transitioned into an era of sustainable abundance, both on Earth and in space. Humanity has now reached the capacity to develop a generation ship without major sacrifices. An Interstellar Starship flies by an icy planet in a nearby solar system. Going beyond the classical examination of the problem of Interstellar propulsion and structural design for a voyage lasting multiple centuries, what might be the ideal type of habitat architecture and society in order to ensure a successful trip?”
Credit: Maciej Rebisz/Michel LamontagneParticipants will be tasked with designing the ship, its habitat, and its subsystems, including details on its architecture and society. The Project Brief describes other important Boundary Conditions, including the duration of the mission, its destination, and other important considerations. The mission duration is 250 years from launch to arrival at the target star system, consistent with the ship having advanced propulsion capable of achieving a fraction of the speed of light.
To ensure the health and safety of the crew, the ship’s habitat must have atmospheric conditions similar to Earth, protection from galactic rays, micrometeorites, and interstellar dust (necessary for relativistic space travel). The ship must also provide artificial gravity via rotating sections, but “parts of the habitat can have reduced gravity.” The habitat must also provide accommodation and decent living conditions for 1000 plus or minus 500 people throughout the trip. The habitat will also need to be designed in such a way that it can be modified to meet changing needs.
The society’s structure must allow for cultural variations, including language, ethics, family structure, beliefs, aesthetics, family structure, and other social factors. The competition also considers knowledge retention and loss relative to Earth, which they describe as “almost inevitable.” Cameron Smith, an anthropologist at Portland State University and the University of Arizona’s Center for Human Space Exploration (CHaSE), is also a member of Project Hyperion’s Organizing Committee. As he explained to Universe Today:
“[T]he situation of a population, let’s say thousands or even 1500 people, traveling in isolation for centuries would be unique to the human experience. So just as we plan for the health of the architecture and the hardware, maintaining them to keep them in a good state over this time span, we can plan for the health and maintenance of both biology and culture. And we have an excellent guide which is evolution.
“Evolution is at the heart of all life sciences, and it also, in many ways, applies to cultural change through time. Biology evolves, and cultures evolve. And we have learned how to manage our cultures on Earth to fit a wide variety of situations.”
“The idea, however, is to get people thinking about how culture might be adjusted for the unusual conditions I’ve outlined. Separation from Earth, separation from other populations of humans, except by radio or video communication – which will become less and less as they get farther from Earth – what could change through time of the voyage that would require cultural adjustment?”
Credit: Maciej Rebisz/Michel LamontagneThroughout the trip, the population must also have access to basic products (clothing, shelter, etc.). The mass of the habitat is to be as low as possible, reliable over the entire duration of the journey, and include redundant systems. The generation ship’s target destination is a rocky planet in a nearby star system (like Proxima b). In an interesting twist, the competition stresses that this planet will have an artificial ecosystem created by a precursor probe, à la Project Genesis. As a result, the crews will not require any significant genetic or biological adaptations to survive in that ecosystem. As Hein explained:
“250 years in a tin can and staying happy, aka. can a society thrive in a severely resource-constrained environment? Answering this question is essential for designing a generation ship and may also offer insights into sustainable futures on Earth. From my perspective, there has been a significant lack of imaginative solutions to this challenge.’250 years in a tin can and staying happy, aka, can a society thrive in a severely resource-constrained environment?’
“Answering this question is essential for designing a generation ship and may also offer insights into sustainable futures on Earth. From my perspective, there has been a significant lack of imaginative solutions to this challenge.”
“We also hope to raise awareness of the complexities underlying today’s technologies. Which technologies could or should be preserved on a generation ship, and which may be lost? Research shows that a society’s population size affects the diversity and complexity of its technologies. Most modern technologies require intricate supply chains involving numerous companies, infrastructure, and regulatory systems. Therefore, a generation ship will likely rely on low-tech solutions unless disruptive technologies, like molecular manufacturing or Standard Template Constructs (as depicted in Warhammer 40k), become feasible.”
An Interdisciplinary ApproachA major focus of the competition is interdisciplinary research, reflective of the organizing committee itself. This has become a trend in space research, thanks in large part to the rise of the commercial space industry. For many companies and non-profits today, traditional research is expanding beyond aerospace engineering and incorporating architecture and interior design, biology, sociology, psychology, agriculture, and other disciplines to create concepts that will allow for healthy and sustainable living in space.
Credit: Maciej Rebisz/Michel LamontagnePer the rules, teams must consist of at least one architectural designer, engineer, and social scientist (a sociologist, anthropologist, etc.). As Yazgi Demirbas Pech, an architect and designer with the Organizing Committee, explained:
“We hope this competition will inspire greater interdisciplinary collaboration, emphasizing the value of fields such as architecture and social sciences—especially critical in planning for long-duration, long-distance missions. A holistic approach that integrates these diverse fields can contribute to more sustainable and human-centered solutions for space exploration.
“Unlike traditional architectural practices on Earth, space architecture requires a delicate balance between strict technical constraints—such as limited physical space, extreme environmental conditions, and restricted resources—and the essential human needs for comfort, safety, and psychological well-being. Here, architecture becomes a life-sustaining element, enabling people to live, work, and thrive across vast distances and timescales.
“Through this competition, we invite teams to challenge conventional design principles and redefine what “home” means among the stars. Including architects or architecture students on teams will undoubtedly add fresh perspectives to this thought-provoking competition.”
Solving for Space Solves for EarthAnother important aspect of the competition is the desire to inspire ideas that will also have applications and benefits here on Earth. This is another crucial aspect of the future of space exploration, which includes plans for creating outposts on the Moon, Mars, and beyond. Like a generation ship, missions operating farther from Earth cannot rely on regular resupply missions sent from Earth. This means that habitats must be as self-sufficient as possible and ensure that inhabitants have enough air, water, and food to live comfortably.
For decades, scientists and planners have looked to Earth’s natural environment for inspiration. This was the purpose of the Biosphere 2 project, which conducted two experiments between 1991 and 1994 in which volunteers lived in a sealed biome that mimics Earth’s many environments. Since 2007, the University of Arizona has used the facility to conduct research through its CHaSE program while remaining open to the public.
Credit: Maciej Rebisz/Michel Lamontagne“Since the 1990s, [Biosphere 2] has been a research center for closed ecosystems as though on a starship, and the research here continues. [I am] actually residing at the biosphere until January, and I am looking at the stars and engaged in all of this right now,” said Smith, who wrote to Universe Today from the facility. As he went on to note, research from this experiment and similar studies have significant applications for life here on Earth, mainly because there is no margin for error in space:
“[T]he planning and preparation going into the starship in terms of its culture and biological protections for the offspring would be very carefully designed to give the greatest protections to them, perhaps in ways more specifically tailored to their survival and good health than in any culture ever on Earth. On the interstellar voyage, things must go just right to survive over multiple generations in the closed ecosystem, so planning and preparation would have to be very thorough.”
Since failure in space often means death, especially when people are stationed far from Earth where rescue missions would take too long to reach them, the technologies future explorers and settlers rely on must be regenerative, fail-proof, and sustainable over time. This research and development will have direct benefits when it comes to the most pressing problems we face here on Earth: climate change, overpopulation, poverty and hunger, and the need for sustainable living. As Pech emphasized:
“I believe that thinking beyond Earth can offer valuable insights into how we might improve life here on ‘spaceship Earth.’ Just as in space, where we face numerous challenges, our planet requires innovative approaches to foster harmony and resilience amidst current global conflicts and challenges.”
There’s also the added benefit of stimulating questions about life in the Universe and where extraterrestrial civilizations (ETCs) could already be traveling among the stars. For decades, scientists have explored these questions as part of the Fermi Paradox. As Hein explained:
“Finally, just as Project Daedalus demonstrated the theoretical feasibility of interstellar travel, we aim to establish a similar ‘existence proof’ for human travel to the stars. Achieving this will add new perspectives to the Fermi Paradox: if we can envision crewed interstellar travel today, a more advanced civilization should have achieved it already. So, where are they?”
Those interested in the competition or have more questions are encouraged to contact the Initiative for Interstellar Studies at info@i4is.org. The i4is will remain open to Q&A until December 1st, 2024.
Further Reading: Project Hyperion
The post Project Hyperion is Seeking Ideas for Building Humanity’s First Generation Ship appeared first on Universe Today.
Supermassive Black Holes (SMBHs) can have billions of solar masses, and observational evidence suggests that all large galaxies have one at their centres. However, the JWST has revealed a foundational cosmic mystery. The powerful space telescope, with its ability to observe ancient galaxies in the first billion years after the Big Bang, has shown us that SMBHs were extremely massive even then. This contradicts our scientific models explaining how these behemoths became so huge.
How did they get so massive so early?
Black holes of all masses are somewhat mysterious. We know that massive stars can collapse and form stellar-mass black holes late in their lives. We also know that pairs of stellar-mass black holes can merge, and we’ve detected the gravitational waves from those mergers. So, it’s tempting to think that SMBHs also grow through mergers when galaxies merge together.
The problem is, in the early Universe, there wasn’t enough time for black holes to grow large enough and merge often enough to produce the SMBHs. The JWST has shown us the errors in our models of black hole growth by finding quasars powered by black holes of 1-10 billion solar masses less than 700 million years after the Big Bang.
Astrophysicists are busy trying to understand how SMBHs became so massive so soon in the Universe. New research titled “Primordial black holes as supermassive black holes seeds” attempts to fill in the gap in our understanding. The lead author is Francesco Ziparo from the Scuola Normale Superiore di Pisa, a public university in Italy.
This artist’s conception illustrates a supermassive black hole (central black dot) at the core of a young, star-rich galaxy. Observational evidence suggests all large galaxies have one. Image credit: NASA/JPL-CaltechThere are three types of black holes: Stellar-mass black holes, intermediate-mass black holes (IMBHs), and SMBHs. Stellar-mass black holes have masses ranging from about five solar masses up to several tens of solar masses. SMBHs have masses ranging from hundreds of thousands of solar masses up to millions or billions of solar masses. IMBHs are in between, with masses ranging from about one hundred to one hundred thousand solar masses. Researchers have wondered if IMBHs could be the missing link between stellar-mass black holes and SMBHs. However, we only have indirect evidence that they exist.
This is Omega Centauri, the largest and brightest globular cluster that we know of in the Milky Way. An international team of astronomers used more than 500 images from the NASA/ESA Hubble Space Telescope spanning two decades to detect seven fast-moving stars in the innermost region of Omega Centauri. These stars provide compelling new evidence for the presence of an intermediate-mass black hole. Image Credit: ESA/Hubble & NASA, M. Häberle (MPIA)There’s a fourth type of black hole that is largely theoretical, and some researchers think they can help explain how the early SMBHs were so massive. They’re called primordial black holes (PBHs.) Conditions in the very early Universe were much different than they are now, and astrophysicists think that PBHs could’ve formed by the direct collapse of dense pockets of subatomic matter. PBHs formed before there were any stars, so aren’t limited to the rather narrow mass range of stellar-mass black holes.
Artist illustration of primordial black holes. NASA’s Goddard Space Flight Center“The presence of supermassive black holes in the first cosmic Gyr (gigayear) challenges current models of BH formation and evolution,” the researchers write. “We propose a novel mechanism for the formation of early SMBH seeds based on primordial black holes (PBHs).”
Ziparo and his co-authors explain that in the early Universe, PBHs would’ve clustered and formed in high-density regions, the same regions where dark matter halos originated. Their model takes into account PBH accretion and feedback, the growth of dark matter halos, and dynamical gas friction.
In this model, the PBHs are about 30 solar masses and are in the central region of dark matter (DM) halos. As the halos grow, baryonic matter settles in their wells as cooled gas. “PBHs both accrete baryons and lose angular momentum as a consequence of the dynamical friction on the gas, thus gathering in the central region of the potential well and forming a dense core,” the authors explain. Once clustered together, a runaway collapse occurs that ends up as a massive black hole. Its mass depends on the initial conditions.
Planted soon enough, these seeds can explain the early SMBHs the JWST has observed.
This figure from the research illustrates how PBHs could form the seeds for SMBHs. (Left) As the gas cools, it settles into the center of the dark matter gravitational potential, and the PBHs become embedded at the center. (Middle) The PBHs lose angular momentum due to the gas’s dynamic friction and concentrate in the core of the DM halo. (Right) PBH binaries form and merge rapidly because of their high density. The end result is a runaway merger process that creates the seeds of SMBHs. Image Credit: Ziparo et al. 2024.There’s a way to test this model, according to the authors.
“During the runaway phase of the proposed seed formation process, PBH-PBH mergers are expected to copiously emit gravitational waves. These predictions can be tested through future Einstein Telescope observations and used to constrain inflationary models,” they explain.
The Einstein Telescope or Einstein Observatory is a proposal from several European research agencies and institutions for an underground gravitational wave (GW) observatory that would build on the success of the laser-interferometric detectors Advanced Virgo and Advanced LIGO. The Einstein Telescope would also be a laser interferometer but with much longer arms. While LIGO has arms four km long, Einstein would have arms 10 km long. Those longer arms, combined with new technologies, would make the Telescope much more sensitive to GWs.
The Einstein Telescope should open up a GW window into the entire population of stellar and intermediate-mass black holes over the entire history of the Universe. “The Einstein Telescope will make it possible, for the first time, to explore the Universe through gravitational waves along its cosmic history up to the cosmological dark ages, shedding light on open questions of fundamental physics and cosmology,” the Einstein website says.
A thorough understanding of SMBHs is a ways away, but it’s important to understand them because of their role in the Universe. They help explain the universe’s large-scale structure by influencing the distribution of matter on large scales. The fact that they appeared so much earlier in the Universe than we thought possible shows that we have a lot to learn about SMBHs and how the Universe has evolved to the state it’s in now.
The post How Did Supermassive Black Holes Get So Big, So Early? They Might Have Had a Head Start appeared first on Universe Today.
A town in the Austrian Alps might not seem like the most conducive place to come up with daring space missions. But, for the last 40 years, students and professors have been gathering to do just that in Alpbach, just north of the Lichtenstein/Austrian border. One outcome of the Alpbach Summer School this year was an idea for a combined Neptune / Triton explorer mission to take advantage of existing technology developed for the JUICE missions. Before we get into the technical details of the mission, though, let’s dive into why scientists should care about the Neptunian system in the first place.
The last time we visited Neptune was with Voyager 2 back in 1989, and it was launched 12 years before that in 1977. Technology has advanced significantly since then, and the limited amount of data Voyager collected at Neptune provided exciting insights into the planet. For example, its magnetosphere is tilted by 47 degrees. Also, Neptune’s interior remains opaque, with our best guess being that it differs from the other gas giants. However, a lack of data makes further speculation difficult.
Triton, Neptune’s moon, is also interesting in its own right. It has a retrograde orbit, which implies that it is a captured Trans-Neptunian Object rather than a moon that formed from some violent event on Neptune itself. It shows a significant amount of geological activity and shot a series of dark plumes into space during Voyager’s flyby, whose composition remains unknown.
There are plenty of mission ideas for visiting Neptune and Triton – including the Trident mission at NASA.Visiting these faraway worlds requires plenty of foresight, and many missions have been proposed. The “Blue” team at the Alpbach summer school developed a two-pronged approach for this mission design – the Triton Unveiler & Neptune Explorer (TUNE). This orbiter would hold most of the mission’s primary instrumentation and the Probe for Inner Atmospheric Neptune Observations (PIANO). One of the classes at the Summer School was space exploration acronym training.
TUNE, the orbiter, will be placed into a trajectory allowing it to orbit Neptune 600 times while using Triton to course-correct during its 40 flybys of the smaller moon. Its payload would include a standard suite of sensors, including a radiometer, spectrometer, altimeter, and many other meters. These instruments would help it complete its nine science objectives, which range from measuring temperature and pressure differences in Neptune’s atmosphere to determining Triton’s surface composition.
A second craft will help with several of those missions. PIANO has its own suite of meters, including a Nephelometer and helium sensor. It is designed to be shot into Neptune and send data back to TUNE during its descent, allowing scientists to get a first glimpse into the interior of this enigmatic world.
Fraser discusses the Voyager’s collected data on Neptune.Thanks to the Jupiter Icy Moons Explorer (JUICE) mission from ESA, most of the mission’s technologies already exist and have been flight-proven. While that lowers the overall development cost of the mission, other factors play into a sense of urgency for launch. In the 2070s, the part of Triton that emits those dark plumes will enter a night phase that it will not leave for years, making it necessary to get there before that nuance of orbital mechanics makes the mission goals more difficult.
Given the long development time for some missions and the decade-plus journey to reach the last planet in the solar system, the sooner scientists and engineers start working toward the mission, the better. But so far, none of the big space agencies have picked up the idea as a fully-fledged mission concept. Though we will eventually send another probe to Neptune at some point, unless one of them does pick up this mission, TUNE-PIANO might remain only a dream of one summer in the Austrian Alps.
Learn More:
M. Acurcio et al – The TUNE & PIANO Mission
UT – 10 Interesting Facts About Neptune
UT – What Is The Surface of Neptune Like?
UT – An Ambitious Mission to Neptune Could Study Both the Planet and Triton
Lead Image:
Global color mosaic of Neptune’s largest moon, Triton, taken by NASA’s Voyager 2 in 1989.
Credit: NASA/JPL-Caltech/USGS
The post A Mission to Triton and Neptune Would Unlock Their Mysteries appeared first on Universe Today.