On September 11, 2001, as part of a planned terrorist attack, commercial planes were hijacked and flown into each of the two towers at the World Trade Center in New York. A third plane was flown into the Pentagon, and a fourth crashed after the passengers fought back. This, of course, was a huge world-affecting event. It is predictable that after such events, conspiracy theorists will come out of the woodwork and begin their anomaly hunting, breathing in the chaos that inevitably follows such events and spinning their sinister tales, largely out of their warped imagination. It is also not surprising that the theories that result, just like any pseudoscience, never truly die. They may fade to the fringe, but will not go away completely, waiting for a new generation to bamboozle. In the age of social media, everything also has a second and third life as a You Tube or Tik Tok video.
But still I found it interesting, after not hearing 911 conspiracy theories for years, to get an e-mail out of the blue promoting the same-old 911 conspiracy that the WTC towers fell due to planned demolition, not the impact of the commercial jets. The e-mail pointed to this recent video, by dedicated conspiracy theorist Jonathan Cole. The video has absolutely nothing new to say, but just recycles the same debunked line of argument.
The main idea is that experts and engineers cannot fully explain the sequence of events that led to the collapse of the towers and also explain exactly how the towers fell as they did. To do this Cole uses the standard conspiracy theory playbook – look for anomalies and then insert your preferred conspiracy theory into the apparent gap in knowledge that you open up. The unstated major premise of this argument is that experts should be able to explain, to an arbitrary level of detail, exactly how a complex, unique, and one-off event unfolded – and they should be able to do this from whatever evidence happens to be available.
The definitive official report on the cause of the collapse of the two towers is in the NIST report, which concludes:
“In WTC 1 , the fires weakened the core columns and caused the floors on the south side of the building to sag. The floors pulled the heated south perimeter columns inward, reducing their capacity to support the building above. Their neighboring columns quickly became overloaded as columns on the south wall buckled. The top section of the building tilted to the south and began its descent. The time from aircraft impact to collapse initiation was largely determined by how long it took for the fires to weaken the building core and to reach the south side of the building and weaken the perimeter columns and floors.”
The process in WTC 2 was similar, just with different details. Essentially the impact of the commercial jets dislodged the fireproofing from the core columns. The subsequent fires then heated up and weakened the steel, reducing their ability to bear load until they ultimately failed, initiating collapse. Once a collapse was initiated the extra load of the falling floors was greater than the ability of the lower floors to bear, so they also collapsed.
There really is no mystery here – a careful and thorough analysis by many experts using all available video evidence, engineering designs of the building, and computer simulations have provided an adequate and highly plausible explanation. But Cole believes you can just look at the videos and contradict the experts – he explicitly argues for this position, even that it is “obvious” what is happening and all the experts are wrong. He then cherry picks reasons for not accepting the expert conclusion, such as, why haven’t we seen this before? Where are the pancaked floors? But again, he is just anomaly hunting. What he fails to consider is that the WTC towers were the largest structures ever to collapse in this way, and that you cannot simply scale up smaller building collapses and think you can understand or predict what should have happened with the towers. The energies involved are different, and therefore the relevant physics will behave differently. This is like trying to understand what will happen if a person falls from a height by your experience with small insects falling from relatively similar heights.
The two main anomalies he focuses on are the absence of recognizable debris and the apparent “explosions”. He says – where are the pancaked floors? Meanwhile, lower Manhattan was covered with a layer of concrete dust. Where do you think that dust came from? Again – at this scale and these energies the concrete was mostly pulverized into powder. This is not a mystery.
His second line of evidence (again, nothing new) is the apparent series of explosions ahead of the collapse. However, these explosions are simply the air pressure and immense power of the building collapsing down, causing an explosive sound as each floor was encountered by the collapse, and causing air to be blown out the windows. This is not an incredibly precise sequence of explosions ahead of the collapse – it is the collapse. I love how the “controlled demolition” advocates argue that the collapse looked like such a demolition. But actually look at videos of controlled demolitions – they look nothing like the collapse of the towers. In such cases you see the explosions, usually happening at roughly the same time, a moment before the collapse. The sequence is – explosions then collapse. But with the WTC collapses the collapse comes first, and the apparent “explosions” (which do not look like any demolition video I have seen) are at the leading edge of the collapse. This would requires a fantastically timed sequence of demolitions that is virtually impossible.
In essence Cole and other die-hard 911 conspiracy theorists are replacing a well modeled and evidenced explanation for the collapse with wild speculation, causing far more problems than the imaginary ones they conjure up.
There is also the fact that conspiracy theorists rarely provide any positive evidence for their conspiracy. They only try to poke holes in the official explanation, then insert a sinister interpretation. But here we are, 23 years later, and still there isn’t a lick of evidence (even through multiple subsequent administrations) for a conspiracy. The conspiracy narrative also doesn’t make sense. Why would they arrange to have commercial jets laden with fuel crash into the towers, and then also take on the risk of rigging them for controlled demolition, and then setting off the demolition in front of the world and countless cameras? And then take the risk that an official investigation, even in a later administration, would not reveal the truth. This is a bad movie plot, one that would pull me from my sustained disbelief.
There is no evidence for an inside job. There is no evidence that a massive project to plant explosives in both towers (or three, if you include WTC7) had occurred. There is no evidence from actual expert analysis that the towers fell due to controlled demolition. Cole’s analysis is not convincing to say the least. I find it childish and simplistic. But it is easy to use anomaly hunting to create the impression that something fishy is going on, and that is partly why these conspiracy theories persists, long past their expiration date.
The post 911 Conspiracy Theories Persist first appeared on NeuroLogica Blog.
Rarely does something get developed which is a real game changer in space exploration. One example is the Skylon reusable single-stage-to-orbit spaceplane. Powered by the hypersonic SABRE engine it operates like a jet engine at low altitude and more like a conventional rocket at high altitude. Sadly, ‘Reaction Engines’ the company that designs the engines has filed for bankruptcy.
Launching rockets into space is an expensive business and it has often been a significant barrier in space exploration. This is largely because traditional rockets include a significant proportion of expendable elements. A typical launch into low Earth orbit for example can cost anything from tens to hundreds of millions of dollars due to those single use components. Movement has however been seen with reusable rocket technology like the Falcon 9 and Starship rockets which are refurbished and reused for multiple launches. This has helped to drive down the cost of a rocket launch but still about $2,000 per kilogram there is still much to do to drive down the cost of space exploration.
A SpaceX Falcon 9 rocket sends the European Space Agency’s Hera spacecraft into space from its Florida launch pad. (Credit: SpaceX)The idea for a fully reusable single-stage-to-orbit (SSTO) spaceplane is one such development and was the brainchild of Reaction Engines Limited. The Skylon spaceplane was designed to take off and land like a conventional aircraft significantly reducing the launch costs. Instead of relying upon multiple expendable stages during ascent, Skylon’s Synergetic Air-Breathing Rocket Engine (SABRE) combines jet and rocket propulsion technology to reach orbit. Instead of being fuelled by conventional rocket propellant carried aloft, it utilises atmospheric oxygen reducing the need to carry heavy oxygen and therefore drastically improves fuel efficiency. Once at sufficient altitude, the SABRE engine switches to rocket mode and only then starts to use onboard oxygen to reach final orbit.
An artist’s conception of Reaction Engines’ Skylon spacecraft. Credit: Reaction EnginesReaction Engines Limited was formed in the UK back in 1989 and focussed its attention on propulsion technology. In particular to address access issues to space and hypersonic flight. The SABRE engine they developed showed successfully that a dual-mode rocket could efficiently transition between high speed flight within the atmosphere to rocket powered flight in space. It relies upon a pre-cooler system that cools incoming air from over 1,000°C to room temperature in fractions of a second to drive high speeds without the engine over heating.
The company is based in Oxfordshire and has to date, secured significant investments including BAE Systems, Boeing and the European Space Agency. Unfortunately, the company has been struggling to source funding to continue operations so formally entered administration on 31 October 2024. An eight week process is now underway to develop plans to restructure, sell the company or liquidate its assets. Most of its 200 employees have now been laid off.
Source : Reaction Engines Limited
The post Reaction Engines Goes Into Bankruptcy, Taking the Hypersonic SABRE Engine With it appeared first on Universe Today.
Bats are scary and rabies is deadly, but do you need to worry about you or your pets catching the disease from them?
The dream of traversing the depths of space and planting the seed of human civilization on another planet has existed for generations. For long as we’ve known that most stars in the Universe are likely to have their own system of planets, there have been those who advocated that we explore them (and even settle on them). With the dawn of the Space Age, this idea was no longer just the stuff of science fiction and became a matter of scientific study. Unfortunately, the challenges of venturing beyond Earth and reaching another star system are myriad.
When it comes down to it, there are only two ways to send crewed missions to exoplanets. The first is to develop advanced propulsion systems that can achieve relativistic speeds (a fraction of the speed of light). The second involves building spacecraft that can sustain crews for generations – aka. a Generation Ship (or Worldship). On November 1st, 2024, Project Hyperion launched a design competition for crewed interstellar travel via generation ships that would rely on current and near-future technologies. The competition is open to the public and will award a total of $10,000 (USD) for innovative concepts.
Project Hyperion is an international, interdisciplinary team composed of architects, engineers, anthropologists, and urban planners. Many of them have worked with agencies and institutes like NASA, the ESA, and the Massachusetts Institute of Technology (MIT). Their competition is sponsored by the Initiative for Interstellar Studies (i4is), a non-profit organization incorporated in the UK dedicated to research that will enable robotic and human exploration and the settlement of exoplanets around nearby stars.
Artist’s concept of a generation ship. Credit: Maciej Rebisz/Michel LamontagneWhile concepts for an interstellar spacecraft go back to the early Space Age, interest in the field has grown considerably in the past two decades. This is largely due to the recent explosion in the number of known exoplanets in our galaxy, which currently stands at 5,787 confirmed planets in 4,325 star systems. This is illustrated by concepts like Breakthrough Starshot, Swarming Proxima Centauri, and the Genesis Project. These concepts leverage gram-scale spacecraft, directed energy (lasers), and lightsails to achieve speeds of up to 20% of the speed of light, allowing them to make the journey in decades rather than centuries or millennia.
However, sending crewed spacecraft to other star systems with enough passengers to settle on another planet is far more challenging. As addressed in a previous article, a spacecraft relying on known or technically feasible propulsion methods would take between 1,000 and 81,000 years to reach even the nearest star (Proxima Centauri). While some advanced concepts like Project Orion, Daedalus, and Icarus could theoretically reach Proxima Centauri in 36 to 85 years, the costs and amount of propellant needed are prohibitive.
The alternative to these “go fast” concepts is to settle in for the long ride, which may last centuries or even millennia. This necessitates a spacecraft of sufficient size capable of accommodating hundreds (or thousands) of human beings over multiple generations. To save room and reduce the mass of cargo space, the crews will need to grow much of their food and rely on life support systems that are bioregenerative in nature. In short, the ship would need to be self-sustaining so the passengers could live comfortable, healthy lives until they reached their destination.
Andreas Hein, an Associate Professor of Aerospace Engineering at the University of Luxembourg and the Chief Scientist at the Interdisciplinary Centre for Security, Reliability and Trust, is part of the Hyperion Project’s Organizing Committee. As he told Universe Today via email:
“Think about the difference between a drone and an ocean liner. Previous designs for interstellar spacecraft, such as Orion, Daedalus, and Icarus, focused on uncrewed probes with the primary objective of gathering scientific data from target star systems, including searching for signs of life. In contrast, generation ships are designed to transport a crew, with the primary goal of settling an exoplanet or other celestial body in the target star system. They also tend to be much larger than interstellar probes, though they would likely use similar propulsion systems, such as fusion-based propulsion.”
Generation ShipsThe first known description of a generation ship was made by rocketry engineer Robert H. Goddard, one of the “forefathers of modern rocketry,” for whom NASA’s Goddard Space Flight Center is named. In his 1918 essay, “The Ultimate Migration,” he described an “interstellar ark” leaving the Solar System in the distant future after the Sun reached the end of its life cycle. The passengers would cryogenically frozen or in a state of induced torpor for much of the journey except for the pilot, who would be awakened periodically to steer the ship.
Goddard recommended that the ship be powered by atomic energy if the technology were realized. If not, a combination of hydrogen, oxygen, and solar energy would suffice. Goddard calculated that these power sources would allow the vessel to achieve velocities of 4.8 to 16 km/s (3 to 10 mi/s), or roughly 57,936 km/h (36,000 mph). This was followed by famed Russian rocket scientist and cosmologist Konstantin E. Tsiolkovsky, also recognized as one of the “forefathers of modern rocketry.” In 1928, he wrote an essay titled “The Future of Earth and Mankind” that described an interstellar “Noah’s Ark.”
In Tsiolkovsky’s version, the spaceship would be self-sufficient, and the crew would be awake for the journey, which would last for thousands of years. In 1964, NASA scientist Dr. Robert Enzmann proposed the most detailed concept to date for a generation ship, known as an “Enzmann Starship.“ The proposal called for a ship measuring 600 meters (2,000 feet) in length powered by a fusion thruster that uses deuterium as a propellant. According to Enzmann, this ship would house an initial crew of 200 people with room for expansion along the way.
In recent years, the concept has been explored from various angles, from biological and psychological to ethical. This included a series of studies (2017-2019) conducted by Dr. Frederic Marin of the Astronomical Observatory of Strasbourg using tailor-made numerical software (called HERITAGE). In the first two studies, Dr. Marin and colleagues conducted simulations that showed that a minimum crew of 98 (max. 500) would need to be coupled with a cryogenic bank of sperm, eggs, and embryos to ensure genetic diversity and good health upon arrival.
In the third study, Dr. Marin and another group of scientists determined that the ship carrying them would need to measure 320 meters (1050 feet) in length, 224 meters (735 feet) in radius, and contain 450 m² (~4,850 ft²) of artificial land to grow enough food to sustain them. In short, these proposals and studies establish that a generation ship and its crew must bring “Earth with them” and rely on bioregenerative systems to replenish their food, water, and air throughout generations.
Credit: Maciej Rebisz/Michel LamontagneAs noted, most studies regarding interstellar exploration have focused on probes or ships and tended to emphasize speed over ensuring passengers could make the journey. As Hein explained, this makes the Hyperion Project the first competition to focus on generation ships and ensuring the interstellar voyagers remain healthy and safe until they arrive in a nearby star system:
“This competition is unprecedented—a true first. To our knowledge, it marks the first time a design competition specifically focused on generation ships has been launched. It builds on our team’s prior research, conducted since 2011, which addresses fundamental questions such as the required population size. This competition uniquely explores the complex interplay between generation ship technologies and the dynamics of a highly resource-constrained society.
“Most studies have focused on the technological aspects, such as propulsion and life support, while often treating the ship’s technology and onboard society as separate issues. This approach is understandable given the challenge of analyzing these interdependencies. We even got the advice to stay away. Our goal is to take an initial step toward exploring and envisioning these interdependencies. We aim to be Cayley instead of Da Vinci. Da Vinci imagined aircraft, but Cayley conceived their basic design principles, which paved the way for the Wright Brothers.”
The CompetitionRegistration for the competition will remain open until December 15th, 2024, and all participating teams must pay a $20 registration fee. The top three winning entries will be announced on June 2nd, 2025, and awarded $5000 for first place, $3000 for second, and $2000 for third. In addition, ten teams will receive honorary mentions for creative and innovative ideas. For more information, check out Project Hyperion’s website and the Mission Brief.
Per their mission statement, Project Hyperion is a preliminary study and feasibility assessment for crewed interstellar flight using current and near-future technologies. The goal is to inform the public about the future possibility of interstellar space travel and to guide future research and technology development. As they state on their website, the competition has the following theme:
“Humanity has overcome the great sustainability crisis in the 21st century and has transitioned into an era of sustainable abundance, both on Earth and in space. Humanity has now reached the capacity to develop a generation ship without major sacrifices. An Interstellar Starship flies by an icy planet in a nearby solar system. Going beyond the classical examination of the problem of Interstellar propulsion and structural design for a voyage lasting multiple centuries, what might be the ideal type of habitat architecture and society in order to ensure a successful trip?”
Credit: Maciej Rebisz/Michel LamontagneParticipants will be tasked with designing the ship, its habitat, and its subsystems, including details on its architecture and society. The Project Brief describes other important Boundary Conditions, including the duration of the mission, its destination, and other important considerations. The mission duration is 250 years from launch to arrival at the target star system, consistent with the ship having advanced propulsion capable of achieving a fraction of the speed of light.
To ensure the health and safety of the crew, the ship’s habitat must have atmospheric conditions similar to Earth, protection from galactic rays, micrometeorites, and interstellar dust (necessary for relativistic space travel). The ship must also provide artificial gravity via rotating sections, but “parts of the habitat can have reduced gravity.” The habitat must also provide accommodation and decent living conditions for 1000 plus or minus 500 people throughout the trip. The habitat will also need to be designed in such a way that it can be modified to meet changing needs.
The society’s structure must allow for cultural variations, including language, ethics, family structure, beliefs, aesthetics, family structure, and other social factors. The competition also considers knowledge retention and loss relative to Earth, which they describe as “almost inevitable.” Cameron Smith, an anthropologist at Portland State University and the University of Arizona’s Center for Human Space Exploration (CHaSE), is also a member of Project Hyperion’s Organizing Committee. As he explained to Universe Today:
“[T]he situation of a population, let’s say thousands or even 1500 people, traveling in isolation for centuries would be unique to the human experience. So just as we plan for the health of the architecture and the hardware, maintaining them to keep them in a good state over this time span, we can plan for the health and maintenance of both biology and culture. And we have an excellent guide which is evolution.
“Evolution is at the heart of all life sciences, and it also, in many ways, applies to cultural change through time. Biology evolves, and cultures evolve. And we have learned how to manage our cultures on Earth to fit a wide variety of situations.”
“The idea, however, is to get people thinking about how culture might be adjusted for the unusual conditions I’ve outlined. Separation from Earth, separation from other populations of humans, except by radio or video communication – which will become less and less as they get farther from Earth – what could change through time of the voyage that would require cultural adjustment?”
Credit: Maciej Rebisz/Michel LamontagneThroughout the trip, the population must also have access to basic products (clothing, shelter, etc.). The mass of the habitat is to be as low as possible, reliable over the entire duration of the journey, and include redundant systems. The generation ship’s target destination is a rocky planet in a nearby star system (like Proxima b). In an interesting twist, the competition stresses that this planet will have an artificial ecosystem created by a precursor probe, à la Project Genesis. As a result, the crews will not require any significant genetic or biological adaptations to survive in that ecosystem. As Hein explained:
“250 years in a tin can and staying happy, aka. can a society thrive in a severely resource-constrained environment? Answering this question is essential for designing a generation ship and may also offer insights into sustainable futures on Earth. From my perspective, there has been a significant lack of imaginative solutions to this challenge.’250 years in a tin can and staying happy, aka, can a society thrive in a severely resource-constrained environment?’
“Answering this question is essential for designing a generation ship and may also offer insights into sustainable futures on Earth. From my perspective, there has been a significant lack of imaginative solutions to this challenge.”
“We also hope to raise awareness of the complexities underlying today’s technologies. Which technologies could or should be preserved on a generation ship, and which may be lost? Research shows that a society’s population size affects the diversity and complexity of its technologies. Most modern technologies require intricate supply chains involving numerous companies, infrastructure, and regulatory systems. Therefore, a generation ship will likely rely on low-tech solutions unless disruptive technologies, like molecular manufacturing or Standard Template Constructs (as depicted in Warhammer 40k), become feasible.”
An Interdisciplinary ApproachA major focus of the competition is interdisciplinary research, reflective of the organizing committee itself. This has become a trend in space research, thanks in large part to the rise of the commercial space industry. For many companies and non-profits today, traditional research is expanding beyond aerospace engineering and incorporating architecture and interior design, biology, sociology, psychology, agriculture, and other disciplines to create concepts that will allow for healthy and sustainable living in space.
Credit: Maciej Rebisz/Michel LamontagnePer the rules, teams must consist of at least one architectural designer, engineer, and social scientist (a sociologist, anthropologist, etc.). As Yazgi Demirbas Pech, an architect and designer with the Organizing Committee, explained:
“We hope this competition will inspire greater interdisciplinary collaboration, emphasizing the value of fields such as architecture and social sciences—especially critical in planning for long-duration, long-distance missions. A holistic approach that integrates these diverse fields can contribute to more sustainable and human-centered solutions for space exploration.
“Unlike traditional architectural practices on Earth, space architecture requires a delicate balance between strict technical constraints—such as limited physical space, extreme environmental conditions, and restricted resources—and the essential human needs for comfort, safety, and psychological well-being. Here, architecture becomes a life-sustaining element, enabling people to live, work, and thrive across vast distances and timescales.
“Through this competition, we invite teams to challenge conventional design principles and redefine what “home” means among the stars. Including architects or architecture students on teams will undoubtedly add fresh perspectives to this thought-provoking competition.”
Solving for Space Solves for EarthAnother important aspect of the competition is the desire to inspire ideas that will also have applications and benefits here on Earth. This is another crucial aspect of the future of space exploration, which includes plans for creating outposts on the Moon, Mars, and beyond. Like a generation ship, missions operating farther from Earth cannot rely on regular resupply missions sent from Earth. This means that habitats must be as self-sufficient as possible and ensure that inhabitants have enough air, water, and food to live comfortably.
For decades, scientists and planners have looked to Earth’s natural environment for inspiration. This was the purpose of the Biosphere 2 project, which conducted two experiments between 1991 and 1994 in which volunteers lived in a sealed biome that mimics Earth’s many environments. Since 2007, the University of Arizona has used the facility to conduct research through its CHaSE program while remaining open to the public.
Credit: Maciej Rebisz/Michel Lamontagne“Since the 1990s, [Biosphere 2] has been a research center for closed ecosystems as though on a starship, and the research here continues. [I am] actually residing at the biosphere until January, and I am looking at the stars and engaged in all of this right now,” said Smith, who wrote to Universe Today from the facility. As he went on to note, research from this experiment and similar studies have significant applications for life here on Earth, mainly because there is no margin for error in space:
“[T]he planning and preparation going into the starship in terms of its culture and biological protections for the offspring would be very carefully designed to give the greatest protections to them, perhaps in ways more specifically tailored to their survival and good health than in any culture ever on Earth. On the interstellar voyage, things must go just right to survive over multiple generations in the closed ecosystem, so planning and preparation would have to be very thorough.”
Since failure in space often means death, especially when people are stationed far from Earth where rescue missions would take too long to reach them, the technologies future explorers and settlers rely on must be regenerative, fail-proof, and sustainable over time. This research and development will have direct benefits when it comes to the most pressing problems we face here on Earth: climate change, overpopulation, poverty and hunger, and the need for sustainable living. As Pech emphasized:
“I believe that thinking beyond Earth can offer valuable insights into how we might improve life here on ‘spaceship Earth.’ Just as in space, where we face numerous challenges, our planet requires innovative approaches to foster harmony and resilience amidst current global conflicts and challenges.”
There’s also the added benefit of stimulating questions about life in the Universe and where extraterrestrial civilizations (ETCs) could already be traveling among the stars. For decades, scientists have explored these questions as part of the Fermi Paradox. As Hein explained:
“Finally, just as Project Daedalus demonstrated the theoretical feasibility of interstellar travel, we aim to establish a similar ‘existence proof’ for human travel to the stars. Achieving this will add new perspectives to the Fermi Paradox: if we can envision crewed interstellar travel today, a more advanced civilization should have achieved it already. So, where are they?”
Those interested in the competition or have more questions are encouraged to contact the Initiative for Interstellar Studies at info@i4is.org. The i4is will remain open to Q&A until December 1st, 2024.
Further Reading: Project Hyperion
The post Project Hyperion is Seeking Ideas for Building Humanity’s First Generation Ship appeared first on Universe Today.
Supermassive Black Holes (SMBHs) can have billions of solar masses, and observational evidence suggests that all large galaxies have one at their centres. However, the JWST has revealed a foundational cosmic mystery. The powerful space telescope, with its ability to observe ancient galaxies in the first billion years after the Big Bang, has shown us that SMBHs were extremely massive even then. This contradicts our scientific models explaining how these behemoths became so huge.
How did they get so massive so early?
Black holes of all masses are somewhat mysterious. We know that massive stars can collapse and form stellar-mass black holes late in their lives. We also know that pairs of stellar-mass black holes can merge, and we’ve detected the gravitational waves from those mergers. So, it’s tempting to think that SMBHs also grow through mergers when galaxies merge together.
The problem is, in the early Universe, there wasn’t enough time for black holes to grow large enough and merge often enough to produce the SMBHs. The JWST has shown us the errors in our models of black hole growth by finding quasars powered by black holes of 1-10 billion solar masses less than 700 million years after the Big Bang.
Astrophysicists are busy trying to understand how SMBHs became so massive so soon in the Universe. New research titled “Primordial black holes as supermassive black holes seeds” attempts to fill in the gap in our understanding. The lead author is Francesco Ziparo from the Scuola Normale Superiore di Pisa, a public university in Italy.
This artist’s conception illustrates a supermassive black hole (central black dot) at the core of a young, star-rich galaxy. Observational evidence suggests all large galaxies have one. Image credit: NASA/JPL-CaltechThere are three types of black holes: Stellar-mass black holes, intermediate-mass black holes (IMBHs), and SMBHs. Stellar-mass black holes have masses ranging from about five solar masses up to several tens of solar masses. SMBHs have masses ranging from hundreds of thousands of solar masses up to millions or billions of solar masses. IMBHs are in between, with masses ranging from about one hundred to one hundred thousand solar masses. Researchers have wondered if IMBHs could be the missing link between stellar-mass black holes and SMBHs. However, we only have indirect evidence that they exist.
This is Omega Centauri, the largest and brightest globular cluster that we know of in the Milky Way. An international team of astronomers used more than 500 images from the NASA/ESA Hubble Space Telescope spanning two decades to detect seven fast-moving stars in the innermost region of Omega Centauri. These stars provide compelling new evidence for the presence of an intermediate-mass black hole. Image Credit: ESA/Hubble & NASA, M. Häberle (MPIA)There’s a fourth type of black hole that is largely theoretical, and some researchers think they can help explain how the early SMBHs were so massive. They’re called primordial black holes (PBHs.) Conditions in the very early Universe were much different than they are now, and astrophysicists think that PBHs could’ve formed by the direct collapse of dense pockets of subatomic matter. PBHs formed before there were any stars, so aren’t limited to the rather narrow mass range of stellar-mass black holes.
Artist illustration of primordial black holes. NASA’s Goddard Space Flight Center“The presence of supermassive black holes in the first cosmic Gyr (gigayear) challenges current models of BH formation and evolution,” the researchers write. “We propose a novel mechanism for the formation of early SMBH seeds based on primordial black holes (PBHs).”
Ziparo and his co-authors explain that in the early Universe, PBHs would’ve clustered and formed in high-density regions, the same regions where dark matter halos originated. Their model takes into account PBH accretion and feedback, the growth of dark matter halos, and dynamical gas friction.
In this model, the PBHs are about 30 solar masses and are in the central region of dark matter (DM) halos. As the halos grow, baryonic matter settles in their wells as cooled gas. “PBHs both accrete baryons and lose angular momentum as a consequence of the dynamical friction on the gas, thus gathering in the central region of the potential well and forming a dense core,” the authors explain. Once clustered together, a runaway collapse occurs that ends up as a massive black hole. Its mass depends on the initial conditions.
Planted soon enough, these seeds can explain the early SMBHs the JWST has observed.
This figure from the research illustrates how PBHs could form the seeds for SMBHs. (Left) As the gas cools, it settles into the center of the dark matter gravitational potential, and the PBHs become embedded at the center. (Middle) The PBHs lose angular momentum due to the gas’s dynamic friction and concentrate in the core of the DM halo. (Right) PBH binaries form and merge rapidly because of their high density. The end result is a runaway merger process that creates the seeds of SMBHs. Image Credit: Ziparo et al. 2024.There’s a way to test this model, according to the authors.
“During the runaway phase of the proposed seed formation process, PBH-PBH mergers are expected to copiously emit gravitational waves. These predictions can be tested through future Einstein Telescope observations and used to constrain inflationary models,” they explain.
The Einstein Telescope or Einstein Observatory is a proposal from several European research agencies and institutions for an underground gravitational wave (GW) observatory that would build on the success of the laser-interferometric detectors Advanced Virgo and Advanced LIGO. The Einstein Telescope would also be a laser interferometer but with much longer arms. While LIGO has arms four km long, Einstein would have arms 10 km long. Those longer arms, combined with new technologies, would make the Telescope much more sensitive to GWs.
The Einstein Telescope should open up a GW window into the entire population of stellar and intermediate-mass black holes over the entire history of the Universe. “The Einstein Telescope will make it possible, for the first time, to explore the Universe through gravitational waves along its cosmic history up to the cosmological dark ages, shedding light on open questions of fundamental physics and cosmology,” the Einstein website says.
A thorough understanding of SMBHs is a ways away, but it’s important to understand them because of their role in the Universe. They help explain the universe’s large-scale structure by influencing the distribution of matter on large scales. The fact that they appeared so much earlier in the Universe than we thought possible shows that we have a lot to learn about SMBHs and how the Universe has evolved to the state it’s in now.
The post How Did Supermassive Black Holes Get So Big, So Early? They Might Have Had a Head Start appeared first on Universe Today.
A town in the Austrian Alps might not seem like the most conducive place to come up with daring space missions. But, for the last 40 years, students and professors have been gathering to do just that in Alpbach, just north of the Lichtenstein/Austrian border. One outcome of the Alpbach Summer School this year was an idea for a combined Neptune / Triton explorer mission to take advantage of existing technology developed for the JUICE missions. Before we get into the technical details of the mission, though, let’s dive into why scientists should care about the Neptunian system in the first place.
The last time we visited Neptune was with Voyager 2 back in 1989, and it was launched 12 years before that in 1977. Technology has advanced significantly since then, and the limited amount of data Voyager collected at Neptune provided exciting insights into the planet. For example, its magnetosphere is tilted by 47 degrees. Also, Neptune’s interior remains opaque, with our best guess being that it differs from the other gas giants. However, a lack of data makes further speculation difficult.
Triton, Neptune’s moon, is also interesting in its own right. It has a retrograde orbit, which implies that it is a captured Trans-Neptunian Object rather than a moon that formed from some violent event on Neptune itself. It shows a significant amount of geological activity and shot a series of dark plumes into space during Voyager’s flyby, whose composition remains unknown.
There are plenty of mission ideas for visiting Neptune and Triton – including the Trident mission at NASA.Visiting these faraway worlds requires plenty of foresight, and many missions have been proposed. The “Blue” team at the Alpbach summer school developed a two-pronged approach for this mission design – the Triton Unveiler & Neptune Explorer (TUNE). This orbiter would hold most of the mission’s primary instrumentation and the Probe for Inner Atmospheric Neptune Observations (PIANO). One of the classes at the Summer School was space exploration acronym training.
TUNE, the orbiter, will be placed into a trajectory allowing it to orbit Neptune 600 times while using Triton to course-correct during its 40 flybys of the smaller moon. Its payload would include a standard suite of sensors, including a radiometer, spectrometer, altimeter, and many other meters. These instruments would help it complete its nine science objectives, which range from measuring temperature and pressure differences in Neptune’s atmosphere to determining Triton’s surface composition.
A second craft will help with several of those missions. PIANO has its own suite of meters, including a Nephelometer and helium sensor. It is designed to be shot into Neptune and send data back to TUNE during its descent, allowing scientists to get a first glimpse into the interior of this enigmatic world.
Fraser discusses the Voyager’s collected data on Neptune.Thanks to the Jupiter Icy Moons Explorer (JUICE) mission from ESA, most of the mission’s technologies already exist and have been flight-proven. While that lowers the overall development cost of the mission, other factors play into a sense of urgency for launch. In the 2070s, the part of Triton that emits those dark plumes will enter a night phase that it will not leave for years, making it necessary to get there before that nuance of orbital mechanics makes the mission goals more difficult.
Given the long development time for some missions and the decade-plus journey to reach the last planet in the solar system, the sooner scientists and engineers start working toward the mission, the better. But so far, none of the big space agencies have picked up the idea as a fully-fledged mission concept. Though we will eventually send another probe to Neptune at some point, unless one of them does pick up this mission, TUNE-PIANO might remain only a dream of one summer in the Austrian Alps.
Learn More:
M. Acurcio et al – The TUNE & PIANO Mission
UT – 10 Interesting Facts About Neptune
UT – What Is The Surface of Neptune Like?
UT – An Ambitious Mission to Neptune Could Study Both the Planet and Triton
Lead Image:
Global color mosaic of Neptune’s largest moon, Triton, taken by NASA’s Voyager 2 in 1989.
Credit: NASA/JPL-Caltech/USGS
The post A Mission to Triton and Neptune Would Unlock Their Mysteries appeared first on Universe Today.