It’s time to take a thorough, more serious look at using geoengineering to protect the planet’s icesheets, according to a group of scientists who have released a new report examining the issue. Glacial geoengineering is an emerging field of study that holds some hope for Earth’s diminishing glaciers and ice sheets.
Collectively, glaciers and icesheets are called the cryosphere. The cryosphere plays an important role in the water cycle. They’re massive water reservoirs that release their water into rivers, lakes, and oceans when the temperature rises. They cover about 10% of the Earth’s land surface and provide agricultural water for about two billion people.
There’s a dire consequence to not protecting Earth’s glaciers and icesheets: global sea rise. The IPCC (International Panel on Climate Change) doesn’t pull punches when it comes to our planet’s melting ice sheets and glaciers. In their Special Report on the Ocean and Cryosphere in a Changing Climate, published in 2019, the IPCC said that global mean sea levels would probably rise between 0.95 feet (0.29m) and 3.61 feet (1.1m) by the end of the 21st century. Those estimates may actually be on the conservative side, but they still put vast numbers of people in small island states and coastal cities right in the crosshairs of the unfolding melting cryosphere disaster.
A team of five scientists has released a new white paper on glacial geoengineering, “Glacial Climate Intervention: A Research Vision.” In it, they argue that glaciological research should focus on ice-sheet preservation to slow down or prevent sea level rise. They write that we need to determine “if engineered interventions applied to critical icesheet regions may reduce sea-level rise.”
In their paper, they focus on icesheets rather than glaciers. The world’s glaciers are remote, each one is relatively small, and they’re spread around the world. They’re not realistic targets for geoengineering. Conversely, Antarctica and Greenland feature massive, continent-size icesheets that are accessible and are the main source of meltwater that is raising sea levels.
The authors don’t advocate for any particular geoengineering intervention. Instead, they present their vision of a vigorous effort to determine which interventions should or could be used.
“We need vigorous public debate of potential benefits and harms, informed by research that creates evidence regarding those concerns.”
From Glacial Climate Intervention: A Research Vision“Everyone who is a scientist hopes that we don’t have to do this research,” said Douglas MacAyeal, a professor of geophysical sciences with the University of Chicago who has studied glaciers for nearly 50 years and is a co-author on the white paper. “But we also know that if we don’t think about it, we could be missing an opportunity to help the world in the future.”
Every major ice sheet and glacier system in the world is undergoing critical changes. As their melting accelerates, they’ll contribute more and more water to the oceans. The global sea level has already risen by about 8 or 9 inches since the late 1800s, and the rise will only accelerate.
Most of the water will come from regions in the Antarctic and Arctic, basically Greenland and the Antarctic Ice Sheet, a continental ice sheet that covers almost the entirety of Antarctica. Could limiting the melt in these key regions help slow the global sea level rise? How could it be achieved, and what undesirable effects would the effort have on ecosystems? According to the authors of the report, it’s time to tackle these questions seriously and with a sustained effort.
In the last couple of decades, scientists have focused on two questions about the melting cryosphere. One asks what processes cause the loss of ice that contributes to global sea rise, and the other asks how climate change is driving or affecting these processes. For decades, glaciologists have been informally discussing what interventions might be possible to slow down the sea rise.
For the authors of this report, it’s time to take the next step and ask what can be done. “We cannot stop sea-level rise, but we may be able to slow it while humanity makes the necessary shift away from carbon-based energy systems,” they write.
Their white paper is organized around three questions:
The white paper is a research agenda aimed at answering these questions. It goes beyond geoengineering and also considers “social license and justice, governance, ethics, and the wisdom of any research into glacial climate intervention.”
There are two prominent approaches to limiting melt and global sea level rise (GSLR.) One involves intervening in the ocean’s heat transport mechanisms, and the other involves basal-hydrology interventions. Basal-hydrology refers to the conditions at the base of the ice. Another less prominent approach involves intervening by pumping seawater.
The issue is extremely complex. In Antarctica, for example, different ice sheets respond differently to warmer temperatures. They have different structures and contact the ocean in different ways. Some are relatively protected from the melt, while others are in far more peril. No single type of intervention will succeed.
The Larsen ice shelf is situated on the east coast of the Antarctic Peninsula. It’s been breaking up since the 1990s. Could geoengineering slow or stop Antarctic ice shelves from fracturing and melting faster? Image: By A. J. Cook and D. G. Vaughan, CC BY 3.0, https://commons.wikimedia.org/w/index.php?curid=30463195In some cases, geoengineering would have to prevent warm water from reaching the underside of ice shelves. This could be done by constructing sediment berms on the ocean bottom or placing fibrous curtains there. Colder water could be directed toward the underside of the shelves instead, limiting and delaying the melting. This could also thicken and lengthen the ice shelves. This is an example of ocean heat transport interventions. “This would stabilize the ice sheet and slow the rate of collapse,” the authors explain. Modelling studies show that modest curtains covering only a fraction of the water column could have an outsized effect on melting.
The obvious question is, what happens to the ecosystem? It would be a tough sell if the environmental destruction was severe.
Basal hydrology interventions are aimed at the base of ice sheets where they contact the ground. Ice streams are fast-flowing streams that discharge ice and sediment into the ocean from under an ice sheet and contribute to GSLR. In the past, some of them have stopped on their own. The Kamb Ice Stream suddenly shut down about 200 years ago from natural causes. Could we recreate those causes with geoengineering? “Better understanding of why the Kamb Ice Stream shut down of its own accord will tell us whether there are human interventions that could make it happen again,” the authors write.
The authors point out that the Kamb Ice Stream likely slowed down because it lost water content. Water acts as a lubricant that allows the streams to flow faster, increasing the melt.
One idea is to drill a field of holes through ice sheets and extract water from the basal region. That would reduce the lubrication effect and slow down the ice streams. “These holes would be used to extract either water or heat from the subglacial system, possibly using passive, unpowered thermosyphons,” the authors explain. Another similar method would involve creating channels under the ice sheet where water could drain away.
One advantage to these types of basal hydrology interventions is that there could be less ecological impact.
There are a handful of other potential interventions that haven’t been as well studied. For example, windbreaks could be employed on the surface to help snow build up on the top of ice sheets. We could place reflective materials on the surface of ice sheets to reduce ablation. Another one is to use cables and anchors to prevent ice sheets from breaking up. Yet another one is to pump seawater onto the surface of ice sheets during winter to create more ice.
The eastern coast of Antarctica has lost most of the Glenzer and Conger ice shelves, as seen in these satellite images taken between November 15, 1989 – January 9, 2022. Credit: NASA GSFC/UMBC JCET.“It will take 15 to 30 years for us to understand enough to recommend or rule out any of these interventions,” said co-author John Moore, a professor with the Arctic Center at the University of Lapland.
There are many uncertainties. Altering the flow of water with berms or curtains could have unintended consequences elsewhere that might work against our geoengineering efforts. Basal hydrology interventions could cause the grounding line, the place where subsurface ice meets rock, to retreat. Pumping seawater onto the top of an ice sheet could create or exacerbate existing fractures, hastening the sheet’s breakup.
The authors acknowledge how uncertain this all is. “All glacial climate interventions are scientifically new and not yet proven to work, and are technically and socially complex projects with multiple uncertain impacts,” they write. It’ll take a coordinated and committed effort to remove these uncertainties.
“Our argument is that we should start funding this research now so that we aren’t making panicked decisions down the road when the water is already lapping at our ankles.”
Douglas MacAyeal, Professor of Geophysical Sciences, University of ChicagoThere are arguments against the effort, of course.
This type of research could end up disincentivizing other research into reducing GHG emissions. But for the authors, reducing emissions is always the top priority. “We can never say often enough that that is the first priority,” said Moore.
Some say it might create an overreliance on technological solutions. Others argue that there might be too many unintended and adverse reactions.
There might be a moral hazard, too, with the actions of one generation imperilling the next. That’s already happening with GHG emissions. Another argument against geoengineering points out that it will be the developed nations that undertake it, and they may optimize the effort for their own desired outcomes, ones that benefit them unevenly. An additional argument is that the population of scientists is small and that if they’re the only ones discussing this, valuable perspectives might be missed.
In the end, the authors are calling for a vigorous debate on all aspects of the issue, not just the engineering methods themselves. “We need vigorous public debate of potential benefits and harms, informed by research that creates evidence regarding those concerns,” they write. “We need to know and discuss how such interventions will affect people across the globe, natural systems, perceptions of “nature,” and pressure to reduce anthropogenic climate change.”
In August 2021, it rained on the summit of Greenland for the first time in human history. Image Credit: Contains modified Copernicus Sentinel data (2021) and GEUS weather station data processed by ESA. ESA Standard LicenceThey say that the overall effort is to engage as many stakeholders as possible in discussion and research.
Our carbon emissions are still climbing. The rate isn’t the same across countries and economies because more developed economies have more resources to combat emissions. But ultimately, that doesn’t really matter. The problem is global, and the solution will be, too.
It’s possible that the world’s glaciers and ice sheets have a tipping point. We may have already reached it. “Humans have already released so much carbon dioxide that we are seeing profound changes in every glacier system around the world,” said MacAyeal. “Many of these are likely to have a tipping point where even if we were to stop emitting all carbon worldwide tomorrow, the system would still collapse. And we are not in a position now to say that we haven’t already crossed those points.”
The detailed approach that the authors recommend will take time to develop. If we implement these types of solutions, it will take time to see any benefits. As that time passes, ice sheets will continue to melt, and the seas will continue to rise. There’s a sense of panic, but that can’t drive our decisions. “Without research, we cannot know if there are viable interventions,” the authors write. Without research we also can’t know if there are tipping points.
This is another familiar refrain from scientists, one in a long line of refrains that were unheeded at first and pushed aside in the face of more pressing, short-term concerns. We’ve wasted time and have to stop wasting more. “Without the concurrent practical planning, engineering, and consultation, there will be an unconscionable delay in action, should there be a solution,” the authors explain.
They envision a large-scale expansion of the science and engineering behind glaciers and the measures we can take to slow their melt.
“We are proposing such an ambitious program because we see examining options for reducing sea-level rise from icesheet melting as a global imperative,” they write.
“Our argument is that we should start funding this research now so that we aren’t making panicked decisions down the road when the water is already lapping at our ankles,” said MacAyeal.
The post Can Geoengineering Protect Earth’s Icesheets? appeared first on Universe Today.
As I’m doing a lot of preparation for my trip to South Africa, I have neither the time nor the will to dissect the article below, a piece that appeared in The Journal of Chemical Education. As is so often the case with these articles that try to use science education to create what they call “Social Justice”, it’s poorly written, illustrated with childish and uninformative figures, and—worse—so poorly argued that I can’t even see its main point. It has something to do with teaching chemistry in a more “inclusive” way, but gives no serious methodology for doing so beyond talking about social justice in chemistry class. In the end, it’s simply a performative act that says, “Hey, there’s real structural racism in chemistry, and we two chemists are on the side of the minoritized. ” Click below to read, or download the pdf here.
Below there’s also a critique of this article by Jordan Beck; a critique published in Heterodox STEM.
Just a few excepts from the article above to give a sense of its inanity, and AI-style boilerplate:
Sexism, racism, queerphobia, and ableism (among many other forms of discrimination) continue to permeate society and culture. Existing as a multiply marginalized individual exacerbates these inequities. Intersectionality as a concept was described in the academic literature by Crenshaw in 1989, explaining how individuals could experience specific, compounded discrimination, not simply additive.
These societal inequities are reflected and reproduced in chemistry. Stereotypes about who can and cannot succeed in chemistry persist, in combination with inequality of participation and research funding success statistics, leading to homogeneity in groups communicating and conducting scientific research. Important work has highlighted the contributions of racially minoritized chemists in curricula, which is a key aspect of teaching chemistry both in schools and in postcompulsory education. Chemistry-specific inequities also include privileging only certain, narrow forms of “expert” scientific knowledge, e.g., prioritizing academic language which advantages the dominant cultural groups of chemistry students, graduates, and academics–an “untranslatable code” for those outside. This leads to individuals who do not see themselves as “properly” scientific or think that genuine fears of chemistry and/or chemicals will be dismissed, developing chemophobic attitudes. Therefore, when trying to challenge chemophobia, we have to consider these structural factors to avoid reinforcing existing views of being excluded, patronized, or dismissed. This social justice lens builds on previous models of chemophobia to explicitly identify these structures, highlighting additional challenges faced by marginalized groupThese sense of this, insofar as it has any sense, is that the emphasis on merit in chemistry, and the use of language that conveys chemical concepts, is bigoted and creates “chemophobia.”
There’s more:
However, very little literature on chemophobia specifically considers structural factors, e.g., systemic racism, sexism, or unequal access to education, and where research identifies that certain marginalized subgroups in a population are more likely to endorse chemophobic attitudes, this is rarely interrogated or explained.
Maybe there isn’t that much literature on systemic racism in chemistry because there’s not that much systemic racism (i.e. formally codified discrimination) in chemistry.
And here’s how to fix chemophobia (there’s a long list given as well, but you can read it for yourself). The upshot: we need more DEI!
However, a small but growing number of papers integrate social justice considerations, including Goeden and colleagues, who describe a community-based inquiry that improved critical thinking in allied health biochemistry. Livezey and Gerdon both describe teaching practices that integrate DEI (Diversity, Equity, and Inclusion) practices and explicitly link chemistry with social justice; these authors found that the social justice focus of the teaching promoted student engagement from those who were already involved in STEMM courses and those who, in their own words, “honestly did not like science”, and improved learners’ understanding of chemistry and wider scientific issues through course content that was relevant to their experiences and interests.
Again, the authors are using chemistry to advance their notion of social justice, which includes effacing the dubious “systemic racism” of the field. I think it would be better just to bring more minorities into chemistry by widening the net, furthering equal opportunity, and teaching chemistry—real chemistry—in an interesting way.
Like me, Jordan Beck is weary of papers like this. Click to read, and I’ll give one excerpt below. There is no branch of science immune from this kind of performative virtue signaling:
From Beck:
Thus, I really struggle when articles like this chemophobia paper come through because when these topics come up, journals seem to lose any pretense of rigor and relevance—anything goes under the DEI flag. Such papers also promote ideas that I consider to be detrimental to the science. The chemophobia article is only a commentary, but it still bothers me.
The remainder of this post consists of select passages from the commentary with my commentary in response. All quotes are from the commentary.
The Palmer and Sarju paper starts with a figure that I’ve put below along with Beck’s analysis.
The figure, which constitutes an insult to the intelligence of not just academics, but anyone. It adds nothing beyond what’s said in the paper’s text:
Beck’s take:
It is difficult to summarize exactly what the figure is meant to convey, but it seems like the idea is that we need some sort of rainbow lens to disrupt the uniformity of the people in the sciences. It is better, in this view, to label each scientist with a particular label so that we can understand how “differential access to education” is leading to “cognitive overload”. I maintain the notion, which for one reason or another now seems to be outdated or taboo, that I really don’t care about the sexual orientation of the authors of a journal article that I am reading. In fact, if you can believe it, I didn’t even think about trying to determine the gender or sexual orientation of the authors of the article that I just reviewed. The top picture, where all the scientists are the same, has some merit. They can be judged simply by what they contribute.
Frankly, I’m losing my willingness to take apart papers like this because they’re all the same. I can suggest only two things to the authors. First, if you want more diversity in chemistry, work on giving children more opportunity to encounter chemistry, not DEI-ize the way chemistry is taught. Second, learn to write, as your prose is turgid and, surprisingly, laden with jargon that obscures the meaning of your text.
Wikipedia is my go-to site for checking facts quickly, as it is for many. But I’ve seen enough wonky stuff on it that I wouldn’t trust it on controversial matters, and that’s the topic of this post. (I have long wanted to go through its “Evolution” page to check for accuracy, but I’ve never gotten around to it.)
Tablet is a pretty reliable source, and in this piece, Izabella Tabarovsky argues that Wikipedia has distorted facts and material about Judaism and Israel, all in a way hostile to Israel—and the truth. Even Larry Sanger, the co-founder of Wikipedia, claims that the site’s new leadership (yes, there are authorities above the editors) are “clowns” and that its vaunted neutrality is a sham.
Tabarovsky is identified this way:
Izabella Tabarovsky is a scholar of antizionism and contemporary left antisemitism. She is a Senior Fellow with the Z3 Institute for Jewish Priorities and a Research Fellow with the London Centre for the Study of Contemporary Antisemitism and ISGAP.
Click the headline to read:
The thesis:
Wikipedia’s key principles are codified in “five pillars,” which include writing from a neutral point of view and using reliable sources to document key arguments. Another pillar urges editors to treat each other with respect and seek consensus on contentious topics. Disputes are resolved by volunteer administrators and can be escalated all the way to the Wikipedia Arbitration Committee (aka Wikipedia’s “Supreme Court”). Punishment can include bans varying in severity and length of time.
Today, Jewish people and the Jewish story are under an unprecedented global assault, and Wikipedia is being used as a weapon in this war.
. . .Wikipedia also prides itself on radical transparency: Every edit can be seen by everyone on a specially designated page.
Closer to home, what’s clear is that Wikipedia’s articles are now badly distorted, feeding billions of people—and large-language models that regularly train on the site, such as ChatGPT—with inaccurate research and dangerously skewed narratives about Jews, Jewish history, Israel, Zionism, and contemporary threats to Jewish lives.
The first sign of the problem to Tablet:
In June, a group of Wikipedia editors and administrators rated the Anti-Defamation League as “generally unreliable” on the Israeli-Palestinian conflict and “roughly reliable” on antisemitism “when Israel and Zionism are not concerned.” They also evaluated the ADL’s database of hate symbols, deeming it as “reliable for the existence of a symbol and for straightforward facts about it, but not reliable for more complex details, such as symbols’ history.”
The anonymous editors, with unknown backgrounds or academic credentials, accused the ADL of “conflating” anti-Zionism with antisemitism and relying on the International Holocaust Remembrance Alliance’s definition of antisemitism, which, they claimed, brands all criticism of Israel as antisemitic and stifles pro-Palestinian speech. They also accused the ADL of “smearing” Students for Justice in Palestine by calling on universities to investigate whether the group provided material support to Hamas, a U.S.-designated terrorist organization.
You can read the linked articles, and also the Wikipedia article on the Anti-Defamation League, which beefs that the ADL conflates antisemitism with anti-Zionism, a claim that no longer carries much water for me. (It’s a pity that the Munk Debate on this issue, in which Douglas Murray and Natasha Hausdorff, taking the side of equivalence, trouncedtheir opponents, is no longer free online, though bits of it are; see here and here.)
Apparently the Wikipedia editors who are most persistent on matters Jewish in Wikipedia are those who are anti-Israel, and have simply worn down their opponents. Especially in foreign-language articles, which have some influence, the errors persist for years and years. For example:
In 2004, a spokesperson for the Polish branch of Wikimedia Foundation created an article in English describing an extermination camp in Warsaw, where the Nazis gassed 212,000 Poles. The story—a fiction—remained on the site for 15 years before the Israeli newspaper Haaretz revealed the problem in 2019. By then, the article had been translated into multiple languages, and its claims incorporated into multiple other Wikipedia articles. An estimated half a million people got exposed to the lie.
Last year two historians published a bombshell paper demonstrating how a group of ideologically driven editors spent years systematically distorting Polish Jewish history across multiple Wikipedia articles to align it with far-right Polish nationalist preferences. [JAC: It is now against the law in Poland to argue that the Poles helped the Nazis experminate the Jews, even though that’s true.] Working in concert, the group falsified evidence, promoted marginal self-published sources, created fake references, and advanced antisemitic stereotypes. It whitewashed “the role of Polish society in the Holocaust,” “minimize[d] Polish antisemitism, exaggerate[d] the Poles’ role in saving Jews,” blamed Jews for the Holocaust, and generally steered “Wikipedia’s narrative on Holocaust history away from sound, evidence-driven research, toward a skewed version of events,” wrote the authors, Jan Grabowski and Shira Klein.
Wikipedia’s mechanisms proved entirely inadequate in the face of this motivated, organized assault. Working “as a monolith,” the group manipulated the procedures, coordinated edits, and rallied to each other’s support when challenged. Users seeking to correct the group’s edits found themselves outnumbered and outmaneuvered. “Challenging the distortionists takes a monumental amount of time, more than most people can invest in a voluntary hobby,” wrote Grabowski and Klein. The distortionists exhausted their opponents with endless debates, aggressive “battleground behavior,” rudeness, and “mass deletions,” leading some to simply give up on editing the topic. Volunteer administrators called upon to resolve conflicts were unqualified to adjudicate content issues and unwilling to invest the hours required to sort through sources.
. . . . The most incomprehensible part about this is that it took Wikimedia Foundation 14 years from the time the first complaints began to surface to do something about it.
Tabarovsky also argues that the “reliable” sources for matters Judaic on Wikipedia are liberal sources known for being anti-Israel, including the NYT, BBC, The New Yorker, and The Guardian. Those are in fact the very sources that I consider most dubious on Israel news. Conservative Sources like the New York Post and Fox News are rated unreliable, though often news that makes Israel looks bad (like the false claim that an Israeli strike demolished Al-Ahli Arab Hospital in Gaza) are loudly promoted by these organs of the MSM. More:
This ranking tells us what kind of slant we can expect in Wikipedia’s articles about Israel, Zionism, and anti-Zionist antisemitism. In the wake of Oct. 7, “generally reliable” sources have trafficked in disinformation, as when The New York Times splashed the Al Ahli hospital bombing hoax over its front page, helping spark violent anti-Jewish riots across the world; or when The New Yorker legitimized Holocaust inversion—a long-running staple of anti-Zionist propaganda originating in the 1960s USSR. Conservative outlets, on the other hand, have produced reporting that tells Israel’s side of the story and have looked far more critically at the anti-Israel campus protests. The “generally unreliable” Washington Free Beacon has arguably produced the most extensive reporting on the protests. Wikipedia editors, however, are warned against using the Beacon as a source, which is why of the 353 references accompanying Wikipedia’s article on the pro-Palestinian campus protests, the overwhelming majority is to liberal and far-left sources plus Al Jazeera.
Here’s how it works: as we know, among “progressive” Leftists, which are the most anti-Israel group in politics save groups like the Black Muslims, it is the loudest and most persistent group who triumphs. One example:
One-sided sources are just one among a host of problems in Wikipedia articles related to Oct. 7 and the war that followed. In a World Jewish Congress report released in March, Dr. Shlomit Aharoni Nir documents numerous ways in which relevant Wikipedia entries have become de facto anti-Israel propaganda. From biased framing to omissions of key facts to stressing anti-Israel examples while ignoring the Israeli side of the story, to promoting fringe academic perspectives on Zionism—Wikipedia’s editors and administrators have actively worked to subvert the site’s neutrality policy on this topic. As in other instances, conflicts and bullying behavior predominate, with Israeli editors describing uniquely “hostile and disrespectful” treatment. Israeli users, who are most knowledgeable about the Oct. 7 events, often found themselves locked out of editing key articles, which were open for editing only to users who’d made over 500 edits. Several editors told Aharoni Nir that there were a number of activists who operated anonymously and were “responsible for the anti-Israel tone.”
Among some of the most troubling instances Aharoni Nir documented were calls for deletions of crucial articles. These included articles describing individual massacres on Oct. 7, such as those at Netiv HaAsara, Nir Yitzhak, Yakhini, and other kibbutzim and moshavim, as well as articles describing Hamas beheadings. Some of the calls succeeded. So did the call to erase the article about Nazism in Palestinian society (a “documented historical and sociological phenomenon,” notes Aharoni Nir). By contrast, the article normalizing equations between Israel and Nazi Germany—a propagandistic concept that has been weaponized against Jews for decades––remains on the site. Meanwhile, Wikipedia’s Arabic site openly abandoned the principle of neutrality last December when it temporarily went dark in solidarity with the Palestinians, then added the Palestinian flag to its logo and posted a pro-Palestinian statement at the top. Israel’s Wikipedia community protested. Wikimedia Foundation—you guessed it—did nothing.
There are other subtle distortions in articles about Israel, including the one about the Six-Day War in 1967. As Malgorzata noted,
“There is not a word about the threats from both Egyptian and Syrian authorities and media about obliterating Israel. The falsification is very subtly done – as if Israel didn’t have a genuine reason to launch a preemptive strike.”There’s a lot more to read and, in the end, Tabarovsky argues that one of the world’s most-consulted sources of information is biased to the extend that it’s turning itself “into the Great Soviet Encyclopedia.”
Maybe I should have a look at Wikipedia’s “evolution” article, though I’m pretty sure that it hasn’t been ideologically captured by creationists or IDers. There are enough pro-science editors out there to prevent any gross distortions from happening. But do be aware of Wikipedia’s coverage of things about Israel.
***********
UPDATE: Oops, I made the mistake of looking at the Evolution article in Wikipedia and found this right at the beginning:
Evolution by natural selection is established by observable facts about living organisms: (1) more offspring are often produced than can possibly survive; (2) traits vary among individuals with respect to their morphology, physiology, and behaviour; (3) different traits confer different rates of survival and reproduction (differential fitness); and (4) traits can be passed from generation to generation (heritability of fitness).[7] In successive generations, members of a population are therefore more likely to be replaced by the offspring of parents with favourable characteristics for that environment.
This is a hypothesis that doesn’t really establish the fact of natural selection, but suggests its likelihood. To establish natural selection’s existence, you must document it empirically. Also, the fact that “more oftfspring are often produced than can possibly survive” is, as pointed out by Ronald Fisher, more a result of natural selection than an observation that leads one to conclude that natural selection must occur. (You have a lot of kids because there are many things going after them.) I wouldn’t have begun that article using this as evidence for evolution by natural selection. Further, evolution can occur by means other than natural selection, including genetic drift (which they do mention) and meiotic drive (which they don’t). Overall, however, the article looks pretty good, and of course every evolutionist will have a beef that their favorite topics aren’t covered properly.
Today Doug Hayes of Richmond, Virginia, has sent a batch of pictures showing the primate H. sapiens engaged in a ritual of unknown evolutionary signifiance; it’s called “dancing”. Doug’s captions and IDs are indented, and you can enlarge the photos by clicking on them.
Here’s my latest photoshoot done for Starr Foster Dance to publicize their next show, Page To Stage III at Richmond, Virginia’s Firehouse Theatre December 5th – 8th, 2024. This will be the third Page To Stage presentation, a cross-disciplined project combining the art of writing and dance. Writers were asked to submit short works of prose or poetry, which would be interpreted in dance by chorographer Starrene Foster and her dancers.
The dancers pictured here are the core members of the company: Shannon Comerford, Molly Huey, Fran Beaumont, Angela Palmisano and Madison Ernstes. Other dancers, drawn from the Richmond dance community, will be added as needed. The show will consist of eight pieces, of which three pieces have been choreographed and are in rehearsal.
In addition to the choreography, Starrene is also designing and sewing the costumes as well as collaborating on the original music and lighting design! For more information on the company (and more of my photos), visit www.starrfosterdance.org. The Firehouse Theatre’s website is www.firehousetheatre.org.
Core members of Starr Foster Dance (L to R) Fran Beaumont, Shannon Comerford, Molly Huey, Angela Palmisano and Madison Ernstes:
For this group shot, the flash units were aimed at the ceiling and bounced down on the dancers. Starr loves shadowy and mysterious looking lighting similar to stage lighting. Dresses designed and sewn by Starrene Foster:
Another group shot, this time, Angela and Molly were asked to come up with a unique move while Shannon, Madison and Fran jumped in the background:
Angela Palmisano executing a leap. Each dancer was encouraged to execute a different jump for the individual photos:
Angela Palmisano, Molly Huey and Madison Ernstes:
Fran Beaumont and Shannon Comerford:
Molly Huey in flight!:
Madison Ernstes levitating:
Angela Palmisano, Molly Huey and Madison Ernstes execute a group jump:
Shannon Comerford:
Another double jump by Fran Beaumont and Shannon Comerford. They are actually off the ground, not kneeling:
Fran Beaumont does her jump:
Photo info: All photos were shot with the Sony A1 camera body and Sony FE 24-105 lens – usually set at 35mm for group and duo photos, 50-90mm for solo shots. The camera was set to do a burst of 15 shots per second for the action photos.
Lighting was with two Wescott FJ400WS battery-powered monolights set to “Freeze” mode which allows burst shooting at up to 20 frames per second, enabling the flash to keep up with the camera’s burst speed. The camera’s flash synch speed was 1/350th of a second which eliminates motion blur from ambient light – a problem with older cameras and flash units when photographing fast action. Until recent improvements in camera technology and flash units’ electronics, sync speed was limited to only 1/60th of a second.
The flashes were triggered by a wireless FJ-X3 S (for Sony cameras) radio transmitter capable of controlling 20 flash units per channel simultaneously (32 channels with the capability of independently programming six groups of up to 20 flash units). Photos were shot at ISO 400 and edited in Adobe Photoshop and Topaz Photo AI (noise reduction and enhancement)
While black holes are known as the most destructive objects in the universe, their evolution is largely shrouded in mystery. This is because while astronomers are familiar with supermassive black holes that exist at the center of galaxies like our own and black holes whose masses are less than 100 times the size of our Sun, the notion of intermediate-mass black holes (IMBHs) have largely eluded discovery. However, this might change with the recent discovery of a black hole candidate that could exist within the globular cluster, Omega Centauri, and holds the potential to be the “missing link” in scientists better understanding black hole evolution.
For the discovery, an international team of researchers used the Hubble Space Telescope and spent approximately 20 years examining more than 500 images of seven fast-moving stars that lie within Omega Centauri, which is located just over 17,000 light-years from Earth and estimated to be just over 11.5 billion years old. A possible reason for the lengthy research time is because Omega Centauri is estimated to contain approximately 10 million stars, each with an average mass of four of our Suns, with the globular cluster being approximately 150 light-years in diameter.
“We discovered seven stars that should not be there,” said Maximilian Häberle, who is a PhD student at the Max Planck Institute for Astronomy in Germany and the investigation lead. “They are moving so fast that they should escape the cluster and never come back. The most likely explanation is that a very massive object is gravitationally pulling on these stars and keeping them close to the center. The only object that can be so massive is a black hole, with a mass at least 8200 times that of our Sun.”
For context, the supermassive black hole at the center of our galaxy is approximately 4.3 million times the mass of our Sun, which could put this black hole candidate in the “intermediate range” based on what scientists know about smaller black holes being less than 100 times the mass of our Sun. However, the only other study that suggested the existence of an IMBH within Omega Centauri was in 2008, so this latest discovery will require further examination as the researchers have yet to determine this candidate’s exact mass and position within Omega Centauri.
“This discovery is the most direct evidence so far of an IMBH in Omega Centauri,” said Dr. Nadine Neumayer, who is a scientist at the Max Planck Institute for Astronomy and who began the study. “This is exciting because there are only very few other black holes known with a similar mass. The black hole in Omega Centauri may be the best example of an IMBH in our cosmic neighborhood.”
This new candidate continues a long list of potential IMBH discoveries dating back to 2004, with no definitive confirmations being made to this day. Additionally, given this potential IMBH is located just under 17,000 light-years from Earth, if confirmed, could be the closest black hole to Earth, beating out the supermassive black hole residing at the center of our galaxy, which is approximately 27,000 light-years away.
As noted, black holes are known as the most destructive, yet most mysterious, objects in the universe. This is because they can’t be directly observed and are only detectable when they consume an object in their surrounding environment, most often a star. However, once detected, astronomers can learn much about their behavior, which includes the production of gravitational waves when two black holes merge, as was discovered in 2016.
Additionally, their evolution remains a mystery, as astronomers have debated for decades regarding how black holes form, evolve, and even die. Therefore, confirming the existence of the first IMBH could bring astronomers one step closer to better understanding black hole evolution throughout the universe.
How will IMBHs help astronomers better understand black hole evolution in the coming years and decades? Only time will tell, and this is why we science!
As always, keep doing science & keep looking up!
The post Finally! Astronomers Find the Missing Link Between Stellar and Supermassive Black Holes appeared first on Universe Today.
Ae inflatable habitats the future of human space exploration? This is what the space-tech company, Sierra Space, hopes to achieve as they recently conducted a successful Ultimate Burst Pressure test on June 18 with its Large Integrated Flexible Environment (LIFE®) technology at NASA’s Marshall Space Flight Center. The goal of these tests is to inflate the test article until it explodes while ascertaining if the maximum pressure falls within NASA’s strict safety guidelines regarding a recommended operating pressure of 60.8 psi (maximum operating pressure of 15.2 psi times four as a safety factor). Upon explosion, Sierra Space engineers immediately found the recent test achieved 74 psi, thus exceeding NASA’s safety standards by 22 percent.
“We are 100 percent committed to maintaining U.S. leadership in Low Earth Orbit. Sierra Space is leading the way with the first commercial space station to replace the International Space Station when it is decommissioned and ensure there is no gap in LEO,” said Sierra Space CEO Tom Vice. “Our revolutionary, expandable space station technology reinvents the space station. Our technology, for the first time, will enable the right unit economics that will usher in the full commercialization of space. Our biotech and industrial partners will utilize our factories of the future to innovate new products that will massively disrupt terrestrial markets and benefit life on Earth.”
This recent test marks the technology’s second full-scale structural test and seventh key validation test, which comes after Sierra Space successfully conducted its first full-scale burst test in December 2023, achieving 77 psi and exceeding NASA’s safety standards by 27 percent. Both test units stood at more than 6 meters (20 feet) in height and had volumes of 300 m3 (10,594 ft3), or approximately 1/3rd of the pressurized volume of the International Space Station (ISS). Sierra Space is now planning for the first test of its 500 m3 (17,657 ft3) space station technology in 2025, which will be 55 percent of the pressurized volume of the ISS.
December 2023 burst test of the Large Integrated Flexible Environment (LIFE®)“No other company is moving at the speed of Sierra Space to develop actual hardware, stress-tested at full scale, and demonstrate repeatability. We’ve taken a softgoods system that very few companies around the world have been able to design, and now we have consistent, back-to-back results,” said Shawn Buckley, VP of Earthspace Systems, Space Stations, at Sierra Space. “A second successful full-scale test is an absolute game changer. We now know it’s possible to equal or surpass the total habitable volume of the entire International Space Station, in a single launch.”
While these two recent tests were conducted at full-scale, Sierra Space conducted two sub-scale burst tests in July 2022 and November 2022, achieving maximum pressures of 192 and 204 psi, respectively, with NASA’s safety standards being 182.4 psi given the sub-scale sizes, thus both tests successfully exceeding these safety standards.
July 2022 sub-scale burst test November 2022 sub-scale burst testSierra Space stated in June 2023 that they hope to launch a “pathfinder” version of the LIFE® habitat in 2026 with the goal of the technology being an essential piece to the Orbital Reef commercial space station, with the latter scheduled to be operational in 2027. Given its size, Sierra Space estimates the LIFE® habitat can comfortably accommodate four astronauts with the remaining volume being used for science experiments, exercise equipment, small medical facilities, and the Astro Garden® system, which can potentially grow food in space and has previously undergone testing at the Sierra Space Madison WI facility.
This comes as numerous commercial space companies are attempting to launch their own space stations, including the Axiom Station, Starlab Space Station, Haven-1, and the aforementioned Orbital Reef space station. Additionally, this also comes as NASA announced their plans to “retire” the ISS in 2030, although the agency announced in a July 2024 white paper that they will evaluate the possibility of extending the lifetime of the ISS if no commercial space stations are able to accommodate space-based research at that time.
How will Sierra Space’s Large Integrated Flexible Environment (LIFE®) technology help advance human space exploration in the coming years and decades? Only time will tell, and this is why we science!
As always, keep doing science & keep looking up!
The post Watch an Inflatable Habitat Burst in Super Slo-Mo appeared first on Universe Today.
The topic of genetically modified organisms (GMOs) is a great target for science communication because public attitudes have largely been shaped by deliberate misinformation, and the research suggests that those attitudes can change in response to more accurate information. It is the topic where the disconnect between scientists and the public is the greatest, and it is the most amenable to change.
The misinformation comes in several forms, and one of those forms is the umbrella claim that GMOs have been bad for farmers in various ways. But this is not true, which is why I have often said that people who believe the misinformation should talk to farmers. The idea is that the false claims against GMOs are largely based on a fundamental misunderstanding of how modern farming works.
There is another issue here, which falls under another anti-GMO strategy – blaming GMOs for any perceived negative aspects of the economics of farming. Like in many industries, farm sizes have grown, and small family farms (analogous to mom-and-pop stores) have given way to large corporate owned agricultural conglomerates. This is largely due to consolidation, which has been happening for over a century (long before GMOs). It happens because larger farms have an economy of scale – they can afford more expensive high technology farm equipment. They can spread out their risk more. They are more productive. And when a small farm owner retires without a family to leave it to, they tend to consolidate with a larger farm. Also, government subsidies tend to favor larger farms.
Some small family farms have found a business model that works, and they do well. We have many local farms where I live, who do agricultural events, pick your own, pumpkin picking, sell many heirloom vegetables, sell wine, have a corn maze, and do other things to stay profitable.
You will notice that none of this has anything to do with GMOs. But let’s get back to the first strategy and see why it is flawed and can be fixed by talking to a farmer.
One anti-GMO claim is that farmers don’t like to have to buy their seeds from big companies. They would rather save their seeds to plant the next year. There are two reasons why this argument fails as an anti-GMO argument. One is that farmers have been buying their seeds for decades, again – long before GMOs came on the scene. In the 1990s (before GMOs) greater than 90% of all seeds planted in developed countries were hybrids, and hybrid seeds are patented and owned by seed companies. GMOs literally changed nothing. Also, you cannot replant hybrid seeds because the hybrid traits do not breed through. They are worthless for replanting.
Further, farmers generally don’t want to save their seeds, store them over winter, keeping them dry and vermin-free. This is a lot of work. It’s easier just to buy fresh seeds each season. Again, there will be exceptions for some crops and some farmers, but they can still do that if they wish. They can preserve their heirloom crops in this way, no one is stopping them.
Another claim is that GMOs hurt farmers, because they are expensive. But farmers buy GMOs because they are more profitable. In a 2022 study:
Over the period 1996 to 2020, the economic benefits have been significant with farm incomes for those using the technology having increased by $261.3 billion US dollars. This equates to an average farm income gain across all GM crops grown in this period of about $112/hectare.
So the narrative that farmers are forced to buy GMO seeds they don’t want, it costs them money, and they are prevented from saving and replanting seeds is completely false. A basic understanding of the farming industry would correct this false narrative – i.e., talk to an actual farmer. Further, use of some GMO crops makes farming outcomes more predictable, which is critical to farmers. They are less likely to lose their crop to pests or drought. GMOs make it easier to do no-til farming, and to use less pesticide, which saves on labor.
In terms of the effect of GMOs on small-scale farmers specifically:
The most significant advantages of GM crops include being independent to farm size, environment protection, improvement of occupational health issues, and the potential of bio-fortified crops to reduce malnutrition. Challenges faced by small-scale farmers for adoption of GM crops comprise availability and accessibility of GM crop seeds, seed dissemination and price, and the lack of adequate information.
Like all technologies, smaller farms are less able to afford them and reap their benefits. Of course, this is no reason to ban the technology (there is no discussion of banning other high-tech agricultural technology because it is difficult for small-scale farmers to afford). Again, they have to find a business model that works for their scale. There are challenges for any small business to compete with the big boys.
Finally, there is one thing that some farmers worry about when it comes to GMOs – they can be controversial. But this is a circular argument – they are controversial because they are controversial. This is precisely the strategy of the anti-GMO lobby – demonize GMOs and make them controversial so that the controversy becomes a negative unto itself.
We can talk about the economics of farming, how to protect small farmers, and we can have this discussion in the context of business as a whole. Again – the trend toward consolidation is a general trend across most industries for reasons generic to business, capitalism, and scale. But let’s remove the fake arguments from the GMO discussion. This is not about forcing farmers to buy GMO seeds – they do so voluntarily because they are profitable. Let’s also not pretend that most farmers want to save their seeds but can’t. This is simply not a good business model, and most farmers have abandoned the practice of saving seeds long before GMOs entered the scene. Also, most seeds were patented long before GMO technology.
Most farmers recognize that GMO technology is simply an extension of breeding and cultivation and has resulted in some extremely useful cultivars, with more to come. Abandoning GMOs will hurt farmers, will reduce food production, and will hurt the environment.
The post GMOs – Ask a Farmer first appeared on NeuroLogica Blog.
Academic research on solar system objects has increased dramatically over the last twenty years. However, information on most of the estimated 1.2 million objects discovered in our solar system has been spread throughout various databases and research papers. Putting all that data into a single data store and making it easy to access would allow researchers to focus on their research rather than on where to collect data. That is the idea behind the Solar System Open Database Network (SsODNet), a project by data scientists at the Observatoire de Paris.
So why is this important? Ease of access to data can lower barriers to entry into the field of researching solar system objects (SSOs), allowing more people to participate in that research. The more people that research SSOs, the more likely we are to spot a potentially dangerous or economically interesting one.
Additionally, even for researchers already involved in the field, collecting data relevant to their current research can be a time-consuming, manual process. Introducing machine-readable tools like SsODNet can dramatically speed up the time it takes to produce new research on SSOs, enabling those researchers to do better-quality work.
One potential application of the database is AI – how useful can that be?What parameters would those researchers be looking at? The database includes data on diameter, taxonomy, thermal inertia, rotational period, albedo, and more – the most exciting characteristics scientists want to know about an SSO. To collect this data, the developers, led by Jerome Berthier, combined data from several publicly available databases, such as the Jet Propulsion Laboratory Small Bodies Database and the Lowell Observatory Minor Planet Services, with manual data published in dozens of papers on solar system objects. Many of these preexisting databases also didn’t have machine-friendly systems, meaning that Dr. Berthier and his co-authors had to manually scrape data from them and the manuscripts to include it in SsODNet.
When building out SsODNet itself, machine interfaces were a central tenant of development. It is designed as a web service, with standard machine-interfacing protocols accepted for queries, such as Rest and web services. They also implemented a Python interface called “rocks,” which can be called from a command-line interface.
These simple interfaces combine features the SSO research community has asked for, such as standardizing the names of the 1.2 million objects in the database (called quaero in the program) and providing statistical analyses of a set of objects (ssoBFT). There are also several estimates for what properties might be correct if conflicting data points are found in the literature.
Finding asteroids is the first step in stopping the deadly ones – SsODNet can help with that.It is still incomplete even with as much data as they could gather on the 1.2 million objects in the database. The authors admit that most of the data points currently in the database are for asteroids since they are most interested in studying the type of object. However, while asteroids make up a large percentage of SSOs, they don’t include comets, satellites, or even planets, though this is planned for future database releases.
However, perhaps the most impressive part of this data collection effort is its ongoing commitment to support. The authors have committed to updating the database with new SSO data weekly (at least to the quaero name resolver) and releasing major monthly updates to the other applications. Doing so would include adding new data from new papers released during that time. Getting the data into a sustainable format to launch the database in the first place was a Herculean effort, and maintaining it for the foreseeable future will be another one. The SSO research community will undoubtedly thank them for it.
Learn More:
Berthier et al. – SsODNet: Solar system Open Database Network
Solar System Portal – SsODNet
UT – Solar System Guide
UT – Want to be an asteroid miner? There’s a database for that.
Lead Image:
Illustration of an interstellar object approaching our solar system.
Credit: Rubin Observatory/NOIRLab/NSF/AURA/J. daSilva
The post The Properties of 1.2 Million Solar System Objects Are Now Contained In A Machine-Readable Database appeared first on Universe Today.
Is your phone really tracking your driving habits and selling the data? Maybe more so than you know.