Archive for the 'Astronomy' Category

Jul 15 2021

Methane in Enceladus Plumes

Published by under Astronomy

Enceladus is the 6th largest moon of Saturn, about 500 km in diameter. It is completely covered with mostly fresh ice, making it highly reflective (in fact, it is the object in the solar system with the highest albedo, reflecting almost 100% of the light that hits it). Given its small size, astronomers assumed it was likely frozen solid. This small chunk of ice, however, became significantly more interesting in 2005 when Cassini first observed plumes ejecting from its southern pole. This suggested that Enceladus has liquid water beneath that surface crust of ice – and any place with liquid water is a potential candidate location for life.

Over the next 10 years Cassini made many Enceladus flybies, collecting data that is still being analyzed. NASA now estimates that there is an ocean beneath the southern pole of the moon, below 30-40 km of surface ice, and 10 km deep. Further, analysis of the misty plumes finds that it is salty water with a higher level of organic material than predicted.

Now we have a new analysis of Cassini data looking at the methane content of the Enceladus plumes. Methane is of particular interest to exobiologists looking for telltale signs of life. This is for two good reasons. One is that methane is a highly reactive gas, and will not persist for long in an atmosphere or liquid water. So if it is present in significant amounts it must be being constantly replenished. It shares this feature with oxygen, which is why oxygen is also a significant sign of potential life. Further (and again like oxygen) methane is known to be a byproduct of metabolism of certain kinds of critters. On Earth deep sea vents contain methanogenic archaea, bacteria-like single-celled organisms that live by chemosynthesis.

Continue Reading »

No responses yet

Jul 12 2021

The Dawn of Space Tourism

Published by under Astronomy,Technology

A common theme that emerges when writing about science and technology is that often the most important factor in determining if and how a technology is adapted is not the tech itself. Economics is often the overriding factor. People will tend to take the most efficient and least expensive route to any goal. We don’t usually do things just because we can. This is why it is so important that the market places a fair and proper price on goods and services without significant distortion. Distorted market forces (like allowing companies to externalize real costs of their business) will produce distorted outcomes. (Government regulation is used when efficiency is not the only desired outcome. We also want a clean environment, justice, and protection of minors, for example.)

This is why recent developments have been exciting for space enthusiasts, who have long accepted that the route to a robust space infrastructure requires commercialization of space. Big government programs will pave the way, bootstrapping the technology, but will likely not be able to sustain a space industry. The moment going to space becomes profitable, we will truly enter the space age. And one industry well positioned to be on the leading edge of commercialization is tourism.

All this is why the recent trip to the edge of space by Virgin Galactic is noteworthy. The ship is designed as a space plane, that takes off horizontally like a traditional jet. There is a carrier portion, named White Knight, which carries the actual ship, Spaceship 2, in the middle section (the mission itself was dubbed Unity 22). At 15 km the ships separated. Spaceship 2 then rocketed up to an altitude of 80 km. “Space” is considered to begin at 100 km (at the Kármán line), so this was technically not into space (therefore the oft-cited “edge of space”). Also, this was a suborbital flight, not capable of getting into orbit around the Earth. The maximum altitude of Spaceship 2 is about 90 km, so it is not capable of orbital flight.

Continue Reading »

No responses yet

Jun 17 2021

Betelgeuse Mystery Solved

Published by under Astronomy

In 2019 and 2020 the red supergiant star Betelgeuse was dimming, and by a significant amount. Betelgeuse is the right shoulder of the constellation Orion and so is one of the easier stars to find. As a red supergiant it is also in the later stages of it’s life. Such start might dim like this when they are getting close to going supernova, and so that possibility was excitedly proposed. Not only would it be incredibly cool to see a supernova that close by – it would be visible during the day for weeks, or light up the night sky – but the opportunity to study a supernova close up (just 724 light years away) like that would be a scientific boon. But astronomers were too cautious to jump to conclusions, and therefore explored other options as well.

It didn’t take long before the evidence started pointing in a different direction. First, the star started to brighten again, which would not happen if it were about to go kablooi. Betelgeuse, it seems, is not in its final stage, ready to collapse and explode, and likely still has tens of thousands of years of life left. The dimming was likely due to a large dust cloud. But that would be unprecedented, and astronomers had not ready explanation for why a dust cloud would suddenly obscure such a large star to such a degree. Now a new study may have solved the mystery, thanks to high resolution observations of the star. They report:

Here we report high-angular-resolution observations showing that the southern hemisphere of Betelgeuse was ten times darker than usual in the visible spectrum during its Great Dimming. Observations and modelling support a scenario in which a dust clump formed recently in the vicinity of the star, owing to a local temperature decrease in a cool patch that appeared on the photosphere.

Continue Reading »

No responses yet

May 28 2021

New Dark Matter Map Mystery

Published by under Astronomy

Scientists have published the most extensive map of dark matter in the universe to date, based on a survey of 100 million galaxies. The findings don’t quite match with predictions made by computer models, suggesting that there is some physics at work which scientists do not yet understand. This, of course, is exciting for physicists.

As I discussed previously, we don’t know what dark matter is, but we are pretty confident it’s there. Dark matter does not give off any radiation, but it does have gravity, so we can see its gravitational effects. Based on these observations it seems that 80% of the matter in the universe is dark matter. This is a major area of research, because we do not know what dark matter is made of. It is probably some new particle we have not identified so far. This is where scientists live – on the edge of our current knowledge, peering into the unknown.

Part of that “peering” is gathering lots of data, and that is what the current study does. They used gravitational lensing to map the gravity of the universe, 80% of which is dark matter. Visible galaxies and dark matter cluster together, creating an overall structure to the universe. There are vast black voids with nothing, and there are tendrils of matter with galaxies, gas, and stars. The goal is to map this distribution, to see where all the stuff in the universe is.

They then compared this map to what we would predict based on our current understanding of the laws of physics. They started with a map of where all the matter was 350,000 years after the Big Bang, which was created by examining the cosmic background radiation. Then they model where that matter should have gone over the last 13.8 billion years based upon relativity and other physical laws. The map and the model were off by a few percent. The universe is more evenly distributed than the models predict. This may not sound like a lot, but physicists are used to dealing with high levels of precision. Physical laws tend to be very reliable. This is why we can make calculations and send a probe to Pluto 5 billion km away, and arrive precisely where they predicted. If the New Horizons probe was off course by a few percent, that would have been a disaster, both for the mission and our understanding of the relevant laws of physics.

This is why physicists love discrepancies between predicted and observed phenomena, even tiny ones. It means something is going on we are not aware of. This could be an effect we have not considered, an error in their experimental design or method of observation, or occasionally a tweak to our understanding of the laws of physics. The first two need to be thoroughly ruled out before new physics can be confidently postulated, and it is an increasingly rare event, but that is what physicists live for.

Continue Reading »

No responses yet

Apr 19 2021

SpaceX Awarded Lunar Lander Contract

Published by under Astronomy,Technology

I’ve been watching For All Mankind – a very interesting series that imagines an alternate history in which the Soviets beat the US to landing on the Moon, triggering an extended space race that puts us decades ahead of where we are now. By the 1980s we had a permanent lunar base and a reusable lunar lander, not to mention spacecraft with nuclear engines. Meanwhile, back in reality, we are approaching 50 years since any human has stepped foot on the moon.

But NASA does plan on returning to the Moon and staying there this time, with their Artemis mission. (In Greek mythology Artemis was the twin sister of Apollo.) Originally they planned to return to the Moon by 2028, then Trump asked them to move up the timeline to 2024. NASA dutifully complied, but this was never realistic and anyone who has been following Artemis knew this was not going to happen. And now NASA is admitting they will not be ready by 2024. But sometime likely in the latter half of this decade we will return to the Moon.

One of the last pieces to put into place is a lunar lander, something to get people from lunar orbit down to the surface of the Moon. NASA has finally awarded the contract to build this lander – to SpaceX.  They are making no secret of the reason. SpaceX gave the lowest bid, by far. This is partly because the entire mission of SpaceX is to make space travel cheaper, mainly by using as many reusable parts as possible. Toward this end they perfected the technology for landing rockets vertically. The videos of Falcon rockets landing after launching satellites is still stunning. SpaceX also achieved a rating for their Dragon Crew capsule to actually carry people into space, and they have delivered astronauts to the ISS. Finally, SpaceX has already been developing their Starship design, which will be the basis of the new lander, which NASA is calling the Human Landing System (HLS).

Interestingly, a recent independent analysis found that the most efficient (only looking at efficiency) landing system using non-reusable parts was the Apollo system – a two-stage approach with a landing module and ascent module. However, if you use a renewable lander, then the one stage approach makes the most sense. That is in keeping with SpaceX’s philosophy, so it’s not surprising that they are taking that approach. I do wonder if they are going to use an actual Starship just outfitted for lunar landing, or are they going to make a new and smaller version? If the former, then it seems a bit odd that the HLS part of the system is a ship capable (theoretically) of doing the entire mission, from Earth surface to Lunar surface. That is Musk’s vision, single stage to destination for maximal reusability.

Continue Reading »

No responses yet

Mar 29 2021

How Confident Are We That Dark Matter Is Real?

Published by under Astronomy

In the 1970s astronomer Vera Rubin was observing the Andromeda galaxy and discovered something very curious. Andromeda is a spiral galaxy, like our own, and spins like a pinwheel. The “spinning” is comprised of all the individual stars (and gas and dust, but the stars are what we can see) revolving around all the mass within their orbit. If you run the numbers, as stars get further away from the galactic center they should revolve more slowly. The relationship between distance from the galactic center and each star’s velocity is called a galactic rotation curve. Rubin and others predicted the curve should be decreasing in a linear fashion (after an initial rise because of the increase in mass at the galactic center).

What Rubin found, however, was that the rotation curve of Andromeda increased at first as expected but then did not decrease with distance but remained largely flat. This difference between prediction and observation was a genuine anomaly and required an explanation. The results were verified with other large spiral galaxies, and yes, they all have flat rotation curves. The stars on the outskirts of these galaxies, according to Newton, should be flying away. They are moving to fast to be held by the gravity of the galaxy they are orbiting.

Unless…

Perhaps there is more mass in the galaxy than we can observe. There is matter that is not giving off light like stars or even reflecting or glowing from the light of stars like gas clouds. There must be matter we cannot see, some dark matter. How much dark matter would it take to explain the observed rotation curves? Quite a lot – about six times the mass of the stuff we can see. If true, then some 84% of the matter in the universe is dark matter. And we do not know what dark matter is – we don’t know what most of the universe is made of.

Continue Reading »

No responses yet

Mar 23 2021

Removing Space Debris

Published by under Astronomy,Technology

Right now there are about 3,000 active satellites in Earth orbit. About 1,000 of those satellites are part of the Starlink project to provide internet access everywhere on the planet, with a planned 42,000 total when complete. that is a massive increase in the number of active satellites. At the same time there another 3,000 defunct satellites that are no longer operational but remain in orbit. There is about 9,000 tonnes of total orbital debris, and we are tracking 30,000 objects of 10 cm or larger. But estimates are that there are millions of smaller objects in orbit.

In other words – usable Earth orbit is becoming crowded and hazardous. This is a risk to operational satellites, space stations, and any spacecraft hoping to get off Earth. Much of this debris is moving very fast relative to other objects with intersecting orbits. A lost bolt could destroy a satellite or punch a hole in the International Space Station (ISS). There is a concern that a serious collision, say between two satellites, would generate enough debris to cause a cascading event of further collisions.

There are now international agreements that make states responsible for anything they put into orbit for its lifetime. Companies and nations are supposed to arrange for the deorbiting of anything they put into orbit within 25 years of the end of its functional lifetime. However, the agreements have little teeth and compliance is low. This is just another example of allowing entities to externalize the costs of their own waste or downstream effects. It is also another example of how the assumption that natural resources are so gigantic we don’t have to worry about sustaining them. Space is really big, so who cares if we leave a lot of junk up there? Well, it took only a few decades for us to clutter low Earth orbit with enough debris to be a serious hazard.

Continue Reading »

No responses yet

Mar 18 2021

Oumuamua Explained

In 2017 astronomers spotted a very unusual object approaching Earth. What was most unusual about it was that it was on a trajectory that would take it out of the solar system. Given its path it could only have come from outside the solar system – our first ever discovered extrasolar visitor, named Oumuamua. For an extrasolar object, it came improbably close to the Earth and the Sun, which gave us a great opportunity to take a close look at it. And then, as it passed by the sun and headed out of the solar system it became even more unusual. First, we could see that it was an very long and flat object, not typical for a comet or asteroid. Second it accelerated as it moved away from the sun, like a comet would from sublimation of ice into gas acting like a rocket. But we could not see a comet-like tail, and the albedo was off. Curiouser and curiouser.

This lead some to speculate wildly that Oumuamua may be an alien artifact, most famously Avi Loeb, a Harvard scientist who has now even published a book – Extraterrestrial: The First Signs of Intelligent Life Beyond Earth. This is a clear case of the “aliens of the gap” fallacy – any astronomical phenomenon we do not currently fully understand must be evidence of alien technology. Of course, all natural explanation must first be excluded. But even then, we don’t have aliens, we have an unknown phenomenon that needs further exploration.

Oumuamua is now yet another great case in point. Two Arizona State University astrophysicists, Steven Desch and Alan Jackson, have come up with a plausible explanation for Oumuamua’s funky properties. Perhaps, they hypothesized, our attempts so far to explain the object’s behavior and properties failed because we were making false assumptions about what kind of ice it might contain. We assumed it would have a profile of ice similar to the comets we know. But what if the ice is made of something else, because Oumuamua is not a typical comet. When they looked at the properties of nitrogen gas – bingo. This would nicely fit the data, including the combination of the rate of acceleration from ice sublimation near the sun and the low albedo – not as much reflective ice would have been necessary to cause the acceleration.

Continue Reading »

No responses yet

Mar 12 2021

Planet with Secondary Atmosphere

Published by under Astronomy

The discovery and exploration of exoplanets over the last three decades has been an exciting addition to astronomy. In 1990 we knew of no planets outside our solar system, and now there are more than 4,000 confirmed exoplanets, and thousands of more candidates awaiting confirmation. This is still just a tiny sample of the planets even in our small corner of the galaxy. One of the questions going into this enterprise was – how typical are the planets we know in our own system, and also how typical is our system in terms of the number and arrangement of planets? So far the answer seems to be that there is no typical. We are finding all kinds of planets in all kinds of arrangements.

We can now add, potentially, one additional planetary phenomenon to the list – a planet with an apparent secondary atmosphere. The planet is GJ 1132 b and is 41 light years away (in naming convention the star itself has the designation “a” and so “b” is the first planet discovered in the system). This is a red dwarf star, and the planet is very close, so close that it is tidally locked, meaning the same side faces the star at all times. It’s year is only a day and a half. Planets this close to their parent stars will tend to have their atmospheres stripped by the heat and solar wind from the star.

Astronomers believe that GJ 1132 b was once a subNeptune – a planet with a rocky core about the size of Earth, but a thick hydrogen-helium atmosphere making it a gas giant. But soon after formation that atmosphere would have been blown away, leaving behind the rocky core. So astronomers expected to see little to no atmosphere on GJ 1132 b, and instead they find evidence of an atmosphere about as thick as Earth’s. This and other evidence has led them to believe this is a secondary atmosphere.

Stars tend to be hotter when they are very young and then cool down a bit. Red dwarfs change even more. They are unstable when young, giving off lots of coronal mass ejections, constantly blasting any nearby planets. This does not bode well for the prospect of life on any such planets. Unfortunately, any planet in the habitable zone is also in this blast radius and would have its atmosphere stripped. Red dwarfs are the most common star type in the galaxy, making up 70% of the stars, so this has implications for the probability of life in the galaxy.

Continue Reading »

No responses yet

Mar 11 2021

Technosignatures

Published by under Astronomy

Recently experts gathered online for a digital conference in which they discussed possibilities for detecting signs of alien technological civilizations – so called “technosignatures”. Being an enthusiast, I have heard of many of these before, but there were a lot of new ideas coming out of that meeting as well. Here is the preprint, with all the technical information. I think collectively this is a compelling case for NASA and other agencies to include the search for technosignatures as a part of their mission.

The most obvious technosignature is radio signals, most likely deliberately broadcast, either out into the universe or even directed at Earth (if they have detected our own technosignatures). This is the object of SETI (The Search for Extraterrestrial Intelligence). NASA briefly funded a SETI project, but then pulled funding. The effort continues, however, with other funding. I have interviewed Seth Shostak from SETI several times and he makes a couple of points worth emphasizing here. One is that SETI is simultaneously doing a lot of non-SETI astronomy. They are essentially doing radio astronomy but looking at any data in such a way that it could detect an alien signal.

The members of the recent meeting also considered what they call “synergies” or ancillary benefits for each of the techniques they discuss. This is critical, I think, because it means even if we never detect an alien technosignature, the effort would not have been wasted. We will have accomplished a lot of meaningful astronomy in the meantime. I would argue it wouldn’t be wasted in any case – negative results from an experiment are still results. We would have gathered lots of data about how rare technological civilizations are in the universe. But if you look at the table of discussed techniques in the paper, each one has a listed potential synergy. In fact, they also list existing astronomical data that can be searched for technosignatures.

Seth also pointed out that if we did detect an alien message hiding in radio signals, we would very likely not be able to read it. This is because radio signals get weaker with distance, and at some point they would be lost in the background radio noise. The more powerful the initial radio signal, the greater its range. The current authors call this the “cosmic footprint” of each technosignature, and coin the term “ichnoscale” to measure it. For example, they give radio astronomy a 10 kpc (kiloparsec) ichnoscale for detecting alien signals.

Continue Reading »

No responses yet

Next »