Archive for November, 2021

Nov 30 2021

Self-Replicating Xenobots

Published by under Skepticism

Placing “self-replicating” and any kind of “bots” in the same sentence immediately raises red flags, conjuring the image of reducing the surface of the world to gray goo. But that is not a concern here, for reasons that will become clear. There is a lot to unpack here, so let’s start with what xenobots are. They are biological machines, little “robots” assembled from living cells. In this case the source cells are embryonic pluripotent stem cells taken from the frog species Xenopus laevis. Researchers at the Allen Discovery Center at Tufts University have been experimenting with assembling these cells into functional biological machines, and have now added self-replication to their list of abilities.

Further, these xenobots replicate in a unique way, by what is known as kinematic self-replication. This is the first instance of this type of replication at the cell or organism level. The researchers point out that life has many ways of replicating itself: “fission, budding, fragmentation, spore formation, vegetative propagation, parthenogenesis, sexual reproduction, hermaphroditism, and viral propagation.” However, all these forms of self-replication have one thing in common – they happen through growth within or on the organism itself. By contrast, kinematic self-replication occurs entirely outside the organism itself, through the assemblage of external source material.

This process has been known at the molecular level, where molecules (like proteins) can guide the assemblage of identical molecules using external resources. However, this process is entirely unknown at the cellular level or above.

In the case of xenobots, the researchers placed them in an environment with lots of individual stem cells. The xenobots spontaneously gathered these stem cells into copies of themselves. However, these copies were not able to replicate themselves, so the process ended after one or a very limited number of generations. In a new study, the researchers set out to design an optimal xenobot that could sustain many generations of self-replication. They did not do this the old-fashioned way, through extensive trial and error. Rather, they used an AI simulation, which calculated for literally months, testing billions of possible configurations. It came up with a simple shape – a sphere with a mouth, looking incredibly like a Pac-Man. These xenobots are comprised of about 3,000 cells. The researchers assembled their xenobot Pac-Men and when placed in an environment with available stem cells they spontaneously herded them into spheres and then into copies of themselves. These copies were also able to make more copies of themselves, and so-on for many generations.

Continue Reading »

No responses yet

Nov 29 2021

Get Ready for Omicron

Experts knew, and had been warning, that delta was not going to be the last Greek letter to sweep across the world. The World Health Organization (WHO) tracks variants of the SARS-CoV-2 virus which causes COVID-19. They track variants of interest (VOI) which have been identified as potentially problematic, and variants of concern (VOC) which have been demonstrated to have either increased infectivity, increased illness severity, and/or evasion of preventive measures (such as vaccines or masks). These variants are given a Greek letter designation as they are added to the list. What is now called the omicron variant has been added to the list of VOC. Here’s what we know so far.

The virus appears to have originated in South Africa. Fortunately, South Africa has a robust surveillance system and labs that can grow the virus and do a whole-genome sequence. They were therefore able to identify the variant quickly and share their information with the world. This isn’t the first variant to originate in South Africa, which raises the question of why this is the case? Increased surveillance may be part of the answer, but is not able to fully explain why. Some scientists speculate that South Africa’s large population of HIV infected and inadequately treated people provide a fertile breeding ground for new variants.

Variants are caused by mutations in the virus genome, some of which may alter proteins and therefore viral functions. SARS-CoV-2 does not have a particularly high mutation rate, but because we are having a world-wide pandemic there are lots of opportunities for new mutations to occur. It’s possible that when a person has a prolonged infection the viruses in their system are under selective pressure, so any mutation that might partly evade the immune system will be favored. Those with untreated HIV have an impaired immune response. This may be just enough to provide some selective pressure but not enough to fight off the infection, creating a breeding ground for new variants.

Continue Reading »

No responses yet

Nov 23 2021

DART Asteroid Deflection Mission Ready for Launch

Published by under Astronomy

Why is NASA planning on deliberately crashing a spacecraft into a small asteroid that poses no threat to the Earth? It’s a test of an asteroid deflection system – DART (Double Asteroid Redirection Test). Why the “double”? Most articles on the topic don’t say, and I had two hypotheses. The first is that the mission is targeting two asteroids, or actually a binary asteroid, Didymos (Greek for “twin”). Didymos has a primary asteroid that’s 780 meters across, and a smaller secondary asteroid 160 meters across that actually orbits the primary asteroid, and is therefore called a “moonlet”. However, the mission was originally supposed to be part of a pair of missions, with the second one by the ESA who were going to send their AIM probe to orbit and monitor Didymos during the DART mission. The ESA cancelled this mission, however, and now Didymos will be monitored by ground telescopes. But it turns out the “double” refers to the twin asteroids.

In any case, the purpose of the mission is to test out an asteroid defense system known as a kinetic impactor. The course of an asteroid can be altered by ramming something into it very fast. At first this seems like a crude method, but sometimes simple is best. The mission is part of NASA’s Planetary Defense Coordination Office. The European Space Agency (ESA) is also engaged in planetary defense, although their cancelling of AIM was disappointing. There are also international meetings on planetary defense, with calls for the USA, Russia and China to work together on this project. Russia, for their part, has proposed repurposing old ICBMs as asteroid busters. This would not be a kinetic impactor, but actually use nukes to blow up asteroids.

The DART mission is the first real test of an asteroid defense system. The spacecraft uses electric motors powered by solar panels, and will be going 6.6 km/s when it impacts the smaller Didymos asteroid. This impact will only divert the orbit of the asteroid by less than a percent, but that will be enough to change its orbit around the larger asteroid by several minutes, which can be observed from Earth. The craft is scheduled to launch tonight, November 23rd, at 10:21 pm PST aboard a SpaceX Falcon 9 rocket. It will intercept Didymos in late September 2022.

Continue Reading »

No responses yet

Nov 22 2021

The Efficiency of Data Storage

Published by under Technology

As our world becomes increasingly digital, math becomes more and more important (not that it wasn’t always important). Even in ancient times, math was a critical technology improving our ability to predict the seasons, design buildings and roads, and have a functioning economy. In recent decades our world has been becoming increasingly virtual and digital, run by mathematical algorithms, simulations, and digital representations. We are increasingly building our world using methods that are driven by computers, and the clear trend in technology is toward a greater meshing of the virtual with the physical. One possible future destination of this trend is programmable matter, in which the physical world literally becomes a manifestation of a digital creation.

What this means is that the impact of even tiny incremental improvements in the efficiency of the underlying technology, computers, has increasingly powerful reverberations throughout our economy and our world. The nerds have truly inherited the Earth. This is why it is interesting science news that computer scientists at MIT have developed a tweak that may improve the efficiency with this computers store and retrieve data. William Kuszmaul and his team have demonstrated a way to improve what is known as linear probing hash tables. The underlying concept is interesting, at least for those curious about how the increasingly ubiquitous computer technology works.

Hash tables were developed in 1954 as a way for computers to store and locate data. When given a piece of data to store, the computer will calculate the “hash function of x, h(x)”. This will generate an essentially random number from 1 to 10,000. The computer then goes to that location in the sequential data array and stores the data there. If that location is already occupied by data then it probes forward until it finds an open slot and it puts the data there. When searching for the data to retrieve it does the same thing – goes to the assigned location and if the data is not there it probes forward until it finds it. If it encounters an open position first it concludes the data has been deleted.

Continue Reading »

No responses yet

Nov 16 2021

Russia Shoots Down Satellite

Published by under Astronomy

In the movie Gravity (one of my favorite movies, highly recommended), the Russians shoot down one of their own satellites in order to test their anti-satellite system. The debris from this satellite crashes into other satellites causing a cascade of debris, which travels around the Earth eventually crashing into the ISS and a space shuttle in low Earth orbit. I have to point out that the orbital mechanics in the movie are terrible. One big problem is that objects in the same orbit are going the same velocity, by definition. So the debris would not have been flying by so fast. But putting all that aside, the core concept that space debris is a huge problem, and blowing up satellites in orbit is a horrifically bad idea, is valid.

Which is why it is head scratching that 8 years after Gravity came out Russia would blow up one of its own satellites in orbit in order to test its anti-satellite system. Didn’t anyone in Russia see this movie? More seriously, they should know that this is a terrible idea, contributing significantly to the problem of space debris. The US and other space-faring nations are not happy. In a state department release they said:

“The test has so far generated over 1,500 pieces of trackable orbital debris and hundreds of thousands of pieces of smaller orbital debris that now threaten the interests of all nations.”

The astronauts aboard the ISS had to shelter in capsules for safety as a result of the debris. Our goal is to reduce space debris, not significantly increase it. Russia is not the first country to do this. In 2007 China destroyed one of its defunct weather satellites, producing more than 2,000 pieces of trackable debris. After nearly 65 years of putting satellites into orbit, there are now over a million pieces of debris between 1 and 10 cm orbiting the Earth. NASA is tracking 27,000 pieces of larger debris. While space may seem big, low Earth orbit is finite and valuable real estate. Having more than a million pieces of debris flying around is a significant risk. They can damage satellites and threaten crewed missions, such as the ISS. In fact the ISS frequently has to adjust its orbit in order to avoid tracked debris.

Continue Reading »

No responses yet

Nov 15 2021

What Came Out of COP26

Published by under General Science

The world met in Glasgow at the COP26 summit to see if politicians could put their heads together and work out a deal to limit global warming. The outcome was as much of a mess as you probably think it was. I suspect this is because their entire approach to the problem was flawed. That doesn’t mean it was hopeful or fruitless, just that it was highly problematic, and yielded highly problematic outcomes.

As an analogy, the psychological literature indicates that the best way to achieve a goal is not to focus on the goal but on the steps you need to achieve that goal. Richard Wiseman points this out in his book, 59 Seconds. Imagining yourself having achieved your goal is not helpful, and may even be counterproductive. Rather, you should outline the precise steps you need to take in order to achieve your goal. Chart a path, don’t just indicate your destination. I might humbly suggest that our world leaders would take this advice when they approach the problem of climate change.

As far as I can tell from all the reporting from COP26 (and the Paris agreement, and other climate agreements), the focus is primarily on the goal. We want to limit climate change to 1.5 C temperature rise above pre-industrial levels. That’s a great goal – now how are we going to achieve it? This latest agreement goes a bit further. It lays out several actual steps that are critical to achieving the temperature goal. One is to “phase down” coal. At the last minute India objected to the agreement to “phase out” coal and a watering down of the language was necessary to get an agreement. There was also an agreement to reduce deforestation, an admirable goal that is vital to tackling climate change.

While this language is one step closer to laying out actual policy changes, it is still not quite there. Politicians are straining to come up with language to portray the deal in a positive light, preferring the word “progress” over all others. But few are willing to use the word “success”, which is an indication that many think the conference was essentially a failure. But I think the failure was baked into the process. This is because you can’t just tell countries like India, who are banking on coal to develop their nation and reduce poverty, to phase out coal.

Continue Reading »

No responses yet

Nov 11 2021

Current Warming Unprecedented

Published by under Pseudoscience

While the world debates how best to reverse the trend of anthropogenic global warming (AGW), scientists continue to refine their data on historical global temperatures. A recent study published in Nature adds to this a high resolution picture of average surface temperatures over the last 24,000 years, since the last glacial maximum. The study reinforces the conclusion that the last century of warming is unprecedented over this time frame, and does not reflect any natural cycle but rather the effects of human forcing.

To construct their map of past temperatures, the researchers combined two methods. They used a dataset of chemical analysis of marine sediments, which are affected by local average temperatures. They combined this with a dataset based on computer-simulated climate models. The idea was to leverage the strengths of each approach to arrive at a map of historical surface temperatures that is more accurate than either method alone.

Of course, no one study is ever the final word, but this reconstruction is in line with other research using independent methods and data. The authors also draw two other main conclusions from their data. There has been a debate about whether or not the last 10,000 years had a small warming trend, and this graph supports that conclusion. Further, the authors conclude that the main driver of the large warming trend starting around 17,000 years ago is the retreat of the glacial ice sheets, but that the main driver of the rapid warming over the last 150 years is increasing green house gases. The rate of this recent warming is also out of proportion to any natural cycle detected in the last 24,000 years.

Continue Reading »

No responses yet

Nov 09 2021

Brain Stimulation for Cognitive Control

Published by under Neuroscience

A newly published study presents a proof-of-concept for using deep brain stimulation controlled with artificial intelligence (AI) in a closed-loop system to enhance cognitive control, suggesting it might be effective for a number of mental illnesses. That’s a lot to unpack, so let’s go back to the beginning. The most fundamental necessary to understand what is going on here is that your brain is a machine. It’s a really complicated machine, but it’s a machine none-the-less, and we can alter the function of that machine by altering its physical state.

This may seem obvious, but actually people are generally psychologically biased against this view. This may, in fact, be a consequence of brain function itself, which evolved to create a seamless stream of consciousness, an illusion of self unaware of all the subconscious processes that make up brain function. This is why we tend to interpret people’s behavior in terms of personality and conscious choice, when in fact much of our behavior is a consequence of subconscious processes. We are also biased to believe that people can think or will-power their way out of mental illness.

The more we understand about how the brain functions, however, the more it becomes apparent that the brain is just a glitchy machine, and lots can go wrong. Even when functioning within healthy parameters, there are many trade-offs in brain function, with strengths often coming at the price of weaknesses. We need to look out for our own interests, for example, but this comes at the price of anxiety and paranoia. But there are some brain functions that are so basic they are almost universally useful, and impairment of them can cause of host of problems. One such basic brain function is called cognitive control, which is essentially the ability to determine what thoughts and actions will be the focus of your brain’s attention.

Continue Reading »

No responses yet

Nov 08 2021

Hypervelocity Dust Impacts

Published by under Astronomy,Technology

Space is an incredibly hostile environment, and we are learning more about the challenges of living and traveling in space the more we study it. Apart from the obvious near vacuum and near absolute zero temperatures, space is full of harmful radiation. We live comfortably beneath a blanket of protective atmosphere and a magnetic shield, but in space we are exposed.

Traveling through space adds another element – not only would radiation be passing through us, the faster our ship is traveling the more stuff we would be plowing through. Space is not empty, it is full of gas and dust. In our own solar system, most of the dust is confined to the plane of the ecliptic, in what’s called the zodiacal cloud. But of course, if we are traveling from one planet to another, that would be the plane we are traveling in. At interplanetary velocities, assuming we want to get to our destination quickly (which we do, to minimize exposure to all that radiation) our craft would be plowing through the zodiacal cloud.

We now have some measurements from The Parker Solar Probe regarding the effects of impacts with dust at high velocity. The Parker probe is the fastest human object at 180 kilometers per second. It is also the closest probe ever to the Sun and the one able to operate at the highest temperature. To accomplish this it must keep its heat shield oriented toward the sun. Meanwhile it is encountering thousands of dust particles, tiny grains between 2 and 20 microns in diameter (less than that standard measure of all things tiny, the width of a human hair). We now have data from the probe about the effect of these impacts. Dust grains are striking the probe at hypervelocity, greater than 10,800 km per hour. When they hit they are instantly heated and vaporized, along with a small portion of the surface of the probe. The resulting cloud of debris is also hot enough to become ionized, turning into a plasma. Smaller grains are entirely vaporized in less than a thousandth of a second. Larger grains also give off a cloud of debris that expands away from the craft.

The authors report that the effect of this is:

Some of the impactors encountered by Parker Solar Probe are relatively large, resulting in plasma plumes dense enough to (i) refract natural plasma waves away from the spacecraft, (ii) produce transient magnetic signatures, (iii) and drive plasma waves during plume expansion.  Further, some impacts liberate clouds of macroscopic spacecraft material which can result in electrostatic disturbances near the spacecraft that can linger for up to a minute, which is ~10,000 times longer than the transient plasma plume.

Continue Reading »

No responses yet

Nov 04 2021

Securing Data with the Laws of Physics

Published by under Technology

Data security sounds like a boring topic. However, it is quickly becoming one of the most important technologies in our modern world. Our data, communications, and transactions are increasingly digital, and they are all vulnerable to hacking. It’s estimated that hacking costs the world about $6 trillion per year as of 2021, and increasing. Slightly more than half of data breaches are due to hacking (the rest to some form of social engineering, like phishing). Cyberwarfare is now the new warfare between developed nations, and critical infrastructure may be vulnerable to hacks. Companies are now under constant attack by ransomware. Individuals may have their digital identities stolen, losing their savings and disrupting their lives.

About half of the problem is individual behavior, and this can be mitigated through education, company and governmental policies, and improved tools. But the other half is not due to any failure of personal behavior, but rather to straight-up hacking. This problem requires new technology to fix (in addition to institution-level responsibility to secure systems as much as possible). One aspect of hacking-resistance is authentication – you need a code to get into a system. This is the focus of a potential incremental advance in authentication systems, but let’s give some further background first.

Authentication involves a prover and a verifier. They might, for example, share a code, and the prover needs to provide the code to the verifier to confirm their identity. The inherent problem with this system is that the prover, by necessity, has to give up personal information (such as their code) during the verification process, and this is a point of attack for a hacker. To solve this problem, in the 1980s, programmers developed so-called “zero-knowledge proofs”. The idea is that the prover can demonstrate they have the code without giving up the code itself, and so it remains secure.

Continue Reading »

No responses yet

Next »