Archive for December, 2022

Dec 23 2022

A Quick Review of Facilitated Communication

Published by under Technology

Facilitated communication (FC) is a technique that involves a facilitator supporting the hand or arm of a person with severe communication disabilities, such as autism or cerebral palsy, as they type on a keyboard or communicate through other means. The theory behind FC is that the facilitator’s physical support allows the person to overcome any motor impairments and communicate more effectively. However, FC has been the subject of considerable controversy and skepticism within the scientific community.

One major issue with FC is that there is little scientific evidence to support its effectiveness. Despite being used for decades, FC has never been rigorously tested in controlled, double-blind studies. This is problematic because it is impossible to determine whether the messages being communicated through FC are actually coming from the person with disabilities or from the facilitator. Some researchers have suggested that FC may be susceptible to ideomotor effect, which is when unconscious movements or responses are influenced by a person’s thoughts or beliefs. This means that the facilitator’s own thoughts and beliefs could be influencing the messages that are being communicated.

Another issue with FC is that there have been numerous cases where the messages communicated through FC have been shown to be incorrect or misleading. For example, in one well-known case, a woman with severe communication disabilities was believed to have communicated through FC that she had been sexually abused as a child. However, subsequent investigations revealed that the allegations were not true and that the facilitator had likely influenced the woman’s responses.

Given these concerns, it is important to be cautious about the validity of FC as a means of communication. While it may be tempting to believe that FC can provide a way for people with severe communication disabilities to express themselves, the lack of scientific evidence and the potential for misleading or false messages make it difficult to rely on FC as a reliable source of information. Instead, it may be more productive to focus on other, more established communication methods, such as augmentative and alternative communication (AAC) devices or sign language.

In conclusion, while FC may be a well-intentioned approach to helping people with severe communication disabilities communicate, the lack of scientific evidence and the potential for misleading or false messages make it difficult to rely on as a reliable source of information. Until there is more rigorous scientific evidence to support the effectiveness of FC, it is important to approach it with skepticism and consider alternative methods for communication.

Now…

Continue Reading »

No responses yet

Dec 22 2022

Can Misinformation Cause Cancer?

What are the known factors that increase the risk of getting cancer? Most people know about smoking, but can probably only guess at other factors, and are likely to endorse things that do not contribute to cancer risk. The known contributors to cancer risk include: smoking, consuming alcohol, low levels of physical activity, getting sunburnt as a child, family history of cancer, HPV infection, and being overweight. But there are also a number of “mythic” causes that do not contribute to cancer risk but are widely believed to: artificial sweeteners or additives and genetically modified food; using microwave ovens, aerosol containers, mobile phones, and cleaning products; living near power lines and feeling stressed.

These are all lifestyle factors that people can influence by changing their behavior. Therefore there is a direct utility to informing the public about the true causes of cancer and identifying the factors that they should not worry about. I see the effects of misinformation and poor communication on a regular basis. Often my patients will express to me that they are highly motivated to get healthier by changing their lifestyle, and then they rattle off a list of things they are doing, most of which are useless or counterproductive. Forget all that – just stop smoking and let’s talk about a healthy and practical exercise routine for you.

A recent study seeks to shed light on why there is so much misinformation about the modifiable causes of cancer. This is a complex question, and any one study is only going to look at a tiny slice of potential contributing factors. Also, this is the type of question that is hard to look at in a controlled experiment, so we will have to make due with observational data that can have a lot of confounding factors. The authors did a survey of several English and Spanish language forums, assessing knowledge of true and mythic causes of cancer, and correlating them with belief in conspiracies, preference for alternative medicine, and lack of COVID-19 vaccination. The results are pretty much what you would expect, but let’s dive into some details.

Continue Reading »

No responses yet

Dec 20 2022

Best Science News 2022

It’s always fun and interesting to look back at the science news of the previous year, mainly because of how much of it I have forgotten. What makes a science news item noteworthy? Ultimately it’s fairly subjective, and we don’t yet have enough time to really see what the long term impact of any particular discovery or incremental advance was. So I am not going to give any ranked list, just reminisce about some of the cool science and technology new from the past year, in no particular order. I encourage you to extend the discussion to the comments – let me know what you though had or will have the most impact from the past year.

Fusion

I have to start with the fusion breakthrough, mainly because it is the most recent in my memory and I suspect it will top a lot of lists. The National Ignition Facility managed to achieve what they call “ignition” by producing fusion that created more energy than the energy put into the fuel. This is clearly a milestone. However, this particular setup, referred to as inertial confinement, which uses 192 high power lasers to implode a container which has the fuel, is likely a dead end when it comes to commercial energy production. It was never really designed to be that, just an experiment in fusion. I doubt this will be the method we ultimately use for commercial fusion, which I also predict is still many decades away. We will see in a generation how this news is looked back upon, if at all.

Space

This was a good year for space exploration. The successful launch of Artemis I marks the beginning of our return to the moon. The SLS rocket worked, and it’s more powerful than even the Atlas V. It carried the Orion capsule past the moon and back again, successfully returning to the Earth. Returning to the moon now seems inevitable. Artemis II will launch in 2024 and carry people to the moon but not land. Artemis III will land people on the moon, in 2025 or 2026. It’s going to be exciting to watch.

The other big space news, of course, was the James Webb space telescope (JWST), which is already sending back mind-blowing pictures of the universe. We are just at the beginning of its career, which will likely last 20 years. Can’t wait to see what else it sends us.

AI Continue Reading »

No responses yet

Dec 13 2022

Genetically Engineering Sex Selection

Published by under Technology

I wrote one year ago about a potential technique to create offspring in animals used for research or food of just one sex. For example, the chicken industry culls several billion male chicks each year because they have no commercial value. They don’t lay eggs and they are not optimal for growing for meat. Now Israeli researchers announce they have developed a second entirely different method of sex selection in chicks.

Sex selection in animals can be extremely useful, to avoid unnecessary culling and also to increase efficiency. Often research requires either all male or all female animals of a specific genetic strain, and so those in the litter of the unwanted sex are simply culled. Of course the chicken industry dwarfs this practice with billions culled each year. The technique described a year ago involves inserting a CRISPR kill-switch into the DNA of the parent animals. Half of the kill switch is implanted in the male and half in the female. If the two halves come together in the offspring, then the embryo never develops beyond the 16-32 cell stage. For a chicken, the egg will never hatch.

For mammals, like mice and rats, females have XX chromosomes while males have XY. The female half of the kill switch is inserted into both X chromosomes of the mother, while the other half is inserted in the X chromosome of the father if you want all male offspring, or the Y chromosome if you want all female offspring. Birds are the opposite – the females have WZ chromosomes, while males have ZZ, but the principle is the same.

The Israeli team has developed a different method. They only have to genetically engineer the female parent, inserting a gene for a protein on either the W or Z chromosome for chickens. For female-only offspring they insert it on the Z chromosome of the mother, so that only the ZZ males will carry it. The resulting eggs are then exposed to a blue light for several hours, which activates the protein and prevents further development. Therefore, only the female eggs hatch. Also, the resulting females are not genetically altered in any way, since the gene was only inserted into the male chromosome. The researchers still need to publish their results so that they can be independently verified. The technique should work, but there are details we would want to see, such as the success rate. Even if a small percentage of males survive, that could still be a huge problem for the production process.

Continue Reading »

No responses yet

Dec 12 2022

Fusion Breakthrough – Ignition

Published by under Technology

Much of the discussion about how we are going to rapidly change over our energy infrastructure to low carbon energy involves existing technology, or at most incremental advancements. The problem is, of course, that we are up against the clock and the best solutions are ones that we can implement immediately. Even next generation fission reactors are controversial because they are not a tried-and-true technology, even though fission technology itself is. It certainly would not be prudent to count on an entirely new technology as our solution. If some game-changing technology emerges, great, but until then we will make due with what we know works.

The ultimate game-changing energy technology is, I think, fusion. Fusion technology replicates the processes that power stars, mostly fusing hydrogen into other forms of hydrogen and ultimately into helium. Massive enough stars can then fuse helium into heavier elements, with more massive stars fusing heavier elements until we get to iron which cannot be fused to produce net energy. But even fusing the lightest elements takes a tremendous amount of heat and pressure, which has proved technologically difficult to achieve on Earth. We have been inching closer to this goal, however, and recently the National Ignition Facility at the Lawrence Livermore National Laboratory in California has inched over a significant milestone – ignition.

I wrote just last year about the NIF achieving another milestone, burning plasma. The pace of advancement seemed pretty brisk, and I speculated about how long it would be to achieve the next milestone, ignition. Well, here we are. You can read that article for background, but quickly, the NIF uses a fusion method called inertial confinement – an array of 192 powerful lasers to produce inward pressure sufficient to cause a vessel to implode, with the implosion causing sufficient heat and pressure to produce fusion. The NIF was built in 2009, but it took significant upgrades before it was powerful enough to achieve fusion in 2021. Some of the energy from fusion contributed to further fusion, a process called burning plasma. But in that experiment fusion contributed only 70% of the energy necessary to sustain fusion. That means that the fusion process was still a net energy loss. (Those powerful lasers require a lot of energy.)

Continue Reading »

No responses yet

Dec 08 2022

Ancient Environmental DNA

Published by under Evolution

Our ability to detect, amplify, and sequence tiny amount of DNA has lead to a scientific revolution. We can now take a small sample of water from a lake, and by analyzing the environmental DNA in that water determine all of the things that live in the lake. This is an amazingly powerful tool. My favorite application of this technique was to demonstrate the absence of DNA in Loch Ness from any giant reptile or aquatic dinosaur. So-called eDNA is perhaps the most powerful evidence of a negative, the absence of a creature in an environment – you can’t hide your eDNA.

The ultimate limiting factor on eDNA is how long such DNA will survive. DNA has a half-life, it spontaneously degrades and sheds information, until it is no longer useful for sequencing. Previously scientists extracted DNA from ice cores in Greenland, and were able to sequence DNA up to 800,000 years old. The oldest DNA ever recovered was probably 1.1-1.2 million years old. Based on this  scientists estimated that the ultimate lifespan of usable DNA was about 1 million years. This put the final nail in the coffin of any dreams of a Jurassic park. Non-avian dinosaurs died out 65 million years ago, so none of their DNA should still be left on Earth (the closest we can get is related DNA in birds). But no T. rex DNA in amber.

According to a new assay in the most norther region of Greenland, however, we have to push back the estimate of how long DNA can survive to at least 2 million years. That is a significant increase (but still a long way from T. rex). The site is Kap København Formation located in Peary Land in north Greenland. This is now a barren frozen desert. There are also very few macrofossils here, mostly from a boreal forest and insects, with the only vertebrate being a hare’s tooth. Conditions there are apparently not conducive to fossilization. We do know that 2 million years ago Greenland was much warmer, about 10 degrees C warmer than present. So there is no reason it should not have been teeming with life.

The new analysis of eDNA finds that, in fact, it was. They found DNA from hares, but also other rodents, reindeer, geese, and mastodons. They also found DNA from poplars, birch trees, and thuja trees (a type of coniferous tree), as well as a rich assortment of bushes, herbs, and other flora. Basically this was a mixed forest with a rich ecosystem. In addition they found marine species including horseshoe crab and green algae, confirming the warmer climate.

This ancient eDNA gives us a much more complete picture of the ecosystem than was provided by macrofossils alone. But perhaps more importantly – it demonstrates that eDNA can survive for up to two million years, doubling the previous estimate. The researchers speculate that minerals in the soil bound to the DNA and stabilized it, slowing its degradation. DNA is negatively charged. This property is used to separate out chunks of DNA in a sample by size. You apply a magnetic field which attracts the DNA pieces, which move through a gel at a range proportional to their size. In this case the negatively charged DNA bound to positively charged minerals in the soil. I guess this is the DNA version of fossilization.

The question is – in such environments where DNA is stabilized by binding to minerals, how much is the degradation process slowed down, and therefore how long can DNA survive? DNA breaks down due to “microbial enzymatic activity, mechanical shearing and spontaneous chemical reactions such as hydrolysis and oxidation.” DNA breaks down faster with warmer temperature, so the fact that this DNA remained frozen for so long is crucial. But freezing alone was not enough, which is why scientists think that binding to minerals also played a role.

They measured the “thermal age” of the DNA – if the DNA were at a constant temperature of 10 degrees C how long would it have taken to degrade to its current state – at 2.7 thousand years, 741 times less than its actual age of 2 million years. Therefore it degraded 741 times slower then exposed DNA at 10 degrees C. The average temperature at the site is -17 degrees C. They further found that the DNA was bound mostly to clay minerals, and specifically smectite (and to a lesser degree, quartz).

Perhaps this is the limit of DNA survival – although we thought the previous record of 1.1-1.2 million years was the limit. It is possible there may be environmental conditions elsewhere in the world that could slow DNA degradation even further. Slow DNA degradation by a factor of 30 or so beyond the Kap København Formation and we are getting into the time of dinosaurs. This is probably unlikely. Constant freezing temperatures are required, in addition to geological stability and optimal soil conditions. But I don’t think we can say now that it is impossible, just highly unlikely. I did not see any estimate in the study about the ultimate upper limit of DNA lifespan, but I suspect we will see such analyses based on this latest information.

The best evidence, however, will come from simply looking in new locations for eDNA, especially those that likely have the optimal conditions for maximal DNA longevity. But for now, being able to reconstruct ecosystems from 2 million years ago is still pretty cool.

No responses yet

Dec 06 2022

Mars More Volcanically Active Than We Thought

Published by under Astronomy

Mars is perhaps the best candidate world in our solar system for a settlement off Earth. Venus is too inhospitable. The Moon is a lot closer, but the extremely low gravity (.166 g) is a problem for long-term habitation. Mars gravity is 0.38 g, still low by Earth standards but better than the Moon. But there are some other differences between Earth and Mars. Mars has only a very thin atmosphere, less than 1% that of Earth’s. That’s just enough to cause annoying sand storms, but not enough to avoid the need for pressure suits. Mars lost its atmosphere because it was stripped away by the solar wind – because Mars also does not have a global magnetic field to protect itself. The thin atmosphere and lack of magnetic field also exposes the surface to lots of radiation.

Mars’ smaller size also means that it cooled faster than the Earth. While there are ancient volcanoes on Mars, the surface crust looks solid, without plate tectonics. This has led astronomers to believe that Mars is a quiet planet, with heat at the core, but a solid crust and mantle and no geological activity. That also means there are no recent volcanic eruptions that might replenish its depleted atmosphere. However – that view is changing.

There is one region of Mars, Elysium Planitia, which may be geologically active. In fact, there is now good evidence of a giant mantle plume under the surface. A mantle plume occurs when hot magma from the core rises up through the mantle and pushes up against the overlying crust. There are more than 18 such mantle plumes on Earth. One is right below the Hawaiian islands – as the Pacific plate moves over this plume it creates a chain of volcanoes and resulting volcanic islands. What is the evidence for a mantle plume beneath Elysium Planitia?

Continue Reading »

No responses yet

Dec 05 2022

Square Kilometer Array

Published by under Astronomy

Construction begins this week on what will be the largest radio telescope in the world – the Square Kilometer Array (SKA). This project began more than 30 years ago, in 1991, as an idea, with an international working group forming in 1993. It took three decades to flesh out the concept, create a detailed design, secure the land rights, and secure government funding. The first antennas will go online by 2024 with more added through 2028 (which will complete the first phase – about 10% of the total planned project). This will result in a radio telescope array with a total area of one square kilometer.

There are actually two components to the total array. One is being built in Australia, the SKA-Low, for low frequency. These will use antennas that look like two-meter tall metal Christmas trees. There will be 500 arrays of 256 antennas for a total of 131,000 antennas. This will be the low frequency array, able to detect radio waves between 50 megahertz and 350 megahertz. There will also be SKA-Mid in South Africa, which will be an array of 197 dishes sensitive between 350 megahertz and 15.4 gigahertz. The whole thing will be connected together, with the bulk of the computing power located in the UK.

Why do astronomers connect radio receivers together? This has to do with interferometry – the ability to combine two signals so that they can simulate a single receiver with a diameter equal to the distance between the two receivers.  It’s not the same as having one giant dish, however. An array increases the resolution of the received image, but the sensitivity is still a function of the total receiving area (not the distance). The Very Large Array (VLA) in New Mexico has radio dishes on rails, so that they can be moved into different configuration. By moving the dishes apart you can achieve greater resolution, but by moving them closer together you get greater precision – so there is a trade-off from moving receivers farther apart. There is no substitute for total collecting area, which is why the SKA will have so many individual receivers.

Continue Reading »

No responses yet

Dec 02 2022

Evolution Is Not a Straight Line

Published by under Evolution

Yesterday I wrote about the fact that technological development is not a straight line, with superior technology replacing older technology. That sometimes happens, but so do many other patterns of change. Often competing technologies have a suite of relative strengths and weaknesses, and its hard to predict which one will prevail. Also, competing technologies may exist side-by-side for long periods of time. Sometimes, after experimenting with new technologies, people may revert to older and simpler methods because they are in the mood for a different set of tradeoffs.

Similarly, biological evolution is not a simple straight line with “more advanced” species replacing more primitive ones. Adaption to the local environment is a relative thing, and many biological features have a complex set of tradeoffs. With technological evolution (any cultural evolution) ideas can come from anywhere and spread in any pattern (although some are more likely than others). Biological evolution is more constrained. It can only work with the material it has at hand, and information is passed down mostly vertically, from parents to child. But there is also horizontal gene transfer in evolution, there is hybridization, and even back mutations. The overall pattern is a complex branching bush, spreading out in many directions. Any long term directionality in evolution is likely just an epiphenomenon.

Paleontologists try to reverse engineer the multitudes of complex branching bushes of evolutionary relationships using an incomplete fossil record and, more recently, genetic analysis. But this can be extremely difficult because it may not always be obvious how to draw the lines to connect the dots. The simplest or most obvious pattern may not be true. A recent discovery involving bird evolution highlights this fact. It is now pretty well established that birds evolved from theropod dinosaurs. The evidence is overwhelming and convincing. Creationists, who predicted that birds would forever remain an isolated group, have egg on their face.

Continue Reading »

No responses yet

Dec 01 2022

Ancient Shipwreck Reveals Complex Trade Network

Published by under Technology

People tend to understand the world through the development of narratives – we tell stories about the past, the present, ourselves, others, and the world. That is how we make sense of things. I always find it interesting, the many and often subtle ways in which our narratives distort reality. One common narrative is that the past was simpler and more primitive than it actually was, and that progress is linear, objective, and inevitable. I remember watching The Day the Universe Changed with James Burke when in one episode he declared that the Dark Ages were a time of great technological advancement. This seemed at odds with what I had been told, but I later confirmed this view that the so-called “Dark Ages” were maligned by later Renaissance writers congratulating their own progress.

The same is true of our image of technological advancement, that it’s objective and inevitable. This became more clear to me when researching my latest book, The Skeptics’ Guide to the Future. One story in particular is the sequence of the material ages – the stone age giving way to the copper age, then bronze age, and finally iron age. Metallurgy was clearly a huge technological advance, and did progress significantly over time. But this sequence was not strictly linear, older technologies persisted alongside newer technologies for different applications, and sometimes technological shifts are more of a lateral move than a clear advance.

The biggest example from the sequence above is the transition from relying mainly on bronze for tools and weapons to iron. Iron, it turns out, is not objectively better than bronze for many applications. Bronze is actually a very useful metal – it can be cast, it is easy to work with, it is strong, and it doesn’t rust. That last feature, not rusting, makes it superior to iron for many applications, even into the Renaissance (until the development of stainless steel). Bronze is actually stronger than iron and can be worked more easily, at a lower temperature. Until the development of carbon steel, there was no reason to favor iron over bronze. Why, then, did the change happen?

Continue Reading »

No responses yet