Archive for September, 2020

Sep 29 2020

COVID – Not Close to Herd Immunity

One question weighing on the minds of many people today is – when will this all end? And by “this” (well, one of the “thises”) I mean the pandemic. Experts have been saying all along that we need to buckle up and get read for a long ride on the pandemic express. This is a marathon, and we need to be psychologically prepared for what we are doing now being the new normal for a long time. The big question is – what will it take to end the pandemic?

Many people are pinning their hopes on a vaccine (or several). This is probably our best chance, and the world-wide effort to quickly develop possible vaccines against SARS-CoV-2 has been impressive. There are currently 11 vaccines in late stage Phase 3 clinical trials. There are also 5 vaccines approved for limited early use. No vaccines are yet approved for general use. If all goes well we might expect one or more vaccines to have general approval by the end of the year, which means wide distribution by the end of 2021. That is, if all goes well. This is still new, and we are fast-tracking this vaccine. This is not a bad thing and does not necessarily mean we are rushing it, but it means we won’t know until we know. Scientists need to confirm how much immunity any particular vaccine produces, and how long it lasts. We also need to track them seriously for side effects.

Early on there was much speculation about the pandemic just burning itself out, or being seasonal and so going away in the summer. Neither of these things happened. In fact, the pandemic is giving the virus lots of opportunity to mutate, and a new more contagious strain of the virus has been dominating since July. Pandemics do eventually end, but that’s not the same as them going away. Some viruses just become endemic in the world population, and they come and go over time. We now, for example, just live with the flu, and with HIV. So perhaps COVID will just be one more chronic illness plaguing humanity that we have to deal with.

But what about herd immunity? The point of an aggressive vaccine program is to create herd immunity – giving so many people resistance that the virus has difficulty finding susceptible hosts and cannot easily spread. The percent of the population with immunity necessary for this to happen depends on how contagious the infectious agent is, and ranges from about 50-90%. We don’t know yet where COVID-19 falls, but this is a contagious virus so will probably be closer to 90%. One question is, how much immunity is the pandemic itself causing, and will we naturally get to herd immunity, even without a vaccine? The results of a new study suggest the answer is no.

Continue Reading »

No responses yet

Sep 28 2020

Y Chromosomes of Humans, Neanderthals, and Denisovans

Published by under Evolution

Recent human ancestry remains a complex puzzle, although we are steadily filling in the pieces. For the first time scientists have looked at the Y chromosomes of modern humans, Neanderthals, and Denisovans, giving us yet one more piece of the puzzle. But first, here’s some background.

The common ancestor of humans (by which I mean modern humans), Neanderthals, and Denisovans lived between 500,000-800,000 years ago. This common ancestor species almost definitely lived in Africa, but we are not sure exactly which species it was. Homo heidelbergensis is a candidate, but it’s not clear if the timeline matches up well enough. In any case, this split probably occurred when the ancestors of Neanderthals and Denisovans migrated from Africa to Europe. They then split from each other, with the Neanderthals remaining in Europe and the Denisovans going to Asia. This split happened around 600,000 years ago, but could be more recent.

Why is there so much confusion about the exact time of the split? There are two main reasons. The first is that these splits were not single events. Populations of early humans became relatively separated from each other, enough so that genetic differences could start piling up. But they also continued to interbreed, sharing genes throughout their history. There is a specimen that is likely the child of a Neanderthal and a Denisovan from just 50,000 years ago. Humans and Neanderthals interbred right up until the end. All non-African humans living today have 2-3% Neanderthal DNA.

The other reason the divide is difficult to pin down is because there are numerous methods for estimating when it occurred, and they give slightly different answers. We can look at teeth, and other fossil bones, and genomic DNA, mitochondrial DNA, or specifically X and Y chromosomes. We can also look at their culture via their tools. We can see when certain tools start appearing in different locations. Culture, by the way, can also be shared among populations, so that picture may be confused as well.

Continue Reading »

No responses yet

Sep 25 2020

Climate Change and Wild fires

Published by under General Science

Psychological research confirms what I have observed anecdotally – that people prefer simple answers to complex ones, and will often settle on a single cause of even complex events. This is why I often jokingly answer questions of, “Is the cause A, B, or C,” with “yes.” That is usually the correct (if unsatisfactory) answer, all of the options are correct to some degree. Assuming there is “one true cause” can also be considered a false choice fallacy, or a false dichotomy.

I most recently did this when asked if the increase in wildfires were are currently experiencing on the West Coast of the US are caused by global warming or bad forest management. The experts agree that both contribute, and a new review of the literature sheds some additional light on this question. The authors reviewed over 100 studies published since 2013. This same group published an earlier review on the causes of the Australian wild fires last year. The conclusion of the new review is that global warming has had an “unequivocal and pervasive” role in increasing the conditions that contribute to wild fires.

As they say, this is not rocket surgery. As the weather gets warmer we are experiencing a greater portion of the year with high temperatures, lower humidity, decreased rain, and increased winds. These are all conditions that contribute to starting and spreading wildfires, making them more likely and more intense when they occur. The result has been the worst fire season on record, with three of the four worst individual fires occurring this year.

Stepping back a bit to the bigger question – is there global warming – this fire season adds to the growing evidence that there clearly is. Average temperatures are increasing with the top 10 warmest years on record all being since 1998, with 2016 being the warmest. It is too early to tell for sure, but 2020 is on track to being one of the warmest years on record as well, and may even break the record as the warmest. Further, global ice is decreasing steadily. Hurricanes are getting stronger. Flooding is increasing. And of course, wild fires are increasing. You could claim that any one of these is a coincidence, or has a separate explanation. But given the totality of evidence, that amounts to little more than special pleading. Climate models predicted all of these things, and they are all happening. Trying to write off each individual item (and others I didn’t mention) may work rhetorically with some, but only when looked at in isolation. The probability that so many events predicted by climate scientists as a result of global warming are actually happening is not some grand coincidence or conspiracy. The Earth is warming.

Continue Reading »

No responses yet

Sep 24 2020

Study Finds More Adulterated Supplements

Drugs are regulated in most countries for a reason. They can have powerful effects on the body, can be particularly risky or are incompatible with certain diseases, and can interact with other drugs. Dosages also need to be determined and monitored. So most countries have concluded that prescriptions of powerful drugs should be monitored by physicians who have expertise in their effects and interactions. Some drugs are deemed safe enough that they can be taken over the counter without a prescription, usually in restricted doses, but most require a prescription. Further, before going on the market drug manufacturers have to thoroughly study a drug’s pharmacological activity and determine that it is safe and effective for the conditions claimed. There is a balance here of risk vs benefit – making useful drugs available while taking precautions to minimize the risk of negative outcomes.

But in the US and some other countries there is a parallel system of drug regulation that does not do this. Companies are able to sell drugs without studying their pharmacology, and without providing evidence of safety or effectiveness. Many studies have shown that these alternate drugs are commonly adulterated with unlisted ingredients, often don’t have the key ingredient on the label and have substitutions, are contaminated with sometimes toxic substances, and have little quality control in terms of dosing. Often the active ingredients, if any, are not even known, and we have little knowledge about organ toxicity or drug-drug interactions.

This alternate drug regulatory scheme refers to supplements. In the US, since the 1994 DSHEA law, supplements can include herbal drugs and can make health claims, with the only restriction that they cannot name a disease specifically. So the industry has become expert at making sort-of claims, like boosting the immune system or supporting healthy brain function. There is no evidence that granting expanded over-the-counter access to a wide list of herbal drugs has been a positive things for American’s health. Rather, this is a multi-billion dollar predatory industry that, if anything, has worsened overall public health (because they contain risk with no proven benefit). Real nutritional supplements, like vitamins, minerals, and other micronutrients, were already on the market. What DSHEA added was open access to these dirty and poorly controlled drugs.

Continue Reading »

No responses yet

Sep 22 2020

GMO Crops and Yield

The issue of genetically modified organisms is interesting from a science communication perspective because it is the one controversy that apparently most follows the old knowledge deficit paradigm. The question is – why do people reject science and accept pseudoscience. The knowledge deficit paradigm states that they reject science in proportion to their lack of knowledge about science, which should therefore be fixable through straight science education. Unfortunately, most pseudoscience and science denial does not follow this paradigm, and are due to other factors such as lack of critical thinking, ideology, tribalism, and conspiracy thinking. But opposition to GMOs does appear to largely result from a knowledge deficit.

A 2019 study, in fact, found that as opposition to GM technology  increased, scientific knowledge about genetics and GMOs decreased, but self-assessment increased. GMO opponents think they know the most, but in fact they know the least.  Other studies show that consumers have generally low scientific knowledge about GMOs. There is also evidence that fixing the knowledge deficit, for some people, can reduce their opposition to GMOs (at least temporarily). We clearly need more research, and also different people oppose GMOs for different reasons, but at least there is a huge knowledge deficit here and reducing it may help.

So in that spirit, let me reduce the general knowledge deficit about GMOs. I have been tackling anti-GMO myths for years, but the same myths keep cropping up (pun intended) in any discussion about GMOs, so there is still a lot of work to do. To briefly review – no farmer has been sued for accidental contamination, farmers don’t generally save seeds anyway, there are patents on non-GMO hybrid seeds, GMOs have been shown to be perfectly safe, GMOs did not increase farmer suicide in India, and use of GMOs generally decreases land use and pesticide use.

Continue Reading »

No responses yet

Sep 21 2020

The Holocaust and Losing History

Published by under Skepticism

In the movie Interstellar, which takes place in a dystopian future where the Earth is challenged by progressive crop failures, children are taught in school that the US never went to the Moon, that it was all a hoax. This is a great thought experiment – could a myth, even a conspiracy theory, rise to the level of accepted knowledge? In the context of religion the answer is, absolutely. We have seen this happen in recent history, such as with Scientology, and going back even a little further with Mormonism and Christian Science. But what is the extent of the potential contexts in which a rewriting of history within a culture can occur? Or, we can frame the question as – are there any limits to such rewriting of history?

I think it is easy to make a case for the conclusion that there are no practical limits. Religion is also not the only context in which myth can become belief. The more totalitarian the government, the more they will be able to rewrite history any way they want (“We’ve always been at war with Eastasia”). It is also standard belief, and I think correctly, that the victors write the history books, implying that they write it from their perspective, with themselves as the heroes and the losers as the villains.

But what about just culture? I think the answer here is an unqualified yes also. In many Asian cultures belief in chi, a mysterious life force, is taken for granted, for example. There are many cultural mythologies, stories we tell ourselves and each other that become accepted knowledge. These can be the hardest false beliefs to challenge in oneself, because they become part of your identity. Doubting these stories is equivalent to tearing out a piece of yourself, questioning your deeper world-view.

These cultural beliefs can also be weaponized, for political purposes, and for just marketing. A century ago Chairman Mao decided to manufacture a new history of Traditional Chinese Medicine (TCM). He took parts of various previous TCM traditions, even ones that were mutually exclusive and at ideological war with each other, and then grafted them into a new TCM, altering basic concepts to make them less barbaric and more palatable to a modern society. Now, less than a century later, nearly everyone believes this manufactured fiction as if it were real history. Only skeptical nerds or certain historians know, for example, that acupuncture as practiced today is less than a century old, and not the thousands of years old that proponents claim. Mao’s propaganda has become history.

Continue Reading »

No responses yet

Sep 18 2020

Review of The Social Dilemma

I just watched the Netflix documentary, The Social Dilemma, and found it extremely interesting, if flawed. The show is about the inside operation of the big social media tech companies and the impact they are having on society. Like all documentaries – this one has a particular narrative, and that narrative is a choice made by the filmmakers. These narratives never reflect the full complexity of reality, and often drive the viewer to a certain conclusion. In short, you can never take them at face value and should try to understand them in a broader context.

Having said that, there is a lot of useful insight in the film. What it does well is interview tech insiders who expose the thinking on the part of corporations. We already know many of the pitfalls of social media, and I have discussed many of them here. Social media can be addictive, can lead to depression and a low self-esteem, and to FOMO (fear of missing out). We definitely need to explore the psychological aspects of social media, and this is still a new and active area of research.

Also, social media lends itself to information bubbles. When we rely mostly on social media for our news and information, over time that information is increasing curated to cater to a particular point of view. We can go down rabbit holes of subculture, conspiracy theories, and radical political perspectives. Social media algorithsms have essentially convinced people that the Earth is flat, that JFK Jr. is alive and secretly working for Trump, and that the experts are all lying to us.

This is where I think the documentary was very persuasive and the conclusions resonated. They argued that increasingly people of different political identities are literally living in different worlds. They are cocooned in an information ecosystem that not only has its own set of opinions but its own set of facts. This makes a conversation between different camps impossible. There is no common ground of a shared reality. In fact, the idea of facts, truth, and reality fades away and is replaced entirely with opinion and perspective, and a false equivalency that erases expertise, process, and any measure of validity. At least, this is what happens in the extreme (and I think we have all experienced this).

Continue Reading »

No responses yet

Sep 17 2020

Ice Age Bear Found in Melting Permafrost

Published by under General Science

On the surface this is a story of a fantastic paleontological find. Reindeer herders discovered a well-preserved brown bear in the Russian Arctic, released from melting permafrost. The bear is intact, with lots of preserved soft-tissue, and is therefore of extreme scientific value.

But behind the story there is a deeper and concerning one – wait, isn’t “melting permafrost” an oxymoron? Isn’t permafrost supposed to be permanent? Not exactly. The technical definition of permafrost is any ground that is frozen for at least two years straight. Less than that and it is considered seasonally frozen. But much of the permafrost in the world has been frozen for hundreds of thousands of years. The oldest ice is in Antarctica, believed to be 1.5 million years old.

The bear is estimated to be between 22,000 to 39,500 years old. That is also the age of the ice in which it was frozen, and that ice is melting. The bear is also not the first ice-age remains to be discovered in the Arctic permafrost. In recent years scientists have also found dogs and woolly mammoths melting out of the ice. Of course fluctuations in the extent of the permafrost is nothing new, when we consider the a long time frame. But the Arctic permafrost particularly appears to be melting at an alarming rate. And of course, this is thought to be due to global warming.

Continue Reading »

No responses yet

Sep 15 2020

Life on Venus?

Published by under Astronomy

This is definitely the big news of the week – scientists have detected phosphine gas in the clouds of Venus. This is a big deal because phosphine gas is a potential marker for life. This adds Venus to the list of worlds in our solar system that are candidate hosts of life, along with Mars, Europa, Enceladus and others. Europa and Enceladus are moons with an icy shell and definitely liquid water underneath. The presence of liquid water is what makes them intriguing candidates for potential life. Mars is currently dry and desolate, but in the past was warmer and wetter. Life could have evolved on Mars, and we may find the fossil evidence of such life. Or, unlikely but possible, life could have barely clung to some ecosystems in the Martian soil.

But Venus was not a serious contender for life, and least not after we sent probes there. Prior to the first probe in 1962 scientists and science-fiction writers fantasized about life on Venus. It is our nearest neighbor, almost the same size as Earth, and all those clouds might contain water vapor. Perhaps Venus was a jungle planet. But now we have sent multiple probes to map the planet, and Soviet probes even landed on Venus (surviving for only a short period of time). Here is NASA’s summary of the planet:

Venus has a thick, toxic atmosphere filled with carbon dioxide and it’s perpetually shrouded in thick, yellowish clouds of mostly sulfuric acid that trap heat, causing a runaway greenhouse effect. It’s the hottest planet in our solar system, even though Mercury is closer to the Sun. Venus has crushing air pressure at its surface – more than 90 times that of Earth – similar to the pressure you’d encounter a mile below the ocean on Earth.

Crushing heat, gravity, and sulfuric acid do not make for a hospitable world. However, hope for life on Venus was never completely abandoned. Optimists pointed out that in the upper atmosphere of Venus there is a sweet spot where the temperatures are warm and comfortable for organic reactions and the pressure would be less. Sure, there would still be an acidic atmosphere, but there are extremophiles on earth that thrive in high acidity (acidophiles). I don’t think this was considered a high probability, more of a footnote on the quest for life in our solar system, but Venus could not be completely ruled out as a host for life.

Continue Reading »

No responses yet

Sep 14 2020

Who Invented the Lightbulb?

Published by under Technology

The question of who should get credit for inventing the lightbulb is deceptively complex, and reveals several aspects of the history of science and technology worth revealing. Most people would probably answer the question – Thomas Edison. However, this is more than just overly simplistic. It is arguably wrong. This question has also become political, made so when presidential candidate Joe Biden claims that a black man invented the lightbulb, not Edison. This too is wrong, but is perhaps as correct as the claim that Edison was the inventor.

The question itself betrays an underlying assumption that is flawed, and so there is no one correct answer. Instead, we have to confront the underlying assumption – that one person or entity mostly or entirely invented the lightbulb. Rather, creating the lightbulb was an iterative process with many people involved and no clear objective demarcation line. However, there was a sort-of demarcation line – the first marketable lightbulb. That is really what people are referring to with Edison – not that he invented the lightbulb but that he brought the concept over the finish line to a marketable product.  Edison sort-of did that, and he does deserve credit for the tweak he did develop at Menlo Park.

The real story of the lightbulb begins in 1802 with Humphrey Davy He developed an electric arc lamp by connecting Volta’s electric pile (basically a battery) to charcoal electrodes. The electrodes made a bright arc of light, but it burned too bright for everyday use and burned out too quickly to be practical. But still, Davy gets credit as the first person to use electricity to generate light. Arc lamps of various designs were used for outdoor lighting, such as street light and lighthouses, and for stage lighting until fairly recently.

In 1841 Frederick de Moleyns received the first patent for a light bulb – a glass bulb with a vacuum containing platinum filaments. The bulb worked, but the platinum was expensive. Further, the technology for making vacuums inside bulbs was still not efficient. The glass also had a tendency to blacken, reducing the light emitted over time. So we are not commercially viable yet, but all the elements of a modern incandescent bulb are already there. The technology for evacuating bulbs without disturbing the filaments improved over time. In 1865, German chemist Hermann Sprengel developed the mercury vacuum pump, which was soon adopted by lightbulb inventors.

Continue Reading »

No responses yet

Next »