Aug 24 2023

Should Japan Release Radioactive Water Into The Pacific?

Japan is planning on releasing treated radioactive water from the Fukushima nuclear accident into the ocean. They claim this will be completely safe, but there are protests going on in both Japan and South Korea, and China has just placed a ban on seafood from Japan. In a perfect world we would just have a calm and transparent discussion about the relevant scientific facts, make a reasonable decision, and go forward without any drama. But of course that is not the world we live in. But let’s pretend it is – what are the relevant facts?

In 2011 a tsunami (and poor safety decisions) caused several reactors at the Fukushima Daichi nuclear power plant to melt down. These reactors were flooded with water to cool them, but heat from continued radioactive decay means they need to be continuously cooled. The water used has become contaminated with 64 different radioactive isotopes. In the past 12 years 350 million gallons of contaminated water has been stored in over 1,000 tanks on site, but they are simply running out of room, which is why there is urgency to do something with the stored contaminated water. How unsafe is this water?

Over the last 12 years the short half-life isotopes have lost most of their radioactivity, but there are still some long half-life isotopes. This is good because the shorter the half-life the more intense the radioactivity per mass, by definition. Really long half-life isotopes, like carbon-14 (half-life 5,000 years), have much lower intensity. Also, the contaminated water as been treated with several processes, such as filtration and sedimentation. Most of the remaining radioactive isotopes have been removed (to levels below acceptable limits) by this process, although carbon-14 and tritium remain.  How much radioactivity is left in this contaminated but treated water? That is the key question.

Continue Reading »

Comments: 0

Aug 22 2023

For Movies – Animals Don’t Sound Real Enough

What does a majestic eagle sound like, or the hoot of a spider monkey, or the roar of a bear? Unless you have an interest in movie tropes, or listen regularly to the SGU, you may have a complete misconception about the sounds these and many other animals make. Eagles, for example, do not make that cool-sounding screech that is almost always paired with a video of an eagle. That is the sound of a red-tailed hawk, which has become the standard sound movies use for any raptor. Eagles make a high-pitched chirping sound. If you have seen a bear roar in a movie, chances are the sound you heard was that of a tiger. All primates hoot like a chimp, all frogs “ribbit” like that one species whose range includes Hollywood.

The same is true of soundscapes. If there is a scene of a jungle, then you will hear a classic jungle soundscape, even if it includes animals from a different continent. If you are in a more foreboding or swampy area, you will hear a loon. Doesn’t matter where the actual location is supposed to be.

I understand why this is the case. Modern moviemaking is, in part, an agreed-upon cultural language. The writer/director/costumer/set-dresser/editor/music director are all communicating to the audience. They are trying to efficiently create a mood, or convey a situation, or signal to the audience something about a location or a character. There are ways to do this using a pre-existing movie language. If someone just came back from grocery shopping, the paper bags they are carrying will have a baguette sticking out the top, and/or carrot tops. This is not because, statistically, that is what a grocery bag is likely to contain, but rather it instantly lets the audience know what they are.

Similarly, with animals sounds, a majestic animal must sound majestic. A large predator must roar like a large predator. The problem for movie makers is that often reality does not sound real enough. It doesn’t convey the emotion or danger that a scene might require.

Continue Reading »

Comments: 0

Aug 21 2023

Gradient Nanostructured Steel

Science fiction writers, who have to think deeply about the possible nature of future technology, often invent new sci-fi materials in order to make their future technology seem plausible. They seem to understand the critical role that material science plays in advancing technology. This is why sci-fi is full of fictional materials such as unobtainium, vibranium, adamantium, and carbonite (to name just some of the most famous ones). New materials change the limits of what’s possible. There is only so much that technology can do within the limits of existing materials.

In fact the early stages of human technology are defined by the materials they used, from the stone age to the iron age. Today we live in the steel age, more than 3,000 years after steel production came into existence. There are many advanced materials with different applications, but in many ways steel still defines the limits of our technology. This is why research looking for ways to improve the characteristics of modern steel is still going on. A recent study might be pointing the way to one method of pushing the limits of steel.

Steel is simply an alloy of iron combined with a small percentage of carbon. Carbon atoms bind with the iron atoms to make crystals of steel that are harder and stronger than iron by itself. The properties of steel can be adjusted by the percentage of carbon in the alloy. The properties of the resulting steel can also be altered by alloying other metals with the steel as well- molybdenum, manganese, nickel, chromium, vanadium, silicon, and boron, for example. These can make the steel stronger, tougher, more ductile, heat resistant, or rust resistant.

Continue Reading »

Comments: 0

Aug 18 2023

Localizing Hidden Consciousness

What’s going on in the minds of people who appear to be comatose? This has been an enduring neurological question from the beginning of neurology as a discipline. Recent technological advances have completely changed the game in terms of evaluating comatose patients, and now a recent study takes our understanding one step further by teasing apart how different systems in the brain contribute to conscious responsiveness.

This has been a story I have been following closely for years, both as a practicing neurologist and science communicator. For background, when evaluating patients who have a reduced ability to execute aspects of the neurological exam, there is an important question to address in terms of interpretation – are they not able to perform a given task because of a focal deficit directly affecting that task, are they generally cognitively impaired or have decreased level of conscious awareness, or are other focal deficits getting in the way of carrying out the task? For example, if I ask a patient to raise their right arm and they don’t, is that because they have right arm weakness, because they are not awake enough to process the command, or because they are deaf? Perhaps they have a frozen shoulder, or they are just tired of being examined. We have to be careful in interpreting a failure to respond or carry out a requested action.

One way to deal with this uncertainty is to do a thorough exam. The more different types of examination you do, the better you are able to put each piece into the overall context. But this approach has its limits, especially when dealing with patients who have a severe impairment of consciousness, which gets us to the context of this latest study. For further background, there are different levels of impaired consciousness, but we are talking here about two in particular. A persistent vegetative state is defined as an impairment of consciousness in which the person has zero ability to respond to or interact with their environment. If there is any flicker of responsiveness, then we have to upgrade them to a minimally conscious state. The diagnosis of persistent vegetative state, therefore, is partly based on demonstrating the absence of a finding, which means it is only as reliable as the thoroughness with which one has looked. This is why coma specialists will often do an enhanced neurological exam, looking really closely and for a long time for any sign of responsiveness. Doing this picks up a percentage of patients who would otherwise have been diagnosed as persistent vegetative.

Continue Reading »

Comments: 0

Aug 15 2023

A Lifecycle Analysis of Electric Vehicles

This article is part of my informal series on EVs, sorting through the claims, reality, and propaganda. There are many complicated factors to sort through, but overall, in my opinion, most concerns about EVs are outdated or overblown. There are definitely locations and use scenarios that still favor ICE (internal combustion engine) vehicles (or at least hybrids) for now, but the battery and EV technology is still on the steep part of the curve, and infrastructure is being built. The percentage of the population for whom EVs make sense will slowly expand, until it’s the best choice for 95% or so. Over this same time period (about 20 years) we should also be decarbonizing our energy production. We can also be reducing dependence on individual cars by building mass transit, making more walkable living spaces, and eventually developing cars as a service (with self-driving cars). For trains and long haul trucking, hydrogen may eventually be the best bet. For short flight planes, electric vehicles are increasingly plausible, while long distance jet travel will need biofuels to keep their carbon footprint down.

After my recent articles on EVs, and companion discussions on the SGU, one of the questions that has been raised that I want to dive deeper into is this – are EVs still better than ICE vehicles even when we consider everything that goes into vehicle production? Spoiler – I think the consensus is that yes, EVs are still better. They have a lower total carbon footprint over the lifetime of use than ICE vehicles. But this is not a universal opinion. For example, I was pointed to this podcast in which the business man and physicist host claims: “Pushing an all-EV world is likely to increase CO2 emissions.”

Continue Reading »

Comments: 0

Aug 10 2023

The Alzheimer’s Revolution

Decades of complex research and persevering through repeated disappointment appears to be finally paying off for the diagnosis and treatment of Alzheimer’s  disease (AD). In 2021 Aduhelm was the first drug approved by the FDA (granted contingent accelerated approval) that is potentially disease-modifying in AD. This year two additional drugs received FDA approval. All three drugs are monoclonal antibodies that target amyloid protein. They each seem to have overall modest clinical effect, but they are the first drugs to actually slow down progression of AD, which represents important confirmation of the amyloid hypothesis. Until now attempts at slowing down the disease by targeting amyloid have failed.

Three drugs in as many years is no coincidence – this is the result of decades of research into a very complex disease, combined with monoclonal antibody technology coming into its own as a therapeutic option. AD is a form of dementia, a chronic degenerative disease of the brain that causes the slow loss of cognitive function and memory over years. There are over 6 million people in the US alone with AD, and it represents a massive health care burden. More than 10% of the population over 65 have AD.

The probable reason we have rapidly crossed over the threshold to detectable clinical effect is attributed by experts to two main factors – treating people earlier in the disease, and giving a more aggressive treatment (essentially pushing dosing to a higher level). The higher dosing comes with a downside of significant side effects, including brain swelling and bleeding. But that it what it took to show even a modest clinical benefit. But the fact that three drugs, which target different aspects of amyloid protein, show promising or demonstrated clinical benefit helps confirm that the amyloid protein and the plaques they form in the brain are, to some extend driving AD. They are not just a marker for brain cell damage, they are at least partly responsible for that damage. Until now, this was not clear.

Continue Reading »

Comments: 0

Aug 04 2023

Using Cement for Energy Storage

Imagine if every house, every building, came with 1-2 days (or possibly more) of energy storage. What if every wind turbine could store a day’s worth of the energy it produces on average? How beneficial would it be if the most common building material the world could be used to store energy?  This prospect is not far fetched, and a new study by MIT scientists has already done a proof-of-concept.

The material is cement, and what these researchers have demonstrated is that they can turn cement into a supercapacitor with another common material – carbon black. A capacitor is a device in which two conductive plates are separated by an insulating layer. This allows negative charge to build up on the positive plate and positive charge to build up on the negative plate, with the separated charges storing electrical energy. The Capacitors have several advantages for energy storage devices – they can store and discharge energy very quickly, they are relatively simple to construct, and they have almost endless charge and discharge cycles. But they also have one major disadvantage, they store very little energy per volume or mass.

As I discussed previously, current lithium ion batteries have energy densities up to 265 Wh/kg. The new Amprius lithium ion batteries with silicon anodes have an energy density of 500 Wh/kg. A typical supercapacitor (just a high energy density capacitor) has an energy density of 16 Wh/kg. My previous article was about a carbon nanofiber supercapacitor with energy density up to 73 Wh/kg – very high for a supercapacitor, but still tiny compared to cutting edge lithium ion batteries.

For an application like a car, a supercapacitor simply holds too little energy. But they could supplement the battery, and would increase the efficiency of regenerative braking. Future EVs might combine a battery and a supercapacitor to get the best of both worlds. Also, if the body of the car itself could be constructed out of a supercapacitor, that could add range and efficiency without adding any weight to the car.

Continue Reading »

Comments: 0

Aug 03 2023

New Whale Fossil – Possibly Heaviest Animal Ever

Published by under Evolution
Comments: 0

The largest and heaviest animal to ever live on the Earth, as far as we know, is the blue whale, which is extant today. The blue whale is larger than any dinosaur, even the giant sauropods. The average weight of a blue whale is 160 tons, with the largest specimen being 190 tons, and 110′ 17″ (33.58m) long. The largest sauropod, Argentinosaurus, weighed up to 110 tons. The reason the largest whales are bigger than the largest dinosaurs is simple – whales swim in the ocean, so they have buoyancy to help carry their incredible heft. The ancestors of whales were land mammal of modest size. It was only when they adapted to the water that they grew very large, and the age of gigantism among whales started about 4.5 million years ago.

At least that is what we thought from existing evidence. That is one of the interesting things about paleontology – a single specimen can upend our phylogenetic charts, the history of what evolved into what and when. Essentially we have scattered puzzle pieces that we try to fit together into a branching tree of evolutionary relationships. One specimen that fits outside of the branches of this tree forces scientists to redraw some of the lines, or add new ones.

That is what has happened with a new extinct whale species, discovered in Peru in 2010 but only recently described in detail. The species is appropriately named Perucetus colossus, and it is a whopper. Scientists estimate the weight at 85 to 320 tonnes, depending on assumptions about soft tissue like organs and blubber. If we take the middle of that range, 180 tonnes, that puts it at the upper range for blue whales. If we assume this is an average specimen (statistically likely but not a guarantee) then its size range may exceed that of the blue whale. Perucetus is not, however, longer than the blue whale, it’s a little shorter. But it’s bones are a lot heavier, they are denser and overgrown, which is an adaption found in other shallow water mammals. It’s the heavy bones that makes it potentially heavier than the blue whale, and regardless, this species has the heaviest skeleton known.

Continue Reading »

Comments: 0

Jul 31 2023

The Superconductor Flap of 2023

If you are at all interested in science and technology news, you have probably heard that a team from South Korea claims to have developed a material that is a superconductor at room temperature and ambient pressure. Interestingly, if you are someone who does not follow such news, you probably haven’t read this. As is often the case, I am as interested in how certain science news gets received and reported as the news itself, and this is an interesting case.

First, the claim being made here is beyond massive. If true (and that’s still a big if) this is the biggest science news so far this century. I would rank it above even CRISPR. This is a technological “holy grail” if ever there were one. A superconductor is a material that conducts electricity without resistance, so there is no energy loss or waste heat produced. It doesn’t take much of an imagination to figure out how useful this would be. We dedicated an entire chapter to this idea in our recent book on future technology. We have a massively and increasingly electrified civilization, and a practical superconducting material would benefit almost every aspect of it. It also makes some extreme technologies more plausible, such as fusion power. We (the world) are about to (hopefully) invest billions if not trillions into upgrading our electrical power grids, and this is the material we would use if these claims are true.

We already have superconductors. You might remember back in the 1980s when scientists discovered the first “high temperature” superconducting class of material. This was only relatively high temperature – raising the highest critical temperature (the temperature below which a substance is superconducting) above 77 K, the boiling point of liquid nitrogen. While still extremely cold, cooling with liquid nitrogen is much cheaper and more practical than liquid hydrogen or helium.  Since then other classes of material have been found with critical temperatures as 250 K, but these require extremely high pressures. They basically are not practical for anything, and are only useful for superconductivity research.

Here is the new paper – it’s a preprint, which means it has not been peer-reviewed. I think that’s one of the reasons the news isn’t headlined everywhere. The researchers claim to have produced essentially an alloy of lead, apatite, and copper that is a superconductor at room temperature, actually up to 127°C (261°F), at ambient pressure. Further, this is not some brittle ceramic, it’s a ductile metal. They report: Continue Reading »

Comments: 0

Jul 27 2023

More On Electric Vehicles

I recently wrote about electric vehicles, which sparked a lively discussion in the comments. There was enough discussion that I wanted to pull my responses together into a new post. Before I get to the details, some general observations. The conversation, in my opinion, nicely demonstrates a couple of general critical thinking principles. The first is that basically well-meaning people (meaning they are not a paid shill) can look at essentially the same collection of facts and come to a different opinion. This relates partly to another post I wrote recently, about how we can subjectively define “true” in order to support pre-existing narratives.

The other principle on clear display in the comments is our old friend confirmation bias (this cuts in all directions, although not necessarily symmetrically). We tend to seek out, accept, and remember bits of information that seem to support what we already believe or want to believe, while finding reasons to dismiss or ignore information that contradicts our narrative. The result is a powerful illusion of knowledge, that what we feel in our guts (or aligns with our ideology) is objectively and obviously true. Therefore, those who disagree with us must be suffering some catastrophic personal failing.

There are also external factors at play, because we are not living in a neutral or disinterested information ecosystem. Not only are we biased in how we gather facts, information is being curated for us with the specific purpose of influencing what we believe to be true. This is also a self-reinforcing phenomenon, because acceptance of curated information leads us to increasingly curated and extreme sources of information, sometimes leading to the infamous “information bubble”.

Continue Reading »

Comments: 0

« Prev - Next »