Jan 13 2026

Is Donut Lab’s Solid State Battery Legit?

The tech world is buzzing with the claims of a startup battery company out of Finland called Donut Lab. They claim to have created the world’s first production solid state battery. At first blush the claims are exciting but seem in line with the promises that we have been hearing about solid state batteries for years. So it may seem that a company has finally cracked the technical issues with the technology and gotten a product across the finish line. But let’s take a closer look.

First let’s review their claims. The CEO is claiming that their battery has a specific energy of 400 watt hours per kilogram. This is great, considering the current lithium ion batteries in production are in the 175-250 range. The Amprius silicon anode Li-ion battery has 370 Wh/kg, so 400 sounds plausibly incremental, but make no mistake, this would still be a huge breakthrough. Meanwhile the CEO also claims 100,000 charge-discharge cycles, and operation temperature from -30 to 100C. In addition he claims his battery is cheaper than standard Li-ion, does not use any geopolitically sensitive raw materials, and is already in production (for motorcycles). Further it can be fully recharged in 5 minutes, and is incredibly stable with no risk of catching fire.

As I have pointed out previously, battery technology is tricky because a useful EV battery needs a suite of features all at the same time, while reality often requires trade-offs. So you can get your high capacity, but with increased expense, for example (like the Amprius battery). So claiming to have every critical feature of an EV battery improve all at once is beyond a huge deal. That in itself starts to get into the implausibility range, but it’s not impossible. My reaction appears to be similar to most people in the tech world – show me the money. At the CES where Donut rolled out its battery claims, in short, they did not do that.

Continue Reading »

Comments: 0

Jan 05 2026

Challenging the Acceleration of the Universe

Published by under Astronomy
Comments: 0

South Korean astronomers are challenging the notion that the universe’s expansion is accelerating, an observation in the 1990s that lead to the theory of dark energy. This is currently very controversial, and may simply fizzle away or change our understanding of the fate of the universe.

In the 1990s astronomers used data from Type Ia supernovae to determine the rate of the expansion of the universe. Type Ias are known as standard candles because they put out the exact same amount of light. The reason for this is the way they form. They are caused by white dwarfs in a double star system – the white dwarfs might pull gas from their partner, and when that gas reaches a critical amount its gravity is sufficient to cause the white dwarf to explode. Because the explosions occur at the same mass, the size of the explosion, and therefore its absolute brightness, is the same. If we know the absolute brightness of an object, and we can measure its apparent brightness, then we can calculate its exact distance.

The astronomers used data from many Type Ia supernova to essentially map the expansion of the universe over time. Remember – when we look out into space we are also looking back in time. They found that the farther away galaxies were the slower they were moving away from each other, as if the universal expansion itself were accelerating over time. This discovery won them the Nobel Prize. The problem was, we did not know what force would cause such an expansion, so astronomers hypothesized the existence of dark energy, as a placeholder for the force that is pushing galaxies away from each other. This dark energy force would have to be significant, stronger than the gravitational force pulling galaxies together.

Continue Reading »

Comments: 0

Dec 29 2025

Biological vs Artificial Consciousness

Definitely the most fascinating and perhaps controversial topic in neuroscience, and one of the most intense debates in all of science, is the ultimate nature of consciousness. What is consciousness, specifically, and what brain functions are responsible for it? Does consciousness require biology, and if not what is the path to artificial consciousness? This is a debate that possibly cannot be fully resolved through empirical science alone (for reasons I have stated and will repeat here shortly). We also need philosophy, and an intense collaboration between philosophy and neuroscience, informing each other and building on each other.

A new paper hopes to push this discussion further – On biological and artificial consciousness: A case for biological computationalism. Before we delve into the paper, let’s set the stage a little bit. By consciousness we mean not only the state of being wakeful and conscious, but the subjective experience of our own existence and at least a portion of our cognitive state and function. We think, we feel things, we make decisions, and we experience our sensory inputs. This itself provokes many deep questions, the first of which is – why? Why do we experience our own existence? Philosopher David Chalmers asked an extremely provocative question – could a creature have evolved that is capable of all of the cognitive functions humans have but not experience their own existence (a creature he termed a philosophical zombie, or p-zombie)?

Part of the problem of this question is that – how could we know if an entity was experiencing its own existence? If a p-zombie could exist, then any artificial intelligence (AI), even one capable of duplicating human-level intelligence, could be a p-zombie. If so, what is different between the AI and biological consciousness? At this point we can only ask these questions, some of them may need to wait until we actually develop human-level AI.

Continue Reading »

Comments: 0

Dec 15 2025

Animals Adapting to Humans

As human civilization spreads into every corner of the world, human and animal territories are butting up against each other more intensely. This often doesn’t end well for the animals. This is also causing evolutionary pressures that are adapting some species to living in close proximity to humans.

Humans cause significant changes to the environment – we may, for example, clear forests in order to plant crops. We also convert a lot of land to human living spaces. We alter the ecosystem with lots of light pollution. We are also now warming the planet.

Humans also produce a lot of food and along with it a lot of food waste. One of the common rules of evolution is that if a resource exists, something will adapt to exploit it. Perhaps the most versatile species in terms of adapting to human sources of food is rats. They follow humans everywhere we go, and prosper in our shadow. New York city experiencing this phenomenon first hand – there is basically no effective way to deal with the rat problem in the city as long as they have a waste problem. They will need to significantly reduce the availability of food waste if they want to make any dent in the rat population.

There is another way that humans provide a selective pressure on the animals that live close to us – we kill aggressive animals. A recent study shows this effect in a population of brown bears that live in Italy, close to humans. This isolated population has become its own genetic subpopulation of brown bears with distinctive features, including a genetic profile associated with less aggressiveness. Make no mistake, these are still wild animals, and brown bears are a dangerous animal. But they are less aggressive than other brown bears.

Continue Reading »

Comments: 0

Dec 11 2025

Mining Asteroids

We are not close to mining asteroids, but the idea is intriguing enough to cause some serious study of the potential. The idea is simple enough – our solar system is full of chunks of rock with valuable minerals. If we could make it economically viable to mine even a tiny percentage of these asteroids the potential would be immense, a game changer for many types of resources. How valuable are asteroids?

The range of potential value is extreme, but at the high end we have a large metal rich asteroid like 16 Psyche in the asteroid belt. Astronomers estimate that the iron in 16 Psyche alone is worth about $10,000 quadrillion on today’s market. By comparison the world’s current economic output is just over $100 trillion, so that’s 100,000 times the world’s annual economic output. Of course, the cost of extraction would be high and the market value would likely be dramatically affected by such a resource, but it shows the dramatic potential of mining asteroids. Some asteroids are rich in platinum-group metals or rare earths, which would be even more valuable. But even the more common carbonaceous asteroids would likely have minerals worth quadrillions.

Again, these figures are likely not the actual monetary value that would be profited from mining asteroids, but they indicate that it is very likely economically viable to do so. I am reminded of the fact that aluminum was more expensive than gold in the 19th century. Then a process for extracting and refining aluminum from dirt was found, and now it is worth about $1.30 a pound. Still the aluminum industry is worth about $300 billion today. Mining asteroids would have a similar effect on many industries.

Continue Reading »

Comments: 0

Dec 08 2025

New Study on the COVID-19 mRNA Vaccines

A new study reinforces the evidence for the safety and efficacy of the mRNA COVID-19 vaccines. That’s the TLDR, but let’s dive into the details.

Medical evidence is always rolled out in stages. First there is what we would consider preclinical evidence, or basic science. This could be initial uncontrolled clinical observations, or mechanistic animal or in vitro research. At some point we have sufficient evidence to generate a hypothesis that a specific treatment could be effective in treating a specific disease, enough to progress to human research. For FDA qualifying research, there are four specific phases. Phase I trials look at the safety of the intervention in usually healthy controls, while also answering basic questions and mechanism and effects. If there are no safety red-flags then the research progressed to a phase II trial, which look for preliminary evidence of efficacy, and further safety data. Again, if that data continues to look encouraging we can progress to a phase III trial, which is a larger and more rigorous trial designed to be definitive. Usually the FDA requires several phase III trials to grant approval of a drug for a specific indication. Then, once on the market there is phase IV trials, which look at data from more widespread use to confirm safety and effectiveness in the real world.

Looked at another way, we do research in the lab, then on dozens of people, then score to hundreds of people, then hundreds to thousands of people, and then finally on thousands to millions of people. Each step of the way we gain the ability to detect less and less common side effects in a broader set of people. Further, the types of evidence are designed to be complementary. Phase III trials, for example, are rigorously experimental, with highly defined populations with randomization to control as many variables as possible. Phase IV trials, on the other hand, are generally observational, designed to look at very large numbers of people in an uncontrolled setting – to determine how safe and effective the treatment is in real-world conditions.

Continue Reading »

Comments: 0

Dec 01 2025

Cognitive Legos

We have all likely had the experience that when we learn a task it becomes easier to learn a distinct but related task. Learning to cook one dish makes it easier to learn other dishes. Learning how to repair a radio helps you learn to repair other electronics. Even more abstractly – when you learn anything it can improve your ability to learn in general. This is partly because primate brains are very flexible – we can repurpose knowledge and skills to other areas. This is related to the fact that we are good at finding patterns and connections among disparate items. Language is also a good example of this – puns or witty linguistic humor is often based on making a connection between words in different contexts (I tried to tell a joke about chemistry, but there was no reaction).

Neuroscientists are always trying to understand what we call the “neuroanatomical correlates” of cognitive function – what part of the brain is responsible for specific tasks and abilities? There is no specific one-to-one correlation. I think the best current summary of how the brain is organized is that it is made of networks of modules. Modules are nodes in the brain that do specific processing, but they participate in multiple different networks or circuits, and may even have different functions in different networks. Networks can also be more or less widely distributed, with the higher cognitive functions tending to be more complex than specific simple tasks.

What, then, is happening in the brain when we exhibit this cognitive flexibility, repurposing elements of one learned task to help learn a new task? To address this question Princeton researchers looked at rhesus macaques. Specifically they wanted to know if primates engage in what is called “compositionality” – breaking down a task into specific components that can then be combined to perform the task. Those components can then be combined in new arrangements to compose a new task, like building with legos.

Continue Reading »

Comments: 0

Nov 25 2025

Is Climate Science “Post Normal” Science – Part II

Yesterday I started a response to this article, which seems to me fits cleanly into a science-denial format. The author is making a lawyers case against the notion of climate change, using classic denialist strategies. Yesterday I focused on his denial that scientists can ever form a meaningful consensus about the evidence, conflating it with the straw man that a consensus somehow is mere opinion, rather than being based on the totality of the evidence. Today I am going to focus on the notion of “post-normal” science. Macrae gives this summary of what post-normal science is:

“The conclusions of post-normal science aren’t ultimately based, then, on empirical data, with theories that can be rigorously tested and falsified, but on “quality as assessed by internal and extended peer communities,” i.e., “consensus,” i.e., informed guesses.”

This is another straw man. He is creating a false dichotomy here, based on his misunderstanding of science (he is a journalist, not a scientist). Yesterday I gave this summary of how science works:

“Science is not a simple matter of proof. There are many different kinds of evidence – observational, experimental, theoretical, and modeling (computer modeling, animal models, etc.). Scientific evidence can use deduction, induction, can start with observation or start with a hypothesis, can use theoretical constructs, can make observations about the past and make predictions about the future. All of these various activities are part of the regular operation of science. No one type of evidence is supreme or perfect – they all represent different tradeoffs. Scientific conclusions are always a matter of inference – scientists make the best inference they can to the most probable explanation given all of the available evidence. This always involves judgement, and some opinion. How are different kinds of evidence weighted when they appear to conflict?”

Continue Reading »

Comments: 0

Nov 24 2025

Is Climate Science “Post Normal” Science?

This article is from a year ago, but it was just sent to me as it is making the rounds in climate change denying circles. It is by Paul Macrae, who is an ex-journalist who now seems to be primarily engaged in climate change denial. The article (a chapter from his book on the subject) is full of the standard climate denial tropes – for the sake of space, I would like to focus on three specific points. The first is the claim that climate science is “settled”, the second is the notion of “post-normal science”, and the third is a factual claim about the accuracy of prior climate models.

Of course, if there is a consensus among climate scientists that global warming (I will get into more details on what this means) is “settled”, that makes it difficult, especially for  a non-scientists, to question the conclusion. So order number one – deny that there is a consensus, deny that consensus is even a thing in science, and deny that science can ever be settled.  I don’t suspect that I will ever be able to slay this dragon, it is simply too useful rhetorically, but for those who are open to argument, here is my analysis.

First – consensus is absolutely a thing in the regular operations of science. A consensus can be built in a number of ways, but often panels of recognized world experts are assembled to review all existing scientific data and make a consensus statement about what the data shows. This is often done when there is a policy or practice question. For example, in medicine, practitioners need to know how to practice, and these consensus statements are used as practice guidelines. They also set the standard of care, so as a practitioner you should definitely be aware of them and not violate them unless you have a good reason. Obviously, the question of global warming is a serious policy question, and so providing scientific guidance to policy makers is the point, such as with the IPCC. Consensus is also used to set research and funding priorities, to establish terminology, and resolve controversies. But to be clear – these mechanisms of consensus do not determine what the science says. That is determined by the actual science. The point is to provide clarity regarding complex scientific evidence, especially when a practice or policy is at issue.

Continue Reading »

Comments: 0

Nov 17 2025

The Future of the Mind

I am currently in Dubai at the Future Forum conference, and later today I am on a panel about the future of the mind with two other neuroscientists. I expect the conversation to be dynamic, but here is the core of what I want to say.

As I have been covering here over the years in bits and pieces, there seems to be several technologies converging on at least one critical component of research into consciousness and sentience. The first is the ability to image the functioning of the brain, in addition to the anatomy, in real time. We have functional MRI scanning, PET, and EEG mapping which enable us to see cerebral blood flow, metabolism and electrical activity. This allows researchers to ask questions such as: what parts of the brain light up when a subject is experiencing something or performing a specific task. The data is relatively low resolution (compared to the neuronal level of activity) and noisy, but we can pull meaningful patterns from this data to build our models of how the brain works.

The second technology which is having a significant impact on neuroscience research is computer technology, including but not limited to AI. All the technologies I listed above are dependent on computing, and as the software improves, so does the resulting imaging. AI is now also helping us make sense of the noisy data. But the computing technology flows in the other direction as well – we can use our knowledge of the brain to help us design computer circuits, whether in neural networks or even just virtually in software. This creates a feedback loop whereby we use computers to understand the brain, and the resulting neuroscience to build better computers.

Continue Reading »

Comments: 0

Next »