Archive for the 'Technology' Category

Nov 22 2021

The Efficiency of Data Storage

Published by under Technology

As our world becomes increasingly digital, math becomes more and more important (not that it wasn’t always important). Even in ancient times, math was a critical technology improving our ability to predict the seasons, design buildings and roads, and have a functioning economy. In recent decades our world has been becoming increasingly virtual and digital, run by mathematical algorithms, simulations, and digital representations. We are increasingly building our world using methods that are driven by computers, and the clear trend in technology is toward a greater meshing of the virtual with the physical. One possible future destination of this trend is programmable matter, in which the physical world literally becomes a manifestation of a digital creation.

What this means is that the impact of even tiny incremental improvements in the efficiency of the underlying technology, computers, has increasingly powerful reverberations throughout our economy and our world. The nerds have truly inherited the Earth. This is why it is interesting science news that computer scientists at MIT have developed a tweak that may improve the efficiency with this computers store and retrieve data. William Kuszmaul and his team have demonstrated a way to improve what is known as linear probing hash tables. The underlying concept is interesting, at least for those curious about how the increasingly ubiquitous computer technology works.

Hash tables were developed in 1954 as a way for computers to store and locate data. When given a piece of data to store, the computer will calculate the “hash function of x, h(x)”. This will generate an essentially random number from 1 to 10,000. The computer then goes to that location in the sequential data array and stores the data there. If that location is already occupied by data then it probes forward until it finds an open slot and it puts the data there. When searching for the data to retrieve it does the same thing – goes to the assigned location and if the data is not there it probes forward until it finds it. If it encounters an open position first it concludes the data has been deleted.

Continue Reading »

No responses yet

Nov 08 2021

Hypervelocity Dust Impacts

Published by under Astronomy,Technology

Space is an incredibly hostile environment, and we are learning more about the challenges of living and traveling in space the more we study it. Apart from the obvious near vacuum and near absolute zero temperatures, space is full of harmful radiation. We live comfortably beneath a blanket of protective atmosphere and a magnetic shield, but in space we are exposed.

Traveling through space adds another element – not only would radiation be passing through us, the faster our ship is traveling the more stuff we would be plowing through. Space is not empty, it is full of gas and dust. In our own solar system, most of the dust is confined to the plane of the ecliptic, in what’s called the zodiacal cloud. But of course, if we are traveling from one planet to another, that would be the plane we are traveling in. At interplanetary velocities, assuming we want to get to our destination quickly (which we do, to minimize exposure to all that radiation) our craft would be plowing through the zodiacal cloud.

We now have some measurements from The Parker Solar Probe regarding the effects of impacts with dust at high velocity. The Parker probe is the fastest human object at 180 kilometers per second. It is also the closest probe ever to the Sun and the one able to operate at the highest temperature. To accomplish this it must keep its heat shield oriented toward the sun. Meanwhile it is encountering thousands of dust particles, tiny grains between 2 and 20 microns in diameter (less than that standard measure of all things tiny, the width of a human hair). We now have data from the probe about the effect of these impacts. Dust grains are striking the probe at hypervelocity, greater than 10,800 km per hour. When they hit they are instantly heated and vaporized, along with a small portion of the surface of the probe. The resulting cloud of debris is also hot enough to become ionized, turning into a plasma. Smaller grains are entirely vaporized in less than a thousandth of a second. Larger grains also give off a cloud of debris that expands away from the craft.

The authors report that the effect of this is:

Some of the impactors encountered by Parker Solar Probe are relatively large, resulting in plasma plumes dense enough to (i) refract natural plasma waves away from the spacecraft, (ii) produce transient magnetic signatures, (iii) and drive plasma waves during plume expansion.  Further, some impacts liberate clouds of macroscopic spacecraft material which can result in electrostatic disturbances near the spacecraft that can linger for up to a minute, which is ~10,000 times longer than the transient plasma plume.

Continue Reading »

No responses yet

Nov 04 2021

Securing Data with the Laws of Physics

Published by under Technology

Data security sounds like a boring topic. However, it is quickly becoming one of the most important technologies in our modern world. Our data, communications, and transactions are increasingly digital, and they are all vulnerable to hacking. It’s estimated that hacking costs the world about $6 trillion per year as of 2021, and increasing. Slightly more than half of data breaches are due to hacking (the rest to some form of social engineering, like phishing). Cyberwarfare is now the new warfare between developed nations, and critical infrastructure may be vulnerable to hacks. Companies are now under constant attack by ransomware. Individuals may have their digital identities stolen, losing their savings and disrupting their lives.

About half of the problem is individual behavior, and this can be mitigated through education, company and governmental policies, and improved tools. But the other half is not due to any failure of personal behavior, but rather to straight-up hacking. This problem requires new technology to fix (in addition to institution-level responsibility to secure systems as much as possible). One aspect of hacking-resistance is authentication – you need a code to get into a system. This is the focus of a potential incremental advance in authentication systems, but let’s give some further background first.

Authentication involves a prover and a verifier. They might, for example, share a code, and the prover needs to provide the code to the verifier to confirm their identity. The inherent problem with this system is that the prover, by necessity, has to give up personal information (such as their code) during the verification process, and this is a point of attack for a hacker. To solve this problem, in the 1980s, programmers developed so-called “zero-knowledge proofs”. The idea is that the prover can demonstrate they have the code without giving up the code itself, and so it remains secure.

Continue Reading »

No responses yet

Oct 25 2021

A Synergistic Hydrogen Economy

Published by under Technology

I have been writing a lot about energy recently, partly because there is a lot of energy news, and also I find it helpful to focus on a topic for a while to help me see how all the various pieces fit together. It is often easy to see how individual components of energy and other technology might operate in isolation, and more challenging to see how they will all work together as part of a complete system. But the system approach is critical. We can calculate the carbon and energy efficiency of installing solar panels on a home and easily understand how they would function. A completely different analysis is required to imagine solar power supplying 50% or more of our energy – now we have to consider things like grid function, grid storage, energy balancing, regulations for consumers and industry, sourcing raw material, disposal and recycling.

This is why just letting the market sort everything out will likely not result in optimal outcomes. Market decisions tend to be individual and short term. When EVs and solar panels are cost effective, everyone will want one. Demand is likely to outstrip supply. Supply chains could bottleneck. The grid won’t support rooftop solar beyond the early adopters. And where is everyone going to charge their EVs? At scale, widespread change in technology often requires new infrastructure and sometimes systems planning.

This can create an infrastructure dilemma – what future technology to you build for? You can take the “build it and they will come” approach, which assumes that infrastructure investment will affect and even determine the future direction of the market. California discovered the limits of this approach when they tried to bootstrap a hydrogen vehicle revolution by building a hydrogen infrastructure. Or you can backfill infrastructure as technology requires it, but this doesn’t quite work either. People won’t buy cars until there are roads, and won’t want to invest in roads until lots of people have cars. At some point you have to bet on future technology – just be flexible and willing to change course as technology evolves.

Continue Reading »

No responses yet

Oct 14 2021

Lack of Infrastructure Killed Early Electric Car

Published by under Technology

At the turn of the 19th century there were three relatively equal contenders for automobile technology, electric cars, steam powered, and the internal combustion engine (ICE). It was not obvious at the time which technology would emerge dominant, or even if they would all continue to have market share. By 1905, however, the ICE began to dominate, and by 1920 electric cars fell out of production. The last steam car company ended production in 1930, perhaps later than you might have guessed.

This provides an excellent historical case for debate over which factors ultimately determined the winner of this marketplace competition (right up there with VHS vs Betamax). We will never definitively know the answer – we can’t rerun history with different variables to see what happens. Also, the ICE won out the world over because the international industry consolidated around that choice, meaning that other countries were not truly independent experiments.

The debate comes down to internal vs external factors – the inherent attributes of each technology vs infrastructure. Each technology had its advantages and disadvantages. Steam engines worked just fine, and had the advantage of being flexible in terms of fuel. These were external combustion engines, as the combustion took place separately, outside the engine itself. But they also needed a boiler, which produced the steam to power the engine. Steam cars were more powerful than ICE cars, and also quieter and (depending on their configuration) produced less pollution. They had better torque characteristics, obviating the need for a transmission. The big disadvantage was that they needed water for the boiler, which required either a condenser or frequent topping off. They could also take a few minutes to get up to operating temperature, but this problem was solved in later models with a flash boiler.

Continue Reading »

No responses yet

Oct 04 2021

Incremental Advance for Quantum Computing

Published by under Technology

Quantum computing is an exciting technology with tremendous potential. However, at present that is exactly what it remains – a potential, without any current application. It’s actually a great example of the challenges of trying to predict the future. If quantum computing succeeds, the implications could be enormous. But at present, there is no guarantee that quantum computing will become a reality, and if so how long it will take. So if we try to imagine the world 50 or 100 years in the future, quantum computing is a huge variable we can’t really predict at this point.

The technology is moving forward, but significant hurdles remain. I suspect that for the next 2-3 decades the “coming quantum computer revolution” will be similar to the “coming hydrogen economy,” in that it never came. But the technology continues to progress, and it might come yet.

What is quantum computing? Here is the quick version – a quantum computer exploits the weird properties of quantum mechanics to perform computing operations. Instead of classical “bits” where a unit of information is either a “1” or “0”, a quantum bit (or qubit) is in a state of quantum superposition, and can have any value between 0 and 1 inclusive. This means that each qubit contains a vastly greater amount of information than a classical bit, especially as they scale up in number. A theoretical quantum computer with one million qubits could perform operations in minutes that would take a universe full of classical supercomputers billions of years to perform (in other words, operations that are essentially impossible for classical computers). It’s no wonder that IBM, Google, China, and others are investing heavily in this technology.

But there are significant technological hurdles that remain. Quantum computer operations leverage quantum entanglement (where the physical properties of two particles are linked) among the qubits in order to get to the desired answer, but that answer is only probabilistic. In order to know that the quantum computer is working at all, researchers check the answers with a classical computer. Current quantum computers are running at about a 1% error rate. That sounds low, but for a computer it’s huge, essentially rendering the computer useless for any large calculations (the ones that quantum computers would be useful for).

Continue Reading »

No responses yet

Sep 23 2021

Will We Respect a Robot’s Authority?

The robots are coming. Of course, they are already here, mostly in the manufacturing sector. Robots designed to function in the much softer and chaotic environment of a home, however, are still in their infancy (mainly toys and vacuum cleaners). Slowly but surely, however, robots are spreading out of the factory and into places where they interact with humans. As part of this process, researchers are studying how people socially react to robots, and how robot behavior can be tweaked to optimize this interaction.

We know from prior research that people react to non-living things as if they are real people (technically, as if they have agency) if they act as if they have a mind of their own. Our brains sort the world into agents and objects, and this categorization seems to entirely depend on how something moves. Further, emotion can be conveyed with minimalistic cues. This is why cartoons work, or ventriloquist dummies.

A humanoid robot that can speak and has basic facial expressions, therefore, is way more than enough to trigger in our brains the sense that it is a person. The fact that it may be plastic, metal, and glass does not seem to matter. But still, intellectually, we know it is a robot. Let’s further assume for now we are talking about robots with narrow AI only, no general AI or self-awareness. Cognitively the robot is a computer and nothing more. We can now ask a long list of questions about how people will interact with such robots, and how to optimize their behavior for their function.

Continue Reading »

No responses yet

Sep 21 2021

Virtual Phobia Treatment

Are you afraid of spiders? I mean, really afraid, to the point that you will alter your plans and your behavior in order to specifically reduce the chance of encountering one of these multi-legged creatures? Intense fears, or phobias, are fairly common, affecting from 3-15% of the population. The technical definition (from the DSM-V) of phobia contains a number of criteria, but basically it is a persistent fear or anxiety provoked by a specific object or situation that is persistent, unreasonable and debilitating. In order to be considered a disorder:

“The fear, anxiety, or avoidance causes clinically significant distress or impairment in social, occupational, or other important areas of functioning.”

The most effective treatment for phobias is exposure therapy, which gradually exposes the person suffering from a phobia to the thing or situation which provokes fear and anxiety. This allows them to slowly build up a tolerance to the exposure (desensitization), to learn that their fears are unwarranted and to reduce their anxiety. Exposure therapy works, and reviews of the research show that it is effective and superior to other treatments, such as cognitive therapy alone.

But there can be practical limitations to exposure therapy. One of which is the inability to find an initial exposure scenario that the person suffering from a phobia will accept. For example, you may be so phobic of spiders that any exposure is unacceptable, and so there is no way to begin the process of exposure therapy. For these reasons there has been a great deal of interest in using virtual/augment reality for exposure therapy for phobia. A 2019 systematic review including nine studies found that VR exposure therapy was as effective as “in vivo” exposure therapy for agoraphobia (fearing situations like crowds that trigger panic) and specific phobias, but not quite as effective for social phobia.

Continue Reading »

No responses yet

Sep 16 2021

Another Fusion Breakthrough

Published by under Technology

About a month ago I wrote about a milestone achieved by the National Ignition Facility (NIF) which is using the inertial confinement method to achieve fusion of hydrogen into helium. Briefly, they achieved “burning plasma” where heat from the fusion provides energy for further fusion. They are only about 70% of the way to ignition, where the fusion is self-sustaining with only its own energy.

The NIF uses lasers to compress the hydrogen plasma to sufficient heat and density for fusion to occur. The other approach to achieving fusion is magnetic confinement, using powerful magnetic fields to squeeze the plasma to incredible density and heat, so that the hydrogen atoms are moving fast enough that occasionally two will collide with enough force to cause fusion. The magnetic confinement approach is all about the magnets – if we have magnets that are powerful and efficient enough, we can make fusion. It’s that simple. After decades of plasma research there are multiple labs around the world that can use magnetic confinement to get hydrogen to fuse. But we have yet to achieve “ignition”. Also, ignition is not the final goal, just one more milestone along the way. We need to go beyond ignition, where the fusion process is producing more than enough energy needed to sustain the fusion, so that some of the excess energy can be siphoned off and used to make electricity for the grid. That’s the whole idea.

MIT’s Plasma Science and Fusion Center (PSFC) in collaboration with Commonwealth Fusion Systems (CFS) has their own magnetic confinement fusion experiment called SPARC (Soonest/Smallest Private-Funded Affordable Robust Compact). This is a demonstration reactor based on the tokamak design first developed by Soviet physicists. The magnetic field is a doughnut shape with a “D” shape in cross-section. Three years ago they determined that if they could build a magnet that was able to produce a 20 Tesla magnetic field, then the SPARC reactor would be able to produce excess fusion energy. It’s all about the magnets. The news is that they just achieved that very goal, on time despite the challenges of the intervening pandemic.

Continue Reading »

No responses yet

Sep 13 2021

Using Solar Power to Make Ammonia

Published by under Technology

Ammonia is the second most produce industrial chemical in the world. It is used for a variety of things, but mostly fertilizer:

About 80% of the ammonia produced by industry is used in agriculture as fertilizer. Ammonia is also used as a refrigerant gas, for purification of water supplies, and in the manufacture of plastics, explosives, textiles, pesticides, dyes and other chemicals.

In 2010 the world produced 157.3 million metric tons of ammonia. The process requires temperatures of 400-600 degrees Celsius, and pressure of 100-200 atmospheres. This uses a lot of energy – about 1% of world energy production. The nitrogen is sourced from N2 in the atmosphere, and the high pressure and temperatures are necessary to break apart the N2 so that the nitrogen can combine with hydrogen to form NH3 (ammonia). The hydrogen is sourced from natural gas, using up about 5% of the world’s natural gas production. About half of all food production requires fertilizer made using this process (the Haber-Bosch process). This is not a process we can phase out easily or quickly.

It does represent, however, a huge opportunity for increased efficiency, saving energy and natural gas, and reducing the massive carbon footprint of the entire process. We know it’s possible because bacteria do it. Some plants have a symbiotic relationship with certain soil bacteria. The plants provide nutrients and energy to the bacteria, and in turn the bacteria fix nitrogen from the atmosphere (at normal pressures and temperatures) and provide it to the plant. These plants (which include, for example, legumes) therefore do not require nitrogen fertilizer. In fact, they can be used to fix nitrogen and put it back into the soil.

Continue Reading »

No responses yet

Next »