Nov 08 2013

Emerging Technologies

Most Fridays I submit a blog post to Swift, the official blog of the JREF. The article I submitted this morning is about a new study demonstrating  a brain-machine-interface (BMI) that allows a rhesus monkey to control two robotic arms at the same time. This is a technology I have been following here at NeuroLogica, blogging about it whenever I think a cool breakthrough is made.

The topic touches on several areas simultaneously that I find fascinating – neuroscience, computer technology, virtual reality, and predicting future technology. I make the point, as I often do, that predicting future technology has a terrible track record, with the only reasonable conclusion being that it is very difficult.

It’s fun to look back at past future predictions and see what people generally got right and what they got wrong, and then see if we can learn any general lessons that we can apply to predicting future technology.

Major Hurdles

For example, we are not all flying around with jetpacks or taking our flying car to work. This has become, in fact, a cliche of failed future technologies. I think the lesson here is that both of these technologies suffer from a major hurdle – fuel is heavy, and if you have to carry your fuel around with you it quickly becomes prohibitive. There just doesn’t seem to be any way to overcome this limitation with chemical fuel or batteries.

In other words, whenever the viability of a technology depends upon making a major breakthrough that changes the game with respect to some major limitation imposed by the laws of physics, you cannot count on that technology succeeding in the short to medium term. Long term – all bets are off.

The coming hydrogen economy is another example. It turns out, safely and efficiently storing for convenient release large amounts of hydrogen is a non-trivial technical problem that will not be solved as a matter of course.

Incremental Advance

By contrast, even in the 1980s, but certainly by the early 1990’s the promise of the coming internet was in the air. I remember reading fiction, popular science articles, and talking about how the world will change when information becomes digital and ubiquitous. No one predicted ebay and Twitter specifically, but certainly online commerce and communication were anticipated.

The difference here is that computer and electronic technologies had a proven track record of continuous incremental improvement, and that was all that was necessary for the dreams of the internet to become reality. You can extrapolate incremental progress much more reliably that massive breakthroughs.

Not So Fast

Smartphones, also anticipated for decades, are now a reality. The additional lesson here is that sometimes it takes longer than we predict for a technology to mature. I remember people desperately trying to make use of early portable computing devices in the 1990s (like the Newton and other PDA). I was there, using my PDA, but the functionality was just not sufficient to make it easier than a paper notebook. I’m sure some people made it work for them, but widespread adoption was just not happening.

Now, 20 years later, smartphones have finally achieved the promise of portable personal computing devices. People use smartphones not only for communication, but to quickly look up information, to update their Twitter feed, to listen to music and podcasts, as still and video cameras, and as portable GPS devices. They are still rapidly increasing in power and utility, but they have definitely passed the bar of general adoption.

As PDAs, carrying around a small computer was not that useful. It took the development of other applications to really make the technology useful, such as GPS, the internet, MP3s, and miniaturized cameras.

Yes, But What Is It Good For?

Perhaps the most difficult prediction involves how a new technology will be used. Microwaves were developed for cooking. It turns out, they are terrible tools for cooking. The technology might have completely died on the vine, except it turns out they are really convenient for heating food – defrosting, rewarming, and, of course, making popcorn. They quickly became indispensable.

Segways were supposed to change the way people move about a city. They utterly failed in this goal. However, they enjoy a niche for security guards to move around malls and airports.

This is, in my opinion, the trickiest part of predicting future technology adoption. Even when the technology itself is viable, it’s hard to predict how millions of people will react to the technology. Why are we not all using video-phones all the time? In the 1980s I would have sworn they would be in wide adoption as soon as the technology was available. Now I could, if I chose, make every phone call a video call, but I choose not to. For most calls, it’s just not worth it. I’d rather not have to look into a camera and worry about what I am doing.

Likewise, who would have thought that people would prefer texting to talking on the phone? That was a real shocker to me.

Sometimes the adoption of a specific technology depends upon someone finding a good use for it. The technology itself may be viable, but utilization really determines whether or not it will be adopted. There is no substitute for the real-world experiment of millions of people getting their hands on a technology or application and seeing if they like it.

The Future

Will all this in mind, what are the technologies that I think are likely to have a huge impact on our future? This is a huge topic, and maybe I’ll dedicate a future blog post to exploring this further, but let me name some that come to mind.

Carbon nanotubes and graphene are the plastics and the semi-conductors of the 21st century rolled into one. This material is strong and has interesting and changeable conductive properties that make them potentially usable in small, energy efficient and flexible electronics. The major limitation right now is mass producing carbon nanofibers in long lengths and large amounts efficiently and with sufficient quality. This seems to be an area of steady progress, however.

This may seem like an easy one, but stem cells clearly have tremendous potential. However, I would have to file this one under – major breakthrough still necessary in order to achieve the full potential of stem cell technology. I also think this is one that will mature 2-3 decades later than popularly anticipated. Maybe by the middle of the 21st century we will begin to see the promise of growing or regenerating organs, reversing degenerative diseases, and healing major damage and disease with stem cells.

And to bring the article back around to the original topic – brain-machine interfaces in all manifestations. The ability to affect brain function with electricity and the ability to communicate between external devices (going in both directions – sensory input and motor or other output device) mediated by a computer chip has massive implications.

On the one hand, this is a new paradigm in treating the brain by altering its function. Right now the major medical intervention for brain function is pharmacological, but this approach has inherent limits. The brain is not only a chemical organ, however, it is an electrical organ, and increasingly we are seeing electrical devices, such as deep brain stimulation, to treat neurological diseases.

Beyond that, the ability to interface a brain and a computer essentially brings neuroscience into the computer age, which further means that applications will benefit from the continued incremental advance of computer technology. It may take a few more decades than we hope or anticipate, but we can now clearly see the day when paralyzed patients can control robot legs or arms through BMI, where we can enter a virtual world and not only control but actually mentally occupy an avatar, and where people can control anything technological in their environment through thought alone.

In short, it has been demonstrated that it is possible for humans to merge with their machines. I know this sounds like hyperbole and science fiction, but the science is pretty solid if immature.

This technology is coming. What remains to be seen is what applications will develop, and how will people react.

9 responses so far