Nov 28 2023

The Threat of Technology

In my second book (shameless plug alert) – The Skeptics’ Guide to the Future – my coauthors and I try to imagine both the utopian and dystopian versions of the future, brought about by technology, either individually or collectively. This topic has come up multiple times recently on this blog when discussing technology and trust in science and scientists, so I thought it deserved its own discussion.

The overarching point is that science and technology should not be thought of as pure objective good, but rather they are tools, and tools can be used for good or evil. I admit I am a science enthusiast and a technophile, also a bit of an optimist, so when I hear about a new discovery or technology my first thoughts go to all the ways that it might make life better, or at least cooler. I have to remember to consider all the ways in which the technology can also be abused or exploited, which is why we explicitly did this in our futurism book.

So far, on the balance I think science and technology has been an incredible plus to humanity. For most of human existence life was “short, nasty, and brutish.” Science has given us a greater perspective on ourselves and the universe, freeing us from ignorance and superstition. And technology has given us the power to extend our lives, improve our health, and control our environment. It enables us to peer deep into the universe, and see for the first time a microscopic world that was always there but we had no idea existed. It enables us to travel beyond the confines of our planet, and eventually (if we survive) will enable us to be a multi-world, and even multi-system, species.

I do think we have lost touch with how bad life was prior to modern technology. Our period movies, for example, are highly romanticized. A brutally accurate portrayal of life prior to the industrial revolution would show people with horrible dentition ravaged by diseases and living mostly in drudgery. Most people never saw the world beyond their small village.  We get a hint of this sometimes, but never the reality.

But at the same time, science and technology has brought us the threat of nuclear annihilation. It now threatens all life on Earth through global climate change. Humans are in the process of engineering a sixth mass extinction. The same science and technology that allows us to cure disease could also unleash an engineered pandemic. Technology puts incredible destructive power in the hands of individual terrorists or those who are mentally unwell.

Many technologies have a dark side. Modern transportation makes us much more mobile and cosmopolitan, but poisons the environment and kills thousands of people every year. Social media unleashed greater communication and connection, but also unleashed psychopaths on the world and lead to incredible social stress. Computer technology has made us much more productive, with greater access to information, but also enables the surveillance state.

And we are looking at near-future advances in technology that can be profound and transformative. Artificial Intelligence is already increasing the pace of research, and proving to be a powerful tool of efficiency and innovation. But also has the potential to eradicate privacy and liberty. Genetic engineering challenges our notions of what it means to human. Manufacturing technology is putting greater and greater power in the hands of individual people. One particularly nasty computer virus can crash economies.

What does all this mean? What is the optimal stance toward science and technology? I am not someone who thinks the solution is to push back against advancing science and technology. That is a hopeless game anyway. The genie is out of the bottle, and there are 195 recognized self-governing nations in the world with over 8 billion people. This does not mean we should not think strategically about which research and development we support and fund. I just don’t think that banning new technologies as they emerge is the way to go.

Rather, we need to recognize that science and technology are powerful tools, which can either give us the universe or destroy us. We also have no idea how rare we are in the universe, as a spacefaring technological species. All the greater the tragedy if we ultimately destroy ourselves. What do we do with this recognition?

I think there are two main categories of responses to the realization of the power of technology. The first is thoughtful regulation. As new technologies emerge, or are about to emerge, we need to explore the potential uses and abuses of the technology. We also need to explore mechanisms to insure safety, responsibility, and transparency in developing and using the technology. There needs to be standards that are developed, debated, adjusted as needed, and internationally recognized.

There are already many examples of this. Limits on gain-of-function research for infectious organisms is one. There are also regulations regarding making genetic modifications in humans that would be introduced into the germ-line. Both of these represent technology that can be developed in one small lab anywhere in the world that could impact all of humanity. Fissile material is also heavily regulated.

Regulations are also a double-edged sword (nothing is easy). They can be abused, have overreach, or be based more in fear and ignorance than responsible risk management. This is not a reason to throw up our hands in futility, but to think of ways to get it right.

All of this is dependent on the second, and arguably more important, type of response we need to have to the threat of technology. We – humanity – need to get our collective shit together. This, of course, can mean a lot of things, but here are some broad brushstrokes.

One of my greatest fears for future technology is what authoritarian governments will do with the technology. Are we handing tyrants and dictators the power for total and permanent control? Will this act like a ratchet – as governments vacillate over time between more and less democracy, will technology lock in authoritarian style governments once they get a toe-hold? If so, then we better figure out ways to make democracies more stable and more functional. We also would do well to have international standards of human rights that are somehow enforceable.

There is also the potential for the tyranny of corporations. Increasingly wealthy and powerful transnational corporations may eventually have more influence over our lives than governments. Think about the tech giants in the US. Think about the effect of the tobacco industry on our health, and the fossil fuel industry on the environment. We need to balance giving corporations the freedom to innovate and thrive, while also reigning in their tendency toward greed and exploitation.

There is, obviously, no one solution. We need to work toward a system that has checks and balances, and a dynamic homeostatic stability. This is perhaps the greatest challenge we collectively face today – forging a stable global civilization in the face of rapidly progressing and destabilizing technology. Perhaps the most important sciences we have today are political science, social science, and social psychology, combined with and understanding of history, philosophy and ethics. At least we need to value and advance these disciplines as much as we advance other science and technology. Without them, a dystopian future looks more and more likely.

 

No responses yet