By the Skin of Our Teeth
There are a handful of factors that might put our kind at risk in short order. If by the skin of our teeth we make it through the hazards just ahead, though, the chances of our going on to be a truly persistent intelligence in this little sector of space could go way up.
Nature has a number of means of disposing of us. Not only are there potentials for large asteroids hitting this planet, destroying much of the biosphere's fauna and flora, but there are also massive volcanic eruptions that at times have vastly altered life on Earth. The volcanism beneath Yellowstone, for instance, may be typical. At irregular intervals, there have been huge calderas erupting from this geological phenomenon. The last was about 640,000 years ago. Two more such blasts had occurred in the million or so years before that. These violent upheavals can spew ash, magma, and Earth crust material over much of the North American continent, block sunlight, lower global temperatures, and radically alter the geology across millions of square miles. The Yellowstone calderas are not the only volcanic developments that might threaten existential catastrophe. Events such as the Toba eruption, thought to have nearly wiped out our kind the last time it occurred, happen about once every 50,000 years.
Other challenges to human existence include abrupt climate transitions (which can radically lower or raise temperatures over large regions in as little as a decade), gamma ray bursts, pandemics, and mega-tsunamis. All of these are thought to have brought about the demise of millions or billions of earlier varieties of life. With the possible exception of pandemics, none appear to be imminent threats in the natural span of sentient beings today. Experts have speculated on the chances of natural phenomena ending our species in the next hundred years and put the odds at less than one-tenth of 1%.
A comprehensive inventory of natural, if unlikely catastrophes that might one day befall our distant ancestors should include the possibility of Mercury's orbit becoming unstable under Jupiter's ongoing gravitational influence. Just as an unmoored planet crashed into our early world resulting in the Moon's formation, if Mercury were no longer staying in tight orbit around the Sun, one of its wobbles could put it into this planet's path, with Earth-shattering consequences.
The nearer term risks of man-made threats, of course, are more significant. With the exception thus far of the hydrogen bomb, virtually all weapons to be developed have been used in warfare. World War II was the first atomic war. Given the enormous destructive power now available and its probable climate impacts (nuclear winter), the next one could be the final conflict of which our kind is capable.
Besides explosives of horrendous power and radiation, the past several decades have also seen the emergence of bioengineering, including tinkering with the DNA of various life forms, from foodstuffs to microbes. It no longer requires great sophistication or expense to create super-bugs, for instance blends of virulent bacteria and viruses into one organism, resistant to any known medicines. Given the attractions of nihilist philosophies and radical religions to even highly intelligent individuals, it seems only a matter of time before highly toxic, infectious, resistant, and synthetic plagues are created to which humans have no natural immunity. Just as with nuclear warfare, but with far greater ease, people of ill will can now achieve heretofore unparalleled levels of devastation, potentially menacing most all of us. The means of mayhem soon enough to be at the disposal of extremists, chemical, biological, dirty plutonium bombs, or full-fledged nuclear devices, could render prior efforts by alienated individuals or groups trifling by comparison.
Some believe we are going to be done in by nanotechnologies or by robots of superior artificial intelligence, as if a rogue "Hal," "Data," "Terminator," or "Matrix" machine sort of being would or could enslave us or worse. While I personally think this a far-fetched concern, people smarter than I, Stephen Hawking and Bill Gates among them, believe it is a risk worth considering and protecting ourselves against. Unlike the AI envisioned by Isaac Asimov, whose three laws protected humans from most all harm at the hands of intelligent robots, advances in AI today are readily put at the disposal of the military, so that robotic drones are now already killing at one more remove from direct human intervention. Drones, at least, are primitive relative to what AI may be like in the next several decades.
Global warming is often not well understood by various pundits, but if, for whatever reason, the temperature of our environment keeps increasing, as it almost inevitably will once the oceans become saturated with carbon, and then advancing levels of CO-2 plus methane are helping to concentrate the heat of the sun, there will be unfortunate outcomes that adversely affect our weather, sea depths in coastal areas, agriculture, frequency and intensity of fires, health, ice sheets, energy use, military threat levels, potable water reserves, and economies. Some have indicated that if the negative trends are not greatly slowed by the mid-Twenty-First Century, runaway climate change could result. The exact year when this might happen is not so important as that we are at growing risk of having an environment we can no longer effectively adapt to or manage.
Since humans became the dominant vertebrate, thousands of species have already died in this "Sixth Extinction" period. Are we vulnerable as well?
Our blockbuster movies often have themes of alien invasion. This is one threat that seems so remote as to be strictly sci-fi. The distances between stars and galaxies and the time required to travel across them are almost unimaginably great compared with anything with which we are familiar, even given the fastest known current space technologies.
Why would an intelligent being bother expending the vast amounts effort and energy required to come here and do us harm? And if such a destructive intelligent alien creature were around for very long, it would likely first, long before taking an interest in folks on distant worlds, find its aggressive impulses expressing themselves in new, more creative ways to exterminate its own kind. In other words, any beings that had been around long enough and were smart enough to figure out practical means of reaching Earth would likely by then also be at a level of evolutionary maturity such that unprovoked aggression held no interest for them.
There is another category of threat, however, that cannot be easily discounted. A hundred years ago, there were no nuclear bombs, artificially created viruses or bacteria, nanotechnologies, artificial intelligences, or known risks of global warming. It seems probable that in the coming century there will be new means to advance our kind, but also more as yet unimagined methods to put at risk the survival of our own and other species.
What are the overall risks of human extinction? Who knows? In my view, when all the above factors are taken into account, they are probably greater than 1% but less than fifty-fifty in the next hundred years. For any given year they are no doubt lower. It has been suggested about the next use of nuclear weapons that the risk is only about 1% in any 12-month period.
Statistics tell us that a 100-year flood has only a 1% risk of occurring in any particular place and year. Nonetheless, the overall risk of one or more 100-year floods occurring in that place in a century is greater than 60%.
A low probability of a catastrophic event occurring in any one year, decade, or century can still mean a substantial chance that it will occur eventually.
Oxford think-tank experts have put the overall risk of human extinction at 19% in the next 100 years. The odds, short of extinction, yet of destroying known levels of civilization for the foreseeable future are seen as higher. No doubt others will assess the odds differently, some optimistically, some pessimistically. As we have progressed, the chances that we as an entire species shall succumb to natural adversity have significantly diminished while those of our doing ourselves a lot of harm by one circumstance or another have advanced.
If to lose this lottery means either wiping ourselves out entirely or setting back by centuries human quality of life, even at 10% or less these hardly are acceptable odds. How or if to respond, though, we must each determine for ourselves.