Martin Rees, the Astronomer Royal of Great Britain (1995-present) and President of the Royal Society (2005-10) has long been a particular hero of mine. Unfortunately I have never met him, though he is a friend of many of my friends among British mathematicians, whom I got to know while on sabbatical at the Maths Institute at the University of Warwick.This week Martin Rees has an editorial in Science, on the subject of risk. Here are three snippets:Those fortunate enough to live in the developed world fret too much about minor hazards of everyday life... The main threats to sustained human existence now come from people, not from nature. Ecological shocks that irreversibly degrade the biosphere could be triggered by the unsustainable demands of a growing world population. Fast-spreading pandemics would cause havoc in the megacities of the developing world... Equally worrying are the imponderable downsides of powerful new cyber-, bio-, and nanotechnologies. Indeed, we're entering an era when a few individuals could, via error or terror, trigger societal breakdown...Our interconnected world depends on elaborate networks: electric power grids, air traffic control, international finance, just-in-time delivery, and so forth. Unless these are highly resilient, their manifest benefits could be outweighed by catastrophic (albeit rare) breakdowns cascading through the system...Some would dismiss such concerns as an exaggerated jeremiad: After all, societies have survived for millennia, despite storms, earthquakes, and pestilence. But these human-induced threats are different -- they are newly emergent, so we have a limited time base for exposure to them and cannot be so sanguine that we would survive them for long, or that governments could cope if disaster strikes... It is hard to quantify the potential "existential" threats from (for instance) bio- or cybertechnology, from artificial intelligence, or from runaway climatic catastrophes. But we should at least start figuring out what can be left in the sci-fi bin (for now) and what has moved beyond the imaginary.Amen to that. I truly believe that our survival as a sapient life-form depends upon improving our ability to evaluate and adapt to exotic and previously unknown risks. (Yes, I am intentionally avoiding the word "species" -- I seriously doubt that the species homo sapiens will survive the coming storms of genetic engineering intact.)All the risks mentioned by Rees are ones that have been discussed repeatedly in the Boulder Futures Group. Based on these discussions, I would rate them very roughly as follows:Growing population: rapidly diminishing as a threat.Fast-spreading pandemic: small but nasty threat.Nanotech infection: still too difficult to evaluate.Regional nuclear war: high probability, but not TEOTWAWKI.Catastrophic grid failure: small and diminishing.Uncontrolled synthetic biology: not a significant threat.Peak oil: not now a threat, and never was a threat.Asteroid impact: near certainty over the very long term.Climate change: near certainty over the next century.LC
Best Of |
Favorites & Replies |
Start a New Board |
My Fool |
BATS data provided in real-time. NYSE, NASDAQ and NYSEMKT data delayed 15 minutes.
Real-Time prices provided by BATS. Market data provided by Interactive Data.
Company fundamental data provided by Morningstar. Earnings Estimates, Analyst Ra