I've been thinking a lot about risk assessment and management since my recent public conversation with Denise Caruso of the Hybrid Vigor Institute. We were talking about her latest book,Intervention: Confronting the Real Risks of Genetic Engineering and Life on a Biotech Planet, a clear-headed assessment of the importance of risk assessment within biotech industries, where she found a troubling complacency about the potential for things to go Terribly Wrong when genetically engineered organisms are introduced into the wild.
Who's to tell large companies with big investments in the development of transgenics that they should defer ROI while someone(s) presumably knowedgeable weighs the risks? And who's to say what constitutes and unacceptable risk – who can tell them no, you can't market this product? With dollars on the table, biotech industries and government agencies are under pressure to approve and release.
People like Denise Caruso are working hard to elucidate the issues related to biotechnology and nanotechnology. Denise's work, it seems to me, is all about overcoming complacency in others.
Complacency about risk is often associated with an unwillingness to spend money and effort. Consider two recent disasters that were exacerbated by poor risk assessment and general complacency:
In his opinion piece "They Saw It Coming," for The New York Times, Mark Fischetti noted major initiatives proposed within the Coast 2050 plan that might have prevented the flood following Katrina. He describes the difficulty in coordinating a coherent effort and getting it funded:
The debate over New Orleans's vulnerability to hurricanes has raged for a century. By the late 1990's, scientists at Louisiana State University and the University of New Orleans had perfected computer models showing exactly how a sea surge would overwhelm the levee system, and had recommended a set of solutions. The Army Corps of Engineers, which built the levees, had proposed different projects.
Yet some scientists reflexively disregarded practical considerations pointed out by the Army engineers; more often, the engineers scoffed at scientific studies indicating that the basic facts of geology and hydrology meant that significant design changes were needed. Meanwhile, local politicians lobbied Congress for financing for myriad special interest groups, from oil companies to oyster farmers. Congress did not hear a unified voice, making it easier to turn a deaf ear.
Fed up with the splintered efforts, Len Bahr, then the head of the Louisiana Governor's Office of Coastal Activities, somehow dragged all the parties to one table in 1998 and got them to agree on a coordinated solution: Coast 2050. Completing every recommended project over a decade or more would have cost an estimated $14 billion, so Louisiana turned to the federal government. While this may seem an astronomical sum, it isn't in terms of large public works; in 2000 Congress began a $7 billion engineering program to refresh the dying Florida Everglades. But Congress had other priorities, Louisiana politicians had other priorities, and the magic moment of consensus was lost.
Disaster focuses our attention on risk, and ends our complacency, but for how long? I've been thinking how we might become more proactive about very real risks of global catastrophe.
There are two kinds of catastrophes: those that are caused by humans (think the Chernobyl disaster, or the 9/11 attacks) and those that are natural disasters, like tsunamis. It's not always easy to tell which is which: while arson was alledgedly the cause of the recent fires in California, they were exacerbated by natural forces like the Santa Ana winds, as well as drought conditions -- though the severity of those might be attributable to climate change, another combination of human cause and "natural" effect. Complacency about global warming seems to be waning: there is broad acceptance of the argument that it's caused by human actions, and can be mitigated through reducing our greenhouse gas emissions. But people (sometimes in positions of great power and influcence) who realize that this reduction will be disruptive to global economies, and not in a good way, are more inclined to be complacent -- or at least espouse it -- and to argue a "business as usual" approach for now... until we run out of oil. (I recently argued with the head of a foundation that recovery of every last drop of oil would be detrimental to the environment; he argued back, saying that nothing we do can harm the earth. Complacency.)
Where natural disasters are concerned, even when they're predictable, we seem to think and talk a lot more than we do. Professionals do consider the potential consequences of asteroid strikes and supervolcano eruptions, but it's not clear that we're prepared for either event, or even know what preventive measures might work. The massive forces that created the Southeast Asian tsunami, or that might cause the Yellowstone supervolcano to blow, are probably beyond our control, so prevention is probably out of the question. All the more reason to understand as closely as possible what we would be dealing with, and devote time, energy and money to preparation. Complacency here may be the enemy of survival.
I'm not sure we can ever completely solve the problem of complacency, but I've been thinking through a concept that might lead to solving Very Large Problems like disaster response to asteroid strikes: Complacency Management. The core idea of Complacency Management is: if we can't end complacency completely, why not at least create the intellectual and emotional space to periodically leave it behind?
It's easier to create a sense of urgency about a Very Large Problem once or twice a month, maybe every week, by committing some time to devising and planning solutions to it, than to try and consider it with every waking minute. While it's not comfy to focus on worst case scenarios, doing it in a managed and defined way could lead to insights that in turn spark real contributions to understanding a potential disaster, and and planning whatever response makes sense.
It would be even better if we undertook Complacency Management at a regular time and place with others, that is, if we used it as an opportunity for collaboration.
To that end, I'm thinking about how to form online communities for Complacency Management. Imagine a community that starts with a few people hashing out the potential problem of asteroid strikes, and evolves into a body of knowledge and a network infrastructure that feeds into NASA and FEMA-type volunteer networks worldwide.
Having proposed the "what," I ask you for the "how." How could we most effectively network a bunch of people who are volunteering to think about worst-case global scenarios?
Complacency Management - I love that term. It accurately describes what I spend much of my time doing. I'm a security adviser for nongovernmental organizations. Maybe I'll ask for a change of title.
Love your ideas!