Denise Caruso holds a somewhat legendary status among tech journalists. A columnist for the NY Times (her old Information Industries column was a must-read for years, while her new column Re:Framing just kicked off on a bang with a piece titled Someone (Other Than You) May Own Your Genes) and founder of the Hybrid Vigor Institute (an NGO dedicated to facilitating interdisciplinary and collaborative approaches to scientific problem solving), it's not going too far to say that Caruso's work has helped shape our society's thinking about the future of science.
That future may be riskier than we like to think. In her new book, Intervention: Confronting the Real Risks of Genetic Engineering and Life on a Biotech Planet, Caruso lays out in chilling detail exactly why even (perhaps especially) those of us who are strong supporters of science and innovation ought to be extremely concerned about the unintended consequences of contemporary biotechnological industrial research.
We normally don't cover problems here on Worldchanging. Indeed, our manifesto says "We don't generally offer links to resources which are about problems and not solutions, unless the resource is so insightful that its very existence is a step towards a solution." This book does offer some solutions (about which, more later), but mostly it offers a fervent, well-reasoned call to action. When such an "alarm bell" book offers such clear thinking (I learned more about biotechnology from this book than any other I've read), it becomes a step towards solutions. And when the person ringing the alarm bell is no luddite, but one of our brightest technology writers, the alarm demands our attention.
The problem, Caruso says, is that the release of transgenic organisms presents the risk of new kinds of unintended catastrophes, ones which could "create stewardship challenges for generations into the future that are already far beyond our present scientific knowledge or capabilities."
[W]hat we know from history is that every promise based on discovery or invention, no matter how positive, comes factory-equipped with its own unintended dark-side consequences. ... It is not especially difficult to come up with scenarios whereby mucking around in the genes of living organisms leads to serious biological, social and/or economic disruption. Yet neither knowledge of history nor dark-side scenarios have tempered the zeal or the speed with which the products of genetic engineering are being dispatched into the global marketplace."
Caruso then explores a number of cases in which scientists themselves have done a lousy job of risk assessment, and in which industrial regulatory capture has prevented further exploration of known risks, including the health effects of common plastics; the over-use of antibiotics; the introduction of invasive species through intention or "escapes."
One of the root problems, Caruso explains, is that we simply don't know as much as we'd like to believe about the genetic mechanics of life. The ability to sequence and manipulate DNA is a powerful (and useful) technique; but there are other aspects of heredity which are less well understood and which are almost impossible to predict or control outside of a laboratory setting. Heredity can be influenced by gene flow between species, horizontal gene transfers, mutations and threshold effects, the effects of environment on the expression of genes, the complex interactions between DNA and other proteins and a variety of other factors. In aggregate, these factors render our understanding of heredity so incomplete that relying on that understanding to assess risk is extremely dangerous, especially when the results of failure may reproduce or recombine to become genetic pollution. (I have heard this argument made before by some in the field, but she lays it out with great clarity.)
In the face of such uncertainty, scientists and business people (and increasingly the two are inseparable in biotechnology -- the conflict between entrepreneurial self-interest and scientific integrity is one I think is too little explored in our society) have too often presumed that if something can't be demonstrated to be dangerous, it must be safe. Caruso quotes Roger Brent, saying, "Unless you can show me the mechanism for risk, it doesn't exist." To which Caruso adds, "[R]isk isn't about what scientists know. It's about what they don't know. Risk is about uncertainty. And uncertainty is not what scientists do."
As one illustration, Caruso points to the first patent awarded on a living creature, Ananda Chakrabarty's altered Pseudomonas bacterium designed to eat oil spills. Though the patent was granted, the bacterium was never used, because, Chakrabarty said, "The bacteria itself is non-toxic, but once in the open environment it can combine with pathogenic elements and show undesirable results." (To which Caruso responds "Let us keep in mind that the 'open environment' under discussion is ocean water, which covers 70 percent of the planet. Yes, that might present a problem.") In this particular case, disaster was foreseen and averted, but in a future episode we may be less lucky.
We can recognize the clear benefits of biotechnological research -- and Caruso does -- without accepting the risks imposed by poor decision making about when it is safe to release transgenic plants and animals into the the world's ecosystems, and, inevitably, our own lives and bodies. We can acknowledge that biotechnology has brought humanity incredible breakthroughs in pharmaceuticals and green chemistry -- even that biotechnology offers incredible opportunities for reasonably-safe-yet-rapid agricultural innovation through smart breeding (which selects for certain traits within a species without using any transgenic materials) -- and still demand that a reasonable precautionary principle be applied to actions (like releasing transgenic organisms into the wild where they might run feral beyond bioconfinement) which cannot be undone. We can accept genetic insight as a useful tool without granting its inventors the right of unregulated transgenesis.
The only means of controlling this "transgenic free-for-all," Caruso argues, is a set of better and stronger national and international regulations based on a new model of cost-benefit analysis.
The old model isn't working, both because of scientific blinders and corporate manipulation. "No matter what industry you're in, if you've got the nerve and the know-how, gaming an official cost-benefit analysis can be irresistible...because cost-benefit analysis in the real world is about power. Those who control what goes into the analysis also control what comes out of it." Revolving-door policies, for-profit university research and punitive litigation (i.e., suing people who say unfavorable things) have all made the balance of power in these investigations even worse.
But it's not working for other reasons, which have to do with the fact that we don't know all that much about the world yet, really. To add even more serious difficulty we have only select and imperfect measurements of those aspects of the world we do sort of understand, and we tend to misuse even those. (Put another way, in What the Numbers Say, "There is always more than one way to measure something; measurements are error-prone; even when correct, measurements are still only an approximation for what you really want to know; measurements change behavior." This last point is particularly worth noting, Caruso says, as it explains why people get stuck in patterns of reliance on unreliable data.)
But we could do better. As Caruso tells it, the seminal National Academies study Understanding Risk "stated flat-out that the results of math-based analytical approaches to risk and innovation are no longer acceptable on their own. Risk assessment is too subjective to be calculated, it said. It is a political, ethical and value-laden activity, period, and it needs to be conducted with the full participation of everyone who stands to be affected by the decision." In other words, risk is political, and ceding control over discussions of risk to scientists, is not only profoundly undemocratic, it is intellectually bankrupt.
Instead we need an "analytic deliberative process," one which seeks out uncertainty, and evokes foresight and speculation, and attempts to incorporate in its deliberations not just accuracy, but wisdom. Caruso thinks this can be achieved through a process of collaborative risk assessment, exercised with transparency. (Her description of the application of such a open, collaborative approach to judging the riskiness of transplanting pig organs into people -- "xenotransplantation" -- resists easy summary, but is itself worth the book's cover price.) She also says that we need to restore independence and credibility to the regulatory process by resurrecting the U.S. Office of Technology Assessment (OTA -- which got budgeted out of existence by Republicans in 1995) or something like it: a trusted overseer, which could "cast a fresh eye on a regulatory regime for biotechnology." Finally, we need to be willing to demand that those who work emerging technologies in general be held to clearer and higher standards of usefulness and responsibility (One researcher, Mary O'Brien, proposed that biotechnology be guided by the sharp question, "What is the least hazard that is necessary to solve the problem?").
While it sounds the softest of the three answers, I think that it may be the most important. We all tend to rise to the expectations that others place on us, and we place on ourselves. If those of us who are actually developing the astonishing new technologies which unfold around us daily can raise the bar of responsibility, we will, I believe, see not only fewer risks of catastrophic mistakes, but greater real benefits to humanity. We need better process and strong regulations, but better still would be those two things combined with a new vision of responsible progress.
In summary, if the biological future we are engineering concerns you, read Intervention. It's not often that a book fundamentally changes the way I look at an important field. Those who value the scientific project will find here a reasoned voice for integrity and caution; those who fear the repercussions of altering living beings will find here a tool for measuring the degrees of gray involved and making more informed decisions.
Great review of a great book.
Now, to throw some things out for readers of this
Part of the issue is the philosophical foundation of
risk assessment in the US is that it is supposed to
be "science based". Which puts scientists in positions
where they are supposed to say that
something is safe. Usually, safe or not, but sometimes,
the marching orders are "safe enough". Then
the scientists on regulatory bodies go
forth and try
to figure out "safe enough", and then the
politicos have some cover.
The extreme uncomfortableness of being asked to
determine "safe enough" has as a consequence
a natural human desire to bound the problem very
A risk assessment of automobiles conducted in 1910
might well have dealt with automobile accidents. It
might not have thought of teen age pregnancy, as
children were concieved in automobile back seats,
which was recognized as a problem in the 1920s.
Should it have?
A risk assessment of agriculture conducted in BCE
6000 might have taken into account population increase.
It might have taken into account probability
of evolving much more deadly epidemic diseases. It
might not have taken into account the idea that,
in Eurasia and Mesoamerica, there appear to be
tracks in the human genome suggesting that little
children who had cell surface proteins that were
going to give them problems digesting barley and
wheat (Eurasia) and Corn/Maize (Mesoamerica)-- died
off. And that we are the descendents of the children
who didn't. Should that risk assessment have spotted
The point is not to assert that risk assessment is
bad or not worth doing.
The question is how to bound problems and conduct
the assessments so that they brings in appropriate
elements and forsees "enough"?
Given the inadequacies of the current system, so
nicely illusrated by Denise Caruso, how
should we as a species do it better?
Caruso's book has good ideas on this but this is,
to put it mildly, not a completely solved problem,
and the problem, to put it mildly, merits larger
While I agree with a lot of what Denise says, her views are a little too alarmist. The biggest fear being that people, especially those in policy making positions are likely to think that biotech=transgenics=risky. A large part of the effort today is to understand the details about biological machinery and mechanisms. I am sure there are people who are in a rush to bring products out to market, and often failing to understand the potential consequences, but the level of systems level understanding of organisms, while far from complete is significantly better than it has even been. I actually have more faith in scientists to self-police themselves than in too much regulatory oversight. Innovation is being killed by so many other reasons, that additional oversight could really hinder some really good science.
Thanks to Roger Brent for posting here -- Roger is the raison d'etre for the book -- the very well regarded genomic biologist I met who prompted me to start thinking about risk in a new way. I agree totally -- the point of the book was not to seal the deal on the issue, but to try and start a conversation.
And also thanks to Deepak for the comment. So funny to hear this book called alamist. I don't feel like it's alarmist at all! Except from the standpoint of how badly we assess technological risk, and how little scientists working in this area understand risk.
As Roger points out, and as I point out in the book as well, scientists have a difficult time policing themselves when it comes to risk. They traffic in evidence, not in uncertainty.
Also, I'm afraid I haven't seen much evidence of scientists who are connected to commercializing a discovery being willing to police themselves on these issues.
As for knowing more than ever: absolutely true. But interestingly, what we have learned about biology from a systems level is NOT reflected in the regulations governing the risks of transgenic organisms. The regulations are fully reductionist -- Central Dogmatic*, really, and in my opinion, disingenuously so.
It was exceedingly rare in my experience over the 3.5 years writing this book that I found a molecular biologist who actually considered a systems-level understanding of an organism to include gene-environment interactions, or effects on the ecosystem.
Thanks again for writing - Denise
* The Central Dogma of molecular biology, believed to be true for a brief time after Watson & Crick discovered the structure of DNA, states that one gene yields one protein, or one trait. This was the reason everyone jumped on the "new industry" bandwagon -- it was a very mechanistic and, as it turns out, untrue view of how biology works. Genes work in networks, and are also heavily influenced by environment.
Seems to a degree like Denise is calling for the real world global equivilant of Doomwatch, an old (?UK) television show where a government group nips potentially dangerous technology in the bud. Though that strays periously close to an overpowered counterterrorism squad. Then again, if the fear seems justified, that may appear to be the rational response.
As a kid in the 70's, I remember the shores of Lake Michigan were smothered by piles of dead fish covered in maggots. These were alewives killed by exotic lampreys.
There was (and is) a crazy cascade of invasive species that were intentionally introduced to solve the problem. Each introduction of a foreign species creating a new ecological issue that permanently altered the ecology of the Great Lakes.
SO the prospect of having genetically modified organisms and micro-organisms invade my personal ecology fills me with dread. Even if measures of accountability are imposed, once the damage is done, there is no point of return.
I won't disagree with you on current regulation being very reductionist in nature. Problem is that I don't trust paper pushers to ever have the vision to properly regulate anything this complicated.