top of page
  • Çağlar Akgüngör

Risky Technologies and Regulation: Memoirs of a former NRC Head

Confessions of A Rogue Nuclear Regulator by Gregory B. Jaczko, published by Simon and Schuster, New York, 2019

I finally found some time to finish the memoirs of Gregory Jaczko, a former Nuclear Regulatory Commission (NRC) chairman, who was probably the most controversial person to hold that office. The book is an account of Jaczko’s mandate (2009-2012) that coincided with the Fukushima “accident”. It is certainly interesting to see the complex relationship between the nuclear industry, the political elite, and the administration from Jaczko’s eyes, who was an academic with a background in particle physics and a complete outsider to political or industrial circles until his appointment (almost) to NRC. Jaczko considers the latter as “an agency overwhelmed by the industry it is supposed to regulate and a political system determined to keep it that way[1]. He warns the public throughout his book against the risk of nuclear accidents, which he believes will eventually happen because “no one can design a safety system that works perfectly”.[2]



He affirms that following Fukushima, he “found himself moving from his role as a scientist impressed by the nuclear power to a fierce nuclear safety advocate.” [3] His opinion is that the “nuclear power is a model of electricity production that dates from the early part of the twentieth century […] characterized by large power plants that produce large amounts of power and transmit it over large distances” and that “advances in technology and manufacturing have made other types of clean electricity feasible.”[4] These are undoubtedly unconventional statements coming from an ex-chairman of the nuclear regulatory authority of the country operating the largest number of commercial nuclear power plants in the world. It comes as no surprise, then, to read negative comments about Jaczko’s tenure/mandate (and his memoirs) on Internet.

Perhaps, as Jacob Darwin Hamblin put it correctly in his review of the book, “Jaczko’s account will become standard reading as an antinuclear book”. He adds, however, that Jaczko’s “reasons have more to do with regulation than nuclear energy per se. Jaczko sees two paths ahead. One has a sustainable future with nuclear reactors that includes widespread recognition that accidents will happen and a greater commitment to safety. The other path is the one he witnessed as NRC chairman, featuring waning public trust in a secretive, uncooperative industry that regards safety regulations as unfair and cumbersome.”[5]

What was most interesting to me, however, was to read -once again- that risk is more a byproduct of perception (shaped by vision, mentality, personal experience, professional culture, etc.) than an objective reality formulated by science, and it can be negotiated by various actors with different aims and interests. I cannot but fully agree with Jaczko when he says: “…what constitutes ‘safety’ is often determined by political, not just scientific, judgments. […] Safety is a subjective determination made by societies -or their designated representatives- about the acceptable behaviors that companies and individuals can engage in. […] Because safety, especially as it relates to public health, is often informed by medical and scientific data, we tend to believe that safety standards are also determined by science. But societal norms, traditions, customs and politics play a role in establishing those standards. Safety decisions are public policy decisions that can endeavor to balance the interests of all competing elements in a society.”[6] In my eyes, exposure to harmful materials will always remain as the best case that illustrates the above phenomena. For instance, could anyone who lived the early 90s forget about how the GSM base stations and the hand terminals were presented as “extremely harmful” …until that technology ceased to be reserved for the wealthy and saw widespread use? Today, “microwaves” are far more present in our built environment that they had ever been before, but the risk of “having his brain cooked” is hardly ever mentioned vis-à-vis the benefits of continuous access to information.

Jaczko also complains about a bias I have frequently observed during all my career too, which is the overreliance in physical infrastructure and security systems and underestimated the human factors in safety. I don’t if there’s an emergency manager, health and safety expert or continuity expert who haven’t been through the following kind of dialogue I took from Jaczko’s book:

The dams won’t fail.”

“But what if they do?”

“They won’t. We inspect them. They’re solidly built. They won’t fail.” [7]

Operator mistakes, missed inspection schedules, stretched equipment service times to cut costs, bad hierarchical decisions and other problems are a reality of the world and none of them can be prevented by automatized systems, of which the maintenance and operation always depends on humans at some point, let alone design errors (oh, yes, there is a “but this disaster doesn’t count, it was beyond the design!” argument too, but a closer look may also shows that such cases can actually be economical or political choices, rather than “design” issues)

My usual reaction to “nothing (bad) will happen, we have the (…) systems” rhetoric is to show a counterexample but I know that sometimes even that doesn’t work, especially against the cognitive bias that pushes us to consider the past data as an indicator for predicting the future. This is the bias that makes us think absurd things such as “I had no accident on the route for 20 years, so I won’t have one today either”, and try reduce the uncertainty of the future by relying on our (or someone else’s) past experience. It seems like we tend to forget a simple fact: Every accident, every disaster is the result of a unique blend of factors. Jaczko appears to have struggled with the same posture during his time at NRC.



The following paragraph -an excerpt from a critical review of Jaczko’s book- seems to illustrate this bias: “No perfect energy resource exists, and tradeoffs are inherent. But with nuclear power, the critique is even more absurd, like calling the Yankees a poor team because they’ve lost 7,781 games - ignoring their 10,000-plus victories, 40 pennants, and 27 World Series championships. Over the past six decades, the nuclear industry has safely operated more than 100 reactors, providing roughly 20 percent of U.S. electricity needs. There indeed have been accidents, but only one, at Three Mile Island, actually released radiation. That accident caused no deaths or radiological health problems to the public; in fact, it demonstrated that the U.S. nuclear industry could control an accident safely. Professional baseball has produced several hundred fan fatalities, yet not a single American has died from civilian nuclear power.”[8] Well, I hope that the performance of the nuclear industry keeps going as good as it is now, because, let’s say, 10 years without accident out of 17 is wouldn’t be an acceptable “score”, by any measure…

In any case, “Confessions of a Rogue Nuclear Regulator” is a useful resource for those who are interested in nuclear energy policy area. It also contains valuable lessons on the regulation of major risks/hazards, as well as the three significant nuclear accidents, namely the Three Mile Island, Chernobyl and Fukushima. Finally, there’s a bonus in the annexes if you need a short vulgarization text on the production of electricity with nuclear power.


[1] JAZCKO, Gregory B., Confessions of A Rogue Nuclear Regulator, Simon and Schuster, New York, 2019, p. viii. [2] Ibid., p. 20. [3] Ibid., p. 22. [4] Ibid., p.166. [5] https://blogs.sciencemag.org/books/2019/01/15/confessions-of-a-rogue-nuclear-regulator/ [6] Jazcko, pp. 59-60. [7] Ibid., p.113. [8] https://www.nationalreview.com/2019/02/gregory-jaczko-book-confessions-of-a-rogue-nuclear-regulator-anti-nuclear-bias/

8 views0 comments

Recent Posts

See All

Commentaires


bottom of page