Skip to main content

Home/ Open Intelligence / Energy/ Group items tagged report

Rss Feed Group items tagged

D'coda Dcoda

WNA reports world market post Fukushima [15Sep11] - 0 views

  • World nuclear fuel markets are likely to suffer very little noticeable impact from events at Fukushima, according to the latest edition of the World Nuclear Association (WNA) biennial supply and demand report.
  • According to the WNA report, Global Nuclear Fuel Market: Supply and Demand 2011-2030, it is still too early to assess the full impact of the Fukushima accident on the world nuclear fuel market. However, it is possible to make some "reasonable deductions," the report notes. Despite the permanent closure of reactors in Japan and Germany and slowdowns in some programs in response to Fukushima, the WNA report notes that the global situation for energy supply and demand remains effectively unchanged. Developments in the USA, China, India and Russia will remain particularly crucial in determining nuclear's overall role in global electricity supply, while prospects for nuclear new build remain strong in China, India, South Korea and the UK.
  • Revised scenarios for nuclear generating capacity in individual countries and areas have been incorporated into the report, feeding into three scenarios for world nuclear capacity up to 2030. The scenarios are based on differing underlying economic and political trends but are all described as "plausible".
  • ...3 more annotations...
  • The reference scenario, which includes an assumption that most countries will continue with their pre-Fukushima nuclear plans, sees nuclear capacity growing at an average 2.3% per year from its current 364 GWe to 411 GWe by 2015, reaching 471 GWe by 2020 and 614 GWe by 2030. The upper scenario sees 416 GWe by 2015, 518 GWe by 2020 and 790 GWe by 2030. Under both scenarios, the figures are slightly lower than in the previous edition of the report published in 2009.
  • More mines needed   The model used by the WNA to forecast reactor requirements has also been updated with reassessments of factors affecting nuclear fuel demand. Presenting the report at the WNA Symposium today, EDF procurement manager Anne Chauvin explained that capacity factors, enrichment levels and burnups are all trending upwards, and uranium requirements are affected accordingly. World uranium requirements, currently estimated at 63,800 tU, are expected to grow at a similar rate to nuclear capacity under the reference scenario, reaching 107,600 tU in 2030. This is little changed from the equivalent scenario in the previous report, although total requirements under the upper scenario, reaching 136,900 tU per year by the end of the period, are significantly down on the previous forecast. 
  • Known uranium resources are more than adequate to satisfy reactor requirements to 2030 and beyond, the report notes, but the role of secondary uranium supplies (inventories, stockpile drawdowns and recycled materials) will remain important over the period. Nevertheless, beyond 2020, there is likely to be a substantial need for new primary sources of uranium supply, the report concludes, likening the magnitude of the build-up needed to meet both the reference and upper demand scenarios to the huge historic production expansions of the 1950s and the late 1970s.
D'coda Dcoda

Short-Termism and Energy Revolutions [30Sep11] - 0 views

  • The calls these days for a technological “energy revolution” are widespread. But how do you spark breakthroughs when the natural bias of businesses, investors and governments is toward the here and now? In governance, politics creates a bias toward the short term. This is why bridges sometimes fall down for lack of maintenance. That’s also why it’s so hard to sustain public investment in the research and intellectual infrastructure required to make progress on the frontiers of chemistry, biology and physics, even though it is this kind of work that could produce leaps in how we harvest, harness, store and move energy. (This is why I asked, “Are Chemists and Engineers on the Green Jobs List?” back in 2008.)
  • To get the idea, you only have to look at the sputtering state of President Obama’s mostly unfunded innovation hubs, or look once again at the energy sliver in the graph showing America’s half-century history of public investment in basic scientific research. (There’s not much difference in research patterns in most other industrialized countries.) You can also look at the first Quadrennial Technology Review produced by the Department of Energy (summarized by Climate Progress earlier this week). The review was conducted after the President’s Council of Advisers on Science and Technology wisely recommended regular reviews of this sort as part of its prescription for accelerating change in energy technologies.
  • This excerpt from the new review articulates the tension pretty transparently for a government report: There is a tension between supporting work that industry doesn’t— which biases the department’s portfolio toward the long term—and the urgency of the nation’s energy challenges. The appropriate balance requires the department to focus on accelerating innovation relevant to today’s energy technologies, since such evolutionary advances are more likely to have near- to mid-term impact on the nation’s challenges. We found that too much effort in the department is devoted to research on technologies that are multiple generations away from practical use at the expense of analyses, modeling and simulation, or other highly relevant fundamental engineering research activities that could influence the private sector in the nearer term.
  • ...16 more annotations...
  • In finding that balance, I’m not sure it’s possible to overcome the political pressures tugging agencies and officials to stress refinement and deployment of known and maturing technologies (even though that’s where industry and private investors are most focused).
  • On the left, the pressure is for resources to deploy today’s “green” technology. On the right, as illustrated in a Heritage Foundation report on ways to cut President Obama’s budget for the Energy Department, the philosophy seems to be to discourage all government spending on basic inquiry related to energy.
  • According to Heritage, science “in service of a critical national interest that is not being met by the private sector” is fine if that interest is national defense, but not fine if it’s finding secure and sustainable (environmentally and economically) sources of energy.
  • I solicited reactions to the Energy Department review from a variety of technology and innovation analysts. The first to weigh in are Daniel M. Kammen, an energy technology researcher at the University of California, Berkeley, who is on leave working for the World Bank, and Robert D Atkinson, the founder and president of the Information Technology and Innovation Foundation. Here’s Kammen: The idea of a regular review and status report on both energy innovation and deployment spending is a good one. Some of the findings in the QTR review are useful, although little is new. Overall, though, this is a useful exercise, and one that should be a requirement from any major programmatic effort.
  • he real need in the R&D sector is continuity and matching an increasing portfolio of strategic research with market expansion. My former student and colleague Greg Nemet have written consistently on this: - U.S. energy research and development: Declining investment, increasing need, and the feasibility of expansion - Reversing the Incredible Shrinking Energy R&D Budget
  • Perhaps the biggest worry in this report, however, is the missing logic and value of a ’shift to near term priorities in energy efficiency and in electric vehicles.’ This may be a useful deployment of some resources, but a range of questions are simply never addressed. Among the questions that need firmer answers are:
  • Following record levels funding made available to the energy industry through the [stimulus package of spending], what are the clearly identified market failures that exist in this area that added funding will solve? Funding is always welcome, but energy efficiency in particular, can be strongly driven by regulation and standards, and because good energy efficiency innovations have such rapid payback times, would regulatory approaches, or state-federal partnerships in regulation and incentives not accomplish a great deal of what can be done in this area? Congressman Holt raises a number of key questions on related issues, while pointing to some very hopeful experiences, notably in the Apollo program, in his 16 September editorial in Science.
  • given the state-by-state laboratories we already have of differing approaches to energy efficiency, the logic of spending in this area remains to be proven (as much as we all rightly love and value and benefit from energy efficiency).
  • Near-term electric vehicle deployment. A similar story could be told here. As the director of the University of California at Berkeley’s Transportation Sustainability Research Center (http://tsrc.berkeley.edu) I am huge believer in electric vehicles [EVs]. However, the review does not make clear what advances in this area are already supported through [the Advanced Research Projects Agency for Energy], and what areas of near-term research are also not best driven though regulation, such as low-carbon fuel standards, R&D tax credits, ‘feebates’ that transfer funds from those individuals who purchase inefficient vehicles to those who purchase efficient ones. Similar to the story in energy efficiency, we do have already an important set of state-by-state experiments that have been in place for some time, and these warrant an assessment of how much innovation they have driven, and which ones do and do not have an application in scale-up at the federal level.
  • Finally, the electric vehicle landscape is already very rich in terms of plans for deployment by automakers. What are the barriers five-plus years out that the companies see research-versus-deployment and market-expansion support as the most effective way to drive change in the industry? Where will this focus put the U.S. industry relative to China?
  • There are some very curious omissions from the report, such as more detail on the need to both generate and report on jobs created in this sector — a political ‘must’ these days (see, e.g., the “green jobs” review by the Renewable and Appropriate Energy Laboratory at Berkeley) — and straightforward comparisons in the way of ‘report cards’ on how the US is stacking up relative to other key players (e.g. China, Germany…).
  • Here’s Robert Atkinson: If DOE is shifting toward a more short-term focus, this is quite disturbing.  It would mean that DOE has given up on addressing the challenge of climate change and instead is just focused on the near term goal of reducing oil imports and modestly reducing the expansion the coal fired power plants. If DOE thinks it is still focused on climate change, do they think they are fighting “American warming”?
  • If so, cutting the growth of our emissions make sense.  But its global warming and solving this means supporting the development of scalable, cheap low or no-carbon energy so that every country, rich and poor, will have an economic incentive to transitioning to cheap energy.  Increasing building efficiency, modernizing the electric grid, alternative hydrocarbon fuels, and increasing vehicle efficiency do virtually nothing to meet this goal. They are “American warming” solutions.
  • This is also troubling because (as you point out) who else is going to invest in the long-term, more fundamental, high risk, breakthrough research than the U.S. government.  It certainly won’t be VCs. And it won’t be the Chinese who are principally interested in cutting their energy imports and exporting current generation clean energy, not developing technology to save the planet.  Of course all the folks out there who have been pushing the mistaken view that we have all the clean technologies we need, will hail this as the right direction.  But it’s doing what the rest of the market has been doing in recent years – shifting from high risk, long-term research to short-term, low risk.  If the federal government is doing this it is troubling to say the least.
  • or those seeking more, here are the slides used by Steven Koonin, the physicist and former BP scientist who now is under secretary for science at the department, in presenting the review earlier this week:
  • Rolling Out the Quadrennial Technology Review Report
D'coda Dcoda

Fukushima highly radiated United States water, food cover-up by feds continues [14Jul11] - 0 views

  • In KING 5 TV's report Tuesday on high levels of radiation detected in Northwest rainwater, the United States government is accused of continuing to fail to tell the public about Fukushima dangerous radiation blanketing parts of the United States, a coverup that led grassroots projects and independent reporters to gather and present data for public well-being. University of California Nuclear Engineering Department Forum began asking on Tuesday for people in the Los Angles area to come forward with any dangerous radiation readings that may have been detected after local peaches were highly radioactive
  • UPDATE: July 13, 2011, 11:11pm: The peaches reported on July 12 were bought at "a local market," not Santa Monica Market, according to Environews on Wednesday. An investigation about the source of the peaches is underway.   Right to health denied when United States government hides high levels of Fukushima radiation  In KING 5 TV's report Tuesday on high levels of radiation detected in Northwest rainwater , the United States   government is accused of continuing to fail to tell the public about Fukushima dangerous radiation blanketing parts of the United States , a coverup that led grassroots projects and independent reporters to gather and present data for public well-being. University of California Nuclear Engineering Department Forum   began asking on Tuesday for people in the Los Angles area   to come forward with any dangerous radiation readings that may have been detected after local peaches were highly radioactive . "Our government said no health levels, no health levels were exceeded, when in fact, the rain water in the Northwest is reaching levels 130 times the drinking water standards," said Gerry Pollet from a non-government organization watchdog, Heart of America Northwest.
  • UPDATE: July 13, 2011, 11:11pm: The peaches reported on July 12 were bought at "a local market," not Santa Monica Market, according to Environews on Wednesday. An investigation about the source of the peaches is underway.   Right to health denied when United States government hides high levels of Fukushima radiation   In KING 5 TV's report Tuesday on high levels of radiation detected in Northwest rainwater , the United States   government is accused of continuing to fail to tell the public about Fukushima dangerous radiation blanketing parts of the United States , a coverup that led grassroots projects and independent reporters to gather and present data for public well-being. University of California Nuclear Engineering Department Forum   began asking on Tuesday for people in the Los Angles area   to come forward with any dangerous radiation readings that may have been detected after local peaches were highly radioactive . "Our government said no health levels, no health levels were exceeded, when in fact, the rain water in the Northwest is reaching levels 130 times the drinking water standards," said Gerry Pollet from a non-government organization watchdog, Heart of America Northwest. A call from the University of California Nuclear Engineering Department Forum for public radiation readings in Los Angeles came after a finding on Friday, July 8th, 2011 was reported that two peaches from the popular Santa Monica local market were confirmed to have sustained radiation levels of 81 CPMs, or greater
  • ...5 more annotations...
  • "The market's background radiation was said to be about 39 CPMs. The two peaches, thus, had significantly high radiation contamination equaling over two times site background levels," stated reported EnviroReporter, the  independent news source created by Michael Collins and Denise Anne Duffield in May 2006 featuring work of Collins, a multi-award-winning investigative journalist who specializes in environmental issues and served sic years as a Director of the Los Angeles Press Club and five years as its Judging Chair
  • "What makes this discovery especially significant is that the 2X background radioactivity detected in these peaches was likely significantly attenuated by their water content; when eaten the exposure rate may be significantly higher. Even worse, it is likely that the detected radioactivity is from a longer half life radionuclide; which when eaten, would irradiate a person from the inside out for potential years to come." (@Potrblog, July 10th, 2011, at 8:05 pm, www.enviroreporter.com/2011/03/enviroreporter-coms-radiation-station/)
  • Pollet reviewed Iodine 131 numbers released by the Environmental Protection Agency last spring and reported to KING5 TV, "The level that was detected on March 24 was 41 times the drinking water standard." 
  • EPA says this was a brief period of elevated radiation in rainwater, and safe drinking water standards are based on chronic exposure to radiation over a lifetime, contrary to what independent radiation experts say, including persons such as Dr. Helen Caldicott, the international leading preventionist of nuclear injury, Joseph Mangano, Cindy Folkers, a radiation and health specialist at Beyond Nuclear, Erich Pica, president of Friends of the Earth, and Dr. Alexey Yablokov
  • In light of the ongoing failure of government to provide critically important Fukushima radiation news, each above named experts have recommended that to survive Fukushima, the public needs to seek information being provided by activists and by websites such as Beyond Nuclear and EnvrioReporter.
D'coda Dcoda

Impacts of the Fukushima Nuclear Power Plants on Marine Radioactivity - Environmental S... - 0 views

  • The impacts on the ocean of releases of radionuclides from the Fukushima Dai-ichi nuclear power plants remain unclear. However, information has been made public regarding the concentrations of radioactive isotopes of iodine and cesium in ocean water near the discharge point. These data allow us to draw some basic conclusions about the relative levels of radionuclides released which can be compared to prior ocean studies and be used to address dose consequences as discussed by Garnier-Laplace et al. in this journal.(1) The data show peak ocean discharges in early April, one month after the earthquake and a factor of 1000 decrease in the month following. Interestingly, the concentrations through the end of July remain higher than expected implying continued releases from the reactors or other contaminated sources, such as groundwater or coastal sediments. By July, levels of 137Cs are still more than 10 000 times higher than levels measured in 2010 in the coastal waters off Japan. Although some radionuclides are significantly elevated, dose calculations suggest minimal impact on marine biota or humans due to direct exposure in surrounding ocean waters, though considerations for biological uptake and consumption of seafood are discussed and further study is warranted.
  • there was no large explosive release of core reactor material, so most of the isotopes reported to have spread thus far via atmospheric fallout are primarily the radioactive gases plus fission products such as cesium, which are volatilized at the high temperatures in the reactor core, or during explosions and fires. However, some nonvolatile activation products and fuel rod materials may have been released when the corrosive brines and acidic waters used to cool the reactors interacted with the ruptured fuel rods, carrying radioactive materials into the ground and ocean. The full magnitude of the release has not been well documented, nor is there data on many of the possible isotopes released, but we do have significant information on the concentration of several isotopes of Cs and I in the ocean near the release point which have been publically available since shortly after the accident started.
  • We present a comparison of selected data made publicly available from a Japanese company and agencies and compare these to prior published radionuclide concentrations in the oceans. The primary sources included TEPCO (Tokyo Electric Power Company), which reported data in regular press releases(3) and are compiled here (Supporting Information Table S1). These TEPCO data were obtained by initially sampling 500 mL surface ocean water from shore and direct counting on high-purity germanium gamma detectors for 15 min at laboratories at the Fukushima Dai-ni NPPs. They reported initially results for 131I (t1/2 = 8.02 days), 134Cs (t1/2 = 2.065 years) and 137Cs (t1/2 = 30.07 years). Data from MEXT (Ministry of Education, Culture, Sports, Science and Technology—Japan) were also released on a public Web site(4) and are based on similar direct counting methods. In general MEXT data were obtained by sampling 2000 mL seawater and direct counting on high-purity germanium gamma detectors for 1 h in a 2 L Marinelli beaker at laboratories in the Japan Atomic Energy Agency. The detection limit of 137Cs measurements are about 20 000 Bq m–3 for TEPCO data and 10 000 Bq m–3 for MEXT data, respectively. These measurements were conducted based on a guideline described by MEXT.(5) Both sources are considered reliable given the common activity ratios and prior studies and expertise evident by several Japanese groups involved in making these measurements. The purpose of these early monitoring activities was out of concern for immediate health effects, and thus were often reported relative to statutory limits adopted by Japanese authorities, and thus not in concentration units (reported as scaling factors above “normal”). Here we convert values from both sources to radionuclide activity units common to prior ocean studies of fallout in the ocean (Bq m–3) for ease of comparison to previously published data.
  • ...5 more annotations...
  • We focus on the most complete time-series records from the north and south discharge channels at the Dai-ichi NPPs, and two sites to the south that were not considered sources, namely the north Discharge channels at the Dai-ni NPPs about 10 km to the south and Iwasawa beach which is 16 km south of the Dai-ichi NPPs (Figure 1). The levels at the discharge point are exceedingly high, with a peak 137Cs 68 million Bq m–3 on April 6 (Figure 2). What are significant are not just the elevated concentrations, but the timing of peak release approximately one month after to the earthquake. This delayed release is presumably due to the complicated pattern of discharge of seawater and fresh water used to cool the reactors and spent fuel rods, interactions with groundwater, and intentional and unintentional releases of mixed radioactive material from the reactor facility.
  • the concentrations of Cs in sediments and biota near the NPPs may be quite large, and will continue to remain so for at least 30–100 years due to the longer half-life of 137Cs which is still detected in marine and lake sediments from 1960s fallout sources.
  • If the source at Fukushima had stopped abruptly and ocean mixing processes continued at the same rates, one would have expected that the 137Cs activities would have decreased an additional factor of 1000 from May to June but that was not observed. The break in slope in early May implies that a steady, albeit lower, source of 137Cs continues to discharge to the oceans at least through the end of July at this site. With reports of highly contaminated cooling waters at the NPPs and complete melt through of at least one of the reactors, this is not surprising. As we have no reason to expect a change in mixing rates of the ocean which would also impact this dilution rate, this change in slope of 137Cs in early May is clear evidence that the Dai-ichi NPPs remain a significant source of contamination to the coastal waters off Japan. There is currently no data that allow us to distinguish between several possible sources of continued releases, but these most likely include some combination of direct releases from the reactors or storage tanks, or indirect releases from groundwater beneath the reactors or coastal sediments, both of which are likely contaminated from the period of maximum releases
  • It is prudent to point out though what is meant by “significant” to both ocean waters and marine biota. With respect to prior concentrations in the waters off Japan, all of these values are elevated many orders of magnitude. 137Cs has been tracked quite extensively off Japan since the peak weapons testing fallout years in the early 1960s.(13) Levels in the region east of Japan have decreased from a few 10s of Bq m–3 in 1960 to 1.5 Bq m–3 on average in 2010 (Figure 2; second x-axis). The decrease in 137Cs over this 50 year record reflects both radioactive decay of 137Cs with a 30 year half-life and continued mixing in the global ocean of 137Cs to depth. These data are characteristic of other global water masses.(14) Typical ocean surface 137Cs activities range from <1 Bq m–3 in surface waters in the Southern Hemisphere, which are lower due to lower weapons testing inputs south of the equator, to >10–100 Bq m–3 in the Irish Sea, North Sea, Black Sea, and Baltic Seas, which are elevated due to local sources from the intentional discharges at the nuclear fuel reprocessing facilities at Sellafield in the UK and Cape de la Hague in France, as well as residual 137Cs from Chernobyl in the Baltic and Black Seas. Clearly then on this scale of significance, levels of 137Cs 30 km off Japan were some 3–4 orders of magnitude higher than existed prior to the NPP accidents at Fukushima.
  • Finally though, while the Dai-ichi NPP releases must be considered “significant” relative to prior sources off Japan, we should not assume that dose effects on humans or marine biota are necessarily harmful or even will be measurable. Garnier-Laplace et al.(1) report a dose reconstruction signal for the most impacted areas to wildlife on land and in the ocean. Like this study, they are relying on reported activities to calculate forest biota concentrations,
  •  
    From Wood's Hole, note that calculations are based on reports from TEPCO & other Japanese agencies. Quite a bit more to read on the site.
D'coda Dcoda

How safe is India's nuclear energy programme? [23Aug11] - 0 views

  • The March nuclear disaster in Fukushima in Japan led countries with nuclear power plants to revisit safety measures. The International Atomic Energy Agency constituted a global expert fact-finding mission to the island nation. The purpose of the mission was to ascertain facts and identify initial lessons to be learned for sharing with the nuclear community.
  • The mission submitted its report in June and the report stated in clear terms that “there were insufficient defence for tsunami hazards. Tsunami hazards that were considered in 2002 were underestimated. Additional protective measures were not reviewed and approved by the regulatory authority. Severe accident management provisions were not adequate to cope with multiple plant failures”.
  • Further, on the regulatory environment the report states: “Japan has a well organized emergency preparedness and response system as demonstrated by the handling of the Fukushima accident. Nevertheless, complicated structures and organizations can result in delays in urgent decision making.” The inability to foresee such extreme scenarios is a forewarning to countries that are expanding nuclear capacity at a frenzied pace.
  • ...13 more annotations...
  • For India, this is a lesson and an exceptional opportunity to relook at the protected structures of the department of atomic energy (DAE), and establish more transparent processes and procedures.
  • In the past, the Three Mile Island incident (1979) and Chernobyl accident (1986) had provided similar opportunities to evaluate nuclear safety and regulatory systems. India, in response to these incidents, constituted safety audits to assess the safety of nuclear power plants. However, A. Gopalakrishnan, (a former chairman of Atomic Energy Regulatory Board) in his recent article said, “DAE management classified these audit reports as ‘top secret’ and shelved them. No action was taken on the committee’s findings.”
  • If this is so, these reports, or at least action-taken reports, ought to have been published and made available. Such steps could have guaranteed DAE considerable public faith in the functioning of regulatory authorities and given significant confidence in engaging with stakeholders in the present expansion plan.
  • Nuclear Power Corp. of India Ltd, post-Fukushima has undertaken safety evaluation of 20 operating power plants and nuclear power plants under construction. The inm report titled Safety Evaluation of Indian Nuclear Power Plants Post Fukushima Incident suggested a series of safety measures that must be incorporated in all the audited nuclear power plants in a time-bound manner. Measures pertain to strengthening technical and power systems, automatic reactor shutdown on sensing seismic activity, enhancement of tsunami bunds at all coastal stations, etc.
  • However, in the same breath, the report provides assurance by stating that, “adequate provisions exist at Indian nuclear power plants to handle station blackout situations and maintain continuous cooling of reactor cores for decay heat removal”. Further, the reports recalls, “the incidents at Indian nuclear power plants, like prolonged loss of power supplies at Narora plant in 1993, flood incident at Kakrapar plant in 1994 and tsunami at Madras (Chennai) plant in 2004 were managed successfully with existing provisions.”
  • DAE’s official response, post-Fukushima, has been cautious while providing assurance. Separately, DAE has made it clear the nuclear energy programme will continue as planned after incorporating the additional safety features identified by the safety audit report.
  • Prime Minister Manmohan Singh in his speech two days ago in West Bengal was emphatic about the future of India’s nuclear energy programme. He said that “there would be no looking back on nuclear energy. We are in the process of expanding our civil nuclear energy programme. Even as we do so, we have to ensure that the use of nuclear energy meets the highest safety standards. This is a matter on which there can be no compromise”.
  • S. Banerjee, chairman of Atomic Energy Commission and secretary DAE at the International Atomic Energy Agency Ministerial Conference on Safety, categorically said: “India’s effort has been to achieve continuous improvement and innovation in nuclear safety with the basic principle being, safety first, production next.” This is important at a time when we are in the process of expanding nuclear capacity at an incredible pace.
  • Currently, there are several domestic and international power projects in the pipeline. DAE has projected 20,000MWe (megawatt electric) by 2020 from present 4,780MWe, a fourfold increase from the current production. Going further, Banerjee stated that India hopes to achieve targets exceeding 30,000MWe by 2020 and 60,000MWe by 2032. This is a tall order, considering our experience in executing major infrastructure projects. DAE has struggled in the past to achieve targets.
  • Execution of these targets is to be achieved by importing high-capacity reactors and through DAE’s own programme. As we see greater activity in the nuclear energy sector?which was traditionally not transparent in engaging with the public?the trust deficit could only widen as we expand the programme
  • Land acquisition is already a major concern for infrastructure projects and has become an issue at the proposed Jaitapur nuclear power plant as well. However, the biggest challenge in this expansion would be to convince the public of the safety and security of nuclear power plants and also arrive at a comprehensive information and communication package for states in whose territory projects are being executed. Because of the nature of India’s nuclear programme?the combined existence of civilian and military programmes?the nation may not be in a position to achieve the kind of regulatory autonomy, process and engagement that has been witnessed in many European countries and in the US.
  • The bifurcation of India’s nuclear establishment into civilian and military, subsequent to commitment under India-US civil nuclear cooperation has provided with the prospect of an empowered regulatory system.
  • Incidents in Jaitapur and the Fukushima nuclear disaster have further pushed the government to commit to establish an independent nuclear regulator, the Bill of which is expected to be in Parliament any time this year. Nuclear programme is likely to face more complex issues in the future with respect to environment, social and health. Neighbouring countries may also join the chorus soon since some of the proposed nuclear power plant sites are close to our borders
D'coda Dcoda

Nuclear Stress tests take on Fukushima lessons [19Sep11] - 0 views

  • European regulators have been publishing progress reports on the program of stress tests being carried out at nuclear power plants in response to the Fukushima accident. In the weeks following the Fukushima accident, the European Council (EC) requested a review of safety at European nuclear power plants when faced by challenging situations. The criteria for the reviews, now known as stress tests, were produced for the European Commission by the European Nuclear Safety Regulatory Group (ENSREG). Progress reports were due to be submitted to the European Commission by 15 September, and many nuclear regulators and in some cases plant operators have published summaries, including regulators in Belgium, France, Hungary, Romania, Slovakia, Slovenia, Spain, Sweden and the UK.
  • The reports vary from country to country, but the take-home story emerging from the reports is that Europe's nuclear plants are generally well placed to withstand beyond-design-basis events. Some plants have already put into practice initial measures to improve safety in response to Fukushima, and the tests are bringing to light more measures that need to be taken to improve resilience on a plant-by-plant basis.   Some measures that have already been identified are simple to put into place: for example, housekeeping routines have been changed to reduce the potential for seismic interactions (where non-safety related equipment could impact or fall onto seismically qualified equipment) at UK power plants.
  • stress tests focus on three areas highlighted by events in Japan: external threats from earthquake and flooding, specifically tsunami; the consequences of loss of safety functions, that is, a total loss of electricity supply (also referred to as station black-out, or SBO), the loss of ultimate heat sink (UHS), or both; and issues connected with the management of severe accidents. The UHS is a medium to which the residual heat from the reactor is transferred, for example the sea or a river.
  • ...3 more annotations...
  • While tsunami are not foreseen as a problem in Europe, the plants have been obliged to consider other external and internal initiating events that could trigger a loss of safety functions.In France, a total of 150 nuclear facilities including operating reactors, reactors under construction, research reactors and other nuclear facilities are affected. In its progress report, French regulator Autorité de Sûreté Nucléaire (ASN) notes that the risk of similar phenomena to those that triggered the Fukushima accident is negligible and says that it prefers to submit a more comprehensive report for all of its affected installations later in the year. However, reports for the 80 facilities identified as priorities have been submitted and those for the country's 58 operating power reactors have already been published on the ASN's web site.
  • No fundamental weaknesses in the definition of design basis events or the safety systems to withstand them has been revealed for UK nuclear power plants from either the stress tests or from earlier national reviews, according to the progress report from the UK's Office for Nuclear Regulation (ONR). However, lessons are being learnt about improving resilience for beyond-design-basis events and removing or reducing cliff-edges, and will be applied in a timely manner, the regulator says.
  • Measures under consideration in the UK include the provision of additional local flood protection to key equipment and the provision of further emergency back-up equipment to provide cooling and power, while EDF Energy, operator of the country's AGRs and single PWR plant, is preparing additional studies to reconsider flood modelling for specific sites and to review recent climate change information that arrived subsequent to recent routine ten-yearly safety reviews. The main focus for the country's Magnox reactors will be to improve the reliability of cooling systems in the face of a variety of beyond-design-basis faults to reduce or minimise the potential for cliff-edges. Evaluations of findings are still ongoing. Operators have up to 31 October to make their full report back to their national regulator, and regulators have until 31 December to make their full reports to the European Commission.
D'coda Dcoda

The Dispatch Queue - An Alternative Means of Accounting for External Costs? [28Sep11] - 0 views

  • Without much going on recently that hasn’t been covered by other blog posts, I’d like to explore a topic not specifically tied to nuclear power or to activities currently going on in Washington, D.C. It involves an idea I have about a possible alternative means of having the electricity market account for the public health and environmental costs of various energy sources, and encouraging the development and use of cleaner sources (including nuclear) without requiring legislation. Given the failure of Congress to take action on global warming, as well as environmental issues in general, non-legislative approaches to accomplishing environmental goals may be necessary. The Problem
  • One may say that the best response would be to significantly tighten pollution regulations, perhaps to the point where no sources have significant external costs. There are problems with this approach, however, above and beyond the fact that the energy industry has (and will?) successfully blocked the legislation that would be required. Significant tightening of regulations raises issues such as how expensive compliance will be, and whether or not viable alternative (cleaner) sources would be available. The beauty of simply placing a cost (or tax) on pollution that reflects its costs to public health and the environment is that those issues need not be addressed. The market just decides between sources based on the true, overall cost of each, resulting in the minimum overall (economic + environmental) cost-generation portfolio
  • The above reasoning is what led to policies like cap-and-trade or a CO2 emissions tax being proposed as a solution for the global warming problem. This has not flown politically, however. Policies that attempt to have external costs included in the market cost of energy have been labeled a “tax increase.” This is particularly true given that the associated pollution taxes (or emissions credit costs) would have largely gone to the government.
  • ...15 more annotations...
  • One final idea, which does not involve money going to or from government, is simply requiring that cleaner sources provide a certain fraction of our overall power generation. The many state Renewable Portfolio Standards (that do not include nuclear) and the Clean Energy Standard being considered by Congress and the Obama administration (which does include nuclear) are examples of this policy. While better than nothing, such policies are not ideal in that they are crude, and don’t involve a quantitative incentive based on real external costs. An energy source is either defined as “clean,” or it is not. Note that the definition of “clean” would be decided politically, as opposed to objectively based on tangible external costs determined by scientific studies (nuclear’s exclusion from state Renewable Portfolio Standards policies being one outrageous example). Finally, there is the fact that any such policy would require legislation.
  • Well, if we can’t tax pollution, how about encouraging the use of clean sources by giving them subsidies? This has proved to be more popular so far, but this idea has also recently run into trouble, given the current situation with the budget deficit and national debt. Events like the Solyndra bankruptcy have put government clean energy subsidies even more on the defensive. Thus, it seems that neither policies involving money flowing to the government nor policies involving money flowing from the government are politically viable at this point.
  • All of the above begs the question whether there is a policy available that will encourage the use of cleaner energy sources that is revenue-neutral (i.e., does not involve money flowing to or from the government), does not involve the outright (political) selection of certain energy sources over others, and does not require legislation. Enter the Dispatch Queue
  • There must be enough power plants in a given region to meet the maximum load (or demand) expected to occur. In fact, total generation capacity must exceed maximum demand by a specified “reserve margin,” to address the possibility of a plant going offline, or other possible considerations. Due to the fact that demand varies significantly with time, a significant fraction of the generation capacity remains offline, some or most of the time. The dispatch queue is a means by which utilities, or independent regional grid operators, decide which power plants will operate in order to meet demand at any given instant. A good discussion of dispatch queues and how they operate can be found in this Department of Energy report.
  • The general goal of the methodology used to set the dispatch queue order is to minimize overall generation cost, while staying in compliance with all federal or state laws (environmental rules, etc.). This is done by placing the power plants with the lowest “variable” cost first in the queue. Plants with the highest “variable” cost are placed last. The “variable” cost of a plant represents how much more it costs to operate the plant than it costs to leave it idle (i.e., it includes the fuel cost and maintenance costs that arise from operation, but does not include the plant capital cost, personnel costs, or any fixed maintenance costs). Thus, one starts with the least expensive plants, and moves up (in cost) until generation meets demand. The remaining, more expensive plants are not fired up. This ensures that the lowest-operating-cost set of plants is used to meet demand at any given time
  • As far as who makes the decisions is concerned, in many cases the local utility itself runs the dispatch for its own service territory. In most of the United States, however, there is a large regional grid (covering several utilities) that is operated by an Independent System Operator (ISO) or Regional Transmission Organization (RTO), and those organizations, which are independent of the utilities, set the dispatch queue for the region. The Idea
  • As discussed above, a plant’s place in the dispatch queue is based upon variable cost, with the lowest variable cost plants being first in the queue. As discussed in the DOE report, all the dispatch queues in the country base the dispatch order almost entirely on variable cost, with the only possible exceptions being issues related to maximizing grid reliability. What if the plant dispatch methodology were revised so that environmental costs were also considered? Ideally, the public health and environmental costs would be objectively and scientifically determined and cast in terms of an equivalent economic cost (as has been done in many scientific studies such as the ExternE study referenced earlier). The calculated external cost would be added to a plant’s variable cost, and its place in the dispatch queue would be adjusted accordingly. The net effect would be that dirtier plants would be run much less often, resulting in greatly reduced pollution.
  • This could have a huge impact in the United States, especially at the current time. Currently, natural gas prices are so low that the variable costs of combine-cycle natural gas plants are not much higher than those of coal plants, even without considering environmental impacts. Also, there is a large amount of natural gas generation capacity sitting idle.
  • More specifically, if dispatch queue ordering methods were revised to even place a small (economic) weight on environmental costs, there would be a large switch from coal to gas generation, with coal plants (especially the older, dirtier ones) moving to the back of the dispatch queue, and only running very rarely (at times of very high demand). The specific idea of putting gas plants ahead of coal plants in the dispatch queue is being discussed by others.
  • The beauty of this idea is that it does not involve any type of tax or government subsidy. It is revenue neutral. Also, depending on the specifics of how it’s implemented, it can be quantitative in nature, with environmental costs of various power plants being objectively weighed, as opposed certain sources simply being chosen, by government/political fiat, over others. It also may not require legislation (see below). Finally, dispatch queues and their policies and methods are a rather arcane subject and are generally below the political radar (many folks haven’t even heard of them). Thus, this approach may allow the nation’s environmental goals to be (quietly) met without causing a political uproar. It could allow policy makers to do the right thing without paying too high of a political cost.
  • Questions/Issues The DOE report does mention some examples of dispatch queue methods factoring in issues other than just the variable cost. It is fairly common for issues of grid reliability to be considered. Also, compliance with federal or state environmental requirements can have some impacts. Examples of such laws include limits on the hours of operation for certain polluting facilities, or state requirements that a “renewable” facility generate a certain amount of power over the year. The report also discusses the possibility of favoring more fuel efficient gas plants over less efficient ones in the queue, even if using the less efficient plants at that moment would have cost less, in order to save natural gas. Thus, the report does discuss deviations from the pure cost model, to consider things like environmental impact and resource conservation.
  • I could not ascertain from the DOE report, however, what legal authorities govern the entities that make the plant dispatch decisions (i.e., the ISOs and RTOs), and what types of action would be required in order to change the dispatch methodology (e.g., whether legislation would be required). The DOE report was a study that was called for by the Energy Policy Act of 2005, which implies that its conclusions would be considered in future congressional legislation. I could not tell from reading the report if the lowest cost (only) method of dispatch is actually enshrined somewhere in state or federal law. If so, the changes I’m proposing would require legislation, of course.
  • The DOE report states that in some regions the local utility runs the dispatch queue itself. In the case of the larger grids run by the ISOs and RTOs (which cover most of the country), the report implies that those entities are heavily influenced, if not governed, by the Federal Energy Regulatory Commission (FERC), which is part of the executive branch of the federal government. In the case of utility-run dispatch queues, it seems that nothing short of new regulations (on pollution limits, or direct guidance on dispatch queue ordering) would result in a change in dispatch policy. Whereas reducing cost and maximizing grid reliability would be directly in the utility’s interest, favoring cleaner generation sources in the queue would not, unless it is driven by regulations. Thus, in this case, legislation would probably be necessary, although it’s conceivable that the EPA could act (like it’s about to on CO2).
  • In the case of the large grids run by ISOs and RTOs, it’s possible that such a change in dispatch methodology could be made by the federal executive branch, if indeed the FERC has the power to mandate such a change
  • Effect on Nuclear With respect to the impacts of including environmental costs in plant dispatch order determination, I’ve mainly discussed the effects on gas vs. coal. Indeed, a switch from coal to gas would be the main impact of such a policy change. As for nuclear, as well as renewables, the direct/immediate impact would be minimal. That is because both nuclear and renewable sources have high capital costs but very low variable costs. They also have very low environmental impacts; much lower than those of coal or gas. Thus, they will remain at the front of the dispatch queue, ahead of both coal and gas.
D'coda Dcoda

Report Assails Japan Response to Fukushima Daiichi Nuclear Accident [26Dec11] - 0 views

  • From inspectors’ abandoning of the Fukushima Daiichi nuclear power plant as it succumbed to disaster to a delay in disclosing radiation leaks, Japan’s response to the nuclear accident caused by the March tsunami fell tragically short, a government-appointed investigative panel said on Monday.
  • The failures, which the panel said worsened the extent of the disaster, were outlined in a 500-page interim report detailing Japan’s response to the calamitous events that unfolded at the Fukushima plant after the March 11 earthquake and tsunami knocked out all of the site’s power.
  • The panel attacked the use of the term “soteigai,” or “unforeseen,” that plant and government officials used both to describe the unprecedented scale of the disaster and to explain why they were unable to stop it. Running a nuclear power plant inherently required officials to foresee the unforeseen, said the panel’s chairman, Yotaro Hatamura, a professor emeritus in engineering at the University of Tokyo. “There was a lot of talk of soteigai, but that only bred perceptions among the public that officials were shirking their responsibilities,” Mr. Hatamura said.
  • ...6 more annotations...
  • Tokyo Electric had assumed that no wave would reach more than about 20 feet. The tsunami hit at more than twice that height.
  • Officials of Japan’s nuclear regulator present at the plant during the quake quickly left the site, and when ordered to return by the government, they proved of little help to workers racing to restore power and find water to cool temperatures at the plant, the report said.
  • the workers left at Fukushima Daiichi had not been trained to handle multiple failures, and lacked a clear manual to follow, the report said. A communications breakdown meant that workers at the plant had no clear sense of what was happening.
  • In particular, an erroneous assumption that an emergency cooling system was working led to hours of delay in finding alternative ways to draw cooling water to the plant, the report said. All the while, the system was not working, and the uranium fuel rods at the cores were starting to melt.
  • devastatingly, the government failed to make use of data on the radioactive plumes released from the plant to warn local towns and direct evacuations, the report said. The failure allowed entire communities to be exposed to harmful radiation, the report said. “Authorities failed to think of the disaster response from the perspective of victims,” Mr. Hatamura said.
  • But the interim report seems to leave ultimate responsibility for the disaster ambiguous. Even if workers had realized that the emergency cooling system was not working, they might not have been able to prevent the meltdowns. The panel limited itself to suggesting that a quicker response might have mitigated the core damage and lessened the release of radiation into the environment.
D'coda Dcoda

Gov't kept silent on worst-case scenario at height of nuclear crisis ‹ [26Jan12] - 0 views

  • The Japanese government’s worst-case scenario at the height of the nuclear crisis last year warned that tens of millions of people, including Tokyo residents, might need to leave their homes, according to a report obtained by The Associated Press. But fearing widespread panic, officials kept the report secret.   The recent emergence of the 15-page internal document may add to complaints in Japan that the government withheld too much information about the nuclear accident.
  • It also casts doubt about whether the government was sufficiently prepared to cope with what could have been an evacuation of unprecedented scale.   The report was submitted to then-Prime Minister Naoto Kan and his top advisers on March 25, two weeks after the earthquake and tsunami devastated the Fukushima Daiichi nuclear power plant, causing three reactors to melt down and generating hydrogen explosions that blew away protective structures.   Workers ultimately were able to bring the reactors under control, but at the time, it was unclear whether emergency measures would succeed. Kan commissioned the report, compiled by the Japan Atomic Energy Commission, to examine what options the government had if those efforts failed.
  • It said that each contingency was possible at the time it was written, and could force all workers to flee the vicinity, meaning the situation at the plant would unfold on its own, unmitigated.   Using matter-of-fact language, diagrams and charts, the report said that if meltdowns spiral out of control, radiation levels could soar.   In that case, it said evacuation orders should be issued for residents within and possibly beyond a 170-kilometer radius of the plant and “voluntary” evacuations should be offered for everyone living within 250 kilometers and even beyond that range.   That’s an area that would have included Tokyo and its suburbs, with a population of 35 million people, and other major cities such as Sendai, with a million people, and Fukushima city with 290,000 people.
  • ...2 more annotations...
  • The report looked at several ways the crisis could escalate—explosions inside the reactors, complete meltdowns, and the structural failure of cooling pools used for spent nuclear fuel
  • The report further warned that contaminated areas might not be safe for “several decades.”   “We cannot rule out further developments that may lead to an unpredictable situation at Fukushima Daiichi nuclear plant, where there has been an accident, and this report outlines a summary of that unpredictable situation,” says the document, written by Shunsuke Kondo, head of the commission, which oversees nuclear policy.   After Kan received the report, he and other Japanese officials publicly insisted that there was no need to prepare for wider-scale evacuations.
D'coda Dcoda

TEPCO Is Not Providing English Translation of Its Report to NISA on Emergency Cooling S... - 0 views

  • The Japanese government seems to be "instructing" TEPCO not to release certain information in English.TEPCO submitted the report to its regulatory agency Nuclear and Industrial Safety Agency (NISA) "on the measures to continue water injection into reactors of Units 1 to 3 at Fukushima Daiichi Nuclear Power Station" on August 3. It's in Japanese only, and it may or may not be translated into English.According to TEPCO:We have provided a Japanese press release version of the instruction document received from NISA. However, at this time we have reserved the right not to provide an English version due to potential misunderstandings that may arise from an inaccurate rendering of the original Japanese text. We may provide the English translation that NISA releases in our press releases. However, in principle we would advise you to visit the NISA website for timely and accurate information.(From TEPCO's English press release on August 3 explaining why they are releasing the information only in Japanese.)The 34-page Japanese report is here.
  • The report talks about the fuel inside the Reactor Pressure Vessels;It talks about the reactors as if they were sound;It states that zirconium will start to interact with water at a certain temperature (1,200 degrees Celsius).
  • Most likely, there is no fuel left inside the RPVs at Fukushima I Nuke Plant. Even if there is, it is not fuel any more but "corium" - fuel, control rods, instruments, whatever inside the RPV, melted together. TEPCO has already admitted that there are holes in the RPV, and holes in the Containment Vessels. There is no zirconium left because there is no cladding left.
  • ...3 more annotations...
  • nowhere in the report does the company say anything about melted fuel, broken reactors, water in the basements, or extremely high radiation at certain locations in the plant.But the report goes on to describe the elaborate backup pump system and power system as if what they are dealing with is normal (i.e. without cracks or holes at the bottom) reactors with intact fuel rods inside the RPVs with control rods safely deployed in a clean nuclear power plant, and all they need to worry is how they can continue the cooling; or as if the salt-encrusted molten mess of everything that was inside the RPV behaves just the same as normal fuel rods in a normal reactor.
  • Why was TEPCO asked by NISA to submit this report to begin with? So that the national government can begin the discussion with the local municipalities within the 20-kilometer radius evacuation zone for the return of the residents to their towns and villages. The discussion is to begin this month, and TEPCO's report will be used to reassure the residents that Fukushima I Nuke Plant is so stable now with the solid plans (to be approved by NISA, which no doubt will happen very soon) to cool the fuels in the reactors even in case of an emergency.
  • Remember the mayor of Naraha-machi, where Fukushima II Nuclear Power Plant is located? He wants TEPCO to restart the plant so that 5,000 jobs will return to the town. He also wanted to invite the government to build the final processing plant of spent nuclear fuels in his town. He would be the first one to highly approve of the report so that his town can continue to prosper with nuclear money.
D'coda Dcoda

Economic Aspects of Nuclear Fuel Reprocessing [12Jul05] - 0 views

  • On Tuesday, July 12, the Energy Subcommittee of the House Committee on Science will hold a hearing to examine whether it would be economical for the U.S. to reprocess spent nuclear fuel and what the potential cost implications are for the nuclear power industry and for the Federal Government. This hearing is a follow-up to the June 16 Energy Subcommittee hearing that examined the status of reprocessing technologies and the impact reprocessing would have on energy efficiency, nuclear waste management, and the potential for proliferation of weapons-grade nuclear materials.
  • Dr. Richard K. Lester is the Director of the Industrial Performance Center and a Professor of Nuclear Science and Engineering at the Massachusetts Institute of Technology. He co-authored a 2003 study entitled The Future of Nuclear Power. Dr. Donald W. Jones is Vice President of Marketing and Senior Economist at RCF Economic and Financial Consulting, Inc. in Chicago, Illinois. He co-directed a 2004 study entitled The Economic Future of Nuclear Power. Dr. Steve Fetter is the Dean of the School of Public Policy at the University of Maryland. He co-authored a 2005 paper entitled The Economics of Reprocessing vs. Direct Disposal of Spent Nuclear Fuel. Mr. Marvin Fertel is the Senior Vice President and Chief Nuclear Officer at the Nuclear Energy Institute.
  • 3. Overarching Questions  Under what conditions would reprocessing be economically competitive, compared to both nuclear power that does not include fuel reprocessing, and other sources of electric power? What major assumptions underlie these analyses?  What government subsidies might be necessary to introduce a more advanced nuclear fuel cycle (that includes reprocessing, recycling, and transmutation—''burning'' the most radioactive waste products in an advanced reactor) in the U.S.?
  • ...13 more annotations...
  • 4. Brief Overview of Nuclear Fuel Reprocessing (from June 16 hearing charter)  Nuclear reactors generate about 20 percent of the electricity used in the U.S. No new nuclear plants have been ordered in the U.S. since 1973, but there is renewed interest in nuclear energy both because it could reduce U.S. dependence on foreign oil and because it produces no greenhouse gas emissions.  One of the barriers to increased use of nuclear energy is concern about nuclear waste. Every nuclear power reactor produces approximately 20 tons of highly radioactive nuclear waste every year. Today, that waste is stored on-site at the nuclear reactors in water-filled cooling pools or, at some sites, after sufficient cooling, in dry casks above ground. About 50,000 metric tons of commercial spent fuel is being stored at 73 sites in 33 states. A recent report issued by the National Academy of Sciences concluded that this stored waste could be vulnerable to terrorist attacks.
  • Under the current plan for long-term disposal of nuclear waste, the waste from around the country would be moved to a permanent repository at Yucca Mountain in Nevada, which is now scheduled to open around 2012. The Yucca Mountain facility continues to be a subject of controversy. But even if it opened and functioned as planned, it would have only enough space to store the nuclear waste the U.S. is expected to generate by about 2010.  Consequently, there is growing interest in finding ways to reduce the quantity of nuclear waste. A number of other nations, most notably France and Japan, ''reprocess'' their nuclear waste. Reprocessing involves separating out the various components of nuclear waste so that a portion of the waste can be recycled and used again as nuclear fuel (instead of disposing of all of it). In addition to reducing the quantity of high-level nuclear waste, reprocessing makes it possible to use nuclear fuel more efficiently. With reprocessing, the same amount of nuclear fuel can generate more electricity because some components of it can be used as fuel more than once.
  • The greatest drawback of reprocessing is that current reprocessing technologies produce weapons-grade plutonium (which is one of the components of the spent fuel). Any activity that increases the availability of plutonium increases the risk of nuclear weapons proliferation.  Because of proliferation concerns, the U.S. decided in the 1970s not to engage in reprocessing. (The policy decision was reversed the following decade, but the U.S. still did not move toward reprocessing.) But the Department of Energy (DOE) has continued to fund research and development (R&D) on nuclear reprocessing technologies, including new technologies that their proponents claim would reduce the risk of proliferation from reprocessing.
  • The report accompanying H.R. 2419, the Energy and Water Development Appropriations Act for Fiscal Year 2006, which the House passed in May, directed DOE to focus research in its Advanced Fuel Cycle Initiative program on improving nuclear reprocessing technologies. The report went on to state, ''The Department shall accelerate this research in order to make a specific technology recommendation, not later than the end of fiscal year 2007, to the President and Congress on a particular reprocessing technology that should be implemented in the United States. In addition, the Department shall prepare an integrated spent fuel recycling plan for implementation beginning in fiscal year 2007, including recommendation of an advanced reprocessing technology and a competitive process to select one or more sites to develop integrated spent fuel recycling facilities.''
  • During floor debate on H.R. 2419, the House defeated an amendment that would have cut funding for research on reprocessing. In arguing for the amendment, its sponsor, Mr. Markey, explicitly raised the risks of weapons proliferation. Specifically, the amendment would have cut funding for reprocessing activities and interim storage programs by $15.5 million and shifted the funds to energy efficiency activities, effectively repudiating the report language. The amendment was defeated by a vote of 110–312.
  • But nuclear reprocessing remains controversial, even within the scientific community. In May 2005, the American Physical Society (APS) Panel on Public Affairs, issued a report, Nuclear Power and Proliferation Resistance: Securing Benefits, Limiting Risk. APS, which is the leading organization of the Nation's physicists, is on record as strongly supporting nuclear power. But the APS report takes the opposite tack of the Appropriations report, stating, ''There is no urgent need for the U.S. to initiate reprocessing or to develop additional national repositories. DOE programs should be aligned accordingly: shift the Advanced Fuel Cycle Initiative R&D away from an objective of laying the basis for a near-term reprocessing decision; increase support for proliferation-resistance R&D and technical support for institutional measures for the entire fuel cycle.''  Technological as well as policy questions remain regarding reprocessing. It is not clear whether the new reprocessing technologies that DOE is funding will be developed sufficiently by 2007 to allow the U.S. to select a technology to pursue. There is also debate about the extent to which new technologies can truly reduce the risks of proliferation.
  •  It is also unclear how selecting a reprocessing technology might relate to other pending technology decisions regarding nuclear energy. For example, the U.S. is in the midst of developing new designs for nuclear reactors under DOE's Generation IV program. Some of the potential new reactors would produce types of nuclear waste that could not be reprocessed using some of the technologies now being developed with DOE funding.
  • 5. Brief Overview of Economics of Reprocessing
  • The economics of reprocessing are hard to predict with any certainty because there are few examples around the world on which economists might base a generalized model.  Some of the major factors influencing the economic competitiveness of reprocessing are: the availability and cost of uranium, costs associated with interim storage and long-term disposal in a geologic repository, reprocessing plant construction and operating costs, and costs associated with transmutation, the process by which certain parts of the spent fuel are actively reduced in toxicity to address long-term waste management.
  • Costs associated with reducing greenhouse gas emissions from fossil fuel-powered plants could help make nuclear power, including reprocessing, economically competitive with other sources of electricity in a free market.
  •  It is not clear who would pay for reprocessing in the U.S.
  • Three recent studies have examined the economics of nuclear power. In a study completed at the Massachusetts Institute of Technology in 2003, The Future of Nuclear Power, an interdisciplinary panel, including Professor Richard Lester, looked at all aspects of nuclear power from waste management to economics to public perception. In a study requested by the Department of Energy and conducted at the University of Chicago in 2004, The Economic Future of Nuclear Power, economist Dr. Donald Jones and his colleague compared costs of future nuclear power to other sources, and briefly looked at the incremental costs of an advanced fuel cycle. In a 2003 study conducted by a panel including Matthew Bunn (a witness at the June 16 hearing) and Professor Steve Fetter, The Economics of Reprocessing vs. Direct Disposal of Spent Nuclear Fuel, the authors took a detailed look at the costs associated with an advanced fuel cycle. All three studies seem more or less to agree on cost estimates: the incremental cost of nuclear electricity to the consumer, with reprocessing, could be modest—on the order of 1–2 mills/kWh (0.1–0.2 cents per kilowatt-hour); on the other hand, this increase represents an approximate doubling (at least) of the costs attributable to spent fuel management, compared to the current fuel cycle (no reprocessing). Where they strongly disagree is on how large an impact this incremental cost will have on the competitiveness of nuclear power. The University of Chicago authors conclude that the cost of reprocessing is negligible in the big picture, where capital costs of new plants dominate all economic analyses. The other two studies take a more skeptical view—because new nuclear power would already be facing tough competition in the current market, any additional cost would further hinder the nuclear power industry, or become an unacceptable and unnecessary financial burden on the government.
  • 6. Background
  •  
    Report from the Subcommitte on Energy, Committee on Science for House of Representatives. Didn't highlight the entire article, see site for the rest.
D'coda Dcoda

TEPCO Now Says There Was No Hydrogen Explosion at Reactor 2 [01Oct11] - 1 views

  • From Yomiuri Shinbun (3:03AM JST 10/2/2011):
  • Details of an interim report by TEPCO's internal "Fukushima nuclear accident investigation committee" (headed by Vice President Masao Yamazaki) were revealed.
  • The committee reversed the company's position that there had been a hydrogen explosion in Reactor 2, and now concluded there was no such explosion. As to the tsunami that triggered the accident, the committee says "it was beyond expectations"; of the delay in initial response to the accident, the committee concludes "it couldn't be helped". Overall, the report looks full of self-justification. TEPCO plans to run the report with the verification committee made of outside experts before it publishes the report.
  • ...3 more annotations...
  • At Fukushima I Nuclear Power Plant, the Reactor 1 reactor building blew up in a hydrogen explosion in the afternoon of March 12, followed by a hydrogen explosion of Reactor 3 in the morning of March 14. Further, in the early morning on March 15, there was an explosive sound, and the damage to the Reactor 4 reactor building was confirmed. Right after the explosive sound the pressure in the Suppression Chamber of Reactor 2 dropped sharply, which led TEPCO to conclude that there were near-simultaneous explosions in Reactors 2 and 4. The Japanese government reported the events as such in the report to IAEA in June.So then what does TEPCO now think happened in Reactor 2 in the early morning on March 15? Yomiuri doesn't say in the article text, but at the bottom of the illustration that accompanies the article it says:"There was no explosion, but a possibility of some kind of damage to the Containment Vessel."So, before TEPCO completely changes story, here's what they say happened on Reactor 4 on March 15 (from the daily "Status of TEPCO's Facilities - past progress" report, page 6):
  • It says "abnormal sound was confirmed near the suppression chamber" at 6:14AM on March 15.Now, this is what TEPCO says about Reactor 4 on the same day, about the same time, from Page 16:
  • It says "an explosive sound was heard" at 6AM on March 15. The Reactor 4 explosion occurred before the Reactor 2 "explosion" which TEPCO now says never happened.The two sounds are 14 minutes apart, and TEPCO now claims they misheard the second one and there was no explosion in the Suppression Chamber of Reactor 2.(By the way, the fire spotted at 9:38AM on March 15 on Reactor 4 was never reported to the local fire department or the local government, as I reported on March 15.)
D'coda Dcoda

Huhne will use Fukushima report to revive nuclear programme [09Oct11] - 0 views

  • The government is expected this week to try to use a post-Fukushima green light from Britain’s chief nuclear safety inspector to inject momentum into its stuttering nuclear power and anti-climate-change programmes. The move will run into a hail of criticism from environmentalists who believe the latest inquiry into the nuclear industry has been rushed through and fear that ministers are backing off from their commitments to green issues. On Tuesday, Chris Huhne, the energy secretary, is scheduled to release the final report by Mike Weightman, chief inspector for nuclear installations, into what lessons should be learned from the Fukushima reactor disaster in Japan. The report is understood to contain only small amendments to an earlier, interim, report which made only minor recommendations. End extract http://www.guardian.co.uk/business/2011/oct/09/chris-huhne-fukushima-report-nuclear-programme?newsfeed=true
D'coda Dcoda

BP gets Gulf oil drilling permit amid 28,000 unmonitored abandoned wells [25Oct11] - 0 views

  • Since BP’s catastrophic Macondo Blowout in the Gulf of Mexico last year, the Obama Administration has granted nearly 300 new drilling permits [1] and shirked plans to plug 3,600 of more than 28,000 abandoned wells, which pose significant threats to the severely damaged sea. Among those granted new permits for drilling in the Gulf, on Friday Obama granted BP permission to explore for oil in the Gulf, allowing it to bid on new leases that will be sold at auction in December. Reports Dow Jones: “The upcoming lease sale, scheduled for Dec. 14 in New Orleans, involves leases in the western Gulf of Mexico. The leases cover about 21 million acres, in water depths of up to 11,000 feet. It will be the first lease auction since the Deepwater Horizon spill.” [2]
  • Massachusetts Rep. Ed Markey objected to BP’s participation in the upcoming lease sale, pointing out that: “Comprehensive safety legislation hasn’t passed Congress, and BP hasn’t paid the fines they owe for their spill, yet BP is being given back the keys to drill in the Gulf.” Environmental watchdog, Oceana, added its objection to the new permits, saying that none of the new rules implemented since April 2010 would have prevented the BP disaster. “Our analysis shows that while the new rules may increase safety to some degree, they likely would not have prevented the last major oil spill, and similarly do not adequately protect against future ones.” [3]
  • Detailing the failure of the Dept. of Interior’s safety management systems, Oceana summarizes: Regulation exemptions (“departures”) are often granted, including one that arguably led to the BP blowout; Economic incentives make violating rules lucrative because penalties are ridiculously small; Blowout preventers continue to have critical deficiencies; and Oversight and inspection levels are paltry relative to the scale of drilling operation. Nor have any drilling permits been denied [4] since the BP catastrophe on April 20, 2010, which still spews oil today [5].
  • ...10 more annotations...
  • 28,079 Abandoned Wells in Gulf of Mexico In an explosive report at Sky Truth, John Amos reveals from government data that “there are currently 24,486 known permanently abandoned wells in the Gulf of Mexico, and 3,593 ‘temporarily’ abandoned wells, as of October 2011.” [6] TA wells are those temporarily sealed so that future drilling can be re-started. Both TA wells and “permanently abandoned” (PA) wells endure no inspections.
  • Not only cement, but seals, valves and gaskets can deteriorate over time. A 2000 report by C-FER Technologies to the Dept. of Interior identified several  different points where well leaks can occur, as this image (p. 26) reveals.  To date, no regulations prescribe a maximum time wells may remain inactive before being permanently abandoned. [13] “The most common failure mechanisms (corrosion, deterioration, and malfunction) cause mainly small leaks [up to 49 barrels, or 2,058 gallons]. Corrosion is historically known to cause 85% to 90% of small leaks.” Depending on various factors, C-FER concludes that “Shut-In” wells reach an environmental risk threshhold in six months, TA wells in about 10-12 years, and PA wells in 25 years.  Some of these abandoned wells are 63 years old.
  • Leaking abandoned wells pose a significant environmental and economic threat. A three-month EcoHearth investigation revealed that a minimum of 2.5 million abandoned wells in the US and 20-30 million worldwide receive no follow up inspections to ensure they are not leaking. Worse: “There is no known technology for securely sealing these tens of millions of abandoned wells. Many—likely hundreds of thousands—are already hemorrhaging oil, brine and greenhouse gases into the environment. Habitats are being fundamentally altered. Aquifers are being destroyed. Some of these abandoned wells are explosive, capable of building-leveling, toxin-spreading detonations. And thanks to primitive capping technologies, virtually all are leaking now—or will be.” [11] Sealed with cement, adds EcoHearth, “Each abandoned well is an environmental disaster waiting to happen. The triggers include accidents, earthquakes, natural erosion, re-pressurization (either spontaneous or precipitated by fracking) and, simply, time.”
  • Over a year ago, the Dept. of Interior promised to plug the “temporarily abandoned” (TA) wells, and dismantle another 650 production platforms no longer in use. [7] At an estimated decommissioning cost of $1-3 billion [8], none of this work has been started, though Feds have approved 912 permanent abandonment plans and 214 temporary abandonment plans submitted since its September 2010 rule. [9] Over 600 of those abandoned wells belong to BP, reported the Associated Press last year, adding that some of the permanently abandoned wells date back to the 1940s [10].  Amos advises that some of the “temporarily abandoned” wells date back to the 1950s. “Experts say abandoned wells can repressurize, much like a dormant volcano can awaken. And years of exposure to sea water and underground pressure can cause cementing and piping to corrode and weaken,” reports AP.
  • As far back as 1994, the Government Accountability Office warned that there was no effective strategy in place to inspect abandoned wells, nor were bonds sufficient to cover the cost of abandonment. Lease abandonment costs estimated at “$4.4 billion in current dollars … were covered by only $68 million in bonds.” [12] The GAO concluded that “leaks can occur… causing serious damage to the environment and marine life,” adding that “MMS has not encouraged the development of nonexplosive structure removal technologies that would eliminate or minimize environmental damage.”
  • The AP noted that none of the 1994 GAO recommendations have been implemented. Abandoned wells remain uninspected and pose a threat which the government continues to ignore. Agency Reorganization The Minerals Management Service (MMS) was renamed the Bureau of Ocean Energy Management, Regulation and Enforcement (BOEMRE) last May after MMS drew heavy fire for malfeasance, including allowing exemptions to safety rules it granted to BP. An Office of Inspector General investigation revealed that MMS employees accepted gifts from the oil and gas industry, including sex, drugs and trips, and falsified inspection reports. [14] Not only was nothing was done with the 1994 GAO recommendations to protect the environment from abandoned wells, its 2003 reorganization recommendations [15] were likewise ignored.  In a June 2011 report on agency reorganization in the aftermath of the Gulf oil spill, the GAO reports that “as of December 2010,” the DOI “had not implemented many recommendations we made to address numerous weaknesses and challenges.” [16] Reorganization proceeded.  Effective October 1, 2011, the Dept. of the Interior split BOEMRE into three new federal agencies: the Office of Natural Resources Revenue to collect mineral leasing fees, the Bureau of Safety and Environmental Enforcement (BSEE) and the Bureau of Ocean Energy Management (BOEM) “to carry out the offshore energy management and safety and environmental oversight missions.” The DOI admits:
  • “The Deepwater Horizon blowout and resulting oil spill shed light on weaknesses in the federal offshore energy regulatory system, including the overly broad mandate and inherently conflicted missions of MMS which was charged with resource management, safety and environmental protection, and revenue collection.” [17] BOEM essentially manages the development of offshore drilling, while BSEE oversees environmental protection, with some eco-protection overlap between the two agencies. [18] Early this month, BSEE Director Michael R. Bromwich spoke at the Global Offshore Safety Summit Conference in Stavanger, Norway, sponsored by the International Regulators Forum. He announced a new position, Chief Environmental Officer of the BOEM:
  • This person will be empowered, at the national level, to make decisions and final recommendations when leasing and environmental program heads cannot reach agreement. This individual will also be a major participant in setting the scientific agenda for the United States’ oceans.” [19] Bromwich failed to mention anything about the abandoned wells under his purview. Out of sight, out of mind. Cost of the Macondo Blowout
  • On Monday, the GAO published its final report of a three-part series on the Gulf oil disaster. [20]  Focused on federal financial exposure to oil spill claims, the accountants nevertheless point out that, as of May 2011, BP paid $700 million toward those spill claims out of its $20 billion Trust established to cover that deadly accident. BP and Oxford Economics estimate the total cost for eco-cleanup and compensatory economic damages will run to the “tens of billions of dollars.” [21] On the taxpayer side, the GAO estimates the federal government’s costs will exceed the billion dollar incident cap set by the Oil Pollution Act of 1990 (as amended). As of May 2011, agency costs reached past $626 million. The Oil Spill Liability Trust Fund’s income is generated from an oil barrel tax that is set to expire in 2017, notes GAO.
  • With Monday’s District Court decision in Louisiana, BP also faces punitive damages on “thousands of thousands of thousands of claims.” U.S. District Judge Carl Barbier denied BP’s appeal that might have killed several hundred thousand claims, among them that clean up workers have still not been fully paid by BP. [22] Meanwhile, destroying the planet for profit continues unabated. It’s time to Occupy the Gulf of Mexico: No more oil drilling in our food source.
D'coda Dcoda

U.S. nuke regulators weaken safety rules [20Jun11] - 0 views

  • Federal regulators have been working closely with the nuclear power industry to keep the nation's aging reactors operating within safety standards by repeatedly weakening standards or simply failing to enforce them, an investigation by The Associated Press has found.Officials at the U.S. Nuclear Regulatory Commission regularly have decided original regulations were too strict, arguing that safety margins could be eased without peril, according to records and interviews.The result? Rising fears that these accommodations are undermining safety -- and inching the reactors closer to an accident that could harm the public and jeopardize nuclear power's future.
  • Examples abound. When valves leaked, more leakage was allowed -- up to 20 times the original limit. When cracking caused radioactive leaks in steam generator tubing, an easier test was devised so plants could meet standards.Failed cables. Busted seals. Broken nozzles, clogged screens, cracked concrete, dented containers, corroded metals and rusty underground pipes and thousands of other problems linked to aging were uncovered in AP's yearlong investigation. And many of them could escalate dangers during an accident.
  • Despite the problems, not a single official body in government or industry has studied the overall frequency and potential impact on safety of such breakdowns in recent years, even as the NRC has extended dozens of reactor licenses.Industry and government officials defend their actions and insist no chances are being taken. But the AP investigation found that with billions of dollars and 19 percent of America's electricity supply at stake, a cozy relationship prevails between industry and the NRC.Records show a recurring pattern: Reactor parts or systems fall out of compliance. Studies are conducted by industry and government, and all agree existing standards are "unnecessarily conservative."
  • ...14 more annotations...
  • Regulations are loosened, and reactors are back in compliance."That's what they say for everything ...," said Demetrios Basdekas, a retired NRC engineer. "Every time you turn around, they say, 'We have all this built-in conservatism.' "The crisis at the decades-old Fukushima Dai-ichi nuclear facility in Japan has focused attention on nuclear safety and prompted the NRC to look at U.S. reactors. A report is due in July.But the factor of aging goes far beyond issues posed by Fukushima.
  • Commercial nuclear reactors in the United States were designed and licensed for 40 years. When the first were built in the 1960s and 1970s, it was expected that they would be replaced with improved models long before their licenses expired.That never happened. The 1979 accident at Three Mile Island, massive cost overruns, crushing debt and high interest rates halted new construction in the 1980s.Instead, 66 of the 104 operating units have been relicensed for 20 more years. Renewal applications are under review for 16 other reactors.As of today, 82 reactors are more than 25 years old.The AP found proof that aging reactors have been allowed to run less safely to prolong operations.
  • Last year, the NRC weakened the safety margin for acceptable radiation damage to reactor vessels -- for a second time. The standard is based on a reactor vessel's "reference temperature," which predicts when it will become dangerously brittle and vulnerable to failure. Through the years, many plants have violated or come close to violating the standard.As a result, the minimum standard was relaxed first by raising the reference temperature 50 percent, and then 78 percent above the original -- even though a broken vessel could spill radioactive contents."We've seen the pattern," said nuclear safety scientist Dana Powers, who works for Sandia National Laboratories and also sits on an NRC advisory committee. "They're ... trying to get more and more out of these plants."
  • Sharpening the pencilThe AP study collected and analyzed government and industry documents -- some never-before released -- of both reactor types: pressurized water units that keep radioactivity confined to the reactor building and the less common boiling water types like those at Fukushima, which send radioactive water away from the reactor to drive electricity-generating turbines.The Energy Northwest Columbia Generating Station north of Richland is a boiling water design that's a newer generation than the Fukushima plants.Tens of thousands of pages of studies, test results, inspection reports and policy statements filed during four decades were reviewed. Interviews were conducted with scores of managers, regulators, engineers, scientists, whistleblowers, activists and residents living near the reactors at 65 sites, mostly in the East and Midwest.
  • AP reporters toured some of the oldest reactors -- Oyster Creek, N.J., near the Atlantic coast 50 miles east of Philadelphia and two at Indian Point, 25 miles north of New York City on the Hudson River.Called "Oyster Creak" by some critics, this boiling water reactor began running in 1969 and is the country's oldest operating commercial nuclear power plant. Its license was extended in 2009 until 2029, though utility officials announced in December they will shut the reactor 10 years earlier rather than build state-ordered cooling towers. Applications to extend the lives of pressurized water units 2 and 3 at Indian Point, each more than 36 years old, are under NRC review.Unprompted, several nuclear engineers and former regulators used nearly identical terminology to describe how industry and government research has frequently justified loosening safety standards. They call it "sharpening the pencil" or "pencil engineering" -- fudging calculations and assumptions to keep aging plants in compliance.
  • Cracked tubing: The industry has long known of cracking in steel alloy tubing used in the steam generators of pressurized water reactors. Ruptures have been common in these tubes containing radioactive coolant; in 1993 alone, there were seven. As many as 18 reactors still run on old generators.Problems can arise even in a newer metal alloy, according to a report of a 2008 industry-government workshop.
  • Neil Wilmshurst, director of plant technology for the industry's Electric Power Research Institute, acknowledged the industry and NRC often collaborate on research that supports rule changes. But he maintained there's "no kind of misplaced alliance ... to get the right answer."Yet agency staff, plant operators and consultants paint a different picture:* The AP reviewed 226 preliminary notifications -- alerts on emerging safety problems -- NRC has issued since 2005. Wear and tear in the form of clogged lines, cracked parts, leaky seals, rust and other deterioration contributed to at least 26 of the alerts. Other notifications lack detail, but aging was a probable factor in 113 more, or 62 percent in all. For example, the 39-year-old Palisades reactor in Michigan shut Jan. 22 when an electrical cable failed, a fuse blew and a valve stuck shut, expelling steam with low levels of radioactive tritium into the outside air. And a 1-inch crack in a valve weld aborted a restart in February at the LaSalle site west of Chicago.
  • * A 2008 NRC report blamed 70 percent of potentially serious safety problems on "degraded conditions" such as cracked nozzles, loose paint, electrical problems or offline cooling components.* Confronted with worn parts, the industry has repeatedly requested -- and regulators often have allowed -- inspections and repairs to be delayed for months until scheduled refueling outages. Again and again, problems worsened before being fixed. Postponed inspections inside a steam generator at Indian Point allowed tubing to burst, leading to a radioactive release in 2000. Two years later, cracking grew so bad in nozzles on the reactor vessel at the Davis-Besse plant near Toledo, Ohio, that it came within two months of a possible breach, an NRC report said, which could release radiation. Yet inspections failed to catch the same problem on the replacement vessel head until more nozzles were found to be cracked last year.
  • Time crumbles thingsNuclear plants are fundamentally no more immune to aging than our cars or homes: Metals grow weak and rusty, concrete crumbles, paint peels, crud accumulates. Big components like 17-story-tall concrete containment buildings or 800-ton reactor vessels are all but impossible to replace. Smaller parts and systems can be swapped but still pose risks as a result of weak maintenance and lax regulation or hard-to-predict failures.Even mundane deterioration can carry harsh consequences.For example, peeling paint and debris can be swept toward pumps that circulate cooling water in a reactor accident. A properly functioning containment building is needed to create air pressure that helps clear those pumps. But a containment building could fail in a severe accident. Yet the NRC has allowed safety calculations that assume the buildings will hold.
  • In a 2009 letter, Mario V. Bonaca, then-chairman of the NRC's Advisory Committee on Reactor Safeguards, warned that this approach represents "a decrease in the safety margin" and makes a fuel-melting accident more likely.Many photos in NRC archives -- some released in response to AP requests under the federal Freedom of Information Act -- show rust accumulated in a thick crust or paint peeling in long sheets on untended equipment.Four areas stand out:
  • Brittle vessels: For years, operators have rearranged fuel rods to limit gradual radiation damage to the steel vessels protecting the core and keep them strong enough to meet safety standards.But even with last year's weakening of the safety margins, engineers and metal scientists say some plants may be forced to close over these concerns before their licenses run out -- unless, of course, new regulatory compromises are made.
  • Leaky valves: Operators have repeatedly violated leakage standards for valves designed to bottle up radioactive steam in an earthquake or other accident at boiling water reactors.Many plants have found they could not adhere to the general standard allowing main steam isolation valves to leak at a rate of no more than 11.5 cubic feet per hour. In 1999, the NRC decided to allow individual plants to seek amendments of up to 200 cubic feet per hour for all four steam valves combined.But plants have violated even those higher limits. For example, in 2007, Hatch Unit 2, in Baxley, Ga., reported combined leakage of 574 cubic feet per hour.
  • "Many utilities are doing that sort of thing," said engineer Richard T. Lahey Jr., who used to design nuclear safety systems for General Electric Co., which makes boiling water reactors. "I think we need nuclear power, but we can't compromise on safety. I think the vulnerability is on these older plants."Added Paul Blanch, an engineer who left the industry over safety issues, but later returned to work on solving them: "It's a philosophical position that (federal regulators) take that's driven by the industry and by the economics: What do we need to do to let those plants continue to operate?"Publicly, industry and government say that aging is well under control. "I see an effort on the part of this agency to always make sure that we're doing the right things for safety. I'm not sure that I see a pattern of staff simply doing things because there's an interest to reduce requirements -- that's certainly not the case," NRC chairman Gregory Jaczko said in an interview.
  • Corroded piping: Nuclear operators have failed to stop an epidemic of leaks in pipes and other underground equipment in damp settings. Nuclear sites have suffered more than 400 accidental radioactive leaks, the activist Union of Concerned Scientists reported in September.Plant operators have been drilling monitoring wells and patching buried piping and other equipment for several years to control an escalating outbreak.But there have been failures. Between 2000 and 2009, the annual number of leaks from underground piping shot up fivefold, according to an internal industry document.
D'coda Dcoda

Nuclear Energy Quarterly Deals Analysis - M&A and Investment Trends, Q2 2011 [25Aug11] - 0 views

  • a new market research report is available in its catalogue: Nuclear Energy Quarterly Deals Analysis - M&A and Investment Trends, Q2 2011 http://www.reportlinker.com/p0285100/Nuclear-Energy-Quarterly-Deals-Analysis---MA-and-Investment-Trends-Q2-2011.html#utm_source=prnewswire&utm_medium=pr&utm_campaign=Nuclear_energy Nuclear Energy Quarterly Deals Analysis - M&A and Investment Trends, Q2 2011
  • Summary GlobalData's "Nuclear Energy Quarterly Deals Analysis - M&A and Investment Trends, Q2 2011" report is an essential source of data and trend analysis on Mergers and Acquisitions (M&As) and financings in the nuclear energy market. The report provides detailed information on M&As, equity and debt offerings, private equity and venture capital (PE/VC) and partnership transactions recorded in the nuclear energy industry in Q2 2011. The report provides detailed comparative data on the number of deals and their value in the last five quarters, categorized by deal types, segments and geographies. The report also provides information on the top advisory firms in the nuclear energy industry. Data presented in this report is derived from GlobalData's proprietary in-house Nuclear Energy eTrack deals database and primary and secondary research.
  • Scope - Analyze market trends for the nuclear energy market in the global arena - Review of deal trends in uranium mining & processing, equipment and services, and power generation markets - Analysis of M&A, Equity/Debt Offerings, Private Equity, Venture Financing and Partnerships in the nuclear energy industry - Summary of nuclear energy deals globally in the last five quarters - Information on top deals happened in the nuclear energy industry - Geographies covered include – North America, Europe, Asia Pacific, South & Central America, and Middle East & Africa - League Tables of financial advisors in M&A and equity/debt offerings. This includes key advisors such as Morgan Stanley, Credit Suisse, and Goldman Sachs
  • ...1 more annotation...
  • Reasons to buy - Enhance your decision making capability in a more rapid and time sensitive manner - Find out the major deal performing segments for investments in your industry - Evaluate type of companies divesting / acquiring and ways to raise capital in the market - Do deals with an understanding of how competitors are financed, and the mergers and partnerships that have shaped the nuclear energy market - Identify major private equity/venture capital firms that are providing finance in the nuclear energy market - Identify growth segments and opportunities in each region within the industry - Look for key financial advisors where you are planning to raise capital from the market or for acquisitions within the industry - Identify top deals makers in the nuclear energy market
  •  
    For purchase report
D'coda Dcoda

Fukushima radiation alarms doctors [18Aug11] - 0 views

  • Scientists and doctors are calling for a new national policy in Japan that mandates the testing of food, soil, water, and the air for radioactivity still being emitted from Fukushima's heavily damaged Daiichi nuclear power plant."How much radioactive materials have been released from the plant?" asked Dr Tatsuhiko Kodama, a professor at the Research Centre for Advanced Science and Technology and Director of the University of Tokyo's Radioisotope Centre, in a July 27 speech to the Committee of Health, Labour and Welfare at Japan's House of Representatives. "The government and TEPCO have not reported the total amount of the released radioactivity yet," said Kodama, who believes things are far worse than even the recent detection of extremely high radiation levels at the plant. There is widespread concern in Japan about a general lack of government monitoring for radiation, which has caused people to begin their own independent monitoring, which are also finding disturbingly high levels of radiation. Kodama's centre, using 27 facilities to measure radiation across the country, has been closely monitoring the situation at Fukushima - and their findings are alarming.According to Dr Kodama, the total amount of radiation released over a period of more than five months from the ongoing Fukushima nuclear disaster is the equivalent to more than 29 "Hiroshima-type atomic bombs" and the amount of uranium released "is equivalent to 20" Hiroshima bombs.
  • Kodama, along with other scientists, is concerned about the ongoing crisis resulting from the Fukushima situation, as well as what he believes to be inadequate government reaction, and believes the government needs to begin a large-scale response in order to begin decontaminating affected areas.Distrust of the Japanese government's response to the nuclear disaster is now common among people living in the effected prefectures, and people are concerned about their health.Recent readings taken at the plant are alarming.When on August 2nd readings of 10,000 millisieverts (10 sieverts) of radioactivity per hour were detected at the plant, Japan's science ministry said that level of dose is fatal to humans, and is enough radiation to kill a person within one to two weeks after the exposure. 10,000 millisieverts (mSv) is the equivalent of approximately 100,000 chest x-rays.
  • t is an amount 250 per cent higher than levels recorded at the plant in March after it was heavily damaged by the earthquake and ensuing tsunami. The operator of Japan's crippled Fukushima Daiichi nuclear power plant, Tokyo Electric Power Company (TEPCO), that took the reading, used equipment to measure radiation from a distance, and was unable to ascertain the exact level because the device's maximum reading is only 10,000 mSv. TEPCO also detected 1,000 millisieverts (mSv) per hour in debris outside the plant, as well as finding 4,000 mSv per hour inside one of the reactor buildings.
  • ...35 more annotations...
  • he Fukushima disaster has been rated as a "level seven" on the International Nuclear and Radiological Event Scale (INES). This level, the highest, is the same as the Chernobyl nuclear disaster in 1986, and is defined by the scale as: "[A] major release of radioactive material with widespread health and environmental effects requiring implementation of planned and extended countermeasures."The Fukushima and Chernobyl disasters are the only nuclear accidents to have been rated level seven on the scale, which is intended to be logarithmic, similar to the scale used to describe the comparative magnitude of earthquakes. Each increasing level represents an accident approximately ten times more severe than the previous level.
  • Doctors in Japan are already treating patients suffering health effects they attribute to radiation from the ongoing nuclear disaster."We have begun to see increased nosebleeds, stubborn cases of diarrhoea, and flu-like symptoms in children," Dr Yuko Yanagisawa, a physician at Funabashi Futawa Hospital in Chiba Prefecture, told Al Jazeera.
  • r Helen Caldicott, the founding president of Physicians for Social Responsibility, a group that was awarded the Nobel Peace Prize in 1985, is equally concerned about the health effects from Japan's nuclear disaster."Radioactive elements get into the testicles and ovaries, and these cause genetic disease like diabetes, cystic fibrosis, and mental retardation," she told Al Jazeera. "There are 2,600 of these diseases that get into our genes and are passed from generation to generation, forever."
  • Al Jazeera's Aela Callan, reporting from Japan's Ibaraki prefecture, said of the recently detected high radiation readings: "It is now looking more likely that this area has been this radioactive since the earthquake and tsunami, but no one realised until now."Workers at Fukushima are only allowed to be exposed to 250 mSv of ionising radiation per year.
  • radioactive cesium exceeding the government limit was detected in processed tea made in Tochigi City, about 160km from the troubled Fukushima Daiichi nuclear plant, according to the Tochigi Prefectural Government, who said radioactive cesium was detected in tea processed from leaves harvested in the city in early July. The level is more than 3 times the provisional government limit.
  • anagisawa's hospital is located approximately 200km from Fukushima, so the health problems she is seeing that she attributes to radiation exposure causes her to be concerned by what she believes to be a grossly inadequate response from the government.From her perspective, the only thing the government has done is to, on April 25, raise the acceptable radiation exposure limit for children from 1 mSv/year to 20 mSv/year.
  • This has caused controversy, from the medical point of view," Yanagisawa told Al Jazeera. "This is certainly an issue that involves both personal internal exposures as well as low-dose exposures."Junichi Sato, Greenpeace Japan Executive Director, said: "It is utterly outrageous to raise the exposure levels for children to twenty times the maximum limit for adults."
  • The Japanese government cannot simply increase safety limits for the sake of political convenience or to give the impression of normality."Authoritative current estimates of the health effects of low-dose ionizing radiation are published in the Biological Effects of Ionising Radiation VII (BEIR VII) report from the US National Academy of Sciences.
  • he report reflects the substantial weight of scientific evidence proving there is no exposure to ionizing radiation that is risk-free. The BEIR VII estimates that each 1 mSv of radiation is associated with an increased risk of all forms of cancer other than leukemia of about 1-in-10,000; an increased risk of leukemia of about 1-in-100,000; and a 1-in-17,500 increased risk of cancer death.
  • She attributes the symptoms to radiation exposure, and added: "We are encountering new situations we cannot explain with the body of knowledge we have relied upon up until now.""The situation at the Daiichi Nuclear facility in Fukushima has not yet been fully stabilised, and we can't yet see an end in sight," Yanagisawa said. "Because the nuclear material has not yet been encapsulated, radiation continues to stream into the environment."
  • So far, the only cases of acute radiation exposure have involved TEPCO workers at the stricken plant. Lower doses of radiation, particularly for children, are what many in the medical community are most concerned about, according to Dr Yanagisawa.
  • Humans are not yet capable of accurately measuring the low dose exposure or internal exposure," she explained, "Arguing 'it is safe because it is not yet scientifically proven [to be unsafe]' would be wrong. That fact is that we are not yet collecting enough information to prove the situations scientifically. If that is the case, we can never say it is safe just by increasing the annual 1mSv level twenty fold."
  • Her concern is that the new exposure standards by the Japanese government do not take into account differences between adults and children, since children's sensitivity to radiation exposure is several times higher than that of adults.
  • Al Jazeera contacted Prime Minister Naoto Kan's office for comment on the situation. Speaking on behalf of the Deputy Cabinet Secretary for Public Relations for the Prime Minister's office, Noriyuki Shikata said that the Japanese government "refers to the ICRP [International Commission on Radiological Protection] recommendation in 2007, which says the reference levels of radiological protection in emergency exposure situations is 20-100 mSv per year. The Government of Japan has set planned evacuation zones and specific spots recommended for evacuation where the radiation levels reach 20 mSv/year, in order to avoid excessive radiation exposure."
  • he prime minister's office explained that approximately 23bn yen ($300mn) is planned for decontamination efforts, and the government plans to have a decontamination policy "by around the end of August", with a secondary budget of about 97bn yen ($1.26bn) for health management and monitoring operations in the affected areas. When questioned about the issue of "acute radiation exposure", Shikata pointed to the Japanese government having received a report from TEPCO about six of their workers having been exposed to more than 250 mSv, but did not mention any reports of civilian exposures.
  • Prime Minister Kan's office told Al Jazeera that, for their ongoing response to the Fukushima crisis, "the government of Japan has conducted all the possible countermeasures such as introduction of automatic dose management by ID codes for all workers and 24 hour allocation of doctors. The government of Japan will continue to tackle the issue of further improving the health management including medium and long term measures". Shikata did not comment about Kodama's findings.
  • Nishio Masamichi, director of Japan's Hakkaido Cancer Centre and a radiation treatment specialist, published an article on July 27 titled: "The Problem of Radiation Exposure Countermeasures for the Fukushima Nuclear Accident: Concerns for the Present Situation". In the report, Masamichi said that such a dramatic increase in permitted radiation exposure was akin to "taking the lives of the people lightly". He believes that 20mSv is too high, especially for children who are far more susceptible to radiation.
  • Kodama is an expert in internal exposure to radiation, and is concerned that the government has not implemented a strong response geared towards measuring radioactivity in food. "Although three months have passed since the accident already, why have even such simple things have not been done yet?" he said. "I get very angry and fly into a rage."
  • Radiation has a high risk to embryos in pregnant women, juveniles, and highly proliferative cells of people of growing ages. Even for adults, highly proliferative cells, such as hairs, blood, and intestinal epithelium cells, are sensitive to radiation."
  • Early on in the disaster, Dr Makoto Kondo of the department of radiology of Keio University's School of Medicine warned of "a large difference in radiation effects on adults compared to children".Kondo explained the chances of children developing cancer from radiation exposure was many times higher than adults.
  • Children's bodies are underdeveloped and easily affected by radiation, which could cause cancer or slow body development. It can also affect their brain development," he said.Yanagisawa assumes that the Japanese government's evacuation standards, as well as their raising the permissible exposure limit to 20mSv "can cause hazards to children's health," and therefore "children are at a greater risk".
  • Kodama, who is also a doctor of internal medicine, has been working on decontamination of radioactive materials at radiation facilities in hospitals of the University of Tokyo for the past several decades. "We had rain in Tokyo on March 21 and radiation increased to .2 micosieverts/hour and, since then, the level has been continuously high," said Kodama, who added that his reporting of radiation findings to the government has not been met an adequate reaction. "At that time, the chief cabinet secretary, Mr Edano, told the Japanese people that there would be no immediate harm to their health."
  • n early July, officials with the Japanese Nuclear Safety Commission announced that approximately 45 per cent of children in the Fukushima region had experienced thyroid exposure to radiation, according to a survey carried out in late March. The commission has not carried out any surveys since then.
  • Now the Japanese government is underestimating the effects of low dosage and/or internal exposures and not raising the evacuation level even to the same level adopted in Chernobyl," Yanagisawa said. "People's lives are at stake, especially the lives of children, and it is obvious that the government is not placing top priority on the people's lives in their measures."Caldicott feels the lack of a stronger response to safeguard the health of people in areas where radiation is found is "reprehensible".
  • Millions of people need to be evacuated from those high radiation zones, especially the children."
  • Dr Yanagisawa is concerned about what she calls "late onset disorders" from radiation exposure resulting from the Fukushima disaster, as well as increasing cases of infertility and miscarriages."Incidence of cancer will undoubtedly increase," she said. "In the case of children, thyroid cancer and leukemia can start to appear after several years. In the case of adults, the incidence of various types of cancer will increase over the course of several decades."Yanagisawa said it is "without doubt" that cancer rates among the Fukushima nuclear workers will increase, as will cases of lethargy, atherosclerosis, and other chronic diseases among the general population in the effected areas.
  • Radioactive food and water
  • An August 1 press release from Japan's MHLW said no radioactive materials have been detected in the tap water of Fukushima prefecture, according to a survey conducted by the Japanese government's Nuclear Emergency Response Headquarters. The government defines no detection as "no results exceeding the 'Index values for infants (radioactive iodine)'," and says "in case the level of radioactive iodine in tap water exceeds 100 Bq/kg, to refrain from giving infants formula milk dissolved by tap water, having them intake tap water … "
  • Yet, on June 27, results were published from a study that found 15 residents of Fukushima prefecture had tested positive for radiation in their urine. Dr Nanao Kamada, professor emeritus of radiation biology at Hiroshima University, has been to Fukushima prefecture twice in order to take internal radiation exposure readings and facilitated the study.
  • The risk of internal radiation is more dangerous than external radiation," Dr Kamada told Al Jazeera. "And internal radiation exposure does exist for Fukushima residents."According to the MHLW, distribution of several food products in Fukushima Prefecture remain restricted. This includes raw milk, vegetables including spinach, kakina, and all other leafy vegetables, including cabbage, shiitake mushrooms, bamboo shoots, and beef.
  • he distribution of tealeaves remains restricted in several prefectures, including all of Ibaraki, and parts of Tochigi, Gunma, Chiba, Kanagawa Prefectures.Iwate prefecture suspended all beef exports because of caesium contamination on August 1, making it the fourth prefecture to do so.
  • yunichi Tokuyama, an expert with the Iwate Prefecture Agricultural and Fisheries Department, told Al Jazeera he did not know how to deal with the crisis. He was surprised because he did not expect radioactive hot spots in his prefecture, 300km from the Fukushima nuclear plant."The biggest cause of this contamination is the rice straw being fed to the cows, which was highly radioactive," Tokuyama told Al Jazeera.
  • Kamada feels the Japanese government is acting too slowly in response to the Fukushima disaster, and that the government needs to check radiation exposure levels "in each town and village" in Fukushima prefecture."They have to make a general map of radiation doses," he said. "Then they have to be concerned about human health levels, and radiation exposures to humans. They have to make the exposure dose map of Fukushima prefecture. Fukushima is not enough. Probably there are hot spots outside of Fukushima. So they also need to check ground exposure levels."
  • Radiation that continues to be released has global consequences.More than 11,000 tonnes of radioactive water has been released into the ocean from the stricken plant.
  • Those radioactive elements bio-concentrate in the algae, then the crustaceans eat that, which are eaten by small then big fish," Caldicott said. "That's why big fish have high concentrations of radioactivity and humans are at the top of the food chain, so we get the most radiation, ultimately."
D'coda Dcoda

Tepco cost cut goal said well short of target [03Oct11] - 0 views

  • Tokyo Electric Power Co. should cut costs by around twice as much as it is aiming for over the next 10 years if it expects to compensate victims of the nuclear crisis at its Fukushima No. 1 nuclear plant, a government panel said Monday.
  • In its report, the third-party panel also urged the utility to review its price-setting regime because its findings suggest that household power bills may be unnecessarily high due to cost overestimates on Tepco's side. One estimate in the report states that compensation payments could reach around ¥4.54 trillion by March 2013, including about ¥3.64 trillion for around a year starting from March 11, the day when the megaquake and tsunami crippled the plant.
  • The panel, tasked with scrutinizing Tepco's financial standing, submitted the report to Prime Minister Yoshihiko Noda. It is part of the process the utility must take to get state aid for the compensation payments. Calling the report a "starting point," Noda said the government would "strictly" check Tepco's rationalization efforts and look into the country's electricity pricing system. The outcome of the study, which started in June, showed that Tepco could cut ¥2.55 trillion in costs by fiscal 2020 by reducing personnel and other expenses. But Tepco's plan shows costs would only be cut by ¥1.19 trillion. The panel, meanwhile, urged Tepco's managers to take responsibility by jettisoning executives and other means if it intends to win financial aid from the state-backed Nuclear Damage Compensation Facilitation Corp.
  • ...2 more annotations...
  • This entity is to collect funds by issuing special government bonds and collecting contributions from other utilities that run nuclear power plants in Japan. The content of the panel's report will be reflected in Tepco's special operating plan, which is to be compiled with the entity this month. "Including the streamlining suggestion from the report in the special operating plan is like a minimum requirement for Tepco," said Kazuhiko Shimokobe, the lawyer who headed the panel. The report also points out that restarting the reactors at the sprawling Kashiwazaki-Kariwa power plant in Niigata Prefecture, the world's largest nuclear power plant, will be crucial to Tepco's medium term plan. Simulations in the report link the timing of the restarts and rate hike to Tepco's cash flow.
  • According to one simulation, if Tepco doesn't raise prices or restart the reactors in the next 10 years, it will face a cash shortage of about ¥8.6 trillion in fiscal 2020. Another says that if Tepco restarts the reactors by the end of fiscal 2014 and raises prices 10 percent, it will cut the shortage to ¥790 billion in fiscal 2020. First policy review The government was convening the first meeting of a panel of experts Monday to review the nation's energy policy in light of the Fukushima nuclear crisis. The panel, set up under the trade ministry's energy advisory panel, is tasked with reviewing Japan's basic energy plan, revised last year, which calls for building more than 14 new reactors to boost national reliance on nuclear energy to 53 percent by 2030 from about 30 percent. Its talks will likely differ from past ones as around 10 of its 25 members oppose nuclear power. A previous panel had few opponents because the ministry was promoting atomic power. The panel, headed by Nippon Steel Corp. Chairman Akio Mimura, plans to hold one or two meetings a month to compile a new energy plan by next summer.
D'coda Dcoda

Interaction Between Social Media and Nuclear Energy [17Jul11] - 0 views

  • As blogger on nuclear energy for the past five years, I realize I’m writing on a niche subject that isn’t going to pull in millions of readers. Unlike some entertainment blogs, a site on nuclear energy is never going to be able to link the words “reactor pressure vessel” with the antics of a Hollywood celebrity at a New York night club. So, what can be said about the use of social media and how it has evolved as a new communication tool in a mature industry?
  • EBR-1 chalkboard ~ the 1st known nuclear energy blog post 12/21/51 on the Arco desert of eastern Idaho
  • Evidence of acceptance of social media is widespread, with the most recent example being the launch of the Nuclear Information Center, a social media presence by Duke Energy (NYSE:DUK). Content written for the Nuclear Information Center by a team of the utility’s employees is clearly designed to reach out to the general public. This effort goes beyond the usual scope of a utility Web site, which includes things like how to pay your bill online, where to call when the lights go out, and so forth.
  • ...13 more annotations...
  • Most nuclear blogs have a “blog roll”which list other publishers of information on the nuclear energy field.  Areva has done this on its North American blog. Areva handles the issue of avoiding any appearance of endorsement by noting that the list with more than two dozen entries is one of “blogs we read.” Areva also has several years of experience reaching out to the nuclear blogger community with monthly conference calls. The blog of the Nuclear Energy Institute, NEI Nuclear Notes,  lists a wide range of nuclear blogs including this one as well as the blogs published by independent analysts.
  • Duke’s Web site is a completely modern effort set up like a blog, with new entries on a frequent basis. On the right column, the site has a list of other places to get nuclear energy information, including the American Nuclear Society (ANS), the Nuclear Energy Institute (NEI), and the Nuclear Regulatory Commission (NRC).
  • The Nuclear Information Center announces right at the top that “In this online space, you will find educational information on the nuclear industry and the nuclear stations operated by Duke Energy. We will feature insights into radiation, new nuclear, emergency planning and more . . . allowing readers to get an inside view of the industry.” That’s a big step for a nuclear utility. The reason is that like many publicly traded electric utilities, it generates electricity from several fuel sources, including coal, natural gas, solar, wind, and nuclear. Because these utilities have huge customer rate bases and supply chains, they are inherently conservative about the information they publish on their Web sites. Also, there are significant legal and financial reasons why a utility might or might not put information out there for public consumption. Press releases receive scrutiny from the general counsel and chief financial officer for very important reasons having to do with regulatory oversight and shareholder value.
  • Idaho National Laboratory, Areva, and recruiter CoolHandNuke.
  • Taken together, the four blogs that reported monthly page views represent 100,000 visits to online information pages on nuclear energy or an effective rate of well over 1 million page views per year. These are real numbers and the data are just for a small sample of the more than two dozen blogs on nuclear energy that update at least once a week. Another interesting set of statistics is who reads North American blogs overseas? It turns out that the international readership is concentrated in a small group of countries. They include, in alphabetical order for the same sample of blogs, the following countries: Australia Canada France Germany India Japan United Kingdom
  • Who reads nuclear energy blogs? So, who is reading nuclear blogs? On the ANS Social Media listserv, I asked this question recently and got some interesting results for the month of May 2011. Here’s a sample of the replies: Michele Kearny, at the Nuclear Wire, a news service, reports for the month of May 18,812 page views. Michele’s blog is a fast-moving series of news links that keeps readers coming back for updates. Will Davis, at Atomic Power Review, who has been publishing high quality, in-depth technical updates about Fukushima, reports 31,613 page views for the same month. Rod Adams, who recently updated the template at his blog at Atomic Insights, reported his numbers in terms of absolute visitors. He cites Google Analytics as reporting 10,583 unique visitors for May. Rod emphasizes commentary and analysis across a wide range of nuclear subjects. At my blog Idaho Samizdat, I can report 6,945 visitors and 24,938 page views for May 2011. The blog covers economic and political news about nuclear energy and nonproliferation issues. At ANS Nuclear Cafe, this blog uses WordPress to track readers, reporting 24,476 page views for the same four-week period as the other blogs. During the height of the Fukushima crisis on a single day, March 14, 2011, the blog attained over 55,000 page views as people poured on to the Internet in search of information about the situation in Japan.
  • 5,000 people interact on LinkedIn, moderated by nuclear industry consultant Ed Kee. It is called “Nuclear Power Next Generation” and is one of dozens of such groups related to nuclear energy on the professional networking site.
  • Nuclear energy is not so widely represented on Facebook as on LinkedIn, despite its enormous popularity, and isn’t conducive to the kinds of technical dialogs that populate other nuclear social media sites. While the Facebook format is attractive to lifestyle information such as dating and the promotion of entertainment, sports, and consumer packaged goods, it doesn’t seem to work as well for business and engineering topics. It turns out Facebook is a good way to offer a “soft sell” for recruitment purposes to drive traffic to nuclear energy organization recruitment pages. It can answer the questions of what’s it like to work for an organization and the attractive amenities of life in the employer’s home town. Videos and photos can help deliver these messages.
  • On the other hand, Twitter, even with its limits of 140 characters, is enormously useful for the nuclear energy field. Twitter users who follow the output of nuclear bloggers number in the tens of thousands, and many nuclear energy organizations, including the major utilities such as Entergy, have invested in a Twitter account to have a presence on the service. The American Nuclear Society “tweets” under @ans_org and posts updates daily on the situation at Fukushima
  • Web sites maintained by NEI and the World Nuclear Organization had to make fast upgrades to their computer servers to handle millions of inquires from the media and the public and on a global scale. Getting out the facts of the situation to respond to these inquiries was facilitated by this online presence at an unprecedented scale. Even so, newspapers often had anti-nuclear groups on speed dial early in the crisis and their voices reached an unsettled public with messages of fear, uncertainty, and doubt. In response, ANS used technical experts on its social media listserv to information media engagements, which reached millions of views on network television and major newspapers like the New York Times and Washington Post.
  • This useful mix of free form communication on the listserv and excellent outreach by Clark Communications, working for ANS, made a difference in getting the facts about Fukushima to an understandably anxious public. Margaret Harding, a consulting nuclear engineer with deep experience with boiling water reactor fuels, was one of the people tapped by ANS to be a spokesperson for the society. She wrote to me in a personal e-mail that social media made a difference for her in many ways.
  • In summary, she said that it would have been impossible for her to fulfill this role without many hands helping her from various quarters at ANS. She pointed out that the ANS Social Media listserv group “provided invaluable background information . . that helped me keep up-to-date and ready for the question from the next reporter.” In fact, she said, she might not have even started down this road if the listserv hadn’t already proven itself as a source of information and expertise.
  • Another take on the news media’s shift into anti-nuclear skepticism following Fukushima comes from Andrea Jennetta, publisher of Fuel Cycle Week.  Writing in the March 17 issue, she said that this time the “bunker mentality” that has characterized communications in prior years by the nuclear industry gave way to something new. “But instead of rolling over, the nuclear community for once is mobilizing and fighting back. I am impressed at the efforts of various pronuclear activists, bloggers, advocates and professional organizations.
  •  
    important one
D'coda Dcoda

Gov't Report: EPA's ability to protect human health with RadNet was "potentially impair... - 0 views

  • An internal audit has confirmed observers’ concerns that many of the U.S. Environmental Protection Agency’s radiation monitors were out of service at the height of the 2011 Fukushima power plant meltdown in Japan [...] RadNet consists of 124 stations scattered throughout U.S. territories and 40 deployable air monitors that can be sent to take readings anywhere, according to the IG report. [...] At the time of the Fukushima crisis, “this critical infrastructure asset” was impaired because many monitors were broken, while others had not undergone filter changes in so long that they could not be used to accurately detect real-time radiation levels, the IG report says.
  • “On March 11, 2011, at the time of the Japan nuclear incident, 25 of the 124 installed RadNet monitors, or 20 percent, were out of service for an average of 130 days,” the report says. “In addition, six of the 12 RadNet monitors we sampled (50 percent) had gone over eight weeks without a filter change, and two of those for over 300 days,” the report adds, noting that EPA policy calls on operators to change the filters twice per week. Currently, “EPA remains behind schedule for installing” radiation monitors and has not resolved contracting issues identified as causing similar problems with the system in a 2009 audit, the report says. “Until EPA improves contractor oversight, the agency’s ability to use RadNet data to protect human health and the environment, and meet requirements established in the National Response Framework for Radiological Incidents, is potentially impaired.” [...]
1 - 20 of 551 Next › Last »
Showing 20 items per page