Skip to main content

Home/ Open Intelligence / Energy/ Group items tagged study

Rss Feed Group items tagged

D'coda Dcoda

Economic Aspects of Nuclear Fuel Reprocessing [12Jul05] - 0 views

  • On Tuesday, July 12, the Energy Subcommittee of the House Committee on Science will hold a hearing to examine whether it would be economical for the U.S. to reprocess spent nuclear fuel and what the potential cost implications are for the nuclear power industry and for the Federal Government. This hearing is a follow-up to the June 16 Energy Subcommittee hearing that examined the status of reprocessing technologies and the impact reprocessing would have on energy efficiency, nuclear waste management, and the potential for proliferation of weapons-grade nuclear materials.
  • Dr. Richard K. Lester is the Director of the Industrial Performance Center and a Professor of Nuclear Science and Engineering at the Massachusetts Institute of Technology. He co-authored a 2003 study entitled The Future of Nuclear Power. Dr. Donald W. Jones is Vice President of Marketing and Senior Economist at RCF Economic and Financial Consulting, Inc. in Chicago, Illinois. He co-directed a 2004 study entitled The Economic Future of Nuclear Power. Dr. Steve Fetter is the Dean of the School of Public Policy at the University of Maryland. He co-authored a 2005 paper entitled The Economics of Reprocessing vs. Direct Disposal of Spent Nuclear Fuel. Mr. Marvin Fertel is the Senior Vice President and Chief Nuclear Officer at the Nuclear Energy Institute.
  • 3. Overarching Questions  Under what conditions would reprocessing be economically competitive, compared to both nuclear power that does not include fuel reprocessing, and other sources of electric power? What major assumptions underlie these analyses?  What government subsidies might be necessary to introduce a more advanced nuclear fuel cycle (that includes reprocessing, recycling, and transmutation—''burning'' the most radioactive waste products in an advanced reactor) in the U.S.?
  • ...13 more annotations...
  • 4. Brief Overview of Nuclear Fuel Reprocessing (from June 16 hearing charter)  Nuclear reactors generate about 20 percent of the electricity used in the U.S. No new nuclear plants have been ordered in the U.S. since 1973, but there is renewed interest in nuclear energy both because it could reduce U.S. dependence on foreign oil and because it produces no greenhouse gas emissions.  One of the barriers to increased use of nuclear energy is concern about nuclear waste. Every nuclear power reactor produces approximately 20 tons of highly radioactive nuclear waste every year. Today, that waste is stored on-site at the nuclear reactors in water-filled cooling pools or, at some sites, after sufficient cooling, in dry casks above ground. About 50,000 metric tons of commercial spent fuel is being stored at 73 sites in 33 states. A recent report issued by the National Academy of Sciences concluded that this stored waste could be vulnerable to terrorist attacks.
  • Under the current plan for long-term disposal of nuclear waste, the waste from around the country would be moved to a permanent repository at Yucca Mountain in Nevada, which is now scheduled to open around 2012. The Yucca Mountain facility continues to be a subject of controversy. But even if it opened and functioned as planned, it would have only enough space to store the nuclear waste the U.S. is expected to generate by about 2010.  Consequently, there is growing interest in finding ways to reduce the quantity of nuclear waste. A number of other nations, most notably France and Japan, ''reprocess'' their nuclear waste. Reprocessing involves separating out the various components of nuclear waste so that a portion of the waste can be recycled and used again as nuclear fuel (instead of disposing of all of it). In addition to reducing the quantity of high-level nuclear waste, reprocessing makes it possible to use nuclear fuel more efficiently. With reprocessing, the same amount of nuclear fuel can generate more electricity because some components of it can be used as fuel more than once.
  • The greatest drawback of reprocessing is that current reprocessing technologies produce weapons-grade plutonium (which is one of the components of the spent fuel). Any activity that increases the availability of plutonium increases the risk of nuclear weapons proliferation.  Because of proliferation concerns, the U.S. decided in the 1970s not to engage in reprocessing. (The policy decision was reversed the following decade, but the U.S. still did not move toward reprocessing.) But the Department of Energy (DOE) has continued to fund research and development (R&D) on nuclear reprocessing technologies, including new technologies that their proponents claim would reduce the risk of proliferation from reprocessing.
  • The report accompanying H.R. 2419, the Energy and Water Development Appropriations Act for Fiscal Year 2006, which the House passed in May, directed DOE to focus research in its Advanced Fuel Cycle Initiative program on improving nuclear reprocessing technologies. The report went on to state, ''The Department shall accelerate this research in order to make a specific technology recommendation, not later than the end of fiscal year 2007, to the President and Congress on a particular reprocessing technology that should be implemented in the United States. In addition, the Department shall prepare an integrated spent fuel recycling plan for implementation beginning in fiscal year 2007, including recommendation of an advanced reprocessing technology and a competitive process to select one or more sites to develop integrated spent fuel recycling facilities.''
  • During floor debate on H.R. 2419, the House defeated an amendment that would have cut funding for research on reprocessing. In arguing for the amendment, its sponsor, Mr. Markey, explicitly raised the risks of weapons proliferation. Specifically, the amendment would have cut funding for reprocessing activities and interim storage programs by $15.5 million and shifted the funds to energy efficiency activities, effectively repudiating the report language. The amendment was defeated by a vote of 110–312.
  • But nuclear reprocessing remains controversial, even within the scientific community. In May 2005, the American Physical Society (APS) Panel on Public Affairs, issued a report, Nuclear Power and Proliferation Resistance: Securing Benefits, Limiting Risk. APS, which is the leading organization of the Nation's physicists, is on record as strongly supporting nuclear power. But the APS report takes the opposite tack of the Appropriations report, stating, ''There is no urgent need for the U.S. to initiate reprocessing or to develop additional national repositories. DOE programs should be aligned accordingly: shift the Advanced Fuel Cycle Initiative R&D away from an objective of laying the basis for a near-term reprocessing decision; increase support for proliferation-resistance R&D and technical support for institutional measures for the entire fuel cycle.''  Technological as well as policy questions remain regarding reprocessing. It is not clear whether the new reprocessing technologies that DOE is funding will be developed sufficiently by 2007 to allow the U.S. to select a technology to pursue. There is also debate about the extent to which new technologies can truly reduce the risks of proliferation.
  •  It is also unclear how selecting a reprocessing technology might relate to other pending technology decisions regarding nuclear energy. For example, the U.S. is in the midst of developing new designs for nuclear reactors under DOE's Generation IV program. Some of the potential new reactors would produce types of nuclear waste that could not be reprocessed using some of the technologies now being developed with DOE funding.
  • 5. Brief Overview of Economics of Reprocessing
  • The economics of reprocessing are hard to predict with any certainty because there are few examples around the world on which economists might base a generalized model.  Some of the major factors influencing the economic competitiveness of reprocessing are: the availability and cost of uranium, costs associated with interim storage and long-term disposal in a geologic repository, reprocessing plant construction and operating costs, and costs associated with transmutation, the process by which certain parts of the spent fuel are actively reduced in toxicity to address long-term waste management.
  • Costs associated with reducing greenhouse gas emissions from fossil fuel-powered plants could help make nuclear power, including reprocessing, economically competitive with other sources of electricity in a free market.
  •  It is not clear who would pay for reprocessing in the U.S.
  • Three recent studies have examined the economics of nuclear power. In a study completed at the Massachusetts Institute of Technology in 2003, The Future of Nuclear Power, an interdisciplinary panel, including Professor Richard Lester, looked at all aspects of nuclear power from waste management to economics to public perception. In a study requested by the Department of Energy and conducted at the University of Chicago in 2004, The Economic Future of Nuclear Power, economist Dr. Donald Jones and his colleague compared costs of future nuclear power to other sources, and briefly looked at the incremental costs of an advanced fuel cycle. In a 2003 study conducted by a panel including Matthew Bunn (a witness at the June 16 hearing) and Professor Steve Fetter, The Economics of Reprocessing vs. Direct Disposal of Spent Nuclear Fuel, the authors took a detailed look at the costs associated with an advanced fuel cycle. All three studies seem more or less to agree on cost estimates: the incremental cost of nuclear electricity to the consumer, with reprocessing, could be modest—on the order of 1–2 mills/kWh (0.1–0.2 cents per kilowatt-hour); on the other hand, this increase represents an approximate doubling (at least) of the costs attributable to spent fuel management, compared to the current fuel cycle (no reprocessing). Where they strongly disagree is on how large an impact this incremental cost will have on the competitiveness of nuclear power. The University of Chicago authors conclude that the cost of reprocessing is negligible in the big picture, where capital costs of new plants dominate all economic analyses. The other two studies take a more skeptical view—because new nuclear power would already be facing tough competition in the current market, any additional cost would further hinder the nuclear power industry, or become an unacceptable and unnecessary financial burden on the government.
  • 6. Background
  •  
    Report from the Subcommitte on Energy, Committee on Science for House of Representatives. Didn't highlight the entire article, see site for the rest.
D'coda Dcoda

Fast reactor advocates throw down gauntlet to MIT authors[24Jul11] - 0 views

  • Near the end of 2010, the Massachusetts Institute of Technology released a summary of a report titled The Future of the Nuclear Fuel Cycle as part of its MIT Energy Initiative. The complete report was released a few months ago. The conclusions published that report initiated a virtual firestorm of reaction among the members of the Integral Fast Reactor (IFR) Study group who strongly disagreed with the authors.
  • the following quote from the “Study Context” provides a good summary of why the fast reactor advocates were so dismayed by the report.
  • For decades, the discussion about future nuclear fuel cycles has been dominated by the expectation that a closed fuel cycle based on plutonium startup of fast reactors would eventually be deployed. However, this expectation is rooted in an out-of-date understanding about uranium scarcity. Our reexamination of fuel cycles suggests that there are many more viable fuel cycle options and that the optimum choice among them faces great uncertainty—some economic, such as the cost of advanced reactors, some technical such as implications for waste management, and some societal, such as the scale of nuclear power deployment and the management of nuclear proliferation risks. Greater clarity should emerge over the next few decades, assuming that the needed research is carried out for technological alternatives and that the global response to climate change risk mitigation comes together. A key message from our work is that we can and should preserve our options for fuel cycle choices by continuing with the open fuel cycle, implementing a system for managed LWR spent fuel storage, developing a geological repository, and researching technology alternatives appropriate to a range of nuclear energy futures.
  • ...10 more annotations...
  • The group of fast reactor supporters includes some notable scientists and engineers whose list of professional accomplishments is at least as long as those of the people who produced the MIT report. In addition, it includes people like Charles Till and Yoon Chang who were intimately involved in the US’s multi-decade long fast reactor development and demonstration program that resulted in demonstrating a passively safe, sodium cooled reactor and an integral recycling system based on metallic fuel and pyroprocessing.
  • That effort, known as the Integral Fast Reactor, was not just based on an out-dated concept of uranium availability, but also on the keen recognition that the public wants a clear solution to “the nuclear waste issue” that does not look like a decision to “kick the can down the road.”
  • he Science Council for Global Initiatives produced a detailed critique of the MIT paper and published that on Barry Brook’s Brave New Climate blog at the end of May 2011. The discussion has a great deal of interest for technical specialists and is supporting evidence that belies the often asserted falsehood (by people who oppose nuclear technology) that the people interested in developing and deploying nuclear technology speak with a single, almost brainwashed voice.
  • In recent days, however, the controversy has become more interesting because the IFR discussion group has decided to issue a public debate challenge and to allow people like me to write about that challenge in an attempt to produce some response.
  • I think your team is dead wrong on your conclusion that we don’t need fast reactors/closed fuel cycle for decades.Your study fails to take into account the political landscape the competitive landscape the safety issue environmental issues with uranium miningIt is unacceptable to the public to not have a solution to the waste issue. Nuclear power has been around for over 50 years, and we STILL HAVE NO OPTION FOR THE WASTE today other than interim dry cask storage. There is no national repository. Without that, the laws in my state forbid construction of a new nuclear power plant.
  • Other countries are pursuing fast reactors, we are not. Russia has 30 years of commercial operating history with fast reactors. The US has zero.We invented the best Gen IV technology according to the study done by the Gen IV International Forum. So what did we do with it? After spending $5B on the project, and after proving it met all expectations, we CANCELLED it (although the Senate voted to fund it).
  • An average investment of $300M a year could re-start our fast reactor program with a goal of actually commercializing our best reactor design (the IFR according the GIF study).
  • At least we’d have a bird in the hand that we know works, largely solves the waste problem, since the fast reactor waste needs only to be stored for a few hundred years at most, and doesn’t require electric power or any active systems to safely shut down.
  • Investing lots of money in a project and pulling the funding right before completion is a bad strategy for technology leadership.
  • MIT should be arguing for focusing and finishing what we started with the IFR. At least we’d have something that addresses safety, waste, and environmental issues. Uranium is cheap because we don’t have to pay for the environmental impact of uranium mining.
D'coda Dcoda

"Ecological Half Life" of Cesium-137 May Be 180 to 320 Years? [23Aug11] - 0 views

  • A Wired Magazine article dated December 15, 2009 cites a poster session presentation of the research of the Chernobyl exclusion zone at the American Geophysical Union conference in 2009, and says radioactive cesium may be remaining in the soil far longer than what the half life (30 years) suggests. To note: it was a poster session presentation, and I'm looking to see if it has been formally published in a scientific paper since then.
  • From Wired Magazine (12/15/2009): SAN FRANCISCO — Chernobyl, the worst nuclear accident in history, created an inadvertent laboratory to study the impacts of radiation — and more than twenty years later, the site still holds surprises.
  • Reinhabiting the large exclusion zone around the accident site may have to wait longer than expected. Radioactive cesium isn’t disappearing from the environment as quickly as predicted, according to new research presented here Monday at the meeting of the American Geophysical Union. Cesium 137’s half-life — the time it takes for half of a given amount of material to decay — is 30 years. In addition to that, cesium-137’s total ecological half-life — the time for half the cesium to disappear from the local environment through processes such as migration, weathering, and removal by organisms is also typically 30 years or less, but the amount of cesium in soil near Chernobyl isn’t decreasing nearly that fast. And scientists don’t know why.
  • ...8 more annotations...
  • It stands to reason that at some point the Ukrainian government would like to be able to use that land again, but the scientists have calculated that what they call cesium’s “ecological half-life” — the time for half the cesium to disappear from the local environment — is between 180 and 320 years.
  • “Normally you’d say that every 30 years, it’s half as bad as it was. But it’s not,” said Tim Jannik, nuclear scientist at Savannah River National Laboratory and a collaborator on the work. “It’s going to be longer before they repopulate the area.”
  • In 1986, after the Chernobyl accident, a series of test sites was established along paths that scientists expected the fallout to take. Soil samples were taken at different depths to gauge how the radioactive isotopes of strontium, cesium and plutonium migrated in the ground. They’ve been taking these measurements for more than 20 years, providing a unique experiment in the long-term environmental repercussions of a near worst-case nuclear accident.
  • In some ways, Chernobyl is easier to understand than DOE sites like Hanford, which have been contaminated by long-term processes. With Chernobyl, said Boris Faybishenko, a nuclear remediation expert at Lawrence Berkeley National Laboratory, we have a definite date at which the contamination began and a series of measurements carried out from that time to today. “I have been involved in Chernobyl studies for many years and this particular study could be of great importance to many [Department of Energy] researchers,” said Faybishenko.
  • The results of this study came as a surprise. Scientists expected the ecological half-lives of radioactive isotopes to be shorter than their physical half-life as natural dispersion helped reduce the amount of material in any given soil sample. For strontium, that idea has held up. But for cesium the the opposite appears to be true. The physical properties of cesium haven’t changed, so scientists think there must be an environmental explanation. It could be that new cesium is blowing over the soil sites from closer to the Chernobyl site. Or perhaps cesium is migrating up through the soil from deeper in the ground. Jannik hopes more research will uncover the truth.
  • “There are a lot of unknowns that are probably causing this phenomenon,” he said. Beyond the societal impacts of the study, the work also emphasizes the uncertainties associated with radioactive contamination. Thankfully, Chernobyl-scale accidents have been rare, but that also means there is a paucity of places to study how radioactive contamination really behaves in the wild.
  • “The data from Chernobyl can be used for validating models,” said Faybishenko. “This is the most value that we can gain from it.” Update 12/28: The second paragraph of this story was updated after discussion with Tim Jannik to more accurately reflect the idea of ecological half-life.
  • Citation: “Long-Term Dynamics of Radionuclides Vertical Migration in Soils of the Chernobyl Nuclear Power Plant Exclusion Zone” by Yu.A. Ivanov, V.A. Kashparov, S.E. Levchuk, Yu.V. Khomutinin, M.D. Bondarkov, A.M. Maximenko, E.B. Farfan, G.T. Jannik, and J.C. Marra. AGU 2009 poster session.
D'coda Dcoda

Fukushima's Contamination Produces Some Surprises at Sea [29Sep11] - 0 views

  • Six months after the accident at Fukushima Daiichi, the news flow from the stricken nuclear power plant has slowed, but scientific studies of radioactive material in the ocean are just beginning to bear fruit.The word from the land is bad enough. As my colleague Hiroko Tabuchi reported on Saturday, Japanese officials have detected elevated radiation levels in rice near the crippled reactors. Worrying radiation levels had already been detected in beef, milk, spinach and tea leaves, leading to recalls and bans on shipments.
  • Off the coast, the early results indicate that very large amounts of radioactive materials were released, and may still be leaking, and that rather than being spread through the whole ocean, currents are keeping a lot of the material concentrated. Most of that contamination came from attempts to cool the reactors and spent fuel pools, which flushed material from the plant into the ocean, and from direct leaks from the damaged facilities.
  • Working with a team of scientists from other institutions, including the University of Tokyo and Columbia University, Mr. Buesseler’s Woods Hole group in June spent 15 days in the waters off northeast Japan, studying the levels and dispersion of radioactive substances there and the effect on marine life.The project, financed primarily by the Moore Foundation after governments declined to participate, continued to receive samples from Japanese cruises into July.
  • ...7 more annotations...
  • The leakage very likely isn’t over, either. The Tokyo Electric Power Company, the operator of the plant, said Sept. 20 that it believed that something on the order of 200 to 500 tons a day of groundwater might still be pouring into the damaged reactor and turbine buildings.Ken Buesseler, a scientist at the Woods Hole Oceanographic Institution, who in 1986 studied the effects of the Chernobyl disaster on the Black Sea, said the Fukushima disaster appeared to be by far the largest accidental release of radioactive material into the sea.
  • Chernobyl-induced radiation in the Black Sea peaked in 1986 at about 1,000 becquerels per cubic meter, he said in an interview at his office in Woods Hole, Mass. By contrast, the radiation level off the coast near the Fukushima Daiichi plant peaked at more than 100,000 becquerels per cubic meter in early April.
  • Japanese government and utility industry scientists estimated this month that 3,500 terabecquerels of cesium 137 was released directly into the sea from March 11, the date of the earthquake and tsunami, to late May. Another 10,000 terabecquerels of cesium 137 made it into the ocean after escaping from the plant as steam.
  • While Mr. Buesseler declined to provide details of the findings before analysis is complete and published, he said the broad results were sobering.“When we saw the numbers — hundreds of millions of becquerels — we knew this was the largest delivery of radiation into the ocean ever seen,’’ he said. ‘‘We still don’t know how much was released.’’Mr. Buesseler took samples of about five gallons, filtered out the naturally occurring materials and the materials from nuclear weapon explosions, and measured what was left.
  • The scientists had expected to find ocean radiation levels falling off sharply after a few months, as radioactive substances were dispersed by the currents, because, he said, “The ocean’s solution to pollution is dilution.’’The good news is that researchers found the entire region 20 to 400 miles offshore had radiation levels too low to be an immediate threat to humans.But there was also an unpleasant surprise. “Rather than leveling off toward zero, it remained elevated in late July,’’ he said, up to about 10,000 becquerel per cubic meter. ‘‘That suggests the release problem has not been solved yet.”
  • The working hypothesis is that contaminated sediments and groundwater near the coast are continuing to contaminate the seas, he said.The international team also collected plankton samples and small fish for study. Mr. Buesseler said there were grounds for concern about bioaccumulation of radioactive isotopes in the food chain, particularly in seaweed and some shellfish close to the plants. A fuller understanding of the effect on fish that are commercially harvested will probably take several years of data following several feeding cycles, he said.
  • ‘We also don’t know concentrations in sediments, so benthic biota may be getting higher doses and if consumed (shellfish), could be of concern,’’ he wrote later in an e-mail, referring to organisms that dwell on the sea floor.The study also found that the highest cesium values were not necessarily from the samples collected closest to Fukushima, he said, because eddies in the ocean currents keep the material from being diluted in some spots farther offshore.The overall results were consistent with those previously found by Japanese scientists, Mr. Buesseler said.He said more research was urgently needed to answer several questions, including why the level of contamination offshore near the plant was so high.“Japan is leading the studies, but more work is needed than any one country, or any one lab, can possibly carry out,” he said.
D'coda Dcoda

Study: Childhood cancer not linked to reactors [13Jul11] - 0 views

  • A nationwide study involving more than 1.3 million children in Switzerland has concluded that there is no evidence of an increased risk of cancer for children born near nuclear power plants.    The Federal Office of Public Health (FOPH) and the Swiss Cancer League requested that the Institute of Social and Preventative Medicine (ISPM) at the University of Bern perform a study of the relationship between childhood cancer and nuclear power plants in Switzerland. ISPM then teamed with the Swiss Childhood Cancer Registry and the Swiss Paediatric Oncology Group to conduct the Childhood Cancer and Nuclear Power Plants in Switzerland (CANUPIS) study between September 2008 and December 2010. The results have now been published in the International Journal of Epidemiology.
  • The researchers computed person-years at risk for over 1.3 million children aged 0-15 years born in Switzerland between 1985 and 2009, based on the Swiss censuses 1990 and 2000. They also identified cancer cases in those children from the Swiss Childhood Cancer Registry. The ISPM then compared the rate of leukaemias and cancers in children born less than five kilometres, 5-10 km, and 10-15 km from the nearest nuclear power plants with the risk in children born further away.
  • Researchers concluded that the risk in the zone within 5 km of a nuclear power plant was "similar" to the risk in the control group areas over 15 km away, with 8 cases compared to 6.8 expected cases. In the 5-10 km zone there were 12 cases compared to 20.3 expected cases. And in the 10-15 km zone there were 31 cases compared to 28.3 expected cases. "A statistically significant increase or reduction in the risk of childhood cancer was not observed in any of the analyses," said the ISPM.
  • ...2 more annotations...
  • The study concluded, "This nationwide cohort study, adjusting for confounders and using exact distances from residence at birth and diagnosis to the nearest nuclear power plants, found little evidence for an association between the risk of leukaemia or any childhood cancer and living near nuclear power plants."   There are five nuclear power plants in Switzerland (Beznau I and II, Mühleberg, Gösgen and Leibstadt). About 1% of the population lives within 5 km of a plant and 10% live within 15 km.
  • The radioactive emissions in the vicinity of Swiss nuclear power plants are regularly monitored and the data are published by the Division for Radiation Protection of the FOPH. "The exposure due to emissions from nuclear power plants in the vicinity of these plants is below 0.01 millisieverts per year," the University of Bern said. "This corresponds to less than 1/500 of the average total radiation residents in Switzerland are exposed to, mainly from radon gas, cosmic and terrestrial radiation and medical investigations and therapies."
D'coda Dcoda

Scientists Radically Raise Estimates of Fukushima Fallout [25Oct11] - 0 views

  • The disaster at the Fukushima Daiichi nuclear plant in March released far more radiation than the Japanese government has claimed. So concludes a study1 that combines radioactivity data from across the globe to estimate the scale and fate of emissions from the shattered plant. The study also suggests that, contrary to government claims, pools used to store spent nuclear fuel played a significant part in the release of the long-lived environmental contaminant caesium-137, which could have been prevented by prompt action. The analysis has been posted online for open peer review by the journal Atmospheric Chemistry and Physics.
  • Andreas Stohl, an atmospheric scientist with the Norwegian Institute for Air Research in Kjeller, who led the research, believes that the analysis is the most comprehensive effort yet to understand how much radiation was released from Fukushima Daiichi. "It's a very valuable contribution," says Lars-Erik De Geer, an atmospheric modeller with the Swedish Defense Research Agency in Stockholm, who was not involved with the study. The reconstruction relies on data from dozens of radiation monitoring stations in Japan and around the world. Many are part of a global network to watch for tests of nuclear weapons that is run by the Comprehensive Nuclear-Test-Ban Treaty Organization in Vienna. The scientists added data from independent stations in Canada, Japan and Europe, and then combined those with large European and American caches of global meteorological data.
  • Stohl cautions that the resulting model is far from perfect. Measurements were scarce in the immediate aftermath of the Fukushima accident, and some monitoring posts were too contaminated by radioactivity to provide reliable data. More importantly, exactly what happened inside the reactors — a crucial part of understanding what they emitted — remains a mystery that may never be solved. "If you look at the estimates for Chernobyl, you still have a large uncertainty 25 years later," says Stohl. Nevertheless, the study provides a sweeping view of the accident. "They really took a global view and used all the data available," says De Geer.
  • ...7 more annotations...
  • Challenging numbers Japanese investigators had already developed a detailed timeline of events following the 11 March earthquake that precipitated the disaster. Hours after the quake rocked the six reactors at Fukushima Daiichi, the tsunami arrived, knocking out crucial diesel back-up generators designed to cool the reactors in an emergency. Within days, the three reactors operating at the time of the accident overheated and released hydrogen gas, leading to massive explosions. Radioactive fuel recently removed from a fourth reactor was being held in a storage pool at the time of the quake, and on 14 March the pool overheated, possibly sparking fires in the building over the next few days.
  • But accounting for the radiation that came from the plants has proved much harder than reconstructing this chain of events. The latest report from the Japanese government, published in June, says that the plant released 1.5 × 1016 bequerels of caesium-137, an isotope with a 30-year half-life that is responsible for most of the long-term contamination from the plant2. A far larger amount of xenon-133, 1.1 × 1019 Bq, was released, according to official government estimates.
  • Stohl believes that the discrepancy between the team's results and those of the Japanese government can be partly explained by the larger data set used. Japanese estimates rely primarily on data from monitoring posts inside Japan3, which never recorded the large quantities of radioactivity that blew out over the Pacific Ocean, and eventually reached North America and Europe. "Taking account of the radiation that has drifted out to the Pacific is essential for getting a real picture of the size and character of the accident," says Tomoya Yamauchi, a radiation physicist at Kobe University who has been measuring radioisotope contamination in soil around Fukushima. Click for full imageStohl adds that he is sympathetic to the Japanese teams responsible for the official estimate. "They wanted to get something out quickly," he says. The differences between the two studies may seem large, notes Yukio Hayakawa, a volcanologist at Gunma University who has also modelled the accident, but uncertainties in the models mean that the estimates are actually quite similar.
  • The new study challenges those numbers. On the basis of its reconstructions, the team claims that the accident released around 1.7 × 1019 Bq of xenon-133, greater than the estimated total radioactive release of 1.4 × 1019 Bq from Chernobyl. The fact that three reactors exploded in the Fukushima accident accounts for the huge xenon tally, says De Geer. Xenon-133 does not pose serious health risks because it is not absorbed by the body or the environment. Caesium-137 fallout, however, is a much greater concern because it will linger in the environment for decades. The new model shows that Fukushima released 3.5 × 1016 Bq caesium-137, roughly twice the official government figure, and half the release from Chernobyl. The higher number is obviously worrying, says De Geer, although ongoing ground surveys are the only way to truly establish the public-health risk.
  • The new analysis also claims that the spent fuel being stored in the unit 4 pool emitted copious quantities of caesium-137. Japanese officials have maintained that virtually no radioactivity leaked from the pool. Yet Stohl's model clearly shows that dousing the pool with water caused the plant's caesium-137 emissions to drop markedly (see 'Radiation crisis'). The finding implies that much of the fallout could have been prevented by flooding the pool earlier. The Japanese authorities continue to maintain that the spent fuel was not a significant source of contamination, because the pool itself did not seem to suffer major damage. "I think the release from unit 4 is not important," says Masamichi Chino, a scientist with the Japanese Atomic Energy Authority in Ibaraki, who helped to develop the Japanese official estimate. But De Geer says the new analysis implicating the fuel pool "looks convincing".
  • The latest analysis also presents evidence that xenon-133 began to vent from Fukushima Daiichi immediately after the quake, and before the tsunami swamped the area. This implies that even without the devastating flood, the earthquake alone was sufficient to cause damage at the plant.

    ADVERTISEMENT

    Advertisement

    The Japanese government's report has already acknowledged that the shaking at Fukushima Daiichi exceeded the plant's design specifications. Anti-nuclear activists have long been concerned that the government has failed to adequately address geological hazards when licensing nuclear plants (see Nature 448, 392–393; 2007), and the whiff of xenon could prompt a major rethink of reactor safety assessments, says Yamauchi.

  • The model also shows that the accident could easily have had a much more devastating impact on the people of Tokyo. In the first days after the accident the wind was blowing out to sea, but on the afternoon of 14 March it turned back towards shore, bringing clouds of radioactive caesium-137 over a huge swathe of the country (see 'Radioisotope reconstruction'). Where precipitation fell, along the country's central mountain ranges and to the northwest of the plant, higher levels of radioactivity were later recorded in the soil; thankfully, the capital and other densely populated areas had dry weather. "There was a period when quite a high concentration went over Tokyo, but it didn't rain," says Stohl. "It could have been much worse." 
D'coda Dcoda

Impacts of the Fukushima Nuclear Power Plants on Marine Radioactivity - Environmental S... - 0 views

  • The impacts on the ocean of releases of radionuclides from the Fukushima Dai-ichi nuclear power plants remain unclear. However, information has been made public regarding the concentrations of radioactive isotopes of iodine and cesium in ocean water near the discharge point. These data allow us to draw some basic conclusions about the relative levels of radionuclides released which can be compared to prior ocean studies and be used to address dose consequences as discussed by Garnier-Laplace et al. in this journal.(1) The data show peak ocean discharges in early April, one month after the earthquake and a factor of 1000 decrease in the month following. Interestingly, the concentrations through the end of July remain higher than expected implying continued releases from the reactors or other contaminated sources, such as groundwater or coastal sediments. By July, levels of 137Cs are still more than 10 000 times higher than levels measured in 2010 in the coastal waters off Japan. Although some radionuclides are significantly elevated, dose calculations suggest minimal impact on marine biota or humans due to direct exposure in surrounding ocean waters, though considerations for biological uptake and consumption of seafood are discussed and further study is warranted.
  • there was no large explosive release of core reactor material, so most of the isotopes reported to have spread thus far via atmospheric fallout are primarily the radioactive gases plus fission products such as cesium, which are volatilized at the high temperatures in the reactor core, or during explosions and fires. However, some nonvolatile activation products and fuel rod materials may have been released when the corrosive brines and acidic waters used to cool the reactors interacted with the ruptured fuel rods, carrying radioactive materials into the ground and ocean. The full magnitude of the release has not been well documented, nor is there data on many of the possible isotopes released, but we do have significant information on the concentration of several isotopes of Cs and I in the ocean near the release point which have been publically available since shortly after the accident started.
  • We present a comparison of selected data made publicly available from a Japanese company and agencies and compare these to prior published radionuclide concentrations in the oceans. The primary sources included TEPCO (Tokyo Electric Power Company), which reported data in regular press releases(3) and are compiled here (Supporting Information Table S1). These TEPCO data were obtained by initially sampling 500 mL surface ocean water from shore and direct counting on high-purity germanium gamma detectors for 15 min at laboratories at the Fukushima Dai-ni NPPs. They reported initially results for 131I (t1/2 = 8.02 days), 134Cs (t1/2 = 2.065 years) and 137Cs (t1/2 = 30.07 years). Data from MEXT (Ministry of Education, Culture, Sports, Science and Technology—Japan) were also released on a public Web site(4) and are based on similar direct counting methods. In general MEXT data were obtained by sampling 2000 mL seawater and direct counting on high-purity germanium gamma detectors for 1 h in a 2 L Marinelli beaker at laboratories in the Japan Atomic Energy Agency. The detection limit of 137Cs measurements are about 20 000 Bq m–3 for TEPCO data and 10 000 Bq m–3 for MEXT data, respectively. These measurements were conducted based on a guideline described by MEXT.(5) Both sources are considered reliable given the common activity ratios and prior studies and expertise evident by several Japanese groups involved in making these measurements. The purpose of these early monitoring activities was out of concern for immediate health effects, and thus were often reported relative to statutory limits adopted by Japanese authorities, and thus not in concentration units (reported as scaling factors above “normal”). Here we convert values from both sources to radionuclide activity units common to prior ocean studies of fallout in the ocean (Bq m–3) for ease of comparison to previously published data.
  • ...5 more annotations...
  • We focus on the most complete time-series records from the north and south discharge channels at the Dai-ichi NPPs, and two sites to the south that were not considered sources, namely the north Discharge channels at the Dai-ni NPPs about 10 km to the south and Iwasawa beach which is 16 km south of the Dai-ichi NPPs (Figure 1). The levels at the discharge point are exceedingly high, with a peak 137Cs 68 million Bq m–3 on April 6 (Figure 2). What are significant are not just the elevated concentrations, but the timing of peak release approximately one month after to the earthquake. This delayed release is presumably due to the complicated pattern of discharge of seawater and fresh water used to cool the reactors and spent fuel rods, interactions with groundwater, and intentional and unintentional releases of mixed radioactive material from the reactor facility.
  • the concentrations of Cs in sediments and biota near the NPPs may be quite large, and will continue to remain so for at least 30–100 years due to the longer half-life of 137Cs which is still detected in marine and lake sediments from 1960s fallout sources.
  • If the source at Fukushima had stopped abruptly and ocean mixing processes continued at the same rates, one would have expected that the 137Cs activities would have decreased an additional factor of 1000 from May to June but that was not observed. The break in slope in early May implies that a steady, albeit lower, source of 137Cs continues to discharge to the oceans at least through the end of July at this site. With reports of highly contaminated cooling waters at the NPPs and complete melt through of at least one of the reactors, this is not surprising. As we have no reason to expect a change in mixing rates of the ocean which would also impact this dilution rate, this change in slope of 137Cs in early May is clear evidence that the Dai-ichi NPPs remain a significant source of contamination to the coastal waters off Japan. There is currently no data that allow us to distinguish between several possible sources of continued releases, but these most likely include some combination of direct releases from the reactors or storage tanks, or indirect releases from groundwater beneath the reactors or coastal sediments, both of which are likely contaminated from the period of maximum releases
  • It is prudent to point out though what is meant by “significant” to both ocean waters and marine biota. With respect to prior concentrations in the waters off Japan, all of these values are elevated many orders of magnitude. 137Cs has been tracked quite extensively off Japan since the peak weapons testing fallout years in the early 1960s.(13) Levels in the region east of Japan have decreased from a few 10s of Bq m–3 in 1960 to 1.5 Bq m–3 on average in 2010 (Figure 2; second x-axis). The decrease in 137Cs over this 50 year record reflects both radioactive decay of 137Cs with a 30 year half-life and continued mixing in the global ocean of 137Cs to depth. These data are characteristic of other global water masses.(14) Typical ocean surface 137Cs activities range from <1 Bq m–3 in surface waters in the Southern Hemisphere, which are lower due to lower weapons testing inputs south of the equator, to >10–100 Bq m–3 in the Irish Sea, North Sea, Black Sea, and Baltic Seas, which are elevated due to local sources from the intentional discharges at the nuclear fuel reprocessing facilities at Sellafield in the UK and Cape de la Hague in France, as well as residual 137Cs from Chernobyl in the Baltic and Black Seas. Clearly then on this scale of significance, levels of 137Cs 30 km off Japan were some 3–4 orders of magnitude higher than existed prior to the NPP accidents at Fukushima.
  • Finally though, while the Dai-ichi NPP releases must be considered “significant” relative to prior sources off Japan, we should not assume that dose effects on humans or marine biota are necessarily harmful or even will be measurable. Garnier-Laplace et al.(1) report a dose reconstruction signal for the most impacted areas to wildlife on land and in the ocean. Like this study, they are relying on reported activities to calculate forest biota concentrations,
  •  
    From Wood's Hole, note that calculations are based on reports from TEPCO & other Japanese agencies. Quite a bit more to read on the site.
D'coda Dcoda

Electric cars may not be so green after all, says British study [10Jun11] - 0 views

  • An electric car owner would have to drive at least 129,000km before producing a net saving in CO2. Many electric cars will not travel that far in their lifetime because they typically have a range of less than 145km on a single charge and are unsuitable for long trips. Even those driven 160,000km would save only about a tonne of CO2 over their lifetimes.
  • The British study, which is the first analysis of the full lifetime emissions of electric cars covering manufacturing, driving and disposal, undermines the case for tackling climate change by the rapid introduction of electric cars.
  • The Committee on Climate Change, the UK government watchdog, has called for the number of electric cars on Britain's roads to increase from a few hundred now to 1.7 million by 2020.
  • ...2 more annotations...
  • The study was commissioned by the Low Carbon Vehicle Partnership, which is jointly funded by the British government and the car industry. It found that a mid-size electric car would produce 23.1 tonnes of CO2 over its lifetime, compared with 24 tonnes for a similar petrol car. Emissions from manufacturing electric cars are at least 50 per cent higher because batteries are made from materials such as lithium, copper and refined silicon, which require much energy to be processed.
  • Many electric cars are expected to need a replacement battery after a few years. Once the emissions from producing the second battery are added in, the total CO2 from producing an electric car rises to 12.6 tonnes, compared with 5.6 tonnes for a petrol car. Disposal also produces double the emissions because of the energy consumed in recovering and recycling metals in the battery. The study also took into account carbon emitted to generate the grid electricity consumed.
D'coda Dcoda

German Nuclear Decommissioning and Renewables Build-Out [23Oct11] - 0 views

  • Germany will be redirecting its economy towards renewable energy, because of the political decision to decommission its nuclear plants, triggered by the Fukushima event in Japan and subsequent public opposition to nuclear energy. Germany's decision would make achieving its 2020 CO2 emission reduction targets more difficult.   To achieve the CO2 emissions reduction targets and replace nuclear energy, renewable energy would need to scale up from 17% in 2010 to 57% of total electricity generation of 603 TWh in 2020, according to a study by The Breakthrough Institute. As electricity generation was 603 TWh in 2010, increased energy efficiency measures will be required to flat-line electricity production during the next 9 years.   Germany has 23 nuclear reactors (21.4 GW), 8 are permanently shut down (8.2 GW) and 15 (13.2 GW) will be shut down by 2022. Germany will be adding a net of 5 GW of coal plants, 5 GW of new CCGT plants and 1.4 GW of new biomass plants in future years. The CCGT plants will reduce the shortage of quick-ramping generation capacity for accommodating variable wind and solar energy to the grid.
  • Germany is planning a $14 billion build-out of transmission systems for onshore and future offshore wind energy in northern Germany and for augmented transmission with France for CO2-free hydro and nuclear energy imports to avoid any shortages.    Germany had fallen behind on transmission system construction in the north because of public opposition and is using the nuclear plant shutdown as leverage to reduce public opposition. Not only do people have to look at a multitude of 450-ft tall wind turbines, but also at thousands of 80 to 135 ft high steel structures and wires of the transmission facilities.   The $14 billion is just a minor down payment on the major grid reorganization required due to the decommissioning of the nuclear plants and the widely-dispersed build-outs of renewables. The exisitng grid is mostly large-central-plant based. 
  • This article includes the estimated capital costs of shutting down Germany's nuclear plants, reorganizing the grids of Germany and its neighbors, and building out renewables to replace the nuclear energy.    Germany’s Renewable Energy Act (EEG) in 2000, guarantees investors above-market fees for solar power for 20 years from the point of installation. In 2010, German investments in  renewables was about $41.2 billion, of which about $36.1 billion in 7,400 MW of solar systems ($4,878/kW). In 2010, German incentives for all renewables was about $17.9 billion, of which about half was for solar systems.   The average subsidy in 2010 was about ($9 billion x 1 euro/1.4 $)/12 TWh = 53.6 eurocents/kWh; no wonder solar energy is so popular in Germany. These subsidies are rolled into electric rates as fees or taxes, and will ultimately make Germany less competitive in world markets.   http://thebreakthrough.org/blog//2011/06/analysis_germanys_plan_to_phas-print.html http://mobile.bloomberg.com/news/2011-05-31/merkel-faces-achilles-heel-in-grids-to-unplug-german-nuclear.html http://www.theecologist.org/News/news_analysis/829664/revealed_how_your_country_compares_on_renewable_investment.html http://en.wikipedia.org/wiki/Solar_power_in_Germany  
  • ...12 more annotations...
  • SUMMARY OF ESTIMATED CAPITAL AND OTHER COSTS   The estimated capital costs and other costs for decommissioning the nuclear plants, restoring the sites, building out renewables, wind and solar energy balancing plants, and reorganizing electric grids over 9 years are summarized below.    The capital cost and subsidy cost for the increased energy efficiency measures was not estimated, but will likely need to be well over $180 billion over 9 years, or $20 billion/yr, or $20 b/($3286 b in 2010) x 100% = 0.6% of GDP, or $250 per person per yr.     Decommission nuclear plants, restore sites: 23 @ $1 billion/plant = $23 billion Wind turbines, offshore: 53,300 MW @ $4,000,000/MW = $213.2 billion   Wind turbines, onshore: 27,900 MW @ $2,000,000/MW = $55.8 billion Wind feed-in tariff extra costs rolled into electric rates over 9 years: $200 billion  Solar systems: 82,000 MW @ $4,500,000/MW = $369 billion Solar feed-in tariff extra costs rolled into electric rates over 9 years = $250 billion. Wind and solar energy balancing plants: 25,000 MW of CCGTs @ $1,250,000/MW = $31.3 billion Reorganizing European elecric grids tied to German grids: $150 billion
  • RENEWABLE ENERGY AND ENERGY EFFICIENCY TARGETS   In September 2010 the German government announced the following targets:   Renewable electricity - 35% by 2020 and 80% by 2050 Renewable energy - 18% by 2020, 30% by 2030, and 60% by 2050 Energy efficiency - Reducing the national electricity consumption 50% below 2008 levels by 2050.  http://en.wikipedia.org/wiki/Renewable_energy_in_Germany   Germany has a target to reduce its nation-wide CO2 emissions from all sources by 40% below 1990 levels by 2020 and 80-85% below 1990 levels by 2050. That goal could be achieved, if 100% of electricity is generated by renewables, according to Mr. Flasbarth. Germany is aiming to convince the rest of Europe to follow its lead.
  • A 2009 study by EUtech, engineering consultants, concluded Germany will not achieve its nation-wide CO2 emissions target; the actual reduction will be less than 30%. The head of Germany's Federal Environment Agency (UBA), Jochen Flasbarth, is calling for the government to improve CO2 reduction programs to achieve targets. http://www.spiegel.de/international/germany/0,1518,644677,00.html   GERMAN RENEWABLE ENERGY TO-DATE   Germany announced it had 17% of its electrical energy from renewables in 2010; it was 6.3% in 2000. The sources were 6.2% wind, 5.5% biomass, 3.2% hydro and 2.0% solar. Electricity consumption in 2010 was 603 TWh (production) - 60 TWh (assumed losses) = 543 TWh http://www.volker-quaschning.de/datserv/ren-Strom-D/index_e.php  
  • Wind: At the end of 2010, about 27,200 MW of onshore and offshore wind turbines was installed in Germany at a capital cost of about $50 billion. Wind energy produced was 37.5 TWh, or 6.2% of total production. The excess cost of the feed-in-tariff energy bought by utilities and rolled into electricity costs of rate payers was about $50 billion during the past 11 years.   Most wind turbines are in northern Germany. When wind speeds are higher wind curtailment of 15 to 20 percent takes place because of insufficient transmission capacity and quick-ramping gas turbine plants. The onshore wind costs the Germany economy about 12 eurocent/kWh and the offshore wind about 24 eurocent/kWh. The owners of the wind turbines are compensated for lost production.   The alternative to curtailment is to “sell” the energy at European spot prices of about 5 eurocent/kWh to Norway and Sweden which have significant hydro capacity for balancing the variable wind energy; Denmark has been doing it for about 20 years.   As Germany is very marginal for onshore wind energy (nation-wide onshore wind CF 0.167) and nearly all of the best onshore wind sites have been used up, or are off-limits due to noise/visual/environmental impacts, most of the additional wind energy will have to come from OFFSHORE facilities which produce wind energy at about 2 to 3 times the cost of onshore wind energy. http://theenergycollective.com/willem-post/61774/wind-energy-expensive
  • Biomass: At the end of 2010, about 5,200 MW of biomass was installed at a capital cost of about $18 billion. Biomass energy produced was 33.5 TWh, or 5.5% of production. Plans are to add 1,400 MW of biomass plants in future years which, when fully implemented, would produce about 8.6 TWh/yr.   Solar: At the end of 2010, about 17,320 MW of PV solar was installed in Germany at a capital cost of about $100 billion. PV solar energy produced was 12 TWh, or 2% of total production. The excess cost of the feed-in-tariff energy bought by utilities and rolled into the electricity costs of rate payers was about $80 billion during the past 11 years.   Most solar panels are in southern Germany (nation-wide solar CF 0.095). When skies are clear, the solar production peaks at about 7 to 10 GW. Because of insufficient capacity of transmission and quick-ramping gas turbine plants, and because curtailment is not possible, part of the solar energy, produced at a cost to the German economy of about 30 to 50 eurocent/kWh is “sold” at European spot prices of about 5 eurocent/kWh to France which has significant hydro capacity for balancing the variable solar energy. http://theenergycollective.com/willem-post/46142/impact-pv-solar-feed-tariffs-germany  
  • Hydro: At the end of 2010, about 4,700 MW of hydro was installed. Hydro energy produced was 19.5 TWh, or 3.2% of production. Hydro growth has been stagnant during the past 20 years. See below website.   As it took about $150 billion of direct investment, plus about $130 billion excess energy cost during the past 11 years to achieve 8.2% of total production from solar and wind energy, and assuming hydro will continue to have little growth, as was the case during the past 20 years (almost all hydro sites have been used up), then nearly all of the renewables growth by 2020 will be mostly from wind, with the remainder from solar and biomass. http://www.renewableenergyworld.com/rea/news/article/2011/03/new-record-for-german-renewable-energy-in-2010??cmpid=WNL-Wednesday-March30-2011   Wind and Solar Energy Depend on Gas: Wind and solar energy is variable and intermittent. This requires quick-ramping gas turbine plants to operate at part-load and quickly ramp up with wind energy ebbs and quickly ramp down with wind energy surges; this happens about 100 to 200 times a day resulting in increased wear and tear. Such operation is very inefficient for gas turbines causing them to use extra fuel/kWh and emit extra CO2/kWh that mostly offset the claimed fuel and CO2 reductions due to wind energy. http://theenergycollective.com/willem-post/64492/wind-energy-reduces-co2-emissions-few-percent  
  • Wind energy is often sold to the public as making a nation energy independent, but Germany will be buying gas mostly from Russia supplied via the newly constructed pipeline under the Baltic Sea from St. Petersburg to Germany, bypassing Poland.   GERMANY WITHOUT NUCLEAR ENERGY   A study performed by The Breakthrough Institute concluded to achieve the 40% CO2 emissions reduction target and the decommissioning of 21,400 MW of nuclear power plants by 2022, Germany’s electrical energy mix would have to change from 60% fossil, 23% nuclear and 17% renewables in 2010 to 43% fossil and 57% renewables by 2020. This will require a build-out of renewables, reorganization of Europe’s electric grids (Europe’s concurrence will be needed) and acceleration of energy efficiency measures.   According to The Breakthrough Institite, Germany would have to reduce its total electricity consumption by about 22% of current 2020 projections AND achieve its target for 35% electricity generated from renewables by 2020. This would require increased energy efficiency measures to effect an average annual decrease of the electricity consumption/GDP ratio of 3.92% per year, significantly greater than the 1.47% per year decrease assumed by the IEA's BAU forecasts which is based on projected German GDP growth and current German efficiency policies.
  • The Breakthrough Institute projections are based on electricity consumption of 544  and 532 TWh  in 2008 and 2020, respectively; the corresponding production is 604 TWh in 2008 and 592 TWh in 2020.   http://thebreakthrough.org/blog//2011/06/analysis_germanys_plan_to_phas-print.html http://www.iea.org/textbase/nppdf/free/2007/germany2007.pdf   Build-out of Wind Energy: If it is assumed the current wind to solar energy ratio is maintained at 3 to 1, the wind energy build-out will be 80% offshore and 20% onshore, and the electricity production will be 592 TWh, then the estimated capital cost of the offshore wind turbines will be [{0.57 (all renewables) - 0.11 (assumed biomass + hydro)} x 592 TWh x 3/4] x 0.8 offshore/(8,760 hr/yr x average CF 0.35) = 0.0533 TW offshore wind turbines @ $4 trillion/TW = $213 billion and of the onshore wind turbines will be [{0.57 (all renewables) - 0.11 (assumed biomass + hydro)} x 592 TWh x 3/4] x 0.2 onshore/(8,760 hr/yr x average CF 0.167) = 0.279 TW of wind turbines @ $2 trillion/TW = $56 billion, for a total of $272 billion. The feed in tariff subsidy for 9 years, if maintained similar to existing subsidies to attract adequate capital, will be about $150 billion offshore + $50 billion onshore, for a total of $200 billion.    
  • Note: The onshore build-out will at least double Germany’s existing onshore wind turbine capacity, plus required transmission systems; i.e., significant niose, environmental and visual impacts over large areas.   Recent studies, based on measured, real-time, 1/4-hour grid operations data sets of the Irish, Colorado and Texas grids, show wind energy does little to reduce CO2 emissions. Such data sets became available during the past 2 to 3 years. Prior studies, based on assumptions, estimates, modeling scenarios, and statistics, etc., significantly overstate CO2 reductions.  http://theenergycollective.com/willem-post/64492/wind-energy-reduces-co2-emissions-few-percent   Build-out of PV Solar Energy: The estimated capital cost of the PV solar capacity will be [{0.57 (all renewables) - 0.11 (assumed biomass + hydro)} x 592 TWh x 1/4]/(8,760 hr/yr x average CF 0.095) = 0.082 TW @ $4.5 trillion/TW = $369 billion. The feed in tariff subsidy, if maintained similar to existing subsidies to attract adequate capital, will be about $250 billion.   Reorganizating Electric Grids: For GW reasons, a self-balancing grid system is needed to minimize CO2 emissions from gas-fired CCGT balancing plants. One way to implement it is to enhance the interconnections of the national grids with European-wide HVDC overlay systems (owning+O&M costs, including transmission losses), and with European-wide selective curtailment of wind energy, and with European-wide demand management and with pumped hydro storage capacity. These measures will reduce, but not eliminate, the need for balancing energy, at greater wind energy penetrations during high-windspeed weather conditions, as frequently occur in Iberia (Spain/Portugal).  
  • European-wide agreement is needed, the capital cost will be in excess of $150 billion and the adverse impacts on quality of life (noise, visuals, psychological), property values and the environment will be significant over large areas.    Other Capital Costs: The capacity of the quick-ramping CCGT balancing plants was estimated at 25,000 MW; their capital cost is about 25,000 MW x $1,250,000/MW = $31.3 billion. The capital costs of decommissioning and restoring the sites of the 23 nuclear plants will be about $23 billion.   Increased Energy Efficiency: Increased energy efficiency would be more attractive than major build-outs of renewables, because it provides the quickest and biggest "bang for the buck", AND it is invisible, AND it does not make noise, AND it has minimal environmental impact, AND it usually reduces at least 3 times the CO2 per invested dollar, AND it usually creates at least 3 times the jobs per invested dollar, AND it usually creates at least 3 times the energy reduction per invested dollar, AND it does all this without public resistance and controversy.   Rebound, i.e., people going back to old habits of wasting energy, is a concept fostered by the PR of proponents of conspicuous consumption who make money on such consumption. People with little money love their cars getting 35-40 mpg, love getting small electric and heating bills. The rebound is mostly among people who do not care about such bills.
  • A MORE RATIONAL APPROACH   Global warming is a given for many decades, because the fast-growing large economies of the non-OECD nations will have energy consumption growth far outpacing the energy consumption growth of the slow-growing economies of the OECD nations, no matter what these OECD nations do regarding reducing CO2 emissions of their economies.   It is best to PREPARE for the inevitable additional GW by requiring people to move away from flood-prone areas (unless these areas are effectively protected, as in the Netherlands), requiring new  houses and other buildings to be constructed to a standard such as the Passivhaus standard* (such buildings stay cool in summer and warm in winter and use 80 to 90 percent less energy than standard buildings), and requiring the use of new cars that get at least 50 mpg, and rearranging the world's societies for minimal energy consumption; making them walking/bicycling-friendly would be a good start.   If a nation, such as the US, does not do this, the (owning + O&M) costs of its economy will become so excessive (rising resource prices, increased damage and disruptions from weather events) that its goods and services will become less competitive and an increasing percentage of its population will not be able to afford a decent living standard in such a society.   For example: In the US, the median annual household income (inflation-adjusted) was $49,445, a decline of 7% since 2000. As the world’s population increases to about 10 billion by 2050, a triage-style rationing of resources will become more prevalent. http://www.usatoday.com/news/nation/story/2011-09-13/census-household-income/50383882/1
  • * A 2-year-old addition to my house is built to near-Passivhaus standards; its heating system consists of a thermostatically-controlled 1 kW electric heater, set at 500 W, that cycles on/off on the coldest days for less than 100 hours/yr. The addition looks inside and out entirely like standard construction. http://theenergycollective.com/willem-post/46652/reducing-energy-use-houses
  •  
    Excellent, lengthy article , lots of data
Dan R.D.

New Study: Fukushima Released Twice as Much Radiation as Official Estimate Claimed | 80... - 0 views

  • The nuclear disaster at the  Fukushima Daiichi power plant this spring may have released twice as much radiation into the atmosphere as the Japanese government estimated, a new preliminary study says. While the government estimates relied mostly on data from monitoring stations in Japan, the European research team behind the new report looked at radioactivity data from stations scattered across the globe.
  • The nuclear disaster at the  Fukushima Daiichi power plant this spring may have released twice as much radiation into the atmosphere as the Japanese government estimated, a new preliminary study says. While the government estimates relied mostly on data from monitoring stations in Japan, the European research team behind the new report looked at radioactivity data from stations scattered across the globe. This wider approach factored in the large amounts of radioactivity that were carried out over the Pacific Ocean, which the official tallies didn’t.
  • Overall, the team says, the disaster released about 36,000 terabecquerels of  cesium-137, a radioactive byproduct of nuclear fission, more than twice the 15,000 terabecquerels Japanese authorities estimated—and approximately 42% as much radioactivity as  Chernobyl.
  • ...1 more annotation...
  • The researchers also found that the release of cesium declined sharply when workers started spraying water into the pools holding spent fuel rods at the plant—suggesting that, contrary to the official account, the spent fuel rods had been emitting radiation, and spraying them earlier might have mitigated the fallout.
D'coda Dcoda

Major Study: Reactor No. 5 releases may explain why so much radioactive xenon... - 0 views

  • “Fortunately, due to the maintenance outage and the survival of one diesel generator, it seems that unit 5 reactor cores as well as spent fuel ponds have not suffered major fuel damage,” says the study. Though, Reactor No. 5 is mentioned again several pages later: “Total a posteriori [experienced levels] 133Xe emissions are 16.7 EBq, one third more than the a priori value [predicted levels] of 12.6 EBq (which is equal to the estimated inventory) and 2.5 times the estimated Chernobyl source term of 6.5 EBq.
  • If there was only 12.6 EBq of xenon-133 inventory that could be emitted from reactors 1-3 and spent fuel pool No. 4 — yet 16.7 EBq was experienced — where did the extra xenon come from, according to the study? “There is the possibility of additional releases from unit 5.” Another possibility is that recriticality has occurred in one of the reactor units. The study says the a priori emissions could have been overestimated, but discounts the notion that the initial 12.6 EBq figure so poorly underestimated the amount of xenon in Reactors 1-3 and SFP 4, “It is unlikely that the 133Xe inventories of the reactor units 1–3 were one third higher than estimated.”
  • ABSTRACT: ACPD – Xenon-133 and caesium-137 releases into the atmosphere from the Fukushima Dai-ichi nuclear power plant: determination of the source term, atmospheric dispersion, and deposition SOURCE: Discussion Paper See also: Report: Fukushima Reactors No. 5, 6 now in crisis — Cesium outside release points up 1,000% in recent days — Local says Hitachi engineers coming to help (VIDEO)
D'coda Dcoda

Nuclear radiation from Fukushima twice more than estimated: report [28Oct11] - 0 views

  • The Fukushima nuclear disaster released twice as much of a radioactive substance into the atmosphere as Japanese authorities estimated, reaching 40% of the total from Chernobyl, a preliminary report says. The estimate of much higher levels of radioactive cesium-137 comes from a worldwide network of sensors. Study author Andreas Stohl of the Norwegian Institute for Air Research says the Japanese government estimate came only from data in Japan, and that would have missed emissions blown out to sea. The study did not consider health implications of the radiation. Cesium-137 is dangerous because it can last for decades in the environment, releasing cancer-causing radiation.
  • The long-term effects of the nuclear accident are unclear because of the difficulty of measuring radiation amounts people received. In a telephone interview, Stohl said emission estimates are so imprecise that finding twice the amount of cesium isn’t considered a major difference. He said some previous estimates had been higher than his. The journal Atmospheric Chemistry and Physics posted the report online for comment, but the study has not yet completed a formal review by experts in the field or been accepted for publication.
  • Last summer, the Japanese government estimated that the March 11 Fukushima accident released 15,000 terabecquerels of cesium. Terabecquerels are a radiation measurement. The new report from Stohl and co-authors estimates about 36,000 terabecquerels through April 20. That’s about 42% of the estimated release from Chernobyl, the report says. An official at the Nuclear and Industrial Safety Agency, the Japanese government branch overseeing such findings, said the agency could not offer any comment on the study because it had not reviewed its contents. It also says about a fifth of the cesium fell on land in Japan, while most of the rest fell into the Pacific Ocean. Only about 2% of the fallout came down on land outside Japan, the report concluded.
  • ...2 more annotations...
  • Experts have no firm projections about how many cancers could result because they’re still trying to find out what doses people received. Some radiation from the accident has also been detected in Tokyo and in the United States, but experts say they expect no significant health consequences there. Still, concern about radiation is strong in Japan. Many parents of small children in Tokyo worry about the discovery of radiation hotspots even though government officials say they don’t pose a health risk. And former Prime Minister Naoto Kan has said the most contaminated areas inside the evacuation zone could be uninhabitable for decades.
  • Stohl also noted that his study found cesium-137 emissions dropped suddenly at the time workers started spraying water on the spent fuel pool from one of the reactors. That challenges previous thinking that the pool wasn’t emitting cesium, he said.
D'coda Dcoda

LED Poised To Light Up The World: Study [20Jan12] - 0 views

  • In the next decade, LED lights will capture more than half of the world’s demand for new lighting, the author of a study on green economic opportunities said last night at a University of Chicago forum.
  • Robert Weissbourd, the president of RW-Ventures, was studying economic development in the Chicago region when he and other authors  of “The Chicago Region’s Green Economic Opportunities” (pdf) realized the potential of LED lighting.
  • “It’s projected that the shift to LED lighting is going to be huge. It’s going to capture 60 percent of the market globally in the next ten years.”
  • ...1 more annotation...
  • That shift will be motivated not only by a global response to climate change, but especially by the economic benefits of LED lights. “They’re clearly a superior product,” Weissbourd said, “but not yet market accepted.”
D'coda Dcoda

U.S. Government Confirms Link Between Earthquakes and Hydraulic Fracturing at Oil Price - 0 views

  • On 5 November an earthquake measuring 5.6 rattled Oklahoma and was felt as far away as Illinois. Until two years ago Oklahoma typically had about 50 earthquakes a year, but in 2010, 1,047 quakes shook the state. Why? In Lincoln County, where most of this past weekend's seismic incidents were centered, there are 181 injection wells, according to Matt Skinner, an official from the Oklahoma Corporation Commission, the agency which oversees oil and gas production in the state. Cause and effect? The practice of injecting water into deep rock formations causes earthquakes, both the U.S. Army and the U.S. Geological Survey have concluded.
  • The U.S. natural gas industry pumps a mixture of water and assorted chemicals deep underground to shatter sediment layers containing natural gas, a process called hydraulic fracturing, known more informally as “fracking.” While environmental groups have primarily focused on fracking’s capacity to pollute underground water, a more ominous byproduct emerges from U.S. government studies – that forcing fluids under high pressure deep underground produces increased regional seismic activity. As the U.S. natural gas industry mounts an unprecedented and expensive advertising campaign to convince the public that such practices are environmentally benign, U.S. government agencies have determined otherwise. According to the U.S. Army’s Rocky Mountain Arsenal website, the RMA drilled a deep well for disposing of the site’s liquid waste after the U.S. Environmental Protection Agency “concluded that this procedure is effective and protective of the environment.”  According to the RMA, “The Rocky Mountain Arsenal deep injection well was constructed in 1961, and was drilled to a depth of 12,045 feet” and 165 million gallons of Basin F liquid waste, consisting of “very salty water that includes some metals, chlorides, wastewater and toxic organics” was injected into the well during 1962-1966.
  • Why was the process halted? “The Army discontinued use of the well in February 1966 because of the possibility that the fluid injection was “triggering earthquakes in the area,” according to the RMA. In 1990, the “Earthquake Hazard Associated with Deep Well Injection--A Report to the U.S. Environmental Protection Agency” study of RMA events by Craig Nicholson, and R.I. Wesson stated simply, “Injection had been discontinued at the site in the previous year once the link between the fluid injection and the earlier series of earthquakes was established.” Twenty-five years later, “possibility” and ‘established” changed in the Environmental Protection Agency’s July 2001 87 page study, “Technical Program Overview: Underground Injection Control Regulations EPA 816-r-02-025,” which reported, “In 1967, the U.S. Army Corps of Engineers and the U.S. Geological Survey (USGS) determined that a deep, hazardous waste disposal well at the Rocky Mountain Arsenal was causing significant seismic events in the vicinity of Denver, Colorado.” There is a significant divergence between “possibility,” “established” and “was causing,” and the most recent report was a decade ago. Much hydraulic fracturing to liberate shale oil gas in the Marcellus shale has occurred since.
  • ...3 more annotations...
  • According to the USGS website, under the undated heading, “Can we cause earthquakes? Is there any way to prevent earthquakes?” the agency notes, “Earthquakes induced by human activity have been documented in a few locations in the United States, Japan, and Canada. The cause was injection of fluids into deep wells for waste disposal and secondary recovery of oil, and the use of reservoirs for water supplies. Most of these earthquakes were minor. The largest and most widely known resulted from fluid injection at the Rocky Mountain Arsenal near Denver, Colorado. In 1967, an earthquake of magnitude 5.5 followed a series of smaller earthquakes. Injection had been discontinued at the site in the previous year once the link between the fluid injection and the earlier series of earthquakes was established.” Note the phrase, “Once the link between the fluid injection and the earlier series of earthquakes was established.” So both the U.S Army and the U.S. Geological Survey over fifty years of research confirm on a federal level that that “fluid injection” introduces subterranean instability and is a contributory factor in inducing increased seismic activity.” How about “causing significant seismic events?”
  • Fast forward to the present. Overseas, last month Britain’s Cuadrilla Resources announced that it has discovered huge underground deposits of natural gas in Lancashire, up to 200 trillion cubic feet of gas in all. On 2 November a report commissioned by Cuadrilla Resources acknowledged that hydraulic fracturing was responsible for two tremors which hit Lancashire and possibly as many as fifty separate earth tremors overall. The British Geological Survey also linked smaller quakes in the Blackpool area to fracking. BGS Dr. Brian Baptie said, “It seems quite likely that they are related,” noting, “We had a couple of instruments close to the site and they show that both events occurred near the site and at a shallow depth.” But, back to Oklahoma. Austin Holland’s August 2011 report, “Examination of Possibly Induced Seismicity from Hydraulic Fracturing in the Eola Field, Garvin County, Oklahoma” Oklahoma Geological Survey OF1-2011, studied 43 earthquakes that occurred on 18 January, ranging in intensity from 1.0 to 2.8 Md (milliDarcies.) While the report’s conclusions are understandably cautious, it does state, “Our analysis showed that shortly after hydraulic fracturing began small earthquakes started occurring, and more than 50 were identified, of which 43 were large enough to be located.”
  • Sensitized to the issue, the oil and natural gas industry has been quick to dismiss the charges and deluge the public with a plethora of televisions advertisements about how natural gas from shale deposits is not only America’s future, but provides jobs and energy companies are responsible custodians of the environment. It seems likely that Washington will eventually be forced to address the issue, as the U.S. Army and the USGS have noted a causal link between the forced injection of liquids underground and increased seismic activity. While the Oklahoma quake caused a deal of property damage, had lives been lost, the policy would most certainly have come under increased scrutiny from the legal community. While polluting a local community’s water supply is a local tragedy barely heard inside the Beltway, an earthquake ranging from Oklahoma to Illinois, Kansas, Arkansas, Tennessee and Texas is an issue that might yet shake voters out of their torpor, and national elections are slightly less than a year away.
D'coda Dcoda

IAA says 'Yes We Can' to power plants in orbit [15Nov11] - 0 views

  • Scientists from around the world have completed a study that says harvesting the sun's energy in space can turn out to be a cost effective way of delivering the world’s needs for power in as little as 30 years. As important, the report says that orbiting power plants capable of collecting energy from the sun and beaming it to earth are technically feasible within a decade or so based on technologies now in the laboratory.
  • These are findings in a report from the International Academy of Astronautics, headquartered in Paris. What their time references refer to are that the very technology needed to satisfy global energy requirements may be available in only 10 to 20 years, and the project can show cost-effectiveness in about 30 years. The IAA's three-year, ten-nation study, as the first broadly based international assessment of collecting solar energy in space, is considered significant. The study was conducted from 2008 to 2010 and was under peer review. John Mankins, the former head of concepts at NASA, led the study. The concept centers on placing one, then several, then many, solar-powered satellites in orbit over the equator. Each would be several miles wide. The satellites would collect sunlight up to 24 hours a day
  • The power would be converted to electricity in space, then sent to where it was needed on earth by a microwave-transmitting antenna or by lasers, and then fed into a power grid. Who would bear the cost of such an effort? The report recommends that both governments and the private sector should fund the research needed to further determine viability. A pilot project to demonstrate the technology could proceed using low-cost expendable launch vehicles being developed for other space markets, said Mankins, according to Reuters. A moderate-scale demonstration would cost tens of billions of dollars less than previously projected as a result of not needing costly, reusable launch vehicles early on.
D'coda Dcoda

U.S. nuke regulators weaken safety rules [20Jun11] - 0 views

  • Federal regulators have been working closely with the nuclear power industry to keep the nation's aging reactors operating within safety standards by repeatedly weakening standards or simply failing to enforce them, an investigation by The Associated Press has found.Officials at the U.S. Nuclear Regulatory Commission regularly have decided original regulations were too strict, arguing that safety margins could be eased without peril, according to records and interviews.The result? Rising fears that these accommodations are undermining safety -- and inching the reactors closer to an accident that could harm the public and jeopardize nuclear power's future.
  • Examples abound. When valves leaked, more leakage was allowed -- up to 20 times the original limit. When cracking caused radioactive leaks in steam generator tubing, an easier test was devised so plants could meet standards.Failed cables. Busted seals. Broken nozzles, clogged screens, cracked concrete, dented containers, corroded metals and rusty underground pipes and thousands of other problems linked to aging were uncovered in AP's yearlong investigation. And many of them could escalate dangers during an accident.
  • Despite the problems, not a single official body in government or industry has studied the overall frequency and potential impact on safety of such breakdowns in recent years, even as the NRC has extended dozens of reactor licenses.Industry and government officials defend their actions and insist no chances are being taken. But the AP investigation found that with billions of dollars and 19 percent of America's electricity supply at stake, a cozy relationship prevails between industry and the NRC.Records show a recurring pattern: Reactor parts or systems fall out of compliance. Studies are conducted by industry and government, and all agree existing standards are "unnecessarily conservative."
  • ...14 more annotations...
  • Regulations are loosened, and reactors are back in compliance."That's what they say for everything ...," said Demetrios Basdekas, a retired NRC engineer. "Every time you turn around, they say, 'We have all this built-in conservatism.' "The crisis at the decades-old Fukushima Dai-ichi nuclear facility in Japan has focused attention on nuclear safety and prompted the NRC to look at U.S. reactors. A report is due in July.But the factor of aging goes far beyond issues posed by Fukushima.
  • Commercial nuclear reactors in the United States were designed and licensed for 40 years. When the first were built in the 1960s and 1970s, it was expected that they would be replaced with improved models long before their licenses expired.That never happened. The 1979 accident at Three Mile Island, massive cost overruns, crushing debt and high interest rates halted new construction in the 1980s.Instead, 66 of the 104 operating units have been relicensed for 20 more years. Renewal applications are under review for 16 other reactors.As of today, 82 reactors are more than 25 years old.The AP found proof that aging reactors have been allowed to run less safely to prolong operations.
  • Last year, the NRC weakened the safety margin for acceptable radiation damage to reactor vessels -- for a second time. The standard is based on a reactor vessel's "reference temperature," which predicts when it will become dangerously brittle and vulnerable to failure. Through the years, many plants have violated or come close to violating the standard.As a result, the minimum standard was relaxed first by raising the reference temperature 50 percent, and then 78 percent above the original -- even though a broken vessel could spill radioactive contents."We've seen the pattern," said nuclear safety scientist Dana Powers, who works for Sandia National Laboratories and also sits on an NRC advisory committee. "They're ... trying to get more and more out of these plants."
  • Sharpening the pencilThe AP study collected and analyzed government and industry documents -- some never-before released -- of both reactor types: pressurized water units that keep radioactivity confined to the reactor building and the less common boiling water types like those at Fukushima, which send radioactive water away from the reactor to drive electricity-generating turbines.The Energy Northwest Columbia Generating Station north of Richland is a boiling water design that's a newer generation than the Fukushima plants.Tens of thousands of pages of studies, test results, inspection reports and policy statements filed during four decades were reviewed. Interviews were conducted with scores of managers, regulators, engineers, scientists, whistleblowers, activists and residents living near the reactors at 65 sites, mostly in the East and Midwest.
  • AP reporters toured some of the oldest reactors -- Oyster Creek, N.J., near the Atlantic coast 50 miles east of Philadelphia and two at Indian Point, 25 miles north of New York City on the Hudson River.Called "Oyster Creak" by some critics, this boiling water reactor began running in 1969 and is the country's oldest operating commercial nuclear power plant. Its license was extended in 2009 until 2029, though utility officials announced in December they will shut the reactor 10 years earlier rather than build state-ordered cooling towers. Applications to extend the lives of pressurized water units 2 and 3 at Indian Point, each more than 36 years old, are under NRC review.Unprompted, several nuclear engineers and former regulators used nearly identical terminology to describe how industry and government research has frequently justified loosening safety standards. They call it "sharpening the pencil" or "pencil engineering" -- fudging calculations and assumptions to keep aging plants in compliance.
  • "Many utilities are doing that sort of thing," said engineer Richard T. Lahey Jr., who used to design nuclear safety systems for General Electric Co., which makes boiling water reactors. "I think we need nuclear power, but we can't compromise on safety. I think the vulnerability is on these older plants."Added Paul Blanch, an engineer who left the industry over safety issues, but later returned to work on solving them: "It's a philosophical position that (federal regulators) take that's driven by the industry and by the economics: What do we need to do to let those plants continue to operate?"Publicly, industry and government say that aging is well under control. "I see an effort on the part of this agency to always make sure that we're doing the right things for safety. I'm not sure that I see a pattern of staff simply doing things because there's an interest to reduce requirements -- that's certainly not the case," NRC chairman Gregory Jaczko said in an interview.
  • Neil Wilmshurst, director of plant technology for the industry's Electric Power Research Institute, acknowledged the industry and NRC often collaborate on research that supports rule changes. But he maintained there's "no kind of misplaced alliance ... to get the right answer."Yet agency staff, plant operators and consultants paint a different picture:* The AP reviewed 226 preliminary notifications -- alerts on emerging safety problems -- NRC has issued since 2005. Wear and tear in the form of clogged lines, cracked parts, leaky seals, rust and other deterioration contributed to at least 26 of the alerts. Other notifications lack detail, but aging was a probable factor in 113 more, or 62 percent in all. For example, the 39-year-old Palisades reactor in Michigan shut Jan. 22 when an electrical cable failed, a fuse blew and a valve stuck shut, expelling steam with low levels of radioactive tritium into the outside air. And a 1-inch crack in a valve weld aborted a restart in February at the LaSalle site west of Chicago.
  • * A 2008 NRC report blamed 70 percent of potentially serious safety problems on "degraded conditions" such as cracked nozzles, loose paint, electrical problems or offline cooling components.* Confronted with worn parts, the industry has repeatedly requested -- and regulators often have allowed -- inspections and repairs to be delayed for months until scheduled refueling outages. Again and again, problems worsened before being fixed. Postponed inspections inside a steam generator at Indian Point allowed tubing to burst, leading to a radioactive release in 2000. Two years later, cracking grew so bad in nozzles on the reactor vessel at the Davis-Besse plant near Toledo, Ohio, that it came within two months of a possible breach, an NRC report said, which could release radiation. Yet inspections failed to catch the same problem on the replacement vessel head until more nozzles were found to be cracked last year.
  • Time crumbles thingsNuclear plants are fundamentally no more immune to aging than our cars or homes: Metals grow weak and rusty, concrete crumbles, paint peels, crud accumulates. Big components like 17-story-tall concrete containment buildings or 800-ton reactor vessels are all but impossible to replace. Smaller parts and systems can be swapped but still pose risks as a result of weak maintenance and lax regulation or hard-to-predict failures.Even mundane deterioration can carry harsh consequences.For example, peeling paint and debris can be swept toward pumps that circulate cooling water in a reactor accident. A properly functioning containment building is needed to create air pressure that helps clear those pumps. But a containment building could fail in a severe accident. Yet the NRC has allowed safety calculations that assume the buildings will hold.
  • In a 2009 letter, Mario V. Bonaca, then-chairman of the NRC's Advisory Committee on Reactor Safeguards, warned that this approach represents "a decrease in the safety margin" and makes a fuel-melting accident more likely.Many photos in NRC archives -- some released in response to AP requests under the federal Freedom of Information Act -- show rust accumulated in a thick crust or paint peeling in long sheets on untended equipment.Four areas stand out:
  • Brittle vessels: For years, operators have rearranged fuel rods to limit gradual radiation damage to the steel vessels protecting the core and keep them strong enough to meet safety standards.But even with last year's weakening of the safety margins, engineers and metal scientists say some plants may be forced to close over these concerns before their licenses run out -- unless, of course, new regulatory compromises are made.
  • Leaky valves: Operators have repeatedly violated leakage standards for valves designed to bottle up radioactive steam in an earthquake or other accident at boiling water reactors.Many plants have found they could not adhere to the general standard allowing main steam isolation valves to leak at a rate of no more than 11.5 cubic feet per hour. In 1999, the NRC decided to allow individual plants to seek amendments of up to 200 cubic feet per hour for all four steam valves combined.But plants have violated even those higher limits. For example, in 2007, Hatch Unit 2, in Baxley, Ga., reported combined leakage of 574 cubic feet per hour.
  • Cracked tubing: The industry has long known of cracking in steel alloy tubing used in the steam generators of pressurized water reactors. Ruptures have been common in these tubes containing radioactive coolant; in 1993 alone, there were seven. As many as 18 reactors still run on old generators.Problems can arise even in a newer metal alloy, according to a report of a 2008 industry-government workshop.
  • Corroded piping: Nuclear operators have failed to stop an epidemic of leaks in pipes and other underground equipment in damp settings. Nuclear sites have suffered more than 400 accidental radioactive leaks, the activist Union of Concerned Scientists reported in September.Plant operators have been drilling monitoring wells and patching buried piping and other equipment for several years to control an escalating outbreak.But there have been failures. Between 2000 and 2009, the annual number of leaks from underground piping shot up fivefold, according to an internal industry document.
D'coda Dcoda

TVA's Environmental and Energy Future - Relies on Nuclear Power and Less on Coal [17Sep10] - 0 views

  • The Tennessee Valley Authority on Thursday issued a draft of its Integrated Resource Plan, a comprehensive study that will help guide efforts to meet regional electricity needs over the next 20 years. Titled "TVA's Environmental and Energy Future," the study analyzes potential combinations of economic and regulatory trends in the coming years and provides recommendations for addressing them. The plan's main purpose is to help TVA meet the region's future energy challenges in ways that maintain reliable power supplies, competitive prices, improved environmental performance and continued financial strength.
  • TVA's yearlong analysis included input from numerous stakeholders including state agencies, power distributors, environmental groups, universities and the general public. The study yielded several likely probabilities for TVA, including: Nuclear expansion will continue, with the potential to eventually overtake coal as the leading electricity source; TVA may idle a portion of its coal generation fleet, as coal units become older and less economical under tighter regulations; Energy efficiency and demand response, as well as renewable generation, will play an increasing role in future resource options; Natural gas capacity additions will be a viable resource option and a key source of generation flexibility for TVA; The intensity of TVA's carbon dioxide, nitrogen oxide, sulfur dioxide and mercury emissions will continue to decrease.
  • Using the study's methodology, TVA examined seven possible long-term scenarios for the next two decades, based on factors such as economic growth, inflation, fuel prices and the regulatory environment. They are: Dramatic economic recovery Environmental focus becoming a greater national priority Prolonged economic malaise Introduction of game-changing energy-related technology Greater U.S. energy independence Carbon regulation creating an economic downturn Current approach/baseline
  • ...4 more annotations...
  • The Integrated Resource Plan process also developed various possible strategies that TVA might use to meet the region's future power needs. Each strategy was analyzed to create 20-year power generation portfolios -- or combinations of electricity resources -- for TVA to consider. Each portfolio was rated using factors such as cost, risk and environmental impact
  • "TVA's Integrated Resource Plan process is a rigorous one that is supportive of TVA's renewed vision and will guide the corporation as it leads the region and the nation toward a cleaner and more secure energy future, relying more on nuclear power and energy efficiency and less on coal," said Van Wardlaw, TVA's executive vice president of Enterprise Relations, who is leading the Integrated Resource Plan effort
  • The TVA Board of Directors has adopted a renewed vision for the federal corporation to be one of the nation's leading providers of cleaner low-cost energy by 2020, increasing its use of nuclear power and energy efficiency and improving its environmental performance
  • TVA completed its previous Integrated Resource Plan, titled "Energy Vision 2020," in 1995. The new plan will update the earlier study, based upon changes in regulations and legislation, the marketplace for electric generating utilities and customer demand.
D'coda Dcoda

Researching Safer Nuclear Energy [09Aug11] - 0 views

  • On Tuesday, the Energy Department, handing out research grants in all kinds of energy fields that are low in carbon dioxide emissions, is announcing that it will give $39 million to university programs around the country to try to solve various nuclear problems.
  • The money will go to a variety of projects at 31 universities in 20 states. Several focus on nuclear waste.
  • Two researchers at Clemson University, for example, will get $1 million to study the behavior of particles of nuclear waste when buried in clay in metal canisters that have rusted. One open question, according to the researchers, is how a high temperature, which would be generated by the waste itself, affects the interactions. These are important to understanding how the waste would spread over time. The goal is to “reduce uncertainty” about the life expectancy of atomic particles.
  • ...5 more annotations...
  • With the cancellation of the Yucca Mountain nuclear waste repository in Nevada, many nuclear operators are loading older fuel into sealed metal casks filled with inert gas. The Massachusetts Institute of Technology will get a grant to study how such “dry casks” perform in salt environments.
  • “Storage casks will be stored mostly in coastal or lakeside regions where a salt air environment exists,’’ a summary of the grant says. Cracking related to corrosion could occur in 30 years or less, and the Nuclear Regulatory Commission is studying whether the casks can be used for 100 years as some hope.
  • Another important concern in the nuclear power field is the aging of reactors. Researchers at Pennsylvania State University will get $456,000 to plan a system that will use ultrasonic waves to look for cracks and other defects in hot metal parts. The idea is to find “microscale” defects that lead to big cracks.
  • Some of the work is aimed at helping to improve new reactors. For example, a researcher at the University of Houston, with collaborators at two other universities, will study a “base isolation system” that would protect reactors against earthquakes.
  • In an earthquake, the ground moves back and forth at a certain frequency, similar to the way a gong struck by a mallet vibrates at a given frequency. But plants could be built atop materials with “frequency band gaps,” that do not vibrate at the frequency that is characteristic of earthquakes, the Energy Department suggests.
D'coda Dcoda

The Environmental Case for Nuclear Energy - Korea [26Sep11] - 0 views

  • Six months after the Fukushima disaster, the repercussions of history’s second-largest nuclear meltdown are still being felt, not only in Japan but around the world. Predictably, people are rethinking the wisdom of relying on nuclear power. The German and Swiss governments have pledged to phase out the use of nuclear power, and Italy has shelved plans to build new reactors. Public debate on future nuclear energy use continues in the United Kingdom, Japan, Finland, and other countries.So far, it is unclear what the reaction of the Korean government will be. Certainly, the public backlash to nuclear energy that has occurred elsewhere in the world is also evident in Korea; according to one study, opposition to nuclear energy in Korea has tripled since the Fukushima disaster. However, there are countervailing considerations here as well, which have caused policy-makers to move cautiously. Korea’s economy is often seen as particularly reliant on the use of nuclear power due to its lack of fossil fuel resources, while Korean companies are some of the world’s most important builders (and exporters) of nuclear power stations.
  • There are three primary reasons why nuclear power is safer and greener than power generated using conventional fossil fuels. First ― and most importantly ― nuclear power does not directly result in the emission of greenhouse gases. Even when you take a life-cycle approach and factor in the greenhouse gas emissions from the construction of the plant, there is no contest. Fossil fuels ― whether coal, oil, or natural gas ― create far more global warming.
  • The negative effects of climate change will vastly outweigh the human and environmental consequences of even a thousand Fukushimas. This is not the place to survey all the dire warnings that have been coming out of the scientific community; suffice it to quote U.N. Secretary General Ban Ki-moon’s concise statement that climate change is the world’s “only one truly existential threat … the great moral imperative of our era.” A warming earth will not only lead to death and displacement in far-off locales, either. Typhoons are already hitting the peninsula with greater intensity due to the warming air, and a recent study warns that global warming will cause Korea to see greatly increased rates of contagious diseases such as cholera and bacillary dysentery.
  • ...5 more annotations...
  • As the world’s ninth largest emitter of greenhouse gases, it should be (and is) a major priority for Korea to reduce emissions, and realistically that can only be accomplished by increasing the use of nuclear power. As Barack Obama noted with regard to the United States’ energy consumption, “Nuclear energy remains our largest source of fuel that produces no carbon emissions. It’s that simple. (One plant) will cut carbon pollution by 16 million tons each year when compared to a similar coal plant. That’s like taking 3.5 million cars off the road.” Environmentalists have traditionally disdained nuclear power, but even green activists cannot argue with that logic, and increasing numbers of them ― Patrick Moore, James Lovelock, Stewart Brand and the late Bishop Hugh Montefiore being prominent examples ― have become supporters of the smart use of nuclear power.
  • Second, the immediate dangers to human health of conventional air pollution outweigh the dangers of nuclear radiation. In 2009, the Seoul Metropolitan Government measured an average PM10 (particulate) concentration in the city of 53.8 g/m3, a figure that is roughly twice the level in other developed nations. According to the Gyeonggi Research Institute, PM10 pollution leads to 10,000 premature deaths per year in and around Seoul, while the Korea Economic Institute has estimated its social cost at 10 trillion won. While sulfur dioxide levels in the region have decreased significantly since the 1980s, the concentration of nitrogen dioxide in the air has not decreased, and ground-level ozone levels remain high. Unlike fossil fuels, nuclear power does not result in the release of any of these dangerous pollutants that fill the skies around Seoul, creating health hazards that are no less serious for often going unnoticed.
  • And third, the environmental and safety consequences of extracting and transporting fossil fuels are far greater than those involved with the production of nuclear power. Korea is one of the largest importers of Indonesian coal for use in power plants, for example. This coal is not always mined with a high level of environmental and safety protections, with a predictable result of air, water, and land pollution in one of Asia’s most biologically sensitive ecosystems. Coal mining is also one of the world’s more dangerous occupations, as evidenced by the many tragic disasters involving poorly managed Chinese mines. While natural gas is certainly a better option than coal, its distribution too can be problematic, whether by ship or through the recently proposed pipeline that would slice down through Siberia and North Korea to provide direct access to Russian gas.
  • What about truly green renewable energy, some might ask ― solar, wind, geothermal, hydroelectric, and tidal energy? Of course, Korea would be a safer and more sustainable place if these clean renewable resources were able to cover the country’s energy needs. However, the country is not particularly well suited for hydroelectric projects, while the other forms of renewable energy production are expensive, and are unfortunately likely to remain so for the foreseeable future. The fact is that most Koreans will not want to pay the significantly higher energy prices that would result from the widespread use of clean renewables, and in a democratic society, the government is unlikely to force them to do so. Thus, we are left with two realistic options: fossil fuels or nuclear. From an environmental perspective, it would truly be a disaster to abandon the latter.
  • By Andrew Wolman Andrew Wolman is an assistant professor at the Hankuk University of Foreign Studies Graduate School of International and Area Studies, where he teaches international law and human rights.
Dan R.D.

More Green Madness On the Plains [25Aug11] - 0 views

  • The proposed Keystone XL pipeline will carry oil from tar sands in Canada across the entire midwestern United States to Port Arthur, Texas. It could eventually transport 900,000 barrels of oil a day and without government funding of any kind has the potential to create 20,000 jobs starting early in 2012. The greens want President Obama to kill it of course; the political blindness and the wishful thinking that so frequently vitiates green policy proposals is fully on display.
  • I will only point to a study by the Canadian Association of Petroleum Producers: “Oil sands crude is six per cent more GHG intensive than the U.S. crude supply average on a wells-to-wheels basis.” Only 6 percent. Yes, that study comes from the oil industry; the green studies and the oil company studies are both suspect and need outside review.
  • the Washington Post want to throw the greens under the bus on this one. “Tar sands crude is not appealing; it is low-grade, it is hard to extract, it is difficult to refine and it produces a lot of carbon emissions. But if it is to be burned anyway, there’s little reason for America to reject it, as long as Keystone XL can transport it across the plains safely.”
1 - 20 of 201 Next › Last »
Showing 20 items per page