Skip to main content

Home/ Open Intelligence / Energy/ Group items tagged nature

Rss Feed Group items tagged

D'coda Dcoda

Did Fukushima kill the nuclear renaissance No, that renaissance died right here at home... - 0 views

shared by D'coda Dcoda on 04 Nov 11 - No Cached
  • In the aftermath of the Fukushima Daiichi nuclear disaster in Japan, many wondered what the event’s impact would be on the nuclear renaissance in the United States. Those who follow the nuclear industry didn’t need eight months of hindsight to give an answer: what nuclear renaissance? The outlook for U.S. nuclear power has worsened considerably in the past five years. Where once there were plans for new reactors at more than 30 different sites, today there are only five, and even those planned reactors might disappear. Only one is actually under construction, and to credit the industry with breaking ground on a new reactor is overstating its prospects. However, none of this gloom is the result of Japan’s tsunami. On the eve of the Tohoku earthquake, U.S. nuclear power looked just as moribund as it is today. The cause of this decline is not renewed concerns about safety, or even that old red herring, waste disposal — instead, it is simple economics. Other technologies, particularly natural gas, offer much cheaper power than nuclear both today and in the foreseeable future.
  • In 2009, the MIT Future of Nuclear Power study released an update to its 2003 estimate of the costs of nuclear power. Estimating a capital cost of $4,000/kW and a fuel cost of $0.67/MMBtu, the study’s authors projected a cost of new nuclear power of 6.6 cents/kWh. Using the same modeling approach, the cost of electricity from a natural gas plant with capital costs of $850/kW and fuel costs of $5.16/MMBtu would be 4.4 cents/kWh. What’s worse, the estimate of 6.6 cents/kWh assumes that nuclear power is able to secure financing at the same interest rate as natural gas plants. In reality, credit markets assign a significant risk premium to nuclear power, bringing its total levelized cost of electricity to 8.4 cents/kWh, nearly twice the cost of natural gas power. Unless the capital costs of new nuclear power plants turn out to be significantly less than what experts expect, or natural gas prices rise considerably in the near future, there is little reason to believe that any new nuclear plants will be built without significant subsidies. This is not to say that nuclear power could not make a comeback within the next 10 to 20 years. But before nuclear can once again be considered a credible competitor to fossil fuels, four changes must happen.
  • The second problem facing nuclear power is its high borrowing costs. To some extent, this problem is a natural consequence of nuclear power plants taking a longer time to build than natural gas plants and having a much higher construction risk (the capital cost of natural gas plants is well-established relative to that of nuclear power). And likewise, to some extent, this problem might resolve itself over time, both as the completion of nuclear plants helps nail down the true capital cost of nuclear power, and as vendors add smaller, modular reactor designs to their list of offerings. But much of the reason behind the high interest rates on loans to nuclear construction is that the industry is scoring an own-goal. In the current relationship between utilities and reactor vendors, utilities are asked to absorb all of the costs of a vendor’s overruns — if a reactor ends up costing a couple billion dollars more than the vendor quotes, it’s the utility that is expected to make up the difference.
  • ...4 more annotations...
  • This is terrifying for a utility’s creditors. The largest utilities in the United States have market capitalizations in the area of $30 billion, while most hover closer to $5 billion. If a nuclear project should fail, the utility might go completely bankrupt, leaving nothing to those foolish enough to lend them money. Accordingly, nuclear projects face higher borrowing costs than other electric projects. It doesn’t have to be this way — if reactor vendors and construction companies helped share the project risks posed by nuclear plants, borrowing costs would be lower. It is also possible for the U.S. government to shoulder some of the risk — but after Solyndra, few legislators have an appetite for letting energy companies push their risks onto the taxpayer.
  • Next, the United States is going to have to adopt some form of carbon tax on electricity generation, or offer a comparable subsidy to the nuclear industry. An appropriately sized carbon tax of $20/ton CO2 would raise the cost of natural-gas-generated electricity by 0.7 cents/kWh, while having a negligible impact on nuclear power
  • And finally, the nuclear industry is just going to have to catch some luck and see natural gas prices rise. That’s a tall order, given the new resources being opened up by hydraulic fracturing and the slowed consumption of natural gas brought about by the recession. But it’s not entirely outside of the realm of possibility — the futures market for natural gas has been wrong before.
  • Nuclear power is down, but not out. With a proper R&D focus, good business practices, appropriate policy, and a little luck, the gulf that separates nuclear power from its competitors may yet be bridged.
D'coda Dcoda

Fracking - energy revolution or skillfully marketed mirage? [27Jun11] - 0 views

  • The New York Times published an article on Sunday, June 26, 2011 titled Insiders Sound an Alarm Amid a Natural Gas Rush. The article quotes a number of emails from natural gas industry insiders, financial analysts that cover the gas industry and skeptical geologists to produce a number of questions about the long term viability of an increasing dependence on cheap natural gas from hydraulic fracturing. The message is that the gas industry has been engaging in hyperbole regarding its capacity to expand production at current prices to meet market demands.
  • the people quoted in the NY Times article do not agree that the technique magically produces low cost gas in unprecedented abundance.
  • “Our engineers here project these wells out to 20-30 years of production and in my mind that has yet to be proven as viable,” wrote a geologist at Chesapeake in a March 17 e-mail to a federal energy analyst. “In fact I’m quite skeptical of it myself when you see the % decline in the first year of production.”
  • ...12 more annotations...
  • “In these shale gas plays no well is really economic right now,” the geologist said in a previous e-mail to the same official on March 16. “They are all losing a little money or only making a little bit of money.”
  • Around the same time the geologist sent the e-mail, Mr. McClendon, Chesapeake’s chief executive, told investors, “It’s time to get bullish on natural gas.”
  • Aubrey McClendon, whose name is not terribly familiar to people outside of the energy industry, has an enormous financial interest in encouraging customers to become addicted to natural gas so that they will keep buying even if the price shoots up – like it did in the period from 2000-2008. During that time McClendon and his company rode a wave that resulted in growing a company from tiny to huge based on debt-financed investments in leases and drilling rigs designed to produce gas in the midcontinent region of the US. A high portion of the company’s wells were stimulated with hydraulic fracturing.
  • When the price of natural gas collapsed in 2008, mostly as a result of the contraction in demand caused by the financial crisis and resulting economic recession/depression, McClendon nearly lost control of his company. He had to sell “substantially all” of shares at a dramatically lowered price in order to pay off creditors and meet margin calls.
  • No U.S. chief executive officer has bought more of his own company’s stock in recent years than McClendon, even as the shares reached all-time highs. His appetite for Chesapeake stock made him “a darling of Wall Street,” Tulsa money manager Jake Dollarhide said. But his purchases were made on margin, meaning he used borrowed money. As the value of the stock fell, McClendon was forced to raise cash to meet margin calls. Recent losses — Chesapeake shares have plummeted 60 percent in the past three weeks — left him unable to fulfill those requirements.Read more: http://newsok.com/market-slide-wipes-out-ceos-chesapeake-holdings/article/3310107#ixzz1QSst9NnL
  • McClendon responded vigorously to the NY Times’s suggestion that the gas revolution was more mirage than miracle in a lengthy letter to Chesapeake Energy employees that was published on the company’s public Facebook page. (Note: The timing of this letter with regard to the NY Times article is telling. The article appeared in the Sunday edition of the Times on June 26, 2011. The letter to employees included a time stamp indicating that it was released at 8:37 pm on the same day while the Facebook page indicates that it was posted to the world by 11:27 pm. In other words – there is no rest for the weary in the Internet era.)
  • McClendon’s letter blamed the NY Times article on environmental activists that proclaim a desire to supply all of the US energy needs from wind and solar energy. It also issued a call to action for Chesapeake Energy employees:
  • We hope that every Chesapeake employee can be part of our public education outreach. At more than 11,000 strong, we are an army of “factivists” – people who have knowledge of the facts and the personal knowledge and ability to spread them. You can do this by talking to your families, friends and others in your spheres of influence (schools, churches, civic organizations, etc) about the kind of company you work for and the integrity of what we do every day for our shareholders, our communities, our states, our nation, our economy and our environment. You don’t have to be an expert to stand up and tell folks that Chesapeake is committed to doing what’s right – and that commitment is expressed every day by you and your colleagues across the company.
  • You can also get involved by joining Chesapeake Fed PAC, our political action committee. Our opponents are extremely well funded and organized. We need to make sure our voice is heard in Washington, DC and with elected officials who are making decisions that affect our industry, our company and our ability to operate in the many states in which shale gas and oil have been discovered.
  • After describing how Chesapeake has 125 active drilling rigs and how it has developed a “swat team” with more than 100 employees that works with environmental groups to produce legislation designed to slow the development of new coal fired power plants and to hasten the closure of existing coal plants, Tom Price said the following:
  • “It’s been said before, but the demand side of the equation is extremely important right now. I mean this really is a zero sum game. I think that there are a number of very progressive utilities out there that recognize the challenges that they are facing with regard to climate change, but the Transport Rule, Clean Air Act and various others.”
  • I remain convinced that there is a market battle going on between natural gas and nuclear energy.
D'coda Dcoda

U.S. Government Confirms Link Between Earthquakes and Hydraulic Fracturing at Oil Price - 0 views

  • On 5 November an earthquake measuring 5.6 rattled Oklahoma and was felt as far away as Illinois. Until two years ago Oklahoma typically had about 50 earthquakes a year, but in 2010, 1,047 quakes shook the state. Why? In Lincoln County, where most of this past weekend's seismic incidents were centered, there are 181 injection wells, according to Matt Skinner, an official from the Oklahoma Corporation Commission, the agency which oversees oil and gas production in the state. Cause and effect? The practice of injecting water into deep rock formations causes earthquakes, both the U.S. Army and the U.S. Geological Survey have concluded.
  • The U.S. natural gas industry pumps a mixture of water and assorted chemicals deep underground to shatter sediment layers containing natural gas, a process called hydraulic fracturing, known more informally as “fracking.” While environmental groups have primarily focused on fracking’s capacity to pollute underground water, a more ominous byproduct emerges from U.S. government studies – that forcing fluids under high pressure deep underground produces increased regional seismic activity. As the U.S. natural gas industry mounts an unprecedented and expensive advertising campaign to convince the public that such practices are environmentally benign, U.S. government agencies have determined otherwise. According to the U.S. Army’s Rocky Mountain Arsenal website, the RMA drilled a deep well for disposing of the site’s liquid waste after the U.S. Environmental Protection Agency “concluded that this procedure is effective and protective of the environment.”  According to the RMA, “The Rocky Mountain Arsenal deep injection well was constructed in 1961, and was drilled to a depth of 12,045 feet” and 165 million gallons of Basin F liquid waste, consisting of “very salty water that includes some metals, chlorides, wastewater and toxic organics” was injected into the well during 1962-1966.
  • Why was the process halted? “The Army discontinued use of the well in February 1966 because of the possibility that the fluid injection was “triggering earthquakes in the area,” according to the RMA. In 1990, the “Earthquake Hazard Associated with Deep Well Injection--A Report to the U.S. Environmental Protection Agency” study of RMA events by Craig Nicholson, and R.I. Wesson stated simply, “Injection had been discontinued at the site in the previous year once the link between the fluid injection and the earlier series of earthquakes was established.” Twenty-five years later, “possibility” and ‘established” changed in the Environmental Protection Agency’s July 2001 87 page study, “Technical Program Overview: Underground Injection Control Regulations EPA 816-r-02-025,” which reported, “In 1967, the U.S. Army Corps of Engineers and the U.S. Geological Survey (USGS) determined that a deep, hazardous waste disposal well at the Rocky Mountain Arsenal was causing significant seismic events in the vicinity of Denver, Colorado.” There is a significant divergence between “possibility,” “established” and “was causing,” and the most recent report was a decade ago. Much hydraulic fracturing to liberate shale oil gas in the Marcellus shale has occurred since.
  • ...3 more annotations...
  • According to the USGS website, under the undated heading, “Can we cause earthquakes? Is there any way to prevent earthquakes?” the agency notes, “Earthquakes induced by human activity have been documented in a few locations in the United States, Japan, and Canada. The cause was injection of fluids into deep wells for waste disposal and secondary recovery of oil, and the use of reservoirs for water supplies. Most of these earthquakes were minor. The largest and most widely known resulted from fluid injection at the Rocky Mountain Arsenal near Denver, Colorado. In 1967, an earthquake of magnitude 5.5 followed a series of smaller earthquakes. Injection had been discontinued at the site in the previous year once the link between the fluid injection and the earlier series of earthquakes was established.” Note the phrase, “Once the link between the fluid injection and the earlier series of earthquakes was established.” So both the U.S Army and the U.S. Geological Survey over fifty years of research confirm on a federal level that that “fluid injection” introduces subterranean instability and is a contributory factor in inducing increased seismic activity.” How about “causing significant seismic events?”
  • Fast forward to the present. Overseas, last month Britain’s Cuadrilla Resources announced that it has discovered huge underground deposits of natural gas in Lancashire, up to 200 trillion cubic feet of gas in all. On 2 November a report commissioned by Cuadrilla Resources acknowledged that hydraulic fracturing was responsible for two tremors which hit Lancashire and possibly as many as fifty separate earth tremors overall. The British Geological Survey also linked smaller quakes in the Blackpool area to fracking. BGS Dr. Brian Baptie said, “It seems quite likely that they are related,” noting, “We had a couple of instruments close to the site and they show that both events occurred near the site and at a shallow depth.” But, back to Oklahoma. Austin Holland’s August 2011 report, “Examination of Possibly Induced Seismicity from Hydraulic Fracturing in the Eola Field, Garvin County, Oklahoma” Oklahoma Geological Survey OF1-2011, studied 43 earthquakes that occurred on 18 January, ranging in intensity from 1.0 to 2.8 Md (milliDarcies.) While the report’s conclusions are understandably cautious, it does state, “Our analysis showed that shortly after hydraulic fracturing began small earthquakes started occurring, and more than 50 were identified, of which 43 were large enough to be located.”
  • Sensitized to the issue, the oil and natural gas industry has been quick to dismiss the charges and deluge the public with a plethora of televisions advertisements about how natural gas from shale deposits is not only America’s future, but provides jobs and energy companies are responsible custodians of the environment. It seems likely that Washington will eventually be forced to address the issue, as the U.S. Army and the USGS have noted a causal link between the forced injection of liquids underground and increased seismic activity. While the Oklahoma quake caused a deal of property damage, had lives been lost, the policy would most certainly have come under increased scrutiny from the legal community. While polluting a local community’s water supply is a local tragedy barely heard inside the Beltway, an earthquake ranging from Oklahoma to Illinois, Kansas, Arkansas, Tennessee and Texas is an issue that might yet shake voters out of their torpor, and national elections are slightly less than a year away.
D'coda Dcoda

EPA Finds Compound Used in Fracking in Wyoming Aquifer [10Nov11]f - 0 views

  • As the country awaits results from a nationwide safety study on the natural gas drilling process of fracking, a separate government investigation into contamination in a place where residents have long complained [1] that drilling fouled their water has turned up alarming levels of underground pollution. A pair of environmental monitoring wells drilled deep into an aquifer in Pavillion, Wyo., contain high levels of cancer-causing compounds and at least one chemical commonly used in hydraulic fracturing, according to new water test results [2] released yesterday by the Environmental Protection Agency.
  • The findings are consistent with water samples the EPA has collected from at least 42 homes in the area since 2008, when ProPublica began reporting [3] on foul water and health concerns in Pavillion and the agency started investigating reports of contamination there. Last year -- after warning residents not to drink [4] or cook with the water and to ventilate their homes when they showered -- the EPA drilled the monitoring wells to get a more precise picture of the extent of the contamination.
  • The Pavillion area has been drilled extensively for natural gas over the last two decades and is home to hundreds of gas wells. Residents have alleged for nearly a decade [1] that the drilling -- and hydraulic fracturing in particular -- has caused their water to turn black and smell like gasoline. Some residents say they suffer neurological impairment [5], loss of smell, and nerve pain they associate with exposure to pollutants. The gas industry -- led by the Canadian company EnCana, which owns the wells in Pavillion -- has denied that its activities are responsible for the contamination. EnCana has, however, supplied drinking water to residents.
  • ...1 more annotation...
  • The information released yesterday by the EPA was limited to raw sampling data: The agency did not interpret the findings or make any attempt to identify the source of the pollution. From the start of its investigation, the EPA has been careful to consider all possible causes of the contamination and to distance its inquiry from the controversy around hydraulic fracturing. Still, the chemical compounds the EPA detected are consistent with those produced from drilling processes, including one -- a solvent called 2-Butoxyethanol (2-BE) -- widely used in the process of hydraulic fracturing. The agency said it had not found contaminants such as nitrates and fertilizers that would have signaled that agricultural activities were to blame.
D'coda Dcoda

Smoking Gun - Jan Lundberg antinuclear activist & heir to petroleum wealth [18Jul11] - 0 views

  • A ‘smoking gun’ article is one that reveals a direct connection between a fossil fuel or alternative energy system promoter and a strongly antinuclear attitude. One of my guiding theories about energy is that a great deal of the discussion about safety, cost, and waste disposal is really a cover for a normal business activity of competing for market share.
  • This weekend, I came across a site called Culture Change that provides some strong support for my theory about the real source of strength for the antinuclear industry. According to the information at the bottom of the home page, Culture Change was founded by Sustainable Energy Institute (formerly Fossil Fuels Policy Action), a nonprofit organization.Jan Lundberg, who has led the organization and its predecessor organizations since 1988, grew up in a wealthy family with a father who was a popular and respected petroleum industry analyst.
  • As Oil Guru, Dan [Lundberg, my father] earned a regular Nightly Business Report commentary spot on the Public Broadcasting System television network in the early and mid-1980s. I helped edit or proof-read just about every one of those commentaries, and we delighted in the occasional opportunity to attack gasohol and ethanol for causing “agricultural strip mining” (as we did in the Lundberg Letter).
  • ...5 more annotations...
  • Before entering into the non-profit world, he entered into the family business of oil industry analysis and claims to have achieved a fair amount of financial success. As Lundberg tells the tale, he stopped “punching the corporate time clock” in 1988 to found Fossil Fuels Policy Action.I had just learned about peak oil. Upon my press conference announcing the formation of Fossil Fuels Policy Action, USA Today’s headline was “Lundberg Lines up with Nature.” My picture with the story looked like I was a corporate fascist, not an acid-tripping hippie. The USA Today story led to an invitation to review Beyond Oil: The Threat to Food and Fuel in the Coming Decades, for the quarterly Population and Environment journal. In learning for the first time about peak oil (although I had questioned long-term growth in petroleum supplies), I was awakened to the bigger picture as never before. Natural gas was no answer. And I already knew that the supply crisis to come — I had helped predict the 1970s oil shocks — was to be a liquid fuels crisis.
  • Lundberg tells an interesting story about his initial fundraising activities for his new non-profit group.Setting out to become a clearinghouse for energy data and policy, we had a tendency to go along with the buzzword “natural gas as a bridge fuel” — especially when my previous clients serving the petroleum industry until 1988 included natural gas utilities. They were and are represented by the American Gas Association, where I knew a few friendly executives. Upon starting a nonprofit group for the environment with an energy focus, I met with the AGA right away. I was anticipating one of their generous grants they were giving large environmental groups who were trumpeting the “natural gas is a bridge fuel” mantra.
  • I slept on it and decided that I would not participate in this corrupt conspiracy. Instead, I had fun writing one of Fossil Fuels Policy Action’s first newsletters about this “bridge” argument and the background story that the gas industry was really competing with fuel oil for heating. I brought up the AGA’s funding for enviros and said I was rejecting it. I was crazy, I admit, for I was starting a new career with almost no savings and no guarantees. So I was not surprised when my main contact at AGA called me up and snarled, “Jan, are you on acid?!
  • Here is a quote from his July 10, 2011 post titled Nuclear Roulette: new book puts a nail in coffin of nukesCulture Change went beyond studying the problem soon after its founding in 1988: action and advocacy must get to the root of the crises to assure a livable future. Also, information overload and a diet of bad news kills much activism. So it’s hard to find reading material to strongly recommend. But the new book Nuclear Roulette: The Case Against the “Nuclear Renaissance” is must-have if one is fighting nukes today.
  • He goes to say the following:The uneconomic nature of nuclear power, and the lack of energy gain compared to cheap oil, are two huge reasons for society to quit flirting with more nuclear power, never mind the catastrophic record and certainty of more to come. Somehow the evidence and true track record of dozens of accidents and perhaps 300,000 to nearly 1,000,000 deaths from just Chernobyl, are brushed aside by corporate media and most governments. So, imaginative means of helping to end nuclear proliferation are crucial, the most careful and reasonable-sounding ones being included in summary form in Nuclear Roulette.
D'coda Dcoda

The Intermittency of Fossil Fuels & Nuclear [19Aug11] - 0 views

  • You’ve likely heard this argument before: “The wind doesn’t always blow and the sun doesn’t always shine, so we can’t rely on renewable energy.” However, a series of recent events undermine the false dichotomy that renewable energies are unreliable and that coal, nuclear and natural gas are reliable.
  • There are too many reasons to list in a single blogpost why depending on fossil and nuclear energies is dangerous, but one emerging trend is that coal, natural gas and even nuclear energy are not as reliable as they are touted to be. Take for instance the nuclear disaster still unfolding in Japan. On March 11, that country experienced a massive earthquake and the resulting tsunami knocked out several nuclear reactors on the coast. Three days later, an operator of a nearby wind farm in Japan restarted its turbines - turbines that were intentionally turned off  immediately after the earthquake. Several countries, including France and Germany, are now considering complete phase-outs of nuclear energy in favor of offshore wind energy in the aftermath of the Japanese disaster. Even China has suspended its nuclear reactor plans while more offshore wind farms are being planned off that country’s coast.
  • In another example much closer to home, here in the Southeast, some of TVA’s nuclear fleet is operating at lower levels due to extreme temperatures. When the water temperatures in the Tennessee River reach more than 90 degrees, the TVA Browns Ferry nuclear reactors cannot discharge the already-heated power plant water into the river. If water temperatures become too high in a natural body of water, like a river, the ecosystem can be damaged and fish kills may occur. This problem isn’t limited to nuclear power plants either.
  • ...5 more annotations...
  • Texas has been experiencing a terrible heat wave this summer - along with much of the rest of the country. According to the Dallas Morning News, this heat wave has caused more than 20 power plants to shut down, including coal and natural gas plants. On the other hand, Texan wind farms have been providing a steady, significant supply of electricity during the heat wave, in part because wind farms require no water to generate electricity. The American Wind Energy Association (AWEA) noted on their blog: “Wind plants are keeping the lights on and the air conditioners running for hundreds of thousands of homes in Texas.”
  • This near-threat of a blackout is not a one-time or seasonal ordeal for Texans. Earlier this year, when winter storms were hammering the Lone Star State, rolling blackouts occurred due to faltering fossil fuel plants. In February, 50 power plants failed and wind energy helped pick up the slack.
  • Although far from the steady winds of the Great Plains, Cape Wind Associates noted that if their offshore wind farm was already operational, the turbines would have been able to harness the power of the heat wave oppressing the Northeast, mostly at full capacity. Cape Wind, vying to be the nation’s first offshore wind farm, has a meteorological tower stationed off Nantucket Sound in Massachusetts. If Cape Wind had been built, it could have been using these oppressive heat waves to operate New England’s cooling air conditioners. These three examples would suggest that the reliability of fossil fuels and nuclear reactors has been overstated, as has the variability of wind.
  • So just how much electricity can wind energy realistically supply as a portion of the nation’s energy? A very thorough report completed by the U.S. Department of Energy in 2008 (completed during President George W. Bush’s tenure) presents one scenario where wind energy could provide 20% of the U.S.’s electrical power by 2030. To achieve this level, the U.S. Department of Energy estimates energy costs would increase only 50 cents per month per household. A more recent study, the Eastern Wind Integration and Transmission Study (EWITS), shows that wind could supply 30% of the Eastern Interconnect’s service area (all of the Eastern U.S. from Nebraska eastward) with the proper transmission upgrades. As wind farms become more spread out across the country, and are better connected to each other via transmission lines, the variability of wind energy further decreases. If the wind isn’t blowing in Nebraska, it may be blowing in North Carolina, or off the coast of Georgia and the electricity generated in any state can then be transported across the continent. A plan has been hatched in the European Union to acquire 50% of those member states’ electricity from wind energy by 2050 - mostly from offshore wind farms, spread around the continent and heavily connected with transmission lines.
  • With a significant amount of wind energy providing electricity in the U.S., what would happen if the wind ever stops blowing? Nothing really - the lights will stay on, refrigerators will keep running and air conditions will keep working. As it so happens, wind energy has made the U.S. electrical supply more diversified and protects us against periodic shut downs from those pesky, sometimes-unreliable fossil fuel power plants and nuclear reactors.
  •  
    a series of recent events undermine the false dichotomy that renewable energies are unreliable and that coal, nuclear and natural gas are reliable.
D'coda Dcoda

Lifetime Cumulative Limit of Internal Radiation from Food to Be 100 Millisieverts in Ja... - 0 views

  • External radiation is not counted in this number, as opposed to their draft plan in July which did include external radiation, and it is in addition to the natural radiation exposure (by which is meant pre-Fukushima natural).The experts on the Commission didn't rule on the radiation limit for children, leaving the decision to the Ministry of Health and Labor as if the top-school career bureaucrats in the Ministry would know better.Yomiuri and other MSMs are spinning it as "tightening" the existing provisional safety limits on food.From Yomiuri Shinbun (10/27/2011):
  • The Food Safety Commission under the Cabinet Office has been deliberating on the health effect of internal radiation exposure from the radioactive materials in food. On October 27, it submitted its recommendation to set the upper limit on lifetime cumulative radiation from food at 100 millisieverts.
  • On receiving the recommendation, the Ministry of Health and Labor will start setting the detailed guidelines for each food items. They are expected to be stricter than the provisional safety limits set right after the Fukushima I Nuclear Plant accident. The Radiation Commission under the Ministry of Education will review the guidelines to be set by the Ministry of Health and Labor, and the new safety limits will be formally decided.
  • ...7 more annotations...
  • According to the draft of the recommendation in July, the Food Safety Commission was aiming at setting "100 millisieverts lifetime limit" that would include the external radiation exposure from the nuclides in the air. However, based on the opinions from the general public, the Commission decided that the effect of external radiation exposure was small and focused only on internal radiation exposure from food.
  • If we suppose one's lifetime is 100 years, then 1 millisievert per year would be the maximum. The current provisional safety limit assumes the upper limit of 5 millisievert per year with radioactive cesium alone. So the new regulations will inevitably be stricter than the current provisional safety limits.
  • In addition, the Commission pointed out that children "are more susceptible to the effect of radiation", but it didn't cite any specific number for children. The Commission explained that it would be up to the Ministry of Health and Labor and other agencies to discuss" whether the effect on children should be reflected in the new safety limits.Oh boy. So many holes in the article.First, I suspect it is a rude awakening for many Japanese to know that the current provisional safety limits for radioactive materials in food presuppose very high internal radiation level already. The Yomiuri article correctly says 5 millisieverts per year from radioactive cesium alone. The provisional safety limit for radioactive iodine, though now it's almost irrelevant, is 2,000 becquerels/kg, and that presupposes 2 millisieverts per year internal radiation. From cesium and iodine alone, the provisional safety limits on food assume 7 millisievert per year internal radiation.
  • (The reason why the radioactive iodine limit is set lower than that for radioactive cesium is because radioactive iodine all goes to thyroid gland and gets accumulated in the organ.)I am surprised that Yomiuri even mentioned the 5 millisieverts per year limit from cesium exposure alone. I suspect it is the first time ever for the paper.Second, the article says the Commission decided to exclude external radiation from the "100 millisieverts" number because of the public opinion. Which "public" opinion are they talking about? Mothers and fathers with children? I doubt it. If anything, the general public (at least those who doesn't believe radiation is good for them) would want to include external radiation so that the overall radiation limit is set, rather than just for food.
  • Third, and most importantly, if the proposed lifetime limit of 100 millisieverts is only for internal radiation from FOOD, then the overall internal radiation could be much higher. Why? Because, pre-Fukushima, the natural internal radiation from food in Japan was only 0.41 millisievert per year (mostly from K-40), or 28% of total natural radiation exposure per year of 1.45 millisievert (average). Of internal radiation exposure, inhaling radon is 0.45 millisievert per year in Japan, as opposed to the world average of 1.2 millisievert per year.Now, these so-called experts in the government commission are saying the internal radiation from food can be 1 millisievert per year (assuming the life of 100 years), in addition to the natural internal radiation from food (K-40) which is 0.41 millisievert per year. Then, you will have to add internal exposure from inhaling the radioactive materials IN ADDITION TO radon which is 0.45 millisievert per year.
  • Winter in the Pacific Ocean side of east Japan is dry, particularly in Kanto. North wind kicks up dust, and radioactive materials in the dust will be kicked up. The Tokyo metropolitan government will be burning away the radioactive debris from Iwate Prefecture (Miyagi's to follow) into the wintry sky. So-called "decontamination" efforts all over east Japan will add more radioactive particles in the air for people to breathe in.
  • For your information, the comparison of natural radiation exposure levels (the world vs Japan), from the Nuclear Safety Research Association Handbook on treating acute radiation injury (original in Japanese; my translation of labels). Japan has (or had) markedly lower radon inhalation than the world average, and much lower external radiation from the ground and from cosmic ray. It makes it all up by overusing the medical X-rays and CT scans, and even the Nuclear Safety Research Association who issued the following table says Japan tends to use too many X-rays and scans and that the medical professionals should make effort not to overuse them.
D'coda Dcoda

The nuclear power plans that have survived Fukushima [28Sep11] - 0 views

  • SciDev.Net reporters from around the world tell us which countries are set on developing nuclear energy despite the Fukushima accident. The quest for energy independence, rising power needs and a desire for political weight all mean that few developing countries with nuclear ambitions have abandoned them in the light of the Fukushima accident. Jordan's planned nuclear plant is part of a strategy to deal with acute water and energy shortages.
  • The Jordan Atomic Energy Commission (JAEC) wants Jordan to get 60 per cent of its energy from nuclear by 2035. Currently, obtaining energy from neighbouring Arab countries costs Jordan about a fifth of its gross domestic product. The country is also one of the world's most water-poor nations. Jordan plans to desalinate sea water from the Gulf of Aqaba to the south, then pump it to population centres in Amman, Irbid, and Zarqa, using its nuclear-derived energy. After the Fukushima disaster, Jordan started re-evaluating safety procedures for its nuclear reactor, scheduled to begin construction in 2013. The country also considered more safety procedures for construction and in ongoing geological and environmental investigations.
  • The government would not reverse its decision to build nuclear reactors in Jordan because of the Fukushima disaster," says Abdel-Halim Wreikat, vice Chairman of the JAEC. "Our plant type is a third-generation pressurised water reactor, and it is safer than the Fukushima boiling water reactor." Wreikat argues that "the nuclear option for Jordan at the moment is better than renewable energy options such as solar and wind, as they are still of high cost." But some Jordanian researchers disagree. "The cost of electricity generated from solar plants comes down each year by about five per cent, while the cost of producing electricity from nuclear power is rising year after year," says Ahmed Al-Salaymeh, director of the Energy Centre at the University of Jordan. He called for more economic feasibility studies of the nuclear option.
  • ...20 more annotations...
  • And Ahmad Al-Malabeh, a professor in the Earth and Environmental Sciences department of Hashemite University, adds: "Jordan is rich not only in solar and wind resources, but also in oil shale rock, from which we can extract oil that can cover Jordan's energy needs in the coming years, starting between 2016 and 2017 ... this could give us more time to have more economically feasible renewable energy."
  • Finance, rather than Fukushima, may delay South Africa's nuclear plans, which were approved just five days after the Japanese disaster. South Africa remains resolute in its plans to build six new nuclear reactors by 2030. Katse Maphoto, the director of Nuclear Safety, Liabilities and Emergency Management at the Department of Energy, says that the government conducted a safety review of its two nuclear reactors in Cape Town, following the Fukushima event.
  • Vietnam's nuclear energy targets remain ambitious despite scientists' warning of a tsunami risk. Vietnam's plan to power 10 per cent of its electricity grid with nuclear energy within 20 years is the most ambitious nuclear energy plan in South-East Asia. The country's first nuclear plant, Ninh Thuan, is to be built with support from a state-owned Russian energy company and completed by 2020. Le Huy Minh, director of the Earthquake and Tsunami Warning Centre at Vietnam's Institute of Geophysics, has warned that Vietnam's coast would be affected by tsunamis in the adjacent South China Sea.
  • Larkin says nuclear energy is the only alternative to coal for generating adequate electricity. "What other alternative do we have? Renewables are barely going to do anything," he said. He argues that nuclear is capable of supplying 85 per cent of the base load, or constantly needed, power supply, while solar energy can only produce between 17 and 25 per cent. But, despite government confidence, Larkin says that a shortage of money may delay the country's nuclear plans.
  • The government has said yes but hasn't said how it will pay for it. This is going to end up delaying by 15 years any plans to build a nuclear station."
  • The Ninh Thuan nuclear plant would sit 80 to 100 kilometres from a fault line on the Vietnamese coast, potentially exposing it to tsunamis, according to state media. But Vuong Huu Tan, president of the state-owned Vietnam Atomic Energy Commission, told state media in March, however, that lessons from the Fukushima accident will help Vietnam develop safe technologies. And John Morris, an Australia-based energy consultant who has worked as a geologist in Vietnam, says the seismic risk for nuclear power plants in the country would not be "a major issue" as long as the plants were built properly. Japan's nuclear plants are "a lot more earthquake prone" than Vietnam's would be, he adds.
  • Undeterred by Fukushima, Nigeria is forging ahead with nuclear collaborations. There is no need to panic because of the Fukushima accident, says Shamsideen Elegba, chair of the Forum of Nuclear Regulatory Bodies in Africa. Nigeria has the necessary regulatory system to keep nuclear activities safe. "The Nigerian Nuclear Regulatory Authority [NNRA] has established itself as a credible organisation for regulatory oversight on all uses of ionising radiation, nuclear materials and radioactive sources," says Elegba who was, until recently, the NNRA's director general.
  • Vietnam is unlikely to experience much in the way of anti-nuclear protests, unlike neighbouring Indonesia and the Philippines, where civil society groups have had more influence, says Kevin Punzalan, an energy expert at De La Salle University in the Philippines. Warnings from the Vietnamese scientific community may force the country's ruling communist party to choose alternative locations for nuclear reactors, or to modify reactor designs, but probably will not cause extreme shifts in the one-party state's nuclear energy strategy, Punzalan tells SciDev.Net.
  • Will the Philippines' plans to rehabilitate a never-used nuclear power plant survive the Fukushima accident? The Philippines is under a 25-year moratorium on the use of nuclear energy which expires in 2022. The government says it remains open to harnessing nuclear energy as a long-term solution to growing electricity demand, and its Department of Science and Technology has been making public pronouncements in favour of pursuing nuclear energy since the Fukushima accident. Privately, however, DOST officials acknowledge that the accident has put back their job of winning the public over to nuclear by four or five years.
  • In the meantime, the government is trying to build capacity. The country lacks, for example, the technical expertise. Carmencita Bariso, assistant director of the Department of Energy's planning bureau, says that, despite the Fukushima accident, her organisation has continued with a study on the viability, safety and social acceptability of nuclear energy. Bariso says the study would include a proposal for "a way forward" for the Bataan Nuclear Power Plant, the first nuclear reactor in South East Asia at the time of its completion in 1985. The $2.3-billion Westinghouse light water reactor, about 60 miles north of the capital, Manila, was never used, though it has the potential to generate 621 megawatts of power. President Benigno Aquino III, whose mother, President Corazon Aquino, halted work on the facility in 1986 because of corruption and safety issues, has said it will never be used as a nuclear reactor but could be privatised and redeveloped as a conventional power plant.
  • But Mark Cojuangco, former lawmaker, authored a bill in 2008 seeking to start commercial nuclear operations at the Bataan reactor. His bill was not passed before Congress adjourned last year and he acknowledges that the Fukushima accident has made his struggle more difficult. "To go nuclear is still the right thing to do," he says. "But this requires a societal decision. We are going to spark public debates with a vengeance as soon as the reports from Fukushima are out." Amended bills seeking both to restart the reactor, and to close the issue by allowing either conversion or permanent closure, are pending in both the House and the Senate. Greenpeace, which campaigns against nuclear power, believes the Fukushima accident has dimmed the chances of commissioning the Bataan plant because of "increased awareness of what radioactivity can do to a place". Many parts of the country are prone to earthquakes and other natural disasters, which critics say makes it unsuitable both for the siting of nuclear power stations and the disposal of radioactive waste.
  • In Kenya, nuclear proponents argue for a geothermal – nuclear mix In the same month as the Fukushima accident, inspectors from the International Atomic Energy Agency approved Kenya's application for its first nuclear power station (31 March), a 35,000 megawatt facility to be built at a cost of Sh950 billion (US$9.8 billion) on a 200-acre plot on the Athi Plains, about 50km from Nairobi
  • The plant, with construction driven by Kenya's Nuclear Electricity Project Committee, should be commissioned in 2022. The government claims it could satisfy all of Kenya's energy needs until 2040. The demand for electricity is overwhelming in Kenya. Less than half of residents in the capital, Nairobi, have grid electricity, while the rural rate is two per cent. James Rege, Chairman of the Parliamentary Committee on Energy, Communication and Information, takes a broader view than the official government line, saying that geothermal energy, from the Rift Valley project is the most promising option. It has a high production cost but remains the country's "best hope". Nuclear should be included as "backup". "We are viewing nuclear energy as an alternative source of power. The cost of fossil fuel keeps escalating and ordinary Kenyans can't afford it," Rege tells SciDev.Net.
  • Hydropower is limited by rivers running dry, he says. And switching the country's arable land to biofuel production would threaten food supplies. David Otwoma, secretary to the Energy Ministry's Nuclear Electricity Development Project, agrees that Kenya will not be able to industrialise without diversifying its energy mix to include more geothermal, nuclear and coal. Otwoma believes the expense of generating nuclear energy could one day be met through shared regional projects but, until then, Kenya has to move forward on its own. According to Rege, much as the nuclear energy alternative is promising, it is extremely important to take into consideration the Fukushima accident. "Data is available and it must be one step at a time without rushing things," he says. Otwoma says the new nuclear Kenya can develop a good nuclear safety culture from the outset, "but to do this we need to be willing to learn all the lessons and embrace them, not forget them and assume that won't happen to us".
  • But the government adopted its Integrated Resource Plan (IRP) for 2010-2030 five days after the Fukushima accident. Elliot Mulane, communications manager for the South African Nuclear Energy Corporation, (NECSA) a public company established under the 1999 Nuclear Energy Act that promotes nuclear research, said the timing of the decision indicated "the confidence that the government has in nuclear technologies". And Dipuo Peters, energy minister, reiterated the commitment in her budget announcement earlier this year (26 May), saying: "We are still convinced that nuclear power is a necessary part of our strategy that seeks to reduce our greenhouse gas emissions through a diversified portfolio, comprising some fossil-based, renewable and energy efficiency technologies". James Larkin, director of the Radiation and Health Physics Unit at the University of the Witwatersrand, believes South Africa is likely to go for the relatively cheap, South Korean generation three reactor.
  • It is not only that we say so: an international audit came here in 2006 to assess our procedure and processes and confirmed the same. Elegba is firmly of the view that blame for the Fukushima accident should be allocated to nature rather than human error. "Japan is one of the leaders not only in that industry, but in terms of regulatory oversight. They have a very rigorous system of licensing. We have to make a distinction between a natural event, or series of natural events and engineering infrastructure, regulatory infrastructure, and safety oversight." Erepamo Osaisai, Director General of the Nigeria Atomic Energy Commission (NAEC), has said there is "no going back" on Nigeria's nuclear energy project after Fukushima.
  • Nigeria is likely to recruit the Russian State Corporation for Atomic Energy, ROSATOM, to build its first proposed nuclear plant. A delegation visited Nigeria (26- 28 July) and a bilateral document is to be finalised before December. Nikolay Spassy, director general of the corporation, said during the visit: "The peaceful use of nuclear power is the bedrock of development, and achieving [Nigeria's] goal of being one of the twenty most developed countries by the year 2020 would depend heavily on developing nuclear power plants." ROSATOM points out that the International Atomic Energy Agency monitors and regulates power plant construction in previously non-nuclear countries. But Nnimmo Bassey, executive director of the Environmental Rights Action/Friends of the Earth Nigeria (ERA/FoEN), said "We cannot see the logic behind the government's support for a technology that former promoters in Europe, and other technologically advanced nations, are now applying brakes to. "What Nigeria needs now is investment in safe alternatives that will not harm the environment and the people. We cannot accept the nuclear option."
  • Thirsty for electricity, and desirous of political clout, Egypt is determined that neither Fukushima ― nor revolution ― will derail its nuclear plans. Egypt was the first country in the Middle East and North Africa to own a nuclear programme, launching a research reactor in 1961. In 2007 Egypt 'unfroze' a nuclear programme that had stalled in the aftermath of the Chernobyl disaster. After the Egyptian uprising in early 2011, and the Fukushima accident, the government postponed an international tender for the construction of its first plant.
  • Yassin Ibrahim, chairman of the Nuclear Power Plants Authority, told SciDev.Net: "We put additional procedures in place to avoid any states of emergency but, because of the uprising, the tender will be postponed until we have political stability after the presidential and parliamentary election at the end of 2011". Ibrahim denies the nuclear programme could be cancelled, saying: "The design specifications for the Egyptian nuclear plant take into account resistance to earthquakes and tsunamis, including those greater in magnitude than any that have happened in the region for the last four thousand years. "The reactor type is of the third generation of pressurised water reactors, which have not resulted in any adverse effects to the environment since they began operation in the early sixties."
  • Ibrahim El-Osery, a consultant in nuclear affairs and energy at the country's Nuclear Power Plants Authority, points out that Egypt's limited resources of oil and natural gas will run out in 20 years. "Then we will have to import electricity, and we can't rely on renewable energy as it is still not economic yet — Egypt in 2010 produced only two per cent of its needs through it." But there are other motives for going nuclear, says Nadia Sharara, professor of mineralogy at Assiut University. "Owning nuclear plants is a political decision in the first place, especially in our region. And any state that has acquired nuclear technology has political weight in the international community," she says. "Egypt has the potential to own this power as Egypt's Nuclear Materials Authority estimates there are 15,000 tons of untapped uranium in Egypt." And she points out it is about staying ahead with technology too. "If Egypt freezes its programme now because of the Fukushima nuclear disaster it will fall behind in many science research fields for at least the next 50 years," she warned.
D'coda Dcoda

The Dispatch Queue - An Alternative Means of Accounting for External Costs? [28Sep11] - 0 views

  • Without much going on recently that hasn’t been covered by other blog posts, I’d like to explore a topic not specifically tied to nuclear power or to activities currently going on in Washington, D.C. It involves an idea I have about a possible alternative means of having the electricity market account for the public health and environmental costs of various energy sources, and encouraging the development and use of cleaner sources (including nuclear) without requiring legislation. Given the failure of Congress to take action on global warming, as well as environmental issues in general, non-legislative approaches to accomplishing environmental goals may be necessary. The Problem
  • One may say that the best response would be to significantly tighten pollution regulations, perhaps to the point where no sources have significant external costs. There are problems with this approach, however, above and beyond the fact that the energy industry has (and will?) successfully blocked the legislation that would be required. Significant tightening of regulations raises issues such as how expensive compliance will be, and whether or not viable alternative (cleaner) sources would be available. The beauty of simply placing a cost (or tax) on pollution that reflects its costs to public health and the environment is that those issues need not be addressed. The market just decides between sources based on the true, overall cost of each, resulting in the minimum overall (economic + environmental) cost-generation portfolio
  • The above reasoning is what led to policies like cap-and-trade or a CO2 emissions tax being proposed as a solution for the global warming problem. This has not flown politically, however. Policies that attempt to have external costs included in the market cost of energy have been labeled a “tax increase.” This is particularly true given that the associated pollution taxes (or emissions credit costs) would have largely gone to the government.
  • ...15 more annotations...
  • One final idea, which does not involve money going to or from government, is simply requiring that cleaner sources provide a certain fraction of our overall power generation. The many state Renewable Portfolio Standards (that do not include nuclear) and the Clean Energy Standard being considered by Congress and the Obama administration (which does include nuclear) are examples of this policy. While better than nothing, such policies are not ideal in that they are crude, and don’t involve a quantitative incentive based on real external costs. An energy source is either defined as “clean,” or it is not. Note that the definition of “clean” would be decided politically, as opposed to objectively based on tangible external costs determined by scientific studies (nuclear’s exclusion from state Renewable Portfolio Standards policies being one outrageous example). Finally, there is the fact that any such policy would require legislation.
  • Well, if we can’t tax pollution, how about encouraging the use of clean sources by giving them subsidies? This has proved to be more popular so far, but this idea has also recently run into trouble, given the current situation with the budget deficit and national debt. Events like the Solyndra bankruptcy have put government clean energy subsidies even more on the defensive. Thus, it seems that neither policies involving money flowing to the government nor policies involving money flowing from the government are politically viable at this point.
  • All of the above begs the question whether there is a policy available that will encourage the use of cleaner energy sources that is revenue-neutral (i.e., does not involve money flowing to or from the government), does not involve the outright (political) selection of certain energy sources over others, and does not require legislation. Enter the Dispatch Queue
  • There must be enough power plants in a given region to meet the maximum load (or demand) expected to occur. In fact, total generation capacity must exceed maximum demand by a specified “reserve margin,” to address the possibility of a plant going offline, or other possible considerations. Due to the fact that demand varies significantly with time, a significant fraction of the generation capacity remains offline, some or most of the time. The dispatch queue is a means by which utilities, or independent regional grid operators, decide which power plants will operate in order to meet demand at any given instant. A good discussion of dispatch queues and how they operate can be found in this Department of Energy report.
  • The general goal of the methodology used to set the dispatch queue order is to minimize overall generation cost, while staying in compliance with all federal or state laws (environmental rules, etc.). This is done by placing the power plants with the lowest “variable” cost first in the queue. Plants with the highest “variable” cost are placed last. The “variable” cost of a plant represents how much more it costs to operate the plant than it costs to leave it idle (i.e., it includes the fuel cost and maintenance costs that arise from operation, but does not include the plant capital cost, personnel costs, or any fixed maintenance costs). Thus, one starts with the least expensive plants, and moves up (in cost) until generation meets demand. The remaining, more expensive plants are not fired up. This ensures that the lowest-operating-cost set of plants is used to meet demand at any given time
  • As far as who makes the decisions is concerned, in many cases the local utility itself runs the dispatch for its own service territory. In most of the United States, however, there is a large regional grid (covering several utilities) that is operated by an Independent System Operator (ISO) or Regional Transmission Organization (RTO), and those organizations, which are independent of the utilities, set the dispatch queue for the region. The Idea
  • As discussed above, a plant’s place in the dispatch queue is based upon variable cost, with the lowest variable cost plants being first in the queue. As discussed in the DOE report, all the dispatch queues in the country base the dispatch order almost entirely on variable cost, with the only possible exceptions being issues related to maximizing grid reliability. What if the plant dispatch methodology were revised so that environmental costs were also considered? Ideally, the public health and environmental costs would be objectively and scientifically determined and cast in terms of an equivalent economic cost (as has been done in many scientific studies such as the ExternE study referenced earlier). The calculated external cost would be added to a plant’s variable cost, and its place in the dispatch queue would be adjusted accordingly. The net effect would be that dirtier plants would be run much less often, resulting in greatly reduced pollution.
  • This could have a huge impact in the United States, especially at the current time. Currently, natural gas prices are so low that the variable costs of combine-cycle natural gas plants are not much higher than those of coal plants, even without considering environmental impacts. Also, there is a large amount of natural gas generation capacity sitting idle.
  • More specifically, if dispatch queue ordering methods were revised to even place a small (economic) weight on environmental costs, there would be a large switch from coal to gas generation, with coal plants (especially the older, dirtier ones) moving to the back of the dispatch queue, and only running very rarely (at times of very high demand). The specific idea of putting gas plants ahead of coal plants in the dispatch queue is being discussed by others.
  • The beauty of this idea is that it does not involve any type of tax or government subsidy. It is revenue neutral. Also, depending on the specifics of how it’s implemented, it can be quantitative in nature, with environmental costs of various power plants being objectively weighed, as opposed certain sources simply being chosen, by government/political fiat, over others. It also may not require legislation (see below). Finally, dispatch queues and their policies and methods are a rather arcane subject and are generally below the political radar (many folks haven’t even heard of them). Thus, this approach may allow the nation’s environmental goals to be (quietly) met without causing a political uproar. It could allow policy makers to do the right thing without paying too high of a political cost.
  • Questions/Issues The DOE report does mention some examples of dispatch queue methods factoring in issues other than just the variable cost. It is fairly common for issues of grid reliability to be considered. Also, compliance with federal or state environmental requirements can have some impacts. Examples of such laws include limits on the hours of operation for certain polluting facilities, or state requirements that a “renewable” facility generate a certain amount of power over the year. The report also discusses the possibility of favoring more fuel efficient gas plants over less efficient ones in the queue, even if using the less efficient plants at that moment would have cost less, in order to save natural gas. Thus, the report does discuss deviations from the pure cost model, to consider things like environmental impact and resource conservation.
  • I could not ascertain from the DOE report, however, what legal authorities govern the entities that make the plant dispatch decisions (i.e., the ISOs and RTOs), and what types of action would be required in order to change the dispatch methodology (e.g., whether legislation would be required). The DOE report was a study that was called for by the Energy Policy Act of 2005, which implies that its conclusions would be considered in future congressional legislation. I could not tell from reading the report if the lowest cost (only) method of dispatch is actually enshrined somewhere in state or federal law. If so, the changes I’m proposing would require legislation, of course.
  • The DOE report states that in some regions the local utility runs the dispatch queue itself. In the case of the larger grids run by the ISOs and RTOs (which cover most of the country), the report implies that those entities are heavily influenced, if not governed, by the Federal Energy Regulatory Commission (FERC), which is part of the executive branch of the federal government. In the case of utility-run dispatch queues, it seems that nothing short of new regulations (on pollution limits, or direct guidance on dispatch queue ordering) would result in a change in dispatch policy. Whereas reducing cost and maximizing grid reliability would be directly in the utility’s interest, favoring cleaner generation sources in the queue would not, unless it is driven by regulations. Thus, in this case, legislation would probably be necessary, although it’s conceivable that the EPA could act (like it’s about to on CO2).
  • In the case of the large grids run by ISOs and RTOs, it’s possible that such a change in dispatch methodology could be made by the federal executive branch, if indeed the FERC has the power to mandate such a change
  • Effect on Nuclear With respect to the impacts of including environmental costs in plant dispatch order determination, I’ve mainly discussed the effects on gas vs. coal. Indeed, a switch from coal to gas would be the main impact of such a policy change. As for nuclear, as well as renewables, the direct/immediate impact would be minimal. That is because both nuclear and renewable sources have high capital costs but very low variable costs. They also have very low environmental impacts; much lower than those of coal or gas. Thus, they will remain at the front of the dispatch queue, ahead of both coal and gas.
D'coda Dcoda

Scientists Radically Raise Estimates of Fukushima Fallout [25Oct11] - 0 views

  • The disaster at the Fukushima Daiichi nuclear plant in March released far more radiation than the Japanese government has claimed. So concludes a study1 that combines radioactivity data from across the globe to estimate the scale and fate of emissions from the shattered plant. The study also suggests that, contrary to government claims, pools used to store spent nuclear fuel played a significant part in the release of the long-lived environmental contaminant caesium-137, which could have been prevented by prompt action. The analysis has been posted online for open peer review by the journal Atmospheric Chemistry and Physics.
  • Andreas Stohl, an atmospheric scientist with the Norwegian Institute for Air Research in Kjeller, who led the research, believes that the analysis is the most comprehensive effort yet to understand how much radiation was released from Fukushima Daiichi. "It's a very valuable contribution," says Lars-Erik De Geer, an atmospheric modeller with the Swedish Defense Research Agency in Stockholm, who was not involved with the study. The reconstruction relies on data from dozens of radiation monitoring stations in Japan and around the world. Many are part of a global network to watch for tests of nuclear weapons that is run by the Comprehensive Nuclear-Test-Ban Treaty Organization in Vienna. The scientists added data from independent stations in Canada, Japan and Europe, and then combined those with large European and American caches of global meteorological data.
  • Stohl cautions that the resulting model is far from perfect. Measurements were scarce in the immediate aftermath of the Fukushima accident, and some monitoring posts were too contaminated by radioactivity to provide reliable data. More importantly, exactly what happened inside the reactors — a crucial part of understanding what they emitted — remains a mystery that may never be solved. "If you look at the estimates for Chernobyl, you still have a large uncertainty 25 years later," says Stohl. Nevertheless, the study provides a sweeping view of the accident. "They really took a global view and used all the data available," says De Geer.
  • ...7 more annotations...
  • Challenging numbers Japanese investigators had already developed a detailed timeline of events following the 11 March earthquake that precipitated the disaster. Hours after the quake rocked the six reactors at Fukushima Daiichi, the tsunami arrived, knocking out crucial diesel back-up generators designed to cool the reactors in an emergency. Within days, the three reactors operating at the time of the accident overheated and released hydrogen gas, leading to massive explosions. Radioactive fuel recently removed from a fourth reactor was being held in a storage pool at the time of the quake, and on 14 March the pool overheated, possibly sparking fires in the building over the next few days.
  • But accounting for the radiation that came from the plants has proved much harder than reconstructing this chain of events. The latest report from the Japanese government, published in June, says that the plant released 1.5 × 1016 bequerels of caesium-137, an isotope with a 30-year half-life that is responsible for most of the long-term contamination from the plant2. A far larger amount of xenon-133, 1.1 × 1019 Bq, was released, according to official government estimates.
  • Stohl believes that the discrepancy between the team's results and those of the Japanese government can be partly explained by the larger data set used. Japanese estimates rely primarily on data from monitoring posts inside Japan3, which never recorded the large quantities of radioactivity that blew out over the Pacific Ocean, and eventually reached North America and Europe. "Taking account of the radiation that has drifted out to the Pacific is essential for getting a real picture of the size and character of the accident," says Tomoya Yamauchi, a radiation physicist at Kobe University who has been measuring radioisotope contamination in soil around Fukushima. Click for full imageStohl adds that he is sympathetic to the Japanese teams responsible for the official estimate. "They wanted to get something out quickly," he says. The differences between the two studies may seem large, notes Yukio Hayakawa, a volcanologist at Gunma University who has also modelled the accident, but uncertainties in the models mean that the estimates are actually quite similar.
  • The new study challenges those numbers. On the basis of its reconstructions, the team claims that the accident released around 1.7 × 1019 Bq of xenon-133, greater than the estimated total radioactive release of 1.4 × 1019 Bq from Chernobyl. The fact that three reactors exploded in the Fukushima accident accounts for the huge xenon tally, says De Geer. Xenon-133 does not pose serious health risks because it is not absorbed by the body or the environment. Caesium-137 fallout, however, is a much greater concern because it will linger in the environment for decades. The new model shows that Fukushima released 3.5 × 1016 Bq caesium-137, roughly twice the official government figure, and half the release from Chernobyl. The higher number is obviously worrying, says De Geer, although ongoing ground surveys are the only way to truly establish the public-health risk.
  • The new analysis also claims that the spent fuel being stored in the unit 4 pool emitted copious quantities of caesium-137. Japanese officials have maintained that virtually no radioactivity leaked from the pool. Yet Stohl's model clearly shows that dousing the pool with water caused the plant's caesium-137 emissions to drop markedly (see 'Radiation crisis'). The finding implies that much of the fallout could have been prevented by flooding the pool earlier. The Japanese authorities continue to maintain that the spent fuel was not a significant source of contamination, because the pool itself did not seem to suffer major damage. "I think the release from unit 4 is not important," says Masamichi Chino, a scientist with the Japanese Atomic Energy Authority in Ibaraki, who helped to develop the Japanese official estimate. But De Geer says the new analysis implicating the fuel pool "looks convincing".
  • The latest analysis also presents evidence that xenon-133 began to vent from Fukushima Daiichi immediately after the quake, and before the tsunami swamped the area. This implies that even without the devastating flood, the earthquake alone was sufficient to cause damage at the plant.

    ADVERTISEMENT

    Advertisement

    The Japanese government's report has already acknowledged that the shaking at Fukushima Daiichi exceeded the plant's design specifications. Anti-nuclear activists have long been concerned that the government has failed to adequately address geological hazards when licensing nuclear plants (see Nature 448, 392–393; 2007), and the whiff of xenon could prompt a major rethink of reactor safety assessments, says Yamauchi.

  • The model also shows that the accident could easily have had a much more devastating impact on the people of Tokyo. In the first days after the accident the wind was blowing out to sea, but on the afternoon of 14 March it turned back towards shore, bringing clouds of radioactive caesium-137 over a huge swathe of the country (see 'Radioisotope reconstruction'). Where precipitation fell, along the country's central mountain ranges and to the northwest of the plant, higher levels of radioactivity were later recorded in the soil; thankfully, the capital and other densely populated areas had dry weather. "There was a period when quite a high concentration went over Tokyo, but it didn't rain," says Stohl. "It could have been much worse." 
D'coda Dcoda

Radiation cleanup plan falls short [09Nov11] - 0 views

  • Radioactive fallout from the crippled Fukushima No. 1 nuclear plant has caused widespread fear, prompting the government in August to adopt basic targets for decontamination efforts in and around Fukushima Prefecture.
  • But the government's plan falls short and efforts should focus in particular on residential areas with more aggressive decontamination measures and goals, including reducing current radiation levels by 90 percent, two radiation experts said when interviewed by The Japan Times. "I really doubt their seriousness (about decontamination)," said radiation expert Tomoya Yamauchi, a professor at the Graduate School of Maritime Sciences at Kobe University.
  • Areas with radiation exposure readings representing more than 20 millisieverts per year have been declared no-go zones, and the government has shifted the focus of its decontamination plan to areas with radiation readings, based on an annual accumulative amount, of between 20 millisieverts and more than 1 millisievert, with the goal of reducing the contamination by 50 to 60 percent over two years. Decontamination efforts by humans, however, are expected to only yield a reduction of 10 to 20 percent. Nature, including the impact of rain, wind and the normal degradation of the radioactivity of cesium-134, whose half-life is roughly two years, is assumed to do the rest, thus reaching the best-case scenario of cutting the contamination by 60 percent.
  • ...4 more annotations...
  • The experts said the government's goal of human effort achieving a 10 to 20 percent reduction is not ambitious enough. "A 10 percent reduction doesn't really mean anything. I mean, 40 percent of the radiation would be reduced just by natural causes, so I think the government is almost saying it is just going to wait for the radioactive materials to decrease naturally," said Shunichi Tanaka, former chairman of the Atomic Energy Society of Japan. The main radioactive materials that spewed from the Fukushima No. 1 plant are cesium-134 and -137, the second of which has a half-life of 30 years. Given the relatively short half-life of cesium-134, the total radiation will naturally be halved in four years and fall to one-third in six years, although the threat from the latter will remain for a longer time. The government is now trying to reduce contamination mainly by using high-power water hoses, known as pressure washers, on structures and removing surface soil and vegetation in limited areas.
  • But radioactive cesium can find its way into minute cracks and crevices. It is hard to remove, for example, from roofs made of certain materials, or surfaces that are rusted or whose paint is peeling, Yamauchi said. He has monitored radiation in areas in the city of Fukushima and found that the levels were still quite high after the city performed cleanup operations. To lower the contamination to pre-March 11 levels, Yamauchi said drastic, and highly costly, efforts by the government are needed, including replacing roofs and removing the surface asphalt of roads. Tanaka meanwhile pointed out that the government has not even floated a plan for decontaminating the no-go zones where the radiation exceeds 20 millisieverts per year — areas where there isn't even a timetable for when evacuees will be able to return.
  • If the government doesn't speed up the decontamination work, it will be years before the evacuees may be able to return home, he said, adding that the government can't set a target date because it isn't sure how the cleanup effort will fare. The government's stance regarding the no-go zone is largely based on recommendations by the International Commission on Radiological Protection and other scientists that call for the maximum radiation exposure of between 20 and 100 millisieverts per year under an emergency situation. The ICRP theorizes that cumulative exposure of 100 millisieverts could increase the cancer mortality risk by about 0.5 percent, meaning about 50 out of 10,000 people exposed to that level could die of cancer caused by radiation.
  • "Municipalities need to communicate closely with residents (to solicit their involvement) . . . without the participation of the residents, they can't find space for the storage," Tanaka said.
D'coda Dcoda

Senator Lamar Alexander: "Nuclear Power Is the Most Reliable and Useful Source of Green... - 0 views

  • U.S. Senator Lamar Alexander (R-Tenn.), chairman of the Senate Republican Conference, delivered a speech this week at the International V.M. Goldschmidt Conference in Knoxville.  Alexander serves on the Senate Environment and Public Works Committee and is the chairman of the Tennessee Valley Authority Congressional Caucus.  His remarks as prepared follow:
  • When
  • in a speech in Oak Ridge in May of 2009, I called for America to build 100 new nuclear plants during the next twenty years.  Nuclear power produces 70 percent of our pollution-free, carbon-free electricity today.  It is the most useful and reliable source of green electricity today because of its tremendous energy density and the small amount of waste that it produces.  And because we are harnessing the heat and energy of the earth itself through the power of the atom, nuclear power is also natural.
  • ...10 more annotations...
  • Forty years ago, nuclear energy was actually regarded as something of a savior for our environmental dilemmas because it didn’t pollute.  And this was well before we were even thinking about global warming or climate change.  It also didn’t take up a great deal of space.  You didn’t have to drown all of Glen Canyon to produce 1,000 megawatts of electricity.  Four reactors would equal a row of wind turbines, each one three times as tall as Neyland Stadium skyboxes, strung along the entire length of the 2,178-mile Appalachian Trail.   One reactor would produce the same amount of electricity that can be produced by continuously foresting an area one-and-a-half times the size of the Great Smoky Mountains National Park in order to create biomass.  Producing electricity with a relatively small number of new reactors, many at the same sites where reactors are already located, would avoid the need to build thousands and thousands of miles of new transmission lines through scenic areas and suburban backyards. 
  • While nuclear lost its green credentials with environmentalists somewhere along the way, some are re-thinking nuclear energy because of our new environmental paradigm – global climate change.  Nuclear power produces 70 percent of our carbon-free electricity today.  President Obama has endorsed it, proposing an expansion of the loan guarantee program from $18 billion to $54 billion and making the first award to the Vogtle Plant in Georgia.  Nobel Prize-winning Secretary of Energy Steven Chu wrote recently in The Wall Street Journal about developing a generation of mini-reactors that I believe we can use to repower coal boilers, or more locally, to power the Department of Energy’s site over in Oak Ridge.  The president, his secretary of energy, and many environmentalists may be embracing nuclear because of the potential climate change benefits, but they are now also remembering the other positive benefits of nuclear power that made it an environmental savior some 40 years ago
  • The Nature Conservancy took note of nuclear power’s tremendous energy density last August when it put out a paper on “Energy Sprawl.”  The authors compared the amount of space you need to produce energy from different technologies – something no one had ever done before – and what they came up with was remarkable.  Nuclear turns out to be the gold standard.  You can produce a million megawatts of electricity a year from a nuclear reactor sitting on one square mile.  That’s enough electricity to power 90,000 homes.  They even included uranium mining and the 230 square miles surrounding Yucca Mountain in this calculation and it still comes to only one square mile per million megawatt hours
  • And for all that, each turbine has the capacity to produce about one-and-a-half megawatts.  You need three thousand of these 50-story structures to equal the output of one nuclear reactor
  • When people say “we want to get our energy from wind,” they tend to think of a nice windmill or two on the horizon, waving gently – maybe I’ll put one in my back yard.   They don’t realize those nice, friendly windmills are now 50 stories high and have blades the length of football fields.  We see awful pictures today of birds killed by the Gulf oil spill.  But one wind farm in California killed 79 golden eagles in one year. The American Bird Conservancy says existing turbines can kill up to 275,000 birds a year.
  • Coal-fired electricity needs four square miles, because you have to consider all the land required for mining and extraction.  Solar thermal, where they use the big mirrors to heat a fluid, takes six square miles.  Natural gas takes eight square miles and petroleum takes 18 square miles – once again, including all the land needed for drilling and refining and storing and sending it through pipelines.  Solar photovoltaic cells that turn sunlight directly into electricity take 15 square miles and wind is even more dilute, taking 30 square miles to produce that same amount of electricity.
  • , wind power can be counted on to be there 10 to 15 percent of the time when you need it.  TVA can count on nuclear power 91 percent of the time, coal, 60 percent of the time and natural gas about 50 percent of the time.  This is why I believe it is a taxpayer rip-off for wind power to be subsidized per unit of electricity at a rate of 25 times the subsidy for all other forms of electricity combined. 
  • the “problem of nuclear waste” has been overstated because people just don’t understand the scale or the risk.  All the high-level nuclear waste that has ever been produced in this country would fit on a football field to a height of ten feet.  That’s everything.  Compare that to the billion gallons of coal ash that slid out of the coal ash impoundment at the Kingston plant and into the Emory River a year and a half ago, just west of here.  Or try the industrial wastes that would be produced if we try to build thousands of square miles of solar collectors or 50-story windmills.  All technologies produce some kind of waste.  What’s unique about nuclear power is that there’s so little of it.
  • Now this waste is highly radioactive, there’s no doubt about that.  But once again, we have to keep things in perspective.  It’s perfectly acceptable to isolate radioactive waste through storage.  Three feet of water blocks all radiation.  So does a couple of inches of lead and stainless steel or a foot of concrete.  That’s why we use dry cask storage, where you can load five years’ worth of fuel rods into a single container and store them right on site.  The Nuclear Regulatory Commission and Energy Secretary Steven Chu both say we can store spent fuel on site for 60 or 80 years before we have to worry about a permanent repository like Yucca Mountain
  • then there’s reprocessing.  Remember, we’re now the only major nuclear power nation in the world that is not reprocessing its fuel.  While we gave up reprocessing in the 1970s, the French have all their high-level waste from 30 years of producing 80 percent of their electricity stored beneath the floor of one room at their recycling center in La Hague.  That’s right; it all fits into one room.  And we don’t have to copy the French.  Just a few miles away at the Oak Ridge National Laboratory they’re working to develop advanced reprocessing technologies that go well beyond what the French are doing, to produce a waste that’s both smaller in volume and with a shorter radioactive life.  Regardless of what technology we ultimately choose, the amount of material will be astonishingly small.  And it’s because of the amazing density of nuclear technology – something we can’t even approach with any other form of energy
D'coda Dcoda

Battling for nuclear energy by exposing opposition motives [19Jul11] - 0 views

  • In the money-driven battle over our future energy supply choices, the people who fight nuclear energy have imagination on their side. They can, and often do, invent numerous scary tales about what might happen without the need to actually prove anything.
  • One of the most powerful weapons in their arsenal is the embedded fantasy that a nuclear reactor accident can lead to catastrophic consequences that cannot be accepted. This myth is doubly hard to dislodge because a large fraction of the nuclear energy professionals have been trained to believe it. When you want to train large numbers of slightly above average people to do their job with great care and attention to detail, it can be useful to exaggerate the potential consequences of a failure to perform. It is also a difficult myth to dislodge because the explanation of why it is impossible requires careful and often lengthy explanations of occasionally complex concepts.
  • The bottom lines of both Chernobyl and Fukushima tell me that the very worst that can realistically happen to nuclear fission reactors results in acceptable physical consequences when compared to the risk of insufficient power or the risk of using any other reliable source of power. The most negative consequences of both accidents resulted from the way that government leaders responded, both during the crisis stage and during the subsequent recoveries.
  • ...9 more annotations...
  • Instead of trying to explain the basis for those statements more fully, I’ll try to encourage people to consider the motives of people on various sides of the discussion. I also want to encourage nuclear energy supporters to look beyond the financial implications to the broader implications of a less reliable and dirtier electrical power system. When the focus is just on the finances, the opposition has an advantage – the potential gains from opposing nuclear energy often are concentrated in the hands of extremely interested parties while the costs are distributed widely enough to be less visible. That imbalance often leads to great passion in the opposition and too much apathy among the supporters. Over at Idaho Samizdat, Dan Yurman has written about the epic battle of political titans who are on opposing sides of the controversy regarding the relicensing and continued operation of the Indian Point Nuclear Power Station. Dan pointed out that there is a large sum of money at stake, but he put it in a way that does not sound too terrible to many people because it spreads out the pain.
  • In round numbers, if Indian Point is closed, wholesale electricity prices could rise by 12%.
  • A recent study quoted in a New York Times article put the initial additional cost of electricity without Indian Point at about $1.5 billion per year, which is a substantial sum of money if concentrated into the hands of a few thousand victors who tap the monthly bills of a few million people. Here is a comment that I added to Dan’s post:Dan – thank you for pointing out that the battle is not really a partisan one determined by political party affiliation. By my analysis, the real issue is the desire of natural gas suppliers to sell more gas at ever higher prices driven by a shift in the balance between supply and demand.
  • They never quite explain what is going to happen as we get closer and closer to the day when even fracking will not squeeze any more hydrocarbons out of the drying sponge that is the readily accessible part of the earth’s crust.The often touted “100 – year” supply of natural gas in the US has a lot of optimistic assumptions built in. First of all, it is only rounded up to 100 years – 2170 trillion cubic feet at the end of 2010 divided by 23 trillion cubic feet per year leaves just 94 years.
  • Secondly, the 2170 number provided by the Potential Gas Committee report includes all proven, probable, possible and speculative resources, without any analysis of the cost of extraction or moving them to a market. Many of the basins counted have no current pipelines and many of the basins are not large enough for economic recovery of the investment to build the infrastructure without far higher prices.Finally, all bets are off with regard to longevity if we increase the rate of burning up the precious raw materia
  • BTW – In case your readers are interested in the motives of a group like Riverkeepers, founded and led by Robert F. Kennedy, Jr., here is a link to a video clip of him explaining his support for natural gas.http://atomicinsights.com/2010/11/power-politics-rfk-jr-explains-how-pressure-from-activists-to-enforce-restrictions-on-coal-benefits-natural-gas.html
  • The organized opposition to the intelligent use of nuclear energy has often painted support for the technology as coming from faceless, money-hungry corporations. That caricature of the support purposely ignores the fact that there are large numbers of intelligent, well educated, responsible, and caring people who know a great deal about the technology and believe that it is the best available solution for many intransigent problems. There are efforts underway today, like the Nuclear Literacy Project and Go Nuclear, that are focused on showcasing the admirable people who like nuclear energy and want it to grow rapidly to serve society’s never ended thirst for reliable power at an affordable price with acceptable environmental impact.
  • The exaggerated, fanciful accident scenarios painted by the opposition are challenging to disprove.
  • I just read an excellent post on Yes Vermont Yankee about a coming decision that might help to illuminate the risk to society of continuing to let greedy antinuclear activists and their political friends dominate the discussion. According to Meredith’s post, Entergy must make a decision within just a week or so about whether or not to refuel Vermont Yankee in October. Since the sitting governor is dead set against the plant operating past its current license expiration in the summer of 2012, the $100 million dollar expense of refueling would only result in about 6 months of operation instead of the usual 18 months.Meredith has a novel solution to the dilemma – conserve the fuel currently in the plant by immediately cutting the power output to 25%.
Dan R.D.

13-Year-Old Uses Fibonacci Sequence For Solar Power Breakthrough [19Aug11] - 0 views

  •  
    An anonymous reader tips news of 7th grader Aidan Dwyer, who used phyllotaxis - the way leaves are arranged on plant stems in nature - as inspiration to arrange an array of solar panels in a way that generates 20-50% more energy than a uniform, flat panel array. Aidan wrote, "I designed and built my own test model, copying the Fibonacci pattern of an oak tree. I studied my results with the compass tool and figured out the branch angles. The pattern was about 137 degrees and the Fibonacci sequence was 2/5. Then I built a model using this pattern from PVC tubing. In place of leaves, I used PV solar panels hooked up in series that produced up to 1/2 volt, so the peak output of the model was 5 volts. The entire design copied the pattern of an oak tree as closely as possible. ... The Fibonacci tree design performed better than the flat-panel model. The tree design made 20% more electricity and collected 2 1/2 more hours of sunlight during the day. But the most interesting results were in December, when the Sun was at its lowest point in the sky. The tree design made 50% more electricity, and the collection time of sunlight was up to 50% longer!" His work earned him a Young Naturalist Award from the American Museum of Natural History and a provisional patent on the design.
D'coda Dcoda

The Thorium Reactor, A Nuclear Energy Alternative [19Sep11] - 0 views

  • After Fukushima a great deal of awareness on the dangers of nuclear energy has ignited a series of reactions in society, mainly a generalized rejection to nuclear energy and a call to develop cleaner and safer sources of energy. When thinking about nuclear energy mainly 2 sources come to peoples minds, solar and wind power condemning any sort of nuclear power.  Nuclear power has been associated with Weapons of Mass Destruction, radiation sickness and disease.  However, this is not due to the nuclear power itself but due to the nuclear fuel used to generate this nuclear power.
  • The above are just some of the most common byproducts, (better known as nuclear waste) of a nuclear fuel cycle, all of these substances are extremely poisonous, causing a variety of diseases, cancers and genetic mutations to the victim.  The worst part is that most of them remain in the environment of decades or even thousands of years, so if accidentally released to the environment they become a problem that future generations have to deal with.  Therefore, in nuclear energy the problem is in the fuel not in the engine. Lets start with the Thorium Reactors.  Thorium is a naturally occurring radioactive chemical element, found in abundance throughout the world.  It is estimated that every cubic meter of earth’s crust contains about 12 grams of this mineral, enough quantity to power 1 person’s electricity consumption for 12-25 years.  Energy is produced from thorium in a process known as the Thorium Fuel Cycle, were a nuclear fuel cycle is derived from the natural abundant isotope of thorium.
  • In today’s world the main fuel for nuclear power is a naturally occurring radioactive mineral, Uranium.  This mineral is one of the most dense metals in the periodic table which allows it to reach a chain reaction that can yield huge amounts of energy that can be exploited for an extended period of time.  Unfortunately the nuclear fuel cycle of Uranium produced extremely dangerous byproducts, commonly known as nuclear waste.  These are produced in liquid, solid and gaseous form in a wide variety of deadly substances, such as: Iodine 131 Strontium 90 Cesium 137 Euricium 155 Krypton 85 Cadmium 113 Tin 121 Samarium 151 Technetium-99
  • ...2 more annotations...
  • Thorium can be used as fuel in a nuclear reactor, and it is a fertile material, which allows it to be used to produce nuclear fuel in a breeder reactor.  These are some of the benefits of Thorium reactors compared to Uranium. Weapons-grade fissionable material is harder to retrieve safely and clandestinely from a thorium reactor; Thorium produces 10 to 10,000 times less long-lived radioactive waste; Thorium comes out of the ground as a 100% pure, usable isotope, which does not require enrichment, whereas natural uranium contains only 0.7% fissionable U-235; Thorium cannot sustain a nuclear chain reaction without priming,[22] so fission stops by default. The following conference by Kirk Sorensen explains a Liquid-Fuoride Thorium Reactor a next generation nuclear reactor.
  • References Thorium – Wikipedia, the free encyclopedia http://bit.ly/qYwoAv Thorium fuel cycle – Wikipedia, the free encyclopedia http://bit.ly/piNoKb Molten salt reactor – Wikipedia, the free encyclopedia http://bit.ly/qlyAxe Thorium Costs http://bit.ly/oQRgXK Thorium – The Better Nuclear Fuel? http://bit.ly/r8xc92
D'coda Dcoda

If Indian Point Nuclear Closes, Plenty of Profits (for natural gas suppliers) [13Jul11] - 0 views

  •  
    (Diigo won't highlight this correctly, putting it in description!) "Matt Wald of the New York Times has finally figured out why there is such a strong push from well connected political types to close the Indian Point Nuclear Power Station. Unfortunately, he and his editor have chosen to put that answer at the very bottom of his recent article titled If Indian Point Closes, Plenty of Challenges . When the demand for natural gas increases, the balance between supply and demand shifts in favor of the sellers, so price inexorably increases. Here is the closing paragraph of that article. It should raise alarm bells for anyone who is a power purchaser instead of a power seller. That description applies to the vast majority of us; part of the challenge is that it only costs each of us a little while concentrating the spoils in the hands of a few victors. Closing the Indian Point reactors would, however, hardly be gloom and doom for everyone. Any company that runs a generator in downstate New York ends up selling its output at a higher price, and would share in the $1.4 billion a year that Con Edison says its customers will pay if the nuclear plant closes."
D'coda Dcoda

Strange nuclear waste lint might be "biological in nature" [18Dec11] - 0 views

  • Savannah River Site scientists are working to identify a strange growth found on racks of spent nuclear fuel collected from foreign governments. The “white, stringlike” material was found among thousands of spent fuel assemblies submerged in deep pools within the site’s L Area, according to a report filed by the Defense Nuclear Facilities Safety Board, a federal oversight panel. “The growth, which resembles a spider web, has yet to be characterized, but may be biological in nature,” the report said. Savannah River National Laboratory collected a small sample in hopes of identifying the mystery lint – and determining whether it is alive.
  • L Area, with 3-foot-thick concrete walls, includes pools that range from 17 to 30 feet deep, where submerged racks are used to store an array of assemblies – some containing highly enriched uranium – from foreign and domestic research reactors. The material is kept there for national security reasons. The safety board’s report said the initial sample collected was too small to allow further characterization.
D'coda Dcoda

Energy Forecast: Fracking in China, Nuclear Uncertain, CO2 Up [09Nov11] - 0 views

  • This year’s World Energy Outlook report has been published by the International Energy Agency, and says wealthy and industrializing countries are stuck on policies that threaten to lock in “an insecure, inefficient and high-carbon energy system.”You can read worldwide coverage of the report here. Fiona Harvey of the Guardian has a piece on the report that focuses on the inexorable trajectories for carbon dioxide, driven by soaring energy demand in Asia.A variety of graphs and slides can be reviewed here:
  • According to the report, Russia will long remain the world’s leading producer of natural gas, but exploitation of shale deposits in the United States, and increasingly in China, will greatly boost production in those countries (which will be in second and third place for gas production in 2035).Last month, in an interview with James Kanter of The Times and International Herald Tribune, the new head of the energy agency, Maria van der Hoeven, discussed one point made in the report today — that concerns raised by the damage to the Fukushima Daiichi power plant could continue to dampen expansion of nuclear power and add to the challenge of avoiding a big accumulation of carbon dioxide, saying: “Such a reduction would certainly make it more difficult for the world to meet the goal of stabilizing the rise in temperature to 2 degrees Centigrade.”
  • Short-term pressures on oil markets are easing with the economic slowdown and the expected return of Libyan supply. But the average oil price remains high, approaching $120/barrel (in year-2010 dollars) in 2035. Reliance grows on a small number of producers: the increase in output from Middle East and North Africa (MENA) is over 90% of the required growth in world oil output to 2035. If, between 2011 and 2015, investment in the MENA region runs one-third lower than the $100 billion per year required, consumers could face a near-term rise in the oil price to $150/barrel.Oil demand rises from 87 million barrels per day (mb/d) in 2010 to 99 mb/d in 2035, with all the net growth coming from the transport sector in emerging economies. The passenger vehicle fleet doubles to almost 1.7 billion in 2035. Alternative technologies, such as hybrid and electric vehicles that use oil more efficiently or not at all, continue to advance but they take time to penetrate markets.
  • ...5 more annotations...
  • In the WEO’s central New Policies Scenario, which assumes that recent government commitments are implemented in a cautious manner, primary energy demand increases by one-third between 2010 and 2035, with 90% of the growth in non-OECD economies. China consolidates its position as the world’s largest energy consumer: it consumes nearly 70% more energy than the United States by 2035, even though, by then, per capita demand in China is still less than half the level in the United States. The share of fossil fuels in global primary energy consumption falls from around 81% today to 75% in 2035. Renewables increase from 13% of the mix today to 18% in 2035; the growth in renewables is underpinned by subsidies that rise from $64 billion in 2010 to $250 billion in 2035, support that in some cases cannot be taken for granted in this age of fiscal austerity. By contrast, subsidies for fossil fuels amounted to $409 billion in 2010.
  • Here’s the summary of the main points, released today by the agency: “Growth, prosperity and rising population will inevitably push up energy needs over the coming decades. But we cannot continue to rely on insecure and environmentally unsustainable uses of energy,” said IEA Executive Director Maria van der Hoeven. “Governments need to introduce stronger measures to drive investment in efficient and low-carbon technologies. The Fukushima nuclear accident, the turmoil in parts of the Middle East and North Africa and a sharp rebound in energy demand in 2010 which pushed CO2 emissions to a record high, highlight the urgency and the scale of the challenge.”
  • The use of coal – which met almost half of the increase in global energy demand over the last decade – rises 65% by 2035. Prospects for coal are especially sensitive to energy policies – notably in China, which today accounts for almost half of global demand. More efficient power plants and carbon capture and storage (CCS) technology could boost prospects for coal, but the latter still faces significant regulatory, policy and technical barriers that make its deployment uncertain.Fukushima Daiichi has raised questions about the future role of nuclear power. In the New Policies Scenario, nuclear output rises by over 70% by 2035, only slightly less than projected last year, as most countries with nuclear programmes have reaffirmed their commitment to them. But given the increased uncertainty, that could change. A special Low Nuclear Case examines what would happen if the anticipated contribution of nuclear to future energy supply were to be halved. While providing a boost to renewables, such a slowdown would increase import bills, heighten energy security concerns and make it harder and more expensive to combat climate change.
  • The future for natural gas is more certain: its share in the energy mix rises and gas use almost catches up with coal consumption, underscoring key findings from a recent WEO Special Report which examined whether the world is entering a “Golden Age of Gas”. One country set to benefit from increased demand for gas is Russia, which is the subject of a special in-depth study in WEO-2011. Key challenges for Russia are to finance a new generation of higher-cost oil and gas fields and to improve its energy efficiency. While Russia remains an important supplier to its traditional markets in Europe, a shift in its fossil fuel exports towards China and the Asia-Pacific gathers momentum. If Russia improved its energy efficiency to the levels of comparable OECD countries, it could reduce its primary energy use by almost one-third, an amount similar to the consumption of the United Kingdom. Potential savings of natural gas alone, at 180 bcm, are close to Russia’s net exports in 2010.
  • In the New Policies Scenario, cumulative CO2 emissions over the next 25 years amount to three-quarters of the total from the past 110 years, leading to a long-term average temperature rise of 3.5°C. China’s per-capita emissions match the OECD average in 2035. Were the new policies not implemented, we are on an even more dangerous track, to an increase of 6°C.“As each year passes without clear signals to drive investment in clean energy, the “lock-in” of high-carbon infrastructure is making it harder and more expensive to meet our energy security and climate goals,” said Fatih Birol, IEA Chief Economist. The WEO presents a 450 Scenario, which traces an energy path consistent with meeting the globally agreed goal of limiting the temperature rise to 2°C. Four-fifths of the total energy-related CO2 emissions permitted to 2035 in the 450 Scenario are already locked-in by existing capital stock, including power stations, buildings and factories. Without further action by 2017, the energy-related infrastructure then in place would generate all the CO2 emissions allowed in the 450 Scenario up to 2035. Delaying action is a false economy: for every $1 of investment in cleaner technology that is avoided in the power sector before 2020, an additional $4.30 would need to be spent after 2020 to compensate for the increased emissions.
D'coda Dcoda

Fracking Radiation Targeted By DOE, GE [03Aug11] - 0 views

  • The Department of Energy and General Electric will spend $2 million over the next two years to remove naturally occurring radioactive materials from the fracking fluids produced by America’s booming shale-gas industry. The New York State Department of Health has identified Radium-226 as a radionuclide of particular concern in the Marcellus Shale formation deep beneath the Appalachian Mountains. In hydraulic fracturing operations, drillers force water and a mixture of chemicals into wells to shatter the shale and free natural gas. The brine that returns to the surface has been found to contain up to 16,000 picoCuries per liter of radium-226 (pdf). The discharge limit in effluent for Radium 226 is 60 pCi/L, and the EPA’s drinking water standard is 5 pCi/L.
  • Uranium and Radon-222 have also been found in water returning to the surface from deep shale wells. In Pennsylvania, produced water has been discharged into streams and rivers from the state’s 71,000 wells after conventional wastewater treatment but without radiation testing, according to the Pittsburgh Post-Gazette and The New York Times, which drew attention to the radioactive contamination earlier this year after studying internal EPA documents: The documents reveal that the wastewater, which is sometimes hauled to sewage plants not designed to treat it and then discharged into rivers that supply drinking water, contains radioactivity at levels higher than previously known, and far higher than the level that federal regulators say is safe for these treatment plants to handle. via The New York Times
  • GE’s Global Research lab in Niskayuna, NY has proposed removing radioactive elements from produced waters and brine using a membrane distillation system similar to conventional reverse osmosis, but designed specifically to capture these radioactive materials. GE will spend $400,000 on the project and DOE will supply $1.6 million. The Energy Department announced the project Monday. The process will produce concentrated radioactive waste, which will be disposed of through conventional means, which usually means storage in sealed containers for deep geological disposal. The government is seeking to address environmental concerns without stemming a boom in cheap gas unleashed by hydraulic fracturing, or fracking, in shale formations.
D'coda Dcoda

How To Remove Radioactive Iodine-131 From Drinking Water [07Apr11] - 0 views

  • The Environmental Protection Agency recommends reverse osmosis water treatment to remove radioactive isotopes that emit beta-particle radiation. But iodine-131, a beta emitter, is typically present in water as a dissolved gas, and reverse osmosis is known to be ineffective at capturing gases. A combination of technologies, however, may remove most or all of the iodine-131 that finds its way into tap water, all available in consumer products for home water treatment.
  • When it found iodine-131 in drinking water samples from Boise, Idaho and Richland, Washington this weekend, the EPA declared: An infant would have to drink almost 7,000 liters of this water to receive a radiation dose equivalent to a day’s worth of the natural background radiation exposure we experience continuously from natural sources of radioactivity in our environment.” But not everyone accepts the government’s reassurances. Notably, Physicians for Social Responsibility has insisted there is no safe level of exposure to radionuclides, regardless of the fact that we encounter them naturally:
  • There is no safe level of radionuclide exposure, whether from food, water or other sources. Period,” said Jeff Patterson, DO, immediate past president of Physicians for Social Responsibility. “Exposure to radionuclides, such as iodine-131 and cesium-137, increases the incidence of cancer. For this reason, every effort must be taken to minimize the radionuclide content in food and water.” via Physicians for Social Responsibility, psr.org No matter where you stand on that debate, you might be someone who simply prefers not to ingest anything that escaped from a damaged nuclear reactor. If so, here’s what we know: Reverse Osmosis The EPA recommends reverse osmosis water treatment for most kinds of radioactive particles. Iodine-131 emits a small amount of gamma radiation but much larger amounts of beta radiation, and so is considered a beta emitter:
  • ...6 more annotations...
  • Reverse osmosis has been identified by EPA as a “best available technology” (BAT) and Small System Compliance Technology (SSCT) for uranium, radium, gross alpha, and beta particles and photon emitters. It can remove up to 99 percent of these radionuclides, as well as many other contaminants (e.g., arsenic, nitrate, and microbial contaminants). Reverse osmosis units can be automated and compact making them appropriate for small systems. via EPA, Radionuclides in Drinking Water
  • However, EPA designed its recommendations for the contaminants typically found in municipal water systems, so it doesn’t specify Iodine-131 by name. The same document goes on to say, “Reverse osmosis does not remove gaseous contaminants such as carbon dioxide and radon.” Iodine-131 escapes from damaged nuclear plants as a gas, and this is why it disperses so quickly through the atmosphere. It is captured as a gas in atmospheric water, falls to the earth in rain and enters the water supply.
  • Dissolved gases and materials that readily turn into gases also can easily pass through most reverse osmosis membranes,” according to the University of Nevada Cooperative Extension. For this reason, “many reverse osmosis units have an activated carbon unit to remove or reduce the concentration of most organic compounds.” Activated Carbon
  • That raises the next question: does activated carbon remove iodine-131? There is some evidence that it does. Scientists have used activated carbon to remove iodine-131 from the liquid fuel for nuclear solution reactors. And Carbon air filtration is used by employees of Perkin Elmer, a leading environmental monitoring and health safety firm, when they work with iodine-131 in closed quarters. At least one university has adopted Perkin Elmer’s procedures. Activated carbon works by absorbing contaminants, and fixing them, as water passes through it. It has a disadvantage, however: it eventually reaches a load capacity and ceases to absorb new contaminants.
  • Ion Exchange The EPA also recommends ion exchange for removing radioactive compounds from drinking water. The process used in water softeners, ion exchange removes contaminants when water passes through resins that contain sodium ions. The sodium ions readily exchange with contaminants.
  • Ion exchange is particularly recommended for removing Cesium-137, which has been found in rain samples in the U.S., but not yet in drinking water here. Some resins have been specifically designed for capturing Cesium-137, and ion exchange was used to clean up legacy nuclear waste from an old reactor at the Department of Energy’s Savannah River Site (pdf).
1 - 20 of 168 Next › Last »
Showing 20 items per page