Skip to main content

Home/ Open Intelligence / Energy/ Group items tagged population

Rss Feed Group items tagged

Dan R.D.

Japan considers new nuclear evacuation measures - Tokyo Times [25Oct11] - 0 views

  • A committee of the Nuclear Safety Commission (NSC) is considering extending the area around nuclear plants where the authorities should be prepared to offer shelter in case of emergency.Currently at about 10 kilometer around the plants, the safety area may be increased to 30 kilometer.Seven months after the beginning of the nuclear crisis, the committee is reviewing the consequences of the March 11 quake and tsunami, in an effort to learn valuable lessons. Japan faced strong criticism for the slow reaction it showed in dealing with the current nuclear crisis.
  • Another measure which the committee is now considering is to ask the local authorities to be ready to provide iodine tablets to the population on a 50-km radius from plants, to help prevent thyroid cancers from radiation.A draft document with final recommendations will be finalized next month. But it could take years until the guidelines are fully revised, according to an official of the NSC.Around 80,000 people were forced to leave their homes in a 20-km radius area from the Fukushima plant. Some more 30,000 people left from the recommended evacuation zone, between 20 and 30 kilometers around the plant.
D'coda Dcoda

Risk below 100 mSv is so low you cannot measure it [23Oct11] - 0 views

  • Risk below 100 mSv is so low you cannot measure it by Rod Adams on October 15, 2011 in Health Effects, LNT, Nuclear Communications Share48 One of my favorite jokes about the difference between scientists and engineers is the one in which a scientist and an engineer are both put into a room with a pot of gold on the other side. They are given the rules of the challenge – the gold will be given to the person who reaches it first. There is one caveat – each contestant is limited to moving only half way to the goal with each turn. The scientist gives up and claims that the goal is unreachable because the distance to the gold will never be zero. The engineer walks across the room, picks up the pot of gold and says – “I may not be able to get here, but I can get close enough.” During the question and answer session following the presentations at the American Chamber of Commerce in Japan (ACCJ) meeting on food safety, Dr. Allison, a life-long scientist, proves that some scientists recognize that close is often good enough. As he says in answer to a lengthy question from the audience, the risk from a dose of 100 mSv each year may not be zero. However, the life span survivor studies of the victims of Hiroshima and Nagasaki show that it is so close to zero that it is impossible to measure. That study included a population of approximately 100,000 people monitored carefully for more than 50 years. It is difficult to conceive of a larger or more well followed study group.
  •  
    2 videos
D'coda Dcoda

CPS must die [24Oct07} - 0 views

  • Collectively, Texas eats more energy than any other state, according to the U.S. Department of Energy. We’re fifth in the country when it comes to our per-capita energy intake — about 532 million British Thermal Units per year. A British Thermal Unit, or Btu, is like a little “bite” of energy. Imagine a wooden match burning and you’ve got a Btu on a stick. Of course, the consumption is with reason. Texas, home to a quarter of the U.S. domestic oil reserves, is also bulging with the second-highest population and a serious petrochemical industry. In recent years, we managed to turn ourselves into the country’s top producer of wind energy. Despite all the chest-thumping that goes on in these parts about those West Texas wind farms (hoist that foam finger!), we are still among the worst in how we use that energy. Though not technically “Southern,” Texans guzzle energy like true rednecks. Each of our homes use, on average, about 14,400 kilowatt hours per year, according to the U.S. Energy Information Administration. It doesn’t all have to do with the A/C, either. Arizonans, generally agreed to be sharing the heat, typically use about 12,000 kWh a year; New Mexicans cruise in at an annual 7,200 kWh. Don’t even get me started on California’s mere 6,000 kWh/year figure.
  • Let’s break down that kilowatt-hour thing. A watt is the energy of one candle burning down. (You didn’t put those matches away, did you?) A kilowatt is a thousand burnin’ candles. And a kilowatt hour? I think you can take it from there. We’re wide about the middle in Bexar, too. The average CPS customer used 1,538 kilowatt hours this June when the state average was 1,149 kWh, according to ERCOT. Compare that with Austin residents’ 1,175 kWh and San Marcos residents’ 1,130 kWh, and you start to see something is wrong. So, we’re wasteful. So what? For one, we can’t afford to be. Maybe back when James Dean was lusting under a fountain of crude we had if not reason, an excuse. But in the 1990s Texas became a net importer of energy for the first time. It’s become a habit, putting us behind the curve when it comes to preparing for that tightening energy crush. We all know what happens when growing demand meets an increasingly scarce resource … costs go up. As the pressure drop hits San Anto, there are exactly two ways forward. One is to build another massively expensive power plant. The other is to transform the whole frickin’ city into a de-facto power plant, where energy is used as efficiently as possible and blackouts simply don’t occur.
  • Consider, South Texas Project Plants 1&2, which send us almost 40 percent of our power, were supposed to cost $974 million. The final cost on that pair ended up at $5.5 billion. If the planned STP expansion follows the same inflationary trajectory, the price tag would wind up over $30 billion. Applications for the Matagorda County plants were first filed with the Atomic Energy Commission in 1974. Building began two years later. However, in 1983 there was still no plant, and Austin, a minority partner in the project, sued Houston Power & Lighting for mismanagement in an attempt to get out of the deal. (Though they tried to sell their share several years ago, the city of Austin remains a 16-percent partner, though they have chosen not to commit to current expansion plans).
  • ...17 more annotations...
  • CPS didn’t just pull nukes out of a hat when it went looking for energy options. CEO Milton Lee may be intellectually lazy, but he’s not stupid. Seeking to fulfill the cheap power mandate in San Antonio and beyond (CPS territory covers 1,566 square miles, reaching past Bexar County into Atascosa, Bandera, Comal, Guadalupe, Kendall, Medina, and Wilson counties), staff laid natural gas, coal, renewables and conservation, and nuclear side-by-side and proclaimed nukes triumphant. Coal is cheap upfront, but it’s helplessly foul; natural gas, approaching the price of whiskey, is out; and green solutions just aren’t ready, we’re told. The 42-member Nuclear Expansion Analysis Team, or NEAT, proclaimed “nuclear is the lowest overall risk considering possible costs and risks associated with it as compared to the alternatives.” Hear those crickets chirping?
  • NEAT members would hold more than a half-dozen closed-door meetings before the San Antonio City Council got a private briefing in September. When the CPS board assembled October 1 to vote the NRG partnership up or down, CPS executives had already joined the application pending with the U.S. Nuclear Regulatory Commission. A Supplemental Participation Agreement allowed NRG to move quickly in hopes of cashing in on federal incentives while giving San Antonio time to gather its thoughts. That proved not too difficult. Staff spoke of “overwhelming support” from the Citizen’s Advisory Board and easy relations with City staff. “So far, we haven’t seen any fatal flaws in our analysis,” said Mike Kotera, executive vice president of energy development for CPS. With boardmember and Mayor Phil Hardberger still in China inspecting things presumably Chinese, the vote was reset for October 29.
  • No one at the meeting asked about cost, though the board did request a month-by-month analysis of the fiasco that has been the South Texas Project 1&2 to be delivered at Monday’s meeting. When asked privately about cost, several CPS officers said they did not know what the plants would run, and the figure — if it were known — would not be public since it is the subject of contract negotiations. “We don’t know yet,” said Bob McCullough, director of CPS’s corporate communications. “We are not making the commitment to build the plant. We’re not sure at this point we really understand what it’s going to cost.” The $206 million outlay the board will consider on Monday is not to build the pair of 1,300-megawatt, Westinghouse Advanced Boiling Water Reactors. It is also not a contract to purchase power, McCullough said. It is merely to hold a place in line for that power.
  • It’s likely that we would come on a recurring basis back to the board to keep them apprised of where we are and also the decision of whether or not we think it makes sense for us to go forward,” said Larry Blaylock, director of CPS’s Nuclear Oversight & Development. So, at what point will the total cost of the new plants become transparent to taxpayers? CPS doesn’t have that answer. “At this point, it looks like in order to meet our load growth, nuclear looks like our lowest-risk choice and we think it’s worth spending some money to make sure we hold that place in line,” said Mark Werner, director of Energy Market Operations.
  • Another $10 million request for “other new nuclear project opportunities” will also come to the board Monday. That request summons to mind a March meeting between CPS officials and Exelon Energy reps, followed by a Spurs playoff game. Chicago-based Exelon, currently being sued in Illinois for allegedly releasing millions of gallons of radioactive wastewater beneath an Illinois plant, has its own nuclear ambitions for Texas. South Texas Project The White House champions nuclear, and strong tax breaks and subsidies await those early applicants. Whether CPS qualifies for those millions remains to be seen. We can only hope.
  • CPS has opted for the Super Honkin’ Utility model. Not only that — quivering on the brink of what could be a substantial efficiency program, CPS took a leap into our unflattering past when it announced it hopes to double our nuclear “portfolio” by building two new nuke plants in Matagorda County. The utility joined New Jersey-based NRG Energy in a permit application that could fracture an almost 30-year moratorium on nuclear power plant creation in the U.S.
  • After Unit 1 came online in 1988, it had to be shut down after water-pump shaft seared off in May, showering debris “all over the place,” according to Nucleonics Week. The next month two breakers failed during a test of backup power, leading to an explosion that sheared off a steam-generator pump and shot the shaft into the station yard. After the second unit went online the next year, there were a series of fires and failures leading to a half-million-dollar federal fine in 1993 against Houston Power. Then the plant went offline for 14 months. Not the glorious launch the partnership had hoped for. Today, CPS officials still do not know how much STP has cost the city, though they insist overall it has been a boon worth billions. “It’s not a cut-and-dried analysis. We’re doing what we can to try to put that in terms that someone could share and that’s a chore,” said spokesman McCollough. CPS has appealed numerous Open Records requests by the Current to the state Attorney General. The utility argues that despite being owned by the City they are not required to reveal, for instance, how much it may cost to build a plant or even how much pollution a plant generates, since the electricity market is a competitive field.
  • How do we usher in this new utopia of decentralized power? First, we have to kill CPS and bury it — or the model it is run on, anyway. What we resurrect in its place must have sustainability as its cornerstone, meaning that the efficiency standards the City and the utility have been reaching for must be rapidly eclipsed. Not only are new plants not the solution, they actively misdirect needed dollars away from the answer. Whether we commit $500 million to build a new-fangled “clean-coal” power plant or choose to feed multiple billions into a nuclear quagmire, we’re eliminating the most plausible option we have: rapid decentralization.
  • A 2003 study at the Massachusetts Institute of Technology estimates the cost of nuclear power to exceed that of both coal and natural gas. A U.S. Energy Information Administration report last year found that will still be the case when and if new plants come online in the next decade. If ratepayers don’t pay going in with nuclear, they can bet on paying on the way out, when virtually the entire power plant must be disposed of as costly radioactive waste. The federal government’s inability to develop a repository for the tens of thousands of tons of nuclear waste means reactors across the country are storing spent fuel in onsite holding ponds. It is unclear if the waste’s lethality and tens of thousands of years of radioactivity were factored into NEAT’s glowing analysis.
  • The federal dump choice, Nevada’s Yucca Mountain, is expected to cost taxpayers more than $60 billion. If it opens, Yucca will be full by the time STP 3&4 are finished, requiring another federal dump and another trainload of greenbacks. Just the cost of Yucca’s fence would set you back. Add the price of replacing a chain-link fence around, let’s say, a 100-acre waste site. Now figure you’re gonna do that every 50 years for 10,000 years or more. Security guards cost extra. That is not to say that the city should skip back to the coal mine. Thankfully, we don’t need nukes or coal, according to the American Council for an Energy-Efficient Economy, a D.C.-based non-profit that champions energy efficiency. A collection of reports released this year argue that a combination of ramped-up efficiency programs, construction of numerous “combined heat and power” facilities, and installation of on-site renewable energy resources would allow the state to avoid building new power plants. Texas could save $73 billion in electric generation costs by spending $50 billion between now and 2023 on such programs, according to the research group. The group also claims the efficiency revolution would even be good for the economy, creating 38,300 jobs. If ACEEE is even mostly right, plans to start siphoning millions into a nuclear reservoir look none too inspired.
  • To jump tracks will take a major conversion experience inside CPS and City Hall, a turning from the traditional model of towering plants, reels of transmission line, and jillions of dependent consumers. CPS must “decentralize” itself, as cities as close as Austin and as far away as Seattle are doing. It’s not only economically responsible and environmentally sound, but it is the best way to protect our communities entering what is sure to be a harrowing century. Greening CPS CPS is grudgingly going greener. In 2004, a team of consultants, including Wisconsin-based KEMA Inc., hired to review CPS operations pegged the utility as a “a company in transition.” Executives interviewed didn’t understand efficiency as a business model. Even some managers tapped to implement conservation programs said such programs were about “appearing” concerned, according to KEMA’s findings.
  • While the review exposed some philosophical shortcomings, it also revealed for the first time how efficiency could transform San Antonio. It was technically possible, for instance, for CPS to cut electricity demand by 1,935 megawatts in 10 years through efficiency alone. While that would be accompanied with significant economic strain, a less-stressful scenario could still cut 1,220 megawatts in that period — eliminating 36 percent of 2014’s projected energy use. CPS’s current plans call for investing $96 million to achieve a 225-megawatt reduction by 2016. The utility plans to spend more than four times that much by 2012 upgrading pollution controls at the coal-fired J.T. Deely power plant.
  • In hopes of avoiding the construction of Spruce 2 (now being built, a marvel of cleanliness, we are assured), Citizen Oversight Committee members asked KEMA if it were possible to eliminate 500 megawatts from future demand through energy efficiency alone. KEMA reported back that, yes, indeed it was possible, but would represent an “extreme” operation and may have “unintended consequences.” Such an effort would require $620 million and include covering 90 percent of the cost of efficiency products for customers. But an interesting thing happens under such a model — the savings don’t end in 2012. They stretch on into the future. The 504 megawatts that never had to be generated in 2012 end up saving 62 new megawatts of generation in 2013 and another 53 megawatts in 2014. With a few tweaks on the efficiency model, not only can we avoid new plants, but a metaphorical flip of the switch can turn the entire city into one great big decentralized power generator.
  • Even without good financial data, the Citizen’s Advisory Board has gone along with the plan for expansion. The board would be “pennywise and pound foolish” not to, since the city is already tied to STP 1&2, said at-large member Jeannie O’Sullivan. “Yes, in the past the board of CPS had been a little bit not as for taking on a [greater] percentage of nuclear power. I don’t know what their reasons were, I think probably they didn’t have a dialogue with a lot of different people,” O’Sullivan said.
  • For this, having a City-owned utility offers an amazing opportunity and gives us the flexibility to make most of the needed changes without state or federal backing. “Really, when you start looking, there is a lot more you can do at the local level,” said Neil Elliott of the ACEEE, “because you control building codes. You control zoning. You can control siting. You can make stuff happen at the local level that the state really doesn’t have that much control of.” One of the most empowering options for homeowners is homemade energy provided by a technology like solar. While CPS has expanded into the solar incentives field this year, making it only the second utility in the state to offer rebates on solar water heaters and rooftop panels, the incentives for those programs are limited. Likewise, the $400,000 CPS is investing at the Pearl Brewery in a joint solar “project” is nice as a white tiger at a truck stop, but what is truly needed is to heavily subsidize solar across the city to help kickstart a viable solar industry in the state. The tools of energy generation, as well as the efficient use of that energy, must be spread among the home and business owners.
  • Joel Serface, with bulb-polished pate and heavy gaze, refers to himself as a “product of the oil shock” who first discovered renewables at Texas Tech’s summer “geek camp.” The possibilities stayed with him through his days as a venture capitalist in Silicon Valley and eventually led him to Austin to head the nation’s first clean-energy incubation center. Serface made his pitch at a recent Solar San Antonio breakfast by contrasting Texas with those sun-worshipping Californians. Energy prices, he says, are “going up. They’re not going down again.” That fact makes alternative energies like solar, just starting to crack the 10-cent-per-killowatt barrier, financially viable. “The question we have to solve as an economy is, ‘Do we want to be a leader in that, or do we want to allow other countries [to outpace us] and buy this back from them?’” he asked.
  • To remain an energy leader, Texas must rapidly exploit solar. Already, we are fourth down the list when it comes not only to solar generation, but also patents issued and federal research awards. Not surprisingly, California is kicking silicon dust in our face.
D'coda Dcoda

Expert: Radioactive materials reached Kanto via 2 routes [28Oct11] - 0 views

  • Radioactive materials from the damaged Fukushima No. 1 nuclear plant reached the Kanto region mainly via two routes, but they largely skirted the heavily populated areas of Tokyo and Kanagawa Prefecture, an expert said. Relatively high levels of radioactive cesium were detected in soil in northern Gunma and Tochigi prefectures and southern Ibaraki Prefecture after the Fukushima No. 1 nuclear power plant was damaged by the March 11 Great East Japan Earthquake and tsunami. But contamination was limited in Tokyo and Kanagawa Prefecture, where 22 million people live. Hiromi Yamazawa, a professor of environmental radiology at Nagoya University, said the first radioactive plume moved through Ibaraki Prefecture and turned northward to Gunma Prefecture between late March 14 and the afternoon of March 15.
  • Large amounts of radioactive materials were released during that period partly because the core of the No. 2 reactor at the Fukushima No. 1 plant was exposed. "The soil was likely contaminated after the plume fell to the ground with rain or snow," Yamazawa said, adding that western Saitama Prefecture and western Tokyo may have been also contaminated. Rain fell in Fukushima, Tochigi and Gunma prefectures from the night of March 15 to the early morning of March 16, according to the Meteorological Agency. The second plume moved off Ibaraki Prefecture and passed through Chiba Prefecture between the night of March 21 and the early morning of March 22, when rain fell in a wide area of the Kanto region, according to Yamazawa's estimates.
  • He said the plume may have created radiation hot spots in coastal and southern areas of Ibaraki Prefecture as well as around Kashiwa, Chiba Prefecture. Yamazawa said the plume continued to move southward, without approaching Tokyo or Kanagawa Prefecture, probably because winds flowed toward a low-pressure system south of the Boso Peninsula. "It rained slightly because the low-pressure system was not strong," said Takehiko Mikami, a professor of climatology at Teikyo University. "Contamination in central Tokyo might have been more serious if (the plume) had approached more inland areas." According to calculations by The Asahi Shimbun, about 13,000 square kilometers, or about 3 percent of Japan's land area, including about 8,000 square kilometers in Fukushima Prefecture, have annual exposure levels of 1 millisievert or more.
  • ...3 more annotations...
  • Gunma and Tochigi prefectures have a combined 3,800 square kilometers with an annual exposure of 1 millisievert or more. Among Tokyo's 23 wards, Katsushika Ward had the highest radiation level of 0.33 microsievert per hour, according to a science ministry map showing radioactive contamination for 12 prefectures. The ward government has been measuring radiation levels in seven locations once a week since late May. It plans to take measurements at about 500 public facilities, such as schools and parks, in response to residents' demands for detailed surveys.
  • The Gunma prefectural government has measured radiation levels in 149 locations since September and has identified six northern mountainous municipalities with an annual exposure of 1 millisievert or more. Earlier this month, the prefectural government asked 35 municipalities to decide whether radioactive materials will be removed. High radiation levels were detected in Minakami, Gunma Prefecture, known as a hot spring resort. Mayor Yoshimasa Kishi said the town could be mistaken as a risky place if it decides to have radioactive materials removed. The science ministry's map showed that 0.2 to 0.5 microsievert was detected in some locations in Niigata Prefecture. Niigata Governor Hirohiko Izumida said the figures were likely mistaken, noting that these locations have high natural radiation levels because of granite containing radioactive materials.
  • The prefectural government plans to conduct its own surveys of airborne radiation levels and soil contamination. Many municipalities are calling for financial support for removing radioactive materials. In Kashiwa and five other cities in northern Chiba Prefecture, radioactive materials need to be removed over an estimated 180 square kilometers of mainly residential areas. The Kashiwa city government is providing up to 200,000 yen ($2,620) to kindergartens and nursery schools for removal work. But some facilities have asked children's parents to help pay the costs because they cannot be covered by the municipal assistance.
Dan R.D.

Energy Demand Will Push Development of Nuclear Power - WSJ.com [24Oct11] - 0 views

  • It has been two years since Mohamed ElBaradei stepped down as head of the United Nations' nuclear watchdog, but the Nobel peace laureate still has nuclear technology very much on his mind.
  • But Mr. ElBaradei doesn't subscribe to the widely held view that Fukushima has killed off the nuclear industry for the foreseeable future. In fact, he argues countries exiting nuclear-power generation are the exception rather than the rule. "There will be, in the short term, a slowdown in some countries. But others like France, India or China [won't see] an impact on their [nuclear] programs," he says.
  • He also points to some nuclear newcomers, such as the United Arab Emirates and Turkey.
  • ...4 more annotations...
  • Further development of nuclear power is guaranteed by the exponential global growth in energy demand, he says, pointing to a study by the U.S. Energy Information Administration estimating global electricity-generation growth of 87% by 2035 as the world's population grows.
  • But while he argues the planet has to live with nuclear energy he acknowledges this has a risk. "Nuclear energy as with any technology has always a risk. You have to balance the costs and benefits," he says.
  • "People need to take safety much more seriously than in the past. I've suggested a number of things that need to be done: a mandatory peer review by experts on every facility, an overall review of all nuclear plants both civilian and military."
  • "People are hypersensitive to anything nuclear, to radioactivity. You don't know how it will impact you. The nuclear industry has to take that into account. They have to go out of their way to make sure that it is as safe as possible. We have to design nuclear-power reactors not just for the worst-case scenario but for the seemingly impossible," he says.
D'coda Dcoda

Scientists Radically Raise Estimates of Fukushima Fallout [25Oct11] - 0 views

  • The disaster at the Fukushima Daiichi nuclear plant in March released far more radiation than the Japanese government has claimed. So concludes a study1 that combines radioactivity data from across the globe to estimate the scale and fate of emissions from the shattered plant. The study also suggests that, contrary to government claims, pools used to store spent nuclear fuel played a significant part in the release of the long-lived environmental contaminant caesium-137, which could have been prevented by prompt action. The analysis has been posted online for open peer review by the journal Atmospheric Chemistry and Physics.
  • Andreas Stohl, an atmospheric scientist with the Norwegian Institute for Air Research in Kjeller, who led the research, believes that the analysis is the most comprehensive effort yet to understand how much radiation was released from Fukushima Daiichi. "It's a very valuable contribution," says Lars-Erik De Geer, an atmospheric modeller with the Swedish Defense Research Agency in Stockholm, who was not involved with the study. The reconstruction relies on data from dozens of radiation monitoring stations in Japan and around the world. Many are part of a global network to watch for tests of nuclear weapons that is run by the Comprehensive Nuclear-Test-Ban Treaty Organization in Vienna. The scientists added data from independent stations in Canada, Japan and Europe, and then combined those with large European and American caches of global meteorological data.
  • Stohl cautions that the resulting model is far from perfect. Measurements were scarce in the immediate aftermath of the Fukushima accident, and some monitoring posts were too contaminated by radioactivity to provide reliable data. More importantly, exactly what happened inside the reactors — a crucial part of understanding what they emitted — remains a mystery that may never be solved. "If you look at the estimates for Chernobyl, you still have a large uncertainty 25 years later," says Stohl. Nevertheless, the study provides a sweeping view of the accident. "They really took a global view and used all the data available," says De Geer.
  • ...7 more annotations...
  • Challenging numbers Japanese investigators had already developed a detailed timeline of events following the 11 March earthquake that precipitated the disaster. Hours after the quake rocked the six reactors at Fukushima Daiichi, the tsunami arrived, knocking out crucial diesel back-up generators designed to cool the reactors in an emergency. Within days, the three reactors operating at the time of the accident overheated and released hydrogen gas, leading to massive explosions. Radioactive fuel recently removed from a fourth reactor was being held in a storage pool at the time of the quake, and on 14 March the pool overheated, possibly sparking fires in the building over the next few days.
  • But accounting for the radiation that came from the plants has proved much harder than reconstructing this chain of events. The latest report from the Japanese government, published in June, says that the plant released 1.5 × 1016 bequerels of caesium-137, an isotope with a 30-year half-life that is responsible for most of the long-term contamination from the plant2. A far larger amount of xenon-133, 1.1 × 1019 Bq, was released, according to official government estimates.
  • Stohl believes that the discrepancy between the team's results and those of the Japanese government can be partly explained by the larger data set used. Japanese estimates rely primarily on data from monitoring posts inside Japan3, which never recorded the large quantities of radioactivity that blew out over the Pacific Ocean, and eventually reached North America and Europe. "Taking account of the radiation that has drifted out to the Pacific is essential for getting a real picture of the size and character of the accident," says Tomoya Yamauchi, a radiation physicist at Kobe University who has been measuring radioisotope contamination in soil around Fukushima. Click for full imageStohl adds that he is sympathetic to the Japanese teams responsible for the official estimate. "They wanted to get something out quickly," he says. The differences between the two studies may seem large, notes Yukio Hayakawa, a volcanologist at Gunma University who has also modelled the accident, but uncertainties in the models mean that the estimates are actually quite similar.
  • The new study challenges those numbers. On the basis of its reconstructions, the team claims that the accident released around 1.7 × 1019 Bq of xenon-133, greater than the estimated total radioactive release of 1.4 × 1019 Bq from Chernobyl. The fact that three reactors exploded in the Fukushima accident accounts for the huge xenon tally, says De Geer. Xenon-133 does not pose serious health risks because it is not absorbed by the body or the environment. Caesium-137 fallout, however, is a much greater concern because it will linger in the environment for decades. The new model shows that Fukushima released 3.5 × 1016 Bq caesium-137, roughly twice the official government figure, and half the release from Chernobyl. The higher number is obviously worrying, says De Geer, although ongoing ground surveys are the only way to truly establish the public-health risk.
  • The new analysis also claims that the spent fuel being stored in the unit 4 pool emitted copious quantities of caesium-137. Japanese officials have maintained that virtually no radioactivity leaked from the pool. Yet Stohl's model clearly shows that dousing the pool with water caused the plant's caesium-137 emissions to drop markedly (see 'Radiation crisis'). The finding implies that much of the fallout could have been prevented by flooding the pool earlier. The Japanese authorities continue to maintain that the spent fuel was not a significant source of contamination, because the pool itself did not seem to suffer major damage. "I think the release from unit 4 is not important," says Masamichi Chino, a scientist with the Japanese Atomic Energy Authority in Ibaraki, who helped to develop the Japanese official estimate. But De Geer says the new analysis implicating the fuel pool "looks convincing".
  • The latest analysis also presents evidence that xenon-133 began to vent from Fukushima Daiichi immediately after the quake, and before the tsunami swamped the area. This implies that even without the devastating flood, the earthquake alone was sufficient to cause damage at the plant.

    ADVERTISEMENT

    Advertisement

    The Japanese government's report has already acknowledged that the shaking at Fukushima Daiichi exceeded the plant's design specifications. Anti-nuclear activists have long been concerned that the government has failed to adequately address geological hazards when licensing nuclear plants (see Nature 448, 392–393; 2007), and the whiff of xenon could prompt a major rethink of reactor safety assessments, says Yamauchi.

  • The model also shows that the accident could easily have had a much more devastating impact on the people of Tokyo. In the first days after the accident the wind was blowing out to sea, but on the afternoon of 14 March it turned back towards shore, bringing clouds of radioactive caesium-137 over a huge swathe of the country (see 'Radioisotope reconstruction'). Where precipitation fell, along the country's central mountain ranges and to the northwest of the plant, higher levels of radioactivity were later recorded in the soil; thankfully, the capital and other densely populated areas had dry weather. "There was a period when quite a high concentration went over Tokyo, but it didn't rain," says Stohl. "It could have been much worse." 
D'coda Dcoda

Scanning the Earth Project - [26Oct11] - 0 views

shared by D'coda Dcoda on 26 Oct 11 - No Cached
  • Scanning the Earth Project (environmental scanning project) is a project to provide environmental information, including the radiation dose. Currently, SafeCast has collected together with data. In this research project, fixed sensors and mobile sensors and sensing in the human living space, build a platform to share sensor data using information technology. In addition, I developed a data visualization techniques and spatial interpolation techniques in order to provide comprehensive information across time and space. Specifically, we are conducting, including fixed-point observation and instrumentation sensors installed radiation dose measurement method using a goal for automobiles and promote the creation of a sustainable platform for radiation information. Sensing information is stored in the server via the Internet, will be open to the public through the Web API. At the same time, the space-time analysis of information technology sensors will be widely available on the portal site with information visualized.
  • This study includes the following research areas such as big. 1. Development of Networked Sensing Devices Network development, such as sensing devices to measure radiation dose and weather information. At the same time defining a data dictionary to collect information for a variety of ground and develop a mechanism for device authentication. We also recommend the standardization of communication protocols used by the device. 2. Development of Sensor Network Development of network technology to collect data measured by the sensor. DTN protocols and collecting data of the type used in sensing movement sensor, developed a protocol for cooperation between the server and advance the standards. 3. Development of spatial analysis There is a limit to the fixed sensors and mobile sensors laying. In order to cover the space, so we developed a technique to interpolate between the measurement point information on the characteristics of each based on the information. Also, consider the API to provide their information widely. 4. The development of visualization techniques
  • In order to take advantage of human-sensing data is essential for meaningful visualization. In addition, the information should not be sensing a zero-dimensional visualization, visualization should not be one-dimensional, not to be visible in two dimensions, not to be visualized in three dimensions The variety of such. In this Purujeku and the visualization techniques we devised according to the characteristics of each space. Contact ste-info_at_sfc.wide.ad.jp
  • ...2 more annotations...
  • About the Scanning the Earth Project Scanning the Project is a Project to Disseminate the Earth Environmental Information, Starting with AIR Radioactive dose rate, in Collaboration with SafeCast . This research project will use a sensor platform of both stationary and mobile sensors to monitor the air around human populations, then share that information via communication technologies. It will also develop data interpolation and visualization techniques to provide comprehensive information over time. Specifically, the project will employ both fixed and bicycle-mounted geiger counters to create a platform for continual radiation measurement. The collected information will be transmitted via the internet to servers and made public via a web API. Finally, the project aims to simultaneously analyze readings and create visualizations of the data to spread information on environmental conditions via a portal site. This project's major research aims are as follows: 1. The development of networked sensing devices These networked devices will monitor radiation and meteorological conditions. We will make provisions for a data repository to gather varied atmospheric information and develop a framework for certifying scanning devices. We will also develop a standardized transmission protocol for these devices. 2. The development of sensor network technology. We will also develop a DTN protocol for gathering information from mobile sensors and a standard coordination protocol for servers.
  • 3. The development of air analysis technology There is a limit to what can be done with stationary and moving sensors. To cover all areas, we will develop methods for interpolating data from existing readings. We intend to develop an API for sharing this information as well. 4. The development of visualization technology In order for people to take advantage of the sensing data, easy-to-understand visualizations of those data are necessary. Some scanning data are best visualized with zero-dimensional displays, some with one-dimensional, some with two-dimensional, and some in three dimensions. This project aims to develop visualization methods for each of these circumstances. Contact: Ste-Info_At_Sfc.wide.ad.jp
  •  
    The University working with Safecast on deploying sensors to track radiation. 
« First ‹ Previous 81 - 87 of 87
Showing 20 items per page