Skip to main content

Home/ Open Intelligence / Energy/ Group items tagged data

Rss Feed Group items tagged

D'coda Dcoda

Scanning the Earth Project - [26Oct11] - 0 views

shared by D'coda Dcoda on 26 Oct 11 - No Cached
  • Scanning the Earth Project (environmental scanning project) is a project to provide environmental information, including the radiation dose. Currently, SafeCast has collected together with data. In this research project, fixed sensors and mobile sensors and sensing in the human living space, build a platform to share sensor data using information technology. In addition, I developed a data visualization techniques and spatial interpolation techniques in order to provide comprehensive information across time and space. Specifically, we are conducting, including fixed-point observation and instrumentation sensors installed radiation dose measurement method using a goal for automobiles and promote the creation of a sustainable platform for radiation information. Sensing information is stored in the server via the Internet, will be open to the public through the Web API. At the same time, the space-time analysis of information technology sensors will be widely available on the portal site with information visualized.
  • This study includes the following research areas such as big. 1. Development of Networked Sensing Devices Network development, such as sensing devices to measure radiation dose and weather information. At the same time defining a data dictionary to collect information for a variety of ground and develop a mechanism for device authentication. We also recommend the standardization of communication protocols used by the device. 2. Development of Sensor Network Development of network technology to collect data measured by the sensor. DTN protocols and collecting data of the type used in sensing movement sensor, developed a protocol for cooperation between the server and advance the standards. 3. Development of spatial analysis There is a limit to the fixed sensors and mobile sensors laying. In order to cover the space, so we developed a technique to interpolate between the measurement point information on the characteristics of each based on the information. Also, consider the API to provide their information widely. 4. The development of visualization techniques
  • In order to take advantage of human-sensing data is essential for meaningful visualization. In addition, the information should not be sensing a zero-dimensional visualization, visualization should not be one-dimensional, not to be visible in two dimensions, not to be visualized in three dimensions The variety of such. In this Purujeku and the visualization techniques we devised according to the characteristics of each space. Contact ste-info_at_sfc.wide.ad.jp
  • ...2 more annotations...
  • About the Scanning the Earth Project Scanning the Project is a Project to Disseminate the Earth Environmental Information, Starting with AIR Radioactive dose rate, in Collaboration with SafeCast . This research project will use a sensor platform of both stationary and mobile sensors to monitor the air around human populations, then share that information via communication technologies. It will also develop data interpolation and visualization techniques to provide comprehensive information over time. Specifically, the project will employ both fixed and bicycle-mounted geiger counters to create a platform for continual radiation measurement. The collected information will be transmitted via the internet to servers and made public via a web API. Finally, the project aims to simultaneously analyze readings and create visualizations of the data to spread information on environmental conditions via a portal site. This project's major research aims are as follows: 1. The development of networked sensing devices These networked devices will monitor radiation and meteorological conditions. We will make provisions for a data repository to gather varied atmospheric information and develop a framework for certifying scanning devices. We will also develop a standardized transmission protocol for these devices. 2. The development of sensor network technology. We will also develop a DTN protocol for gathering information from mobile sensors and a standard coordination protocol for servers.
  • 3. The development of air analysis technology There is a limit to what can be done with stationary and moving sensors. To cover all areas, we will develop methods for interpolating data from existing readings. We intend to develop an API for sharing this information as well. 4. The development of visualization technology In order for people to take advantage of the sensing data, easy-to-understand visualizations of those data are necessary. Some scanning data are best visualized with zero-dimensional displays, some with one-dimensional, some with two-dimensional, and some in three dimensions. This project aims to develop visualization methods for each of these circumstances. Contact: Ste-Info_At_Sfc.wide.ad.jp
  •  
    The University working with Safecast on deploying sensors to track radiation. 
D'coda Dcoda

Impacts of the Fukushima Nuclear Power Plants on Marine Radioactivity - Environmental S... - 0 views

  • The impacts on the ocean of releases of radionuclides from the Fukushima Dai-ichi nuclear power plants remain unclear. However, information has been made public regarding the concentrations of radioactive isotopes of iodine and cesium in ocean water near the discharge point. These data allow us to draw some basic conclusions about the relative levels of radionuclides released which can be compared to prior ocean studies and be used to address dose consequences as discussed by Garnier-Laplace et al. in this journal.(1) The data show peak ocean discharges in early April, one month after the earthquake and a factor of 1000 decrease in the month following. Interestingly, the concentrations through the end of July remain higher than expected implying continued releases from the reactors or other contaminated sources, such as groundwater or coastal sediments. By July, levels of 137Cs are still more than 10 000 times higher than levels measured in 2010 in the coastal waters off Japan. Although some radionuclides are significantly elevated, dose calculations suggest minimal impact on marine biota or humans due to direct exposure in surrounding ocean waters, though considerations for biological uptake and consumption of seafood are discussed and further study is warranted.
  • there was no large explosive release of core reactor material, so most of the isotopes reported to have spread thus far via atmospheric fallout are primarily the radioactive gases plus fission products such as cesium, which are volatilized at the high temperatures in the reactor core, or during explosions and fires. However, some nonvolatile activation products and fuel rod materials may have been released when the corrosive brines and acidic waters used to cool the reactors interacted with the ruptured fuel rods, carrying radioactive materials into the ground and ocean. The full magnitude of the release has not been well documented, nor is there data on many of the possible isotopes released, but we do have significant information on the concentration of several isotopes of Cs and I in the ocean near the release point which have been publically available since shortly after the accident started.
  • We present a comparison of selected data made publicly available from a Japanese company and agencies and compare these to prior published radionuclide concentrations in the oceans. The primary sources included TEPCO (Tokyo Electric Power Company), which reported data in regular press releases(3) and are compiled here (Supporting Information Table S1). These TEPCO data were obtained by initially sampling 500 mL surface ocean water from shore and direct counting on high-purity germanium gamma detectors for 15 min at laboratories at the Fukushima Dai-ni NPPs. They reported initially results for 131I (t1/2 = 8.02 days), 134Cs (t1/2 = 2.065 years) and 137Cs (t1/2 = 30.07 years). Data from MEXT (Ministry of Education, Culture, Sports, Science and Technology—Japan) were also released on a public Web site(4) and are based on similar direct counting methods. In general MEXT data were obtained by sampling 2000 mL seawater and direct counting on high-purity germanium gamma detectors for 1 h in a 2 L Marinelli beaker at laboratories in the Japan Atomic Energy Agency. The detection limit of 137Cs measurements are about 20 000 Bq m–3 for TEPCO data and 10 000 Bq m–3 for MEXT data, respectively. These measurements were conducted based on a guideline described by MEXT.(5) Both sources are considered reliable given the common activity ratios and prior studies and expertise evident by several Japanese groups involved in making these measurements. The purpose of these early monitoring activities was out of concern for immediate health effects, and thus were often reported relative to statutory limits adopted by Japanese authorities, and thus not in concentration units (reported as scaling factors above “normal”). Here we convert values from both sources to radionuclide activity units common to prior ocean studies of fallout in the ocean (Bq m–3) for ease of comparison to previously published data.
  • ...5 more annotations...
  • We focus on the most complete time-series records from the north and south discharge channels at the Dai-ichi NPPs, and two sites to the south that were not considered sources, namely the north Discharge channels at the Dai-ni NPPs about 10 km to the south and Iwasawa beach which is 16 km south of the Dai-ichi NPPs (Figure 1). The levels at the discharge point are exceedingly high, with a peak 137Cs 68 million Bq m–3 on April 6 (Figure 2). What are significant are not just the elevated concentrations, but the timing of peak release approximately one month after to the earthquake. This delayed release is presumably due to the complicated pattern of discharge of seawater and fresh water used to cool the reactors and spent fuel rods, interactions with groundwater, and intentional and unintentional releases of mixed radioactive material from the reactor facility.
  • the concentrations of Cs in sediments and biota near the NPPs may be quite large, and will continue to remain so for at least 30–100 years due to the longer half-life of 137Cs which is still detected in marine and lake sediments from 1960s fallout sources.
  • If the source at Fukushima had stopped abruptly and ocean mixing processes continued at the same rates, one would have expected that the 137Cs activities would have decreased an additional factor of 1000 from May to June but that was not observed. The break in slope in early May implies that a steady, albeit lower, source of 137Cs continues to discharge to the oceans at least through the end of July at this site. With reports of highly contaminated cooling waters at the NPPs and complete melt through of at least one of the reactors, this is not surprising. As we have no reason to expect a change in mixing rates of the ocean which would also impact this dilution rate, this change in slope of 137Cs in early May is clear evidence that the Dai-ichi NPPs remain a significant source of contamination to the coastal waters off Japan. There is currently no data that allow us to distinguish between several possible sources of continued releases, but these most likely include some combination of direct releases from the reactors or storage tanks, or indirect releases from groundwater beneath the reactors or coastal sediments, both of which are likely contaminated from the period of maximum releases
  • It is prudent to point out though what is meant by “significant” to both ocean waters and marine biota. With respect to prior concentrations in the waters off Japan, all of these values are elevated many orders of magnitude. 137Cs has been tracked quite extensively off Japan since the peak weapons testing fallout years in the early 1960s.(13) Levels in the region east of Japan have decreased from a few 10s of Bq m–3 in 1960 to 1.5 Bq m–3 on average in 2010 (Figure 2; second x-axis). The decrease in 137Cs over this 50 year record reflects both radioactive decay of 137Cs with a 30 year half-life and continued mixing in the global ocean of 137Cs to depth. These data are characteristic of other global water masses.(14) Typical ocean surface 137Cs activities range from <1 Bq m–3 in surface waters in the Southern Hemisphere, which are lower due to lower weapons testing inputs south of the equator, to >10–100 Bq m–3 in the Irish Sea, North Sea, Black Sea, and Baltic Seas, which are elevated due to local sources from the intentional discharges at the nuclear fuel reprocessing facilities at Sellafield in the UK and Cape de la Hague in France, as well as residual 137Cs from Chernobyl in the Baltic and Black Seas. Clearly then on this scale of significance, levels of 137Cs 30 km off Japan were some 3–4 orders of magnitude higher than existed prior to the NPP accidents at Fukushima.
  • Finally though, while the Dai-ichi NPP releases must be considered “significant” relative to prior sources off Japan, we should not assume that dose effects on humans or marine biota are necessarily harmful or even will be measurable. Garnier-Laplace et al.(1) report a dose reconstruction signal for the most impacted areas to wildlife on land and in the ocean. Like this study, they are relying on reported activities to calculate forest biota concentrations,
  •  
    From Wood's Hole, note that calculations are based on reports from TEPCO & other Japanese agencies. Quite a bit more to read on the site.
D'coda Dcoda

Safecast: Global sensor network collects and shares radiation data via CC0 - Creative C... - 0 views

  • One week after the nuclear disaster at the Fukushima Diachi plant in March, the Safecast project was born to respond to the information needs of Japanese citizens regarding radiation levels in their environment. Safecast, then known as RDTN.org, started a campaign on Kickstarter “to provide an aggregate feed of nuclear radiation data from governmental, non-governmental and citizen-scientist sources.” All radiation data collected via the project would be dedicated to the public domain using CC0, “available to everyone, including scientists and nuclear experts who can provide context for lay people.” Since then, more than 1.25 million data points have been collected and shared; Safecast has been featured on PBS Newshour; and the project aims to expand its scope to mapping the rest of the world.
  • “Safecast supports the idea that more data – freely available data – is better. Our goal is not to single out any individual source of data as untrustworthy, but rather to contribute to the existing measurement data and make it more robust. Multiple sources of data are always better and more accurate when aggregated. While Japan and radiation is the primary focus of the moment, this work has made us aware of a need for more environmental data on a global level and the longterm work that Safecast engages in will address these needs. Safecast is based in the US but is currently focused on outreach efforts in Japan. Our team includes contributors from around the world.”
  • To learn more, visit http://safecast.org.
D'coda Dcoda

4 Ways the Department of Energy Is Tapping Tech for a Greener Future [03Aug11] - 0 views

  • This week, the U.S. Department of Energy (DOE) re-launched its website, Energy.gov, to provide tools to help individuals and businesses better understand how to save energy and money. You can type your zip code into the site and get hyper-local information about your city, county and state, including information on tax credits, rebates and energy saving tips.
  • The site presents DOE data visually using the open source MapBox suite of tools, and localized data and maps can be shared or embedded on any website or blog. Other data sets the DOE is mapping include alternative fuel locations and per capita energy usage. Anyone can now compare how his state’s energy usage compares with others across the country. In addition to making the data more palatable for the public, the DOE is offering open data sets for others to use.
  • Our goal is simple — to improve the delivery of public services online. We’re using government data to go local in a way that’s never been possible before. We’re connecting the work of the Energy Department with what’s happening in your backyard,” says Cammie Croft, senior advisor and director of new media and citizen engagement at the DOE. “We’re making Energy.gov relevant and accessible to consumers and small businesses in their communities.”
  • ...16 more annotations...
  • How else is the Energy Department working to bring better information about energy, renewable energies and energy technology to the public? Here are a few examples.
  • 1. Your MPG
  • The “Your MPG” feature on the site lets you upload data about your own vehicle’s fuel usage to your “cyber” garage and get a better picture of how your vehicle is doing in terms of energy consumption. The system also aggregates the personal car data from all of the site’s users anonymously so people can share their fuel economy estimates. “You can track your car’s fuel economy over time to see if your efforts to increase MPG are working,” says David Greene, research staffer at Oak Ridge National Lab. “Then you can compare your fuel data with others and see how you are doing relative to those who own the same vehicle.”
  • In the works for the site is a predictive tool you can use when you are in the market for a new or used vehicle to more accurately predict the kind of mileage any given car will give you, based on your particular driving style and conditions. The system, says Greene, reduces the +/- 7 mpg margin of error of standard EPA ratings by about 50% to give you a more accurate estimate of what your MPG will be.
  • Solar Decathlon
  • In response to the White House’s Startup America program supporting innovation and entrepreneurship, the Energy Department launched its own version — America’s Next Top Energy Innovator Challenge. The technology transfer program gives startups the chance to license Energy Department technologies developed at the 17 national laboratories across the country at an affordable price. Entrepreneurs can identify Energy Department technologies through the Energy Innovation Portal, where more than 15,000 patent and patent applications are listed along with more than 450 market summaries describing some of the technologies in layman’s terms.
  • 2. America’s Next Top Energy Innovator
  • 3. Products: Smarter Windows
  • DOE funding, along with private investments, supports a number of companies including the Michigan-based company Pleotint. Pleotint developed a specialized glass film that uses energy generated by the sun to limit the amount of heat and light going into a building or a home. The technology is called Sunlight Responsive Thermochromic (SRT™), and it involves a chemical reaction triggered by direct sunlight that lightens or darkens the window’s tint. Windows made from this glass technology are designed to change based on specific preset temperatures.
  • Another DOE-funded company, Sage ElectroChromics, created SageGlass®, electronically controlled windows that use small electric charges to switch between clear and tinted windows in response to environmental heat and light conditions. And Soladigm has an electronic tinted glass product that is currently undergoing durability testing.
  • Once a company selects the technology of interest to them, they fill out a short template to apply for an option — a precursor to an actual license of the patent — for $1,000. A company can license up to three patents on one technology from a single lab per transaction, and patent fees are deferred for two years. The program also connects entrepreneurs to venture capitalists as mentors.
  • Since 2002, the U.S. Department of Energy’s Solar Decathlon has challenged collegiate students to develop solar-powered, highly efficient houses. Student teams build modular houses on campus, dismantle them and then reassemble the structures on the National Mall. The competition has taken place biennially since 2005. Open to the public and free of charge, the next event will take place at the National Mall’s West Potomac Park in Washington, D.C. from September 23 to October 2, 2011. There are 19 teams competing this year.
  • Teams spend nearly two years planning and constructing their houses, incorporating innovative technology to compete in 10 contests. Each contest is worth 100 points to the winner in the areas of Architecture, Market Appeal, Engineering, Communications, Affordability, Comfort Zone, Hot Water, Appliances, Home Entertainment and Energy Balance. The team with the most points at the end of the competition wins.
  • Since its inception, the Solar Decathlon has seen the majority of the 15,000 participants move on to jobs related to clean energy and sustainability. The DOE’s digital strategy for the Solar Decathlon includes the use of QR codes to provide a mobile interactive experience for visitors to the event in Washington, D.C., as well as Foursquare checkin locations for the event and for each participating house. Many of the teams are already blogging leading up to the event and there are virtual tours and computer animated video walkthroughs to share the Solar Decathlon experience with a global audience. There will be TweetChats using the hashtag #SD2011 and other activities on Twitter, Facebook, Flickr and YouTube.
  • The Future
  • In terms of renewable energies, the DOE tries to stay on the cutting edge. Some of their forward-thinking projects include the Bioenergy Knowledge Discovery Framework (KDF), containing an interactive database toolkit for access to data relevant to anyone engaged with the biofuel, bioenergy and bioproduct industries. Another is an interactive database that maps the energy available from tidal streams in the United States. The database, developed by the Georgia Institute of Technology in cooperation with the Energy Department, is available online. The tidal database gives researchers a closer look at the potential of tidal energy, which is a “predictable” clean energy resource. As tides ebb and flow, transferring tidal current to turbines to become mechanical energy and then converting it to electricity. There are already a number of marine and hydrokinetic energy projects under development listed on the site.
D'coda Dcoda

What are officials hiding about Fukushima? | Vancouver, Canada [20Oct11] - 0 views

  • After the Chernobyl nuclear disaster in 1986, Soviet officials were vilified for hiding the impacts from the public. But when Japan’s Fukushima nuclear accident took place last March, public officials in Japan and Canada alike jumped straight into Chernobyl-style damage-control mode, dismissing any worries about impacts. Now evidence has emerged that the radiation in Canada was worse than Canadian officials ever let on. A Health Canada monitoring station in Calgary detected radioactive material in rainwater that exceeded Canadian guidelines during the month of March, according to Health Canada data obtained by the Georgia Straight.
  • Canadian government officials didn’t disclose the high radiation readings to the public. Instead, they repeatedly insisted that fallout drifting to Canada was negligible and posed no health concerns. In fact, the data shows rainwater in Calgary last March had an average of 8.18 becquerels per litre of radioactive iodine, easily exceeding the Canadian guideline of six becquerels per litre for drinking water. “It’s above the recommended level [for drinking water],” Eric Pellerin, chief of Health Canada’s radiation-surveillance division, admitted in a phone interview from Ottawa. “At any time you sample it, it should not exceed the guideline.”
  • Radioactive-iodine levels also spiked in March in Vancouver (which saw an average of 0.69 becquerels per litre), Winnipeg (which saw 0.64 becquerels per litre) and Ottawa (which saw 1.67 becquerels per litre), the data shows. These levels didn’t exceed the Canadian guidelines, but the level discovered in Ottawa did surpass the more stringent ceiling for drinking water used by the U.S. Environmental Protection Agency, which is 54 times less than the six becquerels per litre of iodine-131 (a radioactive isotope) allowed in this country.
  • ...3 more annotations...
  • Health Canada provided the data only after repeated requests from the Straight. It isn’t posted on Health Canada’s web page devoted to the impacts of Fukushima. Instead, Health Canada maintains on that page that the radioactive fallout from Fukushima was “smaller than the normal day to day fluctuations from background radiation” and “did not pose any health risk to Canadians”. Pellerin said he doesn’t know why Health Canada didn’t make the data public. “I can’t answer that. The communication aspect could be improved,” he said.
  • n a statement emailed to the Straight along with the data, Health Canada played down the radiation in the Calgary rainwater: “Since rainwater is typically not a primary source of drinking water, and the concentration measured was very low (8 Bq/L), this measurement is not considered a health risk.” Health Canada’s rainwater data reveals deficiencies in how Ottawa monitors radiation in terms of public safety. Even at the height of the Fukushima crisis, rainwater in Canada was tested for radiation only at the end of each month, after a network of monitoring stations sent samples to Ottawa. As a result, the spikes in radiation last March were only discovered in early April, after rainwater samples were sent to Ottawa for testing. It’s also impossible to know how high radiation got on specific days in March because each day’s rainwater was added to the previous samples for that month.
  • In contrast, the EPA tested rainwater for radiation every day and reported the data daily on its website. Health Canada’s data on rainwater is also puzzling for another reason. It sharply contrasts with the data collected by SFU associate professor of chemistry Krzysztof Starosta. He found iodine-131 levels in rainwater in Burnaby spiked to 13 becquerels per litre in the days after Fukushima. That’s many times higher than the levels detected in Vancouver by Health Canada.
D'coda Dcoda

Safecast Talk by Joi Ito at MIT - Formation of Citizen Radiation Detection Network [13O... - 0 views

shared by D'coda Dcoda on 26 Oct 11 - No Cached
  •  
    Discusses how they created their ad hoc radiation detection network...uses sensors. Not enough geiger counters, mobilized volunteers, created maps (less radiation in exclusion zone than outside the zone in some cases) 600,000 data points that anyone can use, largest set of radiation data points in the world. MediaLab is doing their data analysis. Data needs to be collected in order to do data science. New device design for radiation measurement. Open hardware created. Sharing hardware design. 
D'coda Dcoda

Scientists Radically Raise Estimates of Fukushima Fallout [25Oct11] - 0 views

  • The disaster at the Fukushima Daiichi nuclear plant in March released far more radiation than the Japanese government has claimed. So concludes a study1 that combines radioactivity data from across the globe to estimate the scale and fate of emissions from the shattered plant. The study also suggests that, contrary to government claims, pools used to store spent nuclear fuel played a significant part in the release of the long-lived environmental contaminant caesium-137, which could have been prevented by prompt action. The analysis has been posted online for open peer review by the journal Atmospheric Chemistry and Physics.
  • Andreas Stohl, an atmospheric scientist with the Norwegian Institute for Air Research in Kjeller, who led the research, believes that the analysis is the most comprehensive effort yet to understand how much radiation was released from Fukushima Daiichi. "It's a very valuable contribution," says Lars-Erik De Geer, an atmospheric modeller with the Swedish Defense Research Agency in Stockholm, who was not involved with the study. The reconstruction relies on data from dozens of radiation monitoring stations in Japan and around the world. Many are part of a global network to watch for tests of nuclear weapons that is run by the Comprehensive Nuclear-Test-Ban Treaty Organization in Vienna. The scientists added data from independent stations in Canada, Japan and Europe, and then combined those with large European and American caches of global meteorological data.
  • Stohl cautions that the resulting model is far from perfect. Measurements were scarce in the immediate aftermath of the Fukushima accident, and some monitoring posts were too contaminated by radioactivity to provide reliable data. More importantly, exactly what happened inside the reactors — a crucial part of understanding what they emitted — remains a mystery that may never be solved. "If you look at the estimates for Chernobyl, you still have a large uncertainty 25 years later," says Stohl. Nevertheless, the study provides a sweeping view of the accident. "They really took a global view and used all the data available," says De Geer.
  • ...7 more annotations...
  • Challenging numbers Japanese investigators had already developed a detailed timeline of events following the 11 March earthquake that precipitated the disaster. Hours after the quake rocked the six reactors at Fukushima Daiichi, the tsunami arrived, knocking out crucial diesel back-up generators designed to cool the reactors in an emergency. Within days, the three reactors operating at the time of the accident overheated and released hydrogen gas, leading to massive explosions. Radioactive fuel recently removed from a fourth reactor was being held in a storage pool at the time of the quake, and on 14 March the pool overheated, possibly sparking fires in the building over the next few days.
  • But accounting for the radiation that came from the plants has proved much harder than reconstructing this chain of events. The latest report from the Japanese government, published in June, says that the plant released 1.5 × 1016 bequerels of caesium-137, an isotope with a 30-year half-life that is responsible for most of the long-term contamination from the plant2. A far larger amount of xenon-133, 1.1 × 1019 Bq, was released, according to official government estimates.
  • The new study challenges those numbers. On the basis of its reconstructions, the team claims that the accident released around 1.7 × 1019 Bq of xenon-133, greater than the estimated total radioactive release of 1.4 × 1019 Bq from Chernobyl. The fact that three reactors exploded in the Fukushima accident accounts for the huge xenon tally, says De Geer. Xenon-133 does not pose serious health risks because it is not absorbed by the body or the environment. Caesium-137 fallout, however, is a much greater concern because it will linger in the environment for decades. The new model shows that Fukushima released 3.5 × 1016 Bq caesium-137, roughly twice the official government figure, and half the release from Chernobyl. The higher number is obviously worrying, says De Geer, although ongoing ground surveys are the only way to truly establish the public-health risk.
  • Stohl believes that the discrepancy between the team's results and those of the Japanese government can be partly explained by the larger data set used. Japanese estimates rely primarily on data from monitoring posts inside Japan3, which never recorded the large quantities of radioactivity that blew out over the Pacific Ocean, and eventually reached North America and Europe. "Taking account of the radiation that has drifted out to the Pacific is essential for getting a real picture of the size and character of the accident," says Tomoya Yamauchi, a radiation physicist at Kobe University who has been measuring radioisotope contamination in soil around Fukushima. Click for full imageStohl adds that he is sympathetic to the Japanese teams responsible for the official estimate. "They wanted to get something out quickly," he says. The differences between the two studies may seem large, notes Yukio Hayakawa, a volcanologist at Gunma University who has also modelled the accident, but uncertainties in the models mean that the estimates are actually quite similar.
  • The new analysis also claims that the spent fuel being stored in the unit 4 pool emitted copious quantities of caesium-137. Japanese officials have maintained that virtually no radioactivity leaked from the pool. Yet Stohl's model clearly shows that dousing the pool with water caused the plant's caesium-137 emissions to drop markedly (see 'Radiation crisis'). The finding implies that much of the fallout could have been prevented by flooding the pool earlier. The Japanese authorities continue to maintain that the spent fuel was not a significant source of contamination, because the pool itself did not seem to suffer major damage. "I think the release from unit 4 is not important," says Masamichi Chino, a scientist with the Japanese Atomic Energy Authority in Ibaraki, who helped to develop the Japanese official estimate. But De Geer says the new analysis implicating the fuel pool "looks convincing".
  • The latest analysis also presents evidence that xenon-133 began to vent from Fukushima Daiichi immediately after the quake, and before the tsunami swamped the area. This implies that even without the devastating flood, the earthquake alone was sufficient to cause damage at the plant.

    ADVERTISEMENT

    Advertisement

    The Japanese government's report has already acknowledged that the shaking at Fukushima Daiichi exceeded the plant's design specifications. Anti-nuclear activists have long been concerned that the government has failed to adequately address geological hazards when licensing nuclear plants (see Nature 448, 392–393; 2007), and the whiff of xenon could prompt a major rethink of reactor safety assessments, says Yamauchi.

  • The model also shows that the accident could easily have had a much more devastating impact on the people of Tokyo. In the first days after the accident the wind was blowing out to sea, but on the afternoon of 14 March it turned back towards shore, bringing clouds of radioactive caesium-137 over a huge swathe of the country (see 'Radioisotope reconstruction'). Where precipitation fell, along the country's central mountain ranges and to the northwest of the plant, higher levels of radioactivity were later recorded in the soil; thankfully, the capital and other densely populated areas had dry weather. "There was a period when quite a high concentration went over Tokyo, but it didn't rain," says Stohl. "It could have been much worse." 
D'coda Dcoda

The Radiation Database User Guide - Worldwide HAARP, VLF, Radar, & [24Apr12]] - 0 views

  •  
    The Radiation Database KML is now called "ClimateViewer 3D": http://climateviewer.com/ The Radiation Database began as a Keyhole Markup Language (KML) project, geolocating Weather Modification projects and devices that may be able to alter the weather. The project quickly grew, expanding to cover many areas of interest/concern. Exploring the RadDB will not only expand your knowledge of our planet, but broaden your awareness of the current state of electromagnetics. This database contains data overlays, images, and links reguarding nuclear test/power/storage, radio frequency antenna (like HAARP), radar, and laser locations all around the globe, as well as climate/real-time data. While focusing on the military-industrial complex, the database covers locations and data ranging from Star Wars to Climate Gate. It's what you get when you mix George Jetson with Google Earth.
D'coda Dcoda

Harm from Fukushima Radiation: A Matter Of Perspective [09Jul11] - 0 views

  • A leading biophysicist has cast a critical light on the government’s reassurances that Americans were never at risk from Fukushima fallout, saying “we really don’t know for sure.”
  • When radioactive fallout from Japan’s nuclear disaster began appearing in the United States this spring, the Obama Administration’s open-data policy obligated the government to inform the public, in some detail, what was landing here.
  • Covering the story, I watched the government pursue what appeared to be two strategies to minimize public alarm:
  • ...14 more annotations...
  • It framed the data with reassurances like this oft-repeated sentence from the EPA: “The level detected is far below a level of public health concern.” The question, of course, is whose concern.
  • The EPA seemed to be timing its data releases to avoid media coverage. It released its most alarming data set late on a Friday—data that showed radioactive fallout in the drinking water of more than a dozen U.S. cities.
  • Friday and Saturday data releases were most frequent when radiation levels were highest. And despite the ravages newspapers have suffered from internet competition, newspaper editors still have not learned to assign reporters to watch the government on weekends. As a result, bloggers broke the fallout news, while newspapers relegated themselves to local followups, most of which did little more than quote public health officials who were pursuing strategy #1.
  • For example, when radioactive cesium-137 was found in milk in Hilo, Hawaii, Lynn Nakasone, administrator of the Health Department’s Environmental Health Services Division, told the Honolulu Star-Advertiser: ”There’s no question the milk is safe.”
  • Nakasone had little alternative but to say that. She wasn’t about to dump thousands of gallons of milk that represented the livelihood of local dairymen, and she wasn’t authorized to dump the milk as long as the radiation detected remained below FDA’s Derived Intervention Level, a metric I’ll discuss more below.
  • That kind of statement failed to reassure the public in part because of the issue of informed consent—Americans never consented to swallowing any radiation from Fukushima—and in part because the statement is obviously false.
  • There is a question whether the milk was safe.
  • medical experts agree that any increased exposure to radiation increases risk of cancer, and so, no increase in radiation is unquestionably safe.
  • Whether you choose to see the Fukushima fallout as safe depends on the perspective you adopt, as David J. Brenner, a professor of radiation biophysics and the director of the Center for Radiological Research at Columbia University Medical Center, elucidated recently in The Bulletin of The Atomic Scientists:
  • Should this worry us? We know that the extra individual cancer risks from this long-term exposure will be very small indeed. Most of us have about a 40 percent chance of getting cancer at some point in our lives, and the radiation dose from the extra radioactive cesium in the food supply will not significantly increase our individual cancer risks.
  • But there’s another way we can and should think about the risk: not from the perspective of individuals, but from the perspective of the entire population. A tiny extra risk to a few people is one thing. But here we have a potential tiny extra risk to millions or even billions of people. Think of buying a lottery ticket — just like the millions of other people who buy a ticket, your chances of winning are miniscule. Yet among these millions of lottery players, a few people will certainly win; we just can’t predict who they will be. Likewise, will there be some extra cancers among the very large numbers of people exposed to extremely small radiation risks? It’s likely, but we really don’t know for sure.
  • the EPA’s standard for radionuclides in drinking water is so much more conservative than the FDA’s standard for radionuclides in food. The two agencies anticipate different endurances of exposure—long-term in the EPA’s view, short-term in FDA’s. But faced with the commercial implications of its actions, FDA tolerates a higher level of mortality than EPA does.
  • FDA has a technical quibble with that last sentence. FDA spokesman Siobhan Delancey says: Risk coefficients (one in a million, two in ten thousand) are statistically based population estimates of risk. As such they cannot be used to predict individual risk and there is likely to be variation around those numbers. Thus we cannot say precisely that “one in a million people will die of cancer from drinking water at the EPA MCL” or that “two in ten thousand people will die of cancer from consuming food at the level of an FDA DIL.” These are estimates only and apply to populations as a whole.
  • The government, while assuring us of safety, comforts itself in the abstraction of the population-wide view, but from Dr. Brenner’s perspective, the population-wide view is a lottery and someone’s number may come up. Let that person decide whether we should be alarmed.
D'coda Dcoda

Australian National Radiation Dose Register (ANRDR) for Uranium Mining and Milling Workers - 0 views

  • The Australian Government is committed to strengthening occupational health and safety requirements for individuals working at uranium mining and milling sites. The Australian Government is committed to strengthening occupational health and safety requirements for individuals working at uranium mining and milling sites.
  • The Australian National Radiation Dose Register (ANRDR) was established in 2010 to collect, store, manage and disseminate records of radiation doses received by workers in the course of their employment in a centralised database. The ANRDR has been open to receive dose records from operators since 1 July 2010. The ANRDR was officially launched in June 2011 following full development of the Register, including a system for workers to be able to request their individual dose history record.
  • The ANRDR is maintained and managed by the Australian Radiation Protection and Nuclear Safety Agency (ARPANSA).
  • ...3 more annotations...
  • What data are we collecting? The ANRDR records radiation dose information as well as some personal information so that we are able to link the dose information with the correct worker. There are several different types of radiation, and different ways that radiation can interact with a worker. This dose register will record information on the doses received from these different radiation types and the pathways through which they interact with the worker. The personal information collected includes the worker’s name, date of birth, gender, employee number, place of employment, employee work classification, and the period of time employed at a particular location. This information is collected in order to ensure that appropriate doses are matched to the correct worker. Please refer to the ANRDR Privacy Statement for further information on the collection, storage and use of personal information.
  • How will the data be used? The data will be used to track a worker’s lifetime radiation dose history within the uranium mining and milling industry in Australia. A worker can request a dose history report from ARPANSA which will show the cumulative dose the worker has received during the course of their employment in the uranium mining and milling industry in Australia, and while the worker has been registered on the ANRDR. The data will be used to create annual statistics showing industry sector trends and comparisons. It will also be used to assess radiological doses within worker categories to help establish recommended dose constraints for certain work practices.
  • Currently, the ANRDR only records data on workers in the uranium mining and milling industry.
D'coda Dcoda

Newly Released TEPCO Data Proves Fairewinds Assertions of Significant Fuel Pool Failure... - 0 views

shared by D'coda Dcoda on 28 Aug 11 - No Cached
  •  
    Video - New TEPCO data measured on August 19 & 20 shows severe damage to the spent fuel in Fukushima Daiichi Units 1, 2, and 3. The adjacent TEPCO table posted on the front page shows incredibly high levels of Cesium 137 and Cesium 134 in all three spent fuel pools of Units 1, 2, & 3. This TEPCO data clearly contradicts and refutes the July assertion by the NRC the Fukushima Daiichi spent fuel pools were not damaged in this tragic accident. Crytome (cry to me) has a new high resolution photo, also uploaded, that shows the extensive damage of the Unit 3 spent fuel pool and the reactor building.
D'coda Dcoda

One week delay in revealing whether quake exceeded North Anna's design basis - Seismic ... - 0 views

  • At North Anna nuclear plant, reassurances but no final data on quake impact, Washington Post by Brian Vastag, September 2, 2011:
  • [...] Yet nearly two weeks after the quake, Dominion officials were unable to say whether the quake shook the facility more than it was designed to handle. “I don’t have those numbers,” Daniel Stoddard, Dominion’s senior vice president for nuclear operations, said repeatedly. It will be another week before final analysis of the “shake plates,” which recorded ground motion at the site, is finished, he said, although a Dominion spokesman had promised that analysis by Friday. In the control room, a 1970s-era seismic detector failed to record data for a critical eight seconds when primary power went down, slowing the company’s analysis. The company has added a battery backup to the unit to prevent a recurrence. [...]
D'coda Dcoda

Effect of contaminated soil on food chain sparks fears [10Sep11] - 0 views

  • Six months after the nuclear meltdowns in Fukushima Prefecture, the public's awareness of the threat posed by radiation is entering a new phase: the realization that the biggest danger now and in the future is from contaminated soil.
  • The iodine-131 ejected into the sky by the Fukushima No. 1 power station disaster was quickly detected in vegetables and tap water — even as far away as Tokyo, 220 km south of the plant. But contamination levels are now so low they are virtually undetectable, thanks to the short half-life of iodine-131 — eight days — and stepped up filtering by water companies.
  • But cesium is proving to be a tougher foe. The element's various isotopes have half-lives ranging from two to 30 years, generating concern about the food chain in Fukushima Prefecture, a predominantly agricultural region, as the elements wash fallout into the ground. The root of the problem is, well — roots. Cesium-134 and cesium-137 are viewed as potential health threats because vegetables can absorb the isotopes from the soil they're planted in.
  • ...9 more annotations...
  • "Until early spring, produce was contaminated (on the surface with radioactive materials) that the No. 1 plant discharged into the atmosphere. But now, the major route of contamination is through plant roots," said Kunikazu Noguchi, a radiation protection expert at Nihon University. Whether absorption by plant roots can affect human health remains to be seen. Experts are warning that the region's soil and agricultural products will require close monitoring for many years.
  • At the moment, sampling data collected by the various prefectural governments indicate that no vegetables, except for those grown in Fukushima Prefecture, have been found to contain more than the government's provisional limit of 500 becquerels per kilogram since June. Likewise, as of Sept. 7, samples of pork, chicken, milk and fruit had also tested within the provisional radiation limit, apart from Fukushima products and tea from Chiba, Kanagawa, Gunma, Tochigi, Saitama and Ibaraki prefectures.
  • In fact, the amount of radioactive materials in most of the food sampled has been steadily declining over the past few months, except for produce from Fukushima. "The results of Fukushima's sampling tests show the amountof radioactive material contained in vegetables has dropped sharply in recent months, including those grown in areas with high radiation levels," Noguchi said. "People shouldn't worry about it much (for the time being)," he said. "But mushrooms and other vegetables grown in contaminated forests are likely tocontain high levels of radioactive materials."
  • his year, it's very important to conduct thorough surveys. The contamination will continue for a long time, so data collection is essential," Muramatsu said. "We need to be prepared for the following years by recording data this year and studying the rate at which cesium in the soil is absorbed by each kind of produce," Muramatsu said. In the meantime, the radioactivity itself will continue to weaken over the years. Cesium-134 has a half-life of 2 years and cesium-137 a half-life of 30 years, meaning the radiation they emit will drop by half in 2 years and 30 years.
  • "Data from the Chernobyl disaster show that radioactive cesium in soil tends to become fixed more strongly to clay minerals as time passes. So agricultural contamination will lessen next year," he said. Muramatsu urged that special caution should be taken over products grown in soil rich in organic matter, such as in forested areas. "If the soil is rich in organic matter, it makes (cesium) more easily transferable to plants. . . . Forest soil is rich in organic matter, so people should be careful," he said.
  • Now that soil in a wide area of eastern Japan has been contaminated with cesium, experts are calling for close monitoring of soil and produce. The education ministry conducted soil surveys in June and July at 2,200 locations within 100 km of the crippled plant. At 34 locations in six municipalities in Fukushima Prefecture, including Minamisoma, Namie and Iitate, the data said cesium levels had exceeded 1.48 million becquerels per sq. meter — the same level that was used to define the exclusion zone around Chernobyl in 1986. Yasuyuki Muramatsu, a radiochemistry professor at Gakushuin University, said that agricultural contamination will likely peak this year because cesium binds more strongly with minerals in soil as time passes, making it more difficult to be absorbed by plant roots.
  • The ratio of cesium-134 to cesium-137 in the Fukushima accident is estimated as 1-to-1, while the ratio during the 1986 Chernobyl disaster was 1-to-2. This indicates the radiation in Fukushima will weaken at a faster rate than at Chernobyl. Between April and early August, the farm ministry tested soil at some 580 locations in six prefectures, including Fukushima, Tochigi and Gunma, to get a better picture of the full extent of contamination.
  • According to the results, 40 locations in Fukushima Prefecture had an intensity exceeding 5,000 becquerels per kilogram — the government's maximum limit for growing rice. Many municipalities within 30 km of the Fukushima No. 1 plant were banned from planting rice based on similar tests conducted in April. In addition, the ministry has asked 17 prefectures in eastern Japan to conduct two-phase radiation tests on harvested rice.
  • So far, none of the tests performed on unmilled rice — including from Fukushima — exceeded the government's limit of 500 becquerels per kilogram. Masanori Nonaka, an agriculture professor at Niigata University who specializes in soil science, said rice grown in contaminated areas is likely to be tainted, but to what extent is anyone's guess. White rice, however, may prove to be safe, Nonaka said. Because most of the radioactive material will adhere to the bran — the part of the husk left behind after hulling — about 60 percent of the cesium can be removed just by polishing it, he explained. Other foods, such as marine produce, won't be as easy to handle, experts say. After the Chernobyl accident, for example, the radioactive contamination of fish peaked between 6 to 12 months after the disaster. The Fisheries Agency, meanwhile, has asked nine prefectures on the Pacific coast to increase their sampling rates to prevent contaminated fish from landing in supermarkets.
D'coda Dcoda

The History of MIT's Blatant Suppression of Cold Fusion - 0 views

  • Due to the fact that commercially-ready cold fusion technologies like Andrea Rossi's E-Cat (Energy Catalyzer) exist and can produce kilowatts of power, I'm not too interested in previous systems from years ago that could only produce a couple watts of power (or less). However, I am very interested in the events that took place immediately after the birth of Cold Fusion in 1989, when Pons and Fleischmann announced the existence of their technology to the world. Although cold fusion systems at the time were not ready for the market place, they proved the effect was real -- a fact the establishment could not allow the public to accept.
  • Immediately after the announcement was made, the "mainstream" scientific community went on the attack. The late Eugene Mallove was in the middle of it, being employed at MIT in the news office -- before resigning in protest of the institution's misconduct. In a featured article for Infinite Energy Magazine, Mallove detailed exactly what took place that led to his resignation, and the depth of hatred that many professors at MIT had for Pons and Fleischmann's work. The article titled, "MIT and Cold Fusion: A Special Report" also looks at how the replication performed by the institution's Plasma Fusion Center actually did produce positive results, how data from the experiment was altered by unknown individuals at least twice, and how the hot fusion scientists in charge of such tests were far too biased to conduct proper research.
  • If you think the suppression Pons and Fleischmann faced was bad, you don't have a clue until you have read this article. 
  • ...4 more annotations...
  • To start with, those in charge of the replication attempt were members of the MIT Plasma Fusion Center. Their work with hot fusion Tokamak brought the university many millions of dollars in funding from the government, and maintained their job security. If cold fusion were to be accepted as a real phenomenon, it could have made hot fusion research appear to be near worthless. 
  • members of his department (including some scientists from others) took every opportunity they could to attack Pons and Fleischmann. For example, consider how..
  • A funeral party or "Wake for Cold Fusion" was held by the Plasma Fusion Center, before their replication test of Pons and Fleischmann's setup was even complete. They held another such party afterwards. Mugs belittling cold fusion were given out by Ron Parker, the head of the MIT hot fusion research group, who was supposed to be doing serious research to determine if cold fusion was a reality or not. The mugs read, "The Utah University: Department of Fusion Confusion" and had mocking instructions for cold fusion on the back. Ron Parker would use the test results to discredit cold fusion, while at a celebration of the death of cold fusion stated to Eugene Mallove (after being shown evidence in support for cold fusion) stated that the data from the MIT replication was "worthless." How examination of the data from MIT's replication showed obvious evidence of tampering. In fact, the corrected data showed excess heat. Yet it was still used to discredit cold fusion research for many years.
  • How the former President of MIT, Charles Vest, refused to order an investigation into how the Plasma Fusion Center handled the replication, and their obviously unscientific behavior -- such as partying for the death of something instead of doing unbiased research. Even worse, years later he signed onto a Department of Energy report stating that cold fusion did not deserve funding for research, yet hot fusion deserved millions of additional dollars and was a "bargain." Conflicts of interest were ignored from the very start. For example, those who had the strongest need for cold fusion to be proven not to work (hot fusion scientists), were tasked with the replication of the effect. It would be like giving a cigarette company the order to conduct a study on the reality of lung cancer, or the lumber industry the job of determining the usefulness of industrial hemp. What the hot fusion scientists were going to say was obvious! How some scientists were so closed minded they stated that if cold fusion was real, Pons and Fleischmann should be dead from radiation poisoning. In addition, some scientists went so far as to personally attack them. In one case, a scientist stated that even if a thousand tests showed excess heat, that the results would not vindicate Pons and Fleischmann.
  •  
    Much more to be found in the article
D'coda Dcoda

NHK: Japan gov't gave SPEEDI radiation forecasts to US military, NOT own citizens [16Ja... - 0 views

  • Title: SPEEDI information provided to United States Military But Not Japanese Citizens Source: Enformable Date: Jan. 16, 2012 Representatives of the Japanese Ministry of Education were called as witnesses by the Accident Investigation Board in charge of discovering the true nature of the Fukushima nuclear disaster. During questioning it was determined that the predictive data from SPEEDI, which forecasts and predicts spread of radioactive material after a nuclear disaster, had been provided to the U.S.Military.
  • According to NHK, the data had been compiled immediately after the accident, and provided to the US through the Foreign Ministry. The Ministry of Education decided that the published data did not accurately predict the actual situation, and may have lead to unnecessary confusion if released to the public. For these reasons the data was not published immediately, and evacuees were abandoned to make decisions with the government only telling part of the story.
D'coda Dcoda

US Radiation Monitoring May Have Been Handed Off To Nuclear Industry [04Nov11] Lobbyist... - 0 views

  • Lucas W Hixson may have uncovered a major abuse of the public trust by the NRC. In late March 2011 the NRC issued a directive that allowed the nuclear industry lobbyist group NEI to supply radiation monitoring data to the NRC who would then forward it to the EPA. March 24th the NRC discussed handing over radiation monitoring to nuclear industry lobbyists, April 14th RadNet was shut down and went back to routine monitoring schedules. This meant no ongoing food,water and air filter testing. Only the radiation level monitors were left operating. The EPA claimed that levels were going down as the reason for shutting down the expanded monitoring, but places like Idaho did not have the decreases seen at other sites.
  • The NRC directive put commercial nuclear power plant owners in charge of voluntarily providing the public with radiation monitoring data but it would be run through their nuclear industry lobbyists before it would then be provided to the NRC. Raw data was not provided directly to the NRC. Considering the massive US nuclear industry offensive to flood the media with propaganda downplaying the Fukushima nuclear disaster, they are hardly a reliable source to tell the public what the radiation levels are.
  • RadNet itself had many problems, stations didn’t work, some were not calibrated before the disaster. Even more disturbing is that the EPA does not even handle their own radiation monitoring network. The important function falls to a former Bush administration appointee running a business out of a rundown storefront in New Mexico. Under a $238,000 no bid contract Environmental Dimensions supposedly manages, maintains and operates RadNet, the only tool the public has to see if we are being subjected to nuclear fallout. The blogger that broke this story states that Environmental Dimensions has tripled their revenue in recent years. The company cites a different address as their mailing address. This shows up as a tiny house in Albuquerque. EDI was also part of a 12 million dollar contract in 2010 along with a couple of other contractors. The contract provides environmental & remediation services to the US Corps of Engineers. EDI claims to have been in business since 1990 but owner, Ms. Bradshaw worked for the DoD in 2006.
  • ...1 more annotation...
  • What little system the public has for radiation notification through the EPA has been shuffled off to a no bid contract with spurious origins and the system experienced widespread problems when it was needed most. That system was mostly turned off just over a month after the disaster. The NRC, the agency tasked with protecting the public from nuclear disasters decided to hand everything over to the nuclear industry’s lobbyists.
D'coda Dcoda

TEPCO revises timetable to cold shutdown for seventh time in eight months [20Nov11] - 0 views

  • The government and Tokyo Electric Power Co. have revised the timetable for the seventh time in the eight months since the crisis began. Data suggests the reactors and radioactive material are under control, and the power plant will achieve a cold shutdown once required conditions are confirmed.
  • The situation at the nuclear plant does not meet this definition. Is it appropriate for the government and TEPCO to call the current status nearly a cold shutdown?
  • the status of the molten nuclear fuel is unclear. It is not known how the fuel, believed to have partially melted through pressure vessels of the reactors and into containment vessels, has dispersed and how much lies in water. On Nov. 2, TEPCO said a small-scale recriticality incident–in which nuclear fuel achieves a fission chain reaction–may have taken place at the No. 2 reactor of the power plant.  TEPCO should have been able to coolly handle the detection of xenon, but it failed to do so as it had not properly prepared necessary data.
  • ...1 more annotation...
  • There are many other unsolved issues, including how to cope with contaminated water said to be accumulating at a rate of 200 to 500 tons a day in underground areas of the reactor buildings. The government and TEPCO must thoroughly solve these issues without being bound by their timetable. Source: Yomiuri Online
D'coda Dcoda

Phase-Out Hurdle: Germany Could Restart Nuclear Plant to Plug Energy Gap [21Jul11] - 0 views

  • Nuclear Phase-Out Related articles, background features and opinions about this topic. Print E-Mail Feedback 07/13/2011   Phase-Out Hurdle Germany Could Restart Nuclear Plant to Plug Energy Gap dapd Germany might need to switch a nuclear power plant back on. Germany's energy agency is warning that one of the German reactors mothballed in the wake of Fukushima may have to be restarted to make up for possible power shortages this winter and next. Berlin is also   using money earmarked for energy efficiency to subsidize coal-fired power plants. For reasons of data protection and privacy, your IP address will only be stored if you are a registered user of Facebook and you are currently logged in to the service. For more detailed information, please click on the "i" symbol. Nuclear energy, as has become abundantly clear this year, has no future in Germany. For once the government, the parliament and the public all agree: Atomic reactors in the country will be history a decade from now. Before that can happen, however, the country has to find alternate power sources. In fact, amid concerns that supply shortages this winter could result in temporary blackouts, Germany's Federal Network Agency on Tuesday indicated that one of the seven reactors shut down in the immediate wake of the Fukushima nuclear disaster in Japan could be restarted this winter to fill the gap . "The numbers that we currently have indicate that one of these nuclear energy plants will be needed," said agency head Matthias Kurth on Tuesday in Berlin. He said that ongoing analysis has indicated that fossil fuel-powered plants would not prove to be adequate as a backup.
  • Nuclear Phase-Out Related articles, background features and opinions about this topic. Print E-Mail Feedback 07/13/2011   Phase-Out Hurdle Germany Could Restart Nuclear Plant to Plug Energy Gap dapd Germany might need to switch a nuclear power plant back on. Germany's energy agency is warning that one of the German reactors mothballed in the wake of Fukushima may have to be restarted to make up for possible power shortages this winter and next. Berlin is also   using money earmarked for energy efficiency to subsidize coal-fired power plants. For reasons of data protection and privacy, your IP address will only be stored if you are a registered user of Facebook and you are currently logged in to the service. For more detailed information, please click on the "i" symbol. Nuclear energy, as has become abundantly clear this year, has no future in Germany. For once the government, the parliament and the public all agree: Atomic reactors in the country will be history a decade from now. Before that can happen, however, the country has to find alternate power sources. In fact, amid concerns that supply shortages this winter could result in temporary blackouts, Germany's Federal Network Agency on Tuesday indicated that one of the seven reactors shut down in the immediate wake of the Fukushima nuclear disaster in Japan could be restarted this winter to fill the gap
  • Nuclear Phase-Out Related articles, background features and opinions about this topic. Print E-Mail Feedback 07/13/2011  Phase-Out Hurdle Germany Could Restart Nuclear Plant to Plug Energy Gap dapd Germany might need to switch a nuclear power plant back on. Germany's energy agency is warning that one of the German reactors mothballed in the wake of Fukushima may have to be restarted to make up for possible power shortages this winter and next. Berlin is also using money earmarked for energy efficiency to subsidize coal-fired power plants.
D'coda Dcoda

Data released about plutonium found in soil outside Fukushima plant [03Oct11] - 0 views

  • Extract Plutonium, a highly toxic radioactive substance, found in soil in places even several dozen kilometers away from the stricken Fukushima No. 1 Nuclear Power Plant had a maximum concentration equivalent to 11 and 31 percent of the levels within the premises of the plant, plant operator Tokyo Electric Power Co. said. On Friday the government released data showing varieties of plutonium were detected at six locations in Fukushima Prefecture as far as Iitate village around 45 kilometers northwest of the Fukushima complex. The government data showed the maximum concentration of plutonium 239 and plutonium 240 combined was 15 becquerels per square meter measured in the city of Minamisoma. End Extract http://mdn.mainichi.jp/mdnnews/news/20111002p2g00m0dm020000c.html
D'coda Dcoda

[MAXIMUM ALERT] Neptunium 239 Potentially Detected In Saint Louis 9/14/11 Radioactive R... - 0 views

  • [MAXIMUM ALERT] Neptunium 239 Potentially Detected In Saint Louis 9/14/11 Radioactive Rainfall.  Updates and video will follow shortly. The source has a calculated average 2.4 day half life. The half life matches Neptunium 239. Np239 decays into Plutonium 239. The source would probably be Americium 243 created in the MOX fuel reactor at Fukushima Unit 3
  • Updated to add: IF WE ARE LUCKY, the source will not be Americium 243 but rather Uranium 239 (in Fukushima); given the 2.4 day half life of Np 239, it is possible that source came directly across the jet-stream as Np-239. The result would be higher levels of Np-239 and Plutonium 239 the further west one went from Saint Louis. 
  • UPDATE 9/17/11: The video below records raw data being taken from the 1.33 mR/hr radioactive rainfall which fell in Saint Louis, Mo on 9/14/11. This data was taken after shorter half life contamination had mostly burned off. The data shown is from one hour total count readings taken of the radioactive source, and local background. The raw data from the later part of the video has yet to be fully analyzed.
1 - 20 of 187 Next › Last »
Showing 20 items per page