Skip to main content

Home/ Advanced Concepts Team/ Group items tagged the

Rss Feed Group items tagged

LeopoldS

Helix Nebula - Helix Nebula Vision - 0 views

  •  
    The partnership brings together leading IT providers and three of Europe's leading research centres, CERN, EMBL and ESA in order to provide computing capacity and services that elastically meet big science's growing demand for computing power.

    Helix Nebula provides an unprecedented opportunity for the global cloud services industry to work closely on the Large Hadron Collider through the large-scale, international ATLAS experiment, as well as with the molecular biology and earth observation. The three flagship use cases will be used to validate the approach and to enable a cost-benefit analysis. Helix Nebula will lead these communities through a two year pilot-phase, during which procurement processes and governance issues for the public/private partnership will be addressed.

    This game-changing strategy will boost scientific innovation and bring new discoveries through novel services and products. At the same time, Helix Nebula will ensure valuable scientific data is protected by a secure data layer that is interoperable across all member states. In addition, the pan-European partnership fits in with the Digital Agenda of the European Commission and its strategy for cloud computing on the continent. It will ensure that services comply with Europe's stringent privacy and security regulations and satisfy the many requirements of policy makers, standards bodies, scientific and research communities, industrial suppliers and SMEs.

    Initially based on the needs of European big-science, Helix Nebula ultimately paves the way for a Cloud Computing platform that offers a unique resource to governments, businesses and citizens.
  •  
    "Helix Nebula will lead these communities through a two year pilot-phase, during which procurement processes and governance issues for the public/private partnership will be addressed." And here I was thinking cloud computing was old news 3 years ago :)
johannessimon81

Bacteria grow electric wire in their natural environment - 1 views

  •  
    Bacterial wires explain enigmatic electric currents in the seabed: Each one of these 'cable bacteria' contains a bundle of insulated wires that conduct an electric current from one end to the other. Cable bacteria explain electric currents in the seabed Electricity and seawater are usually a bad mix.
  •  
    WOW!!!! don't want to even imagine what we do to these with the trailing fishing boats that sweep through sea beds with large masses .... "Our experiments showed that the electric connections in the seabed must be solid structures built by bacteria," says PhD student Christian Pfeffer, Aarhus University. He could interrupt the electric currents by pulling a thin wire horizontally through the seafloor. Just as when an excavator cuts our electric cables. In microscopes, scientists found a hitherto unknown type of long, multi-cellular bacteria that was always present when scientists measured the electric currents. "The incredible idea that these bacteria should be electric cables really fell into place when, inside the bacteria, we saw wire-like strings enclosed by a membrane," says Nils Risgaard-Petersen, Aarhus University. Kilometers of living cables The bacterium is one hundred times thinner than a hair and the whole bacterium functions as an electric cable with a number of insulated wires within it. Quite similar to the electric cables we know from our daily lives. "Such unique insulated biological wires seem simple but with incredible complexity at nanoscale," says PhD student Jie Song, Aarhus University, who used nanotools to map the electrical properties of the cable bacteria. In an undisturbed seabed more than tens of thousands kilometers cable bacteria live under a single square meter seabed. The ability to conduct an electric current gives cable bacteria such large benefits that it conquers much of the energy from decomposition processes in the seabed. Unlike all other known forms of life, cable bacteria maintain an efficient combustion down in the oxygen-free part of the seabed. It only requires that one end of the individual reaches the oxygen which the seawater provides to the top millimeters of the seabed. The combustion is a transfer of the electrons of the food to oxygen which the bacterial inner wires manage over centimeter-long distances. However, s
santecarloni

First flat lens focuses light without distortion - physicsworld.com - 0 views

  •  
    Physicists in the US have made the first ultrathin flat lens. Thanks to its flatness, the device eliminates optical aberrations that occur in conventional lenses with spherical surfaces. As a result, the focusing power of the lens also approaches the ultimate physical limit set by the laws of diffraction.
  •  
    Really nice indeed! The new flat ultrathin lens is different in that it is a nanostructured "metasurface" made of optically thin beam-shaping elements called optical antennas, which are separated by distances shorter than the wavelength of the light they are designed to focus. These antennas are wavelength-scale metallic elements that introduce a slight phase delay in a light ray that scatters off them. The metasurface can be tuned for specific wavelengths of light by simply changing the size, angle and spacing between the nanoantennas. "The antenna is nothing more than a resonator that stores light and then releases it after a short time delay," Capasso says. "This delay changes the direction of the light in the same way that a thick glass lens would." The lens surface is patterned with antennas of different shapes and sizes that are oriented in different directions. This causes the phase delays to be radially distributed around the lens so that light rays are increasingly refracted further away from the centre, something that has the effect of focusing the incident light to a precise point.
pandomilla

Not a scratch - 7 views

shared by pandomilla on 12 Apr 12 - No Cached
LeopoldS liked it
  •  
    I hate scorpions, but this could be a nice subject for a future Ariadna study! This north African desert scorpion, doesn't dig burrows to protect itself from the sand-laden wind (as the other scorpions do). When the sand whips by at speeds that would strip paint away from steel, the scorpion is able to scurry off without apparent damage.
  •  
    Nice research, though they have done almost all the work that we could do in an Ariadna, didnt they? "To check, they took further photographs. In particular, they used a laser scanning system to make a three-dimensional map of the armour and then plugged the result into a computer program that blasted the virtual armour with virtual sand grains at various angles of attack. This process revealed that the granules were disturbing the air flow near the skeleton's surface in ways that appeared to be reducing the erosion rate. Their model suggested that if scorpion exoskeletons were smooth, they would experience almost twice the erosion rate that they actually do. Having tried things out in a computer, the team then tried them for real. They placed samples of steel in a wind tunnel and fired grains of sand at them using compressed air. One piece of steel was smooth, but the others had grooves of different heights, widths and separations, inspired by scorpion exoskeleton, etched onto their surfaces. Each sample was exposed to the lab-generated sandstorm for five minutes and then weighed to find out how badly it had been eroded. The upshot was that the pattern most resembling scorpion armour-with grooves that were 2mm apart, 5mm wide and 4mm high-proved best able to withstand the assault. Though not as good as the computer model suggested real scorpion geometry is, such grooving nevertheless cut erosion by a fifth, compared with a smooth steel surface. The lesson for aircraft makers, Dr Han suggests, is that a little surface irregularity might help to prolong the active lives of planes and helicopters, as well as those of scorpions."
  •  
    What bugs me (pardon the pun) is that the dimensions of the pattern they used were scaled up by many orders of magnitude, while "grains of sand" with which the surface was bombarded apparently were not... Not being a specialist in the field, I would nevertheless expect that the size of the surface pattern *in relation to* to size of particles used for bombarding would be crucial.
LeopoldS

An optical lattice clock with accuracy and stability at the 10-18 level : Nature : Natu... - 0 views

  •  
    Progress in atomic, optical and quantum science1, 2 has led to rapid improvements in atomic clocks. At the same time, atomic clock research has helped to advance the frontiers of science, affecting both fundamental and applied research. The ability to control quantum states of individual atoms and photons is central to quantum information science and precision measurement, and optical clocks based on single ions have achieved the lowest systematic uncertainty of any frequency standard3, 4, 5. Although many-atom lattice clocks have shown advantages in measurement precision over trapped-ion clocks6, 7, their accuracy has remained 16 times worse8, 9, 10. Here we demonstrate a many-atom system that achieves an accuracy of 6.4 × 10−18, which is not only better than a single-ion-based clock, but also reduces the required measurement time by two orders of magnitude. By systematically evaluating all known sources of uncertainty, including in situ monitoring of the blackbody radiation environment, we improve the accuracy of optical lattice clocks by a factor of 22. This single clock has simultaneously achieved the best known performance in the key characteristics necessary for consideration as a primary standard-stability and accuracy. More stable and accurate atomic clocks will benefit a wide range of fields, such as the realization and distribution of SI units11, the search for time variation of fundamental constants12, clock-based geodesy13 and other precision tests of the fundamental laws of nature. This work also connects to the development of quantum sensors and many-body quantum state engineering14 (such as spin squeezing) to advance measurement precision beyond the standard quantum limit.
Dario Izzo

Probabilistic Logic Allows Computer Chip to Run Faster - 3 views

  •  
    Francesco pointed out this research one year ago, we dropped it as noone was really considering it ... but in space a low CPU power consumption is crucial!! Maybe we should look back into this?
  • ...6 more comments...
  •  
    Q1: For the time being, for what purposes computers are mainly used on-board?
  •  
    for navigation, control, data handling and so on .... why?
  •  
    Well, because the point is to identify an application in which such computers would do the job... That could be either an existing application which can be done sufficiently well by such computers or a completely new application which is not already there for instance because of some power consumption constraints... Q2 would be then: for which of these purposes strict determinism of the results is not crucial? As the answer to this may not be obvious, a potential study could address this very issue. For instance one can consider on-board navigation systems with limited accuracy... I may be talking bullshit now, but perhaps in some applications it doesn't matter whether a satellite flies on the exact route but +/-10km to the left/right? ...and so on for the other systems. Another thing is understanding what exactly this probabilistic computing is, and what can be achieved using it (like the result is probabilistic but falls within a defined range of precision), etc. Did they build a complete chip or at least a sub-circiut, or still only logic gates...
  •  
    Satellites use old CPUs also because with the trend of going for higher power modern CPUs are not very convenient from a system design point of view (TBC)... as a consequence the constraints put on on-board algorithms can be demanding. I agree with you that double precision might just not be necessary for a number of applications (navigation also), but I guess we are not talking about 10km as an absolute value, rather to a relative error that can be tolerated at level of (say) 10^-6. All in all you are right a first study should assess what application this would be useful at all.. and at what precision / power levels
  •  
    The interest of this can be a high fault tolerance for some math operations, ... which would have for effect to simplify the job of coders! I don't think this is a good idea regarding power consumption for CPU (strictly speaking). The reason we use old chip is just a matter of qualification for space, not power. For instance a LEON Sparc (e.g. use on some platform for ESA) consumes something like 5mW/MHz so it is definitely not were an engineer will look for some power saving considering a usual 10-15kW spacecraft
  •  
    What about speed then? Seven time faster could allow some real time navigation at higher speed (e.g. velocity of a terminal guidance for an asteroid impactor is limited to 10 km/s ... would a higher velocity be possible with faster processors?) Another issue is the radiation tolerance of the technology ... if the PCMOS are more tolerant to radiation they could get more easily space qualified.....
  •  
    I don't remember what is the speed factor, but I guess this might do it! Although, I remember when using an IMU that you cannot have the data above a given rate (e.g. 20Hz even though the ADC samples the sensor at a little faster rate), so somehow it is not just the CPU that must be re-thought. When I say qualification I also imply the "hardened" phase.
  •  
    I don't know if the (promised) one-order-of-magnitude improvements in power efficiency and performance are enough to justify looking into this. For once, it is not clear to me what embracing this technology would mean from an engineering point of view: does this technology need an entirely new software/hardware stack? If that were the case, in my opinion any potential benefit would be nullified. Also, is it realistic to build an entire self-sufficient chip on this technology? While the precision of floating point computations may be degraded and still be useful, how does all this play with integer arithmetic? Keep in mind that, e.g., in the Linux kernel code floating-point calculations are not even allowed/available... It is probably possible to integrate an "accelerated" low-accuracy floating-point unit together with a traditional CPU, but then again you have more implementation overhead creeping in. Finally, recent processors by Intel (e.g., the Atom) and especially ARM boast really low power-consumption levels, at the same time offering performance-boosting features such as multi-core and vectorization capabilities. Don't such efforts have more potential, if anything because of economical/industrial inertia?
Christos Ampatzis

Academic publishers make Murdoch look like a socialist - 4 views

  •  
    Who are the most ruthless capitalists in the western world? Whose monopolistic practices make Walmart look like a corner shop and Rupert Murdoch a socialist? You won't guess the answer in a month of Sundays. While there are plenty of candidates, my vote goes not to the banks, the oil companies or the health insurers, but - wait for it - to academic publishers.
  •  
    fully agree ... "But an analysis by Deutsche Bank reaches different conclusions. "We believe the publisher adds relatively little value to the publishing process … if the process really were as complex, costly and value-added as the publishers protest that it is, 40% margins wouldn't be available." Far from assisting the dissemination of research, the big publishers impede it, as their long turnaround times can delay the release of findings by a year or more." very nice also: "Government bodies, with a few exceptions, have failed to confront them. The National Institutes of Health in the US oblige anyone taking their grants to put their papers in an open-access archive. But Research Councils UK, whose statement on public access is a masterpiece of meaningless waffle, relies on "the assumption that publishers will maintain the spirit of their current policies". You bet they will. In the short term, governments should refer the academic publishers to their competition watchdogs, and insist that all papers arising from publicly funded research are placed in a free public database. In the longer term, they should work with researchers to cut out the middleman altogether, creating - along the lines proposed by Björn Brembs of Berlin's Freie Universität - a single global archive of academic literature and data. Peer-review would be overseen by an independent body. It could be funded by the library budgets which are currently being diverted into the hands of privateers. The knowledge monopoly is as unwarranted and anachronistic as the corn laws. Let's throw off these parasitic overlords and liberate the research that belongs to us."
  •  
    It is a really great article and the first time I read something in this direction. FULLY AGREE as well. Problem is I have not much encouraging to report from the Brussels region...
LeopoldS

Plant sciences: Plants drink mineral water : Nature : Nature Publishing Group - 1 views

  •  
    Here we go: we might not need liquid water after all on mars to get some nice flowering plants there! ... and terraform ? :-) Thirsty plants can extract water from the crystalline structure of gypsum, a rock-forming mineral found in soil on Earth and Mars.

    Some plants grow on gypsum outcrops and remain active even during dry summer months, despite having shallow roots that cannot reach the water table. Sara Palacio of the Pyrenean Institute of Ecology in Jaca, Spain, and her colleagues compared the isotopic composition of sap from one such plant, called Helianthemum squamatum (pictured), with gypsum crystallization water and water found free in the soil. The team found that up to 90% of the plant's summer water supply came from gypsum.

    The study has implications for the search for life in extreme environments on this planet and others.

    Nature Commun 5, 4660 (2014)
  •  
    Very interesting indeed. Attention is to be put on the form of calcium sulfate that is found on Mars. If it is hydrated (gypsum Ca(SO4)*2(H2O)) it works, but if it is dehydrated there is no water for the roots to take in. The Curiosity Rover tries to find out, but has uncertainty in recognising the hydrogen presence in the mineral: Copying : "(...) 3.2 Hydration state of calcium sulfates Calcium sulfates occur as a non-hydrated phase (anhydrite, CaSO4) or as one of two hydrated phases (bassanite, CaSO4.1/2H2O, which can contain a somewhat variable water content, and gypsum, CaSO4.2H2O). ChemCam identifies the presence of hydrogen at 656 nm, as already found in soils and dust [Meslin et al., 2013] and within fluvial conglomerates [Williams et al., 2013]. However, the quantification of H is strongly affected by matrix effects [Schröder et al., 2013], i.e. effects including major or even minor element chemistry, optical and mechanical properties, that can result in variations of emission lines unrelated to actual quantitative variations of the element in question in the sample. Due to these effects, discriminating between bassanite and gypsum is difficult. (...)"
LeopoldS

Warp Drive More Possible Than Thought, Scientists Say | Space.com - 1 views

  •  
    Sante, Andreas, Luzi, Pacome ... we need you: "But recently White calculated what would happen if the shape of the ring encircling the spacecraft was adjusted into more of a rounded donut, as opposed to a flat ring. He found in that case, the warp drive could be powered by a mass about the size of a spacecraft like the Voyager 1 probe NASA launched in 1977.

    Furthermore, if the intensity of the space warps can be oscillated over time, the energy required is reduced even more, White found.

    "The findings I presented today change it from impractical to plausible and worth further investigation," White told SPACE.com. "The additional energy reduction realized by oscillating the bubble intensity is an interesting conjecture that we will enjoy looking at in the lab.""
  •  
    To me, this looks a little bit like the claim "infinity minus one is a little bit less than infinity"...
  •  
    Luzi I miss you ...
Beniamino Abis

The Wisdom of (Little) Crowds - 1 views

  •  
    What is the best (wisest) size for a group of individuals? Couzin and Kao put together a series of mathematical models that included correlation and several cues. In one model, for example, a group of animals had to choose between two options-think of two places to find food. But the cues for each choice were not equally reliable, nor were they equally correlated. The scientists found that in these models, a group was more likely to choose the superior option than an individual. Common experience will make us expect that the bigger the group got, the wiser it would become. But they found something very different. Small groups did better than individuals. But bigger groups did not do better than small groups. In fact, they did worse. A group of 5 to 20 individuals made better decisions than an infinitely large crowd. The problem with big groups is this: a faction of the group will follow correlated cues-in other words, the cues that look the same to many individuals. If a correlated cue is misleading, it may cause the whole faction to cast the wrong vote. Couzin and Kao found that this faction can drown out the diversity of information coming from the uncorrelated cue. And this problem only gets worse as the group gets bigger.
  •  
    Couzin research was the starting point that co-inspired PaGMO from the very beginning. We invited him (and he came) at a formation flying conference for a plenary here in ESTEC. You can see PaGMO as a collective problem solving simulation. In that respect, we learned already that the size of the group and its internal structure (topology) counts and cannot be too large or too random. One of the project the ACT is running (and currently seeking for new ideas/actors) is briefly described here (http://esa.github.io/pygmo/examples/example2.html) and attempts answering the question :"How is collective decision making influenced by the information flow through the group?" by looking at complex simulations of large 'archipelagos'.
LeopoldS

Ministry of Science and Technology of the People's Republic of China - 0 views

  •  
    University Alliance for Low Carbon Energy   Three universities, including Tsinghua University, University of Cambridge, and the Massachusetts Institute of Technology, have fostered up an alliance on November 15, 2009 to advocate low carbon energy and climate change adaptation The alliance will mainly work on 6 major areas: clean coal technology and CCS, homebuilding energy efficiency, industrial energy efficiency and sustainable transport, biomass energy and other renewable energy, advanced nuclear energy, intelligent power grid, and energy policies/planning. A steering panel made up of the senior experts from the three universities (two from each) will be established to review, evaluate, and endorse the goals, projects, fund raising activities, and collaborations under the alliance. With the Headquarters at the campus of Tsinghua University and branch offices at other two universities, the alliance will be chaired by a scientist selected from Tsinghua University.   According to a briefing, the alliance will need a budget of USD 3-5 million, mainly from the donations of government, industry, and all walks of life. In this context, the R&D findings derived from the alliance will find its applications in improving people's life.
LeopoldS

House Approves Flat 2011 Budget for Most Science Agencies - ScienceInsider - 0 views

  •  
    "Some segments of the research community would get their preferences under the House spending bill. For example, it matches the president's request for a 1.5% increase for NASA, to $19 billion, including a 12% increase, to $5 billion, for the space science program. Legislators had already worked out a deal with the White House on the future of the manned space program, and they included funding for an additional shuttle flight in 2011. They even added $35 million to the $20 million increase that the president requested for NASA's education programs, boosting them by a whopping 30% to $180 million. "
  •  
    Some segments of the research community would get their preferences under the House spending bill. For example, it matches the president's request for a 1.5% increase for NASA, to $19 billion, including a 12% increase, to $5 billion, for the space science program. Legislators had already worked out a deal with the White House on the future of the manned space program, and they included funding for an additional shuttle flight in 2011. They even added $35 million to the $20 million increase that the president requested for NASA's education programs, boosting them by a whopping 30% to $180 million.
Luís F. Simões

Boeing probes international market for human spacecraft - 1 views

  • The aerospace powerhouse is designing and testing systems for its CST-100 space capsule, a craft the company says could begin flying astronauts to low Earth orbit by 2015. It will launch on existing rockets to lessen development risk and costs.
  • "The spacecraft that we're designing is rocket-agnostic. It would be possible to sell this like a commercial airplane to countries who perhaps have a launch vehicle who would like to launch it in their own country."
  •  
    ...and hitting the news in the same day: A Rocket Built from U.S. and European Parts "A new rocket that would combine parts from NASA's canceled Ares I rocket as well as the Ariane 5 , a well-proven European satellite launcher, could provide a low-cost option for taking crew and cargo to the space station. The rocket proposal was announced this week by ATK, an aerospace and defense company that manufactures the solid rocket motors for NASA's space shuttles, and Astrium, the European company that makes the Ariane 5. They say the rocket, called Liberty, would be ready for flight by 2015." "Other commercial companies, including Boeing and Orbital Sciences Corporation, are looking to use low-end versions of the Atlas V to carry the capsules they are building. Liberty could carry any capsule at a cost less than that of the Atlas V, according to ATK." Look! Competition! :)
Luzi Bergamin

Prof. Markrams Hirnmaschine (Startseite, NZZ Online) - 2 views

  •  
    A critical view on Prof. Markram's Blue Brain project (in German).
  • ...4 more comments...
  •  
    A critical view on Prof. Markram's Blue Brain project (in German).
  •  
    so critical that the comment needed to be posted twice :-) ?
  •  
    Yes, I know; I still don't know how to deal with this f.... Diigo Toolbar! Shame on me!!!
  •  
    Would be nice to read something about the modelling, but it appears that there is nothing published in detail. Following the article, the main approach is to model each(!) neuron taking into account the spatial structure of the neurons positions. Once achieved they expect intelligent behaviour. And they need a (type of) supercomputer which does not exist yet.
  •  
    As far as I know it's sort of like "Let's construct an enormous dynamical system and see what happens"... i.e. a waste of taxpayer's money... Able to heal Alzheimer... Yeah... Actually I was on the conference the author is mentioning (FET 2011) and I have seen the presentations of all 6 flagship proposals. Following that I had a discussion with one of my colleagues about the existence of limits of the amount of bullshit politicians are willing to buy from scientists. Will there be a point at which politicians, despite their total ignorance, will realise that scientists simply don't deliver anything they promise? How long will we (scientists) be stuck in the viscous circle of have-to-promise-more-than-predecessors in order to get money? Will we face a situation when we'll be forced to revert to promises which are realistic? To be honest none of the 6 presentations convinced me of their scientific merit (apart from the one on graphene where I have absolutely no expertise to tell). Apparently a huge amount of money is about to be wasted.
  •  
    It's not just "Let's construct an enormous dynamical system and see what happens", it's worse! Also the simulation of the cosmological evolution is/was a little bit of this type, still the results are very interesting and useful. Why? Neither the whole cosmos nor the human brain at the level of single neurons can be modelled on a computer, that would last aeons on a "yet-to-be-invented-extra-super-computer". Thus one has to make assumptions and simplifications. In cosmology we have working theories of gravitation, thermodynamics, electrodynamics etc. at hand; starting from these theories we can make reasonable assumptions and (more or less) justified simplifications. The result is valuable since it provides insight into a complex system under given, explicit and understood assumptions. Nothing similar seems to exist in neuroscience. There is no theory of the human brain and apparently nobody has the slightest idea which simplifications can be made without harm. Of course, Mr. Markram remains completely unaffected of ''details'' like this. Finally, Marek, money is not wasted, we ''build networks of excellence'' and ''select the brightest of the brightest'' to make them study and work at our ''elite institutions'' :-). I lively remember the stage of one of these "bestofthebest" from Ivy League at the ACT...
Dario Izzo

Critique of 'Debunking the climate hiatus', by Rajaratnam, Romano, Tsiang, and Diffenba... - 8 views

  •  
    Hilarious critique to a quite important paper from Stanford trying to push the agenda of global warming .... "You might therefore be surprised that, as I will discuss below, this paper is completely wrong. Nothing in it is correct. It fails in every imaginable respect."
  • ...4 more comments...
  •  
    To quote Francisco "If at first you don't succeed, use another statistical test" A wiser man shall never walk the earth
  •  
    why is this just put on a blog and not published properly?
  •  
    If you read the comments it's because the guy doesn't want to put in the effort. Also because I suspect the politics behind climate science favor only a particular kind of result.
  •  
    just a footnote here, that climate warming aspect is not derived by an agenda of presenting the world with evil. If one looks at big journals with high outreach, it is not uncommon to find articles promoting climate warming as something not bringing the doom that extremists are promoting with marketing strategies. Here is a recent article in Science: http://www.ncbi.nlm.nih.gov/pubmed/26612836 Science's role is to look at the phenomenon and notice what is observed. And here is one saying that the acidification of the ocean due to increase of CO2 (observed phenomenon) is not advancing destructively for coccolithophores (a key type of plankton that builds its shell out of carbonates), as we were expecting, but rather fertilises them! Good news in principle! It could be as well argued from the more sceptics with high "doubting-inertia" that 'It could be because CO2 is not rising in the first place'', but one must not forget that one can doubt the global increase in T with statistical analyses, because it is a complex variable, but at least not the CO2 increase compared to preindustrial levels. in either case : case 1: agenda for 'the world is warming' => - Put random big energy company here- sells renewable energies case 2: agenda for 'the world is fine' => - Put random big energy company here - sells oil as usual The fact that in both cases someone is going to win profits, does not correllate (still not an adequate statistical test found for it?) with the fact that the science needs to be more and more scrutinised. The blog of the Statistics Professor in Univ.Toronto looks interesting approach (I have not understood all the details) and the paper above is from JPL authors, among others.
Luzi Bergamin

First circuit breaker for high voltage direct current - 2 views

  •  
    Doesn't really sound sexy, but this is of utmost importance for next generation grids for renewable energy.
  •  
    I agree on the significance indeed - a small boost also for my favourite Desertec project ... Though their language is a bit too "grandiose": "ABB has successfully designed and developed a hybrid DC breaker after years of research, functional testing and simulation in the R&D laboratories. This breaker is a breakthrough that solves a technical challenge that has been unresolved for over a hundred years and was perhaps one the main influencers in the 'war of currents' outcome. The 'hybrid' breaker combines mechanical and power electronics switching that enables it to interrupt power flows equivalent to the output of a nuclear power station within 5 milliseconds - that's as fast as a honey bee takes per flap of its wing - and more than 30 times faster than the reaction time of an Olympic 100-meter medalist to react to the starter's gun! But its not just about speed. The challenge was to do it 'ultra-fast' with minimal operational losses and this has been achieved by combining advanced ultrafast mechanical actuators with our inhouse semiconductor IGBT valve technologies or power electronics (watch video: Hybrid HVDC Breaker - How does it work). In terms of significance, this breaker is a 'game changer'. It removes a significant stumbling block in the development of HVDC transmission grids where planning can start now. These grids will enable interconnection and load balancing between HVDC power superhighways integrating renewables and transporting bulk power across long distances with minimal losses. DC grids will enable sharing of resources like lines and converter stations that provides reliability and redundancy in a power network in an economically viable manner with minimal losses. ABB's new Hybrid HVDC breaker, in simple terms will enable the transmission system to maintain power flow even if there is a fault on one of the lines. This is a major achievement for the global R&D team in ABB who have worked for years on the challeng
Dario Izzo

Climate scientists told to 'cover up' the fact that the Earth's temperature hasn't rise... - 5 views

  •  
    This is becoming a mess :)
  • ...2 more comments...
  •  
    I would avoid reading climate science from political journals, for a less selective / dramatic picture :-) . Here is a good start: http://www.realclimate.org/ And an article on why climate understanding should be approached hierarcically, (that is not the way done in the IPCC), a view with insight, 8 years ago: http://www.princeton.edu/aos/people/graduate_students/hill/files/held2005.pdf
  •  
    True, but fundings are allocated to climate modelling 'science' on the basis of political decisions, not solid and boring scientific truisms such as 'all models are wrong'. The reason so many people got trained on this area in the past years is that resources were allocated to climate science on the basis of the dramatic picture depicted by some scientists when it was indeed convenient for them to be dramatic.
  •  
    I see your point, and I agree that funding was also promoted through the energy players and their political influence. A coincident parallel interest which is irrelevant to the fact that the question remains vital. How do we affect climate and how does it respond. Huge complex system to analyse which responds in various time scales which could obscure the trend. What if we made a conceptual parallelism with the L Ácquila case : Is the scientific method guilty or the interpretation of uncertainty in terms of societal mobilization? Should we leave the humanitarian aspect outside any scientific activity?
  •  
    I do not think there is anyone arguing that the question is not interesting and complex. The debate, instead, addresses the predictive value of the models produced so far. Are they good enough to be used outside of the scientific process aimed at improving them? Or should one wait for "the scientific method" to bring forth substantial improvements to the current understanding and only then start using its results? One can take both stand points, but some recent developments will bring many towards the second approach.
LeopoldS

Google Says the FBI Is Secretly Spying on Some of Its Customers | Threat Level | Wired.com - 3 views

  •  
    not a surprise though still bad to read ....
  •  
    On a side note, it's hilarious to read an article on something repeatedly referred to as being secret...
  •  
    quite self-explanatory described though: "The terrorists apparently would win if Google told you the exact number of times the Federal Bureau of Investigation invoked a secret process to extract data about the media giant's customers. That's why it is unlawful for any record-keeper to disclose it has received a so-called National Security Letter. But under a deal brokered with the President Barack Obama administration, Google on Tuesday published a "range" of times it received National Security Letters demanding it divulge account information to the authorities without warrants. It was the first time a company has ever released data chronicling the volume of National Security Letter requests. National Security Letters allow the government to get detailed information on Americans' finances and communications without oversight from a judge. The FBI has issued hundreds of thousands of NSLs and has even been reprimanded for abusing them. The NSLs are written demands from the FBI that compel internet service providers, credit companies, financial institutions and businesses like Google to hand over confidential records about their customers, such as subscriber information, phone numbers and e-mail addresses, websites visited and more as long as the FBI says the information is "relevant" to an investigation." and ""You'll notice that we're reporting numerical ranges rather than exact numbers. This is to address concerns raised by the FBI, Justice Department and other agencies that releasing exact numbers might reveal information about investigations. We plan to update these figures annually," Richard Salgado, a Google legal director, wrote in a blog post. Salgado was not available for comment. What makes the government's position questionable is that it is required by Congress to disclose the number of times the bureau issues National Security Letters. In 2011, the year with the latest available figures, the FBI issued 16,511 National Sec
Thijs Versloot

Light brought to a complete stop - 3 views

  •  
    "When a control laser is fired at the crystal, a complex quantum-level reaction turns it the opaque crystal transparent. A second light source is beamed into the crystal before the control laser is shut off, returning the crystal to its opaque state. This leaves the light trapped inside the crystal, and the opacity of the crystal keeps the light trapped inside from bouncing around, effectively bringing light to a full stop." is the simple explanation, but I am not sure how this is actually possible with the current laws of physics
  •  
    There are two ways to make slow light: material slow light and structural slow light, where you either change the material or the structural properties of your system. Here they used EIT to make material slow light, by inducing transparency inside an otherwise opaque material. As you change the absorption properties of a material you also change its dispersion properties, the so-called Kramers-Kronig relations. A rapid positive change in the dispersion properties of a material will give rise to slow light. To effectively stop light they switched off the control beam, bringing back the opaque state. Another control beam is then used to retrieve the probe pulse that was 'frozen' inside the medium. Light will be halted according to the population lifetime on the energy level (~ 100s). They used an evolutionary algorithm to find an optimal pulse preparation sequence to reach close to the maximum possible storage duration of 100s. Interesting paper!
  •  
    So it is not real storage then in a sense, as you are stimulating an excitation population which retains the phase information of your original pulse? Still it is amazing that they could store this up to 100s and retrieve it with a probe pulse, but light has never been halted.
johannessimon81

Cosmological model without accelerated expansion proposed - 1 views

  •  
    Redshift in this model is partially produced by a change in the masses of elementary particles (and atoms)
  • ...3 more comments...
  •  
    It seems to solve the problem of infinite energy density at the singularity in any case. I would love to see a way of experimentally verifying this, although most people seem to believe it is wrong. I read the following quote though by Dirac to Pauli "we all agree your idea is crazy, but the real question is it crazy enough to be correct?"
  •  
    As far as I can see, this is not untestable per se, rather an explanation to the redshift that is equivalent to accelerating expansion. It is not that the theory is untestable, rather just another way of looking at it. Kind of like that its sometimes convenient to consider light a particle, sometimes a wave. In the same way it could sometime convenient to view the universe as static with increasing mass instead.
  •  
    Well the premiss "matter getting heavier" may be up to falsification in some way or another. Currently, there is no absolute method to determine mass so it might even be plausible that this is actually the case. I don't think it is related but there is a problem with the 1kg-standards (1 official and 6 copies) where the masses seem to deviate.
  •  
    It should not be impossible to verify a change in mass(es) over time. For example the electron cyclotron frequency scales ~e/m while the Hydrogen emission frequencies scale with ~m*e^4. Using multiple relationships like that which can be easily and accurately measured an increase in the mass of fundamental particles should - in principle - be detectable (even if the mass of the earth increases at the same time changing the relativistic reference frame).
  •  
    The Watt balance and a definition using the Planck's constant seems to do the trick and is currently being discussed. Would the electron charge not be problematic as it is related to Coulombs which depends on Amperes which is defined by Newtons which hence depends back on the mass again?
‹ Previous 21 - 40 of 3335 Next › Last »
Showing 20 items per page