Skip to main content

Home/ Advanced Concepts Team/ Group items matching "atoms" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Dario Izzo

Probabilistic Logic Allows Computer Chip to Run Faster - 3 views

  •  
    Francesco pointed out this research one year ago, we dropped it as noone was really considering it ... but in space a low CPU power consumption is crucial!! Maybe we should look back into this?
  • ...6 more comments...
  •  
    Q1: For the time being, for what purposes computers are mainly used on-board?
  •  
    for navigation, control, data handling and so on .... why?
  •  
    Well, because the point is to identify an application in which such computers would do the job... That could be either an existing application which can be done sufficiently well by such computers or a completely new application which is not already there for instance because of some power consumption constraints... Q2 would be then: for which of these purposes strict determinism of the results is not crucial? As the answer to this may not be obvious, a potential study could address this very issue. For instance one can consider on-board navigation systems with limited accuracy... I may be talking bullshit now, but perhaps in some applications it doesn't matter whether a satellite flies on the exact route but +/-10km to the left/right? ...and so on for the other systems. Another thing is understanding what exactly this probabilistic computing is, and what can be achieved using it (like the result is probabilistic but falls within a defined range of precision), etc. Did they build a complete chip or at least a sub-circiut, or still only logic gates...
  •  
    Satellites use old CPUs also because with the trend of going for higher power modern CPUs are not very convenient from a system design point of view (TBC)... as a consequence the constraints put on on-board algorithms can be demanding. I agree with you that double precision might just not be necessary for a number of applications (navigation also), but I guess we are not talking about 10km as an absolute value, rather to a relative error that can be tolerated at level of (say) 10^-6. All in all you are right a first study should assess what application this would be useful at all.. and at what precision / power levels
  •  
    The interest of this can be a high fault tolerance for some math operations, ... which would have for effect to simplify the job of coders! I don't think this is a good idea regarding power consumption for CPU (strictly speaking). The reason we use old chip is just a matter of qualification for space, not power. For instance a LEON Sparc (e.g. use on some platform for ESA) consumes something like 5mW/MHz so it is definitely not were an engineer will look for some power saving considering a usual 10-15kW spacecraft
  •  
    What about speed then? Seven time faster could allow some real time navigation at higher speed (e.g. velocity of a terminal guidance for an asteroid impactor is limited to 10 km/s ... would a higher velocity be possible with faster processors?) Another issue is the radiation tolerance of the technology ... if the PCMOS are more tolerant to radiation they could get more easily space qualified.....
  •  
    I don't remember what is the speed factor, but I guess this might do it! Although, I remember when using an IMU that you cannot have the data above a given rate (e.g. 20Hz even though the ADC samples the sensor at a little faster rate), so somehow it is not just the CPU that must be re-thought. When I say qualification I also imply the "hardened" phase.
  •  
    I don't know if the (promised) one-order-of-magnitude improvements in power efficiency and performance are enough to justify looking into this. For once, it is not clear to me what embracing this technology would mean from an engineering point of view: does this technology need an entirely new software/hardware stack? If that were the case, in my opinion any potential benefit would be nullified. Also, is it realistic to build an entire self-sufficient chip on this technology? While the precision of floating point computations may be degraded and still be useful, how does all this play with integer arithmetic? Keep in mind that, e.g., in the Linux kernel code floating-point calculations are not even allowed/available... It is probably possible to integrate an "accelerated" low-accuracy floating-point unit together with a traditional CPU, but then again you have more implementation overhead creeping in. Finally, recent processors by Intel (e.g., the Atom) and especially ARM boast really low power-consumption levels, at the same time offering performance-boosting features such as multi-core and vectorization capabilities. Don't such efforts have more potential, if anything because of economical/industrial inertia?
pacome delva

Landmarks: Teleportation is not Science Fiction - 0 views

  • If a practical quantum computer is ever built, he says, it will have to make use of different kinds of quantum bits, perhaps atoms for storage and photons for transmission. So teleportation would be a natural way to connect these components, he says.
LeopoldS

In the Next Industrial Revolution, Atoms Are the New Bits | Magazine - 1 views

  •  
    nice article - should be a nice tool to make your own small cubesat - who will be first?
pacome delva

Atomic clock is smallest on the market - 2 views

  •  
    Soon caesium in your watch !
  •  
    very nice indeed ... how much more accurate are our galileo clocks?
  •  
    This small clock is around 10^-12 @1d in stability (loose 1 second after 300000 years) and 50 ns in accuracy. For comparison Galileo and GPS clocks are around 10^-14 @1d in stability and 1 ns in accuracy. And ACES/PHARAO will be around 3*10^-16 @1d in stability and 0.3 ps accuracy.
Francesco Biscani

Asteroid blast reveals holes in Earth's defences - space - 26 October 2009 - New Scientist - 2 views

  • On 8 October an asteroid detonated high in the atmosphere above South Sulawesi, Indonesia, releasing about as much energy as 50,000 tons of TNT, according to a NASA estimate released on Friday. That's about three times more powerful than the atomic bomb that levelled Hiroshima, making it one of the largest asteroid explosions ever observed.
nikolas smyrlakis

BBC NEWS | Science & Environment | Bridging the gap to quantum world - 0 views

  •  
    A "spooky" quantum effect is seen in a mechanical system for the first time.
ESA ACT

Dynamics of phononic dissipation at the atomic scale: Dependence on internal degrees of freedom - 0 views

  •  
    Dynamics of dissipation local vibrations to the surrounding substrate is a key issue in friction between sliding surfaces as well as in boundary lubrication.
pacome delva

Physics - Free falling - 2 views

  • In a Rapid Communication appearing in Physical Review A, Pengfei Zhang and colleagues at Shanxi University, China, describe experiments where they tracked an atom’s path with a spatial resolution of 100 nanometers and in a measurement time of 10 microseconds.
Thijs Versloot

A Groundbreaking Idea About Why Life Exists - 1 views

  •  
    Jeremy England, a 31-year-old assistant professor at the Massachusetts Institute of Technology, has derived a mathematical formula that he believes explains this capacity. The formula, based on established physics, indicates that when a group of atoms is driven by an external source of energy (like the sun or chemical fuel) and surrounded by a heat bath (like the ocean or atmosphere), it will often gradually restructure itself in order to dissipate increasingly more energy. This could mean that under certain conditions, matter inexorably acquires the key physical attribute associated with life. The simulation results made me think of Jojo's attempts to make a self-assembling space structure. Seems he may have been on the right track, just not thinking big enough
  •  
    :-P Thanks Thijs... I do not agree with the premise of the article that a possible correlation of energy dissipation in living systems and their fitness means that one is the cause for the other - it may just be that both go hand-in-hand because of the nature of the world that we live in. Maybe there is such a drive for pre-biotic systems (like crystals and amino acids), but once life as we know it exists (i.e., heredity + mutation) it is hard to see the need for an amendment of Darwin's principles. The following just misses the essence of Darwin: "If England's approach stands up to more testing, it could further liberate biologists from seeking a Darwinian explanation for every adaptation and allow them to think more generally in terms of dissipation-driven organization. They might find, for example, that "the reason that an organism shows characteristic X rather than Y may not be because X is more fit than Y, but because physical constraints make it easier for X to evolve than for Y to evolve." Darwin's principle in its simplest expression just says that if a genome is more effective at reproducing it is more likely to dominate the next generation. The beauty of it is that there is NO need for a steering mechanism (like maximize energy dissipation) any random set of mutations will still lead to an increase of reproductive effectiveness. BTW: what does "better at dissipating energy" even mean? If I run around all the time I will have more babies? Most species that prove to be very successful end up being very good at conserving energy: trees, turtles, worms. Even complexity of an organism is not a recipe for evolutionary success: jellyfish have been successful for hundreds of millions of years while polar bears are seem to be on the way out.
Athanasia Nikolaou

Silk protein and chloroplasts for the synthetic leaf - 2 views

  •  
    Royal College of Art's Innovation Design Engineering course in collaboration with Tufts University silk lab. Not as good as it sounds as it does not fully mimic the photosynthesis equation (spare C, H atoms)
  •  
    Interesting stuff and I guess it does not need to fully mimic photosynthesis in the end. As long as oxygen can be produced from CO2 and water that would be great enough. Though the carbon has to be deposited somewhere (in some form) and I wonder how one could extract this efficiently. Maybe it can even serve some purpose (as the sugars are doing for the plant)
jcunha

Kilogram conflict resolved at last - 3 views

  •  
    Apparently it's time for retirement of the Le Grand K, if all goes well until the middle of next year...
  •  
    "One method (...) involves counting the atoms in two silicon-28 spheres that each weigh the same as the reference kilogram." Sounds like a lengthy task, but someone must keep those physics PhD students busy, I guess...
Dario Izzo

Critique of 'Debunking the climate hiatus', by Rajaratnam, Romano, Tsiang, and Diffenbaugh | Radford Neal's blog - 8 views

  •  
    Hilarious critique to a quite important paper from Stanford trying to push the agenda of global warming .... "You might therefore be surprised that, as I will discuss below, this paper is completely wrong. Nothing in it is correct. It fails in every imaginable respect."
  • ...4 more comments...
  •  
    To quote Francisco "If at first you don't succeed, use another statistical test" A wiser man shall never walk the earth
  •  
    why is this just put on a blog and not published properly?
  •  
    If you read the comments it's because the guy doesn't want to put in the effort. Also because I suspect the politics behind climate science favor only a particular kind of result.
  •  
    also good overview: https://www.nasa.gov/sites/default/files/atoms/files/noaa_nasa_global_analysis_2015.pdf
  •  
    just a footnote here, that climate warming aspect is not derived by an agenda of presenting the world with evil. If one looks at big journals with high outreach, it is not uncommon to find articles promoting climate warming as something not bringing the doom that extremists are promoting with marketing strategies. Here is a recent article in Science: http://www.ncbi.nlm.nih.gov/pubmed/26612836 Science's role is to look at the phenomenon and notice what is observed. And here is one saying that the acidification of the ocean due to increase of CO2 (observed phenomenon) is not advancing destructively for coccolithophores (a key type of plankton that builds its shell out of carbonates), as we were expecting, but rather fertilises them! Good news in principle! It could be as well argued from the more sceptics with high "doubting-inertia" that 'It could be because CO2 is not rising in the first place'', but one must not forget that one can doubt the global increase in T with statistical analyses, because it is a complex variable, but at least not the CO2 increase compared to preindustrial levels. in either case : case 1: agenda for 'the world is warming' => - Put random big energy company here- sells renewable energies case 2: agenda for 'the world is fine' => - Put random big energy company here - sells oil as usual The fact that in both cases someone is going to win profits, does not correllate (still not an adequate statistical test found for it?) with the fact that the science needs to be more and more scrutinised. The blog of the Statistics Professor in Univ.Toronto looks interesting approach (I have not understood all the details) and the paper above is from JPL authors, among others.
Paul N

New derivation of pi links quantum physics and pure math - 5 views

  •  
    In 1655 the English mathematician John Wallis published a book in which he derived a formula for pi as the product of an infinite series of ratios. Now researchers from the University of Rochester, in a surprise discovery, have found the same formula in quantum mechanical calculations of the energy levels of a hydrogen atom.
  •  
    This is insanity, Max. Or maybe it's genius.
Thijs Versloot

Quantum entanglement at ambient conditions in a macroscopic solid-state spin ensemble - 1 views

  •  
    Quoted from one of the authors in a separate interview: "We know that the spin states of atomic nuclei associated with semiconductor defects have excellent quantum properties at room temperature," said Awschalom, Liew Family Professor in Molecular Engineering and a senior scientist at Argonne National Laboratory. "They are coherent, long-lived and controllable with photonics and electronics. Given these quantum 'pieces,' creating entangled quantum states seemed like an attainable goal." Bringing the quantum world to the macroscopic scale could see some interesting applications in sensors, or generally entanglement-enhanced applications.
  •  
    They were previously working on the same concept in N-V centers in diamond (as a semiconductor). Here the advantage is that SiC could in principle be integrated with Si or Ge. Anyway its all about controlling coherence. In the next 10 years some breakthroughs are expected in the field of semiconductor spintronics, but quantum computing in this way lies still in the horizon
« First ‹ Previous 61 - 80 of 80
Showing 20 items per page