Skip to main content

Home/ Advanced Concepts Team/ Group items tagged standards

Rss Feed Group items tagged

LeopoldS

An optical lattice clock with accuracy and stability at the 10-18 level : Nature : Natu... - 0 views

  •  
    Progress in atomic, optical and quantum science1, 2 has led to rapid improvements in atomic clocks. At the same time, atomic clock research has helped to advance the frontiers of science, affecting both fundamental and applied research. The ability to control quantum states of individual atoms and photons is central to quantum information science and precision measurement, and optical clocks based on single ions have achieved the lowest systematic uncertainty of any frequency standard3, 4, 5. Although many-atom lattice clocks have shown advantages in measurement precision over trapped-ion clocks6, 7, their accuracy has remained 16 times worse8, 9, 10. Here we demonstrate a many-atom system that achieves an accuracy of 6.4 × 10−18, which is not only better than a single-ion-based clock, but also reduces the required measurement time by two orders of magnitude. By systematically evaluating all known sources of uncertainty, including in situ monitoring of the blackbody radiation environment, we improve the accuracy of optical lattice clocks by a factor of 22. This single clock has simultaneously achieved the best known performance in the key characteristics necessary for consideration as a primary standard-stability and accuracy. More stable and accurate atomic clocks will benefit a wide range of fields, such as the realization and distribution of SI units11, the search for time variation of fundamental constants12, clock-based geodesy13 and other precision tests of the fundamental laws of nature. This work also connects to the development of quantum sensors and many-body quantum state engineering14 (such as spin squeezing) to advance measurement precision beyond the standard quantum limit.
Francesco Biscani

STLport: An Interview with A. Stepanov - 2 views

  • Generic programming is a programming method that is based in finding the most abstract representations of efficient algorithms.
  • I spent several months programming in Java.
  • for the first time in my life programming in a new language did not bring me new insights
  • ...2 more annotations...
  • it has no intellectual value whatsoever
  • Java is clearly an example of a money oriented programming (MOP).
  •  
    One of the authors of the STL (C++'s Standard Template Library) explains generic programming and slams Java.
  • ...6 more comments...
  •  
    "Java is clearly an example of a money oriented programming (MOP)." Exactly. And for the industry it's the money that matters. Whatever mathematicians think about it.
  •  
    It is actually a good thing that it is "MOP" (even though I do not agree with this term): that is what makes it inter-operable, light and easy to learn. There is no point in writing fancy codes, if it does not bring anything to the end-user, but only for geeks to discuss incomprehensible things in forums. Anyway, I am pretty sure we can find a Java guy slamming C++ ;)
  •  
    Personally, I never understood what the point of Java is, given that: 1) I do not know of any developer (maybe Marek?) that uses it for intellectual pleasure/curiosity/fun whatever, given the possibility of choice - this to me speaks loudly on the objective qualities of the language more than any industrial-corporate marketing bullshit (for the record, I argue that Python is more interoperable, lighter and easier to learn than Java - which is why, e.g., Google is using it heavily); 2) I have used a software developed in Java maybe a total of 5 times on any computer/laptop I owned over 15 years. I cannot name of one single Java project that I find necessary or even useful; for my usage of computers, Java could disappear overnight without even noticing. Then of course one can argue as much as one wants about the "industry choosing Java", to which I would counterargue with examples of industry doing stupid things and making absurd choices. But I suppose it would be a kind of pointless discussion, so I'll just stop here :)
  •  
    "At Google, python is one of the 3 "official languages" alongside with C++ and Java". Java runs everywhere (the byte code itself) that is I think the only reason it became famous. Python, I guess, is more heavy if it were to run on your web browser! I think every language has its pros and cons, but I agree Java is not the answer to everything... Java is used in MATLAB, some web applications, mobile phones apps, ... I would be a bit in trouble if it were to disappear today :(
  •  
    I personally do not believe in interoperability :)
  •  
    Well, I bet you'd notice an overnight disappearance of java, because half of the internet would vanish... J2EE technologies are just omnipresent there... I'd rather not even *think* about developing a web application/webservice/web-whatever in standard C++... is it actually possible?? Perhaps with some weird Microsoft solutions... I bet your bank online services are written in Java. Certainly not in PHP+MySQL :) Industry has chosen Java not because of industrial-corporate marketing bullshit, but because of economics... it enables you develop robustly, reliably, error-prone, modular, well integrated etc... software. And the costs? Well, using java technologies you can set-up enterprise-quality web application servers, get a fully featured development environment (which is better than ANY C/C++/whatever development environment I've EVER seen) at the cost of exactly 0 (zero!) USD/GBP/EUR... Since many years now, the central issue in software development is not implementing algorithms, it's building applications. And that's where Java outperforms many other technologies. The final remark, because I may be mistakenly taken for an apostle of Java or something... I love the idea of generic programming, C++ is my favourite programming language (and I used to read Stroustroup before sleep), at leisure time I write programs in Python... But if I were to start a software development company, then, apart from some very niche applications like computer games, it most probably would use Java as main technology.
  •  
    "I'd rather not even *think* about developing a web application/webservice/web-whatever in standard C++... is it actually possible?? Perhaps with some weird Microsoft solutions... I bet your bank online services are written in Java. Certainly not in PHP+MySQL :)" Doing in C++ would be awesomely crazy, I agree :) But as I see it there are lots of huge websites that operate on PHP, see for instance Facebook. For the banks and the enterprise market, as a general rule I tend to take with a grain of salt whatever spin comes out from them; in the end behind every corporate IT decision there is a little smurf just trying to survive and have the back covered :) As they used to say in the old times, "No one ever got fired for buying IBM". "Industry has chosen Java not because of industrial-corporate marketing bullshit, but because of economics... it enables you develop robustly, reliably, error-prone, modular, well integrated etc... software. And the costs? Well, using java technologies you can set-up enterprise-quality web application servers, get a fully featured development environment (which is better than ANY C/C++/whatever development environment I've EVER seen) at the cost of exactly 0 (zero!) USD/GBP/EUR... Since many years now, the central issue in software development is not implementing algorithms, it's building applications. And that's where Java outperforms many other technologies." Apart from the IDE considerations (on which I cannot comment, since I'm not a IDE user myself), I do not see how Java beats the competition in this regard (again, Python and the huge software ecosystem surrounding it). My impression is that Java's success is mostly due to Sun pushing it like there is no tomorrow and bundling it with their hardware business.
  •  
    OK, I think there is a bit of everything, wrong and right, but you have to acknowledge that Python is not always the simplest. For info, Facebook uses Java (if you upload picture for instance), and PHP is very limited. So definitely, in company, engineers like you and me select the language, it is not a marketing or political thing. And in the case of fb, they come up with the conclusion that PHP, and Java don't do everything but complement each other. As you say Python as many things around, but it might be too much for simple applications. Otherwise, I would seriously be interested by a study of how to implement a Python-like system on-board spacecrafts and what are the advantages over mixing C, Ada and Java.
Dario Izzo

Comparing the C++ Standard and Boost - 3 views

  •  
    any thoughts?
  •  
    I don't understand what kind of point the author is trying to make. Boost is not an STL implementation (and you can't actually use Boost if you don't have an underlying STL). So again, what's the conclusion?
LeopoldS

Helix Nebula - Helix Nebula Vision - 0 views

  •  
    The partnership brings together leading IT providers and three of Europe's leading research centres, CERN, EMBL and ESA in order to provide computing capacity and services that elastically meet big science's growing demand for computing power.

    Helix Nebula provides an unprecedented opportunity for the global cloud services industry to work closely on the Large Hadron Collider through the large-scale, international ATLAS experiment, as well as with the molecular biology and earth observation. The three flagship use cases will be used to validate the approach and to enable a cost-benefit analysis. Helix Nebula will lead these communities through a two year pilot-phase, during which procurement processes and governance issues for the public/private partnership will be addressed.

    This game-changing strategy will boost scientific innovation and bring new discoveries through novel services and products. At the same time, Helix Nebula will ensure valuable scientific data is protected by a secure data layer that is interoperable across all member states. In addition, the pan-European partnership fits in with the Digital Agenda of the European Commission and its strategy for cloud computing on the continent. It will ensure that services comply with Europe's stringent privacy and security regulations and satisfy the many requirements of policy makers, standards bodies, scientific and research communities, industrial suppliers and SMEs.

    Initially based on the needs of European big-science, Helix Nebula ultimately paves the way for a Cloud Computing platform that offers a unique resource to governments, businesses and citizens.
  •  
    "Helix Nebula will lead these communities through a two year pilot-phase, during which procurement processes and governance issues for the public/private partnership will be addressed." And here I was thinking cloud computing was old news 3 years ago :)
jaihobah

Debate Intensifies Over Dark Disk Theory | Quanta Magazine - 0 views

  •  
    An alternative idea to the standard WIMP halo picture for the DM distribution in the galaxy
LeopoldS

CMS search for the Standard Model Higgs Boson in LHC data from 2010 and 2011 | CMS Expe... - 0 views

  •  
    news from the search for higgs ...
santecarloni

Has 'new physics' been found at CERN? - physicsworld.com - 1 views

  •  
    Physicists working on the LHCb experiment at the CERN particle-physics lab have released the best evidence yet for direct charge-parity (CP) violation in charm mesons....While more data must be analysed to confirm the result, the work could point to new physics beyond the Standard Model and help physicists understand why there is more matter than antimatter in the universe.
  •  
    lot of new physics this year ...
jmlloren

Fujitsu Cracks Next-Gen Cryptography Standard - Slashdot - 0 views

  •  
    Challenge for PyGMO
Alexander Wittig

Picture This: NVIDIA GPUs Sort Through Tens of Millions of Flickr Photos - 2 views

  •  
    Strange and exotic cityscapes. Desolate wilderness areas. Dogs that look like wookies. Flickr, one of the world's largest photo sharing services, sees it all. And, now, Flickr's image recognition technology can categorize more than 11 billion photos like these. And it does it automatically. It's called "Magic View." Magical deep learning! Buzzword attack!
  • ...4 more comments...
  •  
    and here comes my standard question: how can we use this for space? fast detection of natural disasters onboard?
  •  
    Even on ground. You could for example teach it what nuclear reactors or missiles or other weapons you don't want look like on satellite pictures and automatically scan the world for them (basically replacing intelligence analysts).
  •  
    In fact, I think this could make a nice ACT project: counting seals from satellite imagery is an actual (and quite recent) thing: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0092613 In this publication they did it manually from a GeoEye 1 b/w image, which sounds quite tedious. Maybe one can train one of those image recognition algorithms to do it automatically. Or maybe it's a bit easier to count larger things, like elephants (also a thing).
  •  
    In HiPEAC (High Performance, embedded architecture and computation) conference I attended in the beginning of this year there was a big trend of CUDA GPU vs FPGA for hardware accelerated image processing. Most of it orbitting around discussing who was faster and cheaper with people from NVIDIA in one side and people from Xilinx and Intel in the other. I remember of talking with an IBM scientist working on hardware accelerated data processing working together with the Radio telescope institute in Netherlands about the solution where they working on (GPU CUDA). I gathered that NVIDIA GPU suits best in applications that somehow do not rely in hardware, having the advantage of being programmed in a 'easy' way accessible to a scientist. FPGA's are highly reliable components with the advantage of being available in radhard versions, but requiring specific knowledge of physical circuit design and tailored 'harsh' programming languages. I don't know what is the level of rad hardness in NVIDIA's GPUs... Therefore FPGAs are indeed the standard choice for image processing in space missions (a talk with the microelectronics department guys could expand on this), whereas GPUs are currently used in some ground based (radio astronomy or other types of telescopes). I think that on for a specific purpose as the one you mentioned, this FPGA vs GPU should be assessed first before going further.
  •  
    You're forgetting power usage. GPUs need 1000 hamster wheels worth of power while FPGAs can run on a potato. Since space applications are highly power limited, putting any kind of GPU monster in orbit or on a rover is failed idea from the start. Also in FPGAs if a gate burns out from radiation you can just reprogram around it. Looking for seals offline in high res images is indeed definitely a GPU task.... for now.
  •  
    The discussion of how to make FPGA hardware acceleration solutions easier to use for the 'layman' is starting btw http://reconfigurablecomputing4themasses.net/.
pacome delva

Plan for 'nuclear clock' unveiled - 0 views

  • First there were atomic clocks that beat at microwave frequencies. Then along came optical clocks that provide higher frequency standards. Now, physicists in the US have unveiled plans to build the first “nuclear clock” that runs at still higher frequencies. And because it is based on a solid material, the team claims that such a frequency standard could be far less complicated than gas-based atomic and optical clocks – while delivering the same or better accuracy.
Francesco Biscani

Slashdot Developers Story | GCC Moving To Use C++ Instead of C - 1 views

  •  
    "there is a call for a volunteer to develop the C++ coding standards" Go for it! :-)
  •  
    Of course, the golden PaGMO coding standard! :)
Juxi Leitner

Open-source hardware standards formally issued | Geek Gestalt - CNET News - 1 views

  •  
    useful for space?
pacome delva

The Coolest Antiprotons - 2 views

  • Researchers cooled a cloud of about 4,000 antiprotons down to 9 kelvin using a standard approach for cooling atoms that has never been used with charged particles or ions. The technique could provide a new way to create and trap antihydrogen, which could help researchers probe a basic symmetry of nature.
  • hydrogen and antihydrogen should share many basic traits, like mass, magnetic moment, and emission spectrum. If antihydrogen and hydrogen have even slightly different spectra, it indicates some new physics principles beyond the standard model, a very big deal.
  •  
    antihydrogen propulsion...?
  • ...1 more comment...
  •  
    how to efficiently direct it?
  •  
    didn't roger write an assessment of antimatter propulsion when he was in the ACT?
  •  
    yeah the problem is the amount of antimatter you can get and more specifically how to trap it. I found that you would need around one gram to go to the outer Solar System. So we are far from that, but finding an efficient way to trap it, with an electromagnetic trap rather than solid walls is a first step !
pacome delva

Physics - Atoms in a lattice keep time - 0 views

  • If your wristwatch was as accurate as today’s atomic clocks, it would not gain or lose a second in 80 million years.
  • The NIST group traps and cools neutral 171Yb atoms and loads them into a one-dimensional lattice, so that about 30,000 atoms fill several hundred lattice sites.
  • Lemke et al. compare their optical lattice clock with the current standard atomic fountain clock and find that the accuracy of the Yb lattice clock potentially challenges the current standard.
Francesco Biscani

[Phoronix] NVIDIA Readies Its OpenCL Linux Driver - 0 views

  •  
    Dario: this was the CUDA standardization thing I talked you about.
LeopoldS

Science Inside | Lytro - 4 views

  •  
    the standard question: can we use it for space?
Marcus Maertens

AI at Google: our principles - 4 views

  •  
    Google is taking position here, but can they live up to their own standards?
  •  
    " Avoid creating or reinforcing unfair bias." Thats the very definition of the AI used today. If you learn from a dataset, you are biased to that data set. No escape from it.
LeopoldS

new 24qbits quantum computer - 0 views

  •  
    new 24qbits quantum computer fitting in standard racks ...
johannessimon81

Sounds during sleep can boost memory - 1 views

  •  
    For all of us who want to become smart without hard work :-D
  •  
    Omelette du fromage?
1 - 20 of 52 Next › Last »
Showing 20 items per page