Love tip 3.. thats why I am at the ACT of course :)
3. Surround yourself with pros
Surround yourself with people who are self-assured, and live life without comprising their core values. These people will rub off on you quickly.
finally..
The world is already full of people who obey the status quo. But the people who don't give a fuck are the ones that change the world.
As a side note, this is also being used as method to create a hypothetical substance called 'metallic hydrogen'. At such high pressures, hydrogen itself should become superconducting at room temperature and thus of tremendous interest...
Australian scientists have cleared one of the final hurdles for designing and building a quantum computer. The team of engineers from the University of New South Wales has successfully built a core component needed for the computer to operate and the work is published today in the journal Nature.
"After decades spent thinking, arguing, hoping, and in the words of Turyshev, "making a career off of it," these scientists' interest in the Pioneer anomaly has, understandably, accumulated psychological baggage; in the case of many of them, a cloud of emotional investment has formed around the core of objective scientific inquiry. And clouds obscure things."
Francesco pointed out this research one year ago, we dropped it as noone was really considering it ... but in space a low CPU power consumption is crucial!! Maybe we should look back into this?
Well, because the point is to identify an application in which such computers would do the job... That could be either an existing application which can be done sufficiently well by such computers or a completely new application which is not already there for instance because of some power consumption constraints...
Q2 would be then: for which of these purposes strict determinism of the results is not crucial?
As the answer to this may not be obvious, a potential study could address this very issue. For instance one can consider on-board navigation systems with limited accuracy... I may be talking bullshit now, but perhaps in some applications it doesn't matter whether a satellite flies on the exact route but +/-10km to the left/right?
...and so on for the other systems.
Another thing is understanding what exactly this probabilistic computing is, and what can be achieved using it (like the result is probabilistic but falls within a defined range of precision), etc. Did they build a complete chip or at least a sub-circiut, or still only logic gates...
Satellites use old CPUs also because with the trend of going for higher power modern CPUs are not very convenient from a system design point of view (TBC)... as a consequence the constraints put on on-board algorithms can be demanding. I agree with you that double precision might just not be necessary for a number of applications (navigation also), but I guess we are not talking about 10km as an absolute value, rather to a relative error that can be tolerated at level of (say) 10^-6. All in all you are right a first study should assess what application this would be useful at all.. and at what precision / power levels
The interest of this can be a high fault tolerance for some math operations, ... which would have for effect to simplify the job of coders!
I don't think this is a good idea regarding power consumption for CPU (strictly speaking).
The reason we use old chip is just a matter of qualification for space, not power. For instance a LEON Sparc (e.g. use on some platform for ESA) consumes something like 5mW/MHz so it is definitely not were an engineer will look for some power saving considering a usual 10-15kW spacecraft
What about speed then? Seven time faster could allow some real time navigation at higher speed (e.g. velocity of a terminal guidance for an asteroid impactor is limited to 10 km/s ... would a higher velocity be possible with faster processors?)
Another issue is the radiation tolerance of the technology ... if the PCMOS are more tolerant to radiation they could get more easily space qualified.....
I don't remember what is the speed factor, but I guess this might do it! Although, I remember when using an IMU that you cannot have the data above a given rate (e.g. 20Hz even though the ADC samples the sensor at a little faster rate), so somehow it is not just the CPU that must be re-thought.
When I say qualification I also imply the "hardened" phase.
I don't know if the (promised) one-order-of-magnitude improvements in power efficiency and performance are enough to justify looking into this.
For once, it is not clear to me what embracing this technology would mean from an engineering point of view: does this technology need an entirely new software/hardware stack? If that were the case, in my opinion any potential benefit would be nullified.
Also, is it realistic to build an entire self-sufficient chip on this technology? While the precision of floating point computations may be degraded and still be useful, how does all this play with integer arithmetic? Keep in mind that, e.g., in the Linux kernel code floating-point calculations are not even allowed/available... It is probably possible to integrate an "accelerated" low-accuracy floating-point unit together with a traditional CPU, but then again you have more implementation overhead creeping in.
Finally, recent processors by Intel (e.g., the Atom) and especially ARM boast really low power-consumption levels, at the same time offering performance-boosting features such as multi-core and vectorization capabilities. Don't such efforts have more potential, if anything because of economical/industrial inertia?
Ugh... no operator overloading, no efficient generic programming and no lambda expressions... Only time will tell, but I don't understand who the intended audience is: I think that Python guys won't care about the (supposedly) increased performance (and you can interface C/C++ with Python easily) and that C++ programmers (I mean, the hardcore serious C++ Boost-like programmers, no the Java-like whiners :P) won't have their beloved templates pried from their cold dead hands with ease.
yeah though I think especially operator overloading is not going to be a main problem, it is as with the JS library though quite thinkable that lots of users will switch or use it (or being put to use it...) because it is done by Google
Having Google backing it will certainly help, even though they are presenting it as a "system level" (i.e., hard-core) language, and in that domain it is much more difficult to bullshit your way to a position of relevance.
Look at Java: Sun pushed it like hell and it is certainly widely used in many contexts (corporate, web and embedded markets mostly), yet it completely failed to win the hearts of "open-source" developers (or, more generally, of those developers who are not forced to use it by virtue of some management-driven decision).
An unusual signal picked up by a European space observatory could be the first direct detection of dark matter particles, astronomers say. The findings are tentative and could take several years to check, but if confirmed they would represent a dramatic advance in scientists' understanding of the universe.
"A new Angry Birds-style game is set to help launch a new understanding of quantum science. Some find the concepts of quantum science confusing or unintuitive. Einstein even called quantum effects "spooky." To help people better understand some of the core concepts of quantum science, the Institute for Quantum Computing (IQC) at the University of Waterloo is launching a game - the Quantum Cats"
Looking forward to see the ACT winning the Quantum Cats competition :)
"The Neuromorphic Computing Platform allows neuroscientists and engineers to perform experiments with configurable neuromorphic computing systems. The platform provides two complementary, large-scale neuromorphic systems built in custom hardware at locations in Heidelberg, Germany (the "BrainScaleS" system, also known as the "physical model" or PM system) and Manchester, United Kingdom (the "SpiNNaker" system, also known as the "many core" or MC system)."