Nigel Leck, a software developer by day, was tired of arguing with anti-science crackpots on Twitter. So, like any good programmer, he wrote a script to do it for him.
"He says he made the purchase partly because he wants to be able to spend more time in the virtual world. Before, he was averaging 10 to 20 hours per week. He wants to be able to spend about 40 to 60 hours a week now, basically making running the virtual asteroid a full-time job. (He'll also be cutting back on the time he spends developing software in real life.)"
From what I remember when I visited the developer/producer company HQ, he wouldn't have to pay any taxes. If he has a virtual business he might have to pay them a license fee. If you want to start a virtual bank, you would need to buy a banking license. The money thing is quite regulated in this enviroment, so probably that's why property prices can be quite high.
Last time I checked the "state" was still loosing money. But their main income is the sale of resources. Mostly new land, but I believe at some point they wanted to sell their initial planet too.
By tracking the particle's motion using a video camera and then using image-analysis software to identify when the particle had rotated against the field, the researchers were able to raise the metaphorical barrier behind it by inverting the field's phase. In this way they could gradually raise the potential of the particle even though they had not imparted any energy to it directly.
"Nobody thinks of using bits to boil water," he says, "but that would in principle be possible at nanometre scales." And he speculates that molecular processes occurring in nature might already be converting information to energy in some way. "The message is that processes taking place on the nanoscale are completely different from those we are familiar with, and that information is part of that picture."
Francesco pointed out this research one year ago, we dropped it as noone was really considering it ... but in space a low CPU power consumption is crucial!! Maybe we should look back into this?
Well, because the point is to identify an application in which such computers would do the job... That could be either an existing application which can be done sufficiently well by such computers or a completely new application which is not already there for instance because of some power consumption constraints...
Q2 would be then: for which of these purposes strict determinism of the results is not crucial?
As the answer to this may not be obvious, a potential study could address this very issue. For instance one can consider on-board navigation systems with limited accuracy... I may be talking bullshit now, but perhaps in some applications it doesn't matter whether a satellite flies on the exact route but +/-10km to the left/right?
...and so on for the other systems.
Another thing is understanding what exactly this probabilistic computing is, and what can be achieved using it (like the result is probabilistic but falls within a defined range of precision), etc. Did they build a complete chip or at least a sub-circiut, or still only logic gates...
Satellites use old CPUs also because with the trend of going for higher power modern CPUs are not very convenient from a system design point of view (TBC)... as a consequence the constraints put on on-board algorithms can be demanding. I agree with you that double precision might just not be necessary for a number of applications (navigation also), but I guess we are not talking about 10km as an absolute value, rather to a relative error that can be tolerated at level of (say) 10^-6. All in all you are right a first study should assess what application this would be useful at all.. and at what precision / power levels
The interest of this can be a high fault tolerance for some math operations, ... which would have for effect to simplify the job of coders!
I don't think this is a good idea regarding power consumption for CPU (strictly speaking).
The reason we use old chip is just a matter of qualification for space, not power. For instance a LEON Sparc (e.g. use on some platform for ESA) consumes something like 5mW/MHz so it is definitely not were an engineer will look for some power saving considering a usual 10-15kW spacecraft
What about speed then? Seven time faster could allow some real time navigation at higher speed (e.g. velocity of a terminal guidance for an asteroid impactor is limited to 10 km/s ... would a higher velocity be possible with faster processors?)
Another issue is the radiation tolerance of the technology ... if the PCMOS are more tolerant to radiation they could get more easily space qualified.....
I don't remember what is the speed factor, but I guess this might do it! Although, I remember when using an IMU that you cannot have the data above a given rate (e.g. 20Hz even though the ADC samples the sensor at a little faster rate), so somehow it is not just the CPU that must be re-thought.
When I say qualification I also imply the "hardened" phase.
I don't know if the (promised) one-order-of-magnitude improvements in power efficiency and performance are enough to justify looking into this.
For once, it is not clear to me what embracing this technology would mean from an engineering point of view: does this technology need an entirely new software/hardware stack? If that were the case, in my opinion any potential benefit would be nullified.
Also, is it realistic to build an entire self-sufficient chip on this technology? While the precision of floating point computations may be degraded and still be useful, how does all this play with integer arithmetic? Keep in mind that, e.g., in the Linux kernel code floating-point calculations are not even allowed/available... It is probably possible to integrate an "accelerated" low-accuracy floating-point unit together with a traditional CPU, but then again you have more implementation overhead creeping in.
Finally, recent processors by Intel (e.g., the Atom) and especially ARM boast really low power-consumption levels, at the same time offering performance-boosting features such as multi-core and vectorization capabilities. Don't such efforts have more potential, if anything because of economical/industrial inertia?
"What you do for a living is not be creative, what you do is ship," says bestselling author Seth Godin, arguing that we must quiet our fearful "lizard brains" to avoid sabotaging projects just before we finally finish them.
... or to me the importance of setting deadlines, objectives and planning to not sabotage your creative work!
ad "quieting the lizard brain" a friend of mine used to say: "if in doubt, do it!"
had to think about that when he talks about the lizard brain getting us scared ...
scary guy ..... his 'shipping' philosophy and his 'everybody is creative' line is close to Marx description of alienation ... I share more Stroustrup point of view "The
idea of software development as an assembly line manned by
semi-skilled interchangeable workers is fundamentally flawed and
wasteful."
once again one of these initiatives that came up from a situation and that would never have been possible with a top-down approach .... fantastic!
and as Dario said: we are apparently where NASA still has to go with this :-)
indeed ... you are right .... interesting project btw - they started in 1999, were in 2005 the first NASA project on Sourceforge and won several awards ....
then this entry why they did not participate last year:
"05/01/09: Skipping this years Google Summer-of-Code - many of you have asked why we are not participating in this years Summer of Code. The answer is that both John and Peter were too busy with other assignments to set this up in time. We will be back in 2010. At least we were able to compensate with a limited number of NASA internships to continue some of last years projects."
.... but I could not find them in this years selected list - any clue?
They participate under the name "The Java Pathfinder Team" (http://babelfish.arc.nasa.gov/trac/jpf/wiki/events/soc2010). It is actually a very useful project for both education and industry (Airbus created a consortium on model checking soft, and there is a lot of research on it)
As far as I know, TAS had some plans of using Java onboard spacecrafts, 2 years ago. Not sure the industry is really sensible about Jobs' opinions ;) particularly if there is no better alternative!
oddly most overclocking products and software seem to be aimed at windows as far as i can tell. I'd guess because people usually only bother to do it for gaming?
Well, true - making data public is one thing, and making others to work through them for you is another... I love the new term "citizen science" though (and an explanation on the Wikipedia page why they had to invent a new one because "crowdsourcing" is soooooo - politically - wrong).
D-Wave develops computing systems that leverage the physics of quantum mechanics in order to address problems
that are hard for traditional methods to solve in a cost-effective amount of time. Examples of such problems
include software verification and validation, financial risk analysis, affinity mapping and sentiment analysis,
object recognition in images, medical imaging classification, compressed sensing and bioinformatics.
According to the company's wikipedia page, the computer costs $ 10 million. Can we then declare Quantum Computing has officially arrived?!
quotes from elsewhere in the site: "first commercial quantum computing system on the market"; "our current superconducting 128-qubit processor chip is housed inside a cryogenics system within a 10 square meter shielded room"
Link to the company's scientific publications. Interestingly, this company seems to have been running a BOINC project, AQUA@home, to "predict the performance of superconducting adiabatic quantum computers on a variety of hard problems arising in fields ranging from materials science to machine learning. AQUA@home uses Internet-connected computers to help design and analyze quantum computing algorithms, using Quantum Monte Carlo techniques". List of papers coming out of it.
I especially like " The program will also create a "developer's kit" of open hardware and software specifications to make it easier for new components to integrate into such fractionated systems."
Joris: wanna take the lead on having a closer look on this, I definitely would like to be part of it and happy to contribute, possibly also Juxi? - first assessment by Christmas realistic?
I think it a very interesting approach.
If you google "darpa F6", you should see that a lot seems to be on-going. So, should we do something about it before having the conclusions of the Darpa study ?
wait and see is never a good approach in these cases .... first step has to be anyway to understand what they are up to and then to think about our own ideas on it, own approaches, alternatives and then to see what we can do specifically in the team on it.
The first artificial creature to receive the genomic personality is Rity, a dog-like software character that lives in a virtual 3D world in a PC
In Rity, internal states such as motivation, homeostasis and emotion change according to the incoming perception
The internal control architecture processes incoming sensor information, calculates each value of internal states as its response, and sends the calculated values to the behavior selection module to generate a proper behavior.