Yeah... 2011 called with the greetings.
However, there was quite an interesting news about KSP recently... Perhaps it's been ACT's small failure to spot this opportunity? Considering we wrote space missions games ourselves...
This guy actually makes very detailed video tutorials about how to master the orbital dynamics in Kerbal.
I think the level of detail (and sometimes realism) is quite impressive: https://www.youtube.com/channel/UCxzC4EngIsMrPmbm6Nxvb-A
@Marek: true, old news. But "opportunity"? For what? The games we write are always games with a scientific purpose (not training not educational)
Kerbal Space programme is cool, but it is a game just like Microsoft Flight Simulator (but less accurate). Having ESA mission simulated in it is also cool but is it what we should or could do? Even more is it want we want to do? My personal opinion: No-No-No
> The games we write are always games with a scientific purpose (not training not educational)
I'd say investigating how to get the crowd may be an important part of "science of crowdsourcing".
So, an obvious example would be comparing how many participants the original ACT space mission game attracted versus a variant implemented in Kerbal and why. Easily made and easily publishable I think.
But that's just an obvious example I can give on the spot. I think there is more potential than that, so would not dismiss the idea so definitively.
But then, correct me if I'm wrong, social sciences are still not represented in the ACT... Perhaps an idea to revive during the upcoming retreat? ;-)
Strange and exotic cityscapes. Desolate wilderness areas. Dogs that look like wookies. Flickr, one of the world's largest photo sharing services, sees it all. And, now, Flickr's image recognition technology can categorize more than 11 billion photos like these. And it does it automatically. It's called "Magic View."
Magical deep learning! Buzzword attack!
Even on ground. You could for example teach it what nuclear reactors or missiles or other weapons you don't want look like on satellite pictures and automatically scan the world for them (basically replacing intelligence analysts).
In fact, I think this could make a nice ACT project: counting seals from satellite imagery is an actual (and quite recent) thing:
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0092613
In this publication they did it manually from a GeoEye 1 b/w image, which sounds quite tedious. Maybe one can train one of those image recognition algorithms to do it automatically.
Or maybe it's a bit easier to count larger things, like elephants (also a thing).
In HiPEAC (High Performance, embedded architecture and computation) conference I attended in the beginning of this year there was a big trend of CUDA GPU vs FPGA for hardware accelerated image processing. Most of it orbitting around discussing who was faster and cheaper with people from NVIDIA in one side and people from Xilinx and Intel in the other. I remember of talking with an IBM scientist working on hardware accelerated data processing working together with the Radio telescope institute in Netherlands about the solution where they working on (GPU CUDA).
I gathered that NVIDIA GPU suits best in applications that somehow do not rely in hardware, having the advantage of being programmed in a 'easy' way accessible to a scientist. FPGA's are highly reliable components with the advantage of being available in radhard versions, but requiring specific knowledge of physical circuit design and tailored 'harsh' programming languages. I don't know what is the level of rad hardness in NVIDIA's GPUs...
Therefore FPGAs are indeed the standard choice for image processing in space missions (a talk with the microelectronics department guys could expand on this), whereas GPUs are currently used in some ground based (radio astronomy or other types of telescopes).
I think that on for a specific purpose as the one you mentioned, this FPGA vs GPU should be assessed first before going further.
You're forgetting power usage. GPUs need 1000 hamster wheels worth of power while FPGAs can run on a potato. Since space applications are highly power limited, putting any kind of GPU monster in orbit or on a rover is failed idea from the start. Also in FPGAs if a gate burns out from radiation you can just reprogram around it.
Looking for seals offline in high res images is indeed definitely a GPU task.... for now.
A team of
engineers from NASA's Kennedy Space Center in Florida and some of the agency's
other field centers are looking into this and other novel launch systems based
on cutting-edge technologies.
The launch system
would require some advancements of existing technologies, but it wouldn't need
any brand-new technologies to work
Scramjet vehicles
could be used as a basis for a commercial launch program
Industry wants to rely on tried-and-true tools and techniques,
but is also addicted to dreams of "silver bullets,"
"transformative breakthroughs," "killer apps," and so forth.
This leads to
immense conservatism in the choice of basic tools (such as
programming languages and operating systems) and a desire for
monocultures (to minimize training and deployment costs).
The
idea of software development as an assembly line manned by
semi-skilled interchangeable workers is fundamentally flawed and
wasteful.
"for many, "programming" has become a strange combination of unprincipled hacking and invoking other people's libraries (with only the vaguest idea of what's going on). The notions of "maintenance" and "code quality" are typically forgotten or poorly understood. " ... seen so many of those students :(
and ad "My suggestion is to define a structure of CS education based on a core plus specializations and application areas", I am not saying the austrian university system is good, but e.g. the CS degrees in Vienna are done like this, there is a core which is the same for everybody 4-5 semester, and then you specialise in e.g. software engineering or computational mgmt and so forth, and then after 2 semester you specialize again into one of I think 7 or 8 master degrees ...
It does not make it easy for industry to hire people, as I have noticed, they sometimes really have no clue what the difference between Software Engineering is compared to Computational Intelligence, at least in HR :/
well - part of it we already do ... e.g. : (did you see the picture in the report?)
"Tolerate a little crowding. It took a little creativity to suddenly find a dozen new workspaces in our two-room office. Fortunately, we've found that a room can always fit one more person-and by induction, you can fit as many as you need. (All those years we spent proving math theorems came in handy after all.) "
All the satellite-related systems (except for the rocket to launch it) are DIY programs -- designed so that regular people may also have the chance of developing and eventually launching their own.
I was saying that mainly because of some flaws - the piggy-pack installation, no dedicated stage, the limited control, ...
It is so far very funny, but once he can fill all the gaps, it should be an excellent initiative - although careful about the debris if anyone has its own ;p
his quote: "when art becomes practical, we call it technology;
when technology becomes useless, we call it art" ...
this is probably the later one ....
a nice review on the wonders of Hierarchical Bayesian models. It cites a paper on probabilistic programming languages that might be relevant given our recent discussions.
At Hippo's farewell lunch there was a discussion on how kids are able to learn something as complex as language from a limited amount of observations, while Machine Learning algorithms no matter how many millions of instances you throw at them, don't learn beyond some point. If that subject interested you, you might like this paper.
Had an opportunity to listen to JBT and TLG during one summer school.. if they're half as good in writing as they are in speaking, should be a decent read...
Interesting blog maintained by the people from D-Wave, who developed the first commercial quantum computer. The blog presents a python implementation to program the D-Wave and some examples.
The termination of NASA's space shuttle program marks the end of a nearly 54-year rivalry between the USSR and the United States to achieve superiority in space exploration.
Sources confirmed that in commemoration of the capitalist defeat, extra bread and corn rations had been approved in all major cities, and factory workers were given time off their nine-hour work shifts to join in the festivities.
The problem with this flurry of activity today regarding multicore architectures is that it is really unclear which one will be the winner in the long run. Never understimate the power of inertia, especially in the software industry (after all, people are still programming in COBOL and Fortran today).
For instance, NVIDIA gives you the Teslas with 10000 cores, but then you have to rewrite extensive parts of your code in order to take advantage of this. Is this an investment worth undertaking? Difficult to say, it would certainly be if the whole software world moves into that direction (which is not happening - yet?).
But then you have other approaches coming out, suche as the Cell processor by IBM (the one on the PS3) which has really impressive floating point performance and, of course, a completely different programming model.
The nice thing about this Tilera processor seems to be that it is a general-purpose processor, which may not require extensive re-engineering of existing code (but I'm really hypothesizing, since the thechincal details are not very abundant on their website).
Moreover PaGMO computation model is more towards systems with distributed memory, and not with shared memory (i.e. multi-core). In the latter, at certain point the memory access becomes the bottleneck.
From the minds behind programming Q&A site StackOverflow comes Super User, the smart, simple way to get answers to geeky questions about computer hardware and software. Last fall, ...
They want so much to reduce the price of the program that they forgot that the ultimate goal is Mars ... cos' this would not be a technology demonstration anymore ... am I wrong ?