Skip to main content

Home/ Advanced Concepts Team/ Group items tagged language

Rss Feed Group items tagged

Thijs Versloot

Wolfram Language - 11 views

Thats looks pretty awesome indeed. Some of those functions would be very helpfull right now :)

knowledge model everything

Thijs Versloot

Communicate through the plasma sheath during re-entry - 1 views

  •  
    In order to overcome the communication blackout problem suffered by hypersonic vehicles, a matching approach has been proposed for the first time in this paper. It utilizes a double-positive (DPS) material layer surrounding a hypersonic vehicle antenna to match with the plasma sheath enclosing the vehicle. Or in more easy language, basically one provides an antenna as capacitor, in combination with the plasma sheath (an inductor), they form an electrical circuit which becomes transparent for long wavelength radiation (the communication signal). The reasons is that fluctuations are balanced by the twin system, preventing absorption/reflection of the incoming radiation. Elegant solution, but will only work on long wavelength communication, plus I am not sure whether the antenna needs active control (as the plasma sheath conditions change during the re-entry phase).
johannessimon81

IBM Speech Recognition, 1986 - 0 views

  •  
    Interesting historical perspective. Progress since the late '80 really seems to be fairly slow. ?: Do we need to wait for the singularity until speech recognition works without flaws?
  • ...1 more comment...
  •  
    funny - tried just yesterday the one built in on mavericks: sending one email took three times as long at least as typing it And now my speech PowerPoint Funny, trade trust yesterday they're built in speech recognition in Mavericks sending one e-mail to at least three times a talk as long as typing it. Well this was actually quite okay and relatively fast cheers nice evening
  •  
    "I thought I would give it a try on my android sexy seems to work pretty well and I'm speaking more less at normal speed" Actually I was speaking as fast as I could because it was for the google search input - if you make a pause it will think you finished your input and start the query. Also you might notice that Android thinks it is "android sexy" - this was meant to be "on my Android. THIS seems to work...". Still it is not too bad - maybe in a year or two they have it working. Of course it might also be that I just use the word "sexy" randomly... :-\
  •  
    The problem is that we don't yet understand how speech in humans actually works. As long as we merely build either inference or statistical language models we'll never get perfect speech recognition. A lot of recognition in humans has a predictive/expectational basis to it that stems from our understanding of higher lvl concepts and context awareness. Sadly I suspect that as long as machines remain unembodied in their perceptual abilities their ability to either properly recognize sounds/speech or objects and other features will never reach perfection.
Alexander Wittig

Picture This: NVIDIA GPUs Sort Through Tens of Millions of Flickr Photos - 2 views

  •  
    Strange and exotic cityscapes. Desolate wilderness areas. Dogs that look like wookies. Flickr, one of the world's largest photo sharing services, sees it all. And, now, Flickr's image recognition technology can categorize more than 11 billion photos like these. And it does it automatically. It's called "Magic View." Magical deep learning! Buzzword attack!
  • ...4 more comments...
  •  
    and here comes my standard question: how can we use this for space? fast detection of natural disasters onboard?
  •  
    Even on ground. You could for example teach it what nuclear reactors or missiles or other weapons you don't want look like on satellite pictures and automatically scan the world for them (basically replacing intelligence analysts).
  •  
    In fact, I think this could make a nice ACT project: counting seals from satellite imagery is an actual (and quite recent) thing: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0092613 In this publication they did it manually from a GeoEye 1 b/w image, which sounds quite tedious. Maybe one can train one of those image recognition algorithms to do it automatically. Or maybe it's a bit easier to count larger things, like elephants (also a thing).
  •  
    In HiPEAC (High Performance, embedded architecture and computation) conference I attended in the beginning of this year there was a big trend of CUDA GPU vs FPGA for hardware accelerated image processing. Most of it orbitting around discussing who was faster and cheaper with people from NVIDIA in one side and people from Xilinx and Intel in the other. I remember of talking with an IBM scientist working on hardware accelerated data processing working together with the Radio telescope institute in Netherlands about the solution where they working on (GPU CUDA). I gathered that NVIDIA GPU suits best in applications that somehow do not rely in hardware, having the advantage of being programmed in a 'easy' way accessible to a scientist. FPGA's are highly reliable components with the advantage of being available in radhard versions, but requiring specific knowledge of physical circuit design and tailored 'harsh' programming languages. I don't know what is the level of rad hardness in NVIDIA's GPUs... Therefore FPGAs are indeed the standard choice for image processing in space missions (a talk with the microelectronics department guys could expand on this), whereas GPUs are currently used in some ground based (radio astronomy or other types of telescopes). I think that on for a specific purpose as the one you mentioned, this FPGA vs GPU should be assessed first before going further.
  •  
    You're forgetting power usage. GPUs need 1000 hamster wheels worth of power while FPGAs can run on a potato. Since space applications are highly power limited, putting any kind of GPU monster in orbit or on a rover is failed idea from the start. Also in FPGAs if a gate burns out from radiation you can just reprogram around it. Looking for seals offline in high res images is indeed definitely a GPU task.... for now.
  •  
    The discussion of how to make FPGA hardware acceleration solutions easier to use for the 'layman' is starting btw http://reconfigurablecomputing4themasses.net/.
Francesco Biscani

DM's Esoteric Programming Languages - Intelligent Design Sort - 1 views

  •  
    Cool algorithm! We should implement it in PaGMO.
Francesco Biscani

What Should We Teach New Software Developers? Why? | January 2010 | Communications of t... - 3 views

shared by Francesco Biscani on 15 Jan 10 - Cached
Dario Izzo liked it
  • Industry wants to rely on tried-and-true tools and techniques, but is also addicted to dreams of "silver bullets," "transformative breakthroughs," "killer apps," and so forth.
  • This leads to immense conservatism in the choice of basic tools (such as programming languages and operating systems) and a desire for monocultures (to minimize training and deployment costs).
  • The idea of software development as an assembly line manned by semi-skilled interchangeable workers is fundamentally flawed and wasteful.
  •  
    Nice opinion piece by the creator of C++ Bjarne Stroustrup. Substitute "industry" with "science" and many considerations still apply :)
  •  
    "for many, "programming" has become a strange combination of unprincipled hacking and invoking other people's libraries (with only the vaguest idea of what's going on). The notions of "maintenance" and "code quality" are typically forgotten or poorly understood. " ... seen so many of those students :( and ad "My suggestion is to define a structure of CS education based on a core plus specializations and application areas", I am not saying the austrian university system is good, but e.g. the CS degrees in Vienna are done like this, there is a core which is the same for everybody 4-5 semester, and then you specialise in e.g. software engineering or computational mgmt and so forth, and then after 2 semester you specialize again into one of I think 7 or 8 master degrees ... It does not make it easy for industry to hire people, as I have noticed, they sometimes really have no clue what the difference between Software Engineering is compared to Computational Intelligence, at least in HR :/
Francesco Biscani

A Brief, Incomplete, and Mostly Wrong History of Programming Languages - 2 views

  •  
    Funny :) But the guy does not mention Logo...
nikolas smyrlakis

EUROPA - Press Releases - Investing in the future: Commission calls for ad... - 0 views

  •  
    an additional investment of €50 billion in energy technology research will be needed over the next 10 years. This means almost tripling the annual investment in the European Union, from €3 to €8 billion
ESA ACT

Notepad++, an excellent source code editor and Notepad replacement, which supports seve... - 0 views

  •  
    maybe but windows only :-( ...
ESA ACT

Language: Disputed definitions : Nature News - 0 views

  •  
    This is an article on the problems of defining a scientific term. Or better how people can argue when trying to agree on a definition. Rather funny than important
LeopoldS

Prepare and transmit electronic text - American Institute of Physics - 2 views

  •  
    new revTex version available ... what do they mean by this? how do they use XML and latex to XML? would this also be an option for acta futura? "While we appreciate the benefits to authors of preparing manuscripts in TeX, especially for math-intensive manuscripts, it is neither a cost-effective composition tool (for the volume of pages AIP currently produces) nor is it a format that can be used effectively for online publishing."
  •  
    Dunno really, they may have some in-house process that converts LaTeX to XML for some reason. Probably they are using some subset of SGML, the standard generalized markup language from which both HTML and XML derive. Don't think is really relevant for Acta Futura, and the rest of the world seems to get along with TeX just fine...
jcunha

Space data representation - 1 views

  •  
    A common data hub that allows the representation and comparison of data from numerous space missions. "The IMPEx portal offers tools for the visualization and analysis of datasets from different space missions. Furthermore, several computational model databases are feeding into the environment." As they say, with its massive 3D-visualization capabilities it offers the possibility of displaying spacecraft trajectories, planetary ephemerides as well as scientific representations of observational and simulation datasets.
anonymous

Scientists Are Turning Their Backs on Algorithms Inspired By Nature - 5 views

  •  
    "Over the past couple of decades, the research literature has filled up with endless new nature-based metaphors for algorithms. You can find algorithms based on the behaviour of cuckoos, bees, bats, cats, wolves, galaxy formation and black holes. (...) All researchers have been doing is wasting time on developing new approaches that are probably little better than existing ones. And the language of each metaphor then invades the literature, distracting people from using the already sufficiently expressive terminology of mathematics and, above all, working together to find the best way forward." The golden era of fireworks-like algorithm is about to end
  •  
    Lies, lies, all lies. They will never go away. Papers need to be published.
Dario Izzo

Norris Numbers - 5 views

  •  
    A nice programming guide explaining thoughts I often try to pass on
  •  
    The fact that this article is written in two languages, one is english and the other one is russian...? I get the impression that russians are among the first to hit upon such barriers before the rest of us do, from scaling their programming to bumping upon theory of chaos etc etc Imho.
jaihobah

Microsoft makes play for next wave of computing with quantum computing toolkit - 1 views

  •  
    At its Ignite conference today, Microsoft announced its moves to embrace the next big thing in computing: quantum computing. Later this year, Microsoft will release a new quantum computing programming language, with full Visual Studio integration, along with a quantum computing simulator. With these, developers will be able to both develop and debug quantum programs implementing quantum algorithms.
« First ‹ Previous 41 - 60 of 60
Showing 20 items per page