Skip to main content

Home/ Advanced Concepts Team/ Group items tagged IBM

Rss Feed Group items tagged

johannessimon81

IBM: stop motion video made with individual atoms - 1 views

  •  
    Amazing! :-D Makes you forget how hard it is to detect individual atoms at all.
  •  
    While amazing indeed, it makes me wonder how much longer we will still have to wait until all this nanotechnology stuff will deliver something actually useful (say super-efficient/super-small transistors in my cell phone, camera, computer, etc.)? So far it seems to excel mostly in marketing...
Luís F. Simões

Alice and Bob in Cipherspace » American Scientist - 1 views

  • A new form of encryption allows you to compute with data you cannot read
  • The technique that makes this magic trick possible is called fully homomorphic encryption, or FHE. It’s not exactly a new idea, but for many years it was viewed as a fantasy that would never come true. That changed in 2009, with a breakthrough discovery by Craig Gentry, who was then a graduate student at Stanford University. (He is now at IBM Research.) Since then, further refinements and more new ideas have been coming at a rapid pace.
Lionel Jacques

IBM researchers make 12-atom magnetic memory bit - 0 views

  •  
    Researchers have successfully stored a single data bit in only 12 atoms.
Thijs Versloot

Cognitive computing - 2 views

  •  
    Has this not been underway for quite some time now? Not sure if this 'new era' is coming any day soon. Thoughts?
  •  
    If they want to give the computers "senses" they should also go ahead and give them a body slightly taller than humans ...and guns. So once they reach a critical level of consciousness they can really go to town... http://0-media-cdn.foolz.us/ffuuka/board/tg/image/1385/54/1385549501025.jpg
  •  
    Neural networks!!! However, indeed, "senses" will not make any sense towards human-like computing without bodies that physically interact with the world. That's where most of these things are going wrong. Perception and cognition are for action. Without action coming from the machine side all these ideas simply fail.
LeopoldS

Solar Flower high concentration PV - 3 views

  •  
    interesting collaboration ... unfortunately no prices yet available
Alexander Wittig

Picture This: NVIDIA GPUs Sort Through Tens of Millions of Flickr Photos - 2 views

  •  
    Strange and exotic cityscapes. Desolate wilderness areas. Dogs that look like wookies. Flickr, one of the world's largest photo sharing services, sees it all. And, now, Flickr's image recognition technology can categorize more than 11 billion photos like these. And it does it automatically. It's called "Magic View." Magical deep learning! Buzzword attack!
  • ...4 more comments...
  •  
    and here comes my standard question: how can we use this for space? fast detection of natural disasters onboard?
  •  
    Even on ground. You could for example teach it what nuclear reactors or missiles or other weapons you don't want look like on satellite pictures and automatically scan the world for them (basically replacing intelligence analysts).
  •  
    In fact, I think this could make a nice ACT project: counting seals from satellite imagery is an actual (and quite recent) thing: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0092613 In this publication they did it manually from a GeoEye 1 b/w image, which sounds quite tedious. Maybe one can train one of those image recognition algorithms to do it automatically. Or maybe it's a bit easier to count larger things, like elephants (also a thing).
  •  
    In HiPEAC (High Performance, embedded architecture and computation) conference I attended in the beginning of this year there was a big trend of CUDA GPU vs FPGA for hardware accelerated image processing. Most of it orbitting around discussing who was faster and cheaper with people from NVIDIA in one side and people from Xilinx and Intel in the other. I remember of talking with an IBM scientist working on hardware accelerated data processing working together with the Radio telescope institute in Netherlands about the solution where they working on (GPU CUDA). I gathered that NVIDIA GPU suits best in applications that somehow do not rely in hardware, having the advantage of being programmed in a 'easy' way accessible to a scientist. FPGA's are highly reliable components with the advantage of being available in radhard versions, but requiring specific knowledge of physical circuit design and tailored 'harsh' programming languages. I don't know what is the level of rad hardness in NVIDIA's GPUs... Therefore FPGAs are indeed the standard choice for image processing in space missions (a talk with the microelectronics department guys could expand on this), whereas GPUs are currently used in some ground based (radio astronomy or other types of telescopes). I think that on for a specific purpose as the one you mentioned, this FPGA vs GPU should be assessed first before going further.
  •  
    You're forgetting power usage. GPUs need 1000 hamster wheels worth of power while FPGAs can run on a potato. Since space applications are highly power limited, putting any kind of GPU monster in orbit or on a rover is failed idea from the start. Also in FPGAs if a gate burns out from radiation you can just reprogram around it. Looking for seals offline in high res images is indeed definitely a GPU task.... for now.
  •  
    The discussion of how to make FPGA hardware acceleration solutions easier to use for the 'layman' is starting btw http://reconfigurablecomputing4themasses.net/.
Joris _

Why Computers Can't Mimic The Brain - Forbes.com - 3 views

  • engineers seem to have a diminished ability to understand biology
  • Remember them the next time you read a story claiming some brain-like accomplishment of a computer. The only really human thing these programs are doing is attracting attention to themselves
Francesco Biscani

STLport: An Interview with A. Stepanov - 2 views

  • Generic programming is a programming method that is based in finding the most abstract representations of efficient algorithms.
  • I spent several months programming in Java.
  • for the first time in my life programming in a new language did not bring me new insights
  • ...2 more annotations...
  • it has no intellectual value whatsoever
  • Java is clearly an example of a money oriented programming (MOP).
  •  
    One of the authors of the STL (C++'s Standard Template Library) explains generic programming and slams Java.
  • ...6 more comments...
  •  
    "Java is clearly an example of a money oriented programming (MOP)." Exactly. And for the industry it's the money that matters. Whatever mathematicians think about it.
  •  
    It is actually a good thing that it is "MOP" (even though I do not agree with this term): that is what makes it inter-operable, light and easy to learn. There is no point in writing fancy codes, if it does not bring anything to the end-user, but only for geeks to discuss incomprehensible things in forums. Anyway, I am pretty sure we can find a Java guy slamming C++ ;)
  •  
    Personally, I never understood what the point of Java is, given that: 1) I do not know of any developer (maybe Marek?) that uses it for intellectual pleasure/curiosity/fun whatever, given the possibility of choice - this to me speaks loudly on the objective qualities of the language more than any industrial-corporate marketing bullshit (for the record, I argue that Python is more interoperable, lighter and easier to learn than Java - which is why, e.g., Google is using it heavily); 2) I have used a software developed in Java maybe a total of 5 times on any computer/laptop I owned over 15 years. I cannot name of one single Java project that I find necessary or even useful; for my usage of computers, Java could disappear overnight without even noticing. Then of course one can argue as much as one wants about the "industry choosing Java", to which I would counterargue with examples of industry doing stupid things and making absurd choices. But I suppose it would be a kind of pointless discussion, so I'll just stop here :)
  •  
    "At Google, python is one of the 3 "official languages" alongside with C++ and Java". Java runs everywhere (the byte code itself) that is I think the only reason it became famous. Python, I guess, is more heavy if it were to run on your web browser! I think every language has its pros and cons, but I agree Java is not the answer to everything... Java is used in MATLAB, some web applications, mobile phones apps, ... I would be a bit in trouble if it were to disappear today :(
  •  
    I personally do not believe in interoperability :)
  •  
    Well, I bet you'd notice an overnight disappearance of java, because half of the internet would vanish... J2EE technologies are just omnipresent there... I'd rather not even *think* about developing a web application/webservice/web-whatever in standard C++... is it actually possible?? Perhaps with some weird Microsoft solutions... I bet your bank online services are written in Java. Certainly not in PHP+MySQL :) Industry has chosen Java not because of industrial-corporate marketing bullshit, but because of economics... it enables you develop robustly, reliably, error-prone, modular, well integrated etc... software. And the costs? Well, using java technologies you can set-up enterprise-quality web application servers, get a fully featured development environment (which is better than ANY C/C++/whatever development environment I've EVER seen) at the cost of exactly 0 (zero!) USD/GBP/EUR... Since many years now, the central issue in software development is not implementing algorithms, it's building applications. And that's where Java outperforms many other technologies. The final remark, because I may be mistakenly taken for an apostle of Java or something... I love the idea of generic programming, C++ is my favourite programming language (and I used to read Stroustroup before sleep), at leisure time I write programs in Python... But if I were to start a software development company, then, apart from some very niche applications like computer games, it most probably would use Java as main technology.
  •  
    "I'd rather not even *think* about developing a web application/webservice/web-whatever in standard C++... is it actually possible?? Perhaps with some weird Microsoft solutions... I bet your bank online services are written in Java. Certainly not in PHP+MySQL :)" Doing in C++ would be awesomely crazy, I agree :) But as I see it there are lots of huge websites that operate on PHP, see for instance Facebook. For the banks and the enterprise market, as a general rule I tend to take with a grain of salt whatever spin comes out from them; in the end behind every corporate IT decision there is a little smurf just trying to survive and have the back covered :) As they used to say in the old times, "No one ever got fired for buying IBM". "Industry has chosen Java not because of industrial-corporate marketing bullshit, but because of economics... it enables you develop robustly, reliably, error-prone, modular, well integrated etc... software. And the costs? Well, using java technologies you can set-up enterprise-quality web application servers, get a fully featured development environment (which is better than ANY C/C++/whatever development environment I've EVER seen) at the cost of exactly 0 (zero!) USD/GBP/EUR... Since many years now, the central issue in software development is not implementing algorithms, it's building applications. And that's where Java outperforms many other technologies." Apart from the IDE considerations (on which I cannot comment, since I'm not a IDE user myself), I do not see how Java beats the competition in this regard (again, Python and the huge software ecosystem surrounding it). My impression is that Java's success is mostly due to Sun pushing it like there is no tomorrow and bundling it with their hardware business.
  •  
    OK, I think there is a bit of everything, wrong and right, but you have to acknowledge that Python is not always the simplest. For info, Facebook uses Java (if you upload picture for instance), and PHP is very limited. So definitely, in company, engineers like you and me select the language, it is not a marketing or political thing. And in the case of fb, they come up with the conclusion that PHP, and Java don't do everything but complement each other. As you say Python as many things around, but it might be too much for simple applications. Otherwise, I would seriously be interested by a study of how to implement a Python-like system on-board spacecrafts and what are the advantages over mixing C, Ada and Java.
Francesco Biscani

Eco - "Writings: IBM vs. Mac" - 5 views

  • The fact is that the world is divided between users of the Macintosh computer and users of MS-DOS compatible computers. I am firmly of the opinion that the Macintosh is Catholic and that DOS is Protestant.
  •  
    An amusing opinion piece by Italian writer Umberto Eco.
  •  
    read this first time 10 years ago - but always nice to re-read indeed ...
LeopoldS

Tilera Corporation - 2 views

  •  
    who wants 100 cores ... future of PAGMO?
  • ...2 more comments...
  •  
    Well nVidia provides 10.000 "cores" in a single rack on thei Teslas...
  •  
    remember that you were recommending its purchase already some time ago ... still strong reasons to do so?
  •  
    The problem with this flurry of activity today regarding multicore architectures is that it is really unclear which one will be the winner in the long run. Never understimate the power of inertia, especially in the software industry (after all, people are still programming in COBOL and Fortran today). For instance, NVIDIA gives you the Teslas with 10000 cores, but then you have to rewrite extensive parts of your code in order to take advantage of this. Is this an investment worth undertaking? Difficult to say, it would certainly be if the whole software world moves into that direction (which is not happening - yet?). But then you have other approaches coming out, suche as the Cell processor by IBM (the one on the PS3) which has really impressive floating point performance and, of course, a completely different programming model. The nice thing about this Tilera processor seems to be that it is a general-purpose processor, which may not require extensive re-engineering of existing code (but I'm really hypothesizing, since the thechincal details are not very abundant on their website).
  •  
    Moreover PaGMO computation model is more towards systems with distributed memory, and not with shared memory (i.e. multi-core). In the latter, at certain point the memory access becomes the bottleneck.
LeopoldS

University Industry Series: Nestle and IBM - National Coun... - 0 views

  •  
    just registered to this webseminar on open innovaiton .... could be fun ...
ESA ACT

IBM puts its talents to green use - 29 Oct 2007 - BusinessGreen - 0 views

  •  
    interview with the head of the "big greed innovations" - nothing spectacularly new though ...
ESA ACT

IBM Lotus Symphony - 1.2 Mac Beta - 0 views

  •  
    has anybody already tried out this? i contains "lotus" and thus has already a handicap in its name but well ... why not check it out
nikolas smyrlakis

PARC (Palo Alto Research Center) - 0 views

  •  
    An interesting research centre in California! Focus areas: Business Services Electronic Materials, Devices, & Systems Information & Communication Technologies Biomedical Systems Cleantech
  • ...1 more comment...
  •  
    and some very ACT- like interesting internships / ideas they have Automatic summarization of related documents http://www.parc.com/job/43/automatic-summarization-of-related-documents.html (remember Kev's idea?) Bayesian diagnosis http://www.parc.com/job/34/bayesian-diagnosis---summer.html Autonomous robotics UAVs UGVs http://www.parc.com/job/36/autonomous-robotics---summer.html
  •  
    XEROX PARC was definitely heavily involved in computer development: eg. mouse, GUI, ethernet, OO programming, all came out of PARC, and all that without focusing on computers but printers...
  •  
    aaah its the XEROX centre, didn't know. Yep they made the mouse and then handed it over nicely to Apple after IBM thought it was useless
Marion Nachon

Frontier Development Lab (FDL): AI technologies to space science - 3 views

Applications might be of interest to some: https://frontierdevelopmentlab.org/blog/2019/3/1/application-deadline-extended-cftt4?fbclid=IwAR0gqMsHJCJx5DeoObv0GSESaP6VGjNKnHCPfmzKuvhFLDpkLSrcaCwmY_c ...

technology AI space science

started by Marion Nachon on 08 Apr 19 no follow-up yet
jaihobah

Quantum Artificial Life in an IBM Quantum Computer - 6 views

  •  
    I tried reading the abstract and my eyes glazed over at the buzzword density. Is this hot doo doo or a meaningful result?
  •  
    wow, quantum, artificial life, biomimetic, quantum supremacy .... quantum machine learning, and quantum artificial intelligence and, wait for it ...... quantum complexity. All in one abstract is this the new champion?
‹ Previous 21 - 37 of 37
Showing 20 items per page