Skip to main content

Home/ Advanced Concepts Team/ Group items tagged phone

Rss Feed Group items tagged

johannessimon81

IBM: stop motion video made with individual atoms - 1 views

  •  
    Amazing! :-D Makes you forget how hard it is to detect individual atoms at all.
  •  
    While amazing indeed, it makes me wonder how much longer we will still have to wait until all this nanotechnology stuff will deliver something actually useful (say super-efficient/super-small transistors in my cell phone, camera, computer, etc.)? So far it seems to excel mostly in marketing...
anonymous

Home - Toronto Deep Learning - 2 views

  •  
    Implementation of the deep learning-based image classifier (online). Try making a picture with your phone and upload it there. Pretty impressive results. EDIT: Okay, it works the best with well exposed simple objects (pen, mug).
Ma Ru

Memory and the Cybermind - 0 views

  •  
    or "your phone vs. your wife"...
Luís F. Simões

HP Dreams of Internet Powered by Phone Chips (And Cow Chips) | Wired.com - 0 views

  • For Hewlett Packard Fellow Chandrakat Patel, there’s a “symbiotic relationship between IT and manure.”
  • Patel is an original thinker. He’s part of a group at HP Labs that has made energy an obsession. Four months ago, Patel buttonholed former Federal Reserve Chairman Alan Greenspan at the Aspen Ideas Festival to sell him on the idea that the joule should be the world’s global currency.
  • Data centers produce a lot of heat, but to energy connoisseurs it’s not really high quality heat. It can’t boil water or power a turbine. But one thing it can do is warm up poop. And that’s how you produce methane gas. And that’s what powers Patel’s data center. See? A symbiotic relationship.
  • ...1 more annotation...
  • Financial house Cantor Fitzgerald is interested in Project Moonshot because it thinks HP’s servers may have just what it takes to help the company’s traders understand long-term market trends. Director of High-Frequency Trading Niall Dalton says that while the company’s flagship trading platform still needs the quick number-crunching power that comes with the powerhog chips, these low-power Project Moonshot systems could be great for analyzing lots and lots of data — taking market data from the past three years, for example, and running a simulation.
  •  
    of relevance to this discussion: Koomey's Law, a Moore's Law equivalent for computing's energetic efficiency http://www.economist.com/node/21531350 http://hardware.slashdot.org/story/11/09/13/2148202/whither-moores-law-introducing-koomeys-law
johannessimon81

Iridium to introduce WiFi hotspots for global satelite internet - 2 views

  •  
    In remote / under-developed regions this might actually be a strong alternative for building internet connectivity - somewhat like the exploding market for cell phones in Africa due to the lack of land lines.
johannessimon81

Facebook is buying WhatsApp for ~ $ 19e9 - 1 views

  •  
    That is about € 14e9 - enough to pay more than a million YGTs for half a year. Could we use maybe just half a million YGTs for half a year to build a similar platform and keep the remaining € 7e9 for ourselves? Keep in mind that WhatsApp only has 45 employees (according to AllThingsD: http://goo.gl/NtJcSj ). So we would have an advantage > 10000:1. On the other hand does this mean that every employee at WhatsApp gets enough money now to survive comfortably for ~5000 years or will the inevitable social inequality strike and most people get next to nothing while a few get money to live comfortably for ~1000000 years? Also: Does Facebook think about these numbers before they pay them? Or is it just a case of "That looks tasty - lets have it"? Also (2): As far as I can see all these internet companies (Google, Facebook, Yahoo, WhatsApp, Twitter...) seem to make most of their income from advertising. For all these companies together that must be a lot of advertising money (turns out that in 2013 the world spent about $ 500 billion on advertising: http://goo.gl/vYog15 ). For that money you could of course have 20 million YGTs roaming the Earth and advertising stuff door-to-door... ... ...
  • ...1 more comment...
  •  
    Jo, thats just brilliant... 500billion USD total on advertising, that sounds absolutely ridiculous.. I always wondered whether this giant advertisement scheme is just one big 'ponzi'-like scheme waiting to crash down on us one day when they realize, cat-picture twittering fb-ing whatsapping consumers just aint worth it..
  •  
    The whole valuation of those internet companies is a bit scary. Things like the Facebook and Twitter ipo numbers seem just ridiculous.
  •  
    Facebook is not really so much buying into a potential good business deal as much as it's buying out risky competition. Popular trends need to be killed fast before they take off the ground too much. Also the amount of personal data that WhatsApp is amassing is staggering. I have never seen an app requesting so many phone rights in my life.
Guido de Croon

Will robots be smarter than humans by 2029? - 2 views

  •  
    Nice discussion about the singularity. Made me think of drinking coffee with Luis... It raises some issues such as the necessity of embodiment, etc.
  • ...9 more comments...
  •  
    "Kurzweilians"... LOL. Still not sold on embodiment, btw.
  •  
    The biggest problem with embodiment is that, since the passive walkers (with which it all started), it hasn't delivered anything really interesting...
  •  
    The problem with embodiment is that it's done wrong. Embodiment needs to be treated like big data. More sensors, more data, more processing. Just putting a computer in a robot with a camera and microphone is not embodiment.
  •  
    I like how he attacks Moore's Law. It always looks a bit naive to me if people start to (ab)use it to make their point. No strong opinion about embodiment.
  •  
    @Paul: How would embodiment be done RIGHT?
  •  
    Embodiment has some obvious advantages. For example, in the vision domain many hard problems become easy when you have a body with which you can take actions (like looking at an object you don't immediately recognize from a different angle) - a point already made by researchers such as Aloimonos.and Ballard in the end 80s / beginning 90s. However, embodiment goes further than gathering information and "mental" recognition. In this respect, the evolutionary robotics work by for example Beer is interesting, where an agent discriminates between diamonds and circles by avoiding one and catching the other, without there being a clear "moment" in which the recognition takes place. "Recognition" is a behavioral property there, for which embodiment is obviously important. With embodiment the effort for recognizing an object behaviorally can be divided between the brain and the body, resulting in less computation for the brain. Also the article "Behavioural Categorisation: Behaviour makes up for bad vision" is interesting in this respect. In the field of embodied cognitive science, some say that recognition is constituted by the activation of sensorimotor correlations. I wonder to which extent this is true, and if it is valid for extremely simple creatures to more advanced ones, but it is an interesting idea nonetheless. This being said, if "embodiment" implies having a physical body, then I would argue that it is not a necessary requirement for intelligence. "Situatedness", being able to take (virtual or real) "actions" that influence the "inputs", may be.
  •  
    @Paul While I completely agree about the "embodiment done wrong" (or at least "not exactly correct") part, what you say goes exactly against one of the major claims which are connected with the notion of embodiment (google for "representational bottleneck"). The fact is your brain does *not* have resources to deal with big data. The idea therefore is that it is the body what helps to deal with what to a computer scientist appears like "big data". Understanding how this happens is key. Whether it is the problem of scale or of actually understanding what happens should be quite conclusively shown by the outcomes of the Blue Brain project.
  •  
    Wouldn't one expect that to produce consciousness (even in a lower form) an approach resembling that of nature would be essential? All animals grow from a very simple initial state (just a few cells) and have only a very limited number of sensors AND processing units. This would allow for a fairly simple way to create simple neural networks and to start up stable neural excitation patterns. Over time as complexity of the body (sensors, processors, actuators) increases the system should be able to adapt in a continuous manner and increase its degree of self-awareness and consciousness. On the other hand, building a simulated brain that resembles (parts of) the human one in its final state seems to me like taking a person who is just dead and trying to restart the brain by means of electric shocks.
  •  
    Actually on a neuronal level all information gets processed. Not all of it makes it into "conscious" processing or attention. Whatever makes it into conscious processing is a highly reduced representation of the data you get. However that doesn't get lost. Basic, low processed data forms the basis of proprioception and reflexes. Every step you take is a macro command your brain issues to the intricate sensory-motor system that puts your legs in motion by actuating every muscle and correcting every step deviation from its desired trajectory using the complicated system of nerve endings and motor commands. Reflexes which were build over the years, as those massive amounts of data slowly get integrated into the nervous system and the the incipient parts of the brain. But without all those sensors scattered throughout the body, all the little inputs in massive amounts that slowly get filtered through, you would not be able to experience your body, and experience the world. Every concept that you conjure up from your mind is a sort of loose association of your sensorimotor input. How can a robot understand the concept of a strawberry if all it can perceive of it is its shape and color and maybe the sound that it makes as it gets squished? How can you understand the "abstract" notion of strawberry without the incredibly sensible tactile feel, without the act of ripping off the stem, without the motor action of taking it to our mouths, without its texture and taste? When we as humans summon the strawberry thought, all of these concepts and ideas converge (distributed throughout the neurons in our minds) to form this abstract concept formed out of all of these many many correlations. A robot with no touch, no taste, no delicate articulate motions, no "serious" way to interact with and perceive its environment, no massive flow of information from which to chose and and reduce, will never attain human level intelligence. That's point 1. Point 2 is that mere pattern recogn
  •  
    All information *that gets processed* gets processed but now we arrived at a tautology. The whole problem is ultimately nobody knows what gets processed (not to mention how). In fact an absolute statement "all information" gets processed is very easy to dismiss because the characteristics of our sensors are such that a lot of information is filtered out already at the input level (e.g. eyes). I'm not saying it's not a valid and even interesting assumption, but it's still just an assumption and the next step is to explore scientifically where it leads you. And until you show its superiority experimentally it's as good as all other alternative assumptions you can make. I only wanted to point out is that "more processing" is not exactly compatible with some of the fundamental assumptions of the embodiment. I recommend Wilson, 2002 as a crash course.
  •  
    These deal with different things in human intelligence. One is the depth of the intelligence (how much of the bigger picture can you see, how abstract can you form concept and ideas), another is the breadth of the intelligence (how well can you actually generalize, how encompassing those concepts are and what is the level of detail in which you perceive all the information you have) and another is the relevance of the information (this is where the embodiment comes in. What you do is to a purpose, tied into the environment and ultimately linked to survival). As far as I see it, these form the pillars of human intelligence, and of the intelligence of biological beings. They are quite contradictory to each other mainly due to physical constraints (such as for example energy usage, and training time). "More processing" is not exactly compatible with some aspects of embodiment, but it is important for human level intelligence. Embodiment is necessary for establishing an environmental context of actions, a constraint space if you will, failure of human minds (i.e. schizophrenia) is ultimately a failure of perceived embodiment. What we do know is that we perform a lot of compression and a lot of integration on a lot of data in an environmental coupling. Imo, take any of these parts out, and you cannot attain human+ intelligence. Vary the quantities and you'll obtain different manifestations of intelligence, from cockroach to cat to google to random quake bot. Increase them all beyond human levels and you're on your way towards the singularity.
Dario Izzo

Modular Mobile Phones (big G + Motorola behind this): Same idea applicable to satellite... - 1 views

  •  
    We thought about a modular SAT concept in tahe past (looking to the second issue of Acta Futura its already there) .... but never made concrete steps ....
Luzi Bergamin

"ride sharing 2.0" with car2gether - 2 views

  •  
    Pretty close to what we suggested for "system of systems." Was an excellent exercise in bullshitting, but of course ESA is not innovative enough. Now it's too late, Daimler did it...
  •  
    not sure they will end up needing a lot of "space" though ...
  •  
    There is no "space" yet, at least I didn't find any. But that's because they didn't consider specific hardware, but normal mobile phones. The step to include "space" is obvious, though, and I'm sure they thought about this as well...
LeopoldS

Track changes with LaTeX - 3 views

  •  
    did any of you Latex gurus already try this out?
  • ...1 more comment...
  •  
    It's installed on my computer, but I never really used it. I think it's fine, but for my purposes latexdiff mostly is enough.
  •  
    I assume that you use latexdiff from the command line ... still have to find a nice script with which to integrate it into the TexShop GUI for Karène ...
  •  
    A command line is an interface as well. I was able to explain via phone to a (computer-wise) avarage undereducated Mac-user how to install and run latexdiff. Thus I think also Karene can use it...
Francesco Biscani

Official Google Blog: Announcing Google's Focused Research Awards - 0 views

  • Today, we're announcing the first-ever round of Google Focused Research Awards — funding research in areas of study that are of key interest to Google as well as the research community. These awards, totaling $5.7 million, cover four areas: machine learning, the use of mobile phones as data collection devices for public health and environment monitoring, energy efficiency in computing, and privacy.
  •  
    Might be of some interest to Christos?
Francesco Biscani

STLport: An Interview with A. Stepanov - 2 views

  • Generic programming is a programming method that is based in finding the most abstract representations of efficient algorithms.
  • I spent several months programming in Java.
  • for the first time in my life programming in a new language did not bring me new insights
  • ...2 more annotations...
  • it has no intellectual value whatsoever
  • Java is clearly an example of a money oriented programming (MOP).
  •  
    One of the authors of the STL (C++'s Standard Template Library) explains generic programming and slams Java.
  • ...6 more comments...
  •  
    "Java is clearly an example of a money oriented programming (MOP)." Exactly. And for the industry it's the money that matters. Whatever mathematicians think about it.
  •  
    It is actually a good thing that it is "MOP" (even though I do not agree with this term): that is what makes it inter-operable, light and easy to learn. There is no point in writing fancy codes, if it does not bring anything to the end-user, but only for geeks to discuss incomprehensible things in forums. Anyway, I am pretty sure we can find a Java guy slamming C++ ;)
  •  
    Personally, I never understood what the point of Java is, given that: 1) I do not know of any developer (maybe Marek?) that uses it for intellectual pleasure/curiosity/fun whatever, given the possibility of choice - this to me speaks loudly on the objective qualities of the language more than any industrial-corporate marketing bullshit (for the record, I argue that Python is more interoperable, lighter and easier to learn than Java - which is why, e.g., Google is using it heavily); 2) I have used a software developed in Java maybe a total of 5 times on any computer/laptop I owned over 15 years. I cannot name of one single Java project that I find necessary or even useful; for my usage of computers, Java could disappear overnight without even noticing. Then of course one can argue as much as one wants about the "industry choosing Java", to which I would counterargue with examples of industry doing stupid things and making absurd choices. But I suppose it would be a kind of pointless discussion, so I'll just stop here :)
  •  
    "At Google, python is one of the 3 "official languages" alongside with C++ and Java". Java runs everywhere (the byte code itself) that is I think the only reason it became famous. Python, I guess, is more heavy if it were to run on your web browser! I think every language has its pros and cons, but I agree Java is not the answer to everything... Java is used in MATLAB, some web applications, mobile phones apps, ... I would be a bit in trouble if it were to disappear today :(
  •  
    I personally do not believe in interoperability :)
  •  
    Well, I bet you'd notice an overnight disappearance of java, because half of the internet would vanish... J2EE technologies are just omnipresent there... I'd rather not even *think* about developing a web application/webservice/web-whatever in standard C++... is it actually possible?? Perhaps with some weird Microsoft solutions... I bet your bank online services are written in Java. Certainly not in PHP+MySQL :) Industry has chosen Java not because of industrial-corporate marketing bullshit, but because of economics... it enables you develop robustly, reliably, error-prone, modular, well integrated etc... software. And the costs? Well, using java technologies you can set-up enterprise-quality web application servers, get a fully featured development environment (which is better than ANY C/C++/whatever development environment I've EVER seen) at the cost of exactly 0 (zero!) USD/GBP/EUR... Since many years now, the central issue in software development is not implementing algorithms, it's building applications. And that's where Java outperforms many other technologies. The final remark, because I may be mistakenly taken for an apostle of Java or something... I love the idea of generic programming, C++ is my favourite programming language (and I used to read Stroustroup before sleep), at leisure time I write programs in Python... But if I were to start a software development company, then, apart from some very niche applications like computer games, it most probably would use Java as main technology.
  •  
    "I'd rather not even *think* about developing a web application/webservice/web-whatever in standard C++... is it actually possible?? Perhaps with some weird Microsoft solutions... I bet your bank online services are written in Java. Certainly not in PHP+MySQL :)" Doing in C++ would be awesomely crazy, I agree :) But as I see it there are lots of huge websites that operate on PHP, see for instance Facebook. For the banks and the enterprise market, as a general rule I tend to take with a grain of salt whatever spin comes out from them; in the end behind every corporate IT decision there is a little smurf just trying to survive and have the back covered :) As they used to say in the old times, "No one ever got fired for buying IBM". "Industry has chosen Java not because of industrial-corporate marketing bullshit, but because of economics... it enables you develop robustly, reliably, error-prone, modular, well integrated etc... software. And the costs? Well, using java technologies you can set-up enterprise-quality web application servers, get a fully featured development environment (which is better than ANY C/C++/whatever development environment I've EVER seen) at the cost of exactly 0 (zero!) USD/GBP/EUR... Since many years now, the central issue in software development is not implementing algorithms, it's building applications. And that's where Java outperforms many other technologies." Apart from the IDE considerations (on which I cannot comment, since I'm not a IDE user myself), I do not see how Java beats the competition in this regard (again, Python and the huge software ecosystem surrounding it). My impression is that Java's success is mostly due to Sun pushing it like there is no tomorrow and bundling it with their hardware business.
  •  
    OK, I think there is a bit of everything, wrong and right, but you have to acknowledge that Python is not always the simplest. For info, Facebook uses Java (if you upload picture for instance), and PHP is very limited. So definitely, in company, engineers like you and me select the language, it is not a marketing or political thing. And in the case of fb, they come up with the conclusion that PHP, and Java don't do everything but complement each other. As you say Python as many things around, but it might be too much for simple applications. Otherwise, I would seriously be interested by a study of how to implement a Python-like system on-board spacecrafts and what are the advantages over mixing C, Ada and Java.
Luís F. Simões

Seminar: You and Your Research, Dr. Richard W. Hamming (March 7, 1986) - 10 views

  • This talk centered on Hamming's observations and research on the question "Why do so few scientists make significant contributions and so many are forgotten in the long run?" From his more than forty years of experience, thirty of which were at Bell Laboratories, he has made a number of direct observations, asked very pointed questions of scientists about what, how, and why they did things, studied the lives of great scientists and great contributions, and has done introspection and studied theories of creativity. The talk is about what he has learned in terms of the properties of the individual scientists, their abilities, traits, working habits, attitudes, and philosophy.
  •  
    Here's the link related to one of the lunch time discussions. I recommend it to every single one of you. I promise it will be worth your time. If you're lazy, you have a summary here (good stuff also in the references, have a look at them):      Erren TC, Cullen P, Erren M, Bourne PE (2007) Ten Simple Rules for Doing Your Best Research, According to Hamming. PLoS Comput Biol 3(10): e213.
  • ...3 more comments...
  •  
    I'm also pretty sure that the ones who are remembered are not the ones who tried to be... so why all these rules !? I think it's bullshit...
  •  
    The seminar is not a manual on how to achieve fame, but rather an analysis on how others were able to perform very significant work. The two things are in some cases related, but the seminar's focus is on the second.
  •  
    Then read a good book on the life of Copernic, it's the anti-manual of Hamming... he breaks all the rules !
  •  
    honestly I think that some of these rules actually make sense indeed ... but I am always curious to get a good book recommendation (which book of Copernic would you recommend?) btw Pacome: we are in Paris ... in case you have some time ...
  •  
    I warmly recommend this book, a bit old but fascinating: The sleepwalkers from Arthur Koestler. It shows that progress in science is not straight and do not obey any rule... It is not as rational as most of people seem to believe today. http://www.amazon.com/Sleepwalkers-History-Changing-Universe-Compass/dp/0140192468/ref=sr_1_1?ie=UTF8&qid=1294835558&sr=8-1 Otherwise yes I have some time ! my phone number: 0699428926 We live around Denfert-Rochereau and Montparnasse. We could go for a beer this evening ?
LeopoldS

Europe tackles huge fraud : Nature News - 5 views

  •  
    they used names of scientists and research centres without these actually knowing about their involvement it seems.... I am wondering what they actually reported back in terms of results? randomly generated papers? Christos?
  • ...2 more comments...
  •  
    surprised? of course not! schadenfreude? yes, a lot!
  •  
    Probably some bored project officer "accepted" the deliverables as reasonable? What worries me is the last paragraph by the Committee on Industry and Research (Space is in there..., all RTD is there...) Are we going to simplify procedures or tighten more??? Because there is a lot of talk about simplification in FP8: which is not well received by Parliament/Council and co...
  •  
    Hopefully I'm wrong, but I'm very pessimistic. I guess they will impose even more control, ask for even more detailed description of the results that will be delivered and concentrate even more on project funding instead of funding open research.
  •  
    maybe this is what happen when there is so much paper involved... a simple phone call to one of the research scientist and the fraud is unveiled :) or maybe the "bored project officer" has a brand new mercedes...
Luzi Bergamin

DIY Spy Drone Sniffs Wi-Fi, Intercepts Phone Calls | Threat Level | Wired.com - 2 views

  •  
    Forget about cube-sats, this is the low cost version of your personal spying device...
Juxi Leitner

Real-Life Cyborg Astrobiologists to Search for Signs of Life on Future Mars Missions - 0 views

  • EuroGeo team developed a wearable-computer platform for testing computer-vision exploration algorithms in real-time at geological or astrobiological field sites, focusing on the concept of "uncommon mapping"  in order to identify contrasting areas in an image of a planetary surface. Recently, the system was made more ergonomic and easy to use by porting the system into a phone-cam platform connected to a remote server.
  • a second computer-vision exploration algorithm using a  neural network in order to remember aspects of previous images and to perform novelty detection
  •  
    well a bit misleading title...
annaheffernan

New apps allow smartphone users to join the hunt for ultrahigh-energy cosmic rays - 0 views

  •  
    Two apps - the Distributed Electronic Cosmic-ray Observatory (DECO) and Cosmic Rays Found in Smartphones (CRAYFIS) - transform smartphones into miniature cosmic-ray detectors. They use the CMOS chips inside phones' onboard cameras to detect the secondary particles produced when cosmic rays - energetic, charged subatomic particles arriving from beyond the solar system - collide with air molecules in the Earth's atmosphere
Ma Ru

Shop which knows your name - 6 views

  •  
    I'm sure Leo will love it. Yet another argument not to have a facebook account or a smartphone.
  • ...1 more comment...
  •  
    absolutely ... so you ditched yours also already?
  •  
    Ditched? I never had either! But then on the other hand a recent Dilbert summarised me pretty well...
  •  
    so you also don't have a mobile phone? I thought I knew only one person of my age who does not have one yet ... congratulations
‹ Previous 21 - 40 of 42 Next ›
Showing 20 items per page