Skip to main content

Home/ Advanced Concepts Team/ Group items tagged trends

Rss Feed Group items tagged

Dario Izzo

IPCC models getting mushy | Financial Post - 2 views

  •  
    why am I not surprised .....
  •  
    http://www.academia.edu/4210419/Can_climate_models_explain_the_recent_stagnation_in_global_warming A view of well-respected scientists on how to proceed from here, that was rejected from Nature. In any case, a long way to go...
  •  
    unfortunately it's too early to cheer and burn more coal ... there is also a nice podcast associated to this paper from nature Recent global-warming hiatus tied to equatorial Pacific surface cooling Yu Kosaka & Shang-Ping Xie Nature 501, 403-407 (19 September 2013) doi:10.1038/nature12534 Received 18 June 2013 Accepted 08 August 2013 Published online 28 August 2013 Despite the continued increase in atmospheric greenhouse gas concentrations, the annual-mean global temperature has not risen in the twenty-first century1, 2, challenging the prevailing view that anthropogenic forcing causes climate warming. Various mechanisms have been proposed for this hiatus in global warming3, 4, 5, 6, but their relative importance has not been quantified, hampering observational estimates of climate sensitivity. Here we show that accounting for recent cooling in the eastern equatorial Pacific reconciles climate simulations and observations. We present a novel method of uncovering mechanisms for global temperature change by prescribing, in addition to radiative forcing, the observed history of sea surface temperature over the central to eastern tropical Pacific in a climate model. Although the surface temperature prescription is limited to only 8.2% of the global surface, our model reproduces the annual-mean global temperature remarkably well with correlation coefficient r = 0.97 for 1970-2012 (which includes the current hiatus and a period of accelerated global warming). Moreover, our simulation captures major seasonal and regional characteristics of the hiatus, including the intensified Walker circulation, the winter cooling in northwestern North America and the prolonged drought in the southern USA. Our results show that the current hiatus is part of natural climate variability, tied specifically to a La-Niña-like decadal cooling. Although similar decadal hiatus events may occur in the future, the multi-decadal warming trend is very likely to continue with greenhouse gas
johannessimon81

Computational Imaging: The Next Mobile Battlefield - 2 views

  •  
    Wired article giving an opinion on the future trends for mobile computing (e.g. SLAM, 3D vision, ...)
Dario Izzo

Climate scientists told to 'cover up' the fact that the Earth's temperature hasn't rise... - 5 views

  •  
    This is becoming a mess :)
  • ...2 more comments...
  •  
    I would avoid reading climate science from political journals, for a less selective / dramatic picture :-) . Here is a good start: http://www.realclimate.org/ And an article on why climate understanding should be approached hierarcically, (that is not the way done in the IPCC), a view with insight, 8 years ago: http://www.princeton.edu/aos/people/graduate_students/hill/files/held2005.pdf
  •  
    True, but fundings are allocated to climate modelling 'science' on the basis of political decisions, not solid and boring scientific truisms such as 'all models are wrong'. The reason so many people got trained on this area in the past years is that resources were allocated to climate science on the basis of the dramatic picture depicted by some scientists when it was indeed convenient for them to be dramatic.
  •  
    I see your point, and I agree that funding was also promoted through the energy players and their political influence. A coincident parallel interest which is irrelevant to the fact that the question remains vital. How do we affect climate and how does it respond. Huge complex system to analyse which responds in various time scales which could obscure the trend. What if we made a conceptual parallelism with the L Ácquila case : Is the scientific method guilty or the interpretation of uncertainty in terms of societal mobilization? Should we leave the humanitarian aspect outside any scientific activity?
  •  
    I do not think there is anyone arguing that the question is not interesting and complex. The debate, instead, addresses the predictive value of the models produced so far. Are they good enough to be used outside of the scientific process aimed at improving them? Or should one wait for "the scientific method" to bring forth substantial improvements to the current understanding and only then start using its results? One can take both stand points, but some recent developments will bring many towards the second approach.
Alexander Wittig

Picture This: NVIDIA GPUs Sort Through Tens of Millions of Flickr Photos - 2 views

  •  
    Strange and exotic cityscapes. Desolate wilderness areas. Dogs that look like wookies. Flickr, one of the world's largest photo sharing services, sees it all. And, now, Flickr's image recognition technology can categorize more than 11 billion photos like these. And it does it automatically. It's called "Magic View." Magical deep learning! Buzzword attack!
  • ...4 more comments...
  •  
    and here comes my standard question: how can we use this for space? fast detection of natural disasters onboard?
  •  
    Even on ground. You could for example teach it what nuclear reactors or missiles or other weapons you don't want look like on satellite pictures and automatically scan the world for them (basically replacing intelligence analysts).
  •  
    In fact, I think this could make a nice ACT project: counting seals from satellite imagery is an actual (and quite recent) thing: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0092613 In this publication they did it manually from a GeoEye 1 b/w image, which sounds quite tedious. Maybe one can train one of those image recognition algorithms to do it automatically. Or maybe it's a bit easier to count larger things, like elephants (also a thing).
  •  
    In HiPEAC (High Performance, embedded architecture and computation) conference I attended in the beginning of this year there was a big trend of CUDA GPU vs FPGA for hardware accelerated image processing. Most of it orbitting around discussing who was faster and cheaper with people from NVIDIA in one side and people from Xilinx and Intel in the other. I remember of talking with an IBM scientist working on hardware accelerated data processing working together with the Radio telescope institute in Netherlands about the solution where they working on (GPU CUDA). I gathered that NVIDIA GPU suits best in applications that somehow do not rely in hardware, having the advantage of being programmed in a 'easy' way accessible to a scientist. FPGA's are highly reliable components with the advantage of being available in radhard versions, but requiring specific knowledge of physical circuit design and tailored 'harsh' programming languages. I don't know what is the level of rad hardness in NVIDIA's GPUs... Therefore FPGAs are indeed the standard choice for image processing in space missions (a talk with the microelectronics department guys could expand on this), whereas GPUs are currently used in some ground based (radio astronomy or other types of telescopes). I think that on for a specific purpose as the one you mentioned, this FPGA vs GPU should be assessed first before going further.
  •  
    You're forgetting power usage. GPUs need 1000 hamster wheels worth of power while FPGAs can run on a potato. Since space applications are highly power limited, putting any kind of GPU monster in orbit or on a rover is failed idea from the start. Also in FPGAs if a gate burns out from radiation you can just reprogram around it. Looking for seals offline in high res images is indeed definitely a GPU task.... for now.
  •  
    The discussion of how to make FPGA hardware acceleration solutions easier to use for the 'layman' is starting btw http://reconfigurablecomputing4themasses.net/.
LeopoldS

Hycopter Drone Flies for 4 Hours via Hydrogen Power - Robotics Trends - 2 views

  •  
    finally a consumer application for fuel cells?
LeopoldS

Technology Review: TR10 - 3 views

  •  
    what would be in your best 10 list?
nikolas smyrlakis

INFLUENCERS, How Trends & Creativity Become Contagious. - 1 views

  •  
    haven't seen the documentary yet, great site though(made for ipad?)
Dario Izzo

Probabilistic Logic Allows Computer Chip to Run Faster - 3 views

  •  
    Francesco pointed out this research one year ago, we dropped it as noone was really considering it ... but in space a low CPU power consumption is crucial!! Maybe we should look back into this?
  • ...6 more comments...
  •  
    Q1: For the time being, for what purposes computers are mainly used on-board?
  •  
    for navigation, control, data handling and so on .... why?
  •  
    Well, because the point is to identify an application in which such computers would do the job... That could be either an existing application which can be done sufficiently well by such computers or a completely new application which is not already there for instance because of some power consumption constraints... Q2 would be then: for which of these purposes strict determinism of the results is not crucial? As the answer to this may not be obvious, a potential study could address this very issue. For instance one can consider on-board navigation systems with limited accuracy... I may be talking bullshit now, but perhaps in some applications it doesn't matter whether a satellite flies on the exact route but +/-10km to the left/right? ...and so on for the other systems. Another thing is understanding what exactly this probabilistic computing is, and what can be achieved using it (like the result is probabilistic but falls within a defined range of precision), etc. Did they build a complete chip or at least a sub-circiut, or still only logic gates...
  •  
    Satellites use old CPUs also because with the trend of going for higher power modern CPUs are not very convenient from a system design point of view (TBC)... as a consequence the constraints put on on-board algorithms can be demanding. I agree with you that double precision might just not be necessary for a number of applications (navigation also), but I guess we are not talking about 10km as an absolute value, rather to a relative error that can be tolerated at level of (say) 10^-6. All in all you are right a first study should assess what application this would be useful at all.. and at what precision / power levels
  •  
    The interest of this can be a high fault tolerance for some math operations, ... which would have for effect to simplify the job of coders! I don't think this is a good idea regarding power consumption for CPU (strictly speaking). The reason we use old chip is just a matter of qualification for space, not power. For instance a LEON Sparc (e.g. use on some platform for ESA) consumes something like 5mW/MHz so it is definitely not were an engineer will look for some power saving considering a usual 10-15kW spacecraft
  •  
    What about speed then? Seven time faster could allow some real time navigation at higher speed (e.g. velocity of a terminal guidance for an asteroid impactor is limited to 10 km/s ... would a higher velocity be possible with faster processors?) Another issue is the radiation tolerance of the technology ... if the PCMOS are more tolerant to radiation they could get more easily space qualified.....
  •  
    I don't remember what is the speed factor, but I guess this might do it! Although, I remember when using an IMU that you cannot have the data above a given rate (e.g. 20Hz even though the ADC samples the sensor at a little faster rate), so somehow it is not just the CPU that must be re-thought. When I say qualification I also imply the "hardened" phase.
  •  
    I don't know if the (promised) one-order-of-magnitude improvements in power efficiency and performance are enough to justify looking into this. For once, it is not clear to me what embracing this technology would mean from an engineering point of view: does this technology need an entirely new software/hardware stack? If that were the case, in my opinion any potential benefit would be nullified. Also, is it realistic to build an entire self-sufficient chip on this technology? While the precision of floating point computations may be degraded and still be useful, how does all this play with integer arithmetic? Keep in mind that, e.g., in the Linux kernel code floating-point calculations are not even allowed/available... It is probably possible to integrate an "accelerated" low-accuracy floating-point unit together with a traditional CPU, but then again you have more implementation overhead creeping in. Finally, recent processors by Intel (e.g., the Atom) and especially ARM boast really low power-consumption levels, at the same time offering performance-boosting features such as multi-core and vectorization capabilities. Don't such efforts have more potential, if anything because of economical/industrial inertia?
Nicholas Lan

A counter-distributed computing story - 4 views

  •  
    article itself isn't terribly interesting but first piece i've seen on such a trend
Juxi Leitner

Academics and Research: Virginia Tech Students Build Humanoid | Robotics Trends - 0 views

  •  
    CHARLI is the first untethered, autonomous, full-sized, walking, humanoid robot with four moving limbs and a head, built in the United States. His two long legs and arms can move and gesture thanks to a combination of pulleys, springs, carbon fiber rods, and actuators. CHARLI soon will be able to talk as well.
Ma Ru

Scientists who rock - 3 views

  •  
    I'll just leave this here... Found them through a recent publication of LeDoux in Trends in Cognitive Sciences
  •  
    Ah, and if there are no strong objections, from now on every brainstorming session shall commence with this: http://thebeautifulbrain.com/2010/10/ledoux-amygdaloids-brainstorm/2/
pacome delva

Physics - Nanospheres on a silver plate - 0 views

  • As a result of its high symmetry and conjugated bond structure, the electronic properties of C60 are very unusual, and there is a massive research effort toward integrating it into molecular scale electronic devices [4].
  • In this context, it is important to understand how the molecule forms bonds with a metal substrate, such as silver, which is commonly used as an electrode material.
  • The general trend in all of these cases shows that even molecules with relatively weak individual (atom-to-atom) surface bonds can induce substantial substrate reconstructions in order to create favorable adsorption sites [8]. Such “nanopatterning” of substrates is essential to the stability of ordered structures of these molecules and can critically influence their electronic structure, which is an important aspect in the design of molecular electronic devices.
LeopoldS

Germany Imagines Suburbs Without Cars - NYTimes.com - 0 views

  •  
    how many of the ACT members actually have a car? maybe we are trendsetters :-)
ESA ACT

Google Trends: Neue Konkurrenz für Alexa - 0 views

  •  
    could this be also useful for our website stats - sorry for the german
ESA ACT

Japan Trend Shop - 0 views

  •  
    need ideas for xmas?
ESA ACT

Global Trends 2025: A Transformed World (33MB) - 0 views

  •  
    Interesting not for the US securtiy point of view but some forecasting on technologies and politics of the future
Juxi Leitner

Mendeley, the-Last.fm-of-research, could be world's largest online research paper datab... - 4 views

  •  
    smells like ariadnet for ariadna papers and researchers
  • ...1 more comment...
  •  
    Ideed, seems like what we dream for ariadnet... However could have been good to allow the creation of groups. I will try it next week. The possibility to "Explore research trends and statistics" will please Leopold ;)
  •  
    I am on mendeley now and I like it so far ! You can check my page on http://www.mendeley.com/profiles/pacome-delva/
  •  
    am also on Medelay since some time - think that Tobias has showed it to me. Nice but did not actually use it yet really ....
Ma Ru

The human Turing machine: a neural framework for mental programs - 2 views

  •  
    From the alternative computing series...
Juxi Leitner

Convolutional Neural Networks for Visual Recognition - 3 views

  •  
    pretty impressive stuff!
  • ...3 more comments...
  •  
    Amazing how some guys from some other university also did pretty much the same thing (although they didn't use the bidirectional stuff) and published it just last month. Just goes to show you can dump pretty much anything into an RNN and train it for long enough and it'll produce magic. http://arxiv.org/pdf/1410.1090v1.pdf
  •  
    Seems like quite the trend. And the fact that google still tries to use LSTMs is even more surprising.
  •  
    LSTMs: that was also the first thing in the paper that caught my attention! :) I hadn't seen them in the wild in years... My oversight most likely. The paper seems to be getting ~100 citations a year. Someone's using them.
  •  
    There are a few papers on them. Though you have to be lucky to get them to work. The backprop is horrendous.
‹ Previous 21 - 40 of 43 Next ›
Showing 20 items per page