Skip to main content

Home/ Advanced Concepts Team/ Group items tagged running

Rss Feed Group items tagged

LeopoldS

iWrite - 3 views

  •  
    another reason to buy an ipad .... :-)
  • ...3 more comments...
  •  
    How is this any different from the slew of existing WYSISYG LaTeX-based editors? E.g., http://www.texmacs.org/
  •  
    wow!!!!!!!!!!!! you can write equations!!!!!!!!!!
  •  
    "What you need: - A LaTeX installation" I guess you can buy one in Apple Store... no?
  •  
    my understanding is that it runs on your other computer who would do all the compiling ....
  •  
    texmacs ... and how do you install this on the iPad?
LeopoldS

French National Police Force saves €2 million a year with Ubuntu | Canonical - 0 views

  •  
    Be careful, the article is written by the company who did the migration to Ubuntu. Here is a comment by a police guy from IT (in french...). In brief he says that tyhe migration was not a problem for most of the people; exepc for some probleme with access. But it did cost money ! and the saving was not the main argument. "Personnellement concerné par la news qui n'en est pas une, je peux vous assurer que le message de Canonical est surtout commercial... Le choix d'Ubuntu est dû à son hégémonie et le fait que ce soit basé sur du Debian qui est considéré comme très stable. La distrib est d'une maintenance plus aisée que la plupart de celles qui ont été testées. "4500 postes" veut dire "4500 unités de gendarmerie" donc dans les brigades que vous connaissez... Pour ce qui est d'OpenOffice, le passage s'est fait assez tranquillement sauf pour les applis Access qui ont eu un peu de mal à passer sur le module Base...La plupart ont été reprise au sein d'applis php/mysql ou d'applis centralisées... Aujourd'hui, les gendarmes qui je le rappelle ne sont pas informaticiens mais vivent pour vous (au sens le plus strict je vous l'assure) utilisent donc firefox/thunderbird et oppenoffice en clients lourds, le reste étant des applis sur l'intranet ou "invisibles" pour l'utilisateur. Le passage à Ubuntu ne gène en rien dans l'utilisation car le trio précédemment cité est déjà connu et maîtrisé par nombre de mes collègues. Je ne suis pas censé m'exprimer en lieu et place de mes supérieurs mais à titre personnel, le choix d'Ubuntu est un choix intelligent car c'est une distribution avec une prise en main très accessible et avec une maintenance vraiment aisée pour les spécialistes informatiques dont je fais partie...Il ne faut pas oublier qu'une distribution plus élitiste aurait été maîtrisée par moins de monde et donc la maintenance aurait été plus coûteuse... Donc aujourd'hui nous "maîtrisons" cette part de notre infrastructure et la trans
  •  
    Lotus Notes doesn't run on Linux anyway...
Joris _

Japan probe overshoots Venus, heads toward sun - 0 views

  • A Japanese probe to Venus failed to reach orbit Wednesday and was captured by the sun's gravitational pull
  • Akatsuki's engines did not fire long enough to attain the proper orbiting position
  • may be able to try again when it passes by Venus six years from now.
  •  
    The usefulness of having a robust trajectory :) ... They have to wait 6 more years for another date with Venus ...
  • ...2 more comments...
  •  
    I agree in general but just out of the stomach: is there really an optimised trajectory that would be able to avoid this kind of scenarios when main thrusters don't perform properly? Wouldn't you in any case end up in a sun-orbiting trajectory and have to come back after years??
  •  
    "optimised trajectory" of course not, robust definitely! It was the subject of my paper presented at the AAS (the one in San Diego) "Designing robust interplanetary trajectories subject to one temporary engine failure". The problem here is that they do not have enough fuel for a correction maneuver that would allow to come back to Venus earlier, and break for a VOI. A robust scenario could have alloted the best amount of fuel and time to be able to recover from almost all possible unplanned events. In the paper, I introduce some confidence regions such that I get the robust control for p% chance of mission success in case m% chance of problem with the propulsion system.
  •  
    You should run your method on this scenario and see if you could get a trajectory with a shorter come back time using the same spacecraft.... would be a big selling point for a new trajectory design approach
Francesco Biscani

Hot And Heavy Matter Runs A 4 Trillion Degree Fever - Science News - 1 views

  •  
    Saturday night fever.
Dario Izzo

Probabilistic Logic Allows Computer Chip to Run Faster - 3 views

  •  
    Francesco pointed out this research one year ago, we dropped it as noone was really considering it ... but in space a low CPU power consumption is crucial!! Maybe we should look back into this?
  • ...6 more comments...
  •  
    Q1: For the time being, for what purposes computers are mainly used on-board?
  •  
    for navigation, control, data handling and so on .... why?
  •  
    Well, because the point is to identify an application in which such computers would do the job... That could be either an existing application which can be done sufficiently well by such computers or a completely new application which is not already there for instance because of some power consumption constraints... Q2 would be then: for which of these purposes strict determinism of the results is not crucial? As the answer to this may not be obvious, a potential study could address this very issue. For instance one can consider on-board navigation systems with limited accuracy... I may be talking bullshit now, but perhaps in some applications it doesn't matter whether a satellite flies on the exact route but +/-10km to the left/right? ...and so on for the other systems. Another thing is understanding what exactly this probabilistic computing is, and what can be achieved using it (like the result is probabilistic but falls within a defined range of precision), etc. Did they build a complete chip or at least a sub-circiut, or still only logic gates...
  •  
    Satellites use old CPUs also because with the trend of going for higher power modern CPUs are not very convenient from a system design point of view (TBC)... as a consequence the constraints put on on-board algorithms can be demanding. I agree with you that double precision might just not be necessary for a number of applications (navigation also), but I guess we are not talking about 10km as an absolute value, rather to a relative error that can be tolerated at level of (say) 10^-6. All in all you are right a first study should assess what application this would be useful at all.. and at what precision / power levels
  •  
    The interest of this can be a high fault tolerance for some math operations, ... which would have for effect to simplify the job of coders! I don't think this is a good idea regarding power consumption for CPU (strictly speaking). The reason we use old chip is just a matter of qualification for space, not power. For instance a LEON Sparc (e.g. use on some platform for ESA) consumes something like 5mW/MHz so it is definitely not were an engineer will look for some power saving considering a usual 10-15kW spacecraft
  •  
    What about speed then? Seven time faster could allow some real time navigation at higher speed (e.g. velocity of a terminal guidance for an asteroid impactor is limited to 10 km/s ... would a higher velocity be possible with faster processors?) Another issue is the radiation tolerance of the technology ... if the PCMOS are more tolerant to radiation they could get more easily space qualified.....
  •  
    I don't remember what is the speed factor, but I guess this might do it! Although, I remember when using an IMU that you cannot have the data above a given rate (e.g. 20Hz even though the ADC samples the sensor at a little faster rate), so somehow it is not just the CPU that must be re-thought. When I say qualification I also imply the "hardened" phase.
  •  
    I don't know if the (promised) one-order-of-magnitude improvements in power efficiency and performance are enough to justify looking into this. For once, it is not clear to me what embracing this technology would mean from an engineering point of view: does this technology need an entirely new software/hardware stack? If that were the case, in my opinion any potential benefit would be nullified. Also, is it realistic to build an entire self-sufficient chip on this technology? While the precision of floating point computations may be degraded and still be useful, how does all this play with integer arithmetic? Keep in mind that, e.g., in the Linux kernel code floating-point calculations are not even allowed/available... It is probably possible to integrate an "accelerated" low-accuracy floating-point unit together with a traditional CPU, but then again you have more implementation overhead creeping in. Finally, recent processors by Intel (e.g., the Atom) and especially ARM boast really low power-consumption levels, at the same time offering performance-boosting features such as multi-core and vectorization capabilities. Don't such efforts have more potential, if anything because of economical/industrial inertia?
LeopoldS

Track changes with LaTeX - 3 views

  •  
    did any of you Latex gurus already try this out?
  • ...1 more comment...
  •  
    It's installed on my computer, but I never really used it. I think it's fine, but for my purposes latexdiff mostly is enough.
  •  
    I assume that you use latexdiff from the command line ... still have to find a nice script with which to integrate it into the TexShop GUI for Karène ...
  •  
    A command line is an interface as well. I was able to explain via phone to a (computer-wise) avarage undereducated Mac-user how to install and run latexdiff. Thus I think also Karene can use it...
Christos Ampatzis

Insectlike 'microids' might walk, run, work in colonies - 2 views

  •  
    If a microid were stepped on, it would probably just get up and walk away...
Joris _

SPACE.com -- NASA to Boost Speed of Deep Space Communications - 1 views

  • a few megabits per second might someday get as much as 600 megabits per second, if not more. That could enable far more scientific payoff per mission in the long run.
  • new communication innovations such as disruption tolerant networking
  • one of the biggest communication revolutions will come from laser-driven optical communication
    • Joris _
       
      they kind-of stole my idea ;)
Luís F. Simões

Seminar: You and Your Research, Dr. Richard W. Hamming (March 7, 1986) - 10 views

  • This talk centered on Hamming's observations and research on the question "Why do so few scientists make significant contributions and so many are forgotten in the long run?" From his more than forty years of experience, thirty of which were at Bell Laboratories, he has made a number of direct observations, asked very pointed questions of scientists about what, how, and why they did things, studied the lives of great scientists and great contributions, and has done introspection and studied theories of creativity. The talk is about what he has learned in terms of the properties of the individual scientists, their abilities, traits, working habits, attitudes, and philosophy.
  •  
    Here's the link related to one of the lunch time discussions. I recommend it to every single one of you. I promise it will be worth your time. If you're lazy, you have a summary here (good stuff also in the references, have a look at them):      Erren TC, Cullen P, Erren M, Bourne PE (2007) Ten Simple Rules for Doing Your Best Research, According to Hamming. PLoS Comput Biol 3(10): e213.
  • ...3 more comments...
  •  
    I'm also pretty sure that the ones who are remembered are not the ones who tried to be... so why all these rules !? I think it's bullshit...
  •  
    The seminar is not a manual on how to achieve fame, but rather an analysis on how others were able to perform very significant work. The two things are in some cases related, but the seminar's focus is on the second.
  •  
    Then read a good book on the life of Copernic, it's the anti-manual of Hamming... he breaks all the rules !
  •  
    honestly I think that some of these rules actually make sense indeed ... but I am always curious to get a good book recommendation (which book of Copernic would you recommend?) btw Pacome: we are in Paris ... in case you have some time ...
  •  
    I warmly recommend this book, a bit old but fascinating: The sleepwalkers from Arthur Koestler. It shows that progress in science is not straight and do not obey any rule... It is not as rational as most of people seem to believe today. http://www.amazon.com/Sleepwalkers-History-Changing-Universe-Compass/dp/0140192468/ref=sr_1_1?ie=UTF8&qid=1294835558&sr=8-1 Otherwise yes I have some time ! my phone number: 0699428926 We live around Denfert-Rochereau and Montparnasse. We could go for a beer this evening ?
Joris _

SPACE.com -- Venus Probe's Problems May Cause Japan to Scale Back - 0 views

  • We have to be more conservative to plan our next planetary mission, so it will never fail in any aspect."
  • the probe's initial failure will have a big impact on how JAXA plans future planetary missions
  • hew to more conservative ideas in the near future
  •  
    what a shame! ambition and innovation have not been fairly rewarded ...
  • ...1 more comment...
  •  
    Did you try to run your algorithm on their problem as Dario suggested? I'm very curious!
  •  
    I didn't have time yet. But formulating the failure with a MTBF or a FIT, you can easily imagine a more robust solution. Instead of one single burn, you would make several smaller burns - It will take more time and require more fuel though. Another "robust" approach is to consider weak stability boundary capture. Again it takes time, but chances of failure are lessen.
  •  
    would be a pity indeed!
Luís F. Simões

Polynomial Time Code For 3-SAT Released, P==NP - Slashdot - 0 views

  • "Vladimir Romanov has released what he claims is a polynomial-time algorithm for solving 3-SAT. Because 3-SAT is NP-complete, this would imply that P==NP. While there's still good reason to be skeptical that this is, in fact, true, he's made source code available and appears decidedly more serious than most of the people attempting to prove that P==NP or P!=NP. Even though this is probably wrong, just based on the sheer number of prior failures, it seems more likely to lead to new discoveries than most. Note that there are already algorithms to solve 3-SAT, including one that runs in time (4/3)^n and succeeds with high probability. Incidentally, this wouldn't necessarily imply that encryption is worthless: it may still be too slow to be practical."
  •  
    here we go again...
  •  
    slashdot: "Russian computer scientist Vladimir Romanov has conceded that his previously published solution to the '3 SAT' problem of boolean algebra does not work."
Luís F. Simões

Shell energy scenarios to 2050 - 6 views

  •  
    just in case you were feeling happy and optimistic
  • ...7 more comments...
  •  
    An energy scenario published by an oil company? Allow me to be sceptical...
  •  
    Indeed, Shell is an energy company, not just oil, for some time now ... The two scenarii are, in their approach, dependant of economic and political situation, which is right now impossible to forecast. Reference to Kyoto is surprising, almost out-dated! But overall, I find it rather optimistic at some stages, and probably the timeline (p37-39) is unlikely with recent events.
  •  
    the report was published in 2008, which explains the reference to Kyoto, as the follow-up to it was much more uncertain at that point. The Blueprint scenario is indeed optimistic, but also quite unlikely I'd say. I don't see humanity suddenly becoming so wise and coordinated. Sadly, I see something closer to the Scramble scenario as much more likely to occur.
  •  
    not an oil company??? please have a look at the percentage of their revenues coming from oil and gas and then compare this with all their other energy activities together and you will see very quickly that it is only window dressing ... they are an oil and gas company ... and nothing more
  •  
    not JUST oil. From a description: "Shell is a global group of energy and petrochemical companies." Of course revenues coming from oil are the biggest, the investment turnover on other energy sources is small for now. Knowing that most of their revenues is from an expendable source, to guarantee their future, they invest elsewhere. They have invested >1b$ in renewable energy, including biofuels. They had the largest wind power business among so-called "oil" companies. Oil only defines what they do "best". As a comparison, some time ago, Apple were selling only computers and now they sell phones. But I would not say Apple is just a phone company.
  •  
    window dressing only ... e.g. Net cash from operating activities (pre-tax) in 2008: 70 Billion$ net income in 2008: 26 Billion revenues in 2008: 88 Billion Their investments and revenues in renewables don't even show up in their annual financial reports since probably they are under the heading of "marketing" which is already 1.7 Billion $ ... this is what they report on their investments: Capital investment, portfolio actions and business development Capital investment in 2009 was $24 billion. This represents a 26% decrease from 2008, which included over $8 billion in acquisitions, primarily relating to Duvernay Oil Corp. Capital investment included exploration expenditure of $4.5 billion (2008: $11.0 billion). In Abu Dhabi, Shell signed an agreement with Abu Dhabi National Oil Company to extend the GASCO joint venture for a further 20 years. In Australia, Shell and its partners took the final investment decision (FID) for the Gorgon LNG project (Shell share 25%). Gorgon will supply global gas markets to at least 2050, with a capacity of 15 million tonnes (100% basis) of LNG per year and a major carbon capture and storage scheme. Shell has announced a front-end engineering and design study for a floating LNG (FLNG) project, with the potential to deploy these facilities at the Prelude offshore gas discovery in Australia (Shell share 100%). In Australia, Shell confirmed that it has accepted Woodside Petroleum Ltd.'s entitlement offer of new shares at a total cost of $0.8 billion, maintaining its 34.27% share in the company; $0.4 billion was paid in 2009 with the remainder paid in 2010. In Bolivia and Brazil, Shell sold its share in a gas pipeline and in a thermoelectric power plant and its related assets for a total of around $100 million. In Canada, the Government of Alberta and the national government jointly announced their intent to contribute $0.8 billion of funding towards the Quest carbon capture and sequestration project. Quest, which is at the f
  •  
    thanks for the info :) They still have their 50% share in the wind farm in Noordzee (you can see it from ESTEC on a clear day). Look for Shell International Renewables, other subsidiaries and joint-ventures. I guess, the report is about the oil branch. http://sustainabilityreport.shell.com/2009/servicepages/downloads/files/all_shell_sr09.pdf http://www.noordzeewind.nl/
  •  
    no - its about Shell globally - all Shell .. these participations are just peanuts please read the intro of the CEO in the pdf you linked to: he does not even mention renewables! their entire sustainability strategy is about oil and gas - just making it (look) nicer and environmentally friendlier
  •  
    Fair enough, for me even peanuts are worthy and I am not able to judge. Not all big-profit companies, like Shell, are evil :( Look in the pdf what is in the upstream and downstream you mentionned above. Non-shell sources for examples and more objectivity: http://www.nuon.com/company/Innovative-projects/noordzeewind.jsp http://www.e-energymarket.com/news/single-news/article/ferrari-tops-bahrain-gp-using-shell-biofuel.html thanks.
LeopoldS

Tilera Corporation - 2 views

  •  
    who wants 100 cores ... future of PAGMO?
  • ...2 more comments...
  •  
    Well nVidia provides 10.000 "cores" in a single rack on thei Teslas...
  •  
    remember that you were recommending its purchase already some time ago ... still strong reasons to do so?
  •  
    The problem with this flurry of activity today regarding multicore architectures is that it is really unclear which one will be the winner in the long run. Never understimate the power of inertia, especially in the software industry (after all, people are still programming in COBOL and Fortran today). For instance, NVIDIA gives you the Teslas with 10000 cores, but then you have to rewrite extensive parts of your code in order to take advantage of this. Is this an investment worth undertaking? Difficult to say, it would certainly be if the whole software world moves into that direction (which is not happening - yet?). But then you have other approaches coming out, suche as the Cell processor by IBM (the one on the PS3) which has really impressive floating point performance and, of course, a completely different programming model. The nice thing about this Tilera processor seems to be that it is a general-purpose processor, which may not require extensive re-engineering of existing code (but I'm really hypothesizing, since the thechincal details are not very abundant on their website).
  •  
    Moreover PaGMO computation model is more towards systems with distributed memory, and not with shared memory (i.e. multi-core). In the latter, at certain point the memory access becomes the bottleneck.
LeopoldS

Google Code Blog: Introducing Closure Tools - 1 views

  •  
    new open source tool from google .... Francesco: of any interest to us?
  •  
    I don't think so, it is just a code optimizer for JavaScript, unless there are somewhere big JavaScript (web2.0) applications running that is not of much interest for us Other google labs systems e.g. FriendConnect could be useful for Ariadnet, maybe also the visualization and social graph API
LeopoldS

Knowledge, networks and nations | Royal Society - 4 views

  •  
    nice graphs ... and nice stats
  • ...1 more comment...
  •  
    the graphs are Motion Charts. They were made famous by Hans Rosling's TED talks (http://www.ted.com/speakers/hans_rosling.html). Google eventually bought his software, and made part of it freely available: http://code.google.com/apis/visualization/documentation/gallery/motionchart.html. That's what they are using there.
  •  
    thanks - I was already wondering several times what had happened to this technique that he used at the talk we looked at several times when it was first uploaded ... good that they have made it open source! are they easy to use?
  •  
    the easiest way to use them is: Google Docs > open/create a spreadsheet > Insert > Gadget > Charts > Motion Chart !! :) You have here a tutorial describing all the steps to get it running.
jmlloren

Scientists discover how to turn light into matter after 80-year quest - 5 views

  •  
    Theoretized 80 years ago was Breit-Wheeler pair production in which two photons result in an electron-positron pair (via a virtual electron). It is a relatively simple Feynmann diagram, but the problem is/was how to produce in practice a high energy photon-photon collider... The collider experiment that the scientists have proposed involves two key steps. First, the scientists would use an extremely powerful high-intensity laser to speed up electrons to just below the speed of light. They would then fire these electrons into a slab of gold to create a beam of photons a billion times more energetic than visible light. The next stage of the experiment involves a tiny gold can called a hohlraum (German for 'empty room'). Scientists would fire a high-energy laser at the inner surface of this gold can, to create a thermal radiation field, generating light similar to the light emitted by stars. They would then direct the photon beam from the first stage of the experiment through the centre of the can, causing the photons from the two sources to collide and form electrons and positrons. It would then be possible to detect the formation of the electrons and positrons when they exited the can. Now this is a good experiment... :)
  • ...6 more comments...
  •  
    The solution of thrusting in space.
  •  
    Thrusting in space is solved already. Maybe you wanted to say something different?
  •  
    Thrusting until your fuel runs out is solved, in this way one can produce mass from, among others, solar/star energy directly. What I like about this experiment is that we have the technology already to do it, many parts have been designed for inertial confinement fusion.
  •  
    I am quite certain that it would be more efficient to use the photons directly for thrust instead of converting them into matter. Also, I am a bit puzzled at the asymmetric layout for photon creation. Typically, colliders use two beam of particle with equal but opposite momentum. Because the total momentum for two colliding particles is zero the reaction products are produced more efficiently as a minimum of collision energy is waisted on accelerating the products. I guess in this case the thermal radiation in the cavity is chosen instead of an opposing gamma ray beam to increase the photon density and increase the number of collisions (even if the efficiency decreases because of the asymmetry). However, a danger from using a high temperature cavity might be that a lot of thermionic emission creates lots of free electrons with the cavity. This could reduce the positron yield through recombination and would allow the high energetic photons to loose energy through Compton scattering instead of the Breit-Wheeler pair production.
  •  
    Well, the main benefit from e-p pair creation might be that one can accelerate these subsequently to higher energies again. I think the photon-photon cross-section is extremely low, such that direct beam-beam interactions are basically not happening (below 1/20.. so basically 0 according to quantum probability :P), in this way, the central line of the hohlraum actually has a very high photon density and if timed correctly maximizes the reaction yield such that it could be measured.
  •  
    I agree about the reason for the hohlraum - but I also keep my reservations about the drawbacks. About the pair production as fuel: I pretty sure that your energy would be used smarter in using photon (not necessarily high energy photons) for thrust directly instead of putting tons of energy in creating a rest-mass and then accelerating that. If you look at E² = (p c)²+(m0 c)² then putting energy into the mass term will always reduce your maximum value of p.
  •  
    True, but isnt it E2=(pc)^2 + (m0c^2)^2 such that for photons E\propto{pc} and for mass E\propto{mc^2}. I agree it will take a lot of energy, but this assumes that that wont be the problem at least. The question therefore is whether the mass flow of the photon rocket (fuel consumed to create photons, eg fission/fusion) is higher/lower than the mass flow for e-p creation. You are probably right that the low e-p cross-section will favour direct use of photons to create low thrust for long periods of time, but with significant power available the ISP might be higher for e-p pair creation.
  •  
    In essence the equation tells you that for photons with zero rest mass m0 all the energy will be converted to momentum of the particles. If you want to accelerate e-p then you first spend part of the energy on creating them (~511 keV each) and you can only use the remaining energy to accelerate them. In this case the equation gives you a lower particle momentum which leads to lower thrust (even when assuming 100% acceleration efficiency). ISP is a tricky concept in this case because there are different definitions which clash in the relativistic context (due to the concept of mass flow). R. Tinder gets to a I_SP = c (speed of light) for a photon rocket (using the relativistic mass of the photons) which is the maximum possible relativistic I_SP: http://goo.gl/Zz5gyC .
anonymous

OpenBCI - 5 views

  •  
    "The OpenBCI Board is a versatile and affordable analog-to-digital converter that can be used to sample electrical brain activity (EEG), muscle activity (EMG), heart rate (EKG), and more" Perhaps some work or ideas on brainwave analysis would be interesting ? (User interfaces, mood classifier, detection of various alertness levels )
  • ...1 more comment...
  •  
    lets get one? And then link to the Oculus Rift to control it with my brain.. I want to think about running on Mars and then be doing it :)
  •  
    It's not worth it for $400... The chips are seriously nothing special and you can get a lot better for a lot cheaper. I would just get the electrodes and link them to a RPi or an Odroid or something.
  •  
    True, but the selling feature here is that they take care of that stuff and sell it for 400$. Lets say the hardware is 100USD, then an RF-grade person here here has to do the coding, interfacing, testing within roughly (300/16eur/hour) 20 hours to break even and even then the interface is much nicer in their case.
« First ‹ Previous 61 - 80 of 88 Next ›
Showing 20 items per page