Skip to main content

Home/ Advanced Concepts Team/ Group items tagged Modern

Rss Feed Group items tagged

2More

Stephen Hawking: 'There are no black holes' : Nature News & Comment - 1 views

  •  
    Event Horizon - a modern myth?
  •  
    GR is valid on large scales and is, therefore, a simplification of the unknown GUT. As such, the mathematical solutions obtained in GR are strictly speaking valid only within GR. Certainly, the solution called black hole is an extremely heavy object and at the same time extremely small - a point without geometrical extension. The latter is heavily in conflict with the validity range of the underlying theory and, hence, makes lots of people (including experts unlike me) question the concept of black holes despite the fact that something has been "observed" which fits into this concept. Regarding the movie: Event Horizon might be a myth but it emphasizes what Sante said in on of his presentations: Don't use a black hole for travelling, take the worm hole instead. The constructor of Event Horizon created a black hole not considering that the damn thing has no exit...where did he think the Event Horizon would end?
1More

Insect flight dynamics: Stability and control - 2 views

  •  
    A recently published review on insect flight appeared in Review of Modern Physics. It might be of interest to the biomimetics unit.
3More

InfoQ: A Crash Course in Modern Hardware - 3 views

  •  
    for francesco ;) though i guess he knows it all already so for the others who wanna know too
  •  
    Cool, lots of useful info in there. Though, never having programmed in Java before, I wonder if one can go that low-level in Java?
  •  
    oh I don't think so but it is interesting for the JVM I guess
1More

Sean Gourley on the mathematics of war | Video on TED.com - 0 views

  •  
    By pulling raw data from the news and plotting it onto a graph, Sean Gourley and his team have come up with a stunning conclusion about the nature of modern war -- and perhaps a model for resolving conflicts. - really interesting
1More

Critical phenomena in microgravity: Past, present, and future - 0 views

  •  
    This review provides an overview of the progress in using the low-gravity environment of space to explore critical phenomena and test modern theoretical predictions.
1More

Vacuum tubes are back - in nano form - 0 views

  •  
    Although vacuum tubes were the basic components of early electronic devices, by the 1970s they were almost entirely replaced by semiconductor transistors. They are now back in nano-form as "nanoscale vacuum channel transistors" that combine the best of vacuum tubes and modern semiconductors into a single device. This old-technology with a new twist could be useful for space applications due to broader temperature operational range and better radiation resilience - authors are with NASA Ames.
3More

Weather patterns on Exoplanet detected - 1 views

  •  
    so it took us 70% of the time Earth is in the habitable zone to develop, would this be normal or could it be much faster? In other words, would all forms of life that started on a planet that originated at a 'similar' point in time like us, be equally far developed?
  •  
    That is actually quite tricky to estimate rly. If for no other reason than the fact that all of the mass extinctions we had over the Earth's history basically reset the evolutionary clock. Assuming 2 Earths identical in every way but one did not have the dinosaur wipe-out impact, that would've given non-impact Earth 60million years to evolve a potential dinosaur intelligent super race.
  •  
    The opposite might be true - or might not be ;-). Since usually the rate of evolution increases after major extinction events the chance is higher to produce 'intelligent' organisms if these events happen quite frequently. Usually the time of rapid evolution is only a few million years - so Earth is going quite slow. Certainly extinction events don't reset the evolutionary clock - if they would never have happened Earth gene pool would probably be quite primitive. By the way: dinosaurs were a quite diverse group and large dinosaurs might well have had cognitive abilities that come close to whales or primates - the difference to us might be that we have hands to manipulate our environment and vocal cords to communicate in very diverse ways. Modern dinosaur (descendents), i.e. birds, contain some very intelligent species - especially with respect to their body size and weight.
1More

Nature podcast - Music and the making of science - 2 views

  •  
    "Is music simply a pleasant accompaniment to thought, or a driving force behind it? This show examines music's influence on the development of modern science and the foundations of acoustics." See also http://www.nature.com/news/strike-a-chord-1.17127
1More

Animal brains connected up to make mind-melded computer - 2 views

  •  
    Parallel processing in computing --- Brainet The team sent electrical pulses to all four rats and rewarded them when they synchronised their brain activity. After 10 training sessions, the rats were able to do this 61 per cent of the time. This synchronous brain activity can be put to work as a computer to perform tasks like information storage and pattern recognition, says Nicolelis. "We send a message to the brains, the brains incorporate that message, and we can retrieve the message later," he says. Dividing the computing of a task between multiple brains is similar to sharing computations between multiple processors in modern computers, "If you could collaboratively solve common problems [using a brainet], it would be a way to leverage the skills of different individuals for a common goal."
6More

Research Blog: Inceptionism: Going Deeper into Neural Networks - 0 views

  •  
    Deep neural networks "dreaming" psychedelic images
  • ...3 more comments...
  •  
    Although that's not technically correct. The networks don't actually generate the images, rather the features that get triggered in the network already get amplified through some heuristic. Still fun tho`
  •  
    Now in real time: http://www.twitch.tv/317070
  •  
    Yes, true for the later images, but for the first images they start with random noise and a 'natural image' prior, no? But I guess calling it "hallucinating" might have been more accurate ;)
  •  
    Funny how representation errors in NNs suddenly become art. God.... neo-post-modernism.
1More

helium discussion - 0 views

  •  
    This link has a nice concise explanation of helium escaping the atmosphere. http://www.astronomynotes.com/solarsys/s3.htm (this link describes the mechanics of particles escaping the atmosphere, this includes escape velocity, thermal and nonthermal process) http://www.springerlink.com/content/k094u75188h64516/fulltext.pdf ( and if you are really interested, this paper discuss helium in the atmosphere (production and loss) in much more detail )
9More

Probabilistic Logic Allows Computer Chip to Run Faster - 3 views

  •  
    Francesco pointed out this research one year ago, we dropped it as noone was really considering it ... but in space a low CPU power consumption is crucial!! Maybe we should look back into this?
  • ...6 more comments...
  •  
    Q1: For the time being, for what purposes computers are mainly used on-board?
  •  
    for navigation, control, data handling and so on .... why?
  •  
    Well, because the point is to identify an application in which such computers would do the job... That could be either an existing application which can be done sufficiently well by such computers or a completely new application which is not already there for instance because of some power consumption constraints... Q2 would be then: for which of these purposes strict determinism of the results is not crucial? As the answer to this may not be obvious, a potential study could address this very issue. For instance one can consider on-board navigation systems with limited accuracy... I may be talking bullshit now, but perhaps in some applications it doesn't matter whether a satellite flies on the exact route but +/-10km to the left/right? ...and so on for the other systems. Another thing is understanding what exactly this probabilistic computing is, and what can be achieved using it (like the result is probabilistic but falls within a defined range of precision), etc. Did they build a complete chip or at least a sub-circiut, or still only logic gates...
  •  
    Satellites use old CPUs also because with the trend of going for higher power modern CPUs are not very convenient from a system design point of view (TBC)... as a consequence the constraints put on on-board algorithms can be demanding. I agree with you that double precision might just not be necessary for a number of applications (navigation also), but I guess we are not talking about 10km as an absolute value, rather to a relative error that can be tolerated at level of (say) 10^-6. All in all you are right a first study should assess what application this would be useful at all.. and at what precision / power levels
  •  
    The interest of this can be a high fault tolerance for some math operations, ... which would have for effect to simplify the job of coders! I don't think this is a good idea regarding power consumption for CPU (strictly speaking). The reason we use old chip is just a matter of qualification for space, not power. For instance a LEON Sparc (e.g. use on some platform for ESA) consumes something like 5mW/MHz so it is definitely not were an engineer will look for some power saving considering a usual 10-15kW spacecraft
  •  
    What about speed then? Seven time faster could allow some real time navigation at higher speed (e.g. velocity of a terminal guidance for an asteroid impactor is limited to 10 km/s ... would a higher velocity be possible with faster processors?) Another issue is the radiation tolerance of the technology ... if the PCMOS are more tolerant to radiation they could get more easily space qualified.....
  •  
    I don't remember what is the speed factor, but I guess this might do it! Although, I remember when using an IMU that you cannot have the data above a given rate (e.g. 20Hz even though the ADC samples the sensor at a little faster rate), so somehow it is not just the CPU that must be re-thought. When I say qualification I also imply the "hardened" phase.
  •  
    I don't know if the (promised) one-order-of-magnitude improvements in power efficiency and performance are enough to justify looking into this. For once, it is not clear to me what embracing this technology would mean from an engineering point of view: does this technology need an entirely new software/hardware stack? If that were the case, in my opinion any potential benefit would be nullified. Also, is it realistic to build an entire self-sufficient chip on this technology? While the precision of floating point computations may be degraded and still be useful, how does all this play with integer arithmetic? Keep in mind that, e.g., in the Linux kernel code floating-point calculations are not even allowed/available... It is probably possible to integrate an "accelerated" low-accuracy floating-point unit together with a traditional CPU, but then again you have more implementation overhead creeping in. Finally, recent processors by Intel (e.g., the Atom) and especially ARM boast really low power-consumption levels, at the same time offering performance-boosting features such as multi-core and vectorization capabilities. Don't such efforts have more potential, if anything because of economical/industrial inertia?
9More

Pi Computation Record - 4 views

  •  
    For Dario: the PI computation record was established on a single desktop computer using a cache optimized algorithm. Previous record was obtained by a cluster of hundreds of computers. The cache optimized algorithm was 20 times faster.
  • ...6 more comments...
  •  
    Teeeeheeeeheeee... assembler programmers greet Java/Python/Etc. programmers :)
  •  
    And he seems to have done everything in his free time!!! I like the first FAQ.... "why did you do it?"
  •  
    did you read any of the books he recommends? suggest: Modern Computer Arithmetic by Richard Brent and Paul Zimmermann, version 0.4, November 2009, Full text available here. The Art of Computer Programming, volume 2 : Seminumerical Algorithms by Donald E. Knuth, Addison-Wesley, third edition, 1998. More information here.
  •  
    btw: we will very soon have the very same processor in the new iMac .... what record are you going to beat with it?
  •  
    Zimmerman is the same guy behind the MPFR multiprecision floating-point library, if I recall correctly: http://www.mpfr.org/credit.html I've not read the book... Multiprecision arithmetic is a huge topic though, at least from the scientific and number theory point of view if not for its applications to engineering problems. "The art of computer programming" is probably the closest thing to a bible for computer scientists :)
  •  
    "btw: we will very soon have the very same processor in the new iMac .... what record are you going to beat with it?" Fastest Linux install on an iMac :)
  •  
    "Fastest Linux install on an iMac :)" that is going to be a though one but a worthy aim! ""The art of computer programming" is probably the closest thing to a bible for computer scientists :)" yep! Programming is art ;)
1More

Nuclear winter revisited with a modern climate model and current nuclear arsenals: Stil... - 0 views

  •  
    good to know ...
1More

The challenges of Big Data analysis @NSR_Family - 2 views

  •  
    Big Data bring new opportunities to modern society and challenges to data scientists. On the one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This paper gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures.
3More

Is increased light exposure from screens and phones bad for your health? @Wired - 1 views

  •  
    As Stevens says in the new article, researchers now know that increased nighttime light exposure tracks with increased rates of breast cancer, obesity and depression. Correlation isn't causation, of course, and it's easy to imagine all the ways researchers might mistake those findings. The easy availability of electric lighting almost certainly tracks with various disease-causing factors: bad diets, sedentary lifestyles, exposure to they array of chemicals that come along with modernity. Very difficult to prove causation I would think, but there are known relationships between hormone levels and light.
  •  
    There is actually a windows program called flux, that changes the temperature on your screen to match normal light cycles. When the sun sets it switches to a "warmer" more reddish tint on your screen to promote sleepiness. The typically bright blue/neon white settings of most pc settings is quite "awakening" and keeps your brain running for longer. This impacts your sleeping patterns and all the consequences of that. Amazingly, this flux thing does have an effect. That being said, I wouldn't be too quick to blame it all on PC/artificial lighting time. Sedentary lifestyles, etc can very well place one in a position of long term pc/phone usage so it's quite hard to draw a causal link.
  •  
    nice - also exists for MAC btw: https://justgetflux.com/news/pages/mac/
4More

The Nanodevice Aiming to Replace the Field Effect Transistor - 2 views

  •  
    very nice! "For a start, the wires operate well as switches that by some measures compare well to field effect transistors. For example they allow a million times more current to flow when they are on compared with off when operating at a voltage of about 1.5 V. "[A light effect transistor] can replicate the basic switching function of the modern field effect transistor with competitive (and potentially improved) characteristics," say Marmon and co. But they wires also have entirely new capabilities. The device works as an optical amplifier and can also perform basic logic operations by using two or more laser beams rather than one. That's something a single field effect transistor cannot do."
  • ...1 more comment...
  •  
    The good thing about using CdSe NW (used here) is that they show a photon-to-current efficiency window around the visible wavelengths, therefore any visible light can in principle be used in this application to switch the transistor on/off. I don't agree with the moto "Nanowires are also simpler than field effect transistors and so they're potentially cheaper and easier to make." Yes, they are simple, yet for applications, fabricating devices with them consistently is very challenging (being the research effort not cheap at all..) and asks for improvements and breakthroughs in the fabrication process.
  •  
    any idea how the shine the light selectively to such small surfaces?
  •  
    "Illumination sources consisted of halogen light, 532.016, 441.6, and 325 nm lasers ported through a Horiba LabRAM HR800 confocal Raman system with an internal 632.8 nm laser. Due to limited probe spacing for electrical measurements, all illumination sources were focused through a 50x long working distance (LWD) objective lens (N.A. = 0.50), except 325 nm, which went through a 10x MPLAN objective lens (N.A. = 0.25)." Laser spot size calculated from optical diffraction formula 1.22*lambda/NA
2More

Calling Bullshit - 2 views

  •  
    A college course at University of Washington on "Calling Bullshit". We should invite them to give a lunch lecture at ESA... Our aim in this course is to teach you how to think critically about the data and models that constitute evidence in the social and natural sciences. While bullshit may reach its apogee in the political domain, this is not a course on political bullshit. Instead, we will focus on bullshit that comes clad in the trappings of scholarly discourse. Our learning objectives are straightforward. After taking the course, you should be able to: * Remain vigilant for bullshit contaminating your information diet. * Recognize said bullshit whenever and wherever you encounter it. * Figure out for yourself precisely why a particular bit of bullshit is bullshit. * Provide a statistician or fellow scientist with a technical explanation of why a claim is bullshit. * Provide your crystals-and-homeopathy aunt or casually racist uncle with an accessible and persuasive explanation of why a claim is bullshit. We will be astonished if these skills do not turn out to be among the most useful and most broadly applicable of those that you acquire during the course of your college education.
  •  
    love it: "Politicians are unconstrained by facts. Science is conducted by press release. Higher education rewards bullshit over analytic thought. Startup culture elevates bullshit to high art. Advertisers wink conspiratorially and invite us to join them in seeing through all the bullshit - and take advantage of our lowered guard to bombard us with bullshit of the second order. The majority of administrative activity, whether in private business or the public sphere, seems to be little more than a sophisticated exercise in the combinatorial reassembly of bullshit. We're sick of it. It's time to do something, and as educators, one constructive thing we know how to do is to teach people. So, the aim of this course is to help students navigate the bullshit-rich modern environment by identifying bullshit, seeing through it, and combating it with effective analysis and argument."
1 - 19 of 19
Showing 20 items per page