Skip to main content

Home/ Advanced Concepts Team/ Group items tagged system

Rss Feed Group items tagged

Thijs Versloot

Relativistic rocket: Dream and reality - 3 views

  •  
    An exhaustive overview of all possible advanced rocket concepts, eg.. "As an example, consider a photon rocket with its launching mass, say, 1000 ton moving with a constant acceleration a =0.1 g=0.98 m/s2. The flux of photons with E γ=0.5 MeV needed to produce this acceleration is ~1027/s, which corresponds to the efflux power of 1014 W and the rate of annihilation events N'a~5×1026 s−1 [47]. This annihilation rate in ambiplasma l -l ann corresponds to the value of current ~108 A and linear density N ~2×1018 m−1 thus any hope for non-relativistic relative velocity of electrons and positrons in ambiplasma is groundless." And also, even if it would work, then one of the major issues is going to be heat dispersal: "For example, if the temperature of radiator is chosen T=1500 K, the emitting area should be not less than 1000 m2 for Pb=1 GW, not less than 1 km2 for Pb=1 TW, and ~100 km2 for Pb=100 TW, assuming ε=0.5 and δ=0.2. Lower temperature would require even larger radiator area to maintain the outer temperature of the engine section stable for a given thermal power of the reactor."
  • ...2 more comments...
  •  
    We were also discussing a while ago a propulsion system using the relativistic fragments from nuclear fission. That would also produce an extremely high ISP (>100000) with a fairly high thrust. Never really got any traction though.
  •  
    I absolutely do not see the point in a photon rocket. Certainly, the high energy releasing nulcear processes (annihilation, fusion, ...) should rather be used to heat up some fluid to plasma state and accelerate it via magnetic nozzle. This would surely work as door-opener to our solar system...and by the way minimize the heat disposal problem if regenerative cooling is used.
  •  
    The problem is not achieving a high energy density, that we can already do with nuclear fission, the question however is how to confine or harness this power with relatively high efficiency, low waste heat and at not too crazy specific mass. I see magnetic confinement as a possibility, yet still decades away and also an all-or-nothing method as we cannot easily scale this up from a test experiment to a full-scale system. It might be possible to extract power from such a plasma, but definitely well below breakeven so an additional power supply is needed. The fission fragments circumvent these issues by a more brute force approach, thereby wasting a lot of energy for sure but at the end probably providing more ISP and thrust.
  •  
    Sure. However, the annihilation based photon rocket concept unifies almost all relevant drawbacks if we speak about solar system scales, making itself obsolete...it is just an academic testcase.
jcunha

The physics of life - 2 views

  •  
    Research in active-matter systems is a growing field in biology. It consists in using theoretical statistical physics in living systems such as molecule colonies to deduce macroscopic properties. The aim and hope is to understand how cells divide, take shape and move on these systems. Being a crossing field between physics and biology "The pot of gold is at the interface but you have to push both fields to their limits." one can read
  •  
    Maybe we should discuss about this active matter one of these days? "These are the hallmarks of systems that physicists call active matter, which have become a major subject of research in the past few years. Examples abound in the natural world - among them the leaderless but coherent flocking of birds and the flowing, structure-forming cytoskeletons of cells. They are increasingly being made in the laboratory: investigators have synthesized active matter using both biological building blocks such as microtubules, and synthetic components including micrometre-scale, light-sensitive plastic 'swimmers' that form structures when someone turns on a lamp. Production of peer-reviewed papers with 'active matter' in the title or abstract has increased from less than 10 per year a decade ago to almost 70 last year, and several international workshops have been held on the topic in the past year."
Luzi Bergamin

First circuit breaker for high voltage direct current - 2 views

  •  
    Doesn't really sound sexy, but this is of utmost importance for next generation grids for renewable energy.
  •  
    I agree on the significance indeed - a small boost also for my favourite Desertec project ... Though their language is a bit too "grandiose": "ABB has successfully designed and developed a hybrid DC breaker after years of research, functional testing and simulation in the R&D laboratories. This breaker is a breakthrough that solves a technical challenge that has been unresolved for over a hundred years and was perhaps one the main influencers in the 'war of currents' outcome. The 'hybrid' breaker combines mechanical and power electronics switching that enables it to interrupt power flows equivalent to the output of a nuclear power station within 5 milliseconds - that's as fast as a honey bee takes per flap of its wing - and more than 30 times faster than the reaction time of an Olympic 100-meter medalist to react to the starter's gun! But its not just about speed. The challenge was to do it 'ultra-fast' with minimal operational losses and this has been achieved by combining advanced ultrafast mechanical actuators with our inhouse semiconductor IGBT valve technologies or power electronics (watch video: Hybrid HVDC Breaker - How does it work). In terms of significance, this breaker is a 'game changer'. It removes a significant stumbling block in the development of HVDC transmission grids where planning can start now. These grids will enable interconnection and load balancing between HVDC power superhighways integrating renewables and transporting bulk power across long distances with minimal losses. DC grids will enable sharing of resources like lines and converter stations that provides reliability and redundancy in a power network in an economically viable manner with minimal losses. ABB's new Hybrid HVDC breaker, in simple terms will enable the transmission system to maintain power flow even if there is a fault on one of the lines. This is a major achievement for the global R&D team in ABB who have worked for years on the challeng
LeopoldS

Scientists test novel power system for space travel (w/ video) - 1 views

  •  
    Less impressive than the headline, since they actually just tested their conversion system at suboptimal conditions on an existing reactor setup, but still since done within six month and with less than 1M€ ...
Isabelle Dicaire

ESO - eso1241 - Planet Found in Nearest Star System to Earth - 0 views

shared by Isabelle Dicaire on 19 Oct 12 - No Cached
LeopoldS liked it
  •  
    An Earth-sized planet is found orbiting Alpha Centauri B in the closest star system from us!
johannessimon81

Nasa-funded study: industrial civilisation headed for 'irreversible collapse'? - 4 views

  •  
    Sounds relevant. Does ESA need to have a position on this question?
  •  
    This was on Slashdot now, with a link to the paper. It quite an iteresting study actually. "The scenarios most closely reflecting the reality of our world today are found in the third group of experiments (see section 5.3), where we introduced economic stratification. Under such conditions, we find that collapse is difficult to avoid."
  •  
    Interesting, but is it new? In general, I would say that history has shown us that it is inevitable that civilisations get replaced by new concepts (much is published about this, read eg Fog of War by Jona Lendering on the struggles between civilisations in ancient history, which have remarkably similar issues as today, yet on a different scale of course). "While some members of society might raise the alarm that the system is moving towards an impending collapse and therefore advocate structural changes to society in order to avoid it, Elites and their supporters, who opposed making these changes, could point to the long sustainable trajectory 'so far' in support of doing nothing." I guess this bang on it, the ones that can change the system, are not benefitted by doing so, hence enrichment, depletion, short term gain remain and might even accelerate to compensate for the loss in the rest of the system.
Thijs Versloot

Alien star invaded the Solar System - 2 views

  •  
    An alien star passed through our Solar System just 70,000 years ago, astronomers have discovered. No other star is known to have approached this close to us. An international team of researchers says it came five times closer than our current nearest neighbour - Proxima Centauri. Passing straight through the Oort Cloud region. This must have left some sort of mark maybe? A binary system of a red and brown dwarf (8% and 6% solar masses) so maybe not a too significant impact on trajectories in the Oort cloud?
  •  
    I read this earlier and thought it might be another one of those alien conspiracy stuff. Freaky stuff.
  •  
    what about taking a ride on one of these? - especially if they come with some companion planets? when is the next shuttle coming?
jcunha

Nature: Spawning rings of exceptional points out of Dirac cones - 3 views

  •  
    Dirac cones, a band-structure of two cones touching each other, are the key to understand graphene exceptional properties. They also appear in the theory of photon waveguides and atoms in optical lattices. In here, the study of a Dirac cone deformation in an open system (a system that is perturbed by external agents) lead to the deformation of the Dirac cone, meaning a change in the fundamental properties of the system. This change is such that strange phenomena such as unidirectional transmission or reflection or lasers with single mode (really single) operation can be achieved. Proved experimentally in photonic crystals. New way for extremely pure lasers?
Juxi Leitner

Asteroid Response System in Place (Complete With U.S. Military Eye Patch) - 1 views

  • However, the US air force, which funded the development of the telescope, requires that software automatically black out a swathe of pixels to hide the trajectories of passing satellites.
LeopoldS

A Galactic Origin for HE 0437-5439, The Hypervelocity Star Near the Large Magellanic Cloud - 1 views

  •  
    "we conclude that HE 0437-5439 was most likely a compact binary ejected by the Milky Way's central black hole" reminds me a bit of Francesco's proposal to get rid of Mercury for the stability of our solar system ... what was the proposal: "get rid of the sucker?"
Joris _

SPACE.com -- Railway to the Sky? NASA Ponders New Launch System - 3 views

  • A team of engineers from NASA's Kennedy Space Center in Florida and some of the agency's other field centers are looking into this and other novel launch systems based on cutting-edge technologies.
  • The launch system would require some advancements of existing technologies, but it wouldn't need any brand-new technologies to work
  • Scramjet vehicles could be used as a basis for a commercial launch program
  • ...1 more annotation...
  • It's not very often you get to work on a major technology revolution
  •  
    I wonder if they are also working with that SCRAMSPACE initiative in Australia that was presented at ESTEC a while back...
  •  
    what about a space elevator!!! quiet old concept (1895), see this link on wiki http://en.wikipedia.org/wiki/Space_elevator
Joris _

American Institute of Aeronautics and Astronautics - Space and the Biological Economy - 0 views

  • the U.S. space program has a robust life science program that is diligently working to innovate new approaches, research and technologies in the fields of biotechnology and bio-nanotechnology science, which are providing new solutions for old problems – including food security, medical needs and energy needs
  • more money be allocated to develop environmentally sound and energy efficient engine programs for commercial and private aviation
  • waste water program
  • ...3 more annotations...
  • we lack fundamental knowledge about the entire effect of the photosynthesis system on food growth, and that space-based research could provide vital clues to scientists on how to streamline the process to spur more efficient food growth
  • From the start of the space age until 2010 only around 500 people have journeyed into space, but with the advent of private space travel in the next 24 months another 500 people are expected to go into space
  • Wagner indentified prize systems that award monetary prizes to companies or individuals as an effective way to spur innovation and creativity, and urged the Congressional staffers present to consider creating more prize systems to stimulate needed innovation
  •  
    a bunch of ideas, iinitiatives, and good points about upcoming changes in space ...
Dario Izzo

Probabilistic Logic Allows Computer Chip to Run Faster - 3 views

  •  
    Francesco pointed out this research one year ago, we dropped it as noone was really considering it ... but in space a low CPU power consumption is crucial!! Maybe we should look back into this?
  • ...6 more comments...
  •  
    Q1: For the time being, for what purposes computers are mainly used on-board?
  •  
    for navigation, control, data handling and so on .... why?
  •  
    Well, because the point is to identify an application in which such computers would do the job... That could be either an existing application which can be done sufficiently well by such computers or a completely new application which is not already there for instance because of some power consumption constraints... Q2 would be then: for which of these purposes strict determinism of the results is not crucial? As the answer to this may not be obvious, a potential study could address this very issue. For instance one can consider on-board navigation systems with limited accuracy... I may be talking bullshit now, but perhaps in some applications it doesn't matter whether a satellite flies on the exact route but +/-10km to the left/right? ...and so on for the other systems. Another thing is understanding what exactly this probabilistic computing is, and what can be achieved using it (like the result is probabilistic but falls within a defined range of precision), etc. Did they build a complete chip or at least a sub-circiut, or still only logic gates...
  •  
    Satellites use old CPUs also because with the trend of going for higher power modern CPUs are not very convenient from a system design point of view (TBC)... as a consequence the constraints put on on-board algorithms can be demanding. I agree with you that double precision might just not be necessary for a number of applications (navigation also), but I guess we are not talking about 10km as an absolute value, rather to a relative error that can be tolerated at level of (say) 10^-6. All in all you are right a first study should assess what application this would be useful at all.. and at what precision / power levels
  •  
    The interest of this can be a high fault tolerance for some math operations, ... which would have for effect to simplify the job of coders! I don't think this is a good idea regarding power consumption for CPU (strictly speaking). The reason we use old chip is just a matter of qualification for space, not power. For instance a LEON Sparc (e.g. use on some platform for ESA) consumes something like 5mW/MHz so it is definitely not were an engineer will look for some power saving considering a usual 10-15kW spacecraft
  •  
    What about speed then? Seven time faster could allow some real time navigation at higher speed (e.g. velocity of a terminal guidance for an asteroid impactor is limited to 10 km/s ... would a higher velocity be possible with faster processors?) Another issue is the radiation tolerance of the technology ... if the PCMOS are more tolerant to radiation they could get more easily space qualified.....
  •  
    I don't remember what is the speed factor, but I guess this might do it! Although, I remember when using an IMU that you cannot have the data above a given rate (e.g. 20Hz even though the ADC samples the sensor at a little faster rate), so somehow it is not just the CPU that must be re-thought. When I say qualification I also imply the "hardened" phase.
  •  
    I don't know if the (promised) one-order-of-magnitude improvements in power efficiency and performance are enough to justify looking into this. For once, it is not clear to me what embracing this technology would mean from an engineering point of view: does this technology need an entirely new software/hardware stack? If that were the case, in my opinion any potential benefit would be nullified. Also, is it realistic to build an entire self-sufficient chip on this technology? While the precision of floating point computations may be degraded and still be useful, how does all this play with integer arithmetic? Keep in mind that, e.g., in the Linux kernel code floating-point calculations are not even allowed/available... It is probably possible to integrate an "accelerated" low-accuracy floating-point unit together with a traditional CPU, but then again you have more implementation overhead creeping in. Finally, recent processors by Intel (e.g., the Atom) and especially ARM boast really low power-consumption levels, at the same time offering performance-boosting features such as multi-core and vectorization capabilities. Don't such efforts have more potential, if anything because of economical/industrial inertia?
Luzi Bergamin

Prof. Markrams Hirnmaschine (Startseite, NZZ Online) - 2 views

  •  
    A critical view on Prof. Markram's Blue Brain project (in German).
  • ...4 more comments...
  •  
    A critical view on Prof. Markram's Blue Brain project (in German).
  •  
    so critical that the comment needed to be posted twice :-) ?
  •  
    Yes, I know; I still don't know how to deal with this f.... Diigo Toolbar! Shame on me!!!
  •  
    Would be nice to read something about the modelling, but it appears that there is nothing published in detail. Following the article, the main approach is to model each(!) neuron taking into account the spatial structure of the neurons positions. Once achieved they expect intelligent behaviour. And they need a (type of) supercomputer which does not exist yet.
  •  
    As far as I know it's sort of like "Let's construct an enormous dynamical system and see what happens"... i.e. a waste of taxpayer's money... Able to heal Alzheimer... Yeah... Actually I was on the conference the author is mentioning (FET 2011) and I have seen the presentations of all 6 flagship proposals. Following that I had a discussion with one of my colleagues about the existence of limits of the amount of bullshit politicians are willing to buy from scientists. Will there be a point at which politicians, despite their total ignorance, will realise that scientists simply don't deliver anything they promise? How long will we (scientists) be stuck in the viscous circle of have-to-promise-more-than-predecessors in order to get money? Will we face a situation when we'll be forced to revert to promises which are realistic? To be honest none of the 6 presentations convinced me of their scientific merit (apart from the one on graphene where I have absolutely no expertise to tell). Apparently a huge amount of money is about to be wasted.
  •  
    It's not just "Let's construct an enormous dynamical system and see what happens", it's worse! Also the simulation of the cosmological evolution is/was a little bit of this type, still the results are very interesting and useful. Why? Neither the whole cosmos nor the human brain at the level of single neurons can be modelled on a computer, that would last aeons on a "yet-to-be-invented-extra-super-computer". Thus one has to make assumptions and simplifications. In cosmology we have working theories of gravitation, thermodynamics, electrodynamics etc. at hand; starting from these theories we can make reasonable assumptions and (more or less) justified simplifications. The result is valuable since it provides insight into a complex system under given, explicit and understood assumptions. Nothing similar seems to exist in neuroscience. There is no theory of the human brain and apparently nobody has the slightest idea which simplifications can be made without harm. Of course, Mr. Markram remains completely unaffected of ''details'' like this. Finally, Marek, money is not wasted, we ''build networks of excellence'' and ''select the brightest of the brightest'' to make them study and work at our ''elite institutions'' :-). I lively remember the stage of one of these "bestofthebest" from Ivy League at the ACT...
Marion Nachon

NASA Announces Design for New Deep Space Exploration System - 1 views

  •  
    The Space Launch System will be NASA's first exploration-class vehicle since the Saturn V took American astronauts to the moon over 40 years ago. With its superior lift capability, the SLS will allow us to explore cis-lunar space, near-Earth asteroids, Mars and its moons and beyond.
andreiaries

YouTube - Mission 3 computer animation - 0 views

  •  
    ARCA is the romanian google X prize competitor.
  • ...1 more comment...
  •  
    They'll probably launch the concept this month. It doesn't look very realistic, but I like the stage separation.
  •  
    I like the 4 stage system. But how did they solve the plume issue ?
  •  
    The plume issue is not that difficult. I think they used something similar on Apollo LES. The problem is stabilizing the entire system, which is extremely difficult. The entire system will most likely plummet down after the solar balloon phase (which is the only phase they tested before). At least they are not using government money :).
nikolas smyrlakis

The S&P 500 as a Planetary System | FlowingData - 4 views

  •  
    really great visualization
Joris _

DARPA funds high-power satellite system demonstration - 0 views

  • Its goal is to perform ground demonstrations of a 20kW generation system that is scalable to output up to 80kW
  •  
    interesting indeed! also for SPS!
Friederike Sontag

AMS Policy Statement on Geoengineering the Climate System - 0 views

  • Therefore, the American Meteorological Society recommends:
  • Enhanced research on the scientific and technological potential for geoengineering
  • the climate system, including research on intended and unintended environmental responses.
  • ...2 more annotations...
  • Coordinated study of historical, ethical, legal, and social implications of geoengineering
  • that integrates international, interdisciplinary, and intergenerational issues and perspectives and includes lessons from past efforts to modify weather and climate. Development and analysis of policy options to promote transparency and international cooperation in exploring geoengineering options along with restrictions on reckless efforts to manipulate the climate system.
  •  
    policy statement regarding research on geoengineerin in the US (in force from July 2009-July 2012)
  •  
    looking forward to your recommendations how we can get into it quickly :-)
LeopoldS

NHESS - Home - 0 views

  •  
    especially interesting also for the two new girls, Nina and Friedericke joigning us in a few weeks ...
‹ Previous 21 - 40 of 341 Next › Last »
Showing 20 items per page