Skip to main content

Home/ Advanced Concepts Team/ Group items tagged acceleration

Rss Feed Group items tagged

jmlloren

Scientists discover how to turn light into matter after 80-year quest - 5 views

  •  
    Theoretized 80 years ago was Breit-Wheeler pair production in which two photons result in an electron-positron pair (via a virtual electron). It is a relatively simple Feynmann diagram, but the problem is/was how to produce in practice a high energy photon-photon collider... The collider experiment that the scientists have proposed involves two key steps. First, the scientists would use an extremely powerful high-intensity laser to speed up electrons to just below the speed of light. They would then fire these electrons into a slab of gold to create a beam of photons a billion times more energetic than visible light. The next stage of the experiment involves a tiny gold can called a hohlraum (German for 'empty room'). Scientists would fire a high-energy laser at the inner surface of this gold can, to create a thermal radiation field, generating light similar to the light emitted by stars. They would then direct the photon beam from the first stage of the experiment through the centre of the can, causing the photons from the two sources to collide and form electrons and positrons. It would then be possible to detect the formation of the electrons and positrons when they exited the can. Now this is a good experiment... :)
  • ...6 more comments...
  •  
    The solution of thrusting in space.
  •  
    Thrusting in space is solved already. Maybe you wanted to say something different?
  •  
    Thrusting until your fuel runs out is solved, in this way one can produce mass from, among others, solar/star energy directly. What I like about this experiment is that we have the technology already to do it, many parts have been designed for inertial confinement fusion.
  •  
    I am quite certain that it would be more efficient to use the photons directly for thrust instead of converting them into matter. Also, I am a bit puzzled at the asymmetric layout for photon creation. Typically, colliders use two beam of particle with equal but opposite momentum. Because the total momentum for two colliding particles is zero the reaction products are produced more efficiently as a minimum of collision energy is waisted on accelerating the products. I guess in this case the thermal radiation in the cavity is chosen instead of an opposing gamma ray beam to increase the photon density and increase the number of collisions (even if the efficiency decreases because of the asymmetry). However, a danger from using a high temperature cavity might be that a lot of thermionic emission creates lots of free electrons with the cavity. This could reduce the positron yield through recombination and would allow the high energetic photons to loose energy through Compton scattering instead of the Breit-Wheeler pair production.
  •  
    Well, the main benefit from e-p pair creation might be that one can accelerate these subsequently to higher energies again. I think the photon-photon cross-section is extremely low, such that direct beam-beam interactions are basically not happening (below 1/20.. so basically 0 according to quantum probability :P), in this way, the central line of the hohlraum actually has a very high photon density and if timed correctly maximizes the reaction yield such that it could be measured.
  •  
    I agree about the reason for the hohlraum - but I also keep my reservations about the drawbacks. About the pair production as fuel: I pretty sure that your energy would be used smarter in using photon (not necessarily high energy photons) for thrust directly instead of putting tons of energy in creating a rest-mass and then accelerating that. If you look at E² = (p c)²+(m0 c)² then putting energy into the mass term will always reduce your maximum value of p.
  •  
    True, but isnt it E2=(pc)^2 + (m0c^2)^2 such that for photons E\propto{pc} and for mass E\propto{mc^2}. I agree it will take a lot of energy, but this assumes that that wont be the problem at least. The question therefore is whether the mass flow of the photon rocket (fuel consumed to create photons, eg fission/fusion) is higher/lower than the mass flow for e-p creation. You are probably right that the low e-p cross-section will favour direct use of photons to create low thrust for long periods of time, but with significant power available the ISP might be higher for e-p pair creation.
  •  
    In essence the equation tells you that for photons with zero rest mass m0 all the energy will be converted to momentum of the particles. If you want to accelerate e-p then you first spend part of the energy on creating them (~511 keV each) and you can only use the remaining energy to accelerate them. In this case the equation gives you a lower particle momentum which leads to lower thrust (even when assuming 100% acceleration efficiency). ISP is a tricky concept in this case because there are different definitions which clash in the relativistic context (due to the concept of mass flow). R. Tinder gets to a I_SP = c (speed of light) for a photon rocket (using the relativistic mass of the photons) which is the maximum possible relativistic I_SP: http://goo.gl/Zz5gyC .
Alexander Wittig

Picture This: NVIDIA GPUs Sort Through Tens of Millions of Flickr Photos - 2 views

  •  
    Strange and exotic cityscapes. Desolate wilderness areas. Dogs that look like wookies. Flickr, one of the world's largest photo sharing services, sees it all. And, now, Flickr's image recognition technology can categorize more than 11 billion photos like these. And it does it automatically. It's called "Magic View." Magical deep learning! Buzzword attack!
  • ...4 more comments...
  •  
    and here comes my standard question: how can we use this for space? fast detection of natural disasters onboard?
  •  
    Even on ground. You could for example teach it what nuclear reactors or missiles or other weapons you don't want look like on satellite pictures and automatically scan the world for them (basically replacing intelligence analysts).
  •  
    In fact, I think this could make a nice ACT project: counting seals from satellite imagery is an actual (and quite recent) thing: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0092613 In this publication they did it manually from a GeoEye 1 b/w image, which sounds quite tedious. Maybe one can train one of those image recognition algorithms to do it automatically. Or maybe it's a bit easier to count larger things, like elephants (also a thing).
  •  
    In HiPEAC (High Performance, embedded architecture and computation) conference I attended in the beginning of this year there was a big trend of CUDA GPU vs FPGA for hardware accelerated image processing. Most of it orbitting around discussing who was faster and cheaper with people from NVIDIA in one side and people from Xilinx and Intel in the other. I remember of talking with an IBM scientist working on hardware accelerated data processing working together with the Radio telescope institute in Netherlands about the solution where they working on (GPU CUDA). I gathered that NVIDIA GPU suits best in applications that somehow do not rely in hardware, having the advantage of being programmed in a 'easy' way accessible to a scientist. FPGA's are highly reliable components with the advantage of being available in radhard versions, but requiring specific knowledge of physical circuit design and tailored 'harsh' programming languages. I don't know what is the level of rad hardness in NVIDIA's GPUs... Therefore FPGAs are indeed the standard choice for image processing in space missions (a talk with the microelectronics department guys could expand on this), whereas GPUs are currently used in some ground based (radio astronomy or other types of telescopes). I think that on for a specific purpose as the one you mentioned, this FPGA vs GPU should be assessed first before going further.
  •  
    You're forgetting power usage. GPUs need 1000 hamster wheels worth of power while FPGAs can run on a potato. Since space applications are highly power limited, putting any kind of GPU monster in orbit or on a rover is failed idea from the start. Also in FPGAs if a gate burns out from radiation you can just reprogram around it. Looking for seals offline in high res images is indeed definitely a GPU task.... for now.
  •  
    The discussion of how to make FPGA hardware acceleration solutions easier to use for the 'layman' is starting btw http://reconfigurablecomputing4themasses.net/.
Thijs Versloot

Relativistic rocket: Dream and reality - 3 views

  •  
    An exhaustive overview of all possible advanced rocket concepts, eg.. "As an example, consider a photon rocket with its launching mass, say, 1000 ton moving with a constant acceleration a =0.1 g=0.98 m/s2. The flux of photons with E γ=0.5 MeV needed to produce this acceleration is ~1027/s, which corresponds to the efflux power of 1014 W and the rate of annihilation events N'a~5×1026 s−1 [47]. This annihilation rate in ambiplasma l -l ann corresponds to the value of current ~108 A and linear density N ~2×1018 m−1 thus any hope for non-relativistic relative velocity of electrons and positrons in ambiplasma is groundless." And also, even if it would work, then one of the major issues is going to be heat dispersal: "For example, if the temperature of radiator is chosen T=1500 K, the emitting area should be not less than 1000 m2 for Pb=1 GW, not less than 1 km2 for Pb=1 TW, and ~100 km2 for Pb=100 TW, assuming ε=0.5 and δ=0.2. Lower temperature would require even larger radiator area to maintain the outer temperature of the engine section stable for a given thermal power of the reactor."
  • ...2 more comments...
  •  
    We were also discussing a while ago a propulsion system using the relativistic fragments from nuclear fission. That would also produce an extremely high ISP (>100000) with a fairly high thrust. Never really got any traction though.
  •  
    I absolutely do not see the point in a photon rocket. Certainly, the high energy releasing nulcear processes (annihilation, fusion, ...) should rather be used to heat up some fluid to plasma state and accelerate it via magnetic nozzle. This would surely work as door-opener to our solar system...and by the way minimize the heat disposal problem if regenerative cooling is used.
  •  
    The problem is not achieving a high energy density, that we can already do with nuclear fission, the question however is how to confine or harness this power with relatively high efficiency, low waste heat and at not too crazy specific mass. I see magnetic confinement as a possibility, yet still decades away and also an all-or-nothing method as we cannot easily scale this up from a test experiment to a full-scale system. It might be possible to extract power from such a plasma, but definitely well below breakeven so an additional power supply is needed. The fission fragments circumvent these issues by a more brute force approach, thereby wasting a lot of energy for sure but at the end probably providing more ISP and thrust.
  •  
    Sure. However, the annihilation based photon rocket concept unifies almost all relevant drawbacks if we speak about solar system scales, making itself obsolete...it is just an academic testcase.
santecarloni

Light bends itself round corners - physicsworld.com - 1 views

  •  
    The Florida team generated a specially shaped laser beam that could self-accelerate, or bend, sideways.
  •  
    very nice!!! read this e.g. "In addition to this self-bending, the beam's intensity pattern also has a couple of other intriguing characteristics. One is that it is non-diffracting, which means that the width of each intensity region does not appreciably increase as the beam travels forwards. This is unlike a normal beam - even a tightly collimated laser beam - which spreads as it propagates. The other unusual property is that of self-healing. This means that if part of the beam is blocked by opaque objects, then any disruptions to the beam's intensity pattern could gradually recover as the beam travels forward."
LeopoldS

PLOS ONE: Galactic Cosmic Radiation Leads to Cognitive Impairment and Increas... - 1 views

  •  
    Galactic Cosmic Radiation consisting of high-energy, high-charged (HZE) particles poses a significant threat to future astronauts in deep space. Aside from cancer, concerns have been raised about late degenerative risks, including effects on the brain. In this study we examined the effects of 56Fe particle irradiation in an APP/PS1 mouse model of Alzheimer's disease (AD). We demonstrated 6 months after exposure to 10 and 100 cGy 56Fe radiation at 1 GeV/µ, that APP/PS1 mice show decreased cognitive abilities measured by contextual fear conditioning and novel object recognition tests. Furthermore, in male mice we saw acceleration of Aβ plaque pathology using Congo red and 6E10 staining, which was further confirmed by ELISA measures of Aβ isoforms. Increases were not due to higher levels of amyloid precursor protein (APP) or increased cleavage as measured by levels of the β C-terminal fragment of APP. Additionally, we saw no change in microglial activation levels judging by CD68 and Iba-1 immunoreactivities in and around Aβ plaques or insulin degrading enzyme, which has been shown to degrade Aβ. However, immunohistochemical analysis of ICAM-1 showed evidence of endothelial activation after 100 cGy irradiation in male mice, suggesting possible alterations in Aβ trafficking through the blood brain barrier as a possible cause of plaque increase. Overall, our results show for the first time that HZE particle radiation can increase Aβ plaque pathology in an APP/PS1 mouse model of AD.
jaihobah

Antimatter Starship Scheme Coming to Kickstarter - 1 views

  •  
    "Hbar Technologies plans a Kickstarter effort to raise US $200,000 for the next phase design of an antimatter-propelled spaceship. The two scientists behind this design effort are a veteran Fermilab particle accelerator scientist and a former Los Alamos National Laboratory physicist and founding director of the U.S. Center for Space Nuclear Research. They originally developed it for NASA at the turn of the millennium."
jcunha

Accelerated search for materials with targeted properties by adaptive design - 0 views

  •  
    There has been much recent interest in accelerating materials discovery. High-throughput calculations and combinatorial experiments have been the approaches of choice to narrow the search space. The emphasis has largely been on feature or descriptor selection or the use of regression tools, such as least squares, to predict properties. The regression studies have been hampered by small data sets, large model or prediction uncertainties and extrapolation to a vast unexplored chemical space with little or no experimental feedback to validate the predictions. Thus, they are prone to be suboptimal. Here an adaptive design approach is used that provides a robust, guided basis for the selection of the next material for experimental measurements by using uncertainties and maximizing the 'expected improvement' from the best-so-far material in an iterative loop with feedback from experiments. It balances the goal of searching materials likely to have the best property (exploitation) with the need to explore parts of the search space with fewer sampling points and greater uncertainty.
johannessimon81

Cosmological model without accelerated expansion proposed - 1 views

  •  
    Redshift in this model is partially produced by a change in the masses of elementary particles (and atoms)
  • ...3 more comments...
  •  
    It seems to solve the problem of infinite energy density at the singularity in any case. I would love to see a way of experimentally verifying this, although most people seem to believe it is wrong. I read the following quote though by Dirac to Pauli "we all agree your idea is crazy, but the real question is it crazy enough to be correct?"
  •  
    As far as I can see, this is not untestable per se, rather an explanation to the redshift that is equivalent to accelerating expansion. It is not that the theory is untestable, rather just another way of looking at it. Kind of like that its sometimes convenient to consider light a particle, sometimes a wave. In the same way it could sometime convenient to view the universe as static with increasing mass instead.
  •  
    Well the premiss "matter getting heavier" may be up to falsification in some way or another. Currently, there is no absolute method to determine mass so it might even be plausible that this is actually the case. I don't think it is related but there is a problem with the 1kg-standards (1 official and 6 copies) where the masses seem to deviate.
  •  
    It should not be impossible to verify a change in mass(es) over time. For example the electron cyclotron frequency scales ~e/m while the Hydrogen emission frequencies scale with ~m*e^4. Using multiple relationships like that which can be easily and accurately measured an increase in the mass of fundamental particles should - in principle - be detectable (even if the mass of the earth increases at the same time changing the relativistic reference frame).
  •  
    The Watt balance and a definition using the Planck's constant seems to do the trick and is currently being discussed. Would the electron charge not be problematic as it is related to Coulombs which depends on Amperes which is defined by Newtons which hence depends back on the mass again?
Beniamino Abis

Laser-Plasma Particle Accelerator - 1 views

  •  
    Can we have one?
  •  
    They say they create 2 GeV electrons in a very small setup. However the laser they use is more than 10 meters in length. Still a really nice result.
  •  
    Compared to a 27km circumference this is major achievement and it is indeed already foreseen that future colliders will include this technology as beam fillers and pre-accelerators at some point. The technique is quite elegant and a lot more energy efficient. Nevertheless, there are also thoughts that future particle colliders might actually go towards space and study collisions orginating from extremely energetic cosmic particles
Nicholas Lan

Betting on Green - 5 views

  •  
    breakthroughs vs. accelerated deployment in climate change mitigation technologies.
  • ...2 more comments...
  •  
    interesting guy indeed ... "Forget today's green technologies like electric cars, wind turbines, solar cells and smart grids, in other words. None meets what Mr Khosla calls the "Chindia price"-the price at which people in China and India will buy them without a subsidy. "Everything's a toy until it reaches that point," he says. I also like this one since its a bit like ACT topic selection: ""I am only interested in technologies that have a 90% chance of failure but, if they do succeed, would change the infrastructure of society in some radical way," he says." should we propose SPS to him ? :-)
  •  
    one more: ""I never compute returns. If you start forecasting cash flows, you lose innovation, you lose instinct. You average yourself down to mediocrity." "I've had many more failures than successes in my life," admits Mr Khosla. "My willingness to fail gives me the ability to succeed."
  •  
    indeed. puts me in mind of the often reinvented private ACT idea. actually there's a bunch of interesting looking articles on his website. http://www.khoslaventures.com/khosla/papers.html . No sps in the solar one as far as i can tell :) found this bit intriguing too in that, albeit presumably out of context, it doesn't make sense ""The solution to our energy problems is almost the exact opposite of what Khosla says," declares Joseph Romm, who is the editor of Climate Progress, an influential climate blog, and a senior fellow at the Centre for American Progress Action Fund, a think-tank. "Technology breakthroughs are unlikely to be the answer. Accelerated deployment of existing technologies will get you down the cost curve much more rapidly than a breakthrough."" found this seemingly not very well considered piece (to be fair a blog post) by the guy http://climateprogress.org/2010/07/02/is-anyone-more-incoherent-than-vinod-khosla/ . maybe he's written some more convincing stuff in this vein somewhere.
  •  
    "Mr Khosla (...) is investing over $1 billion of his clients' money in black swans" Well, with his own money his approach might be a little different :-)
ESA ACT

PLoS ONE: The Fastest Flights in Nature: High-Speed Spore Discharge Mechanisms among Fungi - 0 views

  •  
    Tobias have a look at this! 180 000 g acceleration!! LS
ESA ACT

MRS Special Issue Harnessing Materials for Energy - 0 views

  •  
    "Harnessing Materials for Energy," focuses on the most important materials research challenges that need to be addressed to move toward secure, affordable, and environmentally sustainable energy to meet the world's accelerating energy needs. The issue fol
annaheffernan

Particle accelerator barely bigger than a grain of sand - 3 views

  •  
    Just getting a particle up to near the speed of light isn't good enough for today's physics. To properly unravel the fundamentals of the universe, particles have to be smashed together with enormous force. And two Stanford researchers have just devised a laser-based method that imparts ten times the power of traditional methods at a fraction of the cost.
jaihobah

A $2 Billion Chip to Accelerate Artificial Intelligence - 1 views

  •  
    Got $129,000?
Marcus Maertens

Artificial intelligence helps accelerate progress toward efficient fusion reactions | P... - 3 views

  •  
    There we go: Deep Learning predicts disruptions in plasmas. The paper related to this article is here: https://arxiv.org/abs/1802.02242
Marcus Maertens

Ultrahigh Acceleration Neutral Particle Beam-Driven Sails - 1 views

  •  
    An alternative to photon-beam driven sails?
santecarloni

How to Measure Quantum Foam With a Tabletop Experiment | MIT Technology Review - 1 views

  •  
    I think there are a few difficulties when assuming that the macroscopic block of glass will accelerate instantaneously. Also, if I prevent the block from moving would it become perfectly reflective as no momentum can be transferred to it? One could say that the momentum is then transferred to the larger system that holds the glass. But surely I could make that system (or even the block of glass) so heavy that it would not move more than Planck's length during the passage of the photon - especially if the glass is very thin or the light is very red.
Joris _

NASA International Space Station Longeron Marathon Challenge - 1 views

shared by Joris _ on 18 Jan 13 - No Cached
LeopoldS liked it
  •  
    nice - did not know about it. GTOC on steroids and with loads of cash. concerning this specific challenge and especially the last condition: doesn't this hint towards a flawed design? In addition to maximizing the total power output there are some constraints on the possible movements: Each SARJ and BGA is limited to a maximum angular velocity and to a maximum angular acceleration. Each SAW must produce at least some minimum average power over the orbit (which is different for each SAW). The sequence of positions must be cyclic, so it can be repeated on the next orbit. The maximum amount of BGA rotation is not limited, but exceeding a threshold will result in a score penalty. Some structural members of the SAW mast (called Longerons) have restrictions on how they can be shadowed.
  •  
    The longerons will expand and contract with exposition to sun (I think whatever the material they are made of). Because you have 4 longerons in a mast, you just need to be carefull that the mast is well balanced, and that the 4 longerons support each other, basically, you need an even number of shadowed longerons, possibly 0 too. I would call this an operational constraint.
Marcus Maertens

Everything You Wanted to Know about Space Tourism but Were Afraid to Ask | Space Safety... - 3 views

  •  
    "chances are that if 700 passengers are flown annually, up to 10 of them might not survive the flight in the first years of the operations." most remarkable also the question who is to blame if a dead and burned space tourist corps comes crashing down from the sky into your car.
  • ...3 more comments...
  •  
    How sure is the information that a human body would not completely burn / ablate during atmospheric re-entry? I am not aware of any material ground tests in a plasma wind tunnel confirming that human tissue would survive re-entry from LEO.
  •  
    Since a steak would not even be cooked by dropping it from very high altitudes (http://what-if.xkcd.com/28/) I would doubt that a space tourists body would desintegrate by atmospheric re-entry.
  •  
    Funny link, however, some things are not clear enough: 1. Ablation rate is unknown 2. What are the entry conditions? The link suggests that the steak is just dropped (no initial velocity). 3. What about the ballistic coefficient? 4. How would the entry body orientation? It would be a quite non-steady state configuration I guess with heavy accelerations. 5. How would vacuum exposure impact on the water in the body/steak and what would be the consequence for ablation behaviour? 6. Does surface chemistry play a role (not ablation, but catalysis)? My conclusion: the example with the steak is a funny and not so bad exercise, not more.
  •  
    This calls for some we serious simulations by the Petkow code it seems to me ...
  •  
    I still would need some serious input data...
1 - 20 of 49 Next › Last »
Showing 20 items per page