Skip to main content

Home/ Advanced Concepts Team/ Group items tagged radiation

Rss Feed Group items tagged

Thijs Versloot

Communicate through the plasma sheath during re-entry - 1 views

  •  
    In order to overcome the communication blackout problem suffered by hypersonic vehicles, a matching approach has been proposed for the first time in this paper. It utilizes a double-positive (DPS) material layer surrounding a hypersonic vehicle antenna to match with the plasma sheath enclosing the vehicle. Or in more easy language, basically one provides an antenna as capacitor, in combination with the plasma sheath (an inductor), they form an electrical circuit which becomes transparent for long wavelength radiation (the communication signal). The reasons is that fluctuations are balanced by the twin system, preventing absorption/reflection of the incoming radiation. Elegant solution, but will only work on long wavelength communication, plus I am not sure whether the antenna needs active control (as the plasma sheath conditions change during the re-entry phase).
Thijs Versloot

Electromagnetism generated by symmetry breaking in dielectrics - 0 views

  •  
    Using dielectric materials as efficient EM radiators and receivers can scale down these antenna's to the chip level, reducing both weight and power consumption. The infamous internet-of-things one step closer. But could we also transmit power this way?? "In dielectric aerials, the medium has high permittivity, meaning that the velocity of the radio wave decreases as it enters the medium," said Dr Dhiraj Sinha, the paper's lead author. "What hasn't been known is how the dielectric medium results in emission of electromagnetic waves. This mystery has puzzled scientists and engineers for more than 60 years." The researchers determined that the reason for this phenomenon is due to symmetry breaking of the electric field associated with the electron acceleration The researchers found that by subjecting the piezoelectric thin films to an asymmetric excitation, the symmetry of the system is similarly broken, resulting in a corresponding symmetry breaking of the electric field, and the generation of electromagnetic radiation.
Dario Izzo

Probabilistic Logic Allows Computer Chip to Run Faster - 3 views

  •  
    Francesco pointed out this research one year ago, we dropped it as noone was really considering it ... but in space a low CPU power consumption is crucial!! Maybe we should look back into this?
  • ...6 more comments...
  •  
    Q1: For the time being, for what purposes computers are mainly used on-board?
  •  
    for navigation, control, data handling and so on .... why?
  •  
    Well, because the point is to identify an application in which such computers would do the job... That could be either an existing application which can be done sufficiently well by such computers or a completely new application which is not already there for instance because of some power consumption constraints... Q2 would be then: for which of these purposes strict determinism of the results is not crucial? As the answer to this may not be obvious, a potential study could address this very issue. For instance one can consider on-board navigation systems with limited accuracy... I may be talking bullshit now, but perhaps in some applications it doesn't matter whether a satellite flies on the exact route but +/-10km to the left/right? ...and so on for the other systems. Another thing is understanding what exactly this probabilistic computing is, and what can be achieved using it (like the result is probabilistic but falls within a defined range of precision), etc. Did they build a complete chip or at least a sub-circiut, or still only logic gates...
  •  
    Satellites use old CPUs also because with the trend of going for higher power modern CPUs are not very convenient from a system design point of view (TBC)... as a consequence the constraints put on on-board algorithms can be demanding. I agree with you that double precision might just not be necessary for a number of applications (navigation also), but I guess we are not talking about 10km as an absolute value, rather to a relative error that can be tolerated at level of (say) 10^-6. All in all you are right a first study should assess what application this would be useful at all.. and at what precision / power levels
  •  
    The interest of this can be a high fault tolerance for some math operations, ... which would have for effect to simplify the job of coders! I don't think this is a good idea regarding power consumption for CPU (strictly speaking). The reason we use old chip is just a matter of qualification for space, not power. For instance a LEON Sparc (e.g. use on some platform for ESA) consumes something like 5mW/MHz so it is definitely not were an engineer will look for some power saving considering a usual 10-15kW spacecraft
  •  
    What about speed then? Seven time faster could allow some real time navigation at higher speed (e.g. velocity of a terminal guidance for an asteroid impactor is limited to 10 km/s ... would a higher velocity be possible with faster processors?) Another issue is the radiation tolerance of the technology ... if the PCMOS are more tolerant to radiation they could get more easily space qualified.....
  •  
    I don't remember what is the speed factor, but I guess this might do it! Although, I remember when using an IMU that you cannot have the data above a given rate (e.g. 20Hz even though the ADC samples the sensor at a little faster rate), so somehow it is not just the CPU that must be re-thought. When I say qualification I also imply the "hardened" phase.
  •  
    I don't know if the (promised) one-order-of-magnitude improvements in power efficiency and performance are enough to justify looking into this. For once, it is not clear to me what embracing this technology would mean from an engineering point of view: does this technology need an entirely new software/hardware stack? If that were the case, in my opinion any potential benefit would be nullified. Also, is it realistic to build an entire self-sufficient chip on this technology? While the precision of floating point computations may be degraded and still be useful, how does all this play with integer arithmetic? Keep in mind that, e.g., in the Linux kernel code floating-point calculations are not even allowed/available... It is probably possible to integrate an "accelerated" low-accuracy floating-point unit together with a traditional CPU, but then again you have more implementation overhead creeping in. Finally, recent processors by Intel (e.g., the Atom) and especially ARM boast really low power-consumption levels, at the same time offering performance-boosting features such as multi-core and vectorization capabilities. Don't such efforts have more potential, if anything because of economical/industrial inertia?
pacome delva

Special relativity passes key test - 2 views

  • Granot and colleagues studied the radiation from a gamma-ray burst – associated with a highly energetic explosion in a distant galaxy – that was spotted by NASA's Fermi Gamma-ray Space Telescope on 10 May this year. They analysed the radiation at different wavelengths to see whether there were any signs that photons with different energies arrived at Fermi's detectors at different times.
  • According to Granot, these results "strongly disfavour" quantum-gravity theories in which the speed of light varies linearly with photon energy, which might include some variations of string theory or loop quantum gravity. "I would not use the term 'rule out'," he says, "as most models do not have exact predictions for the energy scale associated with this violation of Lorentz invariance. However, our observational requirement that such an energy scale would be well above the Planck energy makes such models unnatural."
  •  
    essentially they made an experiment that does not prove or disprove anything -big deal-... what is the scientific value of "strongly disfavour"??? I also like the sentence "most models do not have exact predictions for the energy scale associated with this violation of Lorentz invariance" ... but if this is true WHAT IS THE POINT OF THE EXPERIMENT!!!! God, physics is in trouble ....
  •  
    hum, null result experiments are not useless !!! there is always the hope of finding "something wrong", which would lead to a great discovery. For the state of theoretical physics (the "no exact predictions" quote), i totally agree that physics is in trouble... That's what happen when physicists don't care anymore about experiments...! All you can do now is drawing "nice"graph with upper bounds on some parameters of an all tunable weird theory !
santecarloni

Sharpening the Nanofocus: Berkeley Lab Researchers Use Nanoantenna to Enhance... - 0 views

  •  
    See Also: Matter & Energy Nanotechnology Optics Physics Materials Science Graphene Organic Chemistry Reference White gold Electromagnetic radiation Nanomedicine Nanoparticle Any use for the smell project? "We have demonstrated resonant antenna-enhanced single-particle hydrogen sensing in the visible region and presented a fabrication approach to the positioning of a single palladium nanoparticle in the nanofocus of a gold nanoantenna,"
LeopoldS

Three-Dimensional Invisibility Cloak at Optical Wavelengths - 4 views

  •  
    more transformation optics ... 
  •  
    I still believe that it is worth to check the thermal, mechanical and chemical properties of the developed metamaterials. For hyperbolic re-entries the radiation is still the dominating heat load source and a dominating bandwith may be indentified. A resistive metamaterial should be placed on the nose cap of the entry body in order to reduce local radiation heat load.
ESA ACT

Does Fungi Feast on Radiation?: Scientific American - 0 views

  •  
    Feed the astronauts with fungi a la Tschernobyl
ESA ACT

Ancient bacteria show evidence of DNA repair -- Johnson et al. 104 (36): 14401 -- Proce... - 0 views

  •  
    DNA repair as an answer to the radiation in space. If we cannot protect the astronauts, we could maybe repair them...
Thijs Versloot

Lasers May Solve the Black Hole Information Paradox - 0 views

  •  
    "In an effort to help solve the black hole information paradox that has immersed theoretical physics in an ocean of soul searching for the past two years, two researchers have thrown their hats into the ring with a novel solution: Lasers. Technically, we're not talking about the little flashy devices you use to keep your cat entertained, we're talking about the underlying physics that produces laser light and applying it to information that falls into a black hole. According to the researchers, who published a paper earlier this month to the journal Classical and Quantum Gravity (abstract), the secret to sidestepping the black hole information paradox (and, by extension, the 'firewall' hypothesis that was recently argued against by Stephen Hawking) lies in stimulated emission of radiation (the underlying physics that generates laser light) at the event horizon that is distinct from Hawking radiation, but preserves information as matter falls into a black hole."
jmlloren

Scientists discover how to turn light into matter after 80-year quest - 5 views

  •  
    Theoretized 80 years ago was Breit-Wheeler pair production in which two photons result in an electron-positron pair (via a virtual electron). It is a relatively simple Feynmann diagram, but the problem is/was how to produce in practice a high energy photon-photon collider... The collider experiment that the scientists have proposed involves two key steps. First, the scientists would use an extremely powerful high-intensity laser to speed up electrons to just below the speed of light. They would then fire these electrons into a slab of gold to create a beam of photons a billion times more energetic than visible light. The next stage of the experiment involves a tiny gold can called a hohlraum (German for 'empty room'). Scientists would fire a high-energy laser at the inner surface of this gold can, to create a thermal radiation field, generating light similar to the light emitted by stars. They would then direct the photon beam from the first stage of the experiment through the centre of the can, causing the photons from the two sources to collide and form electrons and positrons. It would then be possible to detect the formation of the electrons and positrons when they exited the can. Now this is a good experiment... :)
  • ...6 more comments...
  •  
    The solution of thrusting in space.
  •  
    Thrusting in space is solved already. Maybe you wanted to say something different?
  •  
    Thrusting until your fuel runs out is solved, in this way one can produce mass from, among others, solar/star energy directly. What I like about this experiment is that we have the technology already to do it, many parts have been designed for inertial confinement fusion.
  •  
    I am quite certain that it would be more efficient to use the photons directly for thrust instead of converting them into matter. Also, I am a bit puzzled at the asymmetric layout for photon creation. Typically, colliders use two beam of particle with equal but opposite momentum. Because the total momentum for two colliding particles is zero the reaction products are produced more efficiently as a minimum of collision energy is waisted on accelerating the products. I guess in this case the thermal radiation in the cavity is chosen instead of an opposing gamma ray beam to increase the photon density and increase the number of collisions (even if the efficiency decreases because of the asymmetry). However, a danger from using a high temperature cavity might be that a lot of thermionic emission creates lots of free electrons with the cavity. This could reduce the positron yield through recombination and would allow the high energetic photons to loose energy through Compton scattering instead of the Breit-Wheeler pair production.
  •  
    Well, the main benefit from e-p pair creation might be that one can accelerate these subsequently to higher energies again. I think the photon-photon cross-section is extremely low, such that direct beam-beam interactions are basically not happening (below 1/20.. so basically 0 according to quantum probability :P), in this way, the central line of the hohlraum actually has a very high photon density and if timed correctly maximizes the reaction yield such that it could be measured.
  •  
    I agree about the reason for the hohlraum - but I also keep my reservations about the drawbacks. About the pair production as fuel: I pretty sure that your energy would be used smarter in using photon (not necessarily high energy photons) for thrust directly instead of putting tons of energy in creating a rest-mass and then accelerating that. If you look at E² = (p c)²+(m0 c)² then putting energy into the mass term will always reduce your maximum value of p.
  •  
    True, but isnt it E2=(pc)^2 + (m0c^2)^2 such that for photons E\propto{pc} and for mass E\propto{mc^2}. I agree it will take a lot of energy, but this assumes that that wont be the problem at least. The question therefore is whether the mass flow of the photon rocket (fuel consumed to create photons, eg fission/fusion) is higher/lower than the mass flow for e-p creation. You are probably right that the low e-p cross-section will favour direct use of photons to create low thrust for long periods of time, but with significant power available the ISP might be higher for e-p pair creation.
  •  
    In essence the equation tells you that for photons with zero rest mass m0 all the energy will be converted to momentum of the particles. If you want to accelerate e-p then you first spend part of the energy on creating them (~511 keV each) and you can only use the remaining energy to accelerate them. In this case the equation gives you a lower particle momentum which leads to lower thrust (even when assuming 100% acceleration efficiency). ISP is a tricky concept in this case because there are different definitions which clash in the relativistic context (due to the concept of mass flow). R. Tinder gets to a I_SP = c (speed of light) for a photon rocket (using the relativistic mass of the photons) which is the maximum possible relativistic I_SP: http://goo.gl/Zz5gyC .
Thijs Versloot

Relativistic rocket: Dream and reality - 3 views

  •  
    An exhaustive overview of all possible advanced rocket concepts, eg.. "As an example, consider a photon rocket with its launching mass, say, 1000 ton moving with a constant acceleration a =0.1 g=0.98 m/s2. The flux of photons with E γ=0.5 MeV needed to produce this acceleration is ~1027/s, which corresponds to the efflux power of 1014 W and the rate of annihilation events N'a~5×1026 s−1 [47]. This annihilation rate in ambiplasma l -l ann corresponds to the value of current ~108 A and linear density N ~2×1018 m−1 thus any hope for non-relativistic relative velocity of electrons and positrons in ambiplasma is groundless." And also, even if it would work, then one of the major issues is going to be heat dispersal: "For example, if the temperature of radiator is chosen T=1500 K, the emitting area should be not less than 1000 m2 for Pb=1 GW, not less than 1 km2 for Pb=1 TW, and ~100 km2 for Pb=100 TW, assuming ε=0.5 and δ=0.2. Lower temperature would require even larger radiator area to maintain the outer temperature of the engine section stable for a given thermal power of the reactor."
  • ...2 more comments...
  •  
    We were also discussing a while ago a propulsion system using the relativistic fragments from nuclear fission. That would also produce an extremely high ISP (>100000) with a fairly high thrust. Never really got any traction though.
  •  
    I absolutely do not see the point in a photon rocket. Certainly, the high energy releasing nulcear processes (annihilation, fusion, ...) should rather be used to heat up some fluid to plasma state and accelerate it via magnetic nozzle. This would surely work as door-opener to our solar system...and by the way minimize the heat disposal problem if regenerative cooling is used.
  •  
    The problem is not achieving a high energy density, that we can already do with nuclear fission, the question however is how to confine or harness this power with relatively high efficiency, low waste heat and at not too crazy specific mass. I see magnetic confinement as a possibility, yet still decades away and also an all-or-nothing method as we cannot easily scale this up from a test experiment to a full-scale system. It might be possible to extract power from such a plasma, but definitely well below breakeven so an additional power supply is needed. The fission fragments circumvent these issues by a more brute force approach, thereby wasting a lot of energy for sure but at the end probably providing more ISP and thrust.
  •  
    Sure. However, the annihilation based photon rocket concept unifies almost all relevant drawbacks if we speak about solar system scales, making itself obsolete...it is just an academic testcase.
johannessimon81

Fission reactor + stirling engine tested by NASA - 1 views

  •  
    NASA has tested a prototype of a new design for a small uranium reactor as a power source for deep space exploration. In principle this should pose a smaller radiation danger during launch and more energy per mass compared to RTGs.
Thijs Versloot

Minimagnetospheres - towards magnetic deflector shields - 1 views

  •  
    The study gave insight that already weak magnetic fields can deflect energetic particles due to charge separation and the formation of strong electric fields
Daniel Hennes

CubeSat Ambipolar Thruster - 2 views

  •  
    "Cast your name into deep space in style!"
  •  
    Interesting approach, but with 99.9% probability they will miserably fail (at least in terms of their time schedule) simply because the technology is untested. I haven't read the refs (which miss by the way important works of E. Ahedo et al. on magnetic nozzle acceleration by ambipolar effects), but 1. using water means that you produce oxygen radicals which will erode chamber walls (ionisation efficiency is not 100% and experimental tests haven't been performed yet). 2. Electronic excitation (and radiation), rotational excitation, vibrational excitation, and dissociation are all processes which consume energy and reduce ionisation efficiency drastically. 3. It is a miniaturised Helicon thruster. Theoretical analysis probably does not consider near field effects. Far field models are probably not applicable due to the size of the thruster. I expect some surprises during thruster testing. In any case - good luck!
  •  
    Apparently, there is only one qualification constraint regarding CubeSat propulsion which is related to volatile propellant. Since they use water as propellant and are also the owner of the CubeSat it is actually up to them how they qualify their thruster. Given that it is also possible to qualify the thruster within 18 months - since they define what "qualification" means.
fichbio

[1610.08323] Evidence for vacuum birefringence from the first optical polarimetry measu... - 3 views

shared by fichbio on 02 Dec 16 - No Cached
  •  
    Abstract: The "Magnificent Seven" (M7) are a group of radio-quiet Isolated Neutron Stars (INSs) discovered in the soft X-rays through their purely thermal surface emission. Owing to the large inferred magnetic fields ($B\approx 10^{13}$ G), radiation from these sources is expected to be substantially polarised, independently on the mechanism actually responsible for the thermal emission.
Alexander Wittig

Why a Chip That's Bad at Math Can Help Computers Tackle Harder Problems - 1 views

  •  
    DARPA funded the development of a new computer chip that's hardwired to make simple mistakes but can help computers understand the world. Your math teacher lied to you. Sometimes getting your sums wrong is a good thing. So says Joseph Bates, cofounder and CEO of Singular Computing, a company whose computer chips are hardwired to be incapable of performing mathematical calculations correctly.
  •  
    The whole concept boils down to approximate computing it seems to me. In a presentation I attended once I prospected if the same kind of philosophy could be used as a radiation hardness design approach, the short conclusion being that surely will depend on the functionality intended.
jcunha

Vacuum tubes are back - in nano form - 0 views

  •  
    Although vacuum tubes were the basic components of early electronic devices, by the 1970s they were almost entirely replaced by semiconductor transistors. They are now back in nano-form as "nanoscale vacuum channel transistors" that combine the best of vacuum tubes and modern semiconductors into a single device. This old-technology with a new twist could be useful for space applications due to broader temperature operational range and better radiation resilience - authors are with NASA Ames.
Lionel Jacques

Exotic explanation for Pioneer anomaly ruled out - 1 views

  •  
    "Given that for both craft electricity is supplied by a radioisotope thermoelectric generator (RTGs) powered by the heat given off by the radioactive decay of plutonium - an energy source that decays exponentially with time - Turyshev and others suggested that the extra acceleration could be caused by thermal radiation being emitted from the craft in a preferred direction. "
santecarloni

Space Station Spin-Off Could Protect Mars-Bound Astronauts From Radiation - Technology ... - 0 views

  •  
    Superconducting technology developed for the International Space Station could protect humans on the way to the asteroids or Mars. But will it be worth the cost?
LeopoldS

Transformation of concentrated sunlight into laser radiation on small parabolic concent... - 0 views

  •  
    Duncan aND Lionel please check this one
‹ Previous 21 - 40 of 64 Next › Last »
Showing 20 items per page