Skip to main content

Home/ Advanced Concepts Team/ Group items tagged matter

Rss Feed Group items tagged

santecarloni

Has 'new physics' been found at CERN? - physicsworld.com - 1 views

  •  
    Physicists working on the LHCb experiment at the CERN particle-physics lab have released the best evidence yet for direct charge-parity (CP) violation in charm mesons....While more data must be analysed to confirm the result, the work could point to new physics beyond the Standard Model and help physicists understand why there is more matter than antimatter in the universe.
  •  
    lot of new physics this year ...
dejanpetkow

[1202.5708] The Alcubierre Warp Drive: On the Matter of Matter - 1 views

  •  
    News about the warp drive based on the original Alcubierre metric but with modified shape function. Focus of the reserach was on the interaction between warp bubble and cosmic particles. Result: People on board need shielding. People at the journey's destination might get roasted (by Gamma rays if you want to know).
andreiaries

Upping the Anti: CERN Physicists Trap Antimatter Atoms for the First Time: Scientific A... - 0 views

  •  
    Not really the first time, but they seem to be much closer to be being able to study them. Apparently, they had 38 atoms trapped for miliseconds. Now it's time to prove it behaves just like matter.
santecarloni

X Particle Explains Dark Matter and Antimatter at the Same Time | Wired Science | Wired... - 1 views

  •  
    Interesting... however to me it looks like they have only shifted the problem... instead of looking for WIMPS now we look for X and Y and Phi...
Tobias Seidl

6 Ways to Spot False Gurus - 2 views

  •  
    "things are simple if you don't have to check facts" --> the examples are numerous, e.g. mind and matter, cradle-to-cradle
Dario Izzo

Probabilistic Logic Allows Computer Chip to Run Faster - 3 views

  •  
    Francesco pointed out this research one year ago, we dropped it as noone was really considering it ... but in space a low CPU power consumption is crucial!! Maybe we should look back into this?
  • ...6 more comments...
  •  
    Q1: For the time being, for what purposes computers are mainly used on-board?
  •  
    for navigation, control, data handling and so on .... why?
  •  
    Well, because the point is to identify an application in which such computers would do the job... That could be either an existing application which can be done sufficiently well by such computers or a completely new application which is not already there for instance because of some power consumption constraints... Q2 would be then: for which of these purposes strict determinism of the results is not crucial? As the answer to this may not be obvious, a potential study could address this very issue. For instance one can consider on-board navigation systems with limited accuracy... I may be talking bullshit now, but perhaps in some applications it doesn't matter whether a satellite flies on the exact route but +/-10km to the left/right? ...and so on for the other systems. Another thing is understanding what exactly this probabilistic computing is, and what can be achieved using it (like the result is probabilistic but falls within a defined range of precision), etc. Did they build a complete chip or at least a sub-circiut, or still only logic gates...
  •  
    Satellites use old CPUs also because with the trend of going for higher power modern CPUs are not very convenient from a system design point of view (TBC)... as a consequence the constraints put on on-board algorithms can be demanding. I agree with you that double precision might just not be necessary for a number of applications (navigation also), but I guess we are not talking about 10km as an absolute value, rather to a relative error that can be tolerated at level of (say) 10^-6. All in all you are right a first study should assess what application this would be useful at all.. and at what precision / power levels
  •  
    The interest of this can be a high fault tolerance for some math operations, ... which would have for effect to simplify the job of coders! I don't think this is a good idea regarding power consumption for CPU (strictly speaking). The reason we use old chip is just a matter of qualification for space, not power. For instance a LEON Sparc (e.g. use on some platform for ESA) consumes something like 5mW/MHz so it is definitely not were an engineer will look for some power saving considering a usual 10-15kW spacecraft
  •  
    What about speed then? Seven time faster could allow some real time navigation at higher speed (e.g. velocity of a terminal guidance for an asteroid impactor is limited to 10 km/s ... would a higher velocity be possible with faster processors?) Another issue is the radiation tolerance of the technology ... if the PCMOS are more tolerant to radiation they could get more easily space qualified.....
  •  
    I don't remember what is the speed factor, but I guess this might do it! Although, I remember when using an IMU that you cannot have the data above a given rate (e.g. 20Hz even though the ADC samples the sensor at a little faster rate), so somehow it is not just the CPU that must be re-thought. When I say qualification I also imply the "hardened" phase.
  •  
    I don't know if the (promised) one-order-of-magnitude improvements in power efficiency and performance are enough to justify looking into this. For once, it is not clear to me what embracing this technology would mean from an engineering point of view: does this technology need an entirely new software/hardware stack? If that were the case, in my opinion any potential benefit would be nullified. Also, is it realistic to build an entire self-sufficient chip on this technology? While the precision of floating point computations may be degraded and still be useful, how does all this play with integer arithmetic? Keep in mind that, e.g., in the Linux kernel code floating-point calculations are not even allowed/available... It is probably possible to integrate an "accelerated" low-accuracy floating-point unit together with a traditional CPU, but then again you have more implementation overhead creeping in. Finally, recent processors by Intel (e.g., the Atom) and especially ARM boast really low power-consumption levels, at the same time offering performance-boosting features such as multi-core and vectorization capabilities. Don't such efforts have more potential, if anything because of economical/industrial inertia?
Luís F. Simões

How to Grow a Mind: Statistics, Structure, and Abstraction - 4 views

  •  
    a nice review on the wonders of Hierarchical Bayesian models. It cites a paper on probabilistic programming languages that might be relevant given our recent discussions. At Hippo's farewell lunch there was a discussion on how kids are able to learn something as complex as language from a limited amount of observations, while Machine Learning algorithms no matter how many millions of instances you throw at them, don't learn beyond some point. If that subject interested you, you might like this paper.
  •  
    Had an opportunity to listen to JBT and TLG during one summer school.. if they're half as good in writing as they are in speaking, should be a decent read...
LeopoldS

Unbound or distant planetary mass population detected by gravitational microlensing : N... - 1 views

  •  
    dark matter :-) ?? could there be loads of them?
santecarloni

Sharpening the Nanofocus: Berkeley Lab Researchers Use Nanoantenna to Enhance... - 0 views

  •  
    See Also: Matter & Energy Nanotechnology Optics Physics Materials Science Graphene Organic Chemistry Reference White gold Electromagnetic radiation Nanomedicine Nanoparticle Any use for the smell project? "We have demonstrated resonant antenna-enhanced single-particle hydrogen sensing in the visible region and presented a fabrication approach to the positioning of a single palladium nanoparticle in the nanofocus of a gold nanoantenna,"
nikolas smyrlakis

BBC NEWS | Health | Juggling increases brain power - 1 views

  •  
    Oxford University scientists find that a complex skill such as juggling causes changes in the white matter of the brain. - Let's start juggling for the ideastorm!
  •  
    cool. I can do some lessons with three and four balls... tomorrow after lunch !
pacome delva

Phantom menace to dark matter theory - space - 08 July 2009 - New Scientist - 0 views

  • If MOND exists, it will appear as if there is an anomalous, "phantom" mass in that region, exerting a gravitational force on the bodies in our solar system.
  • According to Milgrom, this force should cause the orbits of the planets to precess - that is, their elliptical orbits around the sun should slowly change their orientation, over time tracing out a pattern like the petals of a flower.
ESA ACT

New Statesman - Scientist of the soft stuff - 1 views

  •  
    for those interested in soft matter, which may be a future topic of the ACT ....
ESA ACT

Claytronics - Carnegie Mellon University - 0 views

  •  
    Programmable matter - another nice title for an ACT project...
Ma Ru

Dark Matter or Black Hole Propulsion? - 1 views

  •  
    Anyone out there still doing propulsion stuff? Two more papers just waiting to get busted... http://arxiv.org/abs/0908.1429v1 http://arxiv.org/abs/0908.1803
  • ...5 more comments...
  •  
    What an awful bunch of complete nonsense!!! But I don't think anybody wants to hear MY opinion on this...
  •  
    wow, is this serious at all...!?
  •  
    Are you joking?? The BH drive propses a BH with a lifetime of about an year, just 10^7 tons, peanuts!! Then you have to produce it, better not on Earth, so you do this in space, with a laser that produces an equivalent of 10^9 tons highly foucussed, even more peanuts!! Reasonable losses in the production process (probably 99,999%) are not yet taken into account. Engineering problems... :-) The DM drive is even better, they want to collect DM and compress it in a propulsion chamber. Very easy to collect and compress a gas of particles that traverse the Earth without any interaction. Perhaps if the walls of the chamber are made of artificial BHs?? Who knows??
  •  
    WRONG!!! we are all just WAITING for your opinion on this ....!!!
  •  
    well, yes my remark was ironic... I'm surprised they did a magazine on these concepts...! But the press is always waiting for sensational. They do not even wait for the work to be peer-reviewed now to make an article on it ! This is one of the bad sides of arxiv in my opinion. It's like a journalist that make an article with a copy-paste in wikipedia ! Anyway, this is of course complete bullsh..., and I would have laughed if I had read this in a sci-fi book... but in a "serious" article i'm crying... For the DM i do not agree with your remark Luzi. It's not dark energy they want to use. The DM is baryonic, it's dark just because it's cold so we don't see it by usual means. If you believe the in the standard model of cosmology, then the DM should be somewhere around the galaxies. But it's of course not uniformly distributed, so a DM engine would work (if at all...) only in the periphery of galaxies. It's already impossible to get there...
  •  
    One reply to Pacome, though the discussion exceeds by far the relevance of the topic already. Baryonic DM is strictly limited by cosomology, if one believes in these models, of course. Anyway, even though most DM is cold, we are constantly bombarded by some DM particles that come together with cosmic radiation, solar wind etc. etc. If DM easily interacted with normal matter, we would have found it long ago. In the paper they consider DM as neutralinos, which are neither baryonic nor strongly or electromagnetically interacting.
  •  
    well then I agree, how the fu.. they want to collect them !!!
jmlloren

Scientists discover how to turn light into matter after 80-year quest - 5 views

  •  
    Theoretized 80 years ago was Breit-Wheeler pair production in which two photons result in an electron-positron pair (via a virtual electron). It is a relatively simple Feynmann diagram, but the problem is/was how to produce in practice a high energy photon-photon collider... The collider experiment that the scientists have proposed involves two key steps. First, the scientists would use an extremely powerful high-intensity laser to speed up electrons to just below the speed of light. They would then fire these electrons into a slab of gold to create a beam of photons a billion times more energetic than visible light. The next stage of the experiment involves a tiny gold can called a hohlraum (German for 'empty room'). Scientists would fire a high-energy laser at the inner surface of this gold can, to create a thermal radiation field, generating light similar to the light emitted by stars. They would then direct the photon beam from the first stage of the experiment through the centre of the can, causing the photons from the two sources to collide and form electrons and positrons. It would then be possible to detect the formation of the electrons and positrons when they exited the can. Now this is a good experiment... :)
  • ...6 more comments...
  •  
    The solution of thrusting in space.
  •  
    Thrusting in space is solved already. Maybe you wanted to say something different?
  •  
    Thrusting until your fuel runs out is solved, in this way one can produce mass from, among others, solar/star energy directly. What I like about this experiment is that we have the technology already to do it, many parts have been designed for inertial confinement fusion.
  •  
    I am quite certain that it would be more efficient to use the photons directly for thrust instead of converting them into matter. Also, I am a bit puzzled at the asymmetric layout for photon creation. Typically, colliders use two beam of particle with equal but opposite momentum. Because the total momentum for two colliding particles is zero the reaction products are produced more efficiently as a minimum of collision energy is waisted on accelerating the products. I guess in this case the thermal radiation in the cavity is chosen instead of an opposing gamma ray beam to increase the photon density and increase the number of collisions (even if the efficiency decreases because of the asymmetry). However, a danger from using a high temperature cavity might be that a lot of thermionic emission creates lots of free electrons with the cavity. This could reduce the positron yield through recombination and would allow the high energetic photons to loose energy through Compton scattering instead of the Breit-Wheeler pair production.
  •  
    Well, the main benefit from e-p pair creation might be that one can accelerate these subsequently to higher energies again. I think the photon-photon cross-section is extremely low, such that direct beam-beam interactions are basically not happening (below 1/20.. so basically 0 according to quantum probability :P), in this way, the central line of the hohlraum actually has a very high photon density and if timed correctly maximizes the reaction yield such that it could be measured.
  •  
    I agree about the reason for the hohlraum - but I also keep my reservations about the drawbacks. About the pair production as fuel: I pretty sure that your energy would be used smarter in using photon (not necessarily high energy photons) for thrust directly instead of putting tons of energy in creating a rest-mass and then accelerating that. If you look at E² = (p c)²+(m0 c)² then putting energy into the mass term will always reduce your maximum value of p.
  •  
    True, but isnt it E2=(pc)^2 + (m0c^2)^2 such that for photons E\propto{pc} and for mass E\propto{mc^2}. I agree it will take a lot of energy, but this assumes that that wont be the problem at least. The question therefore is whether the mass flow of the photon rocket (fuel consumed to create photons, eg fission/fusion) is higher/lower than the mass flow for e-p creation. You are probably right that the low e-p cross-section will favour direct use of photons to create low thrust for long periods of time, but with significant power available the ISP might be higher for e-p pair creation.
  •  
    In essence the equation tells you that for photons with zero rest mass m0 all the energy will be converted to momentum of the particles. If you want to accelerate e-p then you first spend part of the energy on creating them (~511 keV each) and you can only use the remaining energy to accelerate them. In this case the equation gives you a lower particle momentum which leads to lower thrust (even when assuming 100% acceleration efficiency). ISP is a tricky concept in this case because there are different definitions which clash in the relativistic context (due to the concept of mass flow). R. Tinder gets to a I_SP = c (speed of light) for a photon rocket (using the relativistic mass of the photons) which is the maximum possible relativistic I_SP: http://goo.gl/Zz5gyC .
LeopoldS

Decreasing human body temperature in the United States since the industrial revolution ... - 1 views

shared by LeopoldS on 11 Jan 20 - No Cached
  •  
    Nice paper and linked to so many other factors.... curious "The question of whether mean body temperature is changing over time is not merely a matter of idle curiosity. Human body temperature is a crude surrogate for basal metabolic rate which, in turn, has been linked to both longevity (higher metabolic rate, shorter life span) and body size (lower metabolism, greater body mass). We speculated that the differences observed in temperature between the 19th century and today are real and that the change over time provides important physiologic clues to alterations in human health and longevity since the Industrial Revolution."
johannessimon81

Single cell slime mold uses external memory for navigation - 3 views

  •  
    Yes, I agree, slime belongs to soft matter.
‹ Previous 21 - 40 of 91 Next › Last »
Showing 20 items per page