Skip to main content

Home/ Advanced Concepts Team/ Group items matching "stack" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
LeopoldS

Graphite + water = the future of energy storage - Monash University - 6 views

  •  
    any idea how this works - who wants to have a closer look at it?
  • ...3 more comments...
  •  
    Water is used for keeping the graphene stacks separate. Without water or some other separation method the different graphene stacks would just stick together and graphene would lose its nice properties (like a huge surface). So, water has nothing to do with energy but is just the material which keeps the graphene stacks at distance. The result is a gel. Still, energy needs to be stored in the gel.
  •  
    and the different graphene layers act as anodes and cathodes??
  •  
    Layer orientation in a gel is random. Additionally to that, cathodes and anodes are about charge seperation. Graphene layers are (as far as I understand) supposed to provide huge surfaces to which something, maybe a charge, can be attached. So do we need ions and electrons? Probably not. Probably just electrons which can travel easily through the gel. I guess the whole gel (and all layers inside) would be nagtively charged, making the gel blob a fluid cathode. But again, it's just a guess.
  •  
    Wouldn't it be worth having a closer look?
  •  
    it's still not clear to me how to get electricity in and out of this thing?
Dario Izzo

Stacked Approximated Regression Machine: A Simple Deep Learning Approach - 5 views

  •  
    from one of the reddit threads discussing this: "bit fishy, crazy if real". "Incredible claims: - Train only using about 10% of imagenet-12, i.e. around 120k images (i.e. they use 6k images per arm) - get to the same or better accuracy as the equivalent VGG net - Training is not via backprop but more simpler PCA + Sparsity regime (see section 4.1), shouldn't take more than 10 hours just on CPU probably "
  •  
    clicking the link says the manuscript was withdrawn :))
  •  
    This "one-shot learning" paper by Googe Deepmind also claims to be able to learn from very few training data. Thought it might be interesting for you guys: https://arxiv.org/pdf/1605.06065v1.pdf
johannessimon81

Plastic Clothing Inspired By Kitchen Wrap Releases Body's Infrared Radiation To Cool The Skin - 0 views

  •  
    Any unintended consequences..? https://i.stack.imgur.com/A0UZn.jpg
LeopoldS

Fastest Ship in the Universe: How Sci-Fi Ships Stack Up - 2 views

  •  
    for the geeks ... and Anna&Jai
  •  
    Gotta love that improbability drive
Dario Izzo

Probabilistic Logic Allows Computer Chip to Run Faster - 3 views

  •  
    Francesco pointed out this research one year ago, we dropped it as noone was really considering it ... but in space a low CPU power consumption is crucial!! Maybe we should look back into this?
  • ...6 more comments...
  •  
    Q1: For the time being, for what purposes computers are mainly used on-board?
  •  
    for navigation, control, data handling and so on .... why?
  •  
    Well, because the point is to identify an application in which such computers would do the job... That could be either an existing application which can be done sufficiently well by such computers or a completely new application which is not already there for instance because of some power consumption constraints... Q2 would be then: for which of these purposes strict determinism of the results is not crucial? As the answer to this may not be obvious, a potential study could address this very issue. For instance one can consider on-board navigation systems with limited accuracy... I may be talking bullshit now, but perhaps in some applications it doesn't matter whether a satellite flies on the exact route but +/-10km to the left/right? ...and so on for the other systems. Another thing is understanding what exactly this probabilistic computing is, and what can be achieved using it (like the result is probabilistic but falls within a defined range of precision), etc. Did they build a complete chip or at least a sub-circiut, or still only logic gates...
  •  
    Satellites use old CPUs also because with the trend of going for higher power modern CPUs are not very convenient from a system design point of view (TBC)... as a consequence the constraints put on on-board algorithms can be demanding. I agree with you that double precision might just not be necessary for a number of applications (navigation also), but I guess we are not talking about 10km as an absolute value, rather to a relative error that can be tolerated at level of (say) 10^-6. All in all you are right a first study should assess what application this would be useful at all.. and at what precision / power levels
  •  
    The interest of this can be a high fault tolerance for some math operations, ... which would have for effect to simplify the job of coders! I don't think this is a good idea regarding power consumption for CPU (strictly speaking). The reason we use old chip is just a matter of qualification for space, not power. For instance a LEON Sparc (e.g. use on some platform for ESA) consumes something like 5mW/MHz so it is definitely not were an engineer will look for some power saving considering a usual 10-15kW spacecraft
  •  
    What about speed then? Seven time faster could allow some real time navigation at higher speed (e.g. velocity of a terminal guidance for an asteroid impactor is limited to 10 km/s ... would a higher velocity be possible with faster processors?) Another issue is the radiation tolerance of the technology ... if the PCMOS are more tolerant to radiation they could get more easily space qualified.....
  •  
    I don't remember what is the speed factor, but I guess this might do it! Although, I remember when using an IMU that you cannot have the data above a given rate (e.g. 20Hz even though the ADC samples the sensor at a little faster rate), so somehow it is not just the CPU that must be re-thought. When I say qualification I also imply the "hardened" phase.
  •  
    I don't know if the (promised) one-order-of-magnitude improvements in power efficiency and performance are enough to justify looking into this. For once, it is not clear to me what embracing this technology would mean from an engineering point of view: does this technology need an entirely new software/hardware stack? If that were the case, in my opinion any potential benefit would be nullified. Also, is it realistic to build an entire self-sufficient chip on this technology? While the precision of floating point computations may be degraded and still be useful, how does all this play with integer arithmetic? Keep in mind that, e.g., in the Linux kernel code floating-point calculations are not even allowed/available... It is probably possible to integrate an "accelerated" low-accuracy floating-point unit together with a traditional CPU, but then again you have more implementation overhead creeping in. Finally, recent processors by Intel (e.g., the Atom) and especially ARM boast really low power-consumption levels, at the same time offering performance-boosting features such as multi-core and vectorization capabilities. Don't such efforts have more potential, if anything because of economical/industrial inertia?
LeopoldS

The iPad in Your Hand: As Fast as a Supercomputer of Yore - NYTimes.com - 3 views

  •  
    one for the IT freaks ... - what will it be in 10 years then?
  •  
    PaGMO on a stack of iPads? Sounds good...
Thijs Versloot

Engineering three-dimensional hybrid supercapacitors for high-performance integrated energy storage - 3 views

  •  
    Stacking laser printed supercapacitors (no clean room required btw) has lead to about 1100F/g and thus about 20-40Wh/L. For supercapacitors thats pretty damn good. For reference, Li-ion recently reached 650Wh/L. The gap is closing, although for supercaps of this type the theoretical maximum is 1400F/g.
marliesarnhof

Attention PGP Users: New Vulnerabilities Require You To Take Action Now - 2 views

  •  
    no cutting-edge space-related science, but important anyways
  •  
    The EFF communicate is actually quite inaccurate. This is disappointing from the EFF, though for some part, it is due to the communication from the researchers who "discovered" the attack. PGP itself is not broken, but rather some implementations on some email clients (notably Enigmail, though it was patched several months ago). See https://protonmail.com/blog/pgp-vulnerability-efail/ On the other hand, if you are very keen on security, there is an XSS attack reported on Signal, so… https://thehackernews.com/2018/05/signal-messenger-code-injection.html The *good* recommendation here is actually rather to keep your software stack up to date (surprising, no?) and keep encrypting your emails.
1 - 10 of 10
Showing 20 items per page