Skip to main content

Home/ Advanced Concepts Team/ Group items matching "chip" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
santecarloni

Intel Reveals Neuromorphic Chip Design  - Technology Review - 1 views

  •  
    Intel's goal is to build chips that work more like the human brain. Now its engineers think they know how
johannessimon81

IBM Neuromorphic chip hits DARPA milestone and has been used to implement deep learning - 2 views

  •  
    "IBM delivered on the DARPA SyNAPSE project with a one million neuron brain-inspired processor. The chip consumes merely 70 milliwatts, and is capable of 46 billion synaptic operations per second, per watt-literally a synaptic supercomputer in your palm." --- No memristors..., yet.: https://www.technologyreview.com/s/537211/a-better-way-to-build-brain-inspired-chips/
Alexander Wittig

Why a Chip That's Bad at Math Can Help Computers Tackle Harder Problems - 1 views

  •  
    DARPA funded the development of a new computer chip that's hardwired to make simple mistakes but can help computers understand the world. Your math teacher lied to you. Sometimes getting your sums wrong is a good thing. So says Joseph Bates, cofounder and CEO of Singular Computing, a company whose computer chips are hardwired to be incapable of performing mathematical calculations correctly.
  •  
    The whole concept boils down to approximate computing it seems to me. In a presentation I attended once I prospected if the same kind of philosophy could be used as a radiation hardness design approach, the short conclusion being that surely will depend on the functionality intended.
jcunha

Introducing A Brain-inspired Computer [IBM TrueNorth] - 0 views

  •  
    Built in Silicon technology (Samsung's 28 nm process), its power is measured as one million neurons and 256 million synapses. It contains 5.4 million transistor being the largest IBM chip in these terms. All this said, it consumes less than 100 mW!! "These systems can efficiently process high-dimensional, noisy sensory data in real time, while consuming orders of magnitude less power than conventional computer architectures." IBM is working with initLabs to integrate the DVS retinal camera with these chips = real time image neuro-like image processing. In what seems to be a very successful project hugely funded by DARPA, "Our sights are now set high on the ambitious goal of integrating 4,096 chips in a single rack with 4 billion neurons and 1 trillion synapses while consuming ~4kW of power."
Dario Izzo

Probabilistic Logic Allows Computer Chip to Run Faster - 3 views

  •  
    Francesco pointed out this research one year ago, we dropped it as noone was really considering it ... but in space a low CPU power consumption is crucial!! Maybe we should look back into this?
  • ...6 more comments...
  •  
    Q1: For the time being, for what purposes computers are mainly used on-board?
  •  
    for navigation, control, data handling and so on .... why?
  •  
    Well, because the point is to identify an application in which such computers would do the job... That could be either an existing application which can be done sufficiently well by such computers or a completely new application which is not already there for instance because of some power consumption constraints... Q2 would be then: for which of these purposes strict determinism of the results is not crucial? As the answer to this may not be obvious, a potential study could address this very issue. For instance one can consider on-board navigation systems with limited accuracy... I may be talking bullshit now, but perhaps in some applications it doesn't matter whether a satellite flies on the exact route but +/-10km to the left/right? ...and so on for the other systems. Another thing is understanding what exactly this probabilistic computing is, and what can be achieved using it (like the result is probabilistic but falls within a defined range of precision), etc. Did they build a complete chip or at least a sub-circiut, or still only logic gates...
  •  
    Satellites use old CPUs also because with the trend of going for higher power modern CPUs are not very convenient from a system design point of view (TBC)... as a consequence the constraints put on on-board algorithms can be demanding. I agree with you that double precision might just not be necessary for a number of applications (navigation also), but I guess we are not talking about 10km as an absolute value, rather to a relative error that can be tolerated at level of (say) 10^-6. All in all you are right a first study should assess what application this would be useful at all.. and at what precision / power levels
  •  
    The interest of this can be a high fault tolerance for some math operations, ... which would have for effect to simplify the job of coders! I don't think this is a good idea regarding power consumption for CPU (strictly speaking). The reason we use old chip is just a matter of qualification for space, not power. For instance a LEON Sparc (e.g. use on some platform for ESA) consumes something like 5mW/MHz so it is definitely not were an engineer will look for some power saving considering a usual 10-15kW spacecraft
  •  
    What about speed then? Seven time faster could allow some real time navigation at higher speed (e.g. velocity of a terminal guidance for an asteroid impactor is limited to 10 km/s ... would a higher velocity be possible with faster processors?) Another issue is the radiation tolerance of the technology ... if the PCMOS are more tolerant to radiation they could get more easily space qualified.....
  •  
    I don't remember what is the speed factor, but I guess this might do it! Although, I remember when using an IMU that you cannot have the data above a given rate (e.g. 20Hz even though the ADC samples the sensor at a little faster rate), so somehow it is not just the CPU that must be re-thought. When I say qualification I also imply the "hardened" phase.
  •  
    I don't know if the (promised) one-order-of-magnitude improvements in power efficiency and performance are enough to justify looking into this. For once, it is not clear to me what embracing this technology would mean from an engineering point of view: does this technology need an entirely new software/hardware stack? If that were the case, in my opinion any potential benefit would be nullified. Also, is it realistic to build an entire self-sufficient chip on this technology? While the precision of floating point computations may be degraded and still be useful, how does all this play with integer arithmetic? Keep in mind that, e.g., in the Linux kernel code floating-point calculations are not even allowed/available... It is probably possible to integrate an "accelerated" low-accuracy floating-point unit together with a traditional CPU, but then again you have more implementation overhead creeping in. Finally, recent processors by Intel (e.g., the Atom) and especially ARM boast really low power-consumption levels, at the same time offering performance-boosting features such as multi-core and vectorization capabilities. Don't such efforts have more potential, if anything because of economical/industrial inertia?
jcunha

Alibaba is making its own neural network chip - 3 views

  •  
    The race for the AI chips intensifies.
  •  
    https://www.eetasia.com/news/article/18041006-ai-chip-market-competition-intensifies
Luís F. Simões

HP Dreams of Internet Powered by Phone Chips (And Cow Chips) | Wired.com - 0 views

  • For Hewlett Packard Fellow Chandrakat Patel, there’s a “symbiotic relationship between IT and manure.”
  • Patel is an original thinker. He’s part of a group at HP Labs that has made energy an obsession. Four months ago, Patel buttonholed former Federal Reserve Chairman Alan Greenspan at the Aspen Ideas Festival to sell him on the idea that the joule should be the world’s global currency.
  • Data centers produce a lot of heat, but to energy connoisseurs it’s not really high quality heat. It can’t boil water or power a turbine. But one thing it can do is warm up poop. And that’s how you produce methane gas. And that’s what powers Patel’s data center. See? A symbiotic relationship.
  • ...1 more annotation...
  • Financial house Cantor Fitzgerald is interested in Project Moonshot because it thinks HP’s servers may have just what it takes to help the company’s traders understand long-term market trends. Director of High-Frequency Trading Niall Dalton says that while the company’s flagship trading platform still needs the quick number-crunching power that comes with the powerhog chips, these low-power Project Moonshot systems could be great for analyzing lots and lots of data — taking market data from the past three years, for example, and running a simulation.
  •  
    of relevance to this discussion: Koomey's Law, a Moore's Law equivalent for computing's energetic efficiency http://www.economist.com/node/21531350 http://hardware.slashdot.org/story/11/09/13/2148202/whither-moores-law-introducing-koomeys-law
Wiktor Piotrowski

Revolutionary Silicon Chip Brings Us Closer To Light-Speed Computer Technology | IFLScience - 1 views

  •  
    They created a 50 times smaller optical beam splitter, which is certainly impressive in itself. Question remains how much does this actually bring us closer to optical chips?1% 50% 20% ?
Ma Ru

Neural Network simulation chip from IBM - 1 views

  •  
    There you go, the latest-and-greatest chip is there. Now the only remaining tiny detail - program it.
  •  
    Let's buy it first and we'll figure the rest out later :P
johannessimon81

Creating Indestructible Self-Healing Chips - 0 views

  •  
    Chips are able to compensate for very large damage and continue working at high performance
LeopoldS

Computing experts unveil superefficient 'inexact' chip - 2 views

  •  
    Directory relaxed to our arianda study, Guido have a Look please
Luke O'Connor

Scientists at MIT replicate brain activity with chip - 2 views

  •  
    A new chip that simulates the behaviour of a synapse. 1 down, a few hundred trillion to go...
Thijs Versloot

Electromagnetism generated by symmetry breaking in dielectrics - 0 views

  •  
    Using dielectric materials as efficient EM radiators and receivers can scale down these antenna's to the chip level, reducing both weight and power consumption. The infamous internet-of-things one step closer. But could we also transmit power this way?? "In dielectric aerials, the medium has high permittivity, meaning that the velocity of the radio wave decreases as it enters the medium," said Dr Dhiraj Sinha, the paper's lead author. "What hasn't been known is how the dielectric medium results in emission of electromagnetic waves. This mystery has puzzled scientists and engineers for more than 60 years." The researchers determined that the reason for this phenomenon is due to symmetry breaking of the electric field associated with the electron acceleration The researchers found that by subjecting the piezoelectric thin films to an asymmetric excitation, the symmetry of the system is similarly broken, resulting in a corresponding symmetry breaking of the electric field, and the generation of electromagnetic radiation.
Juxi Leitner

robots.net - Neuron Interface Chips Advancing - 1 views

  • this advancement could ultimately lead to the use of biological neurons in the central or sub-processing units of computers and automated machinery.
Joris _

Hewlett-Packard Unveils Real-World Memristor, Chip of the Future | Popular Science - 3 views

  • they allow the same device to serve as the processor and the memory
  • a memristor system would work far faster, and with far less energy, than a traditional computer.
  • Second, memristors can be much smaller than transistors.
  • ...1 more annotation...
  • Lastly, unlike transistors, which only work linearly, memristors can form three-dimensional networks
  •  
    wow, looks pretty cool! I wonder what is "a ridiculous amount of memory on a chip"...? giga, tera, peta, exa, ... ?
  •  
    Looks cool indeed, but as with all those technology "breakthroughs" my enthusiasm will be limited until this actually makes it to my laptop...
  •  
    what a phrase: "this advance could increase the power and memory of computers to nearly unimaginable proportions within only a couple of years." ... sure ....
Luís F. Simões

Lockheed Martin buys first D-Wave quantum computing system - 1 views

  • D-Wave develops computing systems that leverage the physics of quantum mechanics in order to address problems that are hard for traditional methods to solve in a cost-effective amount of time. Examples of such problems include software verification and validation, financial risk analysis, affinity mapping and sentiment analysis, object recognition in images, medical imaging classification, compressed sensing and bioinformatics.
  •  
    According to the company's wikipedia page, the computer costs $ 10 million. Can we then declare Quantum Computing has officially arrived?! quotes from elsewhere in the site: "first commercial quantum computing system on the market"; "our current superconducting 128-qubit processor chip is housed inside a cryogenics system within a 10 square meter shielded room" Link to the company's scientific publications. Interestingly, this company seems to have been running a BOINC project, AQUA@home, to "predict the performance of superconducting adiabatic quantum computers on a variety of hard problems arising in fields ranging from materials science to machine learning. AQUA@home uses Internet-connected computers to help design and analyze quantum computing algorithms, using Quantum Monte Carlo techniques". List of papers coming out of it.
Tobias Seidl

DNA May Help Build Next Generation of Chips | Gadget Lab | Wired.com - 0 views

  •  
    Researchers at IBM have made a significant breakthrough in their quest to combine DNA strands with conventional lithographic techniques to create tiny circuit boards.
Thijs Versloot

Real-Time Recognition and Profiling of Home Appliances through a Single Electricity Sensor - 3 views

  •  
    A personal interest of mine that I want to explore a bit more in the future. I just bought a ZigBee electricity monitor and I am wondering whether from the signal of the mains one could detect (reliably) the oven turning on, lights, etc. Probably requires Neural Network training. The idea would be to make a simple device which basically saves you money by telling you how much electricity you are wasting. Then again, its probably already done by Google...
  • ...3 more comments...
  •  
    nice project!
  •  
    For those interested, this is what/where I ordered.. http://openenergymonitor.org/emon/
  •  
    Update two.. RF chip is faulty and tonight I have to solder a new chip into place.. That's open-source hardware for you!
  •  
    haha, yep, that's it... but we can do better than that right! :)
jcunha

Silicon chip with integrated laser: Light from a nanowire - 2 views

  •  
    A nanolaser, a thousand times thinner than a human hair.
1 - 20 of 50 Next › Last »
Showing 20 items per page