Intel announces the “self-learning” chip designed to be just like your brain

intel loihi chip; intel, loihi, chip, loihi chip, tech, technology, science, AI, neuromorphic chip;

Intel is reigniting an old but unverified type of computer processor, with the aim of improving the hardware available for artificial intelligence (AI). Intel calls them neuromorphic chips and they’re a lot more proficient compared to the processors we have today. They use 1000 times less energy because of a number of structural differences that we’ve explained below.

Neuromorphic chips are designed to model the human brain (cool huh?), but they’ve shown better results when operating within standard CPUs compared to real life purposes. So we can’t replicate human brains yet, but Intel are still trying with the new neuromorphic chip that is specific for R&D. Intel have named the chip “Loihi” and it uses “spiking neurons” which replace the traditional logic gates. The spiking neurons are different from the usual AI systems introduced as they weigh the signals sent instead of processing binary information. They’re different from traditional CPUs too because they don’t work by regulated calculations with a central clock, but fire only when required.

intel loihi chip; intel, loihi, chip, loihi chip, tech, technology, science, AI, neuromorphic chip;

Source: Forbes

For all these reasons neuromorphic chips are much more efficient compared to traditional processors, and it’s great for getting AI to operate on regular devices like smartphones. The way neuromorphic chips have been designed mean they can work for all modern AI processors including self-driving cars.

Even with all these advantages, neuromorphic chips haven’t been proven to work yet outside the lab, and can’t be viewed as a consumer product yet, even though they’ve been performing well in academic research so far. So it’s a while before they’re officially available to consumers.

Steve Furber, who leads a major project in the neuromorphic field, said: It’s true that neuromorphic systems exist, and you can get one and use one. But all of them have fairly small user bases, in universities or industrial research groups. All require fairly specialized knowledge. And there is currently no compelling demonstration of a high-volume application where neuromorphic outperforms the alternative”.

The IT industry is buzzing with AI innovation with digital assistants such as Alexa and Siri becoming popular with consumers. Intel’s not the only one focusing on the beuromorphic field since there’s a lot of scientists focusing on neuromorphic research and Nvidia wants in on the market too. Fingers crossed to have another brain when you’re tired of using your own? It’ll be interesting to see how neuromorphic chips take off, especially in the consumer market, and if they really call replicate the human brain, or if it’s just another attempt that fails.

To keep up with the latest on neuromorphic chips and tech news check out


Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>