Could brain-like computers revolutionize the competitive landscape?

Electricity consumption on modern computers is growing at an alarming rate as a result of computing.

As depicted by IEA in their report, the usage from data centers, AI, and crypto is planned to rise to double what it would be in 2022 by 2026.

Chips from Nvidia that are used in almost all AI technologies and algorithms of today are companies that are working actively to create new chips that can work with considerably less power. However, there is another potential solution— to construct computers with a different infrastructure that does not consume such a great amount of electricity.

Some firms believe so and are inspired by an organ that uses a fraction of the power of a conventional computer to perform more operations faster: The center in the human body that is most affected is the brain. Neuromorphic computing involves having electronic devices that replicate the structure of neurons and synapses and connect them like how the brain’s electrical circuitry is connected.

This is not a new concept, and in fact, studies have been conducted with respect to this since the early 1980s. At the same time the rising energy demands due to the development of the AI revolution it encourage to bring this relatively new technology into practice. To date, these systems and platforms are mostly research tools, however, supporters assert they could create tens of percent increase in energy efficiency.

Among the companies that may be interested in having a commercial use are big names in the hardware industry such as Intel and IBM, some companies that have been pocketed in the industry as well. These are the words of wisdom of Dan Hutcheson, an analyst at TechInsights, “The opportunity is there waiting for the company that can figure this out”. “the opportunity is such that this could be the Nvidia killer. ”

Last May, a neuromorphic supercomputer maker, SpiNNcloud Systems, originating out of the Dresden University of Technology, announced plans for sales of the equipment for the first time and opened its order-taking for pre-order as well.

“We have commercialized neuromorphic supercomputers which are ahead of that of other companies,” claimed the co-CEO Héctor González.

Prof Tony Kenyon, a professor of Nanoelectronic and Nanophotonic samples at University College, London noted this as a big improvement in the field.

“Although a ‘neuromorphic computer’ killer app has not emerged yet… there will be numerous domains that will discover tens to hundreds of times improvements in energy efficiency and performance, and for these applications, we can rest assured that large-scale usage of neuromorphic computing platforms will be realized,” he adds.

Neuromorphic computing thus covers from an even more fundamental mimicking of the brain’s design and structure to a simulation of a human brain though this is yet a dream (in the figurative) sense.

Still, some basic design characteristics that distinguish it from traditional computation have been defined in its architecture.

First of all, unlike ordinary computers, neuromorphic computers do not contain processors and memory as two distinct modules. Instead, such tasks are done in one place though they may be carried out on two or more chips.

This approach helps save power, following Prof Kenyon’s explanation, since there is no need to shift data between memory and CPU.

Another typical characteristic of computing is several events or occurrences that happen to occur.

This is different from ordinary systems where every component is constantly powered and connected with every other part of the circuit all the time, while neuromorphic computing is started up more conservatively.

The microscale imitations of the neurons and synapses only fire up when they are preoccupied with something as it is with the neurons and synapses in the brains.

Processing something only when we receive work also saves energy as we do not have to work at full capacity like most computing systems.

Also, while a common model of current computers is digital based on the concept of binary where data are represented as 1s and 0s, neuromorphic computing can be analog.

While it is not as powerful as digital computing, which uses digital computing and requires discrete signals, it has been used in the past with continuous signals and can therefore also be used to process data coming from the external environment.

Neuromorphic computing’s future roles in commercial products are expected in two large areas.

The first of these is that which SpiNNcloud is planning to offer, as mentioned above, a considerably more efficient and high-performance framework for AI use. This will include image and video, speech, and language models that are used in the generation of large language models such as those for chatbots like ChatGPT.

The second type of use case is related to “edge computing” applications where data is processed in real-time on connected devices rather than utilizing server-centered Cloud as much, operating under power limitations. Examples of the domains where this approach may be employed successfully are automatic driving cars, robots, cell phones, and wearable technologies.

However, technical challenges remain. One of the biggest challenges that prevent the progress of neuromorphic computing is the software accompanying the chips. However, the hardware must be made to function and this is not a trivial task since it may call for a paradigm that may be totally different from that used in conventional computers.

In Mr. Hutcheson’s outlook, the potential of such devices is extremely vast; however, the question of how one can get them to do so is challenging. He has estimated that at least ten to twenty years will be required to see the real potential of neuromorphic computing.

There are also cost relevancies Cost considerations are also present They are also cost questions Cost implications are also involved While commercially, most try to stick to silicon, which is quite easy to work with, coming up with a new generation of chips which are a step different is costly according to Prof. Kenyon.

Mike Davies (right) says Intel is making “rapid progress” with its neuromorphic computer.

The current neuromorphic chip that Intel has in its prototype form is Loihi 2.

In April, the company announced that it had integrated 1,152 of these chips to establish Hala Point, a neuromorphic research system of more than 1 million chips. As for the current status of computer ingredients and estimated parameters, there are 15 billion artificial neurons and 128 billion artificial synapses. While its neuron capability is probably slightly less than an owl’s brain, Intel asserts that this is the largest system currently in existence.

Today, Hala Point is still a research project within Intel supported by academic researchers. “What [But Hala Point] is proving is that, yes, there are some real use cases for applications to use AI,” Mike Davies, the director of Intel Neuromorphic Computing Lab.

Available in two sizes, the current model Hala Point is as large as a microwave oven; Davies points out that the product is ‘‘commercially relevant’’ and ‘‘we are progressing very fast on the software side.”

IBM’s newest prototype chips that are designed based on the human brain are called NorthPole chips.

First introduced a year ago, it builds upon the chip of the same line called the TrueNorth prototype. The actual proofs reveal that the new chip, NorthPole, consumes half the energy, occupies half the space, and yet outperforms any chip now available in the market and all this becomes possible through IBM’s brain-inspired computing led by Dharmendra Modha. He also says that his team is now in the process of trying to prove that these chips can form part of a bigger and more complex network.

Looking at the Xiaomi Mi3 and future versions of the device, Dr. Modha concludes that “the path to market will be a story to come.” This analyst underlines that NorthPole is one of the organization’s major innovations, and reveals that, unlike some of the architecture he studied, it has been co-designed with software, which means that its complete potential can be used right away.

Some of the other small companies in the neuromorphic technology sector include BrainChip, SynSense, and Innatera.

IBM claims its NorthPole chip is more energy-efficient and faster than any other chip.

SpiNNcloud’s supercomputer delivers innovative web-based neuromorphic computing provided by Dresden University and Manchester University, as part of the EU’s Human Brain Project.

These efforts have led to the creation of two research-purpose neuromorphic supercomputers: The SpiNNaker1, available since 2018 at the University of Manchester, has over one billion neurons, and the second-generation SpiNNaker2 currently under tuning at TU Dresden, emulates forming five billion neurons.

In the words of Mr. Gonzalez, the systems made available in Splnncloud can go even higher with a capacity of at least 10 billion neurons.

This way, Prof. Kenyon also expects synergy between conventional, neurosynaptic, and quantum computing platforms in the future. Another is quantum computing which forms another type of novel computing as well.

Leave a Reply

Your email address will not be published. Required fields are marked *