International Finance
MagazineTechnology

Neuromorphic Computing: A Power Revolution

Neuromorphic Computing
One of the primary obstacles to the progress of neuromorphic computing has long been the software development required to make the devices function

The need for electricity in modern computing is growing at an alarming rate. According to a recent assessment from the International Energy Agency (IEA), artificial intelligence (AI) and cryptocurrency use by data centres might quadruple from 2022 levels by 2026.

The combined energy consumption of all three industries in 2026 is estimated to be nearly equal to Japan’s yearly energy requirements. Businesses like Nvidia, whose CPU chips power most AI applications, are designing gear that uses less energy.

Is it possible to construct computers with a radically different, more energy-efficient architecture as a substitute? Some businesses believe this, and they are using the structure and capabilities of the brain, an organ that uses a small portion of the power of a traditional computer to accomplish more tasks quickly.

In neuromorphic computing, electronic components mimic synapses and neurons, and their connections are structured to match the brain’s electrical network. It’s not new; since the 1980s, academics have been developing this method.

However, the need to implement emerging technologies in the actual world is growing due to the AI revolution’s energy requirements. Although the main purpose of the platforms and systems in use is research, supporters claim they have the potential to significantly increase energy efficiency.

Giants in the hardware industry like IBM and Intel are among those with business aspirations. There are also a few small businesses present.

TechInsights analyst Dan Hutcheson adds, “The opportunity is there waiting for the company that can figure this out. And it has the potential to be a killer app for Nvidia.”

A Dresden University of Technology spinoff company called SpiNNcloud Systems said in May 2024 that it is accepting preorders for its first neuromorphic supercomputers.

Its co-chief executive, Hector Gonzalez, said, “We have reached the commercialisation of neuromorphic supercomputers in front of other firms.”

Tony Kenyon, a researcher in the field and professor of nanoelectronic and nanophotonic materials at University College London, calls it a remarkable development.

“Even if there isn’t yet a game-changing app, neuromorphic computing will improve performance and energy efficiency in many areas, so as the technology advances, I’m sure we’ll start to see widespread use,” he said.

Neuromorphic computing encompasses a variety of methods, ranging from a more cerebral approach to an almost complete replication of the human brain (which we are far from). However, there are a few fundamental design characteristics that differentiate it from traditional computers.

First off, neuromorphic computers lack separate memory and processor units, in contrast to conventional computers. Rather, those functions are carried out in unison on a single chip at a site.
Professor Tony Kenyon points out that cutting out the need to transport data between the two speeds up processing and uses less energy.

Event-driven computing is another popular method. Activation in neuromorphic computing can be sparser than in conventional computing, where every component of the system is always on and ready to communicate with every other component. Similar to how numerous neurons and synapses in our brains only fire when there’s a reason, the imitation neurons and synapses only fire briefly when they have anything to say.

Another way to save power is to work only when there is something to process. Furthermore, neuromorphic computing can be analogue, whereas modern computers are digital, representing data with 1s or 0s. The computer technique, which is significant historically, uses continuous signals and can help analyse data that comes from the outside world. Nonetheless, the majority of commercially driven neuromorphic endeavours are digital due to convenience.

The two primary types of intended commercial uses are as follows:

One, on which SpiNNcloud focuses, is offering AI applications, such as speech recognition, picture and video analysis, and large-language models that drive chatbots like ChatGPT, a more performant and energy-efficient platform.

Secondly, there is also “edge computing” software, which processes data in real-time on networked devices that run on limited power instead of on the cloud. Wearable technologies, mobile phones, robots, and autonomous cars might all profit.

Charting the difficulties

One of the primary obstacles to the progress of neuromorphic computing has long been the software development required to make the devices function. It’s not enough to just have hardware; it also needs to be programmed to function, which may involve creating a completely new programming language from scratch that differs greatly from that of traditional computers.

Dan Hutcheson concluded, “The potential for these devices is immense. The difficulty is how to make them function,” while believing it would take at least a decade, if not two, before the true advantages of neuromorphic computing become apparent.

Professor Tony Kenyon noted that developing drastically new chips is costly, regardless of whether they employ silicon, as the commercially focused projects do, or other materials. The director of Intel’s neuromorphic computing lab, Mike Davies, claims that Intel is making “rapid progress” with its neuromorphic computer.

The Loihi 2 is the name of Intel’s current neuromorphic microprocessor prototype. The business said in April 2024 that it had assembled 1,152 of them to form Hala Point, a massive neuromorphic research system with 128 billion fictitious connections and over 1.15 billion phoney neurons.

Intel asserts that this system is the largest in the world to date, with a neuron capacity nearly comparable to that of an owl’s brain. It is still a research project.

Mike Davies said that Hala Point “is proving that there’s some real feasibility here for applications to leverage AI.”

Hala Point, about the size of a microwave oven, is “commercially relevant,” and he claims that software development is moving along quickly. NorthPole is the name of IBM’s most recent brain-inspired prototype chip.

The concept represents the development of the TrueNorth prototype processor. According to tests, it is faster, more space-efficient, and more energy-efficient than any chip on the market right now, claims Dharmendra Modha, chief scientist of the company’s brain-inspired computing division.

He continues by saying that his team is currently trying to show how chips can be integrated into a bigger system.

“The narrative to come will be the path to market,” he added, while highlighting that a significant innovation of NorthPole is its co-design with the software, which enables immediate exploitation of the architecture’s full potential. A few other minor neuromorphic businesses are Innatera, BrainChip, and SynSense. IBM claims that its NorthPole chip is faster and uses less energy than other chips.

Key examples

Researchers from the University of Manchester and TU Dresden developed neuromorphic computing, which SpiNNcloud’s supercomputer commercialises as part of the EU’s Human Brain Project.

Such efforts led to the development of two neuromorphic supercomputers for research. The SpiNNaker1 machine, located at the University of Manchester and powered by over a billion neurons, has been operational since 2018.

With the capacity to imitate at least five billion neurons, TU Dresden’s second-generation SpiNNaker2 machine is now undergoing configuration.

According to Hector Gonzalez, SpiNNcloud’s commercially accessible devices are capable of reaching an even greater level of at least 10 billion neurons.

Professor Tony Kenyon believes the future will see a variety of computing platforms operating together, including conventional, neuromorphic, and quantum, which is a revolutionary form of computing that is also on the horizon.

Neuromorphic computing’s integration into several sectors poses ethical and privacy problems as it becomes more widely available and affordable. This covers the gathering, storing, and processing of private stakeholder information, as well as the accuracy of the analysis and the detection of biases. Therefore, safeguards such as strong privacy regulations, encryption, and other measures coupled with set guidelines regarding the use of neuromorphic computing can aid in resolving ethical issues.

Neuromorphic computing, valued at an estimated $8 billion, has the potential to revolutionise various IT disciplines due to its ability to replicate the brain’s information processing and learning capacities. It offers significant advantages, such as rapid processing, energy efficiency, and superior pattern recognition capabilities.

For instance, autonomous vehicles can benefit from neuromorphic hardware to make quick, energy-efficient decisions, reducing collisions and emissions. Similarly, drones using neuromorphic computing could navigate complex environments autonomously, conserving energy by only activating in response to environmental changes.

This technology also promises advancements in edge AI, enabling real-time data processing and extended battery life for devices, and improving automation and fraud detection through its ability to quickly identify complex patterns and anomalies.

Despite these benefits, neuromorphic computing faces considerable challenges. The lack of standardised benchmarks and architectures makes performance evaluation and application sharing difficult. Hardware development is complex, requiring simulations of the brain’s intricate structures, while most software remains designed for conventional von Neumann architectures, limiting neuromorphic applications.

Accessibility is another issue, as neuromorphic systems are currently confined to specialists in well-funded research facilities, demanding deep interdisciplinary knowledge that is scarce among professionals. Furthermore, converting machine learning algorithms to spiking neural networks often reduces precision and accuracy, complicating their integration into existing systems.

Neuromorphic computing mimics brain processes using artificial neurons and synapses to solve problems and make decisions efficiently. This brain-inspired technology is still in its early stages, with practical uses mostly limited to research by academic institutions, government bodies, and major tech companies. However, its potential is immense, particularly in areas requiring high efficiency and speed like edge computing, autonomous vehicles, and cognitive computing.

Neuromorphic designs, often based on the neocortex, achieve high efficiency through spiking neural networks, which replicate the brain’s method of transmitting information quickly and effectively.

The global neuromorphic computing market, valued at USD 4.2 billion in 2022, is projected to soar to USD 29.2 billion by 2032, exhibiting an impressive compound annual growth rate (CAGR) of 22% from 2023 to 2032.

This rapid expansion reflects the growing recognition of neuromorphic computing’s potential to revolutionise various industries. Leading the charge in this market are major companies such as Intel, IBM, BrainChip, Qualcomm, NVIDIA, Hewlett-Packard, Samsung, Accenture, Cadence-Design, and Knowm.

These industry giants are at the forefront of developing and implementing neuromorphic technologies, driving advancements that promise to transform computing paradigms and applications.

The distinction between neuromorphic and traditional computing lies in their architectures. Traditional von Neumann computers use binary processing and sequential operations with separate memory storage and data processing units, often encountering efficiency bottlenecks.

In contrast, neuromorphic computers process multiple pieces of information simultaneously with tightly integrated memory and processors, offering more computational options and accelerating data-intensive tasks. This parallel processing capability positions neuromorphic computing as a promising alternative to overcome the limitations of conventional architectures, potentially driving advancements in AI and computing power.

What's New

Ajman: Emirates’ new ‘Modern City’

IFM Correspondent

Digital extortion: Doxing in the crypto era

IFM Correspondent

AI-enhanced soldiers: Future of warfare unveiled

IFM Correspondent

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.