17.2 C
New York
Tuesday, April 15, 2025

Trends in Neuromorphic Computing CIOs Should Know


Neuromorphic computing is the term applied to computer elements that emulate the way the human brain and nervous system function. Proponents believe that the approach will take artificial intelligence to new heights while reducing computing platform energy requirements. 

“Unlike traditional computing, which incorporates separate memory and processors, neuromorphic systems rely on parallel networks of artificial neurons and synapses, similar to biological neural networks,” observes Nigel Gibbons, director and senior advisor at consulting firm NCC Group in an online interview. 

Potential Applications 

The current neuromorphic computing application landscape is largely research-based, says Doug Saylors, a partner and cybersecurity co-lead with technology research and advisory firm ISG. “It’s being used in multiple areas for pattern and anomaly detection, including cybersecurity, healthcare, edge AI, and defense applications,” he explains via email. 

Potential applications will generally fall into the same areas as artificial intelligence or robotics, says Derek Gobin, a researcher in the AI division of Carnegie Mellon University’s Software Engineering Institute. “The ideal is you could apply neuromorphic intelligence systems anywhere you would need or want a human brain,” he notes in an online interview. 

Related:FICO CAO Scott Zoldi: Innovation Helps Operationalize AI

“Most current research is focused on edge-computing applications in places where traditional AI systems would be difficult to deploy, Gobin observes. Many neuromorphic techniques also intrinsically incorporate temporal aspects, similar to how the human brain operates in continuous time, as opposed to the discrete input-output cycles that artificial neural networks utilize.” He believes that this attribute could eventually lead to the development of time-series-focused applications, such as audio processing and computer vision-based control systems. 

Current Development 

As with quantum computing research, there are multiple approaches to both neuromorphic hardware and algorithm development, Saylors says. The best-known platforms, he states, are BrainScaleS and SpiNNaker. Other players include GrAI Matter labs and BrainChip. 

Neuromorphic strategies are a very active area of research, Gobin says. “There are a lot of exciting findings happening every day, and you can see them starting to take shape in various public and commercial projects.” He reports that both Intel and IBM are developing neuromorphic hardware for deploying neural models with extreme efficiency. “There are also quite a few startups and government proposals looking at bringing neuromorphic capabilities to the forefront, particularly for extreme environments, such as space, and places where current machine learning techniques have fallen short of expectations, such as autonomous driving.” 

Related:What Top 3 Principles Define Your Role as a CIO and a CTO?

Next Steps 

Over the short term, neuromorphic computing will likely be focused on adding AI capabilities to specialty edge devices in healthcare and defense applications, Saylors says. “AI-enabled chips for sensory use cases are a leading research area for brain/spinal trauma, remote sensors, and AI enabled platforms in aerospace and defense,” he notes. 

An important next step for neuromorphic computing will be maturing a technology that has already proven successful in academic settings, particularly when it comes to scaling, Gobin says. “As we’re beginning to see a plateau in performance from GPUs, there’s interest in neuromorphic hardware that can better run artificial intelligence models — some companies have already begun developing and prototyping chips for this purpose.” 

Another promising use case is event-based camera technology, which shows promise as a practical and effective medium for satellite and other computer vision applications, Gobin says. “However, we have yet to see any of these technologies get wide-scale deployment,” he observes. “While research is still very active with exciting developments, the next step for the neuromorphic community is really proving that this tech can live up to the hype and be a real competitor to the traditional hardware and generative AI models that are currently dominating the market.” 

Related:Balancing AI’s Promise and Complexities: 6 Takeaways for Tech Leaders

Looking Ahead 

Given the technology’s cost and complexity, coupled with the lack of skilled resources, it’s likely to take another seven to 10 years before widespread usage of complex neuromorphic computing occurs, Saylors says. “However, recent research in combining neuromorphic computing with GenAI and emerging quantum computing capabilities could accelerate this by a year or two in biomedical and defense applications.” 

Mainstream adoption hinges on hardware maturity, cost reduction, and robust software, Gibbons says. “We may see initial regular usage within the next five to 10 years in specialized low-power applications,” he predicts. “Some of this will be dictated by the maturation of quantum computing.” Gibbons believes that neuromorphic computing’s next phase will focus on scaling integrated chips, refining and spiking neural network algorithms, and commercializing low-power systems for applications in robotics, edge AI, and real-time decision-making. 

Gibbons notes that neuromorphic computing may soon play an important role in advancing cybersecurity. The technology promises to offer improved anomaly detection and secure authentication, thanks to event-driven intelligence, he explains. Yet novel hardware vulnerabilities, unknown exploit vectors, and data confidentiality remain critical concerns that may hamper widespread adoption. 



Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles