IBM has new integrated neuromorphic chip and programming architecture that will make it easier to develop brain like systems that are massively parallel

IBM has a new computer architecture, named TrueNorth, which could lead to a new generation of machines that function more like biological brains. This is DARPA SyNAPSE Phase 3.

It is a breakthrough software ecosystem designed for programming silicon chips that have an architecture inspired by the function, low power, and compact volume of the brain. The technology could enable a new generation of intelligent sensor networks that mimic the brain’s abilities for perception, action, and cognition.

IBM’s new programming model breaks the mold of sequential operation underlying today’s von Neumann architectures and computers. It is instead tailored for a new class of distributed, highly interconnected, asynchronous, parallel, large-scale cognitive computing architectures.

“Architectures and programs are closely intertwined and a new architecture necessitates a new programming paradigm,” said Dr. Dharmendra S. Modha, Principal Investigator and Senior Manager, IBM Research. “We are working to create a FORTRAN for synaptic computing chips. While complementing today’s computers, this will bring forth a fundamentally new technological capability in terms of programming and applying emerging learning systems.”

IBM demonstrated that it is possible to synthesize a rich diversity of computations and behaviors. By hierarchically composing neurons or blocks of neurons into larger networks, we can begin to construct a large class of cognitive algorithms and applications. Looking to the future, by further composing cognitive algorithms and applications, we plan to build versatile, robust, general-purpose cognitive systems that can interact with multi-modal, sub-symbolic, sensors–actuators in real time while being portable and scalable. In an instrumented planet inundated with real-time sensor data, our aspiration is to build cognitive systems that are based on learning instead of programming

Cognitive Computing Building Block: A Versatile and Efficient Digital Neuron Model for Neurosynaptic Cores

Marching along the DARPA SyNAPSE roadmap, IBM unveils a trilogy of innovations towards the TrueNorth cognitive computing system inspired by the brain’s function and efficiency. Judiciously balancing the dual objectives of functional capability and implementation/operational cost, we develop a simple, digital, reconfigurable, versatile spiking neuron model that supports one-to-one equivalence between hardware and simulation and is implementable using only 1272 ASIC gates.

Starting with the classic leaky integrate-and-fire neuron, we add:
(a) configurable and reproducible stochasticity to the input, the state, and the output;
(b) four leak modes that bias the internal state dynamics;
(c) deterministic and stochastic thresholds; and
(d) six reset modes for rich finite-state behavior.
The model supports a wide variety of computational functions and neural codes. We capture 50+ neuron behaviors in a library for hierarchical composition of complex computations and behaviors. Although designed with cognitive algorithms and applications in mind, serendipitously, the neuron model can qualitatively replicate the 20 biologically-relevant behaviors of a dynamical neuron mode

Cognitive Computing Programming Paradigm: A Corelet Language for Composing Networks of Neurosynaptic Cores

The sequential programming paradigm of the von Neumann architecture is wholly unsuited for TrueNorth. Therefore, as our main contribution, we develop a new programming paradigm that permits construction of complex cognitive algorithms and applications while being efficient for TrueNorth and effective for programmer productivity. The programming paradigm consists of (a) an abstraction for a TrueNorth program, named Corelet, for representing a network of neurosynaptic cores that encapsulates all details except external inputs and outputs; (b) an object-oriented Corelet Language for creating, composing, and decomposing corelets; (c) a Corelet Library that acts as an ever-growing repository of reusable corelets from which programmers compose new corelets; and (d) an end-to-end Corelet Laboratory that is a programming environment which integrates with the TrueNorth architectural simulator, Compass, to support all aspects of the programming cycle from design, through development, debugging, and up to deployment. The new paradigm seamlessly scales from a handful of synapses and neurons to networks of neurosynaptic cores of progressively increasing size and complexity. The utility of the new programming paradigm is underscored by the fact that we have designed and implemented more than 100 algorithms as corelets for TrueNorth in a very short time span

Cognitive Computing Systems: Algorithms and Applications for Networks of Neurosynaptic Cores

Visually stimulating: TrueNorth can be used to simulate the processing of a retina. This image shows the firing of virtual neurons in such a system.

IBM has developed a set of abstractions, algorithms, and applications that are natively efficient for TrueNorth.

1) IBM developed repeatedly-used abstractions that span neural codes (such as binary, rate, population, and time-to-spike), long-range connectivity, and short-range connectivity.
2) IBM implemented ten algorithms that include convolution networks, spectral content estimators, liquid state machines, restricted Boltzmann machines, hidden Markov models, looming detection, temporal pattern matching, and various classifiers.
3) IBM demonstrated seven applications that include speaker recognition, music composer recognition, digit recognition, sequence prediction, collision avoidance, optical flow, and eye detection.

The results showcase the parallelism, versatility, rich connectivity, spatio-temporality, and multi-modality of the TrueNorth architecture as well as compositionality of the corelet programming paradigm and the
flexibility of the underlying neuron model

“Architectures and programs are closely intertwined and a new architecture necessitates a new programming paradigm,” said Dr. Dharmendra S. Modha, Principal Investigator and Senior Manager, IBM Research. “We are working to create a FORTRAN for synaptic computing chips. While complementing today’s computers, this will bring forth a fundamentally new technological capability in terms of programming and applying emerging learning systems.”

To advance and enable this new ecosystem, IBM researchers developed the following breakthroughs that support all aspects of the programming cycle from design through development, debugging, and deployment:

– Simulator: A multi-threaded, massively parallel and highly scalable functional software simulator of a cognitive computing architecture comprising a network of neurosynaptic cores.

– Neuron Model: A simple, digital, highly parameterized spiking neuron model that forms a fundamental information processing unit of brain-like computation and supports a wide range of deterministic and stochastic neural computations, codes, and behaviors. A network of such neurons can sense, remember, and act upon a variety of spatio-temporal, multi-modal environmental stimuli.

– Programming Model: A high-level description of a “program” that is based on composable, reusable building blocks called “corelets.” Each corelet represents a complete blueprint of a network of neurosynaptic cores that specifies a based-level function. Inner workings of a corelet are hidden so that only its external inputs and outputs are exposed to other programmers, who can concentrate on what the corelet does rather than how it does it. Corelets can be combined to produce new corelets that are larger, more complex, or have added functionality.

– Library: A cognitive system store containing designs and implementations of consistent, parameterized, large-scale algorithms and applications that link massively parallel, multi-modal, spatio-temporal sensors and actuators together in real-time. In less than a year, the IBM researchers have designed and stored over 150 corelets in the program library.

– Laboratory: A novel teaching curriculum that spans the architecture, neuron specification, chip simulator, programming language, application library and prototype design models. It also includes an end-to-end software environment that can be used to create corelets, access the library, experiment with a variety of programs on the simulator, connect the simulator inputs/outputs to sensors/actuators, build systems, and visualize/debug the results.

Smarter Sensors

IBM’s long-term goal is to build a chip system with ten billion neurons and hundred trillion synapses, while consuming merely one kilowatt of power and occupying less than two liters of volume.

Modha’s team has developed software that runs on a conventional supercomputer but simulates the functioning of a massive network of neurosynaptic cores—with 100 trillion virtual synapses and 2 billion neurosynaptic cores.

Systems built from these chips could bring the real-time capture and analysis of various types of data closer to the point of collection. They would not only gather symbolic data, which is fixed text or digital information, but also gather sub-symbolic data, which is sensory based and whose values change continuously. This raw data reflects activity in the world of every kind ranging from commerce, social, logistics, location, movement, and environmental conditions.

Take the human eyes, for example. They sift through over a Terabyte of data per day. Emulating the visual cortex, low-power, light-weight eye glasses designed to help the visually impaired could be outfitted with multiple video and auditory sensors that capture and analyze this optical flow of data.

These sensors would gather and interpret large-scale volumes of data to signal how many individuals are ahead of the user, distance to an upcoming curb, number of vehicles in a given intersection, height of a ceiling or length of a crosswalk. Like a guide dog, sub-symbolic data perceived by the glasses would allow them to plot the safest pathway through a room or outdoor setting and help the user navigate the environment via embedded speakers or ear buds. This same technology — at increasing levels of scale — can form sensory-based data input capabilities and on-board analytics for automobiles, medical imagers, healthcare devices, smartphones, cameras, and robots.

If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks

IBM has new integrated neuromorphic chip and programming architecture that will make it easier to develop brain like systems that are massively parallel

IBM has a new computer architecture, named TrueNorth, which could lead to a new generation of machines that function more like biological brains. This is DARPA SyNAPSE Phase 3.

It is a breakthrough software ecosystem designed for programming silicon chips that have an architecture inspired by the function, low power, and compact volume of the brain. The technology could enable a new generation of intelligent sensor networks that mimic the brain’s abilities for perception, action, and cognition.

IBM’s new programming model breaks the mold of sequential operation underlying today’s von Neumann architectures and computers. It is instead tailored for a new class of distributed, highly interconnected, asynchronous, parallel, large-scale cognitive computing architectures.

“Architectures and programs are closely intertwined and a new architecture necessitates a new programming paradigm,” said Dr. Dharmendra S. Modha, Principal Investigator and Senior Manager, IBM Research. “We are working to create a FORTRAN for synaptic computing chips. While complementing today’s computers, this will bring forth a fundamentally new technological capability in terms of programming and applying emerging learning systems.”

IBM demonstrated that it is possible to synthesize a rich diversity of computations and behaviors. By hierarchically composing neurons or blocks of neurons into larger networks, we can begin to construct a large class of cognitive algorithms and applications. Looking to the future, by further composing cognitive algorithms and applications, we plan to build versatile, robust, general-purpose cognitive systems that can interact with multi-modal, sub-symbolic, sensors–actuators in real time while being portable and scalable. In an instrumented planet inundated with real-time sensor data, our aspiration is to build cognitive systems that are based on learning instead of programming

Cognitive Computing Building Block: A Versatile and Efficient Digital Neuron Model for Neurosynaptic Cores

Marching along the DARPA SyNAPSE roadmap, IBM unveils a trilogy of innovations towards the TrueNorth cognitive computing system inspired by the brain’s function and efficiency. Judiciously balancing the dual objectives of functional capability and implementation/operational cost, we develop a simple, digital, reconfigurable, versatile spiking neuron model that supports one-to-one equivalence between hardware and simulation and is implementable using only 1272 ASIC gates.

Starting with the classic leaky integrate-and-fire neuron, we add:
(a) configurable and reproducible stochasticity to the input, the state, and the output;
(b) four leak modes that bias the internal state dynamics;
(c) deterministic and stochastic thresholds; and
(d) six reset modes for rich finite-state behavior.
The model supports a wide variety of computational functions and neural codes. We capture 50+ neuron behaviors in a library for hierarchical composition of complex computations and behaviors. Although designed with cognitive algorithms and applications in mind, serendipitously, the neuron model can qualitatively replicate the 20 biologically-relevant behaviors of a dynamical neuron mode

Cognitive Computing Programming Paradigm: A Corelet Language for Composing Networks of Neurosynaptic Cores

The sequential programming paradigm of the von Neumann architecture is wholly unsuited for TrueNorth. Therefore, as our main contribution, we develop a new programming paradigm that permits construction of complex cognitive algorithms and applications while being efficient for TrueNorth and effective for programmer productivity. The programming paradigm consists of (a) an abstraction for a TrueNorth program, named Corelet, for representing a network of neurosynaptic cores that encapsulates all details except external inputs and outputs; (b) an object-oriented Corelet Language for creating, composing, and decomposing corelets; (c) a Corelet Library that acts as an ever-growing repository of reusable corelets from which programmers compose new corelets; and (d) an end-to-end Corelet Laboratory that is a programming environment which integrates with the TrueNorth architectural simulator, Compass, to support all aspects of the programming cycle from design, through development, debugging, and up to deployment. The new paradigm seamlessly scales from a handful of synapses and neurons to networks of neurosynaptic cores of progressively increasing size and complexity. The utility of the new programming paradigm is underscored by the fact that we have designed and implemented more than 100 algorithms as corelets for TrueNorth in a very short time span

Cognitive Computing Systems: Algorithms and Applications for Networks of Neurosynaptic Cores

Visually stimulating: TrueNorth can be used to simulate the processing of a retina. This image shows the firing of virtual neurons in such a system.

IBM has developed a set of abstractions, algorithms, and applications that are natively efficient for TrueNorth.

1) IBM developed repeatedly-used abstractions that span neural codes (such as binary, rate, population, and time-to-spike), long-range connectivity, and short-range connectivity.
2) IBM implemented ten algorithms that include convolution networks, spectral content estimators, liquid state machines, restricted Boltzmann machines, hidden Markov models, looming detection, temporal pattern matching, and various classifiers.
3) IBM demonstrated seven applications that include speaker recognition, music composer recognition, digit recognition, sequence prediction, collision avoidance, optical flow, and eye detection.

The results showcase the parallelism, versatility, rich connectivity, spatio-temporality, and multi-modality of the TrueNorth architecture as well as compositionality of the corelet programming paradigm and the
flexibility of the underlying neuron model

“Architectures and programs are closely intertwined and a new architecture necessitates a new programming paradigm,” said Dr. Dharmendra S. Modha, Principal Investigator and Senior Manager, IBM Research. “We are working to create a FORTRAN for synaptic computing chips. While complementing today’s computers, this will bring forth a fundamentally new technological capability in terms of programming and applying emerging learning systems.”

To advance and enable this new ecosystem, IBM researchers developed the following breakthroughs that support all aspects of the programming cycle from design through development, debugging, and deployment:

– Simulator: A multi-threaded, massively parallel and highly scalable functional software simulator of a cognitive computing architecture comprising a network of neurosynaptic cores.

– Neuron Model: A simple, digital, highly parameterized spiking neuron model that forms a fundamental information processing unit of brain-like computation and supports a wide range of deterministic and stochastic neural computations, codes, and behaviors. A network of such neurons can sense, remember, and act upon a variety of spatio-temporal, multi-modal environmental stimuli.

– Programming Model: A high-level description of a “program” that is based on composable, reusable building blocks called “corelets.” Each corelet represents a complete blueprint of a network of neurosynaptic cores that specifies a based-level function. Inner workings of a corelet are hidden so that only its external inputs and outputs are exposed to other programmers, who can concentrate on what the corelet does rather than how it does it. Corelets can be combined to produce new corelets that are larger, more complex, or have added functionality.

– Library: A cognitive system store containing designs and implementations of consistent, parameterized, large-scale algorithms and applications that link massively parallel, multi-modal, spatio-temporal sensors and actuators together in real-time. In less than a year, the IBM researchers have designed and stored over 150 corelets in the program library.

– Laboratory: A novel teaching curriculum that spans the architecture, neuron specification, chip simulator, programming language, application library and prototype design models. It also includes an end-to-end software environment that can be used to create corelets, access the library, experiment with a variety of programs on the simulator, connect the simulator inputs/outputs to sensors/actuators, build systems, and visualize/debug the results.

Smarter Sensors

IBM’s long-term goal is to build a chip system with ten billion neurons and hundred trillion synapses, while consuming merely one kilowatt of power and occupying less than two liters of volume.

Modha’s team has developed software that runs on a conventional supercomputer but simulates the functioning of a massive network of neurosynaptic cores—with 100 trillion virtual synapses and 2 billion neurosynaptic cores.

Systems built from these chips could bring the real-time capture and analysis of various types of data closer to the point of collection. They would not only gather symbolic data, which is fixed text or digital information, but also gather sub-symbolic data, which is sensory based and whose values change continuously. This raw data reflects activity in the world of every kind ranging from commerce, social, logistics, location, movement, and environmental conditions.

Take the human eyes, for example. They sift through over a Terabyte of data per day. Emulating the visual cortex, low-power, light-weight eye glasses designed to help the visually impaired could be outfitted with multiple video and auditory sensors that capture and analyze this optical flow of data.

These sensors would gather and interpret large-scale volumes of data to signal how many individuals are ahead of the user, distance to an upcoming curb, number of vehicles in a given intersection, height of a ceiling or length of a crosswalk. Like a guide dog, sub-symbolic data perceived by the glasses would allow them to plot the safest pathway through a room or outdoor setting and help the user navigate the environment via embedded speakers or ear buds. This same technology — at increasing levels of scale — can form sensory-based data input capabilities and on-board analytics for automobiles, medical imagers, healthcare devices, smartphones, cameras, and robots.

If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks