IBM Corelet Language Allows Developers to Build Apps that Mimic Human Brain

All the major computing architectures in use today trace their lineage back to the pioneering work of mathematician John von Neumann. Developed from the 1930s through the 1950s, the theories put forth by von Neumann provided the foundation on which the ENIAC, the first general-purpose computer, were based. Subsequently, every system developed since then has relied on the same basic principles to deliver faster levels of computational performance.

While providing the foundation for everything we think we know about computing today, IBM is making the case that the von Neumann approach to computing is only half the story. Over the last several years IBM has been working on a new computing model that more closely mimics the cognitive functions within the human brain.

Based 1,272 ASICs that can fit inside a shoebox-sized system, IBM this past week has now unveiled a Corelet language for building a new class of cognitive applications based on a model that presents developers with the equivalent of nuerosynaptic cores based on an architecture that creates one-to-one relationships between memory and communication processors.

According to Dr. Dharmendra Modha, principal invesitigator for SyNAPSE and a senior management with IBM Research, the new Corelet languages allows developers to create cognitive applications at a higher level of abstraction using a TrueNorth substrate developed by IBM that invokes the nuerosynaptic SyNAPSE cores. In that way developers can then create cognitive applications that much more closely resemble how the human brain processes sensory input.

Modha says that goal is to not replace system based on the von Neumann architecture that was designed primarily to crunch numbers as much as complement them by providing a set of “right brain” approach that complements the analytics functions normally associated with “left brain” activities.

Usage of these new cognitive applications would include, for example, everything from buoys in the ocean that could perceive changes to atmospheric conditions in real time to new types of glasses that would help the visually impaired identify objects. What all these applications would have in common, says Modha, is that all the data could be processed at the exact point where it is collected, rather than trying to bring massive amounts of data across networks where it then has to be centrally processed using systems based on an architecture that was originally designed to crunch numbers.

The implications of systems based on the TrueNorth architecture therefore go well beyond the development of new applications; it creates a foundation for scaling those applications beyond anything possible today because by definition the TrueNorth is based on an event-driven, highly parallelized distributed architecture.

The programming environment for building these applications consists of a multi-threaded software simulator that allows developers to build applications without having to have the actual hardware system present; a model for connecting SyNAPSE processors to create cognitive applications; the Corelet programming language and a library of pre-programmed functions known as Corelets; and a Laboratory curriculum that helps developers learns of to build cognitive applications.

Modha says learning how to build cognitive applications using Corelet isn’t any more difficult than an existing programming language; it just takes time to learn how to think differently about building those applications using Synaptic processors as if they were a series of Lego blocks.

Leveraging a total of $53 million in funding that primarily came from the Defense Advanced Research Project Agency (DARPA), IBM has been working with Cornell University and iniLabs Ltd to build a chip system with ten billion neurons and hundred trillion synapses that only consumes one kilowatt of power in a system that occupies less than two liters of volume.

As compelling as these “right brain” applications are likely to be things will get really interesting when they are integrated with other forms of cognitive computing based on traditional von Neumann architectures such as the IBM Watson platform. Only then will you have joined the hemispheres that make up the typical brain to create something that for better or worse is likely to be truly greater than the sum of its parts.

Michael Vizard