A group of researchers at Sandia National Laboratories have developed a tool that can cross-train standard convolutional neural networks (CNN) to a spiking neural model that can be used on neuromorphic processors. The researchers claim that the conversion will enable deep learning applications to take advantage of the much better energy efficiency of neuromorphic hardware, which are designed to mimic the way the biological neurons work.
The tool, known as Whetstone, works by adjusting artificial neuron behavior during the training phase to only activate when it reaches an appropriate threshold. As a result, neuron activation become a binary choice – either it spikes or it doesn’t. By doing so, Whetstone converts an artificial neural network into a spiking neural network. The tool does this by using an incremental “sharpening process” (hence Whetstone) through each network layer until the activation becomes discrete.
According to Whetstone researcher Brad Aimone, this discrete activation greatly minimizes communication costs between the layers, and thus energy consumption, but with only minimal loss of accuracy. “We continue to be impressed that without dramatically changing what the networks look like, we can get very close to a standard neural net [in accuracy],” he says. “We’re usually within a percent or so on performance.”
Comments are closed.