Electronics that mimic the treelike branches that form the network neurons use to communicate with each other could lead to artificial intelligence that no longer requires the megawatts of power available in the cloud. AI will then be able to run on the watts that can be drawn from the battery in a smartphone, a new study suggests.
As the brain-imitating AI systems known as neural networks grow in size and power, they are becoming more expensive and energy-hungry. For instance, to train its state-of-the-art neural network GPT-3, OpenAI spent US $4.6 million to run 9,200 GPUs for two weeks. Generating the energy that GPT-3 consumed during training released as much carbon as 1,300 cars would have spewed from their tailpipes over the same time, says study author Kwabena Boahen, a neuromorphic engineer at Stanford University, in California.
Now Boahen proposes a way for AI systems to boost the amount of information conveyed in each signal they transmit. This could reduce both the energy and space they currently demand, he says.
Comments are closed.