Menu

Blog

Sep 20, 2024

Brains Could Help Solve a Fundamental Problem in Computer Engineering

Posted by in categories: biotech/medical, finance, mobile phones, quantum physics, robotics/AI

In recent years, these technological limitations have become far more pressing. Deep neural networks have radically expanded the limits of artificial intelligence—but they have also created a monstrous demand for computational resources, and these resources present an enormous financial and environmental burden. Training GPT-3, a text predictor so accurate that it easily tricks people into thinking its words were written by a human, costs $4.6 million and emits a sobering volume of carbon dioxide—as much as 1,300 cars, according to Boahen.

With the free time afforded by the pandemic, Boahen, who is faculty affiliate at the Wu Tsai Neurosciences Institute at Stanford and the Stanford Institute for Human-Centered AI (HAI), applied himself single mindedly to this problem. “Every 10 years, I realize some blind spot that I have or some dogma that I’ve accepted,” he says. “I call it ‘raising my consciousness.’”

This time around, raising his consciousness meant looking toward dendrites, the spindly protrusions that neurons use to detect signals, for a completely novel way of thinking about computer chips. And, as he writes in Nature, he thinks he’s figured out how to make chips so efficient that the enormous GPT-3 language prediction neural network could one day be run on a cell phone. Just as Feynman posited the “quantum supremacy” of quantum computers over traditional computers, Boahen wants to work toward a “neural supremacy.”

Leave a reply