Our brains have approximately 86 billion neurons.

Combined, they make up one of the most advanced organic neural networks known to exist.

40% off TNW Conference!

How an AI brain with only one neuron could surpass humans

Take GPT-3 as an example: it has 175 billion parameters, 100 times more than its predecessor GPT-2.

Typically, a web connection needs more than one node.

Per the teamsresearch paper:

We have designed a method for complete folding-in-time of a multilayer feed-forward DNN.

This Fit-DNN approach requires only a single neuron with feedback-modulated delay loops.

Via a temporal sequentialization of the nonlinear operations, an arbitrarily deep or wide DNN can be realized.

The result, typically, is that more neurons produce more parameters, and more parameters produce finer results.

Rapidly is putting it mildly though.

What does this mean for AI?

According to the researchers, this could counter the rising energy costs of training strong networks.

In initial testing, the researchers used the new system to perform computer vision functions.

Also tagged with