Learning from Our Brains, Accelerating Moore's Law
Updated: Nov 16, 2020
With artificial intelligence applications and algorithms growing at an exponential rate, we’re about hit a real, physical limit in how fast we can run our models. Nearly 5 decades ago, “Moore’s Law” — more like an observation axiom — predicted that the capacity of electronic devices doubles every two years This has been true since then, our chips are getting faster and faster.
More accurately, Intel’s cofounder Gordon Moore, explained that “The number of transistors incorporated in a chip will approximately double every 24 months.” Transistors, small on-off switches that run our machines, got smaller and consumed less electricity. However, mother nature has different plans. Currently, we’re seeing transistors as small as 10 nanometers, roughly 10,000 times thinner than human hair — but going smaller than this is causing some funny quantum business.
Our Quantum Problems
Electrons flowing through such thin transistors are simply tunneling their way out — a quantum property where a wavefunction can propagate through a potential barrier. And since electrons are particles and waves at the same time, it gets impossible to hold them in smaller transistors.
Neuromorphic Computing: Mimicking the Brain
The key distinction between digital and analog computing is that digital computers can only have two states, on or off. An analog computer, on the other hand, may have several possible states. A good analogy for this may be a fan that can be turned of and off to a fan that has varying levels of speed.
Neuromorphic computing is a relatively new branch which focuses on replicating the working of our brain in our hardware. At Penn State, a team of engineers is attempting to pioneer this type of computing that mimics the efficiency of the brain’s neural networks by employing our brain’s analog nature.
“We have powerful computers, no doubt about that, the problem is you have to store the memory in one place and do the computing somewhere else,” Das said. This accurately describes the two-edged sword computers’ of today run on — though a breakthrough technology at its time, now restrains processing power.
“We are creating artificial neural networks, which seek to emulate the energy and area efficiencies of the brain,” described Thomas Schranghamer, a doctoral student in the Das group and first author of a paper recently published in Nature Comm. “The brain is so compact it can fit on top of your shoulders, whereas a modern supercomputer takes up a space the size of two or three tennis courts.”
Though quantum computing for intensive computations is all the hype currently, there is a desperate need of systems that can revolutionize the industry on a more scalable and PC-like level: neuromorphic computing may just be the answer everyone is looking for.