Modern life runs on computers. We shop on Amazon, store our collective knowledge on Wikipedia, and share our social lives on Facebook. With advances in machine learning, computers can be trained to do everything from recognize faces, to decipher human language, to predict disease risk based on an individual’s genome. If Google has its way, computers will soon drive our cars.
Underlying this transformation is the seemingly unstoppable growth in computational power that we call Moore’s Law. Processors have progressed at an exponential rate, making applications that would have been unthinkable 10 years ago almost mundane today. But the endless upward trajectory is getting harder and harder to sustain. The International Technology Roadmap for Semiconductors, organized by the world’s leading manufacturers of semiconductor chips, estimates that Moore’s Law can hold up for at most 10-20 more years. Eventually, the physical and economic challenges of increasing the density of transistors on a chip will be too much. In anticipation of that day, researchers are exploring groundbreaking new ways of performing computation that might enable the next technological revolution.
One approach takes its inspiration from the brain. The few pounds of flesh that we each carry in our skull can perform a dazzling array of tasks on a minuscule power budget. Applications traditionally outside the range of computers might be within reach if computer architecture can learn from nature. In August, researchers at IBM unveiled a neuromorphic computer called TrueNorth, which mimics the organization of neurons in the brain. In IBM’s design, a computer is composed of a million “neurons” connected by 256 million synapses. A traditional computer is composed of separate modules that store information (the memory) and process it (the CPU). In neuromorphic computers, which are based on the brain, the store and computation functions occur locally at a million artificial neurons, which communicate with one another to accomplish a goal. IBM showed that TrueNorth could recognize people, bikes and vehicles in real time video using tens of thousands of times less power than a conventional system would require to run the same program. There’s still a long way to go – TrueNorth can’t learn like people do. Instead, it needs to be programmed in advance, a process which fundamentally falls short of humans’ adaptive approach to the world. Still, neuromorphic computers offer the promise of making the fuzzy tasks that humans excel at more accessible to computers.
Another strand of research reaches beyond traditional physics. Quantum computers exploit the weirdness of particles at microscopic scales to break the limits of classical machines. It has been known for a while that quantum computers are capable of solving certain problems much faster than today’s machines. For example, quantum computers could quickly find the prime factors of large numbers, which would break encryption schemes widely used to secure interactions like online financial transactions. However, there are enormous engineering challenges to creating practical quantum computers. One of the biggest challenges is that the state of the micro-scale system is easily disturbed by contact with the outside world, requiring sophisticated methods to isolate the computer and correct whatever errors are introduced. The beguiling promise of previously unattainable computational power has attracted many efforts to solve these, and other, problems. Google recently hired one of the world’s top experts in quantum computing to join its Quantum A.I. Lab. Perhaps the key to the next computer revolution will soon emerge from Silicon Valley.
In the face of fundamental limits on what today’s chips can do, the future of computing might be discovered in technologies like those that rethink the architecture computers are based on. Tomorrow’s interconnected world could run on an artificial brain, or the carefully orchestrated interactions of microscopic particles. Current research is reimagining what a computer is in order to make today’s extraordinary the new ordinary.