Why quantum and neuromorphic processing systems are paving the way for the next generation of artificial intelligence, potentially reshaping business for decades to come.
As enterprises embed data technologies into business processes, the potential cost savings from reductions in inefficiencies, downtime, and human error could be measured in the trillions.
As these new computing paradigms make their way into commercial use, international competition to own the next era of computing is heating up.
However, the capability to analyze these vast amounts of data is still a tall task for today’s computer processors. Consider that the digital universe in 2020 is approximately one yottabyte in size, or one trillion terabytes—and growing each day. But imagine a computer that could analyze 200 million pages in less than 3 seconds, synthesizing enormous amounts of data—and then making conclusions on that data. Taken further, imagine a computer that has sight, or could actually understand smell.
“The future of computing will not—and cannot—be based on ever-increasing processing power,” says Shawn Kim, Head of the Asia Technology team for Morgan Stanley Research. “Instead it will rely on understanding and drawing inferences from massive collections of data." In other words, computers that can use data to learn, adapt, evolve and “think,” much like the human brain.
Two new computing paradigms aim to make that possible within a decade. Quantum computing, which just a few years ago sounded equal parts hype and promise, is moving into the cloud, capable of solving problems that would take the world's fastest supercomputers years to tackle. Another paradigm, neuromorphic computing, is also emerging as a powerful complement to classical computer design—known as von Neumann architecture—and promises to help machines learn and think.
“Breakthroughs in physics and the biological sciences are the new tools driving artificial intelligence, the Internet of Things, robotics, and autonomy," says Kim. “It is critically important that investors and companies begin to understand these developments because they could shape business models for decades to come."
Dr. John Kelly of IBM estimates that a whopping 90% of the world's data is dark, meaning humans and computers can’t yet use it in a meaningful way. Meanwhile, the Internet of Things will only add more to that cache, as countless new everyday devices go online.
As data accumulate, the incentive grows to mine it for insights that may fuel a broad range of AI applications, from voice assistants to robotics to driverless cars. Doing so effectively will take a new class of computing known as cognitive computing.
“Cognitive computing makes systems smart enough to think and respond without preprogrammed sets of instructions," says Kim. “They are different from programmable systems in their ability to reason, form hypotheses and learn from the data they process."
Kim sees two classes of cognitive systems emerging. The first performs operations without human intervention, such as autonomous vehicles, personal assistants and drones. The second augments human capabilities by collaborating with us to solve tough problems, like medical diagnoses.
Hopes for cognitive systems hinge largely on the emergence of neuromorphic computing, which mimics characteristics of the human brain to drive massive improvements in efficiency, processing and reasoning, relative to traditional computer architectures.
Emulating the brain is no small task since it rivals the processing power of even the largest supercomputer but consumes just 15 watts. (For comparison, modern supercomputers can weigh 150 tons and gorge on 10 million watts.) Nevertheless, researchers have made significant progress creating neuromorphic chips that store and retrieve large amounts of information simultaneously—a key feature of biological intelligence and a major departure from the sequential operations of von Neumann architecture.
Neuromorphic chips could power a range of AI applications, because of their capacity to sense, learn, infer, and make real-time decisions, without explicit instructions in code or millions of prior examples to learn from.
“To date, a popular approach to achieving AI had been through deep-learning algorithms that permit software to train itself to perform speech and image recognition," says Kim. “This is where neuromorphic systems can disrupt existing solutions."
Market-intelligence consultancy Grand View Research sees the neuromorphic computing market more than doubling to $6.5 billion by 2024 vs $2.9 billion in 2020. Morgan Stanley Research expects image processing to account for the largest share of applications by revenue and sees neuromorphic chips as key to the success of AI endeavors, such as autonomous driving.
Quantum computing has arrived—just don't expect to find a quantum computer near you anytime soon. These machines live in the cloud, where an increasing number of full-scale systems are up and running. Top quantum providers are broaching the 50-qubit barrier, and a budding ecosystem of niche entrants are vying for a slice of the next decade's $10 billion pie.
Experts tell Morgan Stanley that large-scale commercial rollout may only be years away, as businesses build use cases and algorithms around them. To that end, quantum-computing market leaders are focused on educating potential customers about potential applications and building an ecosystem of developers for quantum computers.
“Developer use of quantum computing, in theory, could allow them to quickly solve problems involving business models, testing the properties of chemical compounds, and breaking encryptions or tracking for fraud," says Kim, noting that getting there isn't exactly easy, since quantum-computer coding requires learning an entirely new math/logic framework.
To be sure, quantum computing is no panacea. Qubits, the basic units of quantum computing, remain unstable and error-prone, so systems must learn to self-correct. And quantum computers operate at temperatures colder than deep space, so commercial businesses won't have the option of deploying them on their own premises.
Morgan Stanley sees three categories of investments: companies building universal quantum computers, companies with task-specific quantum abilities, and companies offering simulation platforms for quantum computers.
Other, more-distant frontiers of computing also show promise in the lab. Optical computing, for example, uses discreet photons to transmit data through glass cables to achieve orders-of-magnitude performance gains over electron- and copper-based systems.
As these new computing paradigms make their way into commercial use, international competition to own the next era of computing is heating up. At stake are critical national security advantages—quantum computers can crack encryption—and national prestige.
The U.S. is leading in most areas, but China is catching up. "China filed almost twice as many patents as the U.S. in 2017 for quantum technologies like communications and cryptology devices" says Kim. "However the U.S. leads the world in patents related to the most prized segment of the field—quantum computing."
For more Morgan Stanley Research on the future of computing, ask your Morgan Stanley representative or Financial Advisor for the full report, “The New Processing Paradigms" (Jul 19, 2020). Plus, more Ideas from Morgan Stanley's thought leaders.