BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

IBM's New Quantum Computing Move Marks A New Era For Cloud Computing

This article is more than 7 years old.

Two years ago, I called the cloud the most disruptive technology ever, because it made the world’s most advanced technologies available to just about anyone with an internet connection. Previously, only large enterprises that could afford to maintain expensive IT staffs had access to cutting edge capabilities.

Still, while the cloud has proved to be highly disruptive, it offered few capabilities that didn’t exist before. Sure, it made those capabilities cheaper, more efficient and more accessible, but the truth is that, outside of large data applications like Hadoop and Spark, it didn’t allow us to do much that we couldn’t do before.

IBM's announcement today that it will make quantum computing available on its IBM Cloud platform will help change that. For the first time, anyone who wants to will be able to benefit from a technology that virtually no one had access to before. That, in itself, is big news. But it also opens the door to something much bigger—a truly new era for cloud computing.

The New Quantum Era Of Computing

Take a look at a quantum computer and it’s hard to know what you’re looking at. It’s more of a room than a box, with racks of equipment that send microwave pulses and a cryogen-free refrigerator that cools the mechanism within a few thousandths of a degree of absolute zero. Requiring trained physicists to run one, it’s not something you’d see at a typical corporate office.

Yet to really understand how a quantum computer works, you first need to understand how a regular one functions. Switches, called transistors (or formerly vacuum tubes) generate ones and zeros. These switches, in turn, are arranged in Boolean logic gates that represent statements such as AND, OR” and NOT. That, essentially, is the grammar through which traditional computers understand the world.

Quantum computers operate based on very different principles: Superpositioning and entanglement. Superpositioning means that instead of two states—one or zero—quantum bits can exist in one, zero and both one and zero. Entanglement—or what Einstein called “spooky action at a distance—links those states in a manner that even today scientists don’t fully understand.

It is this linking through entanglement that fundamentally changes the mathematics of computation. 300 regular bits results in 600 possible states (2 x 300), but 300 quantum bits, or qubits, results in 2300 possible states or, as Dario Gil, Vice President of Science and Solutions at IBM put it to me, “more states than the number of atoms in the known universe.”

The End Of Moore’s Law And Bottleneck Problem

Clearly, quantum computing offers a vast improvement over the capabilities of today’s computers. Till now though, that hasn’t been a major priority. For the past 50 years, computing power has been doubling roughly 18 months, a phenomenon widely known as Moore’s Law, so our machines were advancing as fast as our ability to dream up new applications for them.

Now, however, Moore’s law is slowing and will likely end around the year 2020. The problem is twofold. First, there are only so many transistors you can cram onto a silicon chip. Second, there is a bottleneck between the central processing chip and other functions, such as memory. So today’s ultrafast chips often spend much of their time waiting for instructions.

Scientists, at IBM and elsewhere, have been scrambling to come up with new schemes. One, called silicon photonics, aims to speed up communication between chips. Another strategy, called 3D stacking, seeks to reduce the distance between chips. Still another, called neuromorphic computing, eliminates the bottleneck with chips designed like a human brain.

None of these are yet commercially available, but quantum computing and neuromorphic computing create additional challenges because they represent completely new computing paradigms. The truth is that no one really knows how to develop applications for them, because no one has really used them in a practical setting. That’s what IBM seeks to change.

Accelerating New Frontiers

IBM has been working towards quantum computing for a very long time. It was, in fact, at an IBM lab that quantum entanglement was first achieved in 1993. It has also been working on neuromorphic chips since it first won funding from DARPA in 2008. Yet with its century long history the company also knows all too well how difficult it is to commercialize a new technology.

In recent years, it has made important breakthroughs in quantum computing, such as the ability to arrange qubits in a lattice and to detect errors, both critical for creating a practical quantum computer that can be used for commercial purposes. Still, as Peter Drucker put it, the only true purpose of a business is to create a customer.

That’s why the company recently also announced a Research Frontiers Institute to accelerate innovation by allowing members of the institute, such as Samsung, JSR and Honda, to work with advanced technologies such as quantum computing. This will be absolutely essential to developing commercial applications in the years to come.

Today, we can only suspect what quantum computers can do. Theoretically, the technology should excel at massive simulations, like the kinds used in pharmaceutical research and agent based models. It is also ideally suited to applications with a heavy computational component, such as cryptography, but until people start actually using it, we can’t really know.

The only true way to test a technology is to put it to work. That is the process that today's announcement put into motion.

A New Era For Cloud Computing

Nobody can predict the future with any degree of accuracy, but there are some things we can say with a high degree of certainty. We know, for instance, that Moore’s Law will end around 2020 and that new paradigms will have to take over. We can also be sure that computing will be vastly different ten years from now.

Today, we work within a universal computing paradigm. My laptop computer runs basic office applications, such as a word processor and a spreadsheet, allows me to surf the internet, watch videos, run analysis and shop for a Mother’s Day gift for my wife. Scale my computer up and it becomes a server, scale it down and it’s a mobile phone.

Yet in the future, we will most likely choose different architectures for different tasks. A marketer may want to access a quantum computer to run an agent based simulation to test a market of 100 million consumers, then switch to a neuromorphic system to analyze the results and write up the report in a more conventional architecture. The final plan may be sent with unbreakable quantum encryption to avoid prying eyes.

It is unlikely that all these things will fit on a desk, or even in a single office. It is also unlikely that typical users will be aware of—or even care about—the constant paradigm switching. Yet it is likely to become essential to their work. So much so, in fact, that anybody who cannot access those capabilities will be at a competitive disadvantage.

That’s why IBM’s announcement today ushers in a new era for cloud computing. Before today, we used the cloud to access greater capability at lower cost. Now, for the first time, anybody can access capabilities that virtually no one—even large firms and sovereign nations—could before. That really is something truly new and different.


Follow me on TwitterCheck out my website