What are qubits in quantum computing – and why you should care?

The power of the qubit is something it really takes algebra to explain. For those who want the simple version... read on.
14 April 2023

Quantum computing – coming soon, revolutinizing everything.

Quantum computing is a coming technology that’s likely to revolutionize what we can do, and even what we can imagine doing with computers. And the keys to the power of quantum computing are qubits.

But what exactly are qubits, and how do they power the quantum computing revolution?

In classical computers, the units of encoding are bits – the binary 0s and 1s with which we’re familiar. A collection of 8 bits can render you any single number between 0-255.

That in itself is pretty impressive, and it’s always been the way we measure computer encoding in the modern era, so we’ve grown up, generation after generation, understanding that it was the way things were done, even as speeds increased and processing power doubled and redoubled, as chips grew faster and more effective.

The cloud changed the way we thought about computing, particularly in storage space and speed, but even there, there has been no fundamental change in the way that data is encoded.

Enter quantum scientists.

Strap in – the next bit is necessarily complicated, because it brings quantum physics to the party of data encoding.

Quantum scientists study the world of the infinitesimally small particles of matter, and the forces that operate on them. The thing to understand about that is that in the world of very small objects, forces often work in very different ways than they do in the macro-universe of comparatively “large” objects – the things we can see, feel, and touch in what we (bless our naivety) think of as the “real world.”

In the world of quantum physics, things get unexpectedly freaky. Objects at that scale behave in strange ways, and two of those ways are key to understanding qubits in quantum computing.

Way 1: quantum superposition.

Quantum superposition is the kind of thing that makes no sense in the macro-universe. It occurs where a quantum element, whether it’s the spin of an electron or the orientation of a proton, can be in two quantum states simultaneously.

In quantum physics, for instance, an electron can be both a particle and a wave at the same time. In the macro-universe of course, that would normally be absurd – a fact pointed out by quantum physicist Erwin Schrödinger when he invented the thought experiment known as Schrödinger’s Cat. Schrödinger’s Cat puts forward the idea that you put a cat in a sealed box with a flask of poison, a source of radioactivity and a Geiger counter. If a single atom in the radioactive source decays, the flask of poison shatters, and the cat dies. If there’s no decay, no flask shatters, and the cat lives to claw your face off when you finally release it.

Until you open the box, the cat is theoretically both alive and dead simultaneously.

So far, so fun, so reportable to the ASPCA. But what does any of that have to do with qubits in quantum computing – right?

Qubits are what are also known as quantum bits. Unless you want to learn about orthogonal x and y-basis states, let’s say that qubits in quantum computing act like electrons in quantum physics, and can have multiple values at the same time.

Take a moment with that, we’re about to hit you with the second way in which qubits harness the principles of quantum physics.

Way 2: quantum entanglement.

Quantum entanglement is a phenomenon in quantum physics, where groups of particles are generated and interact in such a way that they can only be described with reference to one another.

Add the two phenomena together in a qubit (which is ultimately a storage medium that represents a two-basis quantum state – seriously, don’t get us started on x and y-basis orthogonals, you’ll never sleep again), and what you have is a unit of storage that is faster than a quantum bullet.

For instance, remember that with the 0s and 1s of traditional binary-based computers, 8 bits could get you any number between 0-255?

With a qubit, you can get every number between 0-255 at the same time.

That means, for instance, that if we’re looking at bits and qubits as equals, single units of storage on different systems, that a qubit gets you 255 times as much data per second as a bit can deliver.

Multiply that effect by the kind of numbers of bits in a modern computer, and what you have is an insanely fast, insanely powerful machine, the like of which we’ve never seen before.

The unlocked technologies.

That’s going to be important, because just as we’re about to enter the age of quantum computing, powered by qubits, we have other transformational technologies coming to fruition that happen to need insanely fast, insanely powerful machines to make the most of them.

Everybody’s heard of AI (Artificial Intelligence) and machine learning is an integral technology that powers the algorithms on which it depends. Those technologies are already making staggering differences to the world – medical breakthroughs, the digital transformation of the business world, enhanced imaging for everything from self-driving cars to long-range telescopes, and much, much more.

They’re managing that with standard, bit-based computing technology. Imagine those technologies on a never-ending shot of ultra-espresso, and you’re not even halfway to understanding how transformational the power of quantum computing will be to AI and machine learning capabilities.

There’s a potential dark side to the power of quantum computing – it will be able to crack most of the cryptography on which our cybersecurity is built in the blink of an electronic eye. But there are already efforts in play to establish standards of post-quantum cryptography, that will render it safe to use and free its users to maximize the potential of the qubits that will drive quantum computing forever forward – until the next… quantum leap dares to overtake it.