ELI5: Quantum computing explained in 350 words
When we get down to the very small scale – the world of subatomic particles – we find some whacky physics at work. You might think that it’s impossible to be in two places at once. Down in the quantum realm, however, a piece of matter can in fact occupy two places simultaneously.
And this bizarre physics makes quantum computers – and vast processing powers – possible. Read on for quantum computing explained as simply and concisely as possible.
Zeroes and ones
Ordinary computers use something called “bits” to process information and perform calculations. These bits – like everything else here in the everyday realm – can’t be in two places and two different states at once.
Instead, like on-and-off switches, classic computer bits can either take the value of 0 or 1. So, if you have, say, a pair of bits, those bits can store only one of four possible combinations at any given time. (Either 00, 01, 10, or 11.)
From a practical perspective, this means that complex calculations – ones that require all possible configurations to be considered – are going to take your ordinary computer a while to process.
Quantum computers don’t work within the same confines. Rather than bits, they use something called “qubits”. (Quantum bits.) And for these qubits, it’s possible to exist in the state of both 0 and 1 at the same time.
Remember how a pair of bits can only store one of four combinations at once? A pair of qubits can store all four at once. And the more qubits you add, the more combinations increase.
The result is that quantum computers can perform multiple calculations considering multiple configurations at the same time. For an idea of speed, take Google’s 54-qubit Sycamore processor. In 200 seconds, it was able to perform a calculation that would have taken the world’s most powerful supercomputer 10,000 years.
Quantum computing explained
This is a short, simple overview of an enormously complex topic. But, for quantum computing explained in a nutshell:
More tech ELI5s
- Transfer learning
- Artificial neural networks
- Generative adversarial networks