Building the Future We Deserve – A Cyber Success Story
Consider a conventional computer. It uses a small (64-bit) processor architecture and is considered excellent for solving linear problems. Many past and present problems are linear, and 64-bit architectures have been sufficient to solve them (a 64-bit register can hold any of 264 over 18 quintillion [or 1.8×1019] different values). However, if you want to solve a much more complex problem such as those that occur in natural chemistry and physics, using a linear approach is not possible due to the massive numbers and variables that must be considered to reach a solution. Conventional computing and linear problem-solving approaches are quickly overwhelmed by this complexity.
Enter a quantum processor that harnesses bits that are atoms or subatomic particles. Because of the nature of quantum mechanics, those bits can represent anything (e.g., 0,1, or anything in between) and potentially exist anywhere in space. If you connect those bits with entanglement into a circuit, for example a 73 quantum bit (qubit) circuit, the word size is now 2 to the 73rd power (273). This works out to be a yottabit of data, which is equivalent to all the data stored in the world in the last year. Imagine a computer that can process all the data stored in the world in the last year in a single instruction.
This computational capability is amazing for operations such as molecular science, neural networks, and weather simulation. As another point of reference, you have about a trillion neurons in your brain. Think about interrogating the whole state of a complex neural network like your brain into one instruction. This is possible in the future using quantum computers. It is fascinating, and it will open us up to huge breakthroughs in technology, science and nature.
This fantastic computational power is a double-edged sword, however. The problem is that our current public encryption (think the entire internet) is based on a single transaction – factoring a large prime number. Quantum’s large word sizes are great for factoring large prime numbers, rendering much of our current cryptographic capabilities useless. Also, the current cryptography on nearly all electronic devices, whether a watch, phone, computer, or satellite, is based on the same prime number factorization. So far, factoring a significant prime number on a conventional computer is still extremely difficult. But quantum computers pose a threat because they can do it quickly.