The Hutch Report has a fascinating 44-page PDF on Quantum Computing.
If perfected, existing methods of encryption will cease to work. Your bank account password and passwords to cryptocurrencies will easily be hackable.
The ability to break the RSA coding system will render almost all current channels of communication insecure.
This is a national security threat.
The benefits are also huge: Quantum computers will be superior at hurricane detection, airplane design, and in searching DNA for markers to help find cures for diseases such as Autism, Alzheimer’s, Huntington’s, and Parkinson’s.
Classical Computers
Classical computers use strings of 0’s and 1’s with a single digit a “bit” and strings of bits a “byte”. A bit is either a one or a zero.
Excerpts from the Hutch report now follow. I condensed 44 pages to a hopefully understandable synopsis of the promise and problems of quantum computing.
Quantum Background
Quantum computing does not use bits, but uses qubits which can be one, zero, or both zero and one at the same time. This state or capability of being both is called superposition. Where it gets even more complex is that qubits also exhibit a property called entanglement. Entanglement is an extraordinary behaviour in quantum physics in which particles, like qubits, share the same state simultaneously even when separated by large distance.
As comparison a classic computer using bits of zero and one can only store one state at a time and can represent 2n states where n is the number of bits. In the case of two bits, this would be 2*2 which is four states: 00, 01, 10, 11.
A normal computer would require four operations to examine each state. Two qubits could store the four states at one time. When the number of states are low there is not a major processing difference. As the number of possible state combinations increases, the difference in processing time between quantum computers using qubits and a classic computer using classic bits, increases exponentially. The following chart depicts this well showing that 20 qubits can represent simultaneously over 1 million permutations of classical bits.
…click on the above link to read the rest of the article…