Quantum computing started in the early 1980s when physicist Paul Benioff proposed the first quantum mechanical model of the Turing machine. Richard Feynman and Yuri Manin then expressed the idea that a quantum computer had the potential to simulate things that a classical computer could not. David Deutsch, Oxford University's theoretical physicist is known for being the father of quantum computing.

Quantum computers perform calculations based on the probability of an object's state before it is measured instead of just 1s or 0s, which means they have the potential to process exponentially more data compared to classical computers. A quantum computer is any device for computation that makes direct use of distinctively quantum mechanical phenomena, such as superposition and entanglement, to perform operations on data. It is possible to create algorithms that run significantly faster on a quantum computer than a classical computer, due to the unique properties of qubits. These algorithms could be used for several different scientific and business applications and will bring many benefits.

Today's computers work by manipulating bits that exist in one of two states: a 0 or a 1. In other words, Classical computers carry out logical operations using the definite position of a physical state. These are usually binary, meaning its operations are based on one of two positions. A single state such as on or off, up or down, 1 or 0 is called a bit. Quantum computers aren't limited to two states; they encode information as quantum bits, or qubits, which can exist in superposition. Qubits represent ions, photons, atoms or electrons, and their respective control devices that are working together to act as computer memory and a processor. Because a quantum computer can contain these multiple states simultaneously, it has the potential to be millions of times more powerful than today's most powerful supercomputers.

The effect of quantum computers on Cryptography Integer factorization, which underpins the security of public-key cryptographic systems, is believed to be computationally infeasible with an ordinary computer for large integers if they are the product of a few prime numbers. A quantum computer can efficiently solve this problem using Shor's algorithm to find its factors. This ability can allow a quantum computer to break some of the cryptographic systems in use today. Most of the popular public-key ciphers are based on the difficulty of factoring integers or the discrete logarithm problem, both of which can be solved by Shor's algorithm. For example, the RSA, Diffie–Hellman, and elliptic curve Diffie–Hellman algorithms may be broken. Breaking these may have significant ramifications for electronic privacy and security.

However, other cryptographic algorithms do not appear to be broken by those algorithms. Some public-key algorithms are based on problems other than the integer factorization and discrete logarithm problems to which Shor's algorithm applies. Lattice-based cryptosystems are also not known to be broken by quantum computers, and finding a polynomial-time algorithm for solving the dihedral hidden subgroup problem, which would break many lattice-based cryptosystems, is a well-studied open problem.

Many essential aspects of security rely on encryption and public-key cryptography, which are essential for e-commerce and protecting secret electronic information. These are based in turn on mathematical algorithms that are very difficult to “break”. Modern algorithms with suitable key lengths (e.g. AES-128, RSA-2048, ECDSA-256, etc.) are not susceptible to brute force attacks even with massive amounts of computing power, they may take long times.

However, it is possible to create algorithms for quantum computers that reduce the time it takes to break these algorithms. Symmetric algorithms used for encryption, like AES, are still thought to be safe (with sufficient key length – e.g. AES-256 or larger); but, current asymmetric algorithms like RSA and ECDSA may be useless once quantum computers reach a certain scale.

Quantum computers are expected to lead to great breakthroughs in medicine, manufacturing, artificial intelligence, machine learning, and more, but unfortunately, rogue actors could also use quantum computers for destructive purposes. The main disadvantage of computing is the technology required to implement a quantum computer is not available at present. The reason for this is the consistent electron is damaged as soon as it is affected by its environment and that electron is very much essential for the functioning of quantum computers. In particular, qubits are highly susceptible to almost undetectable amounts of thermal and electromagnetic interference that are difficult to eliminate. And another disadvantage is, of course, the high cost.