Pages

Wednesday 2 September 2015

Origin Of Quantum Computing

simultaneously


As technology evolves, it increases the scale of integration and fit more transistors into the same space; and increasingly smaller microchips are made, and that is, the smaller, faster process reaches the chip. However, we can do infinitely small chips. There is a limit at which stop working correctly. When it comes to the scale of nanometers, the electrons escape of the channels through which must circulate. This is called tunneling.

A classical particle, if it encounters an obstacle, can not cross it and bounces. But with electrons, which are quantum particles and behave like waves, there is a possibility that a part of them can pass through the walls if they are too thin; so the signal can go through channels where it should not flow. Therefore, the chip stops working properly.

Consequently, the traditional digital computation would soon reach its limit, since it already has reached scales of only a few tens of nanometers. This raises the need for new technologies and that is where quantum computing comes in.

The idea of ??quantum computation arises in 1981 when Paul Benioff set out his theory to harness the laws of quantum computing environment. Instead of working at the level of electrical voltages, working-level terms. In the digital computing, a bit can take only two values: 0 or 1. However, in quantum computing, involving the laws of quantum mechanics, and the particle can be in coherent superposition: it can be 0, 1, and may be 0 and 1 at a time (two orthogonal states of a subatomic particle). This allows several operations can be performed simultaneously, depending on the number of qubits.

The number of qubits indicates the number of bits that can be superimposed. With conventional bits, if we had a record of three bits, there were eight possible values ??and the registry only could take one of those values. But if we have a vector three qubits, the particle can take eight different values ??at the same time thanks to the quantum superposition. Thus a three qubits vector allow a total of eight parallel operations. As expected, the number of operations is exponential with respect to the number of qubits.

To get an idea of ??breakthrough, a quantum computer of 30 qubits would amount to a conventional processor 10 teraflops (10 trillion floating point operations per second) when the computer is currently working on the order of gigaflops (billion operations).