What is Quantum Computing

What is Quantum Computing

How To

Since the 1980s, scientists have been trying their hand at developing the quantum computer (or quantum computer). A supercomputer that exploits the laws of physics and quantum mechanics to overcome the barriers of today’s supercomputers and open new horizons for Intelligence. In this article, we will give a complete guide about what is quantum computing?

Currently, systems based on a few qubits are already available. Still, the research challenge is to develop Quantum Computing systems or to build quantum computers based on hundreds or thousands of qubits. A condition that would allow a real “quantum leap” in the number and quality of computations. So, that a quantum computer could do.

Systems containing thousands of qubits could still arrive within a decade, and IBM, Google, Microsoft, Intel, and research centers such as MIT and Harvard are playing the game in the United States that clash, almost on the verge of a cold war. Politics, with the centers of Russia and China. The European Union has also decided to take on an essential role by investing one billion euros over the next ten years. Thanks to the “push” made by the Italian physicist Tommaso Calarco, director of the Center for Integrated Quantum Sciences and Technologies of the University of Ulm (Germany).

What is quantum computing?

What is Quantum Computing

Quantum computing can, therefore, be defined as the set of computational techniques and their study that use quanta to store and process information. Many points differentiate it from classical computer science, especially in its founding principles.

Quantum computing, first of all, is based on the principle of no-cloning. Therefore, read with absolute fidelity. Every measurement made on the quantum state destroys much of the information it contains, leaving it in a “base state.” Finally, the data can be encoded and encoded with non-local correlations between different parts of the physical system based on quantum entanglement.

Although it may appear less defined and less “precise” than computer science as we know it today. It is demonstrable that quantum Turing Machines can reach the same degree of precision as classical Turing Machines but can perform calculations and operations impossible for the latter.

What is the qubit

If the basis of classical computing is the bit intended as the minimum unit of information. At the base of quantum computing, we find the qubit or quantum bit. Compared to the “classic bit,” the quantum bit offers greater versatility and computing potential. While the former can only assume a well-defined value (“0” or “1”, “open” or “closed,” “on” or “off”), the qubit can also assume superimposed values.

According to the Heisenberg uncertainty principle, a quantum element – such as the qubit – will never have a well-defined “value state” but will have overlapping and indeterminable values. In the famous paradox proposed by Schrödinger to explain his theory. The cat can be simultaneously alive or dead. In the case of electrons, this principle dictates that it is impossible to determine with certainty, at the instant of time. The energy associated with them and their position at the same time. Therefore, according to the parameters of the system, many combinations of punches and parts will be possible, but not all by the same principle in quantum computing. The qubit can simultaneously assume the value of “0” and “1”.

This has a significant impact from computing. Being able to assume more values ​​at the same time, the qubit allows processing a more substantial amount of information and thus guaranteeing a greater exchange of data compared to classical computing, limited by its binary scale (” 0 “or” 1 “).

Cooling of quantum computers

Following the principles of quantum mechanics and exploiting the laws of quantum physics, the quantum computer (or quantum computer), therefore, uses qubits to perform complex calculations in parallel at an unimaginable speed compared to a supercomputer today. (taking seconds instead of years). Thus lead to loss of data and information useful for the calculation process as well as the development of suitable hardware infrastructures (today the cooling of systems requires helium and the infrastructures must be kept in vibration-free environments) and of algorithms suitably developed for quantum computing.

See also  How to bitcoin mining in 2020 - farm equipment, programs, mining pools

Regarding the operation of the quantum computer from a practical point of view, especially about cooling, to date, there are two predominant approaches:

  • the first, by cooling the circuits close to the so-called absolute zero (0 Kelvin, corresponding to -273.15 degrees Celsius). So that they function as superconductors without resistances that interfere with the current. In this case, we speak of “quantum dots” to indicate a nanostructure with semiconductor material inserted in another semiconductor with a larger energy interval;
  • the second method uses “trapped ions.” An atom or molecule with an electric charge “trapped” in electromagnetic fields and manipulated so that the movement of the electrons produces a change in the state of the ions and therefore, can function as a qubit.

The generalist quantum computer and quantum annealers

Some of the limitations related to quantum computing that we talked about in the previous paragraphs. Lead the “purists” of matter to consider quantum computers on the market today (in particular, D-Wave, the system chosen by NASA and Google; the latter then placed it on the market in 2013, advertising it. The first public and commercial quantum computer in the world,” even if D-Wave Systems had already developed others in a couple of years before) not real generalist computers but so-called ” quantum annealer,” Computers that exploit physics and quantum mechanics for combinatorial optimization problems (i.e., computers that must be optimized to the maximum to address the specific problem to be solved adequately).

Quantum annealers are good systems for combinatorial optimization (i.e., to solve problems by finding the best solution among all feasible solutions where the variables are discrete). In this sense, so-called “digital annealer” microchips are already available on the market. These are traditional processors but which are inspired by the operation of quantum computing to enable very fast computing processes (even if we are far from parallel computing and therefore. The superior performance of a digital annealer compared to normal CPUs would depend on the processing process itself, i.e., on how it is used and for what type of calculations, considering that modern CPUs, GPUs and up GPUs are already able to withstand massive calculations).

The Beginning of the Quantum Computer: Feynman and Deutsch

Although the history of the quantum computer goes back to 1982. The year in which the American physicist Richard Feynman exposes. The theory hypothesizing its realization thanks to the principle of superposition of states of elementary particles, with a resumption in 1985 by David Deutsch. He succeeded in demonstrating the validity of Feynman’s theory. The very first idea of ​​such a possibility came to the scientist Murray Gell-Mann who in the early 1980s saw in the behavior of elementary particles the possibility of developing a new type of computer science. The American physicist he received the Nobel Prize in Physics in 1969 for studies on elementary particles. In particular on quark theory).

In 1982, Richard Feynman theorized what today are classified as ” quantum simulators ” or processing machines that allow the study of quantum systems that are difficult to study in the laboratory and impossible to model with a supercomputer.

Through his studies, Feynman tried to go further and showed that a classical Turing machine (based on the classical Turing test ) would experience an exponential slowdown in the simulation of quantum phenomena. At the same time, his hypothetical universal quantum simulator would not. Feynman, in his work of ’82, essentially said that the Turing Machine could simulate a quantum system only with an exponential slowdown (in the sense that the required algorithmic complexity would slow down the computational capacity).

The last twenty years of the quantum computer

The first prototype of a quantum computer was created in 1997 by IBM in its Research Center in Almaden (San Jose, California) by measuring the spin of atomic nuclei (i.e., the size, or quantum number, associated with particles. Only a year later, Bruce Kane, an Australian physicist. The University of New South Wales, proposes the construction of a quantum computer on phosphorus atoms arranged on a layer of silicon only 25 nanometers thick. In Kane’s hypothesis. The old silicon chips do not go to the cellar but are used by inserting phosphorus atoms inside.

See also  How Should You Prepare Yourself for Family Business for Unexpected Crisis?

However, it was in 2001 that IBM’s Almaden Research Center presented. The first seven qubit quantum processor (composed of a single molecule with seven nuclear spins).

Challenges of practice

In theory, quantum computing looks promising, but it is very, very difficult to implement it in practice.

First of all, qubits are extremely unstable – even minor external influences break entanglement. The maximum lifetime of a quantum system when it is suitable for quantum computing (decoherence time) is extremely short, according to the resource Quantum Computing Report. Now the decoherence time does not exceed hundreds of microseconds. The record – 148.5 μs – belongs to a 20-qubit IBM computer in Tokyo. After the specified time, the system will start producing white noise instead of probability distributions. And during this short period, it is necessary to initialize the system of qubits. Carry out calculations and get the result.

Another obstacle to running complex, lengthy algorithms on quantum computers is the presence of errors. The likelihood of errors in calculations, reading, and writing information increases with the growth of the number of qubits. Standard error correction methods (duplication of computations and averaging) do not work in the quantum world. “If each physical qubit works with one logical one.

Quantum computing for dummies

The number of operations performed with such logical qubits can be increased by two orders of magnitude. That is, we reduce the number of qubits by order of magnitude. Explains the ex-employee of the Institute of Quantum Optics of the Max Planck Society, Cand. Physical-mat. Sci. Stepan Snigirev.

To solve practical problems, it is necessary to increase the number of qubits used radically. Increasing the number of qubits in a quantum computer is a complex technological process. In the best universal quantum computers designed for solving a wide range of problems. There are now no more than a hundred of them (see table).

But even in the best in this parameter, IBM quantum computers manage to entangle only six neighboring qubits. In order to confuse more distant ones, one has to build a chain of additional quantum operations, use additional qubits. Accordingly, increase the overall level of errors.

Also Read: Need Payroll Factoring For Your Business?

What is Quantum Computing the fight for a qubit

Developers are striving to create computers that can control more qubits, for longer computations and with fewer errors. In this case, a variety of technologies are used to ensure superposition and entanglement of qubits.

Superconducting qubit technologies are leading the way in the qubit race. Which are nanoscale discontinuities in a superconductor? The superconducting current circulating in such microcircuits behaves like one large quantum object and has exactly two necessary basic states. By the number of charge carriers (pairs of electrons) on individual microcircuit elements.

Superconducting qubits can be arbitrarily placed on chips and fabricated using mature technology from the semiconductor industry. They are easier to manipulate than multiple atoms or ions. The qubits in IonQ’s new quantum systems are individual atoms. The rare earth element ytterbium suspended in a vacuum and extracted from atoms using precisely directed laser beams. This approach, called ion trapping, is theoretically effective. But technically difficult – working with such systems can only be done in ultra-high vacuum installations.

Also Read: Writing Skills And Its Importance


Theoretically, photons in waveguides can also serve as qubits for a quantum computer. It is possible to implement computational algorithms using such qubits. When scaling the system, serious problems arise. The possibility of using qubits based on the phenomenon of nuclear magnetic resonance (NMR) is being studied. The spins of atomic nuclei in an external magnetic field serve to encode the state. But on the totality of all molecules in the substance used (in the first experiments, liquids). The number of molecules interacting with each other in the working volume of a substance reaches several trillion. Further development raises doubts among specialists because of the complexity of controlling such a large number of quantum states.

The special electronic structure of these defects allows them to respond to laser irradiation. Emit fluorescent radiation with a longer wavelength than laser light. The most suitable for quantum computing are nitrogen defects in diamonds and phosphorus defects in silicon.

The defect does not need to be contained by external electromagnetic fields. As in the case of ions, and cooled to low temperatures, which opens up prospects for commercial solutions. However, scaling such solutions is also very difficult.

Also Read: Should You Get PMP Certified? Top Benefits 

Leave a Reply

Your email address will not be published. Required fields are marked *