Quantum computing, or “quantum computing” in English, is dedicated to revolutionizing computer technologies by drawing on quantum mechanics. Principles such as superposition or quantum entanglement are likely to find practical application in quantum computers in the form of qubits in order to lead to highperformance computers offering almost unlimited performance. The practical future of quantum computers faces technological obstacles such as the interconnection of qubits and modern cooling systems.
What is quantum computing?¶
A ghost haunts the world of computers, a ghost called “quantum computing”. If all predictions one day come true and quantum computers reach market maturity, they are set to spark nothing less than a technological revolution. How is this supposed to work? Thanks to laws of quantum mechanics. These include three principles that can be cited as the pillars of quantum computing:
 Overlay: indicates the ability of a quantum system to simultaneously assume multiple states, 1 And 0 instead of 1 Or 0.
 Quantum entanglement: describes a quantum mechanical phenomenon in which two or more particles are entangled to form an overall interconnected system; changes made to one particle of the entangled quantum system automatically affect all interconnected particles.
 Quantum decoherence: indicates when the systems are measured in a situation of superposition and therefore enter a defined state, from 1 And 0 to 1 Or 0.
Conventional computers are based on the binary electrical principle “on/off”, “on/off” or “1/0”. Quantum computers, on the other hand, rely on nonbinary, multidimensional and quantum mechanical states. Unlike conventional computers, they do not solve problems successively, but in parallel and at the same time, even with complex inputs. In this way, they are able to offer greater computing power and significantly reduced calculation times.
If all goes according to plan, quantum computers will be accompanied by a technological leap forward that will have repercussions in all areas of complex data processing. This includes, among other things, ecommerce, encryption, medicine, financial transactions as well as Big Data, artificial intelligence and machine learning.
How does quantum computing work?
Quantum computing is not easy to get into. Instead of binary bits, quantum computers use qubits (quantum bits) to solve mathematical problems and process sets of data. The classic bit is based on binary code.
A bit can only take one of the following two states: 1 Or 0. Qubits, on the other hand, operate in a nonbinary manner and adopt both states simultaneously: 1 And 0. The quantum mechanics approach increases the potential of quantum computers by a million times compared to binary PCs in terms of performance. Qubits can adopt not only states 1 and 0 simultaneously, but also an infinite number of intermediate states. As quantum computers process information at the same time, they are able to solve complex tasks that are impossible to solve by classical computers.
Superposition and quantum entanglement¶
A simple image illustrates this principle: imagine the mode of operation of classical computers and quantum computers by analogy with a coin toss game. Classical computers must wait for the coin to land to continue. They can only understand the heads state (representing the value 0) or the tails state (representing the value 1). Quantum computers, on the other hand, use a coin that never lands, but instead constantly spins in the air and embodies heads and tails at the same time. She is in the state of superposition.
Qubits only adopt a binary state when a measurement is taken. Imagine the coin again floating in the air. As long as no one is looking at the coin, it spins in the air and represents heads and tails at the same time. As soon as observation or measurement of the state takes place, the coin falls to the ground and shows heads or tails. Added to this is the entanglement of qubits in quantum computers. If one qubit changes, the interconnected qubits also change due to quantum entanglement. This also increases the computing speed of quantum computers. Multiple qubits are then grouped into binary bit quantum registers to perform arithmetic operations.
What is the increase in performance brought by quantum computers?¶
Science and industry have high hopes for the performance of quantum computers. Some scientists even expect that they can simulate the Big Bang and provide proof of the existence of parallel universes. Despite the technical challenges posed, it is a fact that quantum computers offer unlimited potential. Thus, a qubit has more than twice the computing power of a bit, since it can simultaneously take states 1 and 0 and many intermediate states. Computing power multiplies with each additional qubit. Three qubits can adopt eight states in parallel, 300 qubits already two to the power of 300 states.
What are the advantages and disadvantages of quantum computing?¶
Benefits  Disadvantages 

✔ Multiply computing power and time, even with large and complex data sets  ✘ High technical requirements for qubit cooling and entanglement 
✔ Processing large amounts of input values at the same time, not sequentially  ✘ Requires a paradigm shift and new digital infrastructures, as quantum computers are based on different principles from traditional PCs 
✔ Promotes the development of artificial intelligence and machine learning  ✘ Such performances are a threat in the wrong hands 
✔ Facilitates medical research, because quantum computers precisely simulate molecules and genes and process Big Data  ✘ Calculation results cover a wide range of results and may be less accurate than binary computers 
✔ Offers unprecedented potential for highly secure encryption through product decomposition of prime factors 
Possible fields of application of quantum computers¶
It will be years before quantum computers can find practical application. However, the foreseeable advantages for complex data systems and data processing in general make it possible to foresee the following areas of application:
 quantum simulations for natural sciences and medicine
 quantum chemistry and quantum biology
 development of complex financial models
 optimization of artificial intelligence and selflearning systems
 optimization of encryption techniques in cryptography
 smart technologies such as smart grids, smart cities and smart homes
 autonomous driving

data mining
 aerospace
Technical obstacles to quantum computers¶
The main reason why quantum computers are still in the development stage is the associated technical requirements. For example, qubits are highly sensitive and volatile quantum systems. To achieve the most accurate results possible, quantum computers must be able to reliably interconnect millions of qubits. Another pitfall: quantum computers can only work very close to the zero point of absolute temperature (273.15 degrees Celsius). Cooling today’s quantum chips requires days and stateoftheart cooling systems.
Quantum algorithms used to solve complex problems and for data processing are based on new paradigms compared to known algorithms. These include the formation of multidimensional computing and storage units and simulation spaces, which contemporary computers are not capable of. For this reason, quantum computers require new hardware and software technologies to convert and process datasets into qubitcompatible forms. The way of programming and programming languages will also explore new ways to adapt to the principles of quantum mechanics.
Where is quantum computing today?¶
Quantum computing was first discussed in 1980, when physicist Paul Benioff described a variant of the Turing machine operating according to the principles of quantum mechanics. In the late 1980s, theoretical physicist Richard Feynman and mathematician Yuri Manin finally formulated the performance potential of quantum computers compared to classical computers. Since then, interest in quantum computers has continued to grow. The proof is that governments and companies like IBM, Google and Microsoft are working hard to realize quantum computing and are investing millions of dollars.
In 2019, IBM presented a quantum computer with 20 qubits. On October 23, 2019, Google in turn proclaimed that it had reached “quantum supremacy” with its Sycamore chip as part of a cooperation between Google AI and NASA. Sycamore would have solved for the first time tasks that even the best classical supercomputers cannot solve. In 2020, IBM finally announced the development of one of the largest quantum computers called “Hummingbird” with 65 qubits. The IBM “Eagle” model followed in 2021 with 127 qubits.
At the beginning of 2023, another major problem in quantum computing was solved: until now it was difficult to efficiently and consistently transfer data from a quantum computer between chips. But it is now possible to achieve a success rate of up to 99.999993% when transferring between two chips.
Despite the constant progress of these supercomputers, it is still too early to expect them to replace classical computers outright. A hybrid approach with combination of classical PCs and quantum computers is much more likely. It offers the advantage of providing initial results from enormous quantities of data by quantum computers, whereas these have until now been processed by more precise classical supercomputers according to the binary principle.