Is Quantum Computing a Real Thing?

Credit: Nature


Quantum mechanics emerged as a branch of physics in the early 1900s to explain nature on the scale of atoms and led to advances such as transistors, lasers, and magnetic resonance imaging. The idea to merge quantum mechanics and information theory arose in the 1970s but garnered little attention until 1982 when physicist Richard Feynman gave a talk in which he reasoned that computing based on classical logic could not tractably process calculations describing quantum phenomena. Computing based on quantum phenomena configured to simulate other quantum phenomena, however, would not be subject to the same bottlenecks. Although this application eventually became the field of quantum simulation, it didn’t spark much research activity at the time.

What is Quantum Computing?

Quantum computers harness the unique behavior of quantum mechanics and apply it to computing. This introduces new concepts to traditional programming methods. Quantum computing use qubits as its the basic unit of information.
A quantum computer has three primary parts:
o An area that houses the qubits.
o A method for transferring signals to the
o A classical computer to run a program and
send instructions.

Quantum computing is a type of computation that harnesses the collective properties of quantum states, such as superposition, interference, and entanglement, to perform calculations. The devices that perform quantum computations are known as quantum computers.

Why do we need Quantum Computers?

Until now, we’ve relied on supercomputers to solve most problems. These are very large classical computers, often with thousands of classical CPU and GPU cores. However, supercomputers aren’t very good at solving certain types of problems, which seem easy at first glance. This is why we need quantum computers. Supercomputers don’t have the working memory to hold the myriad combinations of real-world problems. Supercomputers have to analyze each combination one after another, which can take a long time.

Quantum Vs Classical

Quantum computers process information differently. Classical computers use transistors, which are either 1 or 0. Quantum computers use qubits, which can be 1 or 0 at the same time. The number of qubits linked together increases the quantum computing power exponentially. Meanwhile, linking together more transistors only increases power linearly. Classical computers are best for everyday tasks that need to be completed by a computer. Meanwhile, quantum computers are great for running simulations and data analyses, such as for chemical or drug trials. These computers must be kept ultra-cold, however. They are also much more expensive and difficult to build.

Applications of Quantum Computing


  • Cybersecurity
  • Drug Development
  • Financial Modeling
  • Better Batteries
  • Cleaner Fertilization
  • Traffic Optimization
  • Weather Forecasting and Climate Change
  • Artificial Intelligence
  • Solar Capture
  • Electronic Materials Discovery


There are many problems to overcome, such as how to handle security and quantum cryptography. Long-time quantum information storage has been a problem in the past too. However, breakthroughs in the last 15 years and in the recent past have made some form of quantum computing practical. There is still much debate as to whether this is less than a decade away or a hundred years into the future. However, the potential that this technology offers is attracting tremendous interest from both the government and the private sector.