AIMultiple ResearchAIMultiple Research

State of Quantum Computing in 2024 for Business Leaders

Spare yourself the trouble and delay learning anything about quantum computing until the next year unless you are working on:

  • A problem that is not solvable in a reasonable time with current computers (e.g. simulating molecular interactions). Such problems are common and almost all Fortune 500 companies could benefit from quantum computers
  • Cryptography, or at an intelligence agency or need to transmit nation or mega corporation level secrets
  • Quantum computing itself

If you are in one of these fields, quantum computing has the possibility to transform your field in a few years. If not, check back later, technology may have progressed to the point that others may also need to learn about quantum computing.

As non-technical corporate leader, what should I do about quantum computing?

If you are working on cryptography, or at an intelligence agency or need to transmit nation or mega corporation level secrets, stop relying on cryptographic methods that rely on factoring large integers. There are already quantum-ready alternatives as we discuss in the use cases section.

If you are a problem that is not solvable in reasonable time with current computers, start exploring quantum computing. This article will explain quantum computing and its ecosystem so you can find partner companies who can help you explore how quantum computing can solve your computing challenges. Do not expect immediate results though. Even though quantum computing companies cite many large companies among their customers, these tend to be PoCs with limited scope. Quantum supremacy (quantum computing being superior to classical computing) has not yet been proven for practical applications.

What is Quantum Computing?

Quantum computing involves non-classical computation which is proved to be superior to classical computation in certain important problems such as factoring large integers.

Classical computation is the foundation of almost all computing done by machines today. This includes all computing such as cloud computing, computing on your desktop, mobile or wearable device. I used the phrase “almost all” because there are also quantum computing experiments performed by researchers which should also be classified as computing. Classical computation relies on deterministic bits which have observable states of 0 or 1. As with classical (or Newtonian) physics, this is a pretty intuitive and practical approach. However, it is not efficient at modelling quantum phenomena or dealing with probabilities.

As you may remember from your high school years, Newton’s formulas are quite accurate for macro particles moving at speeds that are significantly slower than the speed of light. However, such a classical (or Newtonian) view of the world is not accurate at the molecule level or at speeds close the speed of light. This is because all matter display wave-like properties and are non-deterministic. Modeling them with bits that also display wave-like properties is more efficient.

Capability to model phenomena about molecules or particles moving close to the speed of light may seem interesting but not so useful. However, it is extremely useful:

  • Modelling molecule level interactions can unlock new insights in chemistry, biology and healthcare.
  • Quantum computing is effective at modelling probabilities and permutations as quantum mechanics is non-deterministic, certainty in classical physics is replaced with probabilities. This allows quantum computers to break RSA, possibly the most widely used cryptographic method. For example, you rely on RSA when you rely on as you transmit your credit card information to an online merchant.

For a more visual representation of how a photonic quantum computer works, you can check out this video from one of the vendors. There various approaches to building quantum computers and photonic is one approach:

Why is quantum computing relevant now?

Quantum computers are hypothesized to be superior to classical computation in certain important problems. Recent developments suggest that this could become the reality even though timelines for scientific progress are hard to estimate.

  • There have been significant scientific advances in the field:
    • Google claimed to prove quantum supremacy. The benchmark was simulating the outcome of a random sequence of logic gates on a quantum computer which is the task where quantum computers can possibly have the largest advantage over classical computing. While this is not a commercially useful computation, the fact that a quantum computer surpassed state of the art classical computers is still an important milestone.
    • In 2015, Google claimed that quantum annealers have performed orders of magnitude more efficiently in some optimization problems when compared to simulated annealers using classical computing
  • Quantum computing power grows exponentially with each qubit rather than linearly as in the case of linear bits due to the multi state computational capability of qubits. For example, a quantum computer with 1 qubit can simulate 2 classical bits, 2 qubits can simulate 4 classical bits, 3 bits can simulate 8 classical bits etc. This makes exponential quantum computing power growth feasible.
  • There is significant investment in this space:
    • Most mega tech companies such as Fujitsu, Google and IBM have been investing in quantum computing. In addition, startups such as D-Wave Systems raised hundreds of millions to tackle the problem.
    • Number of qubits in quantum computers have been increasing dramatically from 2 qubits in 1998 to 433 qubits in 2023.1

How does it work?

Quantum computing allows developers to leverage laws of quantum mechanics such as quantum superposition and quantum entanglement to compute solutions to certain important problems faster than classical computers. As usual, we kept this as simple as possible, however keeping this simple was the hardest how it works section we have ever written!

Qubits, bits in quantum computers, have 2 advantages over classical bits: They can hold more than one state during computation and two qubits can be entangled (i.e. set to the same state regardless of their location).

Classical computing relies on bits for memory. Bits can either be in the 0 or 1 state. Typically, this is physically represented as voltage on physical bits.

Qubit is the name for memory in quantum computers. Just like bits in classical computing, they can be observed in two states: 0 and 1. However, they are subject to quantum mechanics and when they are not observed, they have a probability of 0 and 1. These probabilities (probability amplitudes to be precise), can be negative, positive or complex numbers and are added up “superimposed”. This is like adding waves in classical physics and allows a single qubit to be capable of holding 2 bits of information.

Other advantage of qubits is quantum entanglement which sets a group of qubits to the same state and qubits retain this equivalence until they are disentangled.

Though qubits are by nature probabilistic, they return a classical, single state when measured. Therefore in most quantum computers, a series of quantum operations are performed before the measurement. Since measurement reduces a probabilistic result to a deterministic one, numerous computations are required to understand the actual probabilistic result of the quantum computer in most cases.

Qubits can be implemented using various quantum phenomena and this is an area of active research and no mature solutions exist. Different quantum computers use different qubit implementations.

What are its potential use cases/applications?

Primary applications include optimization & research in various industries, cryptography and espionage. Feel free to read our article on quantum computing applications for more.

How will quantum computing change AI?

Quantum computing and AI are two of the most hyped tech of today. Combining them naturally raises sceptical eye brows as both fields have numerous sceptics doubting their potential. Sceptics are correct in that quantum computing is still research field and it is a long way from being applied to neural networks. However, in a decade, AI could run into another plateau due to insufficient computing power and quantum computing could rise to help the advance of AI.

External sources

Wired

Access Cem's 2 decades of B2B tech experience as a tech consultant, enterprise leader, startup entrepreneur & industry analyst. Leverage insights informing top Fortune 500 every month.
Cem Dilmegani
Principal Analyst
Follow on

Cem Dilmegani
Principal Analyst

Cem has been the principal analyst at AIMultiple since 2017. AIMultiple informs hundreds of thousands of businesses (as per similarWeb) including 60% of Fortune 500 every month.

Cem's work has been cited by leading global publications including Business Insider, Forbes, Washington Post, global firms like Deloitte, HPE, NGOs like World Economic Forum and supranational organizations like European Commission. You can see more reputable companies and media that referenced AIMultiple.

Throughout his career, Cem served as a tech consultant, tech buyer and tech entrepreneur. He advised businesses on their enterprise software, automation, cloud, AI / ML and other technology related decisions at McKinsey & Company and Altman Solon for more than a decade. He also published a McKinsey report on digitalization.

He led technology strategy and procurement of a telco while reporting to the CEO. He has also led commercial growth of deep tech company Hypatos that reached a 7 digit annual recurring revenue and a 9 digit valuation from 0 within 2 years. Cem's work in Hypatos was covered by leading technology publications like TechCrunch and Business Insider.

Cem regularly speaks at international technology conferences. He graduated from Bogazici University as a computer engineer and holds an MBA from Columbia Business School.

To stay up-to-date on B2B tech & accelerate your enterprise:

Follow on

Next to Read

Comments

Your email address will not be published. All fields are required.

2 Comments
William Main
Apr 21, 2022 at 14:58

There is a long standing axiom in software:
– you can have it right
– you can have it fast (both implications)
– you can have it inexpensively

pick two?
Wonder if this holds in Quantum space as well?

Judith Halnan
Feb 04, 2022 at 04:49

Marx and Jung are last century. Surely we have a relevant quote post 2021.

Cem Dilmegani
Feb 04, 2022 at 12:33

Possibly but important aspects of human experience change slowly

Related research