A Brief Introduction To Quantum Computing
A Brief Introduction To Quantum Computing
Research Report: This article is exclusive to premium customers. For unrestricted website access please Subscribe: £5 monthly / £50 annual.
Quantum computing is one of the hottest topics in the technology sector today. The technology is enabling individuals and companies to solve computational problems that were previously considered intractable.
Quantum computing is a multi-disciplinary field comprising aspects of computer science, physics, and mathematics that utilises quantum mechanics to solve complex problems faster than on classical computers. Quantum computers are able to solve certain types of problems faster than classical computers by taking advantage of quantum mechanical effects, such as superposition and quantum interference.
Cryptography, chemistry, quantum simulation, optimisation, Machine Learning (ML), and numerous other fields have been significantly impacted by this technology. While quantum computers aren’t going to replace classical computers, the ones used for browsing the web, immediately, however quantum technology is significantly changing the way the world operates.
Background
Quantum computing is an area of computer science that uses the principles of quantum theory, which explains the behaviour of energy and material on the atomic and sub-atomic levels.
The quantum in "quantum computing" refers to the quantum mechanics that the system uses to calculate outputs. In physics, a quantum is the smallest possible discrete unit of any physical property. It usually refers to properties of atomic or sub-atomic particles, such as electrons, neutrinos, and photons.
Quantum computing uses specialised technology, including computer hardware and algorithms that take advantage of quantum mechanics, to solve complex problems that classical computers, or super-computers, can’t solve, or can’t solve quickly enough.
Classical computers today employ a stream of electrical impulses (1 and 0) in a binary manner to encode information in bits. This process takes time and restricts their processing ability, compared to quantum computing that does the process far quicker.
Quantum computing uses sub-atomic particles, such as electrons or photons. Quantum bits, or qubits, allow these particles to exist in more than one state, such as 1 and 0, at the same time.
Quantum Computers
Quantum computers are machines that use the properties of quantum physics to store data and perform computations. This can be extremely advantageous for certain tasks where they could vastly out-perform even our best super-computers.
A quantum computer is a device performing quantum computations and it manipulates the quantum states of qubits in a controlled way to perform algorithms. A universal quantum computer is defined as a machine that is able to adopt an arbitrary quantum state from an arbitrary input quantum state.
Quantum computers use this principle to accurately compute the behaviour of quantum systems, or very small particles, that follow the laws of quantum mechanics, for example the behaviour of electrons in a hydrogen molecule or more complex systems like how proteins fold.
It can also be used to run optimisation algorithms very efficiently, execute machine learning algorithms, or do pattern recognition much more efficiently than classical or super-computers can. The development of a quantum computer is currently in its infancy, systems consist of a few to a few tens of quantum bits (qubits).
The main challenges in further development are to make the quantum computer scalable and to make it fault-tolerant. This means that it will be able to perform universal quantum operations using unreliable components.
Classical Computers
Classical computers, which include smartphones and laptops, encode information in binary “bits” that can either be 0s or 1s. With quantum computing the basic unit of memory is a quantum bit or qubit.
Qubits are made using physical systems, such as the spin of an electron or the orientation of a photon. These systems can be in many different arrangements all at once, a property known as quantum superposition.
Qubits can also be inextricably linked together using a phenomenon known as quantum entanglement. The result is that a series of qubits can represent different things simultaneously. For instance, eight bits is enough for a classical computer to represent any particular number between 0 and 255.
But eight qubits is enough for a quantum computer to represent every number between 0 and 255 at the same time. A few hundred entangled qubits would be enough to represent more numbers than there are atoms in the universe.
This is where quantum computers get their edge over classical ones. In situations where there are a large number of possible combinations, quantum computers can consider them simultaneously.
Examples include trying to find the prime factors of a very large number or the best route between two places. However, there may also be plenty of situations where classical computers will still outperform quantum ones. So the computers of the future may be a combination of both these types.
For now, quantum computers are highly sensitive: heat, electromagnetic fields and collisions with air molecules can cause a qubit to lose its quantum properties. This process, known as quantum decoherence, causes the system to crash, and it happens more quickly the more particles that are involved.
Quantum computers need to protect qubits from external interference, either by physically isolating them, keeping them cool or zapping them with carefully controlled pulses of energy. Additional qubits are needed to correct for errors that creep into the system.
History of Quantum Computing
The pre-history of quantum computing begins early in the 20th century, when physicists began to think that they had lost their grip on reality.
Quantum Leaps
1980: Physicist Paul Benioff suggests quantum mechanics could be used for computation.
1981: Nobel prize winner Richard Feynman at Caltech, coins the term 'quantum computer.'
1985: Physicist David Deutschat at Oxford, maps out how a quantum computer would operate, a blueprint that underpins the nascent industry of today.
1994: Mathematician Peter Shor, at Bell Labs, writes an algorithm that could tap a quantum computer’s power to break widely used forms of encryption.
2004: Barbara Terhal and David DiVincenzo, two physicists working at IBM, develop theoretical proofs showing that quantum computers can solve certain mathematical puzzles faster than classical computers.
2014: Google starts its new quantum hardware lab and hires the professor behind some of the best quantum computer hardware to take it forward.
2014: Google hires the professor behind some of the best quantum computer hardware yet to lead its new quantum laboratory.
2016: IBM puts some of its prototype quantum processors on the Internet for anyone to experiment with, saying programmers need to get ready to write quantum code.
2019: Google’s quantum computer beats a classical super-computer at a commercially useless task based on Terhal and DiVincenzo’s 2004 proofs, in a feat many call “quantum advantage.”
2020: The University of New South Wales in Australia offers the first undergraduate degree in quantum engineering to train a workforce for the budding industry.
Quantum Mechanics
First, accepted explanations of the subatomic world turned out to be incomplete. Electrons and other particles didn’t just neatly carom around like Newtonian billiard balls, for example. Sometimes they acted like a wave instead.
Quantum mechanics arrived to explain such quirks, but introduced troubling questions of its own. To take just one example, this new math implied that physical properties of the subatomic world, like the position of an electron, existed as probabilities before they were observed.
Before you measure an electron’s location, it is neither here nor there, but some probability of everywhere. You can think of it like a quarter flipping in the air. Before it lands, the quarter is neither heads nor tails, but some probability of both.
If you find that baffling, you’re in good company. A year before winning a Nobel Prize for his contributions to quantum theory, Caltech’s Richard Feynman said that “nobody understands quantum mechanics.” The way we experience the world just isn’t compatible.
But some people grasped it well enough to redefine our understanding of the universe. And in the 1980s, a few of them, including Feynman, began to wonder whether quantum phenomena like subatomic particles' probabilistic existence could be used to process information.
The basic theory or blueprint for quantum computers that took shape in the ’80s and ’90s still guides Google and other companies working on the technology.
Before we fall into the murky shallows of quantum computing 0.101, we should refresh our understanding of regular computers. As you know, smartwatches, iPhones, and the fastest super-computers, all basically do the same thing: They perform calculations by encoding information as digital bits, aka 0s and 1s. A computer might flip the voltage in a circuit on and off to represent 1s and 0s, for example.
Quantum computers do calculations using bits, too. After all, we want them to plug into our existing data and computers. But quantum bits, or qubits, have unique and powerful properties that allow a group of them to do much more than an equivalent number of conventional bits.
Qubits can be built in various ways, but they all represent digital 0s and 1s using the quantum properties of something that can be controlled electronically.
Popular examples, at least amongst a very select slice of humanity, include superconducting circuits, or individual atoms levitated inside electromagnetic fields.
The magic power of quantum computing is that this arrangement lets qubits do more than just flip between 0 and 1. Treat them right and they can flip into a mysterious extra mode called a superposition.
What the Future Holds for Quantum Computing
Error-prone but better than supercomputers at a cherry-picked task, quantum computers have entered their adolescence. It’s not clear how long this awkward phase will last, and like human puberty it can sometimes feel like it will go on forever.
Researchers in the field broadly describe today’s technology as Noisy Intermediate-Scale Quantum computers, putting the field in the so called 'Noisy Intermediate-Scale Quantum' (NISQ) era.
Existing quantum computers are too small and unreliable to execute the field’s dream algorithms, such as Shor’s algorithm for factoring numbers. The question remains whether researchers can wrangle their gawky teenage NISQ machines into doing something useful. Indeed, teams in both the public and private sector are betting so, as Google, IBM, Intel, and Microsoft have all expanded their teams working on the technology, with a growing swarm of startups including Xanadu and QuEra in hot pursuit.
The US, China, and the European Union each have new programs measured in the billions of dollars to stimulate quantum R&D. Some startups, such as Rigetti and IonQ, have even begun trading publicly on the stock market by merging with a so-called special-purpose acquisition company, or SPAC, a trick to quickly gain access to cash.
It’s not quite clear what the first killer apps of quantum computing will be, or when they will appear. But there’s a sense that which-ever company is first to make these machines useful will gain big economic and national security advantages.
Image: NiPlot
References:
Quantum Computing & Simulation Hub
___________________________________________________________________________________________
If you like this website and use the comprehensive 6,500-plus service supplier Directory, you can get unrestricted access, including the exclusive in-depth Directors Report series, by signing up for a Premium Subscription.
- Individual £5 per month or £50 per year. Sign Up
- Multi-User, Corporate & Library Accounts Available on Request
- Inquiries: Contact Cyber Security Intelligence
Cyber Security Intelligence: Captured Organised & Accessible