Recherche: quantum advantage - optical examples
- F. Brice Dupuy
- 6 janv. 2021
- 6 min de lecture
Dernière mise à jour : 9 févr. 2021
Publié par Olivier Le Moult dans AOT le 14 août 2020 09:55:10

Quantum supremacy (or quantum advantage to limit ambition) is a subject discussed between major actors in Quantum Computing. See example from last fall confrontation between Google and IBM (https://plazza.orange.com/groups/aot/blog/2019/09/23/quantum-supremacy-google-and-nasa-cooperation). Here are latest proposals from mostly French members of LIP6 Paris laboratory and another from a chinese team. Optical solutions for Quantum advantage are very interesting because they do not require an imposing and very expensive dilution quantum computer (see IBM example IBM Q System One and Quantum Ecosystem and the picture below, right side). These experimental works show probably first demonstrations of computational quantum advantage with only linear optics. The goal is to get the calculation verifying an NP problem, with a verification using only limited information about the proof. With linear optics, this result is obtained in a few seconds whereas a conventional computer in N-complete problem solving regime would take an exponential time (the "french" article cites a time greater than the age of the universe, something as 13.8 billions of years). The figure below shows the simple linear optical conceptual diagram ("french" article), and for counter-example the Alphabet CEO near the Google Quantum Computer :

The left side scheme is named Sampling Machine (SM), it is used by the "Arthur" actor for his verification test ("french" article).
Be aware that the last examples of linear optics solution for quantum advantage and the famous Google experimentation work in different problems.
The problem to demonstrate Quantum Advantage depends on the used method for computational comparison.
Previous tests used Boson Sampling, sparse commuting and random quantum circuits (Google test in last fall, 2019, see article cited in the introduction).
The "french" article clearly insists on the conditions for choosing the parameters used in this kind of manipulation, to avoid the controversies that appeared after the Google (and NASA) last year announcement.
To further clarify the particularities of their work, several points should be underlined:
1. Optical set-up is simple and reproducible,
2. the process is quite simple as this corresponds to obtaining yes / no answers,
3. the analysis is made by comparison with classical methods with the hypothesis that the NP-complete problems do not have sub-exponential algorithms (hypothesis presented as acquired),
4. Efficient verification of NP-complete problems (in the case of limited information leaks) gives hope for interesting applications, such as quantum server-client computing, authentication systems, application of ethical behavior and blockchain technologies.
"French" experimental set-up is figured below

The "french" article proposes classical components to show the success of verifying NP-complete problems (some explanations below).
On the far left of the diagram we start with a coherent light source (1560 nm for the wavelength), then an amplitude modulator (AM) (to create 10 ns coherent pulses, see the pulses in the very first figure in this article). The beam splitter just after releases only one percent of the light power for the second part of the set-up. Then a new beam splitter shares the light signal between Merlin and Arthur. Merlin (as a prover) encodes the proof in the phases of his pulses thanks to a phase modulator (PM), and Arthur (as a verifier) takes decision.
The chinese article has different diagrams (see above for their experimental set-up) but both work on a specific approach to show the quantum advantage.
The two-prover Quantum Merlin-Arthur protocol - QMA (Chinese article)
Let's try to make some (partial) reminders.
Both articles try to solve one question : "how to realise the task of verifying NP-complete problems (with quantum solution but with "simple" optical components) ?".
Why it is different from the Google famous experience ?
A figure may help :

The acronyms in this figure are linked to the theory of computational complexity, with a focus on Quantum Computer (NISQ, Noisy Intermediate-Scale Quantum).
We can speak about an efficient algorithm, sometimes called polynomial-time algorithm.
To try a simple definition, an efficient algorithm requires a number of operations that is AT MOST polynomial as the size of the input.
The class of algorithmic tasks that admit efficient algorithms is the small P ball in the bottom of this figure.
Some examples of algorithmic tasks are matching (perfect or not), traveling salesman problem (whithout toll in the visited cities, a simpler version is the Hamiltonian cycle problem).
For example a solution of simplified travelling salesman problem is the route called a Hamiltonian cycle.
A major conjecture in the theory of computational complexity is that there is no efficient (polynomial-time) algorithm for solving the traveling salesman problem and there is no efficient algorithm to tell whether a Hamiltonian cycle exists.
If the number of cities visited by the traveling salesman grows, the computer will not be able to solve the problem (or find a Hamiltonian cycle) in finite time as the number of required steps is exponential.
Being in the computational class NP (the Orange ball bigger than the P ball) means that there is proof that an Hamiltonian cycle can be verified in a polynomial number of steps.
The task of deciding whether there exists a Hamiltonian cycle constitutes an NP-complete problem.
Being NP-complete means that any other NP-problem can be reduced to this problem.
Another ensemble is bigger than NP, it is PH (Polynomial-Time Hierarchy), with all problems that are polynomial-time reducible, see for example Scott J. Aaronson blog.
(to conclude for right part of previous figure, LDP (low-degree polynomial) and bounded-depth computation refer to the Google example of Quantum Advantage quest).
Both articles for Optical Quantum Advantage try to solve the task of verifying NP-complete problem, with same choice for the problem.
Both search for Boolean Satisfiability Problem or SAT, a problem of asking whether a given Boolean formula with n variables has a satisfying assignment.
Explaining the details of both articles is far from the scope of this article but we can underline the usefulness of these works.
SAT is a class of problem somehow equivalent to satisfy potentially conflict constraints.
SAT may be useful in software/hardware verification, circuit design, mode checking, automated proving and artificial intelligence.
If we recall the begining of the article, several points have been underlined for the future implications of this type of quantum optics solution.
This research domain is very active, with many routes explored, and certainly will be very fruitfull for optical quantum processing devices.
It is also this type of work (but for communications purposes) that is done in the CiViQ project part of European Quantum Flagship (see CiViQ ).
"French" article was edited the 31st of July, 2020 and the chinese one the 12th of August, 2020:
Experimental demonstration of quantum advantage for NP verification
https://arxiv.org/abs/2007.15876
with same authors (Eleni Diamanti and Iordanis Kerenidis) see 2018 Nature paper :
Quantum superiority for verifying NP-complete problems with linear optics
https://www.nature.com/articles/s41534-018-0103-1
Quantum verification of NP problems with single photons and linear optics
https://arxiv.org/abs/2008.05453
As cited, it is a very active domain, with efforts to have integrated photonics solutions for processing features in several computing classes, with various quantum characterics, see for example:
Demonstration of chip-based coupled degenerate optical parametric oscillators for realizing a nanophotonic spin-glass (Nature, August 17 2020):
https://www.nature.com/articles/s41467-020-17919-6
Citation :
"Many combinatorial optimization problems fall under this category and are classified as non-deterministic polynomial-time (NP) hard and have applications in areas including finance, scheduling, trajectory planning, and artificial intelligence. The ability to solve these problems efficiently has motivated the development of computing accelerators based on digital and physical systems"
This article shows a proposal of optical processing as coherent computing by simulating the Ising model, see also Ising spin-glass Hamiltonian (and QUBO, Quadratic Unconstrained Binary Optimization), first application in telecom domain with TIM trial for D-Wave Quantum Annealers computers (end of previous article, https://plazza.orange.com/groups/aot/blog/2019/09/03/ericsson-and-xanadu-their-views-on-quantum-computing).
Update December 4, 2020: Again from one of the Jian-Wei Pan teams, a more advanced test for optical quantum computing advantage (here closer to supremacy than ever for this type of processing). The publication is available from Science (Early publication category, December 3, 2020): Quantum computational advantage using photons, https://science.sciencemag.org/content/early/2020/12/02/science.abe8770.full
They work with Gaussian Boson Sampling (GBS) method (possibly usable for graph theory, quantum chemistry and machine learning), using Parametric Down-Conversion (PDC) and Single-Mode Squeezed-States (SMSS) for photons. The complete explanation is long and very complex, but their test for this publication brings an output state space dimension of 10^30 with a sampling rate 10^14 faster than using state-of-art simulation solution on supercomputer (Sunway TaihuLight, here 200 s againt 2.5 billion years), which probably is a more convincing experimentation than the Google's one (see https://plazza.orange.com/groups/aot/blog/2019/09/23/quantum-supremacy-google-and-nasa-cooperation). See above the scheme of their set-up (which is in real world far from integration, for the time being) and the needed time (in seconds) for the supercomputer used for the comparison:
Comments