Research Overview

The research of quantum computing (QC), the study of the impact of quantum mechanics on computation and information science, is experiencing a major transition. With the availability of prototypes of quantum machines, it becomes possible for QC researchers to go beyond traditional theoretical studies and investigate their implementation on real-world quantum machines. A paramount goal of the QC research is to bridge the gap between theoretical studies and the limitation of real-world QC hardware to achieve end-to-end quantum applications.

Research Goals

My research aims to achieve end-to-end quantum applications by investigating both the theoretical study of quantum information and computation and the software tool-chain and system of quantum computers. My research has contributed to the study of quantum complexity, quantum entanglement, quantum cryptography, and has recently been focusing on quantum algorithms for optimization and machine learning. I have also begun to apply formal methods and techniques in programming languages to quantum computing.

Formal Methods and Programming Languages in Quantum Computing

Verification of Sequence, Parallel and Concurrent Quantum Programs

Quantum programs are error-prone and their verification is challenging due to the limit of standard software assurance techniques in the quantum setting. My research investigates the verification of quantum programs via their static analysis with the help of quantum Hoare logic. A prominent approach for program verification is to generate invariants and inductive assertions, which is already a highly non-trivial task even classically. I made the first proposal of quantum invariants and demonstrated its use in quantum program verification, which can be generated by using a semidefinite program (SDP) of dimension exponential in terms of the number of qubits. In ongoing projects, I further improve the invariant-based verification in the term of its expressive power (e.g., including ancilla in Hoare logic) and its scalability.

Moving from sequential to parallel and concurrent quantum programs is a logical next step because of important quantum applications in communication and network. Specifically, we aim to develop a quantum Owicki-Gries-like logic by formulating the concept of interference-free proofs of component quantum programs. However, there is a significant difference between classical and quantum interference due to quantum correlations and casual relationship among component quantum programs. Research in quantum information on these topics (e.g., my own research on quantum correlations below) is likely to play a critical role for such development.

Publications:

  • Invariants of Quantum Programs: Characterizations and Generation. Mingsheng Ying, Shenggang Ying, and Xiaodi Wu. POPL 2017.

Reliable and Resource-optimized Quantum Program Synthesis

Near-term quantum applications will likely focus on Noisy and Intermediate-Scale Quantum (NISQ) computers, where precisely controllable qubits are expensive, error-prone, and scarce. Existing work on quantum algorithms and programming languages assumes this issue will be solved by the hardware, as in classical computers, or with independently designed fault-tolerant protocols, which hence becomes less helpful for implementing near-term quantum applications.

I believe that one way to attack these problems is to set aside a one-size-fits-all approach to fault tolerance and instead consider elevating the question of errors and related architecture-specific resource optimization to the level of the programming language and algorithm design. In particular, I am inspired by techniques in approximate computing that optimizes computation on unreliable classical hardware. As the first step, I've built a formal semantics and a logic for reasoning about reliability in the presence of noise in quantum computation.

In continuing efforts, I aim to build an automated quantum program synthesis, which takes the ideal program and reliability requirements as constraints (which could be generated in Coq via the implementation of our logic) and hardware-specific resource optimization as the objective, and outputs a optimized quantum program with possible additional error mitigation intermediate steps synthesized by a back-end optimizer. This automated tool will then be benchmarked on near-term quantum applications (e.g., variational methods) with specific hardware information.

Publications:

  • Quantitative Robustness Analysis of Quantum Programs. Shih-Han Hung, Kesha Hietala, Shaopeng Zhu, Mingsheng Ying, Michael Hicks, and Xiaodi Wu. POPL 2019.

Formally Verified Software Tool-chain for Quantum Computing

The complexity of quantum computing and the limitations of near-term quantum devices suggest that the development of sophisticated quantum algorithms and clever optimizations is more likely to have mistakes. This calls for verifying every stage of quantum computation, from the software tools used to generate quantum circuits to the architecture and system design.

I am inspired by formal methods applied in safety-critical domains to ensure correctness of code by construction, especially in the example of CompCert (a C compiler written and proved correct in Coq) and the NSF project of deep specification (a project to develop specifications of software toolchains to prove end-to-end correctness of whole systems). A verified quantum computing stack would ensure that each level of quantum computation is implemented satisfying certain specifications and the correctness of the final system, which would have wide practical impact. This approach is especially appealing to quantum computing since alternative software assurance techniques are very limited due to the substantial expense involved in the quantum setting. In an ongoing project, I am working on a proved-correct optimizing compiler for quantum algorithms to optimize the gate count, the depth of circuits, etc., while adhering to any architectural constraints.

Theoretical Studies in Quantum Information and Computation

Quantum Algorithms for Optimization and Machine Learning

My recent research aims to understand the landscape of provable quantum advantage in optimization and machine learning, a major targeted domain of quantum applications. To that end, I have developed quantum algorithms with polynomial speed-ups over classical ones for semidefinite programs (SDPs), general convex optimization, training linear and kernelized classifiers. My algorithms also hint at possible exponential quantum speed-ups when using quantum data as inputs/outputs of SDPs and the principle component analysis problem.

Another well-motivated topic is the property test of quantum and classical distributions. I have closed a long-standing gap between the upper and the lower bound of the sample complexity of testing the whole information of quantum states, so-called quantum tomography, which is a fundamental step to verify the preparation of the experimental setup. I also demonstrated the quantum speed-up in estimating the Shannon and Rényi entropies of classical distributions.

To accelerate these quantum applications on NISQ quantum machines, I designed a systematic framework of decomposing quantum computation into small pieces and then combining simulation results from each piece for the final output. A particular application is, e.g., a practical scheme to implement a 60-qubit quantum computation with only 45-qubit quantum machines.

Publications:

  • Quantum SDP Solvers: Large Speed-ups, Optimality, and Applications to Quantum Learning. Fernando G. S. L. Brandao, Amir Kalev, Tongyang Li, Cedric Yen-Yu Lin, Krysta M. Svore, and Xiaodi Wu. arXiv: 1710.02581v2. To appear in QIP 2019.

  • Quantum algorithms and lower bounds for convex optimization. Shouvanik Chakrabarti, Andrew M. Childs, Tongyang Li, and Xiaodi Wu. arXiv: 1809.01731. To appear in QIP 2019.

  • Simulating large quantum circuits on a small quantum computer. Aram Harrow, Maris Ozols, Tianyi Peng, and Xiaodi Wu. Manuscript, 2018.

  • Quantum query complexity of entropy estimation. Tongyang Li and Xiaodi Wu. Accepted by IEEE Transaction on Information Theory. arXiv: 1710.06025.

  • Sample-optimal tomography of quantum states. Jeongwan Haah, Aram W. Harrow, Zhengfeng Ji, Xiaodi Wu, Nengkun Yu. QIP 2016 and STOC 2016.

Entanglement, Quantum Correlations, and Sum-of-Squares

Research on this topic studies many aspects of one of the key quantum features, entanglement and non-locality. I attack this topic by exploring a surprising connection between quantum information and the sum-of-squares (SOS) proof in approximation algorithms and the famous Unique Games Conjecture (UGC). This connection allows one to leverage technical advances in one field to apply to the other. Specific problems that I am working on includes the characterization of separable (unentangled) states, the complexity of quantum Merlin-Arthur games with unentangled provers (QMA(2)), the possibility of a quantum-inspired approach to attack the UGC.

Publications:

  • Limitations of semidefinite programs for separable states and entangled games. Aram Harrow, Anand Natarajan, Xiaodi Wu. QIP 2017. Accepted by Communications in Mathematical Physics.

  • Tight SoS-degree bounds for approximate Nash equilibria. Aram Harrow, Anand Natarajan, Xiaodi Wu. CCC 2016.

  • An improved semidefinite programming hierarchy for testing entanglement. Aram Harrow, Anand Natarajan, Xiaodi Wu. Communications in Mathematical Physics, June 2017, Volume 352, Issue 3, pp 881– 904.

  • Epsilon-net method for optimizations over separable states. Yaoyun Shi and Xiaodi Wu. QIP 2012, ICALP 2012. Theoretical Computer Science, Volume 598, Pages 51–63.

Tamper-resilient Cryptography under Physical Assumptions

Devices, classical or quantum, are subject to tampering in cryptographic settings, especially due to the proliferation of side-channel attacks. These attacks exploit the fact that de- vices leak information to the outside world not just through input-output interaction, but through physical characteristics of computation such as power consumption, timing, and electro-magnetic radiation. Research on this topic explores two possible solutions to protect cryptographic systems from (quantum) side-channel attacks.

  • Device-independent Cryptography: In this setting, none of the devices can be a priori trusted: security is based solely on simple tests performed by the honest users on the input-output behavior of their devices. The security analysis guarantees that any set of devices meeting reasonable physical assumptions (e.g., spatial separations of devices that prohibit mutual communication) and conducting an acceptable interaction will lead to a secure outcome, regardless of the actual process inside the devices. My research focuses on generating uniform random bits in this setting under minimal assumptions. Check the recent review in Nature (Certified randomness in quantum physics) about this research direction and my work.

  • Leakage-resilient Cryptography: Modern computing environment, such as cloud computing where computation no long takes place on private machines under our control, makes a lot of classical cryptographic systems victims to all kinds of side-channel attacks. The leakage can be quantum. Collecting and storing quantum side information is technically much less challenging than building fully-fledged quantum computers, and could become realistic in the near future. My research focuses on studying the role of quantum side information in both information-theoretical and computational setting. In a recent result, I initiated the study of computational notions of entropy in the quantum setting and developed the first quantum leakage-resilient cryptographic protocol.

Both device-independent and leakage-resilient cryptography can be deemed as tamper-resilient cryptography under physical assumptions. My future plan is to bring these cryptographic designs closer to practice, with better efficiency and broader functionality.

Publications:

  • Computational Notions of Quantum Min-Entropy. Yi-Hsiu Chen, Kai-Min Chung, Ching-Yi Lai, Salil P. Vadhan, and Xiaodi Wu. QCrypt 2017. arXiv: 1704.07309.

  • General randomness amplification with non-signaling security. Kai-Min Chung, Yaoyun Shi, and Xiaodi Wu. QIP 2017.

  • Multi-Source Randomness Extractors Against Quantum Side Information, and their Applications. Kai-Min Chung, Xin Li, and Xiaodi Wu. arXiv:1411.2315.

  • Physical Randomness Extractors. Kai-Min Chung, Yaoyun Shi, and Xiaodi Wu. Plenary talk at QIP 2014.

Quantum Computational Complexity

Interactive proof systems have been a central model in complexity theory with applications ranging from the PCP theorem in hardness of approximation to cryptography. It studies problems with efficiently verifiable proofs via interactions between a polynomial-time verifier and all-powerful provers, where the verifier determines the validity of the proofs. My main contribution on this topic is the development of the Equilibrium Value Method to obtain space-efficient simulations of quantum interactive proof systems, including QIP=PSPACE, QRG(2)=PSPACE. Recently, I have been working on the quantum variant of the PCP theorem in the interactive proof setting. As a concrete first step, I have obtained a parallel repetition result for entangled k-player games.

Publications:

  • Parallel repetition for entangled k-player games via fast quantum search. Kai-Min Chung, Xiaodi Wu and Henry Yuen. CCC 2015.

  • Epsilon-net method for optimizations over separable states. Yaoyun Shi and Xiaodi Wu. QIP 2012, ICALP 2012. Theoretical Computer Science, Volume 598, Pages 51-63.

  • Parallel approximation of min-max problems with applications to classical and quantum zero-sum games Gus Gutoski and Xiaodi Wu. QIP 2012, CCC 2012. Computational Complexity, 22(2):385-428, 2013, the special issue of CCC 2012.

  • Equilibrium Value Method for the proof of QIP=PSPACE. Xiaodi Wu . arXiv:1004.0264.