Research OverviewThe research of quantum computing (QC), the study of the impact of quantum mechanics on computation and information science, is experiencing a major transition. With the availability of prototypes of quantum machines, it becomes possible for QC researchers to go beyond traditional theoretical studies and investigate their implementation on realworld quantum machines. A paramount goal of the QC research is to bridge the gap between theoretical studies and the limitation of realworld QC hardware to achieve endtoend quantum applications. Research GoalsMy research aims to achieve endtoend quantum applications by investigating both the theoretical study of quantum information and computation and the software toolchain and system of quantum computers. My research has contributed to the study of quantum complexity, quantum entanglement, quantum cryptography, and has recently been focusing on quantum algorithms for optimization and machine learning. I have also begun to apply formal methods and techniques in programming languages to quantum computing. Formal Methods and Programming Languages in Quantum ComputingVerification of Sequence, Parallel and Concurrent Quantum ProgramsQuantum programs are errorprone and their verification is challenging due to the limit of standard software assurance techniques in the quantum setting. My research investigates the verification of quantum programs via their static analysis with the help of quantum Hoare logic. A prominent approach for program verification is to generate invariants and inductive assertions, which is already a highly nontrivial task even classically. I made the first proposal of quantum invariants and demonstrated its use in quantum program verification, which can be generated by using a semidefinite program (SDP) of dimension exponential in terms of the number of qubits. In ongoing projects, I further improve the invariantbased verification in the term of its expressive power (e.g., including ancilla in Hoare logic) and its scalability. Moving from sequential to parallel and concurrent quantum programs is a logical next step because of important quantum applications in communication and network. Specifically, we aim to develop a quantum OwickiGrieslike logic by formulating the concept of interferencefree proofs of component quantum programs. However, there is a significant difference between classical and quantum interference due to quantum correlations and casual relationship among component quantum programs. Research in quantum information on these topics (e.g., my own research on quantum correlations below) is likely to play a critical role for such development. Publications:
Reliable and Resourceoptimized Quantum Program SynthesisNearterm quantum applications will likely focus on Noisy and IntermediateScale Quantum (NISQ) computers, where precisely controllable qubits are expensive, errorprone, and scarce. Existing work on quantum algorithms and programming languages assumes this issue will be solved by the hardware, as in classical computers, or with independently designed faulttolerant protocols, which hence becomes less helpful for implementing nearterm quantum applications. I believe that one way to attack these problems is to set aside a onesizefitsall approach to fault tolerance and instead consider elevating the question of errors and related architecturespecific resource optimization to the level of the programming language and algorithm design. In particular, I am inspired by techniques in approximate computing that optimizes computation on unreliable classical hardware. As the first step, I've built a formal semantics and a logic for reasoning about reliability in the presence of noise in quantum computation. In continuing efforts, I aim to build an automated quantum program synthesis, which takes the ideal program and reliability requirements as constraints (which could be generated in Coq via the implementation of our logic) and hardwarespecific resource optimization as the objective, and outputs a optimized quantum program with possible additional error mitigation intermediate steps synthesized by a backend optimizer. This automated tool will then be benchmarked on nearterm quantum applications (e.g., variational methods) with specific hardware information. Publications:
Formally Verified Software Toolchain for Quantum ComputingThe complexity of quantum computing and the limitations of nearterm quantum devices suggest that the development of sophisticated quantum algorithms and clever optimizations is more likely to have mistakes. This calls for verifying every stage of quantum computation, from the software tools used to generate quantum circuits to the architecture and system design. I am inspired by formal methods applied in safetycritical domains to ensure correctness of code by construction, especially in the example of CompCert (a C compiler written and proved correct in Coq) and the NSF project of deep specification (a project to develop specifications of software toolchains to prove endtoend correctness of whole systems). A verified quantum computing stack would ensure that each level of quantum computation is implemented satisfying certain specifications and the correctness of the final system, which would have wide practical impact. This approach is especially appealing to quantum computing since alternative software assurance techniques are very limited due to the substantial expense involved in the quantum setting. In an ongoing project, I am working on a provedcorrect optimizing compiler for quantum algorithms to optimize the gate count, the depth of circuits, etc., while adhering to any architectural constraints. Theoretical Studies in Quantum Information and ComputationQuantum Algorithms for Optimization and Machine LearningMy recent research aims to understand the landscape of provable quantum advantage in optimization and machine learning, a major targeted domain of quantum applications. To that end, I have developed quantum algorithms with polynomial speedups over classical ones for semidefinite programs (SDPs), general convex optimization, training linear and kernelized classifiers. My algorithms also hint at possible exponential quantum speedups when using quantum data as inputs/outputs of SDPs and the principle component analysis problem. Another wellmotivated topic is the property test of quantum and classical distributions. I have closed a longstanding gap between the upper and the lower bound of the sample complexity of testing the whole information of quantum states, socalled quantum tomography, which is a fundamental step to verify the preparation of the experimental setup. I also demonstrated the quantum speedup in estimating the Shannon and Rényi entropies of classical distributions. To accelerate these quantum applications on NISQ quantum machines, I designed a systematic framework of decomposing quantum computation into small pieces and then combining simulation results from each piece for the final output. A particular application is, e.g., a practical scheme to implement a 60qubit quantum computation with only 45qubit quantum machines. Publications:
Entanglement, Quantum Correlations, and SumofSquaresResearch on this topic studies many aspects of one of the key quantum features, entanglement and nonlocality. I attack this topic by exploring a surprising connection between quantum information and the sumofsquares (SOS) proof in approximation algorithms and the famous Unique Games Conjecture (UGC). This connection allows one to leverage technical advances in one field to apply to the other. Specific problems that I am working on includes the characterization of separable (unentangled) states, the complexity of quantum MerlinArthur games with unentangled provers (QMA(2)), the possibility of a quantuminspired approach to attack the UGC. Publications:
Tamperresilient Cryptography under Physical AssumptionsDevices, classical or quantum, are subject to tampering in cryptographic settings, especially due to the proliferation of sidechannel attacks. These attacks exploit the fact that de vices leak information to the outside world not just through inputoutput interaction, but through physical characteristics of computation such as power consumption, timing, and electromagnetic radiation. Research on this topic explores two possible solutions to protect cryptographic systems from (quantum) sidechannel attacks.
Both deviceindependent and leakageresilient cryptography can be deemed as tamperresilient cryptography under physical assumptions. My future plan is to bring these cryptographic designs closer to practice, with better efficiency and broader functionality. Publications:
Quantum Computational ComplexityInteractive proof systems have been a central model in complexity theory with applications ranging from the PCP theorem in hardness of approximation to cryptography. It studies problems with efficiently verifiable proofs via interactions between a polynomialtime verifier and allpowerful provers, where the verifier determines the validity of the proofs. My main contribution on this topic is the development of the Equilibrium Value Method to obtain spaceefficient simulations of quantum interactive proof systems, including QIP=PSPACE, QRG(2)=PSPACE. Recently, I have been working on the quantum variant of the PCP theorem in the interactive proof setting. As a concrete first step, I have obtained a parallel repetition result for entangled kplayer games. Publications:
