Yihui Quek

Hello! I am a fifth-year PhD student at Stanford University. My research interests entangle Physics, information theory (quantum/classical) and algorithms.

My advisor is Prof. Tsachy Weissman and my undergraduate advisor at MIT was Prof. Peter W. Shor. I was a devoted quantum Shannon theorist for the first half of my PhD; after wandering into some excellent classes at Stanford, I also became interested in algorithms and theoretical computer science!

In June 2016, I graduated with a B.S. in Physics and Mathematics (Phi Beta Kappa) from MIT. I wrote my senior thesis on quantum and super-quantum enhancements to capacities of interference channels, supervised by Prof. Peter W. Shor (because of this my Erdös number is 3!).

I am one of the two inaugural recipients of the Stanford Q-FARM Fellowship. I'm also supported by an National University of Singapore (NUS) Overseas Graduate Scholarship. From 2017-2020, I was supported by a Stanford Graduate Fellowship.

Twitter  /  Google Scholar /  GitHub


- (Oct'21) I will be giving a Spotlight talk at NeurIPS 2021 (top 3% of submissions) on our paper `Private Learning implies Quantum Stability'! It will be published in the conference proceedings.

- (May '21) Our paper `Private Learning implies Quantum Stability' won a $5000 research grant for excellence at the QC40: Physics of Computation Conference 40th Anniversary organized by IBM.

- (May '21) Three of my works "Private learning implies quantum stability", "Quantum algorithm for Petz recovery channels and Pretty Good Measurements" and "Bounding the forward classical capacity of bipartite quantum channels" have been accepted for talks at TQC 2021! I will be talking about the first two projects.

- (May '21) I gave an invited talk on "Quantum algorithm for Petz recovery channels and Pretty Good Measurements" at the SIAM LA'21 Conference (Quantum Numerical Linear Algebra Minisymposium), and a talk on "Private learning implies quantum stability" at the QC40: Physics of Computation Conference 40th Anniversary organized by IBM.

- (May '21) I will be an Area Chair for an ICML Workshop on Information-Theoretic Methods for Rigorous, Responsible, and Reliable Machine Learning.

- (May '21) Our work "Bounding the forward classical capacity of bipartite quantum channels" has been accepted as a contributed talk at ISIT 2021.

- (Feb '21) I gave an invited talk summarizing my PhD research at the Harvard University QuantumFest.

- (Oct '20) The workshop Beyond i.i.d. in Information Theory (which I am an organizer/PC member for) was successfully held at Stanford University! I also gave a lightning talk about our recent work on bipartite channel capacities, which gives the tightest known general bound on the classical feedback-assisted quantum channel capacity.

- (Jul '20 - May '21) I gave invited/contributed talks about our algorithms for implementing the Petz map, pretty-good measurements and polar decomposition at the MIT QIP group meeting, the workshop `Quantum Week of Fun' (organized by Cambridge Quantum Computing), the Perimeter Institute Quantum Seminar and the seminar of Japan's QLEAP Consortium. Here are my slides.

- (Feb '20) I was one of 600 young scientists (globally, across all disciplines) invited to participate in the 2020 Lindau Nobel Laureate Meeting (Interdisciplinary) in Lindau, Germany.

  • 2017/1 - Present: Ph.D candidate, Stanford University
  • 2012/8 - 2016/6: B.Sc, Massachusetts Institute of Technology (GPA: 4.9/5.0)
  • 2006/1 - 2011/1: NUS High School of Mathematics and Science, Singapore (Rank: 1/209)

Reviewer for Physical Review Letters, Physical Review A, Quantum, ISIT (International Symposium on Information Theory), IEEE Transactions in Information Theory, SODA (Symposium on Discrete Algorithms), ESA (European Symposium in Algorithms), IEEE Quantum Engineering.

I have organized the Beyond I.I.D. in Information Theory workshop to be held at Stanford in 2020; and will be an Area Chair for an ICML 2021 Workshop.

Quantum Learning Theory
Learnability of the output distributions of local quantum circuits
Marcel Hinsche, Marios Ioannou, Alexander Nietner, Jonas Haferkamp, Yihui Quek, Dominik Hangleiter, Jean-Pierre Seifert, Jens Eisert, Ryan Sweke

How hard is it to learn the output distribution of a local quantum circuit? It depends on how one queries said circuit! We consider two types of queries to the output distributions of such circuits: regular samples, and statistical queries (where learners can access only averaged statistical properties of the distribution-to-be-learned). We then show an exponential separation between the sample complexities for these two types of queries, adding to the few known examples of learning problems in the wild that separate these two types of queries.

Private Learning implies Quantum Stability
Srinivasan Arunachalam, Yihui Quek, John Smolin
[arXiv], [slides from QC40], [Limerick]
To appear in NeurIPS 2021 as Spotlight talk (top 3% of submissions)

We show that a number of "reduced" quantum learning models are surprisingly connected by certain combinatorial dimensions.

Invited talk at the QC40: Physics of Computation conference (public event); contributed talk at TQC'21.

Quantum Algorithms
Quantum algorithm for Petz recovery channels and pretty good measurements
Andras Gilyén, Seth Lloyd, Iman Marvian, Yihui Quek, Mark M. Wilde
[arXiv], [talk], [slides], [Limerick]

We use the recently-developed Quantum Singular Value Transform technique to implement the ubiquitous theoretical tools of Petz recovery channels and pretty good measurements.

Contributed talk at TQC'21. Talk at the MIT QIS group meeting; invited talk at the seminar of Japan's QLEAP Consortium, Perimeter Institute Quantum Seminar; contributed talk at the `Quantum Week of Fun' workshop

Robust Quantum Minimum Finding with an Application to Hypothesis Selection
Yihui Quek, Clément Canonne, Patrick Rebentrost
[arXiv], [slides], [Limerick]

We show that the Quantum Minimum-Finding algorithm of Durr-Hoyer can be robust even in the presence of a noisy or imprecise comparator. We also show an application to hypothesis selection that runs in time sublinear in the number of hypotheses.

Invited talk at the Algorithms and Complexity Seminar of the IRIF at CNRS/the University of Paris; talk at online seminar at the University of Technology Sydney's QSI.

Quantum Shannon theory

Bounding the forward classical channel capacity of bipartite channels
Dawei Ding, Sumeet Khatri, Yihui Quek, Peter W. Shor, Xin Wang, Mark M Wilde
Short version in 2021 IEEE International Symposium on Information Theory (ISIT), Melbourne, Australia, 2021, pp. 906-911
; long version submitted [arXiv], [short talk], [long talk (by Mark)]

We derive an SDP upper-bound on the bipartite channel's classical capacity. As a result, we also obtain the tightest-known upper-bound on classical-feedback-assisted quantum channel capacity, in a sequel to our first paper on the topic.

Contributed talk at TQC'21; lightning talk at the Beyond i.i.d in Information Theory workshop; to appear as a contributed talk in ISIT (International Symposium on Information Theory) 2021 and the corresponding transactions.

Entropy Bound for the Classical Capacity of a Quantum Channel aided by Classical Feedback
Dawei Ding, Yihui Quek, Peter W. Shor, Mark M Wilde
2019 IEEE International Symposium on Information Theory (ISIT), Paris, France, 2019, pp. 250-254
[arXiv], [IEEE]

First-ever general bound on classical feedback-aided capacity over a quantum channel, in terms of the maximum output entropy of that channel.

Contributed talk at the International Symposium for Information Theory 2020.

Quantum and Super-Quantum Enhancements to Two-sender, Two-receiver Channels
Yihui Quek and Peter W. Shor
Physical Review A, Vol.95, No.5, May 1, 2017
[arxiv][Physical Review A]

Poster at Young Quantum Information Scientists Symposium in Barcelona, 2016

Quantum Information Theory

Adaptive Quantum State Tomography with Neural Networks
Yihui Quek , Stanislav Fort, Hui Khoon Ng
[arXiv], [Slides(by Stanislav)] here

We design a recurrent neural network architecture for adaptive quantum state tomography, achieving an orders-of-magnitude speedup over an existing Bayesian algorithm for realistic numbers of measurements while retaining the same reconstruction accuracy.

Invited talk at Stanford Q-FARM seminar (Feb 2020); contributed talks at 3rd Quantum Techniques in Machine Learning 2019 (QTML) in Korea and McGill Physics-AI conference in Montreal; accepted at the 4th Seefeld Workshop on Quantum Information, 22nd Annual Conference on Quantum Information Processing (QIP 2019) and the Machine Learning and the Physical Sciences Workshop at NeurIPS 2019 as a poster.

Signal processing, Biophysics, Linguistics
Minimum Power to Maintain a Nonequilibrium Distribution of a Markov Chain
Dmitri Pavlichin, Yihui Quek, Tsachy Weissman

Inspired by a question of Feynman, we propose KL-divergence between Markov chains as a notion of energy cost for maintaining a nonequilibrium distribution in biological systems.

Body size-dependent energy storage causes Kleiber's law scaling of the metabolic rate in planarians
Albert Thommen, Steffen Werner, Olga Frank, Jenny Philipp, Oskar Knittelfelder, Yihui Quek, Karim Fahmy, Andrej Shevchenko, Benjamin M. Friedrich, Frank Jülicher, Jochen C. Rink
eLife, 8 Art. No. e38187 (2019)
[bioRxiv] [e-Life]

Contributed talk at 1st Crick-Beddington Developmental Biology Symposium, 2019

Severing focus form and meaning in Standard and Colloquial Singapore English
Yihui Quek and Aron Hirsch
Proceedings of the 47th meeting of the North-east Linguistics Society (NELS 47), 2016

Poster at NELS47, 2016

Generalized Robust Shrinkage Estimator and its application to STAP detection problem
Frédéric Pascal, Yacine Chitour and Yihui Quek
IEEE Transactions on Signal Processing, Vol. 62, No. 21, Nov 1, 2014

cloned from clone