Comparing bounds for the Mean-field Gap for the Ising model, advised by Professor Andrea Montanari. This project was about comparing various upper bounds for the so-called “mean-field gap” from estimating the log-partition function of the Ising model. Previously, the various bounds in literature are often complicated (involving covering numbers and variational formulas) and at first glance incomparable, so it was not known if one of them dominated the others.
You can find a sequence of related blogposts here.
Results on Sum-Free Sequences with Aaron Kaufer and Jonathan Akaba, supervised by Rodrigo Sanches Angelo. This project was about sum-free sequences, which are sequences constructed with the following rule:
As a concrete example, starting from the stub “1, 4”, the next term is 6 since 5 = 1 + 4 is not allowed. The sequence continues 1, 4, 6, 9, 11, 14 … and so on. Often such sequences are “regular” (having a periodic pattern, just like how this example has “period 5”), however there are suspected counterexamples which are sum-free but not regular.
A quick observation is that the numbers missing from such sequences can be expressed as a sum in many different ways (e.g. 9 = 1 + 8 = 2 + 7 = 4 + 5). Our main results (Theorem 5.2, 5.4) establish a converse to this observation: if all the missing terms (of any sum-free sequence) can be expressed as a sum in linearly many ways, then we can conclude that the sequence must be regular.
More projects coming soon!
Final Project with Jensen Wang. Our project reviewed and implemented “Optimal Quantile Approximation in Streams” (Karnin, Lang and Liberty 2016), which gives the optimal space complexity of the single quantile approximation. Slides from our presentation.
Final Project with Jensen Wang and Siah Yong Tan. Our project was on deletion codes in the constant number of errors regime, which are an analogue of error-correction codes for deletion errors. We reviewed several key ideas in the literature leading up to the breakthrough result of Sima, Gabrys and Bruck (2020), which is an explicit construction matching the lower bound up to a constant factor.
Additionally, there are two original insights in our paper:
Scribed notes for Lecture 6 with Ryan Tolsma. This was for a guest lecture by Jay Mardia on the statistical-computational gap.
Reading Project. For the final project, I reviewed a recent paper by Polyanski and Wu (2021) titled “Dualizing Le Cam’s method for functional estimation, with applications to estimating the unseens”. My exposition includes proofs of the two main results, which were made significantly shorter using the tools learnt in class.
Scribed notes for Lecture 8 with Jensen Wang. This was scribed for a lecture on a refined upper bound for neural networks and its various consequences.
Update: A good chunk of our notes made it into the revised set of lecture notes here, and we are listed in the acknowledgements!
Reading Project with Jensen Wang. We reviewed a paper by Du et al. (2019) titled “Gradient descent finds global minima of deep neural networks”.
This was a short, self-contained proof for a theorem that we used in Math 154: Analytic Number Theory.
Combinatorial Problems via Information Theory. This was an article I wrote for the final project, and shows how information theory can help to prove combinatorial statements.