Recent Projects
I am currently working on two broad projects on understanding the behavior of neural networks:
Comparative Study of Neural Networks, Random Feature Regression, and Neural Tangent Kernel
Understanding the Optimization Landscape of Deep Neural Networks
Past Projects
Understanding the Behavior of Variational Inference in Topic Models
Optimal Eigenvalue Shrinkage for Spiked Model
Publications
You can access my Google Scholar profile here.
2019
Ghorbani, B., Mei, S., Misiakiewicz, T., Montanari, A. “Limitations of Lazy Training of Two-layers Neural
Networks” NeurIPS (2019) (Accepted for Spotlight) link
Ghorbani, B., Mei, S., Misiakiewicz, T., Montanari, A. “Linearized Two-Layers Neural Networks in High
Dimension” Submitted to Annals of Statistics (2019) link
Ghorbani, B., Xiao, Y., Krishnan, S. “An Investigation into Neural Net Optimization Via Hessian Eigenvalue
Density” ICML (2019) link code
Ghorbani, B., Xiao, Y., Krishnan, S. “The Effect of Network Depth on the Optimization Landscape” ICML
Workshop on Deep Phenomena (2019) link
Ghorbani, B., Javadi, H., Montanari, A. “An Instability in Variational Inference for Topic Models” ICML
(2019) link code
2018
Donoho, D., Ghorbani, B. “Optimal Covariance Estimation for Condition Number Loss in the Spiked Model”
submitted to the Annals of Statistics (2018) link
2015
Ghorbani, B., Yilmaz, O. “Sparse Regression With Highly Correlated Predictors” Undergraduate Thesis link
|