Scott W. Linderman

Stanford University

I'm an assistant professor of Statistics and a member of the Wu Tsai Neurosciences Institute at Stanford University. I work in the fields of machine learning and computational neuroscience. My research focuses on developing probabilistic models and inference algorithms to better understand complex biological data. Previously, I was a postdoctoral fellow with Liam Paninski and David Blei at Columbia University, and I completed my PhD in Computer Science at Harvard University with Ryan Adams and Leslie Valiant.

E-mail: scott.linderman@stanford.edu   CV

Research Group

Here are some highlights of our work.

Hierarchical Recurrent State Space Models of Neural Activity

With Annika Nichols, David Blei, Manuel Zimmer, and Liam Paninski. bioRxiv, 2019

We develop hierarchical and recurrent state space models for whole brain recordings of neural activity in C. elegans. We find states of brain activity that correspond to discrete elements of worm behavior and dynamics that are modulated by brain state and sensory input.

Tree-structured Recurrent SLDS

With Josue Nassar, Monica Bugallo, and Il Memming Park. ICLR, 2019

We develop an extension of the rSLDS to capture hierarchical, multi-scale structure in dynamics via a tree-structured stick-breaking model. We recursively partition the latent space to obtain a piecewise linear approximation of nonlinear dynamics. A hierarchical prior smooths dynamics estimates, and inference is performed via an augmented Gibbs sampling algorithm.

Point process latent variable models of larval zebrafish behavior

With Anuj Sharma, Robert Johnson, and Florian Engert. NeurIPS 2018

We develop deep state space models with point process observation models to capture structure in larval zebrafish behavior. The models combine discrete and continuous latent variables. We marginalize the discrete states with message passing and perform inference with bidirectional LSTM recognition networks.

Variational Sequential Monte Carlo

With Christian Naesseth, Rajesh Ranganath, and David Blei. AISTATS 2018

We view SMC as a variational family indexed by the parameters of its proposal distribution and show how this generalizes the importance weighted autoencoder. As the number of particles goes to infinity, the variational approximation approaches the true posterior.

Rejection Sampling Variational Inference

With Christian Naesseth, Fran Ruiz, and David Blei. AISTATS 2017
Best Paper Award

Reparameterization gradients through rejection samplers for automatic variational inference in models with gamma, beta, and Dirichlet latent variables.

Recurrent Switching Linear Dynamical Systems

With Matt Johnson, Andy Miller, Ryan Adams, David Blei, and Liam Paninski. AISTATS, 2017

Bayesian learning and inference for models with co-evolving discrete and continuous latent states.

Dependent Multinomial Models Made Easy

With Matt Johnson and Ryan Adams. NIPS 2015

We use a stick-breaking construction and Pólya-gamma augmentation to derive block Gibbs samplers for linear Gaussian models with multinomial observations.

Studying Synaptic Plasticity with Time-Varying GLMs

With Chris Stock and Ryan Adams. NIPS 2014

We propose a time-varying generalized linear model whose weights evolve according to synaptic plasticity rules, and we perform Bayesian inference with particle MCMC.

Publications

2019

  1. Sun*, R., Linderman*, S. W., Kinsella, I., & Paninski, L. (2019). Scalable Bayesian inference of dendritic voltage via spatiotemporal recurrent state space models. Advances in Neural Information Processing Systems (NeurIPS). Selected for Oral Presentation (0.5% of all submissions)
  2. Apostolopoulou, I., Linderman, S. W., Miller, K., & Dubrawski, A. (2019). Mutually regressive point processes. Advances in Neural Information Processing Systems (NeurIPS).
  3. Schein, A., Linderman, S. W., Zhou, M., Blei, D., & Wallach, H. (2019). Poisson-randomized gamma dynamical systems. Advances in Neural Information Processing Systems (NeurIPS).
    arXiv
  4. Batty^*, E., Whiteway^*, M., Saxena, S., Biderman, D., Abe, T., Musall, S., … Paninski^\dagger, L. (2019). BehaveNet: nonlinear embedding and Bayesian neural decoding of behavioral videos. Advances in Neural Information Processing Systems (NeurIPS).
  5. Johnson*, R. E., Linderman*, S. W., Panier, T., Wee, C. L., Song, E., Herrera, K. J., … Engert, F. (2019). Probabilistic Models of Larval Zebrafish Behavior: Structure on Many Scales. BioRxiv. https://doi.org/10.1101/672246
    bioRxiv
  6. Linderman, S. W., Nichols, A. L. A., Blei, D. M., Zimmer, M., & Paninski, L. (2019). Hierarchical recurrent state space models reveal discrete and continuous dynamics of neural activity in C. elegans. BioRxiv. https://doi.org/10.1101/621540
    bioRxiv
  7. Nassar, J., Linderman, S. W., Park, M., & Bugallo, M. (2019). Tree-structured locally linear dynamics model to uproot Bayesian neural data analysis. Computational and Systems Neuroscience (Cosyne) Abstracts.
  8. Raju, R. V., Li, Z., Linderman, S. W., & Pitkow, X. (2019). Inferring implicit inference. Computational and Systems Neuroscience (Cosyne) Abstracts.
  9. Glaser, J., Linderman, S. W., Whiteway, M., Perich, M., Dekleva, B., Miller, L., & Cunningham, L. P. J. (2019). State space models for multiple interacting neural populations. Computational and Systems Neuroscience (Cosyne) Abstracts.
  10. Markowitz, J., Gillis, W., Murmann, J., Linderman, S. W., Sabatini, B., & Datta, S. (2019). Resolving the neural mechanisms of reinforcement learning through new behavioral technologies. Computational and Systems Neuroscience (Cosyne) Abstracts.
  11. Linderman, S. W., Sharma, A., Johnson, R. E., & Engert, F. (2019). Point process latent variable models of larval zebrafish behavior. Computational and Systems Neuroscience (Cosyne) Abstracts.
  12. Nassar, J., Linderman, S. W., Bugallo, M., & Park, I. M. (2019). Tree-Structured Recurrent Switching Linear Dynamical Systems for Multi-Scale Modeling. In International Conference on Learning Representations (ICLR).
    Paper arXiv

2018

  1. Tansey, W., Li, K., Zhang, H., Linderman, S. W., Rabadan, R., Blei, D. M., & Wiggins, C. H. (2018). Dose-response modeling in high-throughput cancer drug screenings: A case study with recommendations for practitioners. ArXiv Preprint ArXiv:1812.05691.
    arXiv
  2. Sharma, A., Johnson, R. E., Engert, F., & Linderman, S. W. (2018). Point process latent variable models of freely swimming larval zebrafish. Advances in Neural Information Processing Systems (NeurIPS).
    Paper Code
  3. Markowitz, J. E., Gillis, W. F., Beron, C. C., Neufeld, S. Q., Robertson, K., Bhagat, N. D., … Datta, S. R. (2018). The Striatum Organizes 3D Behavior via Moment-to-Moment Action Selection. Cell. https://doi.org/doi: 10.1016/j.cell.2018.04.019
    Paper
  4. Linderman, S. W., Nichols, A., Blei, D. M., Zimmer, M., & Paninski, L. (2018). Hierarchical recurrent models reveal latent states of neural activity in C. elegans. Computational and Systems Neuroscience (Cosyne) Abstracts.
  5. Markowitz, J. E., Gillis, W. F., Beron, C. C., Neufeld, S. Q., Robertson, K., Bhagat, N. D., … Datta, S. R. (2018). Complementary Direct and Indirect Pathway Activity Encodes Sub-Second 3D Pose Dynamics in Striatum. Computational and Systems Neuroscience (Cosyne) Abstracts.
  6. Johnson*, R. E., Linderman*, S. W., Panier, T., Wee, C., Song, E., Herrera, K., … Engert, F. (2018). Revealing multiple timescales of structure in larval zebrafish behavior. Computational and Systems Neuroscience (Cosyne) Abstracts.
  7. Mena, G. E., Belanger, D., Linderman, S. W., & Snoek, J. (2018). Learning Latent Permutations with Gumbel-Sinkhorn Networks. International Conference on Learning Representations (ICLR).
    Paper Code
  8. Linderman, S. W., Mena, G. E., Cooper, H., Paninski, L., & Cunningham, J. P. (2018). Reparameterizing the Birkhoff Polytope for Variational Permutation Inference. In Proceedings of the 21st International Conference on Artificial Intelligence and Statistics (AISTATS).
    Paper arXiv
  9. Naesseth, C. A., Linderman, S. W., Ranganath, R., & Blei, D. M. (2018). Variational Sequential Monte Carlo. In Proceedings of the 21st International Conference on Artificial Intelligence and Statistics (AISTATS).
    Paper arXiv Code

2017

  1. Linderman, S. W., Wang, Y., & Blei, D. M. (2017). Bayesian inference for latent Hawkes processes. Advances in Approximate Bayesian Inference Workshop at the 31st Conference on Neural Information Processing Systems.
    Paper
  2. Buchanan, E. K., Lipschitz, A., Linderman, S. W., & Paninski, L. (2017). Quantifying the behavioral dynamics of C. elegans with autoregressive hidden Markov models. Workshop on Worm’s Neural Information Processing at the 31st Conference on Neural Information Processing Systems.
    Paper
  3. Mena, G. E., Linderman, S. W., Belanger, D., Snoek, J., Cunningham, J. P., & Paninski, L. (2017). Toward Bayesian permutation inference for identifying neurons in C. elegans. Workshop on Worm’s Neural Information Processing at the 31st Conference on Neural Information Processing Systems.
    Paper
  4. Linderman, S. W., & Johnson, M. J. (2017). Structure-Exploiting Variational Inference for Recurrent Switching Linear Dynamical Systems. IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing.
    Paper
  5. Linderman, S. W., & Blei, D. M. (2017). Comment: A Discussion of “Nonparametric Bayes Modeling of Populations of Networks.” Journal of the American Statistical Association, 112(520), 1543–1547.
    Paper Code
  6. Linderman, S. W., & Gershman, S. J. (2017). Using computational theory to constrain statistical models of neural data. Current Opinion in Neurobiology, 46, 14–24.
    Paper bioRxiv Code
  7. Linderman, S. W., Miller, A. C., Adams, R. P., Blei, D. M., Johnson, M. J., & Paninski, L. (2017). Neuro-behavioral Analysis with Recurrent switching linear dynamical systems. Computational and Systems Neuroscience (Cosyne) Abstracts.
  8. Linderman*, S. W., Johnson*, M. J., Miller, A. C., Adams, R. P., Blei, D. M., & Paninski, L. (2017). Bayesian learning and inference in recurrent switching linear dynamical systems. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS).
    Paper Slides Talk Code
  9. Naesseth, C. A., Ruiz, F. J. R., Linderman, S. W., & Blei, D. M. (2017). Reparameterization Gradients through Acceptance-Rejection Sampling Algorithms. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS). Best Paper Award.
    Paper Blog Post Code

2016

  1. Naesseth, C. A., Ruiz, F. J. R., Linderman, S. W., & Blei, D. M. (2016). Rejection sampling variational inference. Advances in Approximate Bayesian Inference Workshop at the 30th Conference on Neural Information Processing Systems.
    Paper
  2. Chen, Z., Linderman, S. W., & Wilson, M. A. (2016). Bayesian Nonparametric Methods For Discovering Latent Structures Of Rat Hippocampal Ensemble Spikes. IEEE Workshop on Machine Learning for Signal Processing.
    Paper Code
  3. Elibol, H. M., Nguyen, V., Linderman, S. W., Johnson, M. J., Hashmi, A., & Doshi-Velez, F. (2016). Cross-Corpora Unsupervised Learning of Trajectories in Autism Spectrum Disorders. Journal of Machine Learning Research, 17(133), 1–38.
    Paper
  4. Linderman, S. W., Adams, R. P., & Pillow, J. W. (2016). Bayesian latent structure discovery from multi-neuron recordings. In Advances in Neural Information Processing Systems (NIPS).
    Paper Code
  5. Linderman, S. W. (2016). Bayesian methods for discovering structure in neural spike trains (PhD thesis). Harvard University. Leonard J. Savage Award for Outstanding Dissertation in Applied Bayesian Methodology from the International Society for Bayesian Analysis
    Thesis Code
  6. Linderman, S. W., Johnson, M. J., Wilson, M. A., & Chen, Z. (2016). A Bayesian nonparametric approach to uncovering rat hippocampal population codes during spatial navigation. Journal of Neuroscience Methods, 263, 36–47.
    Paper Code
  7. Linderman, S. W., Tucker, A., & Johnson, M. J. (2016). Bayesian Latent State Space Models of Neural Activity. Computational and Systems Neuroscience (Cosyne) Abstracts.

2015

  1. Linderman*, S. W., Johnson*, M. J., & Adams, R. P. (2015). Dependent Multinomial Models Made Easy: Stick-Breaking with the Pólya-gamma Augmentation. In Advances in Neural Information Processing Systems (NIPS) (pp. 3438–3446).
    Paper arXiv Code
  2. Linderman, S. W., & Adams, R. P. (2015). Scalable Bayesian Inference for Excitatory Point Process Networks. ArXiv Preprint ArXiv:1507.03228.
    arXiv Code
  3. Linderman, S. W., Adams, R. P., & Pillow, J. W. (2015). Inferring structured connectivity from spike trains under negative-binomial generalized linear models. Computational and Systems Neuroscience (Cosyne) Abstracts.
  4. Johnson, M. J., Linderman, S. W., Datta, S. R., & Adams, R. P. (2015). Discovering switching autoregressive dynamics in neural spike train recordings. Computational and Systems Neuroscience (Cosyne) Abstracts.

2014

  1. Linderman, S. W., Stock, C. H., & Adams, R. P. (2014). A framework for studying synaptic plasticity with neural spike train data. In Advances in Neural Information Processing Systems (NIPS) (pp. 2330–2338).
    Paper arXiv
  2. Linderman, S. W. (2014). Discovering Latent States of the Hippocampus with Bayesian Hidden Markov Models. CBMM Memo 024: Abstracts of the Brains, Minds, and Machines Summer School.
    Paper
  3. Linderman, S. W., & Adams, R. P. (2014). Discovering Latent Network Structure in Point Process Data. In Proceedings of the International Conference on Machine Learning (ICML) (pp. 1413–1421).
    Paper arXiv Talk Code
  4. Linderman, S. W., Stock, C. H., & Adams, R. P. (2014). A framework for studying synaptic plasticity with neural spike train data. Annual Meeting of the Society for Neuroscience.
    Paper
  5. Nemati, S., Linderman, S. W., & Chen, Z. (2014). A Probabilistic Modeling Approach for Uncovering Neural Population Rotational Dynamics. Computational and Systems Neuroscience (Cosyne) Abstracts.

2013

  1. Linderman, S. W., & Adams, R. P. (2013). Fully-Bayesian Inference of Structured Functional Networks in GLMs. Acquiring and Analyzing the Activity of Large Neural Ensembles Workshop at Neural Information Processing Systems (NIPS).
  2. Linderman, S. W., & Adams, R. P. (2013). Discovering structure in spiking networks. New England Machine Learning Day.
  3. Linderman, S. W., & Adams, R. P. (2013). Inferring functional connectivity with priors on network topology. Computational and Systems Neuroscience (Cosyne) Abstracts.