Linderman Lab

Stanford University

Welcome to the Linderman Lab! We are part of the Statistics Department and the Wu Tsai Neurosciences Institute at Stanford University. We work at the intersection of machine learning and computational neuroscience, developing models and algorithms to better understand complex biological data. Check out some of our research below, and reach out if you'd like to learn more!

Scott W. Linderman

I'm an Assistant Professor of Statistics and, by courtesy, Electrical Engineering and Computer Science at Stanford University. I'm also an Institute Scholar in the Wu Tsai Neurosciences Institute and a member of Stanford Bio-X and the Stanford AI Lab. Previously, I was a postdoctoral fellow with Liam Paninski and David Blei at Columbia University, and I completed my PhD in Computer Science at Harvard University with Ryan Adams and Leslie Valiant. I obtained my undergraduate degree in Electrical and Computer Engineering from Cornell University and spent three years as a software engineer at Microsoft prior to graduate school.

E-mail: scott.linderman@stanford.edu   CV Scholar Twitter

Research Group

Alisa Levin

PhD Student (CS, co-advised by Prof. Jaimie Henderson)

Alumni

Ben Antin

Research Assistant (2019-20)
(now PhD Student at Columbia)

Dieterich Lawson

Graduate Student (2019-23)
(now Research Scientist at Google)

Alex Williams

Postdoc (2019-21)
(now Asst. Prof. at NYU and Research Scientist at the Flatiron Institute)

Here are some highlights of our work.

SIχO: Smoothing Inference with Twisted Objectives

With Dieterich Lawson, Allan Raventos, and Andy Warrington. NeurIPS, 2022
Oral Presentation

We improve Variational Sequential Monte Carlo (VSMC) methods by targeting smoothing distributions instead of filtering ones.

Jacobian Switching Linear Dynamical Systems

With Jimmy Smith and David Sussillo. NeurIPS 2021

We try to bridge the gap between recurrent neural networks and switching linear dynamical systems with a new model called the Jacobian SLDS.


Check out more of our greatest hits here!

Publications

2023

  1. Smith, J. T. H., Mello, S. D., Kautz, J., Linderman, S. W., & Byeon, W. (2023). Convolutional State Space Models for Long-Range Spatiotemporal Modeling. ArXiv Preprint ArXiv:2310.19694 (to Appear at NeurIPS 2023).
    arXiv
  2. Hennig, J., Pinto, S. A. R., Yamaguchi, T., Linderman, S. W., Uchida, N., & Gershman, S. J. (2023). Emergence of belief-like representations through reinforcement learning. PLoS Computational Biology (in Press). https://doi.org/10.1101/2023.04.04.535512
    bioRxiv
  3. Wang, Y., Degleris, A., Williams, A. H., & Linderman, S. W. (2023). Spatiotemporal Clustering with Neyman-Scott Processes via Connections to Bayesian Nonparametric Mixture Models. Journal of the American Statistical Association (in Press).
    arXiv Code
  4. Bukwich, M., Campbell, M. G., Zoltowski, D., Kingsbury, L., Tomov, M. S., Stern, J., Kim, H. G. R., Drugowitsch, J., Linderman, S. W., & Uchida, N. (2023). Competitive integration of time and reward explains value-sensitive foraging decisions and frontal cortex ramping dynamics. BioRxiv.
    bioRxiv
  5. Lawson, D., Li, M., & Linderman, S. (2023). NAS-X: Neural Adaptive Smoothing via Twisting. ArXiv Preprint ArXiv:2308.14864 (to Appear at NeurIPS 2023).
    arXiv
  6. Lee, H. D., Warrington, A., Glaser, J. I., & Linderman, S. W. (2023). Switching Autoregressive Low-rank Tensor Models. ArXiv Preprint ArXiv:2306.03291 (to Appear at NeurIPS 2023).
    arXiv
  7. Zhao, Y., & Linderman, S. W. (2023). Revisiting Structured Variational Autoencoders. International Conference on Machine Learning (ICML).
    Paper arXiv
  8. Liu, M., Nair, A., Linderman, S. W., & Anderson, D. J. (2023). Periodic hypothalamic attractor-like dynamics during the estrus cycle. BioRxiv. https://doi.org/10.1101/2023.05.22.541741
    bioRxiv
  9. Weinreb, C., Osman, M. A. M., Zhang, L., Lin, S., Pearl, J., Annapragada, S., Conlin, E., Gillis, W. F., Jay, M., Ye, S., Mathis, A., Mathis, M. W., Pereira, T., Linderman^*, S. W., & Datta^*, S. R. (2023). Keypoint-MoSeq: parsing behavior by linking point tracking to pose dynamics. BioRxiv. https://doi.org/10.1101/2023.03.16.532307
    bioRxiv
  10. Smith, J. T. H., Warrington, A., & Linderman, S. W. (2023). Simplified State Space Layers for Sequence Modeling. International Conference on Learning Representations (ICLR). Selected for Oral Presentation (top 5% of accepted papers, top 1.5% of all submissions)
    Paper arXiv Code
  11. Markowitz, J., Gillis, W., Jay, M., Wood, J., Harris, R., Cieszkowski, R., Scott, R., Brann, D., Koveal, D., Kuila, T., Weinreb, C., Osman, M., Pinto, S. R., Uchida, N., Linderman, S. W., Sabatini, B., & Datta, S. R. (2023). Spontaneous behavior is structured by reinforcement without exogenous reward. Nature. https://doi.org/https://doi.org/10.1038/s41586-022-05611-2
    Paper
  12. Nair, A., Karigo, T., Yang, B., Ganguli, S., Schnitzer, M. J., Linderman, S. W., Anderson, D. J., & Kennedy, A. (2023). An approximate line attractor in the hypothalamus encodes an aggressive state. Cell, 186(1), 178–193.
    Paper bioRxiv

2022

  1. Lawson, D., Raventos, A., Warrington, A., & Linderman, S. (2022). SIXO: Smoothing Inference with Twisted Objectives. Advances in Neural Information Processing Systems. Selected for Oral Presentation
    Paper arXiv Code
  2. Costacurta, J. C., Duncker, L., Sheffer, B., Gillis, W., Weinreb, C., Markowitz, J. E., Datta, S. R., Williams, A. H., & Linderman, S. (2022). Distinguishing discrete and continuous behavioral variability using warped autoregressive HMMs. Advances in Neural Information Processing Systems.
    Paper bioRxiv
  3. Beller, A., Xu, Y., Linderman, S. W., & Gerstenberg, T. (2022). Looking into the past: Eye-tracking mental simulation in physical inference. Proceedings of the Annual Meeting of the Cognitive Science Society, 44(44).
    Paper arXiv
  4. Beron, C. C., Neufeld, S. Q., Linderman^*, S. W., & Sabatini^*, B. L. (2022). Mice exhibit stochastic and efficient action switching during probabilistic decision making. Proceedings of the National Academy of Sciences, 119(15), e2113961119. https://doi.org/10.1073/pnas.2113961119
    Paper bioRxiv
  5. Lin, A., Witvliet, D., Hernandez-Nunez, L., Linderman, S. W., Samuel, A. D. T., & Venkatachalam, V. (2022). Imaging whole-brain activity to understand behaviour. Nature Reviews Physics, 1–14.
    Paper
  6. Linderman, S. W. (2022). Weighing the evidence in sharp-wave ripples. Neuron, 110(4), 568–570. https://doi.org/https://doi.org/10.1016/j.neuron.2022.01.036
    Paper Code

2021

  1. Williams, A. H., & Linderman, S. W. (2021). Statistical neuroscience in the single trial limit. Current Opinion in Neurobiology, 70, 193–205. https://doi.org/https://doi.org/10.1016/j.conb.2021.10.008
    Paper arXiv
  2. Smith, J. T. H., Linderman, S. W., & Sussillo, D. (2021). Reverse engineering recurrent neural networks with Jacobian switching linear dynamical systems. Advances in Neural Information Processing Systems (NeurIPS).
    Paper arXiv Code
  3. Williams, A. H., Kunz, E., Kornblith, S., & Linderman, S. W. (2021). Generalized Shape Metrics on Neural Representations. Advances in Neural Information Processing Systems (NeurIPS).
    Paper arXiv Code
  4. Yu, X., Creamer, M. S., Randi, F., Sharma, A. K., Linderman, S. W., & Leifer, A. M. (2021). Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training. Elife, 10, e66410.
    Paper arXiv
  5. Low, I. I. C., Williams, A. H., Campbell, M. G., Linderman, S. W., & Giocomo, L. M. (2021). Dynamic and reversible remapping of network representations in an unchanging environment. Neuron.
    Paper bioRxiv
  6. Zhang, L., Marshall, J. D., Dunn, T., Ölveczky, B., & Linderman, S. W. (2021). Animal pose estimation from video data with a hierarchical von Mises-Fisher-Gaussian model. Proceedings of the International Conference on Artificial Intelligence and Statistics (AISTATS).
    Paper

2020

  1. Mittal, A., Linderman, S. W., Paisley, J., & Sajda, P. (2020). Bayesian recurrent state space model for rs-fMRI. Machine Learning for Health (ML4H) Workshop at NeurIPS 2020.
    arXiv
  2. Williams, A. H., Degleris, A., Wang, Y., & Linderman, S. W. (2020). Point process models for sequence detection in high-dimensional neural spike trains. Advances in Neural Information Processing Systems (NeurIPS). Selected for Oral Presentation (1.1% of all submissions)
    Paper arXiv Code
  3. Glaser, J. I., Whiteway, M., Cunningham, J. P., Paninski, L., & Linderman, S. W. (2020). Recurrent switching dynamical systems models for multiple interacting neural populations. Advances in Neural Information Processing Systems (NeurIPS).
    Paper bioRxiv Code
  4. Tansey, W., Li, K., Zhang, H., Linderman, S. W., Rabadan, R., Blei, D. M., & Wiggins, C. H. (2020). Dose-response modeling in high-throughput cancer drug screenings: An end-to-end approach. Biostatistics.
    Paper arXiv
  5. Zoltowski, D. M., Pillow, J. W., & Linderman, S. W. (2020). A general recurrent state space framework for modeling neural dynamics during decision-making. Proceedings of the International Conference on Machine Learning (ICML).
    Paper arXiv Code
  6. Johnson*, R. E., Linderman*, S. W., Panier, T., Wee, C. L., Song, E., Herrera, K. J., Miller, A., & Engert, F. (2020). Probabilistic models of larval zebrafish behavior reveal structure on many scales. Current Biology, 30(1), 70–82.
    Paper bioRxiv

2019

  1. Sun*, R., Linderman*, S. W., Kinsella, I., & Paninski, L. (2019). Scalable Bayesian inference of dendritic voltage via spatiotemporal recurrent state space models. Advances in Neural Information Processing Systems (NeurIPS). Selected for Oral Presentation (0.5% of all submissions)
    Paper Code
  2. Apostolopoulou, I., Linderman, S. W., Miller, K., & Dubrawski, A. (2019). Mutually regressive point processes. Advances in Neural Information Processing Systems (NeurIPS).
    Paper Code
  3. Schein, A., Linderman, S. W., Zhou, M., Blei, D., & Wallach, H. (2019). Poisson-randomized gamma dynamical systems. Advances in Neural Information Processing Systems (NeurIPS).
    Paper arXiv Code
  4. Batty^*, E., Whiteway^*, M., Saxena, S., Biderman, D., Abe, T., Musall, S., Gillis, W., Markowitz, J., Churchland, A., Cunningham, J., Linderman^\dagger, S. W., & Paninski^\dagger, L. (2019). BehaveNet: nonlinear embedding and Bayesian neural decoding of behavioral videos. Advances in Neural Information Processing Systems (NeurIPS).
    Paper Code
  5. Linderman, S. W., Nichols, A. L. A., Blei, D. M., Zimmer, M., & Paninski, L. (2019). Hierarchical recurrent state space models reveal discrete and continuous dynamics of neural activity in C. elegans. BioRxiv. https://doi.org/10.1101/621540
    bioRxiv
  6. Nassar, J., Linderman, S. W., Park, M., & Bugallo, M. (2019). Tree-structured locally linear dynamics model to uproot Bayesian neural data analysis. Computational and Systems Neuroscience (Cosyne) Abstracts.
  7. Raju, R. V., Li, Z., Linderman, S. W., & Pitkow, X. (2019). Inferring implicit inference. Computational and Systems Neuroscience (Cosyne) Abstracts.
  8. Glaser, J., Linderman, S. W., Whiteway, M., Perich, M., Dekleva, B., Miller, L., & Cunningham, L. P. J. (2019). State space models for multiple interacting neural populations. Computational and Systems Neuroscience (Cosyne) Abstracts.
  9. Markowitz, J., Gillis, W., Murmann, J., Linderman, S. W., Sabatini, B., & Datta, S. (2019). Resolving the neural mechanisms of reinforcement learning through new behavioral technologies. Computational and Systems Neuroscience (Cosyne) Abstracts.
  10. Linderman, S. W., Sharma, A., Johnson, R. E., & Engert, F. (2019). Point process latent variable models of larval zebrafish behavior. Computational and Systems Neuroscience (Cosyne) Abstracts.
  11. Nassar, J., Linderman, S. W., Bugallo, M., & Park, I. M. (2019). Tree-Structured Recurrent Switching Linear Dynamical Systems for Multi-Scale Modeling. International Conference on Learning Representations (ICLR).
    Paper arXiv

2018

  1. Sharma, A., Johnson, R. E., Engert, F., & Linderman, S. W. (2018). Point process latent variable models of freely swimming larval zebrafish. Advances in Neural Information Processing Systems (NeurIPS).
    Paper Code
  2. Markowitz, J. E., Gillis, W. F., Beron, C. C., Neufeld, S. Q., Robertson, K., Bhagat, N. D., Peterson, R. E., Peterson, E., Hyun, M., Linderman, S. W., Sabatini, B. L., & Datta, S. R. (2018). The Striatum Organizes 3D Behavior via Moment-to-Moment Action Selection. Cell. https://doi.org/doi: 10.1016/j.cell.2018.04.019
    Paper
  3. Linderman, S. W., Nichols, A., Blei, D. M., Zimmer, M., & Paninski, L. (2018). Hierarchical recurrent models reveal latent states of neural activity in C. elegans. Computational and Systems Neuroscience (Cosyne) Abstracts.
  4. Markowitz, J. E., Gillis, W. F., Beron, C. C., Neufeld, S. Q., Robertson, K., Bhagat, N. D., Peterson, R. E., Peterson, E., Hyun, M., Linderman, S. W., Sabatini, B. L., & Datta, S. R. (2018). Complementary Direct and Indirect Pathway Activity Encodes Sub-Second 3D Pose Dynamics in Striatum. Computational and Systems Neuroscience (Cosyne) Abstracts.
  5. Johnson*, R. E., Linderman*, S. W., Panier, T., Wee, C., Song, E., Herrera, K., Miller, A. C., & Engert, F. (2018). Revealing multiple timescales of structure in larval zebrafish behavior. Computational and Systems Neuroscience (Cosyne) Abstracts.
  6. Mena, G. E., Belanger, D., Linderman, S. W., & Snoek, J. (2018). Learning Latent Permutations with Gumbel-Sinkhorn Networks. International Conference on Learning Representations (ICLR).
    Paper Code
  7. Linderman, S. W., Mena, G. E., Cooper, H., Paninski, L., & Cunningham, J. P. (2018). Reparameterizing the Birkhoff Polytope for Variational Permutation Inference. Proceedings of the 21st International Conference on Artificial Intelligence and Statistics (AISTATS).
    Paper arXiv
  8. Naesseth, C. A., Linderman, S. W., Ranganath, R., & Blei, D. M. (2018). Variational Sequential Monte Carlo. Proceedings of the 21st International Conference on Artificial Intelligence and Statistics (AISTATS).
    Paper arXiv Code

2017

  1. Linderman, S. W., Wang, Y., & Blei, D. M. (2017). Bayesian inference for latent Hawkes processes. Advances in Approximate Bayesian Inference Workshop at the 31st Conference on Neural Information Processing Systems.
    Paper
  2. Buchanan, E. K., Lipschitz, A., Linderman, S. W., & Paninski, L. (2017). Quantifying the behavioral dynamics of C. elegans with autoregressive hidden Markov models. Workshop on Worm’s Neural Information Processing at the 31st Conference on Neural Information Processing Systems.
    Paper
  3. Mena, G. E., Linderman, S. W., Belanger, D., Snoek, J., Cunningham, J. P., & Paninski, L. (2017). Toward Bayesian permutation inference for identifying neurons in C. elegans. Workshop on Worm’s Neural Information Processing at the 31st Conference on Neural Information Processing Systems.
    Paper
  4. Linderman, S. W., & Johnson, M. J. (2017). Structure-Exploiting Variational Inference for Recurrent Switching Linear Dynamical Systems. IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing.
    Paper
  5. Linderman, S. W., & Blei, D. M. (2017). Comment: A Discussion of “Nonparametric Bayes Modeling of Populations of Networks.” Journal of the American Statistical Association, 112(520), 1543–1547.
    Paper Code
  6. Linderman, S. W., & Gershman, S. J. (2017). Using computational theory to constrain statistical models of neural data. Current Opinion in Neurobiology, 46, 14–24.
    Paper bioRxiv Code
  7. Linderman, S. W., Miller, A. C., Adams, R. P., Blei, D. M., Johnson, M. J., & Paninski, L. (2017). Neuro-behavioral Analysis with Recurrent switching linear dynamical systems. Computational and Systems Neuroscience (Cosyne) Abstracts.
  8. Linderman*, S. W., Johnson*, M. J., Miller, A. C., Adams, R. P., Blei, D. M., & Paninski, L. (2017). Bayesian learning and inference in recurrent switching linear dynamical systems. Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS).
    Paper Slides Talk Code
  9. Naesseth, C. A., Ruiz, F. J. R., Linderman, S. W., & Blei, D. M. (2017). Reparameterization Gradients through Acceptance-Rejection Sampling Algorithms. Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS). Best Paper Award.
    Paper Blog Post Code

2016

  1. Naesseth, C. A., Ruiz, F. J. R., Linderman, S. W., & Blei, D. M. (2016). Rejection sampling variational inference. Advances in Approximate Bayesian Inference Workshop at the 30th Conference on Neural Information Processing Systems.
    Paper
  2. Chen, Z., Linderman, S. W., & Wilson, M. A. (2016). Bayesian Nonparametric Methods For Discovering Latent Structures Of Rat Hippocampal Ensemble Spikes. IEEE Workshop on Machine Learning for Signal Processing.
    Paper Code
  3. Elibol, H. M., Nguyen, V., Linderman, S. W., Johnson, M. J., Hashmi, A., & Doshi-Velez, F. (2016). Cross-Corpora Unsupervised Learning of Trajectories in Autism Spectrum Disorders. Journal of Machine Learning Research, 17(133), 1–38.
    Paper
  4. Linderman, S. W., Adams, R. P., & Pillow, J. W. (2016). Bayesian latent structure discovery from multi-neuron recordings. Advances in Neural Information Processing Systems (NIPS).
    Paper Code
  5. Linderman, S. W. (2016). Bayesian methods for discovering structure in neural spike trains [PhD thesis]. Harvard University. Leonard J. Savage Award for Outstanding Dissertation in Applied Bayesian Methodology from the International Society for Bayesian Analysis
    Thesis Code
  6. Linderman, S. W., Johnson, M. J., Wilson, M. A., & Chen, Z. (2016). A Bayesian nonparametric approach to uncovering rat hippocampal population codes during spatial navigation. Journal of Neuroscience Methods, 263, 36–47.
    Paper Code
  7. Linderman, S. W., Tucker, A., & Johnson, M. J. (2016). Bayesian Latent State Space Models of Neural Activity. Computational and Systems Neuroscience (Cosyne) Abstracts.

2015

  1. Linderman*, S. W., Johnson*, M. J., & Adams, R. P. (2015). Dependent Multinomial Models Made Easy: Stick-Breaking with the Pólya-gamma Augmentation. Advances in Neural Information Processing Systems (NIPS), 3438–3446.
    Paper arXiv Code
  2. Linderman, S. W., & Adams, R. P. (2015). Scalable Bayesian Inference for Excitatory Point Process Networks. ArXiv Preprint ArXiv:1507.03228.
    arXiv Code
  3. Linderman, S. W., Adams, R. P., & Pillow, J. W. (2015). Inferring structured connectivity from spike trains under negative-binomial generalized linear models. Computational and Systems Neuroscience (Cosyne) Abstracts.
  4. Johnson, M. J., Linderman, S. W., Datta, S. R., & Adams, R. P. (2015). Discovering switching autoregressive dynamics in neural spike train recordings. Computational and Systems Neuroscience (Cosyne) Abstracts.

2014

  1. Linderman, S. W., Stock, C. H., & Adams, R. P. (2014). A framework for studying synaptic plasticity with neural spike train data. Advances in Neural Information Processing Systems (NIPS), 2330–2338.
    Paper arXiv
  2. Linderman, S. W. (2014). Discovering Latent States of the Hippocampus with Bayesian Hidden Markov Models. CBMM Memo 024: Abstracts of the Brains, Minds, and Machines Summer School.
    Paper
  3. Linderman, S. W., & Adams, R. P. (2014). Discovering Latent Network Structure in Point Process Data. Proceedings of the International Conference on Machine Learning (ICML), 1413–1421.
    Paper arXiv Talk Code
  4. Linderman, S. W., Stock, C. H., & Adams, R. P. (2014). A framework for studying synaptic plasticity with neural spike train data. Annual Meeting of the Society for Neuroscience.
    Paper
  5. Nemati, S., Linderman, S. W., & Chen, Z. (2014). A Probabilistic Modeling Approach for Uncovering Neural Population Rotational Dynamics. Computational and Systems Neuroscience (Cosyne) Abstracts.

2013

  1. Linderman, S. W., & Adams, R. P. (2013). Fully-Bayesian Inference of Structured Functional Networks in GLMs. Acquiring and Analyzing the Activity of Large Neural Ensembles Workshop at Neural Information Processing Systems (NIPS).
  2. Linderman, S. W., & Adams, R. P. (2013). Discovering structure in spiking networks. New England Machine Learning Day.
  3. Linderman, S. W., & Adams, R. P. (2013). Inferring functional connectivity with priors on network topology. Computational and Systems Neuroscience (Cosyne) Abstracts.