Erik M Lindgren

I am a Software Engineer at Databricks in New York. Before that I worked at Google Research. I recieved my PhD from the University of Texas at Austin in the department of Electrical and Computer Engineering, where I was advised by Prof. Alex Dimakis. I am interested in machine learning, combinatorial optimization, and information theory. I received my bachelor's degree from Boston University.

Publications

Efficient Training of Retrieval Models Using Negative Cache
E. M. Lindgren, S. Reddi, R. Guo, S. Kumar
Neural Information Processing Systems, 2021
[paper]

Composing Normalizing Flows for Inverse Problems
J. Whang, E. M. Lindgren, A. G. Dimakis
International Conference on Machine Learning, 2021
+ Best Paper Award at UAI 2021 Workshop on Tractable Probabilistic Modeling
[paper]

Accelerating Large-Scale Inference with Anisotropic Vector Quantization
R. Guo, P. Sun, E. Lindgren, Q. Geng, D. Simcha, F. Chern, S. Kumar
International Conference on Machine Learning, 2020
[paper] [code]

On Robust Learning of Ising Models
E. M. Lindgren, V. Shah, Y. Shen, A. G. Dimakis, A. Klivans
NeurIPS Workshop on Relational Representation Learning, 2018
[paper]

Experimental Design for Cost-Aware Learning of Causal Graphs
E. M. Lindgren, M. Kocaoglu, A. G. Dimakis, S. Vishwanath
Neural Information Processing Systems, 2018
[paper] [code] [video] [poster]

Exact MAP Inference by Avoiding Fractional Vertices
E. M. Lindgren, A. G. Dimakis, A. Klivans
International Conference on Machine Learning, 2017
[paper]

Leveraging Sparsity for Efficient Submodular Data Summarization
E. M. Lindgren, S. Wu, A. G. Dimakis
Neural Information Processing Systems, 2016
[paper] [video]

A Rule-Based Design Specification Language for Synthetic Biology
E. Oberortner, S. Bhatia, E. M. Lindgren, D. Densmore
ACM Journal on Emerging Technologies in Computing Systems, 2014
[paper]