Preprints

  1. Niles-Weed, J., and Zadik, I. (2021), “It was ‘all’ for ‘nothing’: sharp phase transitions for noiseless discrete channels.” [PDF]
  2. Chen, H.-B., Chewi, S., and Niles-Weed, J. (2021), “Dimension-free log-Sobolev inequalities for mixture distributions.” [PDF]
  3. Huang, D., Niles-Weed, J., and Ward, R. (2021), “Streaming k-PCA: Efficient guarantees for Oja’s algorithm, beyond rank-one updates.” [PDF]
  4. Altschuler, D. J., and Niles-Weed, J. (2021), “The Discrepancy of Random Rectangular Matrices.” [PDF]
  5. Mena, G., Nejatbakhsh, A., Varol, E., and Niles-Weed, J. (2020), “Sinkhorn EM: An Expectation-Maximization algorithm based on entropic optimal transport.” [PDF]
  6. Huang, D., Niles-Weed, J., Tropp, J. A., and Ward, R. (2020), “Matrix Concentration for Products.” [video] [PDF]
  7. Niles-Weed, J., and Rigollet, P. (2019), “Estimation of Wasserstein distances in the Spiked Transport Model.” [video] [PDF]
  8. Bandeira, A. S., Blum-Smith, B., Kileel, J., Perry, A., Weed, J., and Wein, A. S. (2017), “Estimation under group actions: recovering orbits from invariants.” [PDF]

Conference Articles

  1. Niles-Weed, J., and Zadik, I. (2020), “The All-or-Nothing Phenomenon in Sparse Tensor PCA,” in Advances in Neural Information Processing Systems 34 (NeurIPS 2020). [PDF]
  2. Liu, S., Niles-Weed, J., Razavian, N., and Fernandez-Granda, C. (2020), “Early-Learning Regularization Prevents Memorization of Noisy Labels,” in Advances in Neural Information Processing Systems 34 (NeurIPS 2020). [PDF]
  3. Cuturi, M., Teboul, O., Niles-Weed, J., and Vert, J.-P. (2020), “Supervised Quantile Normalization for Low-rank Matrix Approximation,” in Thirty-seventh International Conference on Machine Learning (ICML 2020). [PDF]
  4. Altschuler, J., Bach, F., Rudi, A., and Weed, J. (2019), “Massively scalable Sinkhorn distances via the Nyström method,” in Advances in Neural Information Processing Systems 32 (NeurIPS 2019). [PDF]
  5. Mena, G., and Weed, J. (2019), “Statistical bounds for entropic optimal transport: sample complexity and the central limit theorem,” in Advances in Neural Information Processing Systems 33 (NeurIPS 2019). [PDF] (Selected for spotlight presentation)
  6. Weed, J., and Berthet, Q. (2019), “Estimation of smooth densities in Wasserstein distance,” in Proceedings of the 32nd Conference On Learning Theory (COLT 2019). [PDF]
  7. Goldfeld, Z., Greenewald, K., Weed, J., and Polyanskiy, Y. (2019), “Optimality of the plug-in estimator for differential entropy estimation under Gaussian convolutions,” in 2019 IEEE International Symposium on Information Theory (ISIT).
  8. Forrow, A., Hütter, J.-C., Nitzan, M., Rigollet, P., Schiebinger, G., and Weed, J. (2019), “Statistical optimal transport via factored couplings,” in 22nd International Conference on Artificial Intelligence and Statistics (AISTATS 2019). [PDF]
  9. Weed, J. (2018), “An explicit analysis of the entropic penalty in linear programming,” in Proceedings of the 31st Conference On Learning Theory (COLT 2018). [video] [PDF]
  10. Mao, C., Weed, J., and Rigollet, P. (2018), “Minimax rates and efficient algorithms for noisy sorting,” in Algorithmic Learning Theory (ALT 2018). [PDF]
  11. Altschuler, J., Weed, J., and Rigollet, P. (2017), “Near-linear time approximation algorithms for optimal transport via Sinkhorn iteration,” in Advances in Neural Information Processing Systems 30 (NIPS 2017). [PDF] (Selected for spotlight presentation)
  12. Weed, J., Perchet, V., and Rigollet, P. (2016), “Online learning in repeated auctions,” in Proceedings of the 29th Conference on Learning Theory (COLT 2016). [video] [PDF]

Journal Articles

  1. Chen, H.-B., and Niles-Weed, J. (2021), “Asymptotics of smoothed Wasserstein distances,” Potential Analysis. To appear. [PDF]
  2. Klassen, S., Carter, A. K., Evans, D. H., Ortman, S., Stark, M. T., Loyless, A. A., Polkinghorne, M., Heng, P., Hill, M., Wijker, P., Niles-Weed, J., Marriner, G. P., Pottier, C., and Fletcher, R. J. (2021), “Diachronic modeling of the population within the medieval Greater Angkor Region settlement complex,” Science Advances, 7(19).
  3. Goldfeld, Z., Greenewald, K., Polyanskiy, Y., and Weed, J. (2020), “Convergence of smoothed empirical measures with applications to entropy estimation,” IEEE Trans. Inform. Theory, 66(7), 4368–4391. [PDF]
  4. Weed, J., and Bach, F. (2019), “Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance,” Bernoulli, 25(4A), 2620–2648. [PDF]
  5. Rigollet, P., and Weed, J. (2019), “Uncoupled isotonic regression via minimum Wasserstein deconvolution,” Inf. Inference, 8(4), 691–717. [video] [PDF]
  6. Perry, A., Weed, J., Bandeira, A. S., Rigollet, P., and Singer, A. (2019), “The Sample Complexity of Multireference Alignment,” SIAM J. Math. Data Sci., 1(3), 497–517. [PDF]
  7. Bandeira, A., Rigollet, P., and Weed, J. (2019), “Optimal rates of estimation for multi-reference alignment,” Mathematical Statistics and Learning, 2, 25–75. [PDF]
  8. Weed, J. (2018), “Approximately certifying the restricted isometry property is hard,” IEEE Trans. Inform. Theory, 64(8), 5488–5497.
  9. Klassen, S., Weed, J., and Evans, D. (2018), “Semi-supervised machine learning approaches for predicting the chronology of archaeological sites: A case study of temples from medieval Angkor, Cambodia,” PloS one, 13(11).
  10. Rigollet, P., and Weed, J. (2018), “Entropic optimal transport is maximum-likelihood deconvolution,” Comptes Rendus Mathématique, 356(11-12), 1228–1235.
  11. Sawhney, M., and Weed, J. (2017), “Further results on arc and bar \(k\)-visibility graphs,” The Minnesota Journal of Undergraduate Mathematics, 3(1). Project mentored through MIT PRIMES. [PDF]
  12. Woo, A. (2009), “Permutations with Kazhdan-Lusztig polynomial \(P_{id,w}(q)=1+q^h\),” Electronic Journal of Combinatorics, 16(2). With an appendix by S. Billey and J. Weed. [PDF]

Book Chapters

  1. Weed, J. (2017), “Multinational War is Hard,” in The Mathematics of Various Entertaining Subjects, eds. J. Beineke and J. Rosenhouse, Princeton. [PDF]

Miscellaneous

  1. Weed, J. (2018), “Sharper rates for estimating differential entropy under Gaussian convolutions.” [PDF]