Follow
Chen Lu
Chen Lu
Verified email at mit.edu - Homepage
Title
Cited by
Cited by
Year
Optimal dimension dependence of the metropolis-adjusted langevin algorithm
S Chewi, C Lu, K Ahn, X Cheng, T Le Gouic, P Rigollet
Conference on Learning Theory, 1260-1300, 2021
612021
SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence
S Chewi, T Le Gouic, C Lu, T Maunu, P Rigollet
Advances in Neural Information Processing Systems 33, 2098-2109, 2020
602020
Exponential ergodicity of mirror-Langevin diffusions
S Chewi, T Le Gouic, C Lu, T Maunu, P Rigollet, A Stromme
Advances in Neural Information Processing Systems 33, 19573-19585, 2020
482020
Contextual stochastic block model: Sharp thresholds and contiguity
C Lu, S Sen
Journal of Machine Learning Research 24 (54), 1-34, 2023
182023
Robust nonparametric difference-in-differences estimation
C Lu, X Nie, S Wager
arXiv e-prints, arXiv: 1905.11622, 2019
172019
The query complexity of sampling from strongly log-concave distributions in one dimension
S Chewi, PR Gerber, C Lu, T Le Gouic, P Rigollet
Conference on Learning Theory, 2041-2059, 2022
132022
Nonparametric heterogeneous treatment effect estimation in repeated cross sectional designs
X Nie, C Lu, S Wager
arXiv preprint arXiv:1905.11622, 2019
102019
Fisher information lower bounds for sampling
S Chewi, P Gerber, H Lee, C Lu
International Conference on Algorithmic Learning Theory, 375-410, 2023
82023
Query lower bounds for log-concave sampling
S Chewi, JDD Pont, J Li, C Lu, S Narayanan
2023 IEEE 64th Annual Symposium on Foundations of Computer Science (FOCS …, 2023
62023
Rejection sampling from shape-constrained distributions in sublinear time
S Chewi, PR Gerber, C Lu, T Le Gouic, P Rigollet
International conference on artificial intelligence and statistics, 2249-2265, 2022
52022
Fisher information lower bounds for sampling
S Chewi, P Gerber, H Lee, C Lu
arXiv preprint arXiv:2210.02482, 2022
42022
Upper and Lower Bounds for Sampling
C Lu
Massachusetts Institute of Technology, 2023
2023
The system can't perform the operation now. Try again later.
Articles 1–12