Menu
Contact | A-Z
img

Thursday, July 10, 2025 - 16:15 in U2-232


From Hypothesis Testing to Distribution Estimation

A talk in the Oberseminar Probability Theory and Mathematical Statistics series by
Nikita Zhivotovskiy from University of California Berkeley

Abstract: Distinguishing between two distributions based on observed data is a classical problem in statistics and computer science. But what if we aim to go further—not just test, but actually estimate a distribution close to the true one in, say, Kullback-Leibler divergence? Can we do this knowing only that the true distribution lies in a known class, without structural assumptions on the individual densities? In this talk, I will review classical results and present recent developments on this question. The focus will be on high-probability error bounds that are optimal up to constants in this general setting.



Back

© 2017–Present Sonderforschungbereich 1283 | Imprint | Privacy Policy