MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Doctoral Theses
  • View Item
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Doctoral Theses
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Statistics in high dimensions without IID samples : truncated statistics and minimax optimization

Author(s)
Zampetakis, Emmanouil.
Thumbnail
Download1227782503-MIT.pdf (4.239Mb)
Other Contributors
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science.
Advisor
Constantinos Daskalakis.
Terms of use
MIT theses may be protected by copyright. Please reuse MIT thesis content according to the MIT Libraries Permissions Policy, which is available through the URL provided. http://dspace.mit.edu/handle/1721.1/7582
Metadata
Show full item record
Abstract
A key assumption underlying many methods in Statistics and Machine Learning is that data points are independently and identically distributed (i.i.d.). However, this assumption ignores the following recurring challenges in data collection: (1) censoring - truncation of data, (2) strategic behavior. In this thesis we provide a mathematical and computational theory for designing solutions or proving impossibility results related to challenges (1) and (2). For the classical statistical problem of estimation from truncated samples, we provide the first statistically and computationally efficient algorithms that provably recover an estimate of the whole non-truncated population. We design algorithms both in the parametric setting, e.g. Gaussian Estimation, Linear Regression, and in the non-parametric setting. In the non-parametric setting, we provide techniques that bound the extrapolation error of multi-variate polynomial log densities. Our main tool for the non-parametric setting is a Statistical Taylor Theorem that is based on sample access from some probability distribution with smooth probability density function. We believe that this theorem can have applications beyond the topic of Truncated Statistics. In the context of learning from strategic behavior, we consider the problem of min-max optimization, which is a central problem in Game Theory and Statistics and which has recently found interesting applications in Machine Learning tasks such as Generative Adversarial Networks. Our first result is the PPAD-hardness of minmax optimization which implies the first exponential separation between finding an approximate local minimum and finding an approximate local min-max solution. Our second result is a second-order algorithm that provably asymptotically converges to an approximate local min-max solution. Due to our PPAD-hardness result an algorithm with stronger guarantee is out of reach.
Description
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, September, 2020
 
Cataloged from student-submitted PDF of thesis.
 
Includes bibliographical references (pages 367-380).
 
Date issued
2020
URI
https://hdl.handle.net/1721.1/129316
Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Publisher
Massachusetts Institute of Technology
Keywords
Electrical Engineering and Computer Science.

Collections
  • Doctoral Theses

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.