Stochastic tracking algorithms and empirical concentration inequalities for statistical learning.

Authors
  • PEEL Thomas
  • RALAIVOLA Liva
  • ANTHOINE Sandrine
  • DENIS Francois
  • ANTHOINE Sandrine
  • KOWALSKI Matthieu
  • DEBREUVE Eric
  • DAUDET Laurent
  • VAYATIS Nicolas
Publication date
2013
Publication type
Thesis
Summary The first part of this thesis introduces new algorithms for parsimonious signal decomposition. Based on Matching Pursuit (MP) they address the following problem: how to reduce the computational time of the often very expensive MP selection step. In response, we subsample the dictionary at each iteration, in rows and columns. We show that this theoretically sound approach performs well in practice. We then propose an iterative block gradient descent algorithm for feature selection in multi-class classification. This is based on the use of error-correcting codes that transform the problem into a simultaneous parsimonious signal representation problem. The second part presents new empirical concentration inequalities of Bernstein type. First, they concern the theory of U-statistics and are used to elaborate bounds in generalization in the framework of ranking algorithms. These bounds take advantage of a variance estimator for which we propose an efficient computation algorithm. Then, we present an empirical version of the Bernstein-type inequality proposed by Freedman [1975] for martingales. Here again, the strength of our bound lies in the introduction of a variance estimator computable from the data. This allows us to propose bounds in generalization for all online learning algorithms improving the state of the art and opening the door to a new family of learning algorithms taking advantage of this empirical information.
Topics of the publication
Themes detected by scanR from retrieved publications. For more information, see https://scanr.enseignementsup-recherche.gouv.fr