Cargando…
Ecole d’été de probabilités de Saint-Flour XXXI
Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in p...
Autor principal: | |
---|---|
Lenguaje: | eng |
Publicado: |
Springer
2004
|
Materias: | |
Acceso en línea: | https://dx.doi.org/10.1007/b99352 http://cds.cern.ch/record/1695914 |
_version_ | 1780936011465883648 |
---|---|
author | Picard, Jean |
author_facet | Picard, Jean |
author_sort | Picard, Jean |
collection | CERN |
description | Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results. |
id | cern-1695914 |
institution | Organización Europea para la Investigación Nuclear |
language | eng |
publishDate | 2004 |
publisher | Springer |
record_format | invenio |
spelling | cern-16959142021-04-25T16:39:34Zdoi:10.1007/b99352http://cds.cern.ch/record/1695914engPicard, JeanEcole d’été de probabilités de Saint-Flour XXXIMathematical Physics and MathematicsStatistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results.Springeroai:cds.cern.ch:16959142004 |
spellingShingle | Mathematical Physics and Mathematics Picard, Jean Ecole d’été de probabilités de Saint-Flour XXXI |
title | Ecole d’été de probabilités de Saint-Flour XXXI |
title_full | Ecole d’été de probabilités de Saint-Flour XXXI |
title_fullStr | Ecole d’été de probabilités de Saint-Flour XXXI |
title_full_unstemmed | Ecole d’été de probabilités de Saint-Flour XXXI |
title_short | Ecole d’été de probabilités de Saint-Flour XXXI |
title_sort | ecole d’été de probabilités de saint-flour xxxi |
topic | Mathematical Physics and Mathematics |
url | https://dx.doi.org/10.1007/b99352 http://cds.cern.ch/record/1695914 |
work_keys_str_mv | AT picardjean ecoledetedeprobabilitesdesaintflourxxxi |