Cargando…
Less is more: regularization perspectives on large scale machine learning
<!--HTML--><p>Modern data-sets are often huge, possibly high-dimensional, and require complex non-linear parameterization to be modeled accurately.<br /> Examples include image and audio classification but also data analysis problems in natural sciences, e..g high energy physics or...
Autor principal: | |
---|---|
Lenguaje: | eng |
Publicado: |
2017
|
Materias: | |
Acceso en línea: | http://cds.cern.ch/record/2269969 |
_version_ | 1780954813983358976 |
---|---|
author | Rosasco, Lorenzo |
author_facet | Rosasco, Lorenzo |
author_sort | Rosasco, Lorenzo |
collection | CERN |
description | <!--HTML--><p>Modern data-sets are often huge, possibly high-dimensional, and require complex non-linear parameterization to be modeled accurately.<br />
Examples include image and audio classification but also data analysis problems in natural sciences, e..g high energy physics or biology.<br />
Deep learning based techniques provide a possible solution at the expanse of theoretical guidance and, especially, of computational requirements. It is then a key challenge for large scale machine learning to devise approaches guaranteed to be accurate and yet computationally efficient. In this talk, we will consider a regularization perspectives on machine learning appealing to classical ideas in linear algebra and inverse problems to scale-up dramatically nonparametric methods such as kernel methods, often dismissed because of prohibitive costs. Our analysis derives optimal theoretical guarantees while providing experimental results at par or out-performing state of the art approaches.</p> |
id | cern-2269969 |
institution | Organización Europea para la Investigación Nuclear |
language | eng |
publishDate | 2017 |
record_format | invenio |
spelling | cern-22699692022-11-02T22:31:44Zhttp://cds.cern.ch/record/2269969engRosasco, LorenzoLess is more: regularization perspectives on large scale machine learningLess is more: regularization perspectives on large scale machine learningEP-IT Data science seminars<!--HTML--><p>Modern data-sets are often huge, possibly high-dimensional, and require complex non-linear parameterization to be modeled accurately.<br /> Examples include image and audio classification but also data analysis problems in natural sciences, e..g high energy physics or biology.<br /> Deep learning based techniques provide a possible solution at the expanse of theoretical guidance and, especially, of computational requirements. It is then a key challenge for large scale machine learning to devise approaches guaranteed to be accurate and yet computationally efficient. In this talk, we will consider a regularization perspectives on machine learning appealing to classical ideas in linear algebra and inverse problems to scale-up dramatically nonparametric methods such as kernel methods, often dismissed because of prohibitive costs. Our analysis derives optimal theoretical guarantees while providing experimental results at par or out-performing state of the art approaches.</p>oai:cds.cern.ch:22699692017 |
spellingShingle | EP-IT Data science seminars Rosasco, Lorenzo Less is more: regularization perspectives on large scale machine learning |
title | Less is more: regularization perspectives on large scale machine learning |
title_full | Less is more: regularization perspectives on large scale machine learning |
title_fullStr | Less is more: regularization perspectives on large scale machine learning |
title_full_unstemmed | Less is more: regularization perspectives on large scale machine learning |
title_short | Less is more: regularization perspectives on large scale machine learning |
title_sort | less is more: regularization perspectives on large scale machine learning |
topic | EP-IT Data science seminars |
url | http://cds.cern.ch/record/2269969 |
work_keys_str_mv | AT rosascolorenzo lessismoreregularizationperspectivesonlargescalemachinelearning |