Cargando…
The Issue of Proxies and Choice Architectures. Why EU Law Matters for Recommender Systems
Recommendations are meant to increase sales or ad revenue, as these are the first priority of those who pay for them. As recommender systems match their recommendations with inferred preferences, we should not be surprised if the algorithm optimizes for lucrative preferences and thus co-produces the...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9096719/ https://www.ncbi.nlm.nih.gov/pubmed/35573902 http://dx.doi.org/10.3389/frai.2022.789076 |
_version_ | 1784706038350479360 |
---|---|
author | Hildebrandt, Mireille |
author_facet | Hildebrandt, Mireille |
author_sort | Hildebrandt, Mireille |
collection | PubMed |
description | Recommendations are meant to increase sales or ad revenue, as these are the first priority of those who pay for them. As recommender systems match their recommendations with inferred preferences, we should not be surprised if the algorithm optimizes for lucrative preferences and thus co-produces the preferences they mine. This relates to the well-known problems of feedback loops, filter bubbles, and echo chambers. In this article, I discuss the implications of the fact that computing systems necessarily work with proxies when inferring recommendations and raise a number of questions about whether recommender systems actually do what they are claimed to do, while also analysing the often-perverse economic incentive structures that have a major impact on relevant design decisions. Finally, I will explain how the choice architectures for data controllers and providers of AI systems as foreseen in the EU's General Data Protection Regulation (GDPR), the proposed EU Digital Services Act (DSA) and the proposed EU AI Act will help to break through various vicious circles, by constraining how people may be targeted (GDPR, DSA) and by requiring documented evidence of the robustness, resilience, reliability, and the responsible design and deployment of high-risk recommender systems (AI Act). |
format | Online Article Text |
id | pubmed-9096719 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-90967192022-05-13 The Issue of Proxies and Choice Architectures. Why EU Law Matters for Recommender Systems Hildebrandt, Mireille Front Artif Intell Artificial Intelligence Recommendations are meant to increase sales or ad revenue, as these are the first priority of those who pay for them. As recommender systems match their recommendations with inferred preferences, we should not be surprised if the algorithm optimizes for lucrative preferences and thus co-produces the preferences they mine. This relates to the well-known problems of feedback loops, filter bubbles, and echo chambers. In this article, I discuss the implications of the fact that computing systems necessarily work with proxies when inferring recommendations and raise a number of questions about whether recommender systems actually do what they are claimed to do, while also analysing the often-perverse economic incentive structures that have a major impact on relevant design decisions. Finally, I will explain how the choice architectures for data controllers and providers of AI systems as foreseen in the EU's General Data Protection Regulation (GDPR), the proposed EU Digital Services Act (DSA) and the proposed EU AI Act will help to break through various vicious circles, by constraining how people may be targeted (GDPR, DSA) and by requiring documented evidence of the robustness, resilience, reliability, and the responsible design and deployment of high-risk recommender systems (AI Act). Frontiers Media S.A. 2022-04-28 /pmc/articles/PMC9096719/ /pubmed/35573902 http://dx.doi.org/10.3389/frai.2022.789076 Text en Copyright © 2022 Hildebrandt. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Artificial Intelligence Hildebrandt, Mireille The Issue of Proxies and Choice Architectures. Why EU Law Matters for Recommender Systems |
title | The Issue of Proxies and Choice Architectures. Why EU Law Matters for Recommender Systems |
title_full | The Issue of Proxies and Choice Architectures. Why EU Law Matters for Recommender Systems |
title_fullStr | The Issue of Proxies and Choice Architectures. Why EU Law Matters for Recommender Systems |
title_full_unstemmed | The Issue of Proxies and Choice Architectures. Why EU Law Matters for Recommender Systems |
title_short | The Issue of Proxies and Choice Architectures. Why EU Law Matters for Recommender Systems |
title_sort | issue of proxies and choice architectures. why eu law matters for recommender systems |
topic | Artificial Intelligence |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9096719/ https://www.ncbi.nlm.nih.gov/pubmed/35573902 http://dx.doi.org/10.3389/frai.2022.789076 |
work_keys_str_mv | AT hildebrandtmireille theissueofproxiesandchoicearchitectureswhyeulawmattersforrecommendersystems AT hildebrandtmireille issueofproxiesandchoicearchitectureswhyeulawmattersforrecommendersystems |