Cargando…

Crowdsourcing as a Screening Tool to Detect Clinical Features of Glaucomatous Optic Neuropathy from Digital Photography

AIM: Crowdsourcing is the process of simplifying and outsourcing numerous tasks to many untrained individuals. Our aim was to assess the performance and repeatability of crowdsourcing in the classification of normal and glaucomatous discs from optic disc images. METHODS: Optic disc images (N = 127)...

Descripción completa

Detalles Bibliográficos
Autores principales: Mitry, Danny, Peto, Tunde, Hayat, Shabina, Blows, Peter, Morgan, James, Khaw, Kay-Tee, Foster, Paul J.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2015
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4334897/
https://www.ncbi.nlm.nih.gov/pubmed/25692287
http://dx.doi.org/10.1371/journal.pone.0117401
_version_ 1782358248347664384
author Mitry, Danny
Peto, Tunde
Hayat, Shabina
Blows, Peter
Morgan, James
Khaw, Kay-Tee
Foster, Paul J.
author_facet Mitry, Danny
Peto, Tunde
Hayat, Shabina
Blows, Peter
Morgan, James
Khaw, Kay-Tee
Foster, Paul J.
author_sort Mitry, Danny
collection PubMed
description AIM: Crowdsourcing is the process of simplifying and outsourcing numerous tasks to many untrained individuals. Our aim was to assess the performance and repeatability of crowdsourcing in the classification of normal and glaucomatous discs from optic disc images. METHODS: Optic disc images (N = 127) with pre-determined disease status were selected by consensus agreement from grading experts from a large cohort study. After reading brief illustrative instructions, we requested that knowledge workers (KWs) from a crowdsourcing platform (Amazon MTurk) classified each image as normal or abnormal. Each image was classified 20 times by different KWs. Two study designs were examined to assess the effect of varying KW experience and both study designs were conducted twice for consistency. Performance was assessed by comparing the sensitivity, specificity and area under the receiver operating characteristic curve (AUC). RESULTS: Overall, 2,540 classifications were received in under 24 hours at minimal cost. The sensitivity ranged between 83–88% across both trials and study designs, however the specificity was poor, ranging between 35–43%. In trial 1, the highest AUC (95%CI) was 0.64(0.62–0.66) and in trial 2 it was 0.63(0.61–0.65). There were no significant differences between study design or trials conducted. CONCLUSIONS: Crowdsourcing represents a cost-effective method of image analysis which demonstrates good repeatability and a high sensitivity. Optimisation of variables such as reward schemes, mode of image presentation, expanded response options and incorporation of training modules should be examined to determine their effect on the accuracy and reliability of this technique in retinal image analysis.
format Online
Article
Text
id pubmed-4334897
institution National Center for Biotechnology Information
language English
publishDate 2015
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-43348972015-02-24 Crowdsourcing as a Screening Tool to Detect Clinical Features of Glaucomatous Optic Neuropathy from Digital Photography Mitry, Danny Peto, Tunde Hayat, Shabina Blows, Peter Morgan, James Khaw, Kay-Tee Foster, Paul J. PLoS One Research Article AIM: Crowdsourcing is the process of simplifying and outsourcing numerous tasks to many untrained individuals. Our aim was to assess the performance and repeatability of crowdsourcing in the classification of normal and glaucomatous discs from optic disc images. METHODS: Optic disc images (N = 127) with pre-determined disease status were selected by consensus agreement from grading experts from a large cohort study. After reading brief illustrative instructions, we requested that knowledge workers (KWs) from a crowdsourcing platform (Amazon MTurk) classified each image as normal or abnormal. Each image was classified 20 times by different KWs. Two study designs were examined to assess the effect of varying KW experience and both study designs were conducted twice for consistency. Performance was assessed by comparing the sensitivity, specificity and area under the receiver operating characteristic curve (AUC). RESULTS: Overall, 2,540 classifications were received in under 24 hours at minimal cost. The sensitivity ranged between 83–88% across both trials and study designs, however the specificity was poor, ranging between 35–43%. In trial 1, the highest AUC (95%CI) was 0.64(0.62–0.66) and in trial 2 it was 0.63(0.61–0.65). There were no significant differences between study design or trials conducted. CONCLUSIONS: Crowdsourcing represents a cost-effective method of image analysis which demonstrates good repeatability and a high sensitivity. Optimisation of variables such as reward schemes, mode of image presentation, expanded response options and incorporation of training modules should be examined to determine their effect on the accuracy and reliability of this technique in retinal image analysis. Public Library of Science 2015-02-18 /pmc/articles/PMC4334897/ /pubmed/25692287 http://dx.doi.org/10.1371/journal.pone.0117401 Text en © 2015 Mitry et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.
spellingShingle Research Article
Mitry, Danny
Peto, Tunde
Hayat, Shabina
Blows, Peter
Morgan, James
Khaw, Kay-Tee
Foster, Paul J.
Crowdsourcing as a Screening Tool to Detect Clinical Features of Glaucomatous Optic Neuropathy from Digital Photography
title Crowdsourcing as a Screening Tool to Detect Clinical Features of Glaucomatous Optic Neuropathy from Digital Photography
title_full Crowdsourcing as a Screening Tool to Detect Clinical Features of Glaucomatous Optic Neuropathy from Digital Photography
title_fullStr Crowdsourcing as a Screening Tool to Detect Clinical Features of Glaucomatous Optic Neuropathy from Digital Photography
title_full_unstemmed Crowdsourcing as a Screening Tool to Detect Clinical Features of Glaucomatous Optic Neuropathy from Digital Photography
title_short Crowdsourcing as a Screening Tool to Detect Clinical Features of Glaucomatous Optic Neuropathy from Digital Photography
title_sort crowdsourcing as a screening tool to detect clinical features of glaucomatous optic neuropathy from digital photography
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4334897/
https://www.ncbi.nlm.nih.gov/pubmed/25692287
http://dx.doi.org/10.1371/journal.pone.0117401
work_keys_str_mv AT mitrydanny crowdsourcingasascreeningtooltodetectclinicalfeaturesofglaucomatousopticneuropathyfromdigitalphotography
AT petotunde crowdsourcingasascreeningtooltodetectclinicalfeaturesofglaucomatousopticneuropathyfromdigitalphotography
AT hayatshabina crowdsourcingasascreeningtooltodetectclinicalfeaturesofglaucomatousopticneuropathyfromdigitalphotography
AT blowspeter crowdsourcingasascreeningtooltodetectclinicalfeaturesofglaucomatousopticneuropathyfromdigitalphotography
AT morganjames crowdsourcingasascreeningtooltodetectclinicalfeaturesofglaucomatousopticneuropathyfromdigitalphotography
AT khawkaytee crowdsourcingasascreeningtooltodetectclinicalfeaturesofglaucomatousopticneuropathyfromdigitalphotography
AT fosterpaulj crowdsourcingasascreeningtooltodetectclinicalfeaturesofglaucomatousopticneuropathyfromdigitalphotography