Cargando…
Consensus on the Terms and Procedures for Planning and Reporting a Usability Evaluation of Health-Related Digital Solutions: Delphi Study and a Resulting Checklist
BACKGROUND: Usability evaluation both by experts and target users is an integral part of the process of developing and assessing digital solutions. Usability evaluation improves the probability of having digital solutions that are easier, safer, more efficient, and more pleasant to use. However, des...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
JMIR Publications
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10282913/ https://www.ncbi.nlm.nih.gov/pubmed/37279047 http://dx.doi.org/10.2196/44326 |
_version_ | 1785061215168364544 |
---|---|
author | Martins, Ana Isabel Santinha, Gonçalo Almeida, Ana Margarida Ribeiro, Óscar Silva, Telmo Rocha, Nelson Silva, Anabela G |
author_facet | Martins, Ana Isabel Santinha, Gonçalo Almeida, Ana Margarida Ribeiro, Óscar Silva, Telmo Rocha, Nelson Silva, Anabela G |
author_sort | Martins, Ana Isabel |
collection | PubMed |
description | BACKGROUND: Usability evaluation both by experts and target users is an integral part of the process of developing and assessing digital solutions. Usability evaluation improves the probability of having digital solutions that are easier, safer, more efficient, and more pleasant to use. However, despite the widespread recognition of the importance of usability evaluation, there is a lack of research and consensus on related concepts and reporting standards. OBJECTIVE: The aim of the study is to generate consensus on terms and procedures that should be considered when planning and reporting a study on a usability evaluation of health-related digital solutions both by users and experts and provide a checklist that can easily be used by researchers when conducting their usability studies. METHODS: A Delphi study with 2 rounds was conducted with a panel of international participants experienced in usability evaluation. In the first round, they were asked to comment on definitions, rate the importance of preidentified methodological procedures using a 9-item Likert scale, and suggest additional procedures. In the second round, experienced participants were asked to reappraise the relevance of each procedure informed by round 1 results. Consensus on the relevance of each item was defined a priori when at least 70% or more experienced participants scored an item 7 to 9 and less than 15% of participants scored the same item 1 to 3. RESULTS: A total of 30 participants (n=20 females) from 11 different countries entered the Delphi study with a mean age of 37.2 (SD 7.7) years. Agreement was achieved on the definitions for all usability evaluation–related terms proposed (usability assessment moderator, participant, usability evaluation method, usability evaluation technique, tasks, usability evaluation environment, usability evaluator, and domain evaluator). A total of 38 procedures related to usability evaluation planning and reporting were identified across rounds (28 were related to usability evaluation involving users and 10 related to usability evaluation involving experts). Consensus on the relevance was achieved for 23 (82%) of the procedures related to usability evaluation involving users and for 7 (70%) of the usability evaluation procedures involving experts. A checklist was proposed that can guide authors when designing and reporting usability studies. CONCLUSIONS: This study proposes a set of terms and respective definitions as well as a checklist to guide the planning and reporting of usability evaluation studies, constituting an important step toward a more standardized approach in the field of usability evaluation that may contribute to enhancing the quality of planning and reporting usability studies. Future studies can contribute to further validating this study work by refining the definitions, assessing the practical applicability of the checklist, or assessing whether using this checklist results in higher-quality digital solutions. |
format | Online Article Text |
id | pubmed-10282913 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | JMIR Publications |
record_format | MEDLINE/PubMed |
spelling | pubmed-102829132023-06-22 Consensus on the Terms and Procedures for Planning and Reporting a Usability Evaluation of Health-Related Digital Solutions: Delphi Study and a Resulting Checklist Martins, Ana Isabel Santinha, Gonçalo Almeida, Ana Margarida Ribeiro, Óscar Silva, Telmo Rocha, Nelson Silva, Anabela G J Med Internet Res Original Paper BACKGROUND: Usability evaluation both by experts and target users is an integral part of the process of developing and assessing digital solutions. Usability evaluation improves the probability of having digital solutions that are easier, safer, more efficient, and more pleasant to use. However, despite the widespread recognition of the importance of usability evaluation, there is a lack of research and consensus on related concepts and reporting standards. OBJECTIVE: The aim of the study is to generate consensus on terms and procedures that should be considered when planning and reporting a study on a usability evaluation of health-related digital solutions both by users and experts and provide a checklist that can easily be used by researchers when conducting their usability studies. METHODS: A Delphi study with 2 rounds was conducted with a panel of international participants experienced in usability evaluation. In the first round, they were asked to comment on definitions, rate the importance of preidentified methodological procedures using a 9-item Likert scale, and suggest additional procedures. In the second round, experienced participants were asked to reappraise the relevance of each procedure informed by round 1 results. Consensus on the relevance of each item was defined a priori when at least 70% or more experienced participants scored an item 7 to 9 and less than 15% of participants scored the same item 1 to 3. RESULTS: A total of 30 participants (n=20 females) from 11 different countries entered the Delphi study with a mean age of 37.2 (SD 7.7) years. Agreement was achieved on the definitions for all usability evaluation–related terms proposed (usability assessment moderator, participant, usability evaluation method, usability evaluation technique, tasks, usability evaluation environment, usability evaluator, and domain evaluator). A total of 38 procedures related to usability evaluation planning and reporting were identified across rounds (28 were related to usability evaluation involving users and 10 related to usability evaluation involving experts). Consensus on the relevance was achieved for 23 (82%) of the procedures related to usability evaluation involving users and for 7 (70%) of the usability evaluation procedures involving experts. A checklist was proposed that can guide authors when designing and reporting usability studies. CONCLUSIONS: This study proposes a set of terms and respective definitions as well as a checklist to guide the planning and reporting of usability evaluation studies, constituting an important step toward a more standardized approach in the field of usability evaluation that may contribute to enhancing the quality of planning and reporting usability studies. Future studies can contribute to further validating this study work by refining the definitions, assessing the practical applicability of the checklist, or assessing whether using this checklist results in higher-quality digital solutions. JMIR Publications 2023-06-06 /pmc/articles/PMC10282913/ /pubmed/37279047 http://dx.doi.org/10.2196/44326 Text en ©Ana Isabel Martins, Gonçalo Santinha, Ana Margarida Almeida, Óscar Ribeiro, Telmo Silva, Nelson Rocha, Anabela G Silva. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 06.06.2023. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included. |
spellingShingle | Original Paper Martins, Ana Isabel Santinha, Gonçalo Almeida, Ana Margarida Ribeiro, Óscar Silva, Telmo Rocha, Nelson Silva, Anabela G Consensus on the Terms and Procedures for Planning and Reporting a Usability Evaluation of Health-Related Digital Solutions: Delphi Study and a Resulting Checklist |
title | Consensus on the Terms and Procedures for Planning and Reporting a Usability Evaluation of Health-Related Digital Solutions: Delphi Study and a Resulting Checklist |
title_full | Consensus on the Terms and Procedures for Planning and Reporting a Usability Evaluation of Health-Related Digital Solutions: Delphi Study and a Resulting Checklist |
title_fullStr | Consensus on the Terms and Procedures for Planning and Reporting a Usability Evaluation of Health-Related Digital Solutions: Delphi Study and a Resulting Checklist |
title_full_unstemmed | Consensus on the Terms and Procedures for Planning and Reporting a Usability Evaluation of Health-Related Digital Solutions: Delphi Study and a Resulting Checklist |
title_short | Consensus on the Terms and Procedures for Planning and Reporting a Usability Evaluation of Health-Related Digital Solutions: Delphi Study and a Resulting Checklist |
title_sort | consensus on the terms and procedures for planning and reporting a usability evaluation of health-related digital solutions: delphi study and a resulting checklist |
topic | Original Paper |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10282913/ https://www.ncbi.nlm.nih.gov/pubmed/37279047 http://dx.doi.org/10.2196/44326 |
work_keys_str_mv | AT martinsanaisabel consensusonthetermsandproceduresforplanningandreportingausabilityevaluationofhealthrelateddigitalsolutionsdelphistudyandaresultingchecklist AT santinhagoncalo consensusonthetermsandproceduresforplanningandreportingausabilityevaluationofhealthrelateddigitalsolutionsdelphistudyandaresultingchecklist AT almeidaanamargarida consensusonthetermsandproceduresforplanningandreportingausabilityevaluationofhealthrelateddigitalsolutionsdelphistudyandaresultingchecklist AT ribeirooscar consensusonthetermsandproceduresforplanningandreportingausabilityevaluationofhealthrelateddigitalsolutionsdelphistudyandaresultingchecklist AT silvatelmo consensusonthetermsandproceduresforplanningandreportingausabilityevaluationofhealthrelateddigitalsolutionsdelphistudyandaresultingchecklist AT rochanelson consensusonthetermsandproceduresforplanningandreportingausabilityevaluationofhealthrelateddigitalsolutionsdelphistudyandaresultingchecklist AT silvaanabelag consensusonthetermsandproceduresforplanningandreportingausabilityevaluationofhealthrelateddigitalsolutionsdelphistudyandaresultingchecklist |