Cargando…
Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings: Randomized Controlled Trial
BACKGROUND: Engaging patients in health behaviors is critical for better outcomes, yet many patient partnership behaviors are not widely adopted. Behavioral economics–based interventions offer potential solutions, but it is challenging to assess the time and cost needed for different options. Crowds...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
JMIR Publications
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10375278/ https://www.ncbi.nlm.nih.gov/pubmed/37440308 http://dx.doi.org/10.2196/41431 |
_version_ | 1785079001386057728 |
---|---|
author | Chen, Kay-Yut Lang, Yan Zhou, Yuan Kosmari, Ludmila Daniel, Kathryn Gurses, Ayse Xiao, Yan |
author_facet | Chen, Kay-Yut Lang, Yan Zhou, Yuan Kosmari, Ludmila Daniel, Kathryn Gurses, Ayse Xiao, Yan |
author_sort | Chen, Kay-Yut |
collection | PubMed |
description | BACKGROUND: Engaging patients in health behaviors is critical for better outcomes, yet many patient partnership behaviors are not widely adopted. Behavioral economics–based interventions offer potential solutions, but it is challenging to assess the time and cost needed for different options. Crowdsourcing platforms can efficiently and rapidly assess the efficacy of such interventions, but it is unclear if web-based participants respond to simulated incentives in the same way as they would to actual incentives. OBJECTIVE: The goals of this study were (1) to assess the feasibility of using crowdsourced surveys to evaluate behavioral economics interventions for patient partnerships by examining whether web-based participants responded to simulated incentives in the same way they would have responded to actual incentives, and (2) to assess the impact of 2 behavioral economics–based intervention designs, psychological rewards and loss of framing, on simulated medication reconciliation behaviors in a simulated primary care setting. METHODS: We conducted a randomized controlled trial using a between-subject design on a crowdsourcing platform (Amazon Mechanical Turk) to evaluate the effectiveness of behavioral interventions designed to improve medication adherence in primary care visits. The study included a control group that represented the participants’ baseline behavior and 3 simulated interventions, namely monetary compensation, a status effect as a psychological reward, and a loss frame as a modification of the status effect. Participants’ willingness to bring medicines to a primary care visit was measured on a 5-point Likert scale. A reverse-coding question was included to ensure response intentionality. RESULTS: A total of 569 study participants were recruited. There were 132 in the baseline group, 187 in the monetary compensation group, 149 in the psychological reward group, and 101 in the loss frame group. All 3 nudge interventions increased participants’ willingness to bring medicines significantly when compared to the baseline scenario. The monetary compensation intervention caused an increase of 17.51% (P<.001), psychological rewards on status increased willingness by 11.85% (P<.001), and a loss frame on psychological rewards increased willingness by 24.35% (P<.001). Responses to the reverse-coding question were consistent with the willingness questions. CONCLUSIONS: In primary care, bringing medications to office visits is a frequently advocated patient partnership behavior that is nonetheless not widely adopted. Crowdsourcing platforms such as Amazon Mechanical Turk support efforts to efficiently and rapidly reach large groups of individuals to assess the efficacy of behavioral interventions. We found that crowdsourced survey-based experiments with simulated incentives can produce valid simulated behavioral responses. The use of psychological status design, particularly with a loss framing approach, can effectively enhance patient engagement in primary care. These results support the use of crowdsourcing platforms to augment and complement traditional approaches to learning about behavioral economics for patient engagement. |
format | Online Article Text |
id | pubmed-10375278 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | JMIR Publications |
record_format | MEDLINE/PubMed |
spelling | pubmed-103752782023-07-29 Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings: Randomized Controlled Trial Chen, Kay-Yut Lang, Yan Zhou, Yuan Kosmari, Ludmila Daniel, Kathryn Gurses, Ayse Xiao, Yan J Med Internet Res Original Paper BACKGROUND: Engaging patients in health behaviors is critical for better outcomes, yet many patient partnership behaviors are not widely adopted. Behavioral economics–based interventions offer potential solutions, but it is challenging to assess the time and cost needed for different options. Crowdsourcing platforms can efficiently and rapidly assess the efficacy of such interventions, but it is unclear if web-based participants respond to simulated incentives in the same way as they would to actual incentives. OBJECTIVE: The goals of this study were (1) to assess the feasibility of using crowdsourced surveys to evaluate behavioral economics interventions for patient partnerships by examining whether web-based participants responded to simulated incentives in the same way they would have responded to actual incentives, and (2) to assess the impact of 2 behavioral economics–based intervention designs, psychological rewards and loss of framing, on simulated medication reconciliation behaviors in a simulated primary care setting. METHODS: We conducted a randomized controlled trial using a between-subject design on a crowdsourcing platform (Amazon Mechanical Turk) to evaluate the effectiveness of behavioral interventions designed to improve medication adherence in primary care visits. The study included a control group that represented the participants’ baseline behavior and 3 simulated interventions, namely monetary compensation, a status effect as a psychological reward, and a loss frame as a modification of the status effect. Participants’ willingness to bring medicines to a primary care visit was measured on a 5-point Likert scale. A reverse-coding question was included to ensure response intentionality. RESULTS: A total of 569 study participants were recruited. There were 132 in the baseline group, 187 in the monetary compensation group, 149 in the psychological reward group, and 101 in the loss frame group. All 3 nudge interventions increased participants’ willingness to bring medicines significantly when compared to the baseline scenario. The monetary compensation intervention caused an increase of 17.51% (P<.001), psychological rewards on status increased willingness by 11.85% (P<.001), and a loss frame on psychological rewards increased willingness by 24.35% (P<.001). Responses to the reverse-coding question were consistent with the willingness questions. CONCLUSIONS: In primary care, bringing medications to office visits is a frequently advocated patient partnership behavior that is nonetheless not widely adopted. Crowdsourcing platforms such as Amazon Mechanical Turk support efforts to efficiently and rapidly reach large groups of individuals to assess the efficacy of behavioral interventions. We found that crowdsourced survey-based experiments with simulated incentives can produce valid simulated behavioral responses. The use of psychological status design, particularly with a loss framing approach, can effectively enhance patient engagement in primary care. These results support the use of crowdsourcing platforms to augment and complement traditional approaches to learning about behavioral economics for patient engagement. JMIR Publications 2023-07-13 /pmc/articles/PMC10375278/ /pubmed/37440308 http://dx.doi.org/10.2196/41431 Text en ©Kay-Yut Chen, Yan Lang, Yuan Zhou, Ludmila Kosmari, Kathryn Daniel, Ayse Gurses, Yan Xiao. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 13.07.2023. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included. |
spellingShingle | Original Paper Chen, Kay-Yut Lang, Yan Zhou, Yuan Kosmari, Ludmila Daniel, Kathryn Gurses, Ayse Xiao, Yan Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings: Randomized Controlled Trial |
title | Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings: Randomized Controlled Trial |
title_full | Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings: Randomized Controlled Trial |
title_fullStr | Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings: Randomized Controlled Trial |
title_full_unstemmed | Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings: Randomized Controlled Trial |
title_short | Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings: Randomized Controlled Trial |
title_sort | assessing interventions on crowdsourcing platforms to nudge patients for engagement behaviors in primary care settings: randomized controlled trial |
topic | Original Paper |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10375278/ https://www.ncbi.nlm.nih.gov/pubmed/37440308 http://dx.doi.org/10.2196/41431 |
work_keys_str_mv | AT chenkayyut assessinginterventionsoncrowdsourcingplatformstonudgepatientsforengagementbehaviorsinprimarycaresettingsrandomizedcontrolledtrial AT langyan assessinginterventionsoncrowdsourcingplatformstonudgepatientsforengagementbehaviorsinprimarycaresettingsrandomizedcontrolledtrial AT zhouyuan assessinginterventionsoncrowdsourcingplatformstonudgepatientsforengagementbehaviorsinprimarycaresettingsrandomizedcontrolledtrial AT kosmariludmila assessinginterventionsoncrowdsourcingplatformstonudgepatientsforengagementbehaviorsinprimarycaresettingsrandomizedcontrolledtrial AT danielkathryn assessinginterventionsoncrowdsourcingplatformstonudgepatientsforengagementbehaviorsinprimarycaresettingsrandomizedcontrolledtrial AT gursesayse assessinginterventionsoncrowdsourcingplatformstonudgepatientsforengagementbehaviorsinprimarycaresettingsrandomizedcontrolledtrial AT xiaoyan assessinginterventionsoncrowdsourcingplatformstonudgepatientsforengagementbehaviorsinprimarycaresettingsrandomizedcontrolledtrial |