Vijaya Nandiwada-Hofer Arizona Psychological Association Convention Poster
Stakeholder Insights Improve Digital Intervention Acceptability and Engagement
Vijaya M. Nandiwada-Hofer, BS, Sydni A. Basha, MA, Joanna J. Kim, PhD
Introduction: Digital mental health interventions have been heralded as one solution for reaching and providing services to underserved populations (e.g., rural, impoverished, or non-English speaking).1, 5-7 However, for digital tools to be effective, they must be useable and acceptable to the target audience.2-3,9 User experience research (UXR) is a crucial component of developing high quality and persuasive digital interventions.2 Previous user testing research reports 80% of app issues can be detected with 5 users, making this process cost effective and necessary for ensuring future user acceptability and engagement.4,8 Current study describes the process of involving multiple stakeholders in UXR to conduct rapid-cycle refinements to a smartphone app intervention to support caregiver home practice of evidence-based parenting skills. Study aims included (Aim 1) gathering UX feedback on the digital intervention’s user interface, including visual presentation, interaction usability, and task flow and (Aim 2) Improve digital intervention usability and engagement strategies by streamlining app flow, logic, and user interface.
Method: We conducted in-person usability testing with five participants. All participants were parents who had previously participated in an evidence-based parenting intervention or interventionists themselves. Sessions consisted of an introduction (e.g., study overview, app introduction), usability session (e.g., guided tasks and scenarios with success tracking), and freeform phase (e.g., impressions/feedback). Iconography and user feedback language (e.g., praise notifications, encouragement) throughout the app was also evaluated by participants. Study staff conducted debriefing sessions following the UX session during which they discussed key findings and summarized action items to increase usability for the development team. Developers made rapid changes to the digital intervention to ensure iterative design improvement.
Results: All users successfully completed all guided scenarios and provided feedback throughout the sessions. UX feedback was summarized into action items for improvements and aspects that users reported facilitated intervention engagement. Action items were ranked by the research team based on frequency of requested change, feasibility of implementation, and potential for tangible impact. Action items for improvements included icon and text changes, reorganization, and shorten length of the content (e.g., changing button labels from “Practice it” to “I did it” for marking a practice complete). Aspects that users enjoyed included the visual presentation (e.g., celebratory memes) and interaction usability (e.g., ease of locating content and completing tasks) throughout the app.
Discussion: The usability testing allowed for direct feedback from the users for the team and app developers to build a user-friendly app that was persuasively designed to promote task completion and usability. Rapid changes to the app design halfway through usability testing improved usability for the remaining users. Usability testing was an efficient and cost-effective step that provided an extensive amount of feedback to promote future app engagement and improve task completion rates for our digital intervention. Usability testing also provided action items that will be implemented in future versions of the app, including more engagement tools (e.g., push notifications, confetti effects, and vibration). We have plans to conduct another round of UX testing with low-literacy populations in preparation for a future RCT.
References
1. Anderson-Lewis, C., Darville, G., Mercado, R. E., Howell, S., & Di Maggio, S. (2018). mHealth technology use and implications in historically underserved and minority populations in the United States: systematic literature review. JMIR mHealth and uHealth, 6(6), e8383.
2. Fuller-Tyszkiewicz, M., Richardson, B., Klein, B., Skouteris, H., Christensen, H., Austin, D.,... & Ware, A. (2018). A mobile app–based intervention for depression: End-user and expert usability testing study. JMIR mental health, 5(3), e9445.
3. Lemon, C., Huckvale, K., Carswell, K., & Torous, J. (2020). A narrative review of methods for applying user experience in the design and assessment of mental health smartphone interventions. International journal of technology assessment in health care, 36(1), 64-70.
4. Nielsen, J. (1994). Usability engineering. Morgan Kaufmann.
5. Sarkar, U., Gourley, G. I., Lyles, C. R., Tieu, L., Clarity, C., Newmark, L., ... & Bates, D. W. (2016). Usability of commercially available mobile applications for diverse patients. Journal of general internal medicine, 31, 1417-1426.
6. Schueller, S. M., Hunter, J. F., Figueroa, C., & Aguilera, A. (2019). Use of digital mental health for marginalized and underserved populations. Current Treatment Options in Psychiatry, 6, 243-255.
7. van Velthoven, M. H., Car, J., Zhang, Y., & Marušić, A. (2013). mHealth series: New ideas for mHealth data collection implementation in low- and middle-income countries. Journal of global health, 3(2), 020101. https://doi.org/10.7189/jogh.03.020101
8. Virzi, R. A. (1992). Refining the test phase of usability evaluation: How many subjects is enough?. Human factors, 34(4), 457-468.
9. Zhong, S., Yang, X., Pan, Z., Fan, Y., Chen, Y., Yu, X., & Zhou, L. (2023). The usability, feasibility, acceptability, and efficacy of digital mental health services in the COVID-19 pandemic: scoping review, systematic review, and meta-analysis. JMIR Public Health and Surveillance, 9(1), e43730.