- Study protocol
- Open Access
- Open Peer Review
The design and methodology of a usability protocol for the management of medications by families for aging older adults
BMC Medical Informatics and Decision Makingvolume 19, Article number: 181 (2019)
Health research apps often do not focus on usability as a design priority. This is problematic when the population of interest is disproportionately underrepresented as users of mobile apps, especially observed with aging older adults (> = 75). Challenges with the adoption of health information technology (HIT) among this group are exacerbated by poor design and user interface/experience (UI/UX) choices. This protocol describes the testing and evaluation process of one HIT app for the family-based collaboration platform InfoSAGE.
We aim to recruit twenty subjects from both informal family-caregivers and aging older adults to examine the usability of the InfoSAGE mobile medication manager. Participants will be audio and visually recorded, in addition to the use of screen capture recordings, while ‘thinking aloud’ as they complete eight common use-case scenarios. Multiple independent reviewers will code video and audio recordings for thematic analysis and use problems will be evaluated. Success and failure of each scenario will be determined by completion of sub-events. Time-to-complete analysis will be used to ascertain the learning curve associated with the app.
Frequently observed problem areas will be used as the basis of further evolution of the app, and will further inform generalized recommendations for the design of HIT apps for research and public use. This study aims to improve the model of development for dual user populations with dissimilar technological literacy to improve retention and use. Results of this study will form the foundation of a design framework for mobile health apps.
Longevity is increasing worldwide , and in the United States the ‘baby-boomer’ generation is rapidly approaching the age of ballooning medical costs that results from higher utilization due to age-related morbidities [2,3,4]. The challenges associated with these coming stressors will require scalable solutions throughout the healthcare spectrum. One area of promise is in the use of mobile health information (mHealth) technology to bring efficient, cost-effective, care to a wide audience [5, 6]. While elders are increasing using more mobile phones [7,8,9], challenges remain in adoption and technical literacy, especially as age rises [10,11,12]. The digital-divide has been decreasing in the last ten years, but is a scarcity of systematically designed studies for the evaluation of the usability of mHealth solutions in mixed-age populations, aimed at both informal caregivers and aging older adults.
Many studies have shown the potential negative impact of poorly designed information technology on facilitating medical errors [13, 14], and specific problems regarding regarding usability of mobile apps for medication [6, 15]. Usability testing has been applied in the assessment of health information system safety to identify and prevent medical errors and patient safety risks that may arise from the use of health information systems. Specifically, such methods have begun to be applied to the assessment of the impact of user interface features and design choices on medical error . One study  evaluated a prototype app that simulates medication tracking using an iPad and found that users struggled with screen glare, button activation, and the “drag and drop” function , which makes it difficult for the significant number of users with poor vision to correctly use those apps - an estimated 1 in 5 North American adults aged 75 or over have a self-reported “seeing diability” . Low health literacy is another barrier to effective interaction with technology - about one-half of North American adults have low literacy, meaning they lack the literacy skills needed for everyday life,  and between 46 and 60% also have low health literacy and struggle to “obtain process, and understand basic health information and services needed to make appropriate health decisions” [19, 20].
Usability problems in mobile apps lead to reduced utilization, lower rates of user retention, and increased user frustration  users who have difficulties navigating an app, understanding button configuration or layout, or find features to be too convoluted or complex are less likely to continue use [21, 22].
The InfoSAGE platform is a free, mobile and web-based application providing features and tools for the informal caregiving of aging older adults through a shared family network . One of the primary tools of InfoSAGE is the mobile medication manager (Fig. 1), which enables collection of prescription and over-the-counter medications and facilitates documentation of current and discontinued medications, pill images, dosages, and permits scheduling reminders . Although the app is free to use and published publically, there has not been a formal evaluation of the usability within the dual populations serviced by InfoSAGE to date.
To these aims, this protocol describes a systematic method of evaluating a mobile medication management system within the InfoSAGE platform, usable on both iOS and Android operating systems. While the specifics are focused on the InfoSAGE app, the method and approach should be widely applicable to any app under evaluation for age-appropriate usability.
The result of this study will be an evaluation of common problems encountered by frail older adults and their informal caregivers using this mobile app, and a set of recommendations for the design of mobile apps for elders based on our observations and analysis.
Ethical approval and recruitment
The study is approved for ethical review by the Beth Israel Deaconess Medical Center Institutional Review board. Recruitment is on-going, beginning in April 2018, with a target convenience sample of 10 informal caregivers and 10 frail older adults drawn from local advertising, flyers, and online message boards, and a word-of-mouth and grassroots approach through collaborating partners.
Demographic and baseline comfort with the Internet, technology, and apps will be gathered prior to testing. After completion of the test scenarios, participants will be asked to complete a modified standard usability survey on a Likert scale and will have the opportunity to give open-ended feedback on the testing process, the app, and the scenarios (Table 1). Acceptability of use will be evaluated from the responses to the surveys.
Testing will take place in a controlled office environment, using supplied development iPads using the publicly available InfoSAGE app. Audio and video recording will be utilized, but video recording will be limited to hands only. Participants will be asked to ‘think aloud’ while interacting with the app during testing scenarios. No faces or other identifiers will be recorded. During testing, touches and actions performed on the development iPad will be captured by screen recording software. Participants will be supplied with a test account, with full logging enabled to remove potential problems in registration or log ins. Prior to testing, participants will be given a brief overview of the InfoSAGE platform, the tiered access system, and a brief overview of its main features. Participants will not be instructed in how to use the app and will only receive help if they are unable to continue with the testing process. Any help provided will be noted.
Eight individual scenarios were developed based on use cases frequently observed on InfoSAGE and were modified through internal testing with naïve users (Table 2). Each scenario is divided into subevents, or check-points, for further granularity of evaluation. Any staff help received will be evaluated in comparison to these subevents and if the help is deemed instrumental the subevent will be marked as failed. All subevent must be completed for the entire scenario to be considered passed. Scenarios one and two are designed to be used as a gauge for rapidity of familiarity, differing only in medication name. Scenarios four, five, seven, and eight are designed to evaluate the navigation, tactility and location of navigation elements (buttons, switches, links), and descriptive language used. Scenarios three (Fig. 2) and seven use more advanced medication entry elements and will be used to evaluate technological and health literacy.
Evaluation and analysis
Audio transcripts will be transcribed and coded for broad themes. A heuristic method of coding will be performed on the video recording and screen capture. Two analysts will individually evaluate the videos, using a shared code dictionary, noting the events and codes through the Behavioral Observation Research Interactive Software (BORIS, v.7) . Review of the coded videos will be conducted by the entire team. Initial cases will be used to further develop the analysis methodology, with codes undergoing assessment for applicability and appropriateness. This method will ensure the relevance and maximize the value of the categorizing and coding.
Each case will be reviewed by the team, and differences in codes and events noted. Although this is a qualitative study and we expect to have differences in coding between analysts, we will compare each event and code against the code definition for appropriateness. Incorrectly applied codes will be adjudicated as a group. Broadly, we expect the codes to fall into the following categories: data display visibility issues, navigation problems, data entry problems, content comprehension, attention problems or other cognitive confusion, and issues of health literacy. A qualitative assessment will be completed on the problems users had with cognitive confusion based on the verbal comments made by the test subjects.
Interrater correlation will be evaluated by a two-way, intraclass correlation coefficient . Individual codes will be thematically grouped and evaluated by scenario and subject. Additionally, time-series visualization will be used to compare clustering of all subjects’ aggregated events to identify commonly observed navigation or cognitive problems. Finally, intraclass correlation will be determined between coders for failure/success of each scenario.
Secondary to the evaluation of common problems identified, we will also examine the uptake of learning by comparing the differences in time to complete for scenarios one and two in aggregate, and by comparing the time to a group of familiar, expert users. Additionally, navigation errors, hesitation, and comments or observations of frustration and annoyance will be quantified between the two scenarios and analyzed as a condition of learning.
Post-testing feedback from participants will be used to inform future design decisions and will comments will be assessed along with observations and quantitative metrics to modify navigation flow and user interface/user experience. Demographics, Internet comfort-level and app expertise will be examined for correlation to age and tech literacy of the participants.
We aim to produce a generalizable set of recommendations for the future design of mHealth apps targeted towards mixed age populations of informal caregivers and their aging older adults. Through design and iteration, we hope to form the basis for a framework to support further usability testing of mobile health applications.
Availability of data and materials
Not applicable. Our manuscript does not contain any data.
Behavioral Observation Research Interactive Software
Health Information Technology
Information Sharing Across Generations and Environments
United Nations, Department of Economic and Social Affairs, population division. World population prospects: the 2015 revision, key findings and advance tables. Working paper no. ESA/P/WP.241 29 July 2015, New York. Available at URL: http://www.un.org/en/development/desa/publications/world-population-prospects-2015-revision.html [Last Accessed 9 January 2019].
US Census. 2017 National Population Projections Tables - Projections for the United States: 2017 to 2060. Available at URL: https://www.census.gov/data/tables/2017/demo/popproj/2017-summary
King DE, Matheson E, Chirina S, Shankar A, Broman-Fulks J. The status of baby boomers' health in the United States: the healthiest generation? JAMA Intern Med. 2013;173(5):385–6.
Keehan SP, Poisal JA, Cuckler GA, Sisko AM, Smith SD, Madison AJ, Stone DA, Wolfe CJ, Lizonitz JM. National Health Expenditure Projections, 2015-25: economy, prices, and aging expected to shape spending and enrollment. Health Aff (Millwood). 2016;35(8):1522–31.
Quintana Y, Crotty B, Fahy D, Lipsitz L, Davis RB, Safran C. Information sharing across generations and environments (InfoSAGE): study design and methodology protocol. BMC Med Inform Decis Mak. 2018;18(1):105. https://doi.org/10.1186/s12911-018-0697-4.
Grindrod KA, Li M, Gates A. Evaluating user perceptions of mobile medication management applications with older adults: a usability study. JMIR Mhealth Uhealth. 2014;2(1):e11.
Pew Research Center (2014). Older Adults and Technology Use. April 3, 2014. Available at URL: http://www.pewinternet.org/2014/04/03/older-adults-and-technology-use/ [Last Accessed 9 January 2019].
Pew Research Center. Americans’ Internet Access: 2000–2015. June 26, 2015. Available at URL: http://www.pewinternet.org/2015/06/26/americans-internet-access-2000-2015/ [Last Accessed 9 January 2019].
Pew Research Center, May 2017, “Tech adoption climbs among older adults”. URL: http://www.pewinternet.org/2017/05/17/tech-adoption-climbs-among-older-adults/
Lundberg S. The results from a two-year case study of an information and communication technology support system for family caregivers. Disabil Rehabil Assist Technol. 2014;9(4):353–8.
Arcury TA, Sandberg JC, Melius KP, Quandt SA, Leng X, Latulipe C, Miller DP Jr, Smith DA, Bertoni AG. Older adult internet use and eHealth literacy. J Appl Gerontol. 2018t;24:733464818807468.
McCloskey R, Jarrett P, Stewart C, Keeping-Burke L. Recruitment and retention challenges in a technology-based study with older adults discharged from a geriatric rehabilitation unit. Rehabil Nurs. 2015;40(4):249–59.
Kushniruk A, Triola MM, Stein B, Borycki EM, Kannry JL. The relationship of usability to medical error: an evaluation of errors associated with usability problems in the use of a handheld application for prescribing medications. InMedinfo. 2004;7:1073–6.
Svanæs D, Alsos OA, Dahl Y. Usability testing of mobile ICT for clinical settings: methodological and practical challenges. Int J Med Inform. 2010;79(4):e24–34.
Reynoldson C, Stones C, Allsop M, Gardner P, Bennett MI, Closs SJ, et al. Assessing the quality and usability of smartphone apps for pain self-management. Pain Med. 2014;15(6):898–909.
Kushniruk AW, Triola MM, Borycki EM, Stein B, Kannry JL. Technology induced error and usability: the relationship between usability problems and prescription errors when using a handheld application. Int J Med Inform. 2005;74(7–8):519–26.
Lethbridge-Çejku M, Rose D, Vickerie JL. Summary health statistics for the US adults; national health interview survey; 2004.
Kutner M, Greenburg E, Jin Y, Paulsen C. The health literacy of America's adults: results from the 2003 National Assessment of adult literacy. NCES 2006–483. National Center for Education Statistics. 2006 Sep.
Morris NS, Grant S, Repp A, MacLean C, Littenberg B. Prevalence of limited health literacy and compensatory strategies used by hospitalized patients. Nurs Res. 2011;60(5):361.
HP USDHHS. Understanding and improving health. Washington, DC: US Department of Health and Human Services, US Government Printing Office; 2000.
Thies K, Anderson D, Cramer B. Lack of adoption of a mobile app to support patient self-management of diabetes and hypertension in a federally qualified health center: interview analysis of staff and patients in a failed randomized trial. JMIR human factors. 2017;4(4).
Liew MS, Zhang J, See J, Ong YL. Usability challenges for health and wellness Mobile apps: mixed-methods study among mHealth experts and consumers. JMIR mHealth and uHealth. 2019;7(1):e12160.
Friard O, Gamba M. BORIS: a free, versatile open-source event-logging software for video/audio coding and live observations. Methods Ecol Evol. 2016;7:1325–30.
Hayes AF, Krippendorff K. Answering the call for a standard reliability measure for coding data. Commun Methods Meas. 2007;1:77–89.
The authors would thank Warner Slack, Maxwell Gorenberg,and current and former members of the Division of Clinical Informatics at Beth Israel Deaconess Medical Center.
This study was made possible through the Agency for Healthcare Research and Quality (AHRQ) grants # R01 HS021495-01A1 and R18 HS24869–01. The authors are solely responsible for this document’s contents, findings, and conclusions.
Ethics approval and consent to participate
The institutional review board of Beth Israel Deaconess Medical Center approved this protocol (2014P000296). Informed consent will be obtained from all study participants using an approved online informed consent process.
Consent for publication
The authors have no competing interests to declare.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.