International Journal of Medical Informatics 110 (2018) 10–18
Contents lists available at ScienceDirect
International Journal of Medical Informatics journal homepage: www.elsevier.com/locate/ijmedinf
Research Paper
Usability evaluation of a commercial inpatient portal a,⁎
b
a
T a
b
Po-Yin Yen , Daniel M. Walker , Jessica M. Garvey Smith , Michelle P. Zhou , Terri L. Menser , Ann Scheck McAlearneyb a b
Department of Biomedical Informatics, College of Medicine, The Ohio State University, United States Department of Family Medicine, College of Medicine, The Ohio State University, United States
A R T I C L E I N F O
A B S T R A C T
Keywords: Usability Patient portal Personal health record Inpatient
Objectives: Patient portals designed for inpatients have potential to increase patient engagement. However, little is known about how patients use inpatient portals. To address this gap, we aimed to understand how users 1) interact with, 2) learn to use, and 3) communicate with their providers through an inpatient portal. Materials and methods: We conducted a usability evaluation using think-aloud protocol to study user interactions with a commercially available inpatient portal – MyChart Bedside (MCB). Study participants (n = 19) were given a tablet that had MCB installed. They explored MCB and completed eight assigned tasks. Each session’s recordings were coded and analyzed. We analyzed task completion, errors, and user feedback. We categorized errors into operational errors, system errors, and tablet-related errors, and indicated their violations of Nielsen’s ten heuristic principles. Results: Participants frequently made operational errors with most in navigation and assuming non-existent functionalities. We also noted that participants’ learning styles varied, with age as a potential factor that influenced how they learned MCB. Also, participants preferred to individually message providers and wanted feedback on status. Conclusion: The design of inpatient portals can greatly impact how patients navigate and comprehend information in inpatient portals; poor design can result in a frustrating user experience. For inpatient portals to be effective in promoting patient engagement, it remains critical for technology developers and hospital administrators to understand how users interact with this technology and the resources that may be necessary to support its use.
1. Background and significance Patient portal, is defined as “a secure online website that gives patients convenient 24-hour access to personal health information from anywhere with an Internet connection” [1]. Distinct from conventional personal health records (PHR) [2,3], patient portals are owned and managed by the health care organization to provide most current data for patients [1,4]. As patient-centered care is promoted across the nation, patient portals play an important role in facilitating patient engagement and encouraging patients to take control of their own health, as well as improving patient-provider communication [5–7]. Features provided in patient portals include checking lab results, scheduling appointments, refilling medications, obtaining referrals, accessing educational materials, sending secure messages to providers, and paying bills [7,8]. Both patients and providers were positive about patient portals; however, usability has been reported as a major obstacle for adoption [5,9,10]. Patient portals have predominantly been available to outpatients [11]. Recently, usage and research has been moved to the inpatient
⁎
setting. Given that currently 68.0% of U.S. adults have smartphones and 45.0% of adults have tablets [12], offering commonly used Android or iOS operated tablets allows inpatients to access their health records during their hospital stay [6]. A variety of inpatient portals have been developed and assessed for their feasibility and benefits for inpatients. Fifteen studies have evaluated the effectiveness of eleven unique inpatient portals (Table 1). Among them, four were ongoing studies without findings yet [13–16]; eleven reported positive patient experiences [10,17–25], including increased patient satisfaction [10,17,19,20,23,25], increased patient engagement [20,21], decreased anxiety [17,20], increased ownership of their own health condition [17,20,23,25], and improved safety and quality of care. [17,23], However, while most features generally received positive feedback, there were mixed opinions regarding communication with health providers via the messaging feature in the inpatient portal [17,18,23], and with a low usage rate (5.6%) [23]. One study reported a high usage rate (72.9%) of the messaging function, but they did not specify when and how the messaging function was used [26].
Correspondence to: Po-Yin Yen, PhD, RN, Institute for Informatics, Washington University School of Medicine, Goldfarb School of Nursing, BJC HealthCare, St. Louis, United States.
https://doi.org/10.1016/j.ijmedinf.2017.11.007 Received 9 June 2017; Received in revised form 20 October 2017; Accepted 12 November 2017 1386-5056/ © 2017 Elsevier B.V. All rights reserved.
BMT roadmap [14,27] Electronic bedside communication center [24] Health Feed [20] Personal Health Record [13] (ongoing study) Powerchart [21,22] My NYP Inpatient [17–19] MyChart [15] (ongoing study) MyChart Bedside (Wisconsin) [23] MyChart along with Educational Modules [10] MyChart Bedside (St Rita’s Medical Center) [25] MyChart Bedside (The Ohio State University Wexner Medical Center) (ongoing study) [16]
x
11 x
x
x
x
x
x
x
x
x
x x
x x
x x
Medications
x
x
x
x
x
x x
x
x x
x x
Care Team
x
x x
Discharge information
x x
x x
Test Results
x x
Schedule Appointments
Table 1 Feature Comparison of Selected Studies.
x
x
x
x
x
x x
Notes
x
x
x
x
x
x x
Education
x
x
x
x
x
x
x
Messaging
x
x
x
x
x
Schedule
x
x
x
I would like/ Requests
x
x
x
x x
x
Allergies
x
x x
Dining/Diet
x
x
x
x
Vital signs x
Hospital Map
x
Pay Bills
x
Clinical Trial Participation
x
Search for Physicians
P.-Y. Yen et al.
International Journal of Medical Informatics 110 (2018) 10–18
International Journal of Medical Informatics 110 (2018) 10–18
P.-Y. Yen et al.
Ɣ Ɣ Ɣ Ɣ Ɣ Ɣ Ɣ Ɣ Ɣ Ɣ
Fig. 1. Functions in MyChart Bedside.
Bedside Tutorial: Watch introduction video explaining the specific features on MyChart Bedside. Happening Soon add personal events, such as a family visit, or phone call to friends. To Learn: Review educational materials and videos. Taking Care of Me: View care team photos and profiles. Messages: Send non-urgent messages to the care team. My Health: View lab results and track vitals trends. Dining on Demand: Order food directly from the cafeteria (unavailable during the study). Notes: Create personal notes. Could be written, audio, or video notes. I Would Like: Send requests for help with patient education and entertainment options; talk to someone from patient experience, pastoral care, social work/ discharge planning, or
understanding user-task interaction for system development; Level 2 examines task performance to assess system validation and humancomputer interaction for system improvement; Level 3 aims to incorporate environmental factors to identify work processes and system impact in a real setting [46]. In this study, we focused on the Level 2 evaluation aiming to identify usability issues based on the interaction between users and an inpatient portal. Specifically, we conducted an end-user usability evaluation using think-aloud protocol. Think-aloud protocol is grounded in research conducted by cognitive psychologists, Ericsson and Simon [47] and by Clayton Lewis in the usability field [48]. It encourages users to express out loud what they are looking at, thinking, doing, and feeling, as they perform the specified task [47–49]. This approach allows us to observe the cognitive processes associated with task completion. Studying actual or intended users in think-aloud protocol provides a closer view of how users use technology and reveals practical usability problems related to their performance [50]. Thinkaloud protocol has been demonstrated to be effective in assessing users’ interaction with technology [51–55].
Also, these studies mainly investigated user experience, which was defined as “a person's perceptions and responses that result from the use and/or anticipated use of a product, system or service,” [28] and explored an overall understanding of user satisfaction, perceptions, and responses, but did not capture user performance [29]. Little is known about how patients interact with and learn to use an inpatient portal. Therefore, we conducted a usability evaluation of a commercially available inpatient portal. 1.1. Usability evaluation The International Organization for Standardization (ISO) 9241 defines usability as “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use” [30]. Usability is evaluated by observing the interaction between user, tool, and task in a specified environment [31,32], and with measures of task completeness and errors (effectiveness), completion time (efficiency), and perception (satisfaction). In addition, learnability has been introduced as an essential aspect of usability [33–37]. Usability evaluation is used to assess user performance and identify specific problems in products, and thus is commonly used for product redesign [38,39]. Usability evaluation helps improve the predictability of user interactions with products, leading to greater productivity with fewer user errors and savings in development time and cost [34,38,40].
3.1. Setting We conducted the study at a large, Midwestern, academic medical center that provides comprehensive care across the life-cycle at six hospitals and 53 ambulatory site locations, including 30 communitybased clinics.
1.2. Age 3.2. Inpatient portal
Age has been considered one major factor influencing the usage of patient portals [41–45]. Two current studies investigated the feasibility of providing patient portals to older adults [41,42]. They found that older adults were willing to use tablets, but only a small portion of older adults were able to use it without any assistance [41,42]. Older adults often feel that they are too old to learn new technology [43]. As older adults comprise the largest patient population, it is essential to discover age-related issues that might prohibit the use of an inpatient portal and how to engage older adults.
In this study, the inpatient portal being evaluated was a commercially available inpatient portal, Epic’s MyChart Bedside (Epic, Verona, WI), which can provide better generalizability to other healthcare organizations. MyChart Bedside (MCB) was made available on an Android tablet. A list of MCB functions are provided for inpatients as shown in Fig. 1.
3.3. Sample 2. Objectives Participants were recruited through the organization’s Patient and Family Experience Advisors Program, which consists of current and previous patients or primary caregivers that provide feedback of their experience at the medical center. Participants were eligible for the study if they spoke English, were over 18 years of age, did not have a cognitive impairment, and had no previous experience with MCB. Interested volunteers were provided a contact to learn more about the study, and if they agreed to participate, a usability session was scheduled. Participants were recruited and participated in usability sessions between May and September 2016. The usability sessions were conducted in a quiet and private conference room. Institutional Review Board approval was obtained from the study organization to conduct this research.
We conducted a usability evaluation to understand how patients interact with and learn to use inpatient portals. In addition, as the messaging feature has been a concern in prior studies, we were also interested in users’ feedback on patient-provider communication. In summary, the purposes of the study were to investigate how users 1) interact with, 2) learn to use, and 3) communicate with their providers using an inpatient portal. We also sought to understand potential agerelated issues that could influence user performance. 3. Materials and methods In a stratified view of health IT usability evaluation: Level 1 targets 12
International Journal of Medical Informatics 110 (2018) 10–18
P.-Y. Yen et al.
Table 2 Task description. Task ID
Task asked to be performed
Associated functions in MyChart Bedside
1
View your scheduled medication/tests/procedures
2 3 4 5 6
Check your most recent test results View your healthcare providers Communicate with your care team Write a note to yourself Request a patient education session
7 8
Read the diabetes education documents Add an event to your schedule
Participants should go to the Happening Soon page and read the details and times of upcoming medications and events Participants should go to the My Health page and interpret the test results while giving feedback on the page Participants should go to the Taking Care of Me page and comment on and explore the pages content Participants should go to the Messages tab and write a message to send to the care team and offer their opinions Participants should go to the Notes tab and write a note and provide feedback on the page Participants should go to the I Would Like page and request a patient education session while offering their opinions of the page Participants should go to the To Learn page then find and read the diabetes education packet and provide feedback Participants should go to the Happening soon tab and add an event while describing their thoughts on the page
3.4. Data collection procedure (usability sessions)
Table 3 Demographics of Study Participants.
Participants were scheduled and interviewed individually. Sessions were led by at least one experienced investigator and supported by a student research assistant. Following informed consent, participants were asked to login to a test patient account, explore the application, and then perform assigned tasks using MCB. Participants were asked to think aloud as they interacted with MCB and were prompted to elicit additional information on why or what they were doing. If they were unable to perform tasks due to a lack of understanding, we provided navigational hints and took note of the difficulties experienced. All sessions were recorded using a digital camera on a tripod, in order to observe participants’ hand movements on the tablet and their comments on specific features or screens. The assigned tasks are listed in Table 2. In addition, demographic information was collected, including age, race, gender, education level and experience with computers. At the end of the session, participants received a $30 gift card in appreciation for their participation. 3.5. Data analysis Videos generated from think-aloud protocol were analyzed using content analysis to code observed events, which is a systematic and objective approach to describe and quantify specific phenomena [56]. The coding scheme was developed through an iterative process involving one experienced coder (PY) and two trained coders (JGS & MZ) collectively reviewing eight videos. Remaining sessions were then independently coded by two coders (JGS & MZ), with discrepancies being discussed throughout the coding process. During the video playback, Microsoft Excel was used to document and organize all coding and time stamps. Errors and instances of participants requesting assistance were noted. We evaluated participants’ interactions with MCB by task completion and errors (effectiveness). Errors were defined as participants going to the wrong page, checking the wrong places, or clicking on the wrong button. We further categorized errors into operational errors, system errors, and tablet-related errors, and used Nielsen’s ten heuristic principles [57] to classify these errors. Nielsen’s heuristic principles are widely used for guidance in interaction design [48]. The classification of errors helped us determine issues and potential solutions for future redesign. We also assessed users’ feedback (satisfaction) regarding learning and communication with providers in MCB. We did not consider task completion time (efficiency) as think-aloud protocol requires participants to express their thoughts during task performance, which generally prolongs task completion time.
Characteristic
N (%)
Age < 40 40–60 > 60
6 (31.6%) 4 (21.1%) 9 (47.3%)
Gender Female Male
11 (58.9%) 8 (41.1%)
Education Some college but no degree Technical degree or certification Bachelor Master Doctoral Professional degree I prefer not to answer
5 1 4 5 2 1 1
Computer Use Daily A few times/month
18 (94.8%) 1 (5.2%)
Own a Tablet Yes No
16 (84.2%) 3 (15.8%)
Tablet Comfort Level (n = 17) Low (7–10) Medium (4–6) High (1–3)
3 (17.6%) 2 (11.8.%) 12 (70.6%)
PHR Comfort Level (n = 17) Low (7–10) Medium (4–6) High (1–3)
2 (11.8%) 2 (11.8%) 13 (76.4%)
(26.3%) (5.2%) (21.1%) (26.3%) (10.7%) (5.2%) (5.2%)
tablet. The mean level of comfort using a tablet was 2.8 ± 2.4 on a scale of 1–10 (1 = Very comfortable, 10 = Not at all comfortable). 16 (84.2%) participants had experience using a PHR with a mean level of comfort of 2.9 on a scale of 1–10 (1 = Very comfortable, 10 = Not at all comfortable). The average duration of the usability session was 50.6 min.
4.1. Interacting with MyChart bedside 4.1.1. Task completion All participants were able to complete all assigned tasks. However, 8 (42.1%) participants needed help a total of 12 times. The three tasks participants needed help with the most were Task 2, 7, and 8. When asked to check their lab results (Task 2), My Health was not an intuitive term for participants, causing them to struggle to find the lab results. When completing the To Learn page (Task 7), three participants struggled exiting the page. Also, when being asked to manage their own schedule by adding a family event (Task 8), three participants were lost and could not complete the task. Among the 8 participants that needed
4. Results A total of 19 participants participated in the study (Table 3). Eleven (58.9%) participants were female, and age ranged from 18 to 74 years old, with a mean of 51.4 years of age. 16(84.2%) had experience using a 13
International Journal of Medical Informatics 110 (2018) 10–18
P.-Y. Yen et al.
Fig. 2. Error Categories and Examples.
the MCB Home page, but it took participants to the medical center webpage. Another 13 errors occurred while trying to exit the To Learn page using the page navigators within the electronic book. 17 (42.5%) errors on the Home page were navigational errors, where participants were not able to find the tab associated with the assigned task. Similar to the To Learn page, 14 (36.8%) errors on the Home page were issues with closing pages, including 11 errors due to assuming that the grey arrows (page navigators) on the bottom of the screen were back arrows. In the Messages page, 8 (32.0%) were navigational errors; 15 (60.0%) errors related to the assumed non-existent functionality that patients thought that they could send messages to specific clinicians on their care team. We also found that the number of errors encountered varied by age (Fig. 4). Participants below 40 had the lowest error count. Participants above 60 had the highest average error count though they had the most diverse numbers of individual errors. Participants above 60 comprised of four participants with the highest number of errors and four participants with the lowest number of errors, demonstrating that the above 60 age group had the highest variability with technology comfort and competency, although they tended to have lower technology competency than younger age groups.
assistance, six were above 60 years old. We offered help through obvious hints or specifically telling them where to go if they had been struggling for a significant period of time. 4.1.2. Errors We categorized errors into operational errors, system errors, or tablet-related errors. Operational errors were errors due to incorrect operation of MCB by users, such as going to a wrong page, clicking on a wrong button, or assuming a non-existent function. System errors were errors due to system malfunctions in MCB, such as being directed to a wrong page unexpectedly. Tablet-related errors were errors associated with users being unfamiliar with tablets, such as not knowing how to turn up the volume, or use the back button on an Android tablet. Fig. 2 shows the error categories with counts and examples. 4.1.2.1. Operational errors. There were a total of 224 operational errors (Fig. 2). The mean count and SD per participant was 11.8 ± 8.5, ranging from 2 to 27 errors. Among the errors, participants encountered most errors in the pages of To Learn, Home and Messages, though errors varied by participants and pages (Fig. 3). 34 (64.2%) errors on the To Learn page were troubles with closing the page. Specifically, 13 errors related to clicking on the medical center logo attempting to go back to
Fig. 3. Error Count by Study Participant and by Each MCB Page.
14
International Journal of Medical Informatics 110 (2018) 10–18
P.-Y. Yen et al.
Fig. 4. Number of Operational Errors by Age.
4.2.2. In-person instruction 8 (42.1%) participants expressed their preference of in-person instruction. Among those, six were above 60 years old. A participant commented, “It’d be best if somebody came to go through the basics…or show my caregiver or my family member how to use it, and they can help me.” Another participant commented that “[MCB] is hard to figure out how to send a message without any tablet or computer experience. […] should have someone to show people how to do it.”
4.1.2.2. System errors. There were 68 system errors experienced by 13 participants. The mean error count and SD from the 13 participants was 5.3 ± 6.3, ranging from 1 to 25 errors. We did not see any age-related difference in system errors. 29 (42.6%) were errors such as nonfunctional zoom-in, videos not playing or stopping, or slow page loading. 22 (32.4%) errors were related to users being logged out of the application unexpectedly. 17 (25.0%) errors were sensitivity issues where the application did not respond to users tapping until after several attempts.
4.2.3. Guided handouts 5 (26.3%) participants requested a guided handout that explains MCB pages and functionalities. Among those, four were above 60 years old. A participant commented, “You might want to also have a very basic fact sheet…even though you’ve got a tutorial…Something that says what’s in each Table ” Overall, participants were equally divided on their preferences of self-guided learning, in person instruction, or guided handouts. However, half of participants below 40 years old preferred self-guided learning, while two-thirds of participants over 60 years old preferred inperson instruction or guided handouts. A third of those over 60 also had concerns that they may not be able to learn and use MCB when not feeling well. One participant commented, the “First page I had [on the To Learn tab] was very nice and has a lot of information, but if I'm lying there scared and in pain…I don't think I want to read a ton on diabetes” in reference to the extensive information in the educational material. Another commented that “[MCB] is confusing to understand if I was in the hospital… going to take a while to learn the tablet.” He was concerned about his lack of motivation to learn MCB while sick in the hospital. Another participant had trouble using MCB and also commented that she was “[…] concerned about not being able to use the tablet if I was on pain meds, very sick, or not energetic.”
4.1.2.3. Tablet-related errors. Nine tablet-related errors were made by 6 participants. Older participants committed most of these errors, as 3 participants were above 60 years old and 2 were between 40 and 60 years old. The mean error count and SD from the 6 participants was 1.5 ± 0.5, ranging from 1 to 2 errors. 6 (66.7%) errors were issues related to being unfamiliar with the on-screen keyboard. 2 (22.2%) errors related to not finding the volume button. 1 (11.1%) error related to trouble with camera positioning and use. Table 4 categorized all errors identified in MCB using Nielsen’s ten heuristic principles [57]. We found the most numerous violations to be Match between System and Real World, and Consistency and Standards. The redesign of inpatient portals should pay attention to violations of the heuristic principles for a more intuitive user interaction. 4.2. Learning to use MyChart bedside MCB provided an 11-min tutorial video for participants to learn about how to use MCB, but not all participants found it useful. Among 17 (89.5%) participants who went to the tutorial page, 12 (63.1%) participants started to watch the video, but only 5(26.3%) watched the entire tutorial video. A participant who stopped the video midway wanted it to “get to the point” and said that she “could probably do it faster [by exploring].” 8 (42.1%) participants indicated that the video was too long and 4 (21.0%) suggested that it should be split up by tabs or in smaller segments so that it is easier to find the instructions they needed. 8 (42.1%) participants enjoyed the tutorial. One described it as “very detailed, long but explained content thoroughly, good video.” Many participants proposed alternative learning approaches than were provided with MCB, including self-guided learning (free exploration), in-person instruction, and guided handouts.
4.3. Communicating with providers When asked about communicating with their providers with the Messages function, participants had mixed opinions. The Messages function allows users to send messages to the entire care team, but 16 (84.2%) stated that they would prefer to choose who received the message. Participants described messaging the whole care team as “overkill” and “if I needed a pillow I would only want to send it to my nurse.” 16 (84.2%) also felt confusion on when to use Messages, what to request, and who would respond: “I assume a nurse or one of care team members will respond, but I don't know when.” 9 (47.4%) specifically suggested the need to know the status of the communication, for example, whether the message has been read, who would respond, and when their request would be fulfilled. A participant said they would more inclined to call: “I'm not really crazy about the messages thing. To me
4.2.1. Self-guided learning 6 (31.6%) participants expressed their preference of self-guided learning. Among those, three were under 40 years old. They considered MCB to be “easy to go through” and they “did not need any further instructions to use the app.” 15
International Journal of Medical Informatics 110 (2018) 10–18
P.-Y. Yen et al.
Table 4 Usability Problem Summary. Usability Heuristic Principles (error count) & definition
No. of Errors
Error examples
Visibility of System Status: notifying users about system status through feedback, such as a loading status of a page. (n = 7)
5 2
The tutorial video did not show the video loading status. Pages did not load and did not show a loading status.
Match between System and Real World: the system should use language, concepts, features, and functionalities that users are familiar with from the real world and other technologies. (n = 82)
3 6
Could not find the volume button to increase or lower the video volume. Had problems using the keyboard with the accents, apostrophes, space bar, and return key. Tried to zoom the screen using two fingers, but the zoom functionality was not available in MCB. Attempted to assume common smartphone functionalities, such as tapping icons to open more information. Incorrectly tried going to the previous page by clicking the arrows that moved the pages forward and backward in the educational book. Incorrectly tried to go to the MCB Home page by clicking the institution logo.
9 18 22 24 User Control and Freedom: provide users the option to undo their choices within the system, such as undo and exit buttons to promote user control of mistakes. (n = 2)
2
Did not find a place to exit a page or go back to a previous page and was stuck.
Consistency and Standards: use consistent language and designs so users are not confused on what actions or words mean. (n = 120)
2
Misinterpreted the maximize window button as a button to exit the page. Went to the wrong page because misunderstood the page title or thought a certain feature would be under that page but it was not.
118 Error Prevention: precise system design that avoids common user problems by eliminating error-prone situations through thoughtful navigation and design. (n = 17)
17 10
Recognition Rather than Recall: the system should be able to be used without instructions and should be simple to remember and understand. (n = 15)
8 4 3
Flexibility and Efficiency of Use: the system should be designed for both inexperienced and experienced users to be able to have meaningful system interactions. (n = 17)
15 2
Touch screen was not sensitive enough and participants had trouble tapping the buttons to make them work. Clicking the schedule tab from the sidebar button sent the user to the wrong page. Trouble with the buttons used for saving and exiting a note. Could not find the correct buttons to complete a task such as the “add” or “submit” buttons. Could not tell which day they were viewing on the calendar. Tried to message individual provider instead of sending a message to the whole care team. Tapped the sidebar in the video to try to skip to different sections of the video, but was not an available functionality.
Aesthetic and Minimalist Design: only include relevant information, buttons, and designs Help users recognize, diagnose, and recover from errors: display error messages that help users recover from errors by explaining the problem and solutions. (n = 5)
5
Gray arrows, which were not functional, looked like back and forward buttons between the pages on certain pages and confused participants.
Help Users Recognize, Diagnose, and Recover from Errors: display error messages that help users recover from errors by explaining the problem and solutions. (n = 24)
2
Error messages were not clear on the To Learn page concerning the bookshelf availability. Forced logouts confused participants and needed an explanation and error message.
22 Help and Documentation: provide assistance when users are confused or lost, often in the form of a help button. (n = 2)
1 1
Clicked on the help button and nothing showed up. Clicked on the “I have questions” button and nothing was offered.
confidence in using new technology. It has been recommended that healthcare organizations should investigate and incorporate proper tablet and application training for patients in order to maximize their use of patient portals [4,43,60]. Even though a tutorial video was provided as the training material for using the inpatient portal, some participants did not respond positively to this type of learning. We suggest having multiple forms of learning materials to address the varying learning preferences. This would accommodate individuals’ learning habits and needs, and improve their ability to use the portal. Our participants as well as patients in previous studies both expressed their preference to select specific care providers to send individual messages to, instead of sending messages to the whole care team [18,61]. However, clinical workflow in the inpatient environment could be too complex for private messaging. Participants also suggested wanting to know whether the message has been read, who would respond, and when their request would be fulfilled. While the immediate responses or statuses would increase patient satisfaction, it might be challenging for clinicians’ workload. Providers have expressed concerns about the impact of patient portals on patient-provider communication and workflow, including patients misusing messages for serious health concerns such as not being able to breathe [21]. Although a study
that's like e-mailing back and forth with somebody over the course of a day when you talk with somebody for 3 minutes you’d be done.” We did not see any age-related difference on communicating with providers. 5. Discussion Patient portals can enhance patient engagement [58]. However, poor design of patient portals would greatly impact how patients navigate and comprehend information in the portal, which can result in a frustrating experience [59]. In our study, we identified common errors, categorized error types, and indicated their violation of Nielsen’s heuristic principles [57]. Our results provide insight for application redesign and for other organizations implementing inpatient portals. For example, operational errors could be eliminated by redesigning the interface to be more intuitive and user-friendly. Also, designing features based on the needs of older adults is key to maximizing the usability of patient portals for older adults [45]. System errors could be corrected by the development team and be incorporated in the system upgrade. Tablet-related errors are prone to being reduced with proper training in tablet use. Although some participants may report a high comfort level with tablets, courteous reminders or a basic tablet tutorial would be helpful for individuals or older adults who are not familiar or have less 16
International Journal of Medical Informatics 110 (2018) 10–18
P.-Y. Yen et al.
Summary points
reported a high usage rate (72.9%) of the messaging function, it is unclear when and how the messaging function was used [26]. It is essential to ensure that patients utilize the message function appropriately. It is also noted that patient portals should not replace the regular in-person patient-provider communication, but provide a means to increase patient engagement with their own health [43]. Future studies are needed to investigate workflow and communication from a sociotechnical perspective [62] to understand the proper patient-provider communication through inpatient portals. Existing studies reported that patients are less likely to use inpatient portals when they are not feeling well [17]. Although our participants were not inpatients during the evaluation, they reported similar concerns that it would be challenging for them to learn and use MCB when in pain or not energetic. Lastly, usability evaluation could help assess whether patients could easily adopt to patient portals [59]. As the effectiveness of patient portals on improving health outcomes is still inconclusive [7,8,63], patients’ perceived benefits have to outweigh the cost of learning to use the new technology, in order to maximize the usage of patient portals [43].
What was already known on the topic
• Patient portals designed for inpatients have potential to increase patient engagement • The design of inpatient portals can impact how patients navigate and comprehend information in inpatient portals. What this study added to our knowledge
• The role of the inpatient portal for patient-provider commu• •
nication should be clear defined in order to maximize the use of an inpatient portal. Participants preferred to individually message their providers. Participants’ learning styles varied, with age as a potential factor that influenced how they learned and used an inpatient portal.
5.1. Study limitations Acknowledgements Our study is subject to several limitations. First, the small sample size (n = 19) could not draw statistical conclusions. We were not able to examine demographics or socioeconomic factors that would impact the use of the inpatient portal. However, a minimum of 15 participants were recommended for revealing 80–95% of usability issues [64]. Our sample size was sufficient to detect usability issues for future improvement of MCB and similar inpatient portals. In addition, our participants were volunteers who were not inpatients during the usability sessions. Their interactions and feedback may be different from inpatients in the hospitals. However, half of our participants had been inpatients in the past. Lastly, the study was aiming to identify usability problems at Level 2 evaluation [46] for technology improvement, so it did not address organizational or environmental issues that would influence the real usage of the inpatient portal. Future studies should investigate workflow and system impact in a real setting.
The authors are extremely grateful to the informants who participated in this study. We also thank our research team members who assisted at various stages of this project: Megan Chamberlain, Sharon Cross, and Pamela Thompson. This work was supported by the Agency for Healthcare Research and Quality (AHRQ) Grants R01 HS024091-01 and R21 HS024349-01 as well as AHRQ Grant P30-HS024379 through The Ohio State University Institute for the Design of Environments Aligned For Patient Safety (IDEA4PS). While this research was funded by AHRQ, the study sponsor had no involvement in the collection, analysis, or interpretation of data; in the writing of this manuscript; nor in the decision to submit the manuscript for publication. References [1] Office of the National Coordinator, What is a Patient Portal? (2017) Available from: https://www.healthit.gov/providers-professionals/faqs/what-patient-portal. [2] Office of the National Coordinator, What is a Personal Health Record? (2017) Available from: https://www.healthit.gov/providers-professionals/faqs/whatpersonal-health-record. [3] F. Pinciroli, C. Pagliari, Understanding the evolving role of the Personal Health Record, Comput. Biol. Med. 59 (2015) 160–163. [4] C.S. Kruse, K. Bolton, G. Freriks, The effect of patient portals on quality outcomes and its implications to meaningful use: a systematic review, J. Med. Internet Res. 17 (2) (2015) e44. [5] C.S. Kruse, D.A. Argueta, L. Lopez, A. Nair, Patient and provider attitudes toward the use of patient portals for the management of chronic disease: a systematic review, J. Med. Internet Res. 17 (2) (2015) e40. [6] J.E. Prey, J. Woollen, L. Wilcox, A.D. Sackeim, G. Hripcsak, S. Bakken, et al., Patient engagement in the inpatient setting: a systematic review, J. Am. Med. Inf. Assoc. 21 (4) (2014) 742–750. [7] M. Rigby, A. Georgiou, H. Hypponen, E. Ammenwerth, N. de Keizer, F. Magrabi, et al., Patient portals as a means of information and communication technology support to patient- centric care coordination–the missing evidence and the challenges of evaluation. a joint contribution of IMIA WG EVAL and EFMI WG EVAL, Yearb. Med Inf. 10 (1) (2015) 148–159. [8] E. Ammenwerth, P. Schnell-Inderst, A. Hoerbst, The impact of electronic patient portals on patient care: a systematic review of controlled trials, J. Med. Internet Res. 14 (6) (2012) e162. [9] B. Hattink, R.M. Droes, S. Sikkes, E. Oostra, A.W. Lemstra, Evaluation of the digital alzheimer center: testing usability and usefulness of an online portal for patients with dementia and their carers, JMIR Res. Protoc. 5 (3) (2016) e144. [10] S.R. Greysen, R.R. Khanna, R. Jacolbia, H.M. Lee, A.D. Auerbach, Tablet computers for hospitalized patients: a pilot study to improve inpatient engagement, J. Hosp. Med. 9 (6) (2014) 396–399. [11] Services CfMaM, Step 5: Achieve Meaningful Use Stage 2 2014 Health IT.gov, (2015) Available from: https://www.healthit.gov/providers-professionals/achievemeaningful-use/core-measures-2/patient-ability-electronically-view-downloadtransmit-vdt-health-information. [12] Pew Research Center, Technology Device Ownership, (2015) Available from: http://www.pewinternet.org/2015/10/29/technology-device-ownership-2015/.
6. Conclusion We conducted a usability evaluation of a commercial inpatient portal. We identified usability problems that can be used to facilitate the technology upgrade. Usability issues should be addressed to make inpatient portals more user-friendly. In addition, as some errors were the result of insufficient user knowledge of the system and tablet technology, providing various options for learning can accommodate individuals’ different learning habits and needs, and improve their interaction with inpatient portals. Organizations should also clearly define the role of an inpatient portal in regards to patient-provider communication and continuously assess its impact on patient satisfaction and clinician workflow. Organizations should also work closely with the development team regarding system design, upgrades, and training that should be adjusted in the best interests of inpatients. Authors contribution PY and ASM designed the study. PY, DW, JGS, MPZ, and TLM collected the data. PY, JGS, and MPZ analyzed the data and drafted the manuscript. DW, TLM, and ASM provided feedback. All authors read and approved the final manuscript. Conflicts of interest The authors report no conflicts of interest.
17
International Journal of Medical Informatics 110 (2018) 10–18
P.-Y. Yen et al.
Company, 1998. [38] J.S. Dumas, J.C. Redish, A Practical Guide to Usability Testing, Intellect Ltd, Portland, 1999 Revised ed.. [39] P. Yao, P.N. Gorman, Discount usability engineering applied to an interface for webbased medical knowledge resources, Proc AMIA Symp. vol. 92, (2000) 8–32. [40] M.E. Wiklund, Usability in Practice, Academic Press, Cambridge, 1994. [41] J. Barron, M. Bedra, J. Wood, J. Finkelstein, Exploring three perspectives on feasibility of a patient portal for older adults, Stud. Health Technol. Inf. 202 (2014) 181–184. [42] S. Brahmandam, W.C. Holland, S.A. Mangipudi, V.A. Braz, R.P. Medlin, K.M. Hunold, et al., Willingness and ability of older adults in the emergency department to provide clinical information using a tablet computer, J. Am. Geriatr. Soc. 64 (11) (2016) 2362–2367. [43] C. Latulipe, A. Gatto, H.T. Nguyen, D.P. Miller, S.A. Quandt, A.G. Bertoni, et al., Design considerations for patient portal adoption by low-income, older adults, Proc SIGCHI Conf Hum Factor Comput Syst. 2015 (2015) 3859–3868. [44] J. Taha, J. Sharit, S.J. Czaja, The impact of numeracy ability and technology skills on older adults' performance of health management tasks using a patient portal, J. Appl. Gerontol. 33 (4) (2014) 416–436. [45] A. Turner, K. Osterhage, J. Joe, A. Hartzler, L. Lin, G. Demiris, Use of patient portals: personal health information management in older adults, Stud. Health Technol. Inf. 216 (2015) 978. [46] P.Y. Yen, S. Bakken, Review of health information technology usability study methodologies, J. Am. Med. Inf. Assoc. 19 (3) (2012) 413–422. [47] K.A. Ericsson, H.A. Simon, Verbal reports as data, Psychol. Rev. 87 (3) (1980) 215–251. [48] C. Lewis, Using the Think Aloud Method in Cognitive Interface Design, IBM, New York, 1982. [49] M.W. Jaspers, A comparison of usability methods for testing interactive health technologies: methodological aspects and empirical evidence, Int. J. Med. Inf. 78 (5) (2009) 340–353. [50] A. Holzinger, Usability engineering methods for software developers, Commun. ACM 48 (1) (2005) 71–74. [51] T. Cohen, D. Kaufman, T. White, G. Segal, A.B. Staub, V. Patel, et al., Cognitive evaluation of an innovative psychiatric clinical knowledge enhancement system, Stud. Health Technol. Inf. 107 (Pt. 2) (2004) 1295–1299. [52] L.W.P. Peute, M.W.M. Jaspers, The significance of a usability evaluation of an emerging laboratory order entry system, Int. J. Med. Inf. 76 (2–3) (2007) 157–168. [53] N. Elhadad, K. McKeown, D. Kaufman, D. Jordan, Facilitating physicians’ access to information via tailored text summarization, AMIA Annu Symp Proc. 2005; Annual Symposium Proceedings/AMIA Symposium (2017) 226–230. [54] H. Yu, M. Lee, D. Kaufman, J. Ely, J.A. Osheroff, G. Hripcsak, et al., Development, implementation, and a cognitive evaluation of a definitional question answering system for physicians, J. Biomed. Inf. 40 (3) (2007) 236–251. [55] J. Horsky, D.R. Kaufman, V.L. Patel, The cognitive complexity of a provider order entry interface, AMIA Annu Symp Proc. 2003;Annual Symposium Proceedings/ AMIA Symposium (2017) 294–298. [56] B. Downe-Wamboldt, Content analysis: method, applications, and issues, Health Care Women Int. 13 (3) (1992) 313–321. [57] J. Nielsen, Ten Usability Heuristics, (2005) Available from: http://www.useit.com/ papers/heuristic/heuristic_list.html. [58] J.S. Holtrop, W. Corser, G. Jones, G. Brooks, M. Holmes-Rovner, M. Stommel, Health behavior goals of cardiac patients after hospitalization, Am. J. Health Behav. 30 (4) (2006) 387–399. [59] T. Irizarry, A. DeVito Dabbs, C.R. Curran, Patient portals and patient engagement: a state of the science review, J. Med. Internet Res. 17 (6) (2015) e148. [60] A.L. Laccetti, B. Chen, J. Cai, S. Gates, Y. Xie, S.J. Lee, et al., Increase in cancer center staff effort related to electronic patient portal use, J. Oncol. Pract. 12 (12) (2016) e981–e990. [61] J.N. Haun, J.D. Lind, S.L. Shimada, T.L. Martin, R.M. Gosline, N. Antinori, et al., Evaluating user experiences of the secure messaging tool on the Veterans Affairs’ patient portal system, J. Med. Internet Res. 16 (3) (2014) e75. [62] D.F. Sittig, H. Singh, A new sociotechnical model for studying health information technology in complex adaptive healthcare systems, Qual. Saf. Health Care 19 (Suppl. 3) (2010) i68–74. [63] C.L. Goldzweig, G. Orshansky, N.M. Paige, A.A. Towfigh, D.A. Haggstrom, I. MiakeLye, et al., Electronic patient portals: evidence on health outcomes, satisfaction, efficiency, and attitudes: a systematic review, Ann. Intern. Med. 159 (10) (2013) 677–687. [64] L. Faulkner, Beyond the five-user assumption: benefits of increased sample sizes in usability testing, Behav. Res. Methods Instrum. Comput. 35 (3) (2003) 379–383.
[13] R. Masterson Creber, J. Prey, B. Ryan, I. Alarcon, M. Qian, S. Bakken, et al., Engaging hospitalized patients in clinical care: study protocol for a pragmatic randomized controlled trial, Contemp. Clin. Trials 47 (2016) 165–171. [14] M. Maher, D.A. Hanauer, E. Kaziunas, M.S. Ackerman, H. Derry, R. Forringer, et al., A novel health information technology communication system to increase caregiver activation in the context of hospital-based pediatric hematopoietic cell transplantation: a pilot study, JMIR Res. Protoc. 4 (4) (2015) e119. [15] S.R. Greysen, Y. Magan Mendoza, J. Rosenthal, R. Jacolbia, A. Rajkomar, H. Lee, et al., Using tablet computers to increase patient engagement with electronic personal health records: protocol for a prospective, randomized interventional study, JMIR Res. Protoc. 5 (3) (2016) e176. [16] A.S. McAlearney, C.J. Sieck, J.L. Hefner, A.M. Aldrich, D.M. Walker, M.K. Rizer, et al., High touch and high tech (HT2) proposal: transforming patient engagement throughout the continuum of care by engaging patients with portal technology at the bedside, JMIR Res. Protoc. 5 (4) (2016) e221. [17] J. Woollen, J. Prey, L. Wilcox, A. Sackeim, S. Restaino, S.T. Raza, et al., Patient experiences using an inpatient personal health record, Appl. Clin. Inf. 7 (2) (2016) 446–460. [18] L. Wilcox, J. Woollen, J. Prey, S. Restaino, S. Bakken, S. Feiner, et al., Interactive tools for inpatient medication tracking: a multi-phase study with cardiothoracic surgery patients, J. Am. Med. Inf. Assoc. 23 (1) (2016) 144–158. [19] D.K. Vawdrey, L.G. Wilcox, S.A. Collins, S. Bakken, S. Feiner, A. Boyer, et al., A tablet computer application for patients to participate in their hospital care, AMIA Annu Symp Proc 2011 (2011) 1428–1435. [20] Using mobile phones to present medical information to hospital patients, in: L. Vardoulakis, A. Karlson, D. Morris, G. Smith, J. Gatewood, D. Tan (Eds.), Proceedings of the 2012 ACM Annual Conference on Human Factors in Computing Systems 2012 (2012). [21] K.J. O'Leary, R.K. Sharma, A. Killarney, L.S. O'Hara, M.E. Lohman, E. Culver, et al., Patients’ and healthcare providers' perceptions of a mobile portal application for hospitalized patients, BMC Med. Inf. Decis. Mak. 16 (1) (2016) 123. [22] K.J. O’Leary, M.E. Lohman, E. Culver, A. Killarney, G. Randy Smith Jr., D.M. Liebovitz, The effect of tablet computers with a mobile patient portal application on hospitalized patients’ knowledge and activation, J. Am. Med. Inf. Assoc. 23 (1) (2016) 159–165. [23] M.M. Kelly, P.L. Hoonakker, S.M. Dean, Using an inpatient portal to engage families in pediatric hospital care, J. Am. Med. Inf. Assoc. 24 (1) (2017) 153–161. [24] P.C. Dykes, D.L. Carroll, A.C. Hurley, A. Benoit, F. Chang, R. Pozzar, et al., Building and testing a patient-centric electronic bedside communication center, J. Gerontol. Nurs. 39 (1) (2013) 15–19. [25] E.L. Winstanley, M. Burtchin, Y. Zhang, P. Campbell, J. Pahl, S. Beck, et al., Inpatient experiences with MyChart bedside, Telemed. J. E-Health 23 (8) (2017) 691–693. [26] J.R. Robinson, S.E. Davis, R.M. Cronin, G.P. Jackson, Use of a patient portal during hospital admissions to surgical services, AMIA Annu Symp Proc. 2016 (2016) 1967–1976. [27] L. Runaas, D. Hanauer, M. Maher, E. Bischoff, A. Fauer, T. Hoang, et al., BMT roadmap: a user-centered design health information technology tool to promote patient-centered care in pediatric hematopoietic cell transplantation, Biol. Blood Marrow Transplant. 23 (5) (2017) 813–819. [28] E.L.-C. Law, V. Roto, M. Hassenzahl, A.P.O.S. Vermeeren, J. Kort, Understanding, scoping and defining user experience: a survey approach, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, MA, USA. 1518813: ACM, 2009, pp. 719–728. [29] N. Bevan, What is the difference between the purpose of usability and user experience evaluation methods? Proceedings of the Workshop UXEM (2009). [30] ISO 9241-11, Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs) – Part 11: Guidance on Usability, (1998). [31] A. Abran, A. Khelifi, W. Suryn, A. Seffah, Usability meanings and interpretations in ISO standards, Softw. Qual. J. 11 (4) (2003) 325–338. [32] J. Bennett, Visual Display Terminals: Usability Issues and Health Concerns, Prentice-Hall, Englewood Cliffs New Jersey, 1984. [33] A. Dix, J. Finlay, G. Abowd, R. Beale, Human-computer Interaction, Pearson Education Limited, Harlow, 1998. [34] J. Nielsen, Usability Engineering, Academic Press, Cambridge, 1993. [35] J. Preece, Y. Rogers, D. Benyon, S. Holland, T. Carey, Human Computer Interaction, AddisonWesley, Wokingham, 1994. [36] B. Shackel, Usability—context, framework, definition, design and evaluation, in: B. Shackel, S.J. Richardson (Eds.), Human Factors for Informatics Usability, Cambridge University Press, New York, NY, 1991, pp. 21–37. [37] B. Shneiderman, Designing the User Interface, Addison-Wesley Publishing
18