IG-3721-16-0003 - UL Transaction Security

Transcription

IG-3721-16-0003 - UL Transaction Security
Test Results Summary for 2014 Edition EHR Certification 16‐3721‐R‐0003‐PRA V1.0, March 3, 2016 2.2
Gap Certification The following identifies criterion or criteria certified via gap certification. (a)(1) (a)(19) §170.314 (d)(6) (a)(6) (a)(20) (d)(8) (a)(7) (b)(5)* (d)(9) (a)(17) (d)(1) (f)(1) (f)(7)* (a)(18) (d)(5) *Gap certification allowed for Inpatient setting only No gap certification (h)(1) (h)(2) 2.3
Inherited Certification The following identifies criterion or criteria certified via inherited certification. (a)(1) §170.314 (c)(2) (a)(16) Inpt. only (f)(2) (a)(2) (a)(17) Inpt. only (c)(3) (f)(3) (a)(3) (a)(18) (d)(1) (f)(4) Inpt. only (a)(4) (a)(19) (d)(2) (f)(5) Optional & Amb. only
(a)(5) (a)(20) (d)(3) (f)(6) Optional & Amb. only
(a)(6) (b)(1) (d)(4) (f)(7) Amb. only (a)(7) (b)(2) (d)(5) (g)(1) (a)(8) (b)(3) (d)(6) (g)(2) (a)(9) (b)(4) (d)(7) (g)(3) (a)(10) (b)(5) (d)(8) (g)(4) (a)(11) (b)(6) Inpt. only (d)(9) Optional (h)(1) (a)(12) (b)(7) (e)(1) (h)(2) (a)(13) (b)(8) (e)(2) Amb. only (h)(3) (a)(14) (b)(9) (e)(3) Amb. only (a)(15) (c)(1) (f)(1) No inherited certification ©2016 InfoGard. May be reproduced only in its original entirety, without revision 2 Test Results Summary for 2014 Edition EHR Certification 16‐3721‐R‐0003‐PRA V1.0, March 3, 2016 3.2.2
Test Tools Test Tool Cypress ePrescribing Validation Tool HL7 CDA Cancer Registry Reporting Validation Tool HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool HL7 v2 Immunization Information System (IIS) Reporting Validation Tool HL7 v2 Laboratory Results Interface (LRI) Validation Tool HL7 v2 Syndromic Surveillance Reporting Validation Tool Transport Testing Tool Direct Certificate Discovery Tool Edge Testing Tool No test tools required Version 2.6.1 1.0.6 1.8.2 1.7.2 1.7.2 182 3.0.4 3.2.3
Test Data Alteration (customization) to the test data was necessary and is described in Appendix A No alteration (customization) to the test data was necessary 3.2.4
Standards 3.2.4.1
Multiple Standards Permitted The following identifies the standard(s) that has been successfully tested where more than one standard is permitted. Criterion # (a)(8)(ii)(A)(2) Standard Successfully Tested §170.204(b)(1) HL7 Version 3 Implementation Guide: URL‐Based Implementations of the Context‐Aware Information Retrieval (Infobutton) Domain (a)(13) §170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release (a)(15)(i) §170.204(b)(1) HL7 Version 3 Implementation Guide: URL‐
Based Implementations of the Context‐Aware Information Retrieval (Infobutton) Domain (a)(16)(ii) §170.210(g) Network Time Protocol Version 3 (RFC 1305) (b)(2)(i)(A) §170.207(i) §170.204(b)(2) HL7 Version 3 Implementation Guide: Context‐
Aware Knowledge Retrieval (Infobutton) Service‐Oriented Architecture Implementation Guide §170.207(j) HL7 Version 3 Standard: Clinical Genomics; Pedigree §170.204(b)(2) HL7 Version 3 Implementation Guide: Context‐
Aware Knowledge Retrieval (Infobutton) Service‐Oriented Architecture Implementation Guide §170. 210(g) Network Time Protocol Version 4 (RFC 5905) §170.207(a)(3) The code set specified at 45 CFR 162.1002(c)(2) IHTSDO SNOMED CT® International Release (ICD‐10‐CM) for the indicated conditions July 2012 and US Extension to SNOMED CT® March 2012 Release ©2016 InfoGard. May be reproduced only in its original entirety, without revision 4 Test Results Summary for 2014 Edition EHR Certification 16‐3721‐R‐0003‐PRA V1.0, March 3, 2016 Criterion # Standard Successfully Tested (b)(7)(i) §170.207(i) §170.207(a)(3) The code set specified at 45 CFR 162.1002(c)(2) IHTSDO SNOMED CT® International Release (ICD‐10‐CM) for the indicated conditions July 2012 and US Extension to SNOMED CT® March 2012 Release (e)(1)(i) Annex A of the FIPS Publication 140‐2  TLS RSA; CBC SHA 128 (e)(1)(ii)(A)(2) §170.210(g) Network Time Protocol Version 3 (RFC 1305) (e)(3)(ii) §170. 210(g) Network Time Protocol Version 4 (RFC 5905) Annex A of the FIPS Publication 140‐2  TLS RSA; CBC SHA 128 Common MU §170.207(a)(3) Data Set (15) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release §170.207(b)(2) The code set specified at 45 CFR 162.1002(a)(5) (HCPCS and CPT‐4) None of the criteria and corresponding standards listed above are applicable 3.2.4.2
Newer Versions of Standards The following identifies the newer version of a minimum standard(s) that has been successfully tested. Newer Version Applicable Criteria No newer version of a minimum standard was tested 3.2.5
Optional Functionality Criterion # (a)(4)(iii) (b)(1)(i)(B) Optional Functionality Successfully Tested Plot and display growth charts Receive summary care record using the standards specified at §170.202(a) and (b) (Direct and XDM Validation) (b)(1)(i)(C) Receive summary care record using the standards specified at §170.202(b) and (c) (SOAP Protocols) (b)(2)(ii)(B) Transmit health information to a Third Party using the standards specified at §170.202(a) and (b) (Direct and XDM Validation) (b)(2)(ii)(C) Transmit health information to a Third Party using the standards specified at §170.202(b) and (c) (SOAP Protocols) (f)(3) Ambulatory only – Create syndrome‐based public health surveillance information for transmission using the standard specified at §170.205(d)(3) (urgent care visit scenario) Common MU Express Procedures according to the standard specified at §170.207(b)(3) (45 Data Set (15) CFR162.1002(a)(4): Code on Dental Procedures and Nomenclature) Common MU Express Procedures according to the standard specified at §170.207(b)(4) (45 Data Set (15) CFR162.1002(c)(3): ICD‐10‐PCS) No optional functionality tested ©2016 InfoGard. May be reproduced only in its original entirety, without revision 5 Test Results Summary for 2014 Edition EHR Certification 16‐3721‐R‐0003‐PRA V1.0, March 3, 2016 3.2.6 2014 Edition Certification Criteria* Successfully Tested Criteria # (a)(1) (a)(2) (a)(3) (a)(4) (a)(5) (a)(6) (a)(7) (a)(8) (a)(9) (a)(10) (a)(11) (a)(12) (a)(13) (a)(14) (a)(15) (a)(16) Inpt. only (a)(17) Inpt. only (a)(18) (a)(19) (a)(20) (b)(1) (b)(2) (b)(3) (b)(4) (b)(5) (b)(6) Inpt. only (b)(7) (b)(8) (b)(9) Version TP** TD*** 1.2
1.2
1.4
1.4
1.3
1.3
1.2
1.3
1.3
1.2
1.2
1.5
1.4
1.3
1.3
1.3
1.4
1.7
1.4
1.4
1.3
1.4
1.4
1.6
1.2
1.4
1.7.2
1.5
1.7
Criteria # (c)(1) (c)(2) (c)(3) (d)(1) (d)(2) (d)(3) (d)(4) (d)(5) (d)(6) (d)(7) (d)(8) (d)(9) Optional (e)(1) (e)(2) Amb. only (e)(3) Amb. only (f)(1) (f)(2) (f)(3) (f)(4) Inpt. only (f)(5) Optional & Amb. only (f)(6) Optional & Amb. only (f)(7) Amb. only (g)(1) (g)(2) (g)(3) (g)(4) (h)(1) (h)(2) (h)(3) Version TP** TD*** 1.12
1.12
1.12
2.6.1
2.6.1
2.6.1
1.6
1.3
1.2
1.2
1.11
1.2
1.3
1.5
1.6
1.3
1.3
1.3.0
1.3.0
1.1
2.0
1.4
1.2
1.1
1.1
1.1
2.0
*For a list of the 2014 Edition Certification Criteria, please reference http://www.healthit.gov/certification (navigation: 2014 Edition Test Method) **Indicates the version number for the Test Procedure (TP) ***Indicates the version number for the Test Data (TD) ©2016 InfoGard. May be reproduced only in its original entirety, without revision 6 Test Results Summary for 2014 Edition EHR Certification 16‐3721‐R‐0003‐PRA V1.0, March 3, 2016 3.2.7
2014 Clinical Quality Measures* Type of Clinical Quality Measures Successfully Tested: Ambulatory Inpatient No CQMs tested *For a list of the 2014 Clinical Quality Measures, please reference http://www.cms.gov (navigation: 2014 Clinical Quality Measures) CMS ID 2 22 50 52 56 61 62 64 65 66 68 69 74 75 77 82 Version v4 v3 v3 v4 v4 v4 v4 v3 v2 CMS ID 90 117 122 123 124 125 126 127 128 129 130 131 132 133 134 135 Ambulatory CQMs Version CMS ID 136 v3 137 138 v3 v3 139 v3 140 141 v3 142 143 v3 144 v4 145 146 v3 147 v3 148 149 v3 153 154 Version v3 v3 v3 v3 v4 v3 v4 CMS ID 155 156 157 158 159 160 161 163 164 165 166 167 169 177 179 182 Version v3 v3 v3 v3 v4 v3 v3 CMS ID 9 26 30 31 32 Version CMS ID 71 72 73 91 100 Inpatient CQMs Version CMS ID 107 108 109 110 111 Version CMS ID 172 178 185 188 190 Version 53 55 60 102 104 105 113 114 171 ©2016 InfoGard. May be reproduced only in its original entirety, without revision 7 Test Results Summary for 2014 Edition EHR Certification 16‐3721‐R‐0003‐PRA V1.0, March 3, 2016 3.2.8
Automated Numerator Recording and Measure Calculation 3.2.8.1
Automated Numerator Recording (a)(1) (a)(3) (a)(4) (a)(5) (a)(6) (a)(7) Automated Numerator Recording Successfully Tested (a)(16) (a)(9) (a)(17) (a)(11) (b)(2) (a)(12) (b)(3) (a)(13) (a)(14) (a)(15) (b)(4) (b)(5) (b)(6) (e)(1) (e)(2) (e)(3) (b)(6) (e)(1) (e)(2) (e)(3) Automated Numerator Recording was not tested 3.2.8.2
Automated Measure Calculation (a)(1) (a)(3) (a)(4) (a)(5) (a)(6) (a)(7) Automated Numerator Recording Successfully Tested (a)(9) (a)(16) (a)(11) (a)(17) (a)(12) (b)(2) (a)(13) (b)(3) (a)(14) (b)(4) (a)(15) (b)(5) Automated Measure Calculation was not tested 3.2.9
Attestation Attestation Forms (as applicable) Safety‐Enhanced Design* Quality Management System** Privacy and Security Appendix B C D *Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (b)(3), (b)(4) **Required for every EHR product ©2016 InfoGard. May be reproduced only in its original entirety, without revision 8 Test Results Summary for 2014 Edition EHR Certification 16‐3721‐R‐0003‐PRA V1.0, March 3, 2016 Appendix A: Alteration of Test Data Criteria b7 e1 Explanation Determined that modified Test Data had equivalent level of robustness to NIST Test Data Determined that modified Test Data had equivalent level of robustness to NIST Test Data ©2016 InfoGard. May be reproduced only in its original entirety, without revision 9 Test Results Summary for 2014 Edition EHR Certification 16‐3721‐R‐0003‐PRA V1.0, March 3, 2016 Appendix B: Safety Enhanced Design ©2016 InfoGard. May be reproduced only in its original entirety, without revision 10 User Centered
Design Process
Attestation
EnableMyHealth
EHR
Safety-Enhanced Design 170.314(g)(3)
Report based on ISO/IEC 25062:2006 Common Industry Format for
Usability Test Reports
Date of Usability Test: 15 AUG 2015 – 14 SEP 2015
Date of Report: 14 SEP 2015
Report Prepared By: Steve Rothschild
540.588.0016
[email protected]
EnableDoc LLC
7700 Falstaff Road
McLean VA 22104
Table of Contents
Enablemyhealth User Centered Design Process Attestation ........................................................................ 2
Summary ....................................................................................................................................................... 2
Intended Purpose of Document.................................................................................................................... 2
Overview ....................................................................................................................................................... 2
Usability Definition ....................................................................................................................................... 3
Enablemyhealth UCD Process ....................................................................................................................... 3
Organizational ............................................................................................................................................... 5
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
1
Enablemyhealth User Centered Design Process Attestation
Summary
EnableMyHealth attests to the use and implementation of a User Centered Design process in the
ongoing development of its EHR software. Our process implements the process outlined in NISTIR 7741.
Reference:
National Institute of Standards and Technology. (2010). NIST Guide to the Processes Approach for
Improving the Usability of Electronic Health Records (NISTIR 7741). Gaithersburg, MD.
www.nist.gov/manuscript-publication-search.cfm?pub_id=907313
The details of the EnableMyHealth implementation of NISTIR 7741 are described in the remainder of this
document. This UCD approach has been applied in the development and measurement of the 7
ambulatory EHR criteria described in 170.314(g)(3) Safety-enhanced design. Specifically these are:
• CPOE (medications, labs and radiology orders)
• Drug-drug, drug-allergy interaction checks
• Medication list
• Medication allergy list
• Clinical decision support
• Electronic prescribing
• Clinical information reconciliation
The primary artifacts demonstrating this process are the summative usability test reports describing in
detail the usability testing conducted on the 7 safety enhanced design criteria.
Intended Purpose of Document
This document is to attest to the ONC that EnableMyHealth has applied a User Centered Design process
in the development of our EHR offering.
Overview
An established UCD process ensures that designed EHRs are efficient, effective, and satisfying to the
user. Such a process would also ensure an understanding of end user needs, workflows, and
environments. Users are engaged early and often to provide guidance and understanding of their work,
formative feedback during an iterative design and development process, and summative usability
testing to assess if performance goals have been met. Usability is defined by the International
Organization for Standardization (ISO) as “…the effectiveness, efficiency, and satisfaction with which
the intended users can achieve their tasks in the intended context of product use.” This definition
establishes a framework for setting usability goals and specific evaluation measures. For a given
application, measures of these attributes enable comparing the application‘s progress over time. UCD
is an iterative process that serves to continually improve the application. In each iteration, critical points
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
2
and issues are uncovered which can be improved upon and implemented in subsequent releases.
Established UCD processes are followed by organizations that have a culture of usability.
Usability testing is a core component of user-centered design. The point of doing a usability test is to
improve the EHR whether that means its workflow, navigation, screen layout, interaction, visual design,
etc. Early and frequent testing (formative) confirms or refutes design decisions at a time when they can
still easily be changed. Usability testing also guides the development process, helping to avoid problems
in the use of the final software such as elimination of critical user errors, excessive time spent on a task,
or unsuccessful task completion.
The EnableMyHealth UCD process can be described in the following graphic manner:
IMAGE
[From: NISTIR 7741. http://www.nist.gov/manuscript-publication-search.cfm?pub_id=907313]
The remainder of this document will outline this process as implemented at EnableMyHealth.
Usability Definition
The term “usability” appears repeatedly in this document. Ensuring a common ground and
understanding, “usability” is defined thusly:
Effectiveness – the degree to which users are able to successfully accomplish their tasks
and goals. This is measured via collection of task completion rate, types and numbers of errors, and
deviations from optimal path.
Efficiency – the time the task takes for successful completion. The desired or optimal
time is either contained within predetermined usability goals or considered acceptable by end users.
This is measured by collecting time on task.
Satisfaction – the degree to which users are frustrated or pleased, how learnable they
consider the system, and the perceived ease of use. This is measured primarily by the SUS Questionnaire
and by a post task single ease of use question.
Enablemyhealth UCD Process
1. Understand user needs, workflows, and environments
a. This is accomplished in a variety of ways during the analysis and design stages of a feature
or enhancement. EnableMyHealth regularly conducts field visits to end user clients to
observe their work practices and conduct interviews to identify user needs and concerns.
Competitive analysis is also an ongoing activity to identify areas of opportunity.
EnableMyHealth also relies on internal subject matter experts who are able to inform the
design process.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
3
b. EnableMyHealth has established several Physician Advisory Groups: one for clinicians that
primarily use the EHR and one that focuses on the Practice Management component. All
groups meet periodically to provide input and feedback on their work practices and how
our software integrates with and facilitates those practices. Areas of improvement are also
discussed.
2. Engage users early and often
User reaction and feedback is regularly sought during the development phase of the software. A
variety of methods are used including presentation and discussion of proposed designs (screen visual
design as well as interaction model), formative usability testing, surveys, and discussion forums.
EnableMyHealth is actively working to conduct these activities even earlier in the development
process, specifically during the initial design phase of a project.
3. Set usability objectives
As part of the process of starting work on a new feature or enhancement, a discussion occurs among
Product Management, Sales, and User Experience/Development groups to establish desired usability
goals and objectives.
4. Design from known behavior principles and UI patterns
i.
The User Experience group consists of two usability and design professionals that collectively
encompass over 40 years of experience in the field. The skillsets comprise: healthcare work
flow, computer science, technical writing, graphic design, and system architecture.
ii.
The UI group owns the user experience of the products and as such produces or reviews all
aspects of the design of the user interface. As such the UI design process focuses on:
1. Speed of performing functionality with optimal number of steps
2. Screen and functional design consistency
3. Modern user controls
4. Windows/Web User Experience Guidelines
5. Obtain user feedback via formative usability studies
User feedback, either from members of the advisory group or internal users is obtained from
formative usability studies during the design and development phase of a feature. Currently this is
typically done with demonstration of screens and functional work flow. We obtain input from tests
of users and selective customers and industry experts.
6. Adapt and modify designs iteratively
User feedback and other formative usability test results are presented and discussed with product
teams. User experience and functionality is discussed and modified for optimal usability and speed.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
4
7. Measure performance against objectives
Once a feature is sufficiently coded to the point where relevant usability metrics can be measured
and collected, small user groups are conducted to assess performance and determine if the
solution meets usability goals. The design is reassessed if there is significant deviation.
Organizational
1. Data as a basis for decision
As much as possible, EnableMyHealth relies on the data from both formative and comprehensive
usability tests to base design decisions as opposed to only opinions. When competing design options
arise, a quick small-scale usability study is conducted to inform the appropriate design direction. The
specific type of study is determined by the research question. It may be as simple as a wording
preference question or as involved and rigorous as a summative study.
2. Management support
The User Experience group was founded approximately 4 years ago when EnableMyHealth had
attained a software development maturity level where management recognized the strategic
importance of usable products. Since then there has been a 150% growth in the size of the group as
workload increased, due in no small part to management championship. EnableMyHealth was also
acquired by a much larger corporation during this time period which already had user experience and
software usability as an established component of the development process.
3. Design team
The User Experience team consists of a Lead Developer who focuses on is responsible for the
interaction model, screen layout, and both internal and external software behavior consistency, and
a designer who has primary responsibility for color palette, iconography, visual language, and screen
layout consistency. Our development is approximately 10 to 15 software developers and testers that
have multiple roles and responsibilities. Product Management works closely with the User
Experience team to ensure proper work flow, inputs and outputs and provides the subject matter
expertise to ensure customer, regulatory, and legal requirements are met.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
5
EHR Usability Test
Report of
Enablemyhealth
EHR
Medication
Allergy List
Safety-Enhanced Design 170.314(g)(3)
Report based on ISO/IEC 25062:2006 Common Industry Format for
Usability Test Reports
Date of Usability Test: 15 AUG 2015 – 14 SEP 2015
Date of Report: 14 SEP 2015
Report Prepared By: Stephen Rothschild
877.540.0933
[email protected]
EnableDoc LLC
7700 Falstaff Road
McLean VA 22104
Table of Contents
1
Executive Summary ............................................................................................................................... 2
2
INTRODUCTION ..................................................................................................................................... 3
3
METHOD................................................................................................................................................ 4
4
5
3.1
PARTICIPANTS ............................................................................................................................... 4
3.2
STUDY DESIGN............................................................................................................................... 5
3.3
TASKS............................................................................................................................................. 5
3.4
PROCEDURES................................................................................................................................. 6
3.5
TEST LOCATION ............................................................................................................................. 7
3.6
TEST ENVIRONMENT ..................................................................................................................... 7
3.7
TEST FORMS AND TOOLS .............................................................................................................. 7
3.8
PARTICIPANT INSTRUCTIONS ........................................................................................................ 7
3.9
USABILITY METRICS ....................................................................................................................... 8
3.10
DATA SCORING .............................................................................................................................. 8
RESULTS .............................................................................................................................................. 10
4.1
DATA ANALYSIS AND REPORTING ............................................................................................... 10
4.2
DISCUSSION OF THE FINDINGS ................................................................................................... 10
4.3
EFFECTIVENESS ........................................................................................................................... 11
4.4
EFFICIENCY .................................................................................................................................. 11
4.5
SATISFACTION ............................................................................................................................. 11
4.6
MAJOR FINDINGS ........................................................................................................................ 11
4.7
AREAS FOR IMPROVEMENT ........................................................................................................ 11
APPENDICES ........................................................................................................................................ 12
Appendix 1: PARTICIPANT RECRUITING SCREENER ................................................................................ 12
Appendix 2: PARTICIPANT DEMOGRAPHICS ........................................................................................... 15
Appendix 3: Informed Consent Form...................................................................................................... 16
Appendix 4: Task Scenarios ..................................................................................................................... 17
Appendix 5: Usability Test Administration Moderator’s Guide .............................................................. 18
Appendix 6: Sample data collection form ............................................................................................... 20
Appendix 7: System Usability Scale Questionnaire ................................................................................ 21
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
1
1
Executive Summary
A usability test of EnableMyHealth – 2014.1, an ambulatory EHR, was conducted on 15 Aug 2015 – 14
Sep 2015 in the McLean, VA and Rochester MN offices of EnableDoc LLC. The purpose of this test was to
test and validate the usability of the current user interface for edication allergy list, and provide
evidence of usability in the EHR Under Test (EHRUT).
During the usability test, five healthcare providers and clinical staff matching the target demographic
criteria served as participants and used the EHRUT in simulated, but representative tasks.
This study collected performance data on 3 medication allergies and 2 modified medication allergies in
an EHR where the user is required to:
·
·
·
Enter 3 medication allergies
Modify 2 medication allergies
Review allergy history
All test participants conducted the test sessions remotely via on-line conferencing software. During the
15 minute one-on-one usability test, each participant was greeted by the administrator and asked to
review and verbally acknowledge an informed consent/release form (included in Appendix 3); they were
instructed that they could withdraw at any time. Participants had prior experience with the EHR as they
are current users/customers. No additional training materials were provided other than that usually
given to customers. The administrator introduced the test and instructed participants to complete a
series of tasks (given one at a time) using Enablemyhealth. During the testing, the administrator timed
the test and, along with the data logger(s) recorded user performance data on paper and electronically.
The administrator did not give the participant assistance in how to complete the task. Participant
screens and audio were recorded for subsequent analysis. The following types of data were collected for
each participant:
• Number of tasks successfully completed within the allotted time without assistance
• Time to complete the tasks
• Participant’s subjective assessment of the ease of each task
• Number and types of errors
• Path deviations
• Participant’s verbalizations
• Participant’s satisfaction ratings of the system
All participant data was de-identified – no correspondence could be made from the identity of the
participant to the data collected. Following the conclusion of the testing, participants were asked to
complete a post-test questionnaire and were compensated with a $50 gift card for their time. Following
the conclusion of the testing, participants were asked to complete a post-test SUS questionnaire.
Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the
Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the
usability of the EHRUT.
Result data matrix - refer to Appendix 4 for complete task descriptions
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
2
MEASURES
TASKS
Path Deviations
(Observed /
Task
Optimal)
Number Success
Task Time
Mean (SD)
Task Time
Task
Deviations
Ratings
(Observed /
5=Easy
Errors
Optimal) Mean (SD) Mean (SD)
Create 3 medication allergies
3
100.00
1.1 (21.3/19)
84 (2.8)
1.1 (289/275)
1.86 (1.1)
3.7 (0.5)
Make inactive 2 medication allergies
Review Medication Allergy List
2
1
100.00
100.00
1.1 (16.8/16)
2.1 (2.1/1)
75 (2.7)
14 (7.7)
1.3 (75/60)
1.2 (18/15)
0.57 (0.7)
0.43 (0.7)
4.1 (0.6)
4.7 (0)
100.00
1.4 (13.4/12)
57.7 (4.5)
1.2 (128/116)
1 (0.9)
4.2 (0.4)
Mean across tasks
The results from the System Usability Scale scored the subjective satisfaction with the system based on
performance with these tasks to be: 81.4.
In addition to the performance data, the following qualitative observations were made:
Major findings
·
Issues with adding medication allergies:
1. Providers tend to just want to enter the allergies and not want to provide other
information.
2. Editing allergies required existing allergies to be made inactive, which was more time
consuming.
Areas for improvement
·
·
·
2
Allow edit of existing allergy without making inactive.
Allow default setting for medication allergies.
Use speech recognition and NLP to make entering data more natural.
INTRODUCTION
The EHRUT tested for this study was EnableMyHealth EHR4 Release, an ambulatory EHR. Designed to
present medical information to healthcare providers in primarily group practices with a focus on family
practice, surgery, eye, and PT/OT/Chiropractic medicine, the EHRUT consists of modules and features
that include but not limited to support for tested functionality:
· Patient demographics
· Patient problem and allergy lists
· Creating lab orders and receiving lab results
· Immunization recording
· Electronic prescribing
· Drug-drug, drug-allergy, and drug-problem interaction checking
· Clinical decision support
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
3
The EHRUT is used predominantly by physicians and clinical staff such as MA’s, RN’s, LPN’s, etc.
The usability testing attempted to represent realistic exercises and conditions.
The purpose of this study was to test and validate the usability of the current user interface, and provide
evidence of usability of the entry and modification of electronic prescriptions, laboratory orders, and
imaging orders in the EHR Under Test (EHRUT).To this end, measures of effectiveness, efficiency and
user satisfaction, such as successful task completion rate, time on task, number and types of errors, and
participant satisfaction were captured during the usability testing.
3
3.1
METHOD
PARTICIPANTS
A total of 7 participants were tested on the CPOE feature of the EHRUT. Participants in the test were
physicians and clinical staff such as RNs and MAs. Participants were recruited by Enabledoc staff from
existing customers and paid a $50 gift card for their help in testing.
Participants had no direct connection to the development of or organization producing the EHRUT other
than being current customers. Participants were not from the testing or supplier organization.
Participants were actual end users and thus have the same orientation and level of training as other
non-participant customers. For the test purposes, end-user characteristics were identified and
translated into an internal recruitment screener used to solicit potential participants; an example of a
screener is provided in Appendix [1].
Recruited participants had a mix of backgrounds and demographic characteristics conforming to the
recruitment screener. The following is a table of participants by characteristics, including demographics,
professional experience, computing experience and user needs for assistive technology. Participant
names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual
identities.
Participant ID
C1
C2
C3
C4
C5
C6
C7
Gender
M
F
F
F
F
M
F
Age
Occupation
/ role
48 MD
58 Medical Ass
38 Medical Ass
37 MD
74 MD
35 Medical Ass
25 Medical Ass
Computer
Experience
Professional (intermediate,
advanced,
Experience
expert)
(yrs)
27 intermediate
35 intermediate
17 intermediate
16 intermediate
53 intermediate
14 intermediate
4 intermediate
Product
Experience
(yrs)
3
3
4
3
3
3
1
Seven (7) participants (matching the demographics in the section on Participants) were recruited and all
participated in the usability test. 3); they were instructed that they could withdraw at any time.
Participants all had prior experience with the EHR. No participants failed to show for the study.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
4
Participants were scheduled for 15 minute sessions with 15 minutes in between each session for debrief
by the administrator and data logger, and to reset systems to proper test conditions. All testing was
performed over several days to allow the participants to schedule time at their convenience. A
spreadsheet was used to keep track of the participant schedule.
The administrator introduced the test, and instructed participants to complete a series of tasks (given
one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the
data logger(s) recorded user performance data on paper and electronically. The administrator did not
give the participant assistance in how to complete the task.
3.2
STUDY DESIGN
The objective of this test was to perform summative testing to measure the key usability metrics of
effectiveness, efficiency, and user satisfaction. These metrics will uncover areas where the application
performed well and areas where the application failed to meet the needs of the participants in achieving
our internal usability goals. The data from this test may serve as a baseline for future tests with an
updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used.
In short, this testing serves as both a means to record or benchmark current usability, but also to
identify areas where improvements should be made.
During the usability test, participants interacted exclusively with the EHRUT. Each participant used the
system in their preferred location, and was provided with the same instructions. All sessions were
conducted remotely with Join.Me conferencing software. Screens with associated interaction and the
audio stream were recorded for later analysis. The system was evaluated for effectiveness, efficiency
and satisfaction as defined by measures collected and analyzed for each participant:
• Number of tasks successfully completed within the allotted time without assistance
• Time to complete the tasks
• Number and types of errors
• Path deviations
• Participant’s verbalizations (comments)
• Participant’s satisfaction ratings of the system
Additional information about the various measures can be found in Section 3.9 on Usability Metrics.
3.3
TASKS
Tasks were constructed that would be realistic and representative of the kinds of activities a user might
do with this EHR in the area of [describe feature under study] and heavily based on the CMS test scripts
for Meaningful Use Phase 2. The tasks for this study included:
· Enter 3 medication allergies
· Modify 2 medication allergies
· Review allergy history
Task Selection and Priority
Tasks were selected based on their frequency of use, criticality of function, and those that may be most
troublesome for users. The tasks were directly modeled on the CMS Medication Allergy List test
procedure §170.314(a)(7).
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
5
Tasks were ordered and prioritized based on their impact on patient safety, in which the tasks that had
the greatest potential for patient harm due to critical errors were performed first.
The task scenario document is contained in appendix 4.
3.4
PROCEDURES
Just prior to the scheduled time, the test administrator started the join.me session and greeted the
participant on arrival; their identity was verified and matched with a name on the participant schedule.
Participants were then assigned a participant ID. Recording of the session was started using the join.me
recording feature.
Each participant reviewed and agreed to the informed consent and release form via verbal
acknowledgment (See Appendix 3). A representative from the test team witnessed the participant’s
verbal agreement.
To ensure that the test ran smoothly, testing was performed remotely by an experienced usability
practitioner with over 25 years of experience in healthcare user interface and workflow design.
The administrator moderated the session including administering instructions and tasks. The
administrator also monitored task times, obtained post-task rating data, took notes on participant
comments, path deviations, number and type of errors, and comments.
Participants were instructed to perform the tasks (see specific instructions below):
·
·
·
As quickly as possible making as few errors and deviations as possible.
Without assistance; administrators were allowed to give immaterial guidance and clarification
on tasks, but not instructions on use.
Without using a think aloud technique.
For each session, the participants were given an electronic copy of the tasks for that session. They were
requested to not read the tasks prior to the session. Participants were asked to read the task aloud prior
to each task; task timing began once the participant finished reading the question and verbally indicated
they were starting the task. The task time was stopped once the participant indicated they had
successfully completed the task. Scoring is discussed below in Section 3.9.
Following the session, the administrator gave the participant the post-test questionnaire (the System
Usability Scale, see Appendix 7), solicited any further comments, and thanked each individual for their
participation.
Participants' demographic information, task success rate, time on task, errors, deviations, verbal
responses, and post-test questionnaire were recorded into a spreadsheet.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
6
3.5
TEST LOCATION
3.6
TEST ENVIRONMENT
The test was conducted remotely through the use of Join.Me virtual conferencing and screen sharing
software. Thus the actual test location was at the discretion of the test participants. The test
administrator conducted the test from Enabledoc LLC offices in Rochester MN.
The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was
conducted in healthcare office. For testing, the computer used a Windows laptop running Windows OS
and Chrome browser. The participants used a mouse and keyboard when interacting with the EHRUT.
The Enablemyhealth application was used on a 13 inch laptop with 1366 by 768 resolution and 32 bit
color. The application was set up by the vendor according to the vendor’s documentation describing the
system set-up and preparation. The application itself was running on a Windows 2012 Server using a
test database on a WAN connection. Technically, the system performance (i.e., response time) was
representative to what actual users would experience in a field implementation with a minor lag caused
by video screen sharing and recording. Additionally, participants were instructed not to change any of
the default system settings (such as control of font size).
3.7
TEST FORMS AND TOOLS
During the usability test, various documents and instruments were used, including:
• Informed Consent
• Test task scenarios
• Moderator’s Guide
• Observer’s data collection template
• Post-test SUS Questionnaire
Examples of these documents can be found in Appendices 3 – 7 respectively.
The participant’s interaction with the EHRUT was captured and recorded digitally using the screen
recording capability of join.me running on the test machine. This recording included the audio stream of
verbalizations. The test sessions were electronically transmitted to any additional observers who logged
into the join.me session.
3.8
PARTICIPANT INSTRUCTIONS
The administrator reads the following instructions aloud to the each participant (also see the full
moderator’s guide in Appendix 5):
Thank you for participating in this study. Your input is very important. Our session today will last
about 15 minutes. During that time you will use a version of EnableMyHealth EHR and work with
specific features. Our goal is to determine where there are areas of difficulty and design aspects that
can be improved.
I will ask you to complete a few tasks using this system and answer some questions. You should
complete the tasks as quickly as possible making as few errors as possible. Please try to complete the
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
7
tasks on your own following the instructions very closely. Please note that we are not testing you we
are testing the system, therefore if you have difficulty all this means is that something needs to be
improved in the system. There are no wrong answers! We will be here in case you need specific help,
but we will not be able to instruct you or provide help in how to use the application, however we may
provide specific hints as necessary.
Overall, we are interested in how easy (or how difficult) this system is to use, and how we could
improve it. I did not have any involvement in its creation, so please be honest with your opinions.
We are recording the audio and screen interaction of our session today. All of the information that
you provide will be kept confidential and your name will not be associated with your comments at any
time. Should you feel it necessary you are able to withdraw at any time during the testing for any
reason.
Following the procedural instructions, participants were started with a specific patient’s chart data. Prior
to giving the participant mouse and keyboard control, the moderator gave the following instructions:
For each task, I will ask you to read the task and indicate when you begin. At that point, please
perform the task and say “Done” once you believe you have successfully completed the task. I would
like to request that you not talk aloud or verbalize while you are doing the tasks. You may certainly
ask questions if necessary and we may provide guidance or a hint, however we will not provide direct
instruction during the tasks. I will ask you your impressions about the task once you are done.
Participants were then given 4 tasks to complete. Tasks are listed in Appendix 4.
3.9
USABILITY METRICS
According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health
Records, EHRs should support a process that provides a high level of usability for all users. The goal is for
users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To
this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability
testing. The goals of the test were to assess:
1. Effectiveness of ENABLEMYHEALTH EHR by measuring participant success rates and errors
2. Efficiency of ENABLEMYHEALTH EHR by measuring the average task time and path deviations
3. Satisfaction with ENABLEMYHEALTH EHR by measuring ease of use ratings
3.10 DATA SCORING
The following table details how tasks were scored, errors evaluated, and the time data analyzed:
Rationale and Scoring
Measure
Effectiveness:
Task Success
A task was counted as a “Success” if the participant was able to achieve the
correct outcome, without assistance, within the overall time allotted for the
entire set of tasks.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
8
The total number of successes were calculated for each task and then
divided by the total number of times that task was attempted. The results
are provided as a percentage.
Task times were recorded for successes. Observed task times divided by the
optimal time for each task is a measure of optimal efficiency. Due to the
variability of multiple correct paths and a large variety of user settable
preferences, all of which can affect time on task, optimal task times and
deviations from these times were unrealistic to assess. All participants were
trained on Enablemyhealth EHR and so the performance factor of 1.0 was
use. The overall usability goal for any feature of the EHR is task time that is
deemed acceptable by the end user with no critical errors. Thus, if expert,
optimal performance on a task was [60] seconds then allotted task time
performance was [60 * 1.0] seconds. This ratio is aggregated across tasks and
reported with mean and variance scores.
Effectiveness:
Task Failures
Efficiency:
Task Deviations
If the participant abandoned the task, did not reach the correct answer,
performed it incorrectly, or was unsure if they had completed the task, the
task was counted as a “Critical Failure.” No task times were taken for failed
tasks.
Minor errors were defined as an errant click, initial selection of an incorrect
menu option, or incorrect entries that the participant noticed and corrected.
The total number of errors was calculated for each task and then divided by
the total number of times that task was attempted. Minor errors and
deviations were noted but not counted as significant errors. This is expressed
as the mean number of failed tasks per participant.
The participant’s path (i.e., steps) through the application was recorded.
Deviations occur if the participant, for example, went to a wrong screen,
clicked on an incorrect menu item, followed an incorrect link, or interacted
incorrectly with an on-screen control. These path deviations were included in
the minor error count. The minor error count is expressed as an average
across all participants.
Efficiency:
Task Time
Satisfaction:
Task Rating
Each task was timed from when the participant indicated they were
beginning the task until the participant said “Done.” If he or she failed to say
“Done,” the time was stopped when the participant stopped performing the
task. Only task times for tasks that were successfully completed were
included in the average task time analysis. Average time per task was
calculated for each task as was variance (standard deviation).
Participant’s subjective impression of the ease of use of the application was
measured by administering both a simple post-task question as well as a
post-session questionnaire. After each task, the participant was asked to rate
“Overall, this task was:” on a scale of 1 (Very Difficult) to 7 (Very Easy).
Average difficulty ratings per task were calculated as was variance.
Common convention is that average ratings for systems judged easy to use
should be 3.3 or above.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
9
To measure participants’ confidence in and likeability of the
ENABLEMYHEALTH EHR feature overall, the testing team administered the
System Usability Scale (SUS) post-test questionnaire. Questions included, “I
think I would like to use this system frequently,” “I thought the system was
easy to use,” and “I would imagine that most people would learn to use this
system very quickly.” See full System Usability Score questionnaire in
Appendix 7.
4
4.1
RESULTS
DATA ANALYSIS AND REPORTING
The results of the usability test were calculated according to the methods specified in the Usability
Metrics section above. Participants who failed to follow session and task instructions had their data
excluded from the analyses, however there were no instances of this occurring. The usability testing
results for the ENABLEMYHEALTH EHR are detailed below (Table 1). The results should be seen in light of
the objectives and goals outlined in Section 3.2 Study Design. The data should yield actionable results
that, if corrected, yield material, positive impact on user performance.
Table 1
MEASURES
TASKS
Path Deviations
(Observed /
Task
Optimal)
Number Success
Task Time
Mean (SD)
Task Time
Task
Deviations
Ratings
(Observed /
5=Easy
Errors
Optimal) Mean (SD) Mean (SD)
Create 3 medication allergies
3
100.00
1.1 (21.3/19)
84 (2.8)
1.1 (289/275)
1.86 (1.1)
3.7 (0.5)
Make inactive 2 medication allergies
Review Medication Allergy List
2
1
100.00
100.00
1.1 (16.8/16)
2.1 (2.1/1)
75 (2.7)
14 (7.7)
1.3 (75/60)
1.2 (18/15)
0.57 (0.7)
0.43 (0.7)
4.1 (0.6)
4.7 (0)
100.00
1.4 (13.4/12)
57.7 (4.5)
1.2 (128/116)
1 (0.9)
4.2 (0.4)
Mean across tasks
The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system
based on performance with these tasks to be: 81.4. Broadly interpreted, this indicates above average
usability, but with further room for improvement.
4.2
DISCUSSION OF THE FINDINGS
Overall, the Medical Allergy feature of Enablemyhealth user experience exhibits good effectiveness, with
no critical errors out of 42 attempted tasks, but there were 1 mean deviations errors across all tasks.
This is within internal usability goal guidelines. Of the total 20 deviation errors, 13 were attributable to
entering new medication allergies.
Efficiency measures indicate that while acceptable, there is room for improvement, specifically in the
editing of medication allergies. Adding medication allergies can only be expedited using defaulting field
values.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
10
Satisfaction, measured by an SUS score of 81.4 is very good, but can be improved. Working to make the
user experience more natural and intelligent will improve the overall user experience and help guide the
novice and advanced users.
4.3
EFFECTIVENESS
Based on the success rate, minor error rate, and critical error rate data, the medication allergies feature
has an acceptable level of effectiveness. The usability test consisted of 35 tasks.
Out of the 42 attempted tasks, there were no critical errors, giving a successful completion rate of 100%.
The 20 deviation (minor) errors in the medication allergies were difficulties associated with selecting the
correct data and editing then making allergies inactive. The deviation error mean of 1 is within the
acceptable range of effectiveness.
4.4
EFFICIENCY
Observation of task time is the primary area for improvement. The mean task time across all tasks is
57.7 seconds out of an optimal 53.6 was primary caused by searching for medications and selecting
affects and severity.
4.5
SATISFACTION
Based on task difficulty ratings and SUS results data, user satisfaction of the feature is considered
favorable. Users perception of ease of use is considered easy, with a mean rating of 4.2 across all tasks
(1=very difficult, 5=very easy). SUS scoring is very good with a score of 81.4. These scores accurately
describe the average patient satisfaction with Enablemyhealth feature set.
SUS Scoring
Mean
StDev
4.6
81.43
4.40
MAJOR FINDINGS
The major usability problems encountered in this study were:
·
4.7
Issues with adding medication allergies:
1. Providers tend to just want to enter the allergies and not want to provide other
information.
2. Editing allergies required existing allergies to be made inactive, which was more time
consuming.
AREAS FOR IMPROVEMENT
Entering medication allergy improvements:
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
11
·
·
·
5
Allow edit of existing allergy without making inactive.
Allow default setting for medication allergies.
Use speech recognition and NLP to make entering data more natural.
APPENDICES
The following appendices include supplemental data for this usability test report. Following is a list of
the appendices provided:
1: Participant Recruiting Screener
2: Participant demographics
3: Informed Consent Form
4: Task Scenarios
5: Moderator’s Guide
6: Sample data collection form
7: System Usability Scale Questionnaire
Appendix 1: PARTICIPANT RECRUITING SCREENER
Introduction
Hello, my name is __. Enabledoc is seeking doctors and clinicians who are users of
• Medication list and medication allergy list management
to take part in a usability study of that portion of the EnableMyHealth EHR.
This study will assist us in designing and developing a solution that meets your needs. Your experiences
in using this particular design will greatly help our designers and developers. The testing of our design
will take place in your office using remote meeting technology, requiring only your time, thoughts, and
suggestions. We expect the session to last approximately 45 minutes.
Does this sound like something that interests you? Before I schedule you for a session, do you have a
few moments to answer some questions?
General Questions
1. Are you male or female? [Recruit a mix of participants]
2. Have you participated in a focus group or usability test in the past three months? [Note but do not
terminate if yes]
3. Which of the following best describes your age? [25 or less; 26 to 39; 40 to 59; 60 to 74; 75 and
older][Recruit a mix of ages]
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
12
Professional Demographics
4. What is your current position/role in your practice?
5. How long have you been in this role?
6. Do you currently perform e-prescribing? [computerized order entry, manage medication and
medication allergy lists – pick appropriate for test][Terminate if no to specific task for test] How many
prescriptions [or lab orders – use appropriate choice for test] do you write per day (or week if that is a
better estimate)?
7. What year did you receive your medical degree?
Computer Expertise
8. About how many hours per week do you spend on the computer that is medical practice related?
[Recruit a range of use, e.g., 0 to 10, 11 to 25, 26+ hours per week][Terminate if less than 5]
9. Do you use a computer outside of your medical work?
10. If so, about how many hours per week do spend using a computer for non-work related endeavors?
11. Regarding your use of ADP Advancedmd Practice Manager & EHR, what percentage of time is spent
in each product? (per day)
Domain Knowledge
·
Rate your expertise or comfort in using ADP Advancedmd software on a scale of 1 to 5 (1=just
starting, 5=expert) for
§ PM
§ EHR (overall)
§ EHR e-Prescribing
[Terminate if 1 or 2]
·
If you have used similar or competing products, describe your level of expertise in those
products [it is not necessary to name the product(s) unless they want to].
Contact Information
[If the person matches your qualifications, ask for any info we do not have] May I have your contact
information?
· Name of participant:
· Office Key:
· Best phone number:
· Email address:
Those are all the questions I have for you. Your background matches the people we're looking for.
Would you be able to participate on [date, time]?
Alternative: select from a list of sessions [ideal approach]
Alternative; What would be the best date and time for you?
Before your session starts, we will ask you to verbally acknowledge a release form allowing us to record
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
13
your session. The recording will only be used internally for further study if needed and will never be
used for advertising or marketing purposes. Also, you will not be personally identified with any
recording. Will you consent to be recorded? [Terminate if no]
This study will take place remotely via conferencing software, allowing you to participate in the session
at the place of your choosing. I will confirm your appointment a couple of days before your session and
provide you with any additional information. What is the best time to contact you?
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
14
Appendix 2: PARTICIPANT DEMOGRAPHICS
Following is a high-level overview of the participants in this study.
Gender
Men
Women
Total (participants)
2
5
7
Occupationan/Role
RN/NP/MA
Physician
Total (participants)
4
3
7
Years of Experience (average)
Professional
EHR Product Use
23.71
2.86
Demographic detail
Participant ID
C1
C2
C3
C4
C5
C6
C7
Gender
M
F
F
F
F
M
F
Age
Occupation
/ role
48 MD
58 Medical Ass
38 Medical Ass
37 MD
74 MD
35 Medical Ass
25 Medical Ass
Enabledoc LLC ©2015 Confidential and Proprietary
Computer
Experience
Professional (intermediate,
advanced,
Experience
expert)
(yrs)
27 intermediate
35 intermediate
17 intermediate
16 intermediate
53 intermediate
14 intermediate
4 intermediate
Version 1
Product
Experience
(yrs)
3
3
4
3
3
3
1
15
Appendix 3: Informed Consent Form
Enabledoc would like to thank you for participating in this study. The purpose of this study is to
evaluate an electronic health records system. If you decide to participate, you will be asked to perform
several tasks using the prototype and give your feedback. The study will last up to 60 minutes.
Agreement
I understand and agree that as a voluntary participant in the present study conducted by Enabledoc. I
am free to withdraw consent or discontinue participation at any time. I understand and
agree to participate in the study conducted by Enabledoc.
I understand and consent to the use and release of the recording by Enabledoc. I understand that the
information and recording is for research purposes only and that my name and image will not be used
for any purpose other than research. I relinquish any rights to the recording and understand the
recording may be copied and used by Enabledoc without further permission.
I understand and agree that the purpose of this study is to make software applications more useful and
usable in the future. I understand and agree that the data collected from this study will not be shared
outside of Enabledoc
I understand and agree that data confidentiality is assured, because only de-identified data – i.e.,
identification numbers not names – will be used in analysis and reporting of the results.
I agree to immediately raise any concerns or areas of discomfort with the study administrator. I
understand that I can leave at any time.
Please check or verbally indicate one of the following:
 YES, I have read the above statement and agree to be a participant.
 NO, I choose not to participate in this study.
Signature: _____________________________________ Date: ____________________
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
16
Appendix 4: Task Scenarios
Usability Task Scenarios – Medication Allergies
Access the clinical information for Garth Brooks by clicking Patient Center, then click Open Note and
perform these steps:
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
Click Allergies and select Sulfasalazine.
Select wheezing.
Select moderate severity and click Save.
Review interaction guidance and click Close.
Select Penicillin V.
Select Dizziness.
Select Severe and click Save.
Select Carbamazepine.
Select Rash.
Select moderate and click Save.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
Modify medication allergies by performing these steps:
1. Click edit next to Penicillin V.
2. Select Penicillin G and click Save.
3. Click X next to Penicillin V.
4. Type wrong med and click Make inactive.
5. Click edit next to Carbamazepine.
6. Select Codeine and click Save.
7. Click X next to Carbamazepine.
8. Type wrong med and click Make inactive.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
View medication allergies history by performing these steps:
1. Click Allergy History.
Overall, how difficult or easy did you find this task?
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
17
Very Difficult
1
Very Easy
2
3
4
5
Appendix 5: Usability Test Administration Moderator’s Guide
1. Items often forgotten
a. Stop watch for recording time on task
b. Elapsed time timer to time code observations/comments in the recording
2. Office Key Setup
a. Environment
b. Obtain copy of user key database if required for test
c. Any required feature access
d. Create and enter any necessary test data
e. Create Database / Office Key snapshot
3. Schedule Join.Me sessions
4. Schedule participants
5. Supply test participants with materials prior to test
a. Join.Me access
b. Task scenarios (be prepared to resend at the start of the test – do not assume the
participant has multiple monitors, so they may also need to print the scenarios)
c. Informed consent form (can be done verbally during the session)
6. Set up administrative / test PC
a. Join.Me installed and up to date
b. Shut down all programs not needed for the test – remember things like Communicator
c. Turn off notifications – audible and screen – on any software that may be running during
the test (e.g.: Outlook)
d. Ensure the application installed and functional
7. Running a test session
a. Ensure office key/database snapshot of base data exists
b. Start EHR session on test PC and arrive at starting screen for test
c. Select first patient if selection is not part of the task scenario
d. Start Join.Me session
e. Share appropriate screen
f. Greet user
g. Administer introductory materials to user (see below, following checklist)
h. Start Join.Me recording
i. Display informed consent form on shared screen
j. Obtain consent via verbal acceptance
k. Give mouse and keyboard control to ALL
l. Have user work through tasks
m. For each task, record
i. Time on task
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
18
ii. Critical errors (anything that constitutes task failure)
iii. Minor errors (anything the user detects and recovers)
iv. Deviation(s) from optimal path
v. Interesting comments (try to include recording time stamp for reference)
n.
After each task or task set (see specific scenarios) ask for response to single ease of use
question
o. Allow verbal response to any questions contained in task scenarios
p. At end of all tasks, display SUS questionnaire and have the participant place an X in
appropriate response to each statement
i. Save SUS with participant code as part of file name
q. Conduct debrief to solicit any additional feedback, comments, Q&A, etc
Thank you for participating in this study. Your input is very important. Our session today will last about
55 minutes. During that time you will use a version of EnableMyHealth EHR and work with specific
features. Our goal is to determine where there are areas of difficulty and design aspects that can be
improved.
I will ask you to complete a few tasks using this system and answer some questions. You should complete
the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your
own following the instructions very closely. Please note that we are not testing you we are testing the
system, therefore if you have difficulty all this means is that something needs to be improved in the
system. There are no wrong answers! We will be here in case you need specific help, but we will not be
able to instruct you or provide help in how to use the application, however we may provide specific hints
as necessary.
Overall, we are interested in how easy (or how difficult) this system is to use, and how we could improve
it. I did not have any involvement in its creation, so please be honest with your opinions.
We are recording the audio and screenshots of our session today. All of the information that you provide
will be kept confidential and your name will not be associated with your comments at any time. Should
you feel it necessary you are able to withdraw at any time during the testing for any reason.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
19
Appendix 6: Sample data collection form
Summative Usability Test Data Log
Test type:
Participant:
Participant code:
Session Date/start time:
File name:
Task 1:
Critical Error count:
Minor error count:
Optimal path deviations:
Time on task:
SEoUQ response:
Comments/observations:
<repeat for each task>
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
20
Appendix 7: System Usability Scale Questionnaire
In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systems
usability” known as the System Usability Scale or SUS. Lewis and Sauro (2009) and others have
elaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, at
http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008).
Strongly
Disagree
1. I think that I would like to use this system
frequently
Strongly
agree
1
2
3
4
Strongly
Disagree
5
Strongly
agree
2. I found the system necessarily complex
1
2
3
4
5
1
2
3
4
5
3. I thought the system was easy to use
4. I think that I would need the support of a
technical person to be able to use this
system
5. I found the various functions in this system
were well integrated
6. I thought there was too much
inconsistency in this system
7. I would imagine that most people would
learn to use this system very quickly
8. I found the system very cumbersome to
use
Enabledoc LLC ©2015 Confidential and Proprietary
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
Version 1
21
9. I felt very confident using the system
10. I needed to learn a lot of things before I
could get going with this system
Enabledoc LLC ©2015 Confidential and Proprietary
1
2
3
4
5
1
2
3
4
5
Version 1
22
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
23
EHR Usability Test
Report of
Enablemyhealth
EHR Clinical
Information
Reconciliation
Safety-Enhanced Design 170.314(g)(3)
Report based on ISO/IEC 25062:2006 Common Industry Format for
Usability Test Reports
Date of Usability Test: 15 AUG 2015 – 14 SEP 2015
Date of Report: 14 SEP 2015
Report Prepared By: Stephen Rothschild
877.540.0933
[email protected]
EnableDoc LLC
7700 Falstaff Road
McLean VA 22104
Table of Contents
1
Executive Summary ............................................................................................................................... 2
2
INTRODUCTION ..................................................................................................................................... 3
3
METHOD................................................................................................................................................ 4
4
5
3.1
PARTICIPANTS ............................................................................................................................... 4
3.2
STUDY DESIGN............................................................................................................................... 5
3.3
TASKS............................................................................................................................................. 6
3.4
TEST LOCATION ............................................................................................................................. 7
3.5
TEST ENVIRONMENT ..................................................................................................................... 7
3.6
TEST FORMS AND TOOLS .............................................................................................................. 7
3.7
PARTICIPANT INSTRUCTIONS ........................................................................................................ 8
3.8
USABILITY METRICS ....................................................................................................................... 9
3.9
DATA SCORING .............................................................................................................................. 9
RESULTS .............................................................................................................................................. 11
4.1
DATA ANALYSIS AND REPORTING ............................................................................................... 11
4.2
DISCUSSION OF THE FINDINGS ................................................................................................... 11
4.3
EFFECTIVENESS ........................................................................................................................... 12
4.4
EFFICIENCY .................................................................................................................................. 12
4.5
SATISFACTION ............................................................................................................................. 12
4.6
MAJOR FINDINGS ........................................................................................................................ 12
4.7
AREAS FOR IMPROVEMENT ........................................................................................................ 13
APPENDICES ........................................................................................................................................ 14
Appendix 1: PARTICIPANT RECRUITING SCREENER ................................................................................ 15
Appendix 2: PARTICIPANT DEMOGRAPHICS ........................................................................................... 17
Appendix 3: Informed Consent Form...................................................................................................... 18
Appendix 4: Task Scenarios ..................................................................................................................... 19
Appendix 5: Usability Test Administration Moderator’s Guide .............................................................. 20
Appendix 6: Sample data collection form ............................................................................................... 23
Appendix 7: System Usability Scale Questionnaire ................................................................................ 24
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
1
1
Executive Summary
A usability test of EnableMyHealth – 2014.1, an ambulatory EHR, was conducted on 15 Aug 2015 – 14
Sep 2015 in the McLean, VA and Rochester MN offices of EnableDoc LLC. The purpose of this test was to
test and validate the usability of the current user interface for Clinical Information Reconciliation (CIR),
and provide evidence of usability in the EHR Under Test (EHRUT).
During the usability test, four healthcare providers and clinical staff matching the target
demographic criteria served as participants and used Enablemyhealth EHR in simulated, but
representative tasks.
This study collected performance data on two Clinical Information Reconciliation tasks that
enable accepting and merging incoming clinical information obtained from an external provider
for a new or existing patient. In these tasks, an EHR user is required to:
·
·
Locate the reconciliation feature, identify the correct incoming clinical record, and
match the incoming record to the correct patient, if not properly mapped.
Reconcile current medications, allergies and problem from current patient and CDA,
then import and verify.
All test participants conducted the test sessions remotely via on-line conferencing software. During the
15 minute one-on-one usability test, each participant was greeted by the administrator and asked to
review and verbally acknowledge an informed consent/release form (included in Appendix 3); they were
instructed that they could withdraw at any time. Participants had prior experience with the EHR as they
are current users/customers. Participants had prior experience with the EHR as they are current
customers with an average of 2.85 years of use. No additional training materials were provided other
than that usually given to customers. The administrator introduced the test and instructed participants
to complete a series of tasks (given one at a time) using Enablemyhealth. During the testing, the
administrator timed the test and, along with the data logger(s) recorded user performance data on
paper and electronically. The administrator did not give the participant assistance in how to complete
the task. Participant screens and audio were recorded for subsequent analysis. The following types of
data were collected for each participant:
• Number of tasks successfully completed within the allotted time without assistance
• Time to complete the tasks
• Participant’s subjective assessment of the ease of each task
• Number and types of errors
• Path deviations
• Participant’s verbalizations
• Participant’s satisfaction ratings of the system
All participant data was de-identified – no correspondence could be made from the identity of the
participant to the data collected. Following the conclusion of the testing, participants were asked to
complete a post-test questionnaire and were compensated with one $50 gift card as a gift for their
participation. Following the conclusion of the testing, participants were asked to complete a post-test
SUS questionnaire.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
2
Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the
Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the
usability of the EHRUT.
Result data matrix - refer to Appendix 4 for complete task descriptions:
TASKS
Find CDA and check or map to
patient
Reconcile current medications,
allergies, and problems list
Number
Errors Mean
(SD)
Task
Ratings
5=Easy
Mean
(SD)
Path Deviations
(Observed /
Optimal)
Task Time
Mean
(SD)
Task Time
Deviations
(Observed /
Optimal)
1 100.00
1.15 (4.6/4)
22.6 (1.8)
1.1 (22.6/20)
1.86 (1.1)
4.6 (0.5)
1 100.00
1.18 (4.7/4)
12.3 (1.8)
1.2 (12.3/10)
0.57 (0.7)
4.7 (0.5)
Task
Succes
s
The results from the System Usability Scale scored the subjective satisfaction with the system
based on performance with these tasks to be: 83.6.
In addition to the performance data, the following qualitative observations were made:
Every participant found the process of reconciling to be very easy and completed the process
very quickly, but there is always room for improvement.
The major usability difficulties encountered were:
·
·
·
2
Knowing how to select individual medications, allergies or problems or all in each
category appeared to be most confusing.
Selecting an encounter was also confusing.
One user clicked import by accident prior to being completed and went back and
imported the rest.
INTRODUCTION
The EHRUT tested for this study was EnableMyHealth EHR4 Release, an ambulatory EHR. Designed to
present medical information to healthcare providers in primarily group practices with a focus on family
practice, surgery, eye, and PT/OT/Chiropractic medicine, the EHRUT consists of modules and features
that include but not limited to support for:
· Patient demographics
· Patient problem and allergy lists
· Patient encounter notes and charts
· Patient clinical documents
· Creating lab orders and receiving lab results
· Growth charting
· Immunization recording
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
3
·
·
·
·
·
Electronic prescribing
Drug-drug, drug-allergy, and drug-problem interaction checking
Clinical decision support
Quality reporting
Patient population reporting
The EHRUT is used predominantly by physicians and clinical staff such as MA’s, RN’s, LPN’s, etc.
The usability testing attempted to represent realistic exercises and conditions.
The purpose of this study was to test and validate the usability of the current user interface, and provide
evidence of usability of the entry and modification of electronic prescriptions, laboratory orders, and
imaging orders in the EHR Under Test (EHRUT).To this end, measures of effectiveness, efficiency and
user satisfaction, such as successful task completion rate, time on task, number and types of errors, and
participant satisfaction were captured during the usability testing.
3
3.1
METHOD
PARTICIPANTS
A total of 7 participants were tested on the CPOE feature of the EHRUT. Participants in the test were
physicians and clinical staff such as RNs and MAs. Participants were recruited by Enabledoc staff from
existing customers and paid a $50 gift card for their help in testing.
Participants had no direct connection to the development of or organization producing the EHRUT other
than being current customers. Participants were not from the testing or supplier organization.
Participants were actual end users and thus have the same orientation and level of training as other
non-participant customers. For the test purposes, end-user characteristics were identified and
translated into an internal recruitment screener used to solicit potential participants; an example of a
screener is provided in Appendix [1].
Recruited participants had a mix of backgrounds and demographic characteristics conforming to the
recruitment screener. The following is a table of participants by characteristics, including demographics,
professional experience, computing experience and user needs for assistive technology. Participant
names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual
identities.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
4
Participant ID
C1
C2
C3
C4
C5
C6
C7
Gender
M
F
F
F
F
M
F
Age
Occupation
/ role
48 MD
58 Medical Ass
38 Medical Ass
37 MD
74 MD
35 Medical Ass
25 Medical Ass
Computer
Experience
Professional (intermediate,
advanced,
Experience
expert)
(yrs)
27 intermediate
35 intermediate
17 intermediate
16 intermediate
53 intermediate
14 intermediate
4 intermediate
Product
Experience
(yrs)
3
3
4
3
3
3
1
Seven (7) participants (matching the demographics in the section on Participants) were recruited and all
participated in the usability test. 3); they were instructed that they could withdraw at any time.
Participants all had prior experience with the EHR. No participants failed to show for the study.
Participants were scheduled for150 minute sessions with 30 minutes in between each session for
debrief by the administrator and data logger, and to reset systems to proper test conditions. All testing
was performed over several days to allow the participants to schedule time at their convenience. A
spreadsheet was used to keep track of the participant schedule.
The administrator introduced the test, and instructed participants to complete a series of tasks (given
one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the
data logger(s) recorded user performance data on paper and electronically. The administrator did not
give the participant assistance in how to complete the task.
3.2
STUDY DESIGN
The objective of this test was to perform summative testing to measure the key usability metrics of
effectiveness, efficiency, and user satisfaction. These metrics will uncover areas where the application
performed well and areas where the application failed to meet the needs of the participants in achieving
our internal usability goals. The data from this test may serve as a baseline for future tests with an
updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used.
In short, this testing serves as both a means to record or benchmark current usability, but also to
identify areas where improvements should be made.
During the usability test, participants interacted exclusively with the EHRUT. Each participant used the
system in their preferred location, and was provided with the same instructions. All sessions were
conducted remotely with Join.Me conferencing software. Screens with associated interaction and the
audio stream were recorded for later analysis. The system was evaluated for effectiveness, efficiency
and satisfaction as defined by measures collected and analyzed for each participant:
·
·
Number of tasks successfully completed within the allotted time without assistance
Time to complete the tasks
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
5
·
·
·
·
Number and types of errors
Path deviations
Participant’s verbalizations (comments)
Participant’s satisfaction ratings of the system
Additional information about the various measures can be found in Section 3.9 on Usability Metrics.
3.3
TASKS
Tasks were constructed that would be realistic and representative of the kinds of activities a user might
do with this EHR in the area of Clinical Information Reconciliation of incoming clinical data and heavily
based on the CMS test scripts for Meaningful Use Phase 2. The tasks for this study broadly included:
·
·
Locate the reconciliation feature, identify the correct incoming clinical record, and verify patient
is matched to the correct patient.
Select the correct medications, allergies, and problems for that patient’s clinical record and click
import.
Task Selection and Priority
Tasks were selected based on their frequency of use, criticality of function, and those that may be most
troublesome for users. The tasks were directly modeled on the CMS Clinical Information Reconciliation
test procedure [170.314(b)(4)].
Tasks were ordered and prioritized based on their impact on patient safety, in which the tasks that had
the greatest potential for patient harm due to critical errors were performed first. CIR of patient records
are accomplished with the use of the new “Incoming CDA” tool. The interactions to reconcile clinical
information from the patient’s active medication list, active medication allergy list, and problem list are
identical and all are included within the Wizard interaction sequence. Therefore, reconciliation of the
individual areas was not broken out into separate tasks. Results can be generalized across these list
types. These tasks assessed if and how the clinician participants reacted to the intervention alerts.
In conjunction with this usability study were tasks relating to Clinical Decision Support. The results of
that study are described in a separate report and not included here. The seven tasks of that study are
included in the task scenario document contained in appendix 4 but are irrelevant to this report.
During the time a participant was scheduled, an email was sent to the participant that:
• Confirmed the time of the session
• Provided Join.Me access codes
• A copy of the informed consent form
• A document containing the tasks for the test session
Just prior to the scheduled time, the test administrator started the join.me session and greeted the
participant on arrival; their identity was verified and matched with a name on the participant schedule.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
6
Participants were then assigned a participant ID. Recording of the session was started using the join.me
recording feature.
Each participant reviewed and agreed to the informed consent and release form via verbal
acknowledgment (See Appendix 3). A representative from the test team witnessed the participant’s
verbal agreement.
To ensure that the test ran smoothly, testing was performed remotely by an experienced usability
practitioner with over 25 years of experience in healthcare user interface and workflow design.
The administrator moderated the session including administering instructions and tasks. The
administrator also monitored task times, obtained post-task rating data, took notes on participant
comments, path deviations, number and type of errors, and comments.
Participants were instructed to perform the tasks (see specific instructions below):
·
·
·
As quickly as possible making as few errors and deviations as possible.
Without assistance; administrators were allowed to give immaterial guidance and clarification
on tasks, but not instructions on use.
Without using a think aloud technique.
For each session, the participants were given an electronic copy of the tasks for that session. They were
requested to not read the tasks prior to the session. Participants were asked to read the task aloud prior
to each task; task timing began once the participant finished reading the question and verbally indicated
they were starting the task. The task time was stopped once the participant indicated they had
successfully completed the task. Scoring is discussed below in Section 3.9.
Following the session, the administrator gave the participant the post-test questionnaire (the System
Usability Scale, see Appendix 7), solicited any further comments, and thanked each individual for their
participation.
Participants' demographic information, task success rate, time on task, errors, deviations, verbal
responses, and post-test questionnaire were recorded into a spreadsheet.
3.4
TEST LOCATION
3.5
TEST ENVIRONMENT
The test was conducted remotely through the use of Join.Me virtual conferencing and screen sharing
software. Thus the actual test location was at the discretion of the test participants. The test
administrator conducted the test from Enabledoc LLC offices in Rochester MN.
The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was
conducted in healthcare office. For testing, the computer used a Windows laptop running Windows OS
and Chrome browser. The participants used a mouse and keyboard when interacting with the EHRUT.
The Enablemyhealth application was used on a 13 inch laptop with 1366 by 768 resolution and 32 bit
color. The application was set up by the vendor according to the vendor’s documentation describing the
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
7
system set-up and preparation. The application itself was running on a Windows 2012 Server using a
test database on a WAN connection. Technically, the system performance (i.e., response time) was
representative to what actual users would experience in a field implementation with a minor lag caused
by video screen sharing and recording. Additionally, participants were instructed not to change any of
the default system settings (such as control of font size).
3.6
TEST FORMS AND TOOLS
During the usability test, various documents and instruments were used, including:
·
·
·
·
·
Informed Consent
Test task scenarios
Moderator’s Guide
Observer’s data collection template
Post-test SUS Questionnaire
Examples of these documents can be found in Appendices 3 – 7 respectively.
The participant’s interaction with the EHRUT was captured and recorded digitally using the screen
recording capability of join.me running on the test machine. This recording included the audio stream of
verbalizations. The test sessions were electronically transmitted to any additional observers who logged
into the join.me session.
3.7
PARTICIPANT INSTRUCTIONS
The administrator reads the following instructions aloud to the each participant (also see the full
moderator’s guide in Appendix 5):
Thank you for participating in this study. Your input is very important. Our session today will last about
15 minutes. During that time you will use a version of EnableMyHealth EHR and work with specific
features. Our goal is to determine where there are areas of difficulty and design aspects that can be
improved.
I will ask you to complete a few tasks using this system and answer some questions. You should
complete the tasks as quickly as possible making as few errors as possible. Please try to complete the
tasks on your own following the instructions very closely. Please note that we are not testing you we
are testing the system, therefore if you have difficulty all this means is that something needs to be
improved in the system. There are no wrong answers! We will be here in case you need specific help,
but we will not be able to instruct you or provide help in how to use the application, however we may
provide specific hints as necessary.
Overall, we are interested in how easy (or how difficult) this system is to use, and how we could
improve it. I did not have any involvement in its creation, so please be honest with your opinions.
We are recording the audio and screen interaction of our session today. All of the information that you
provide will be kept confidential and your name will not be associated with your comments at any time.
Should you feel it necessary you are able to withdraw at any time during the testing for any reason.
In today’s session, we will be using some new features that are currently being developed. Let’s go
over those now so you will have some familiarity.
The first new feature is called the Infobutton, denoted by a small blue “i” icon that appears as
necessary to provide additional clinical information.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
8
The second new feature is a tool to retrieve incoming clinical information for a patient that is sent
electronically from another provider outside your practice. This tool is to resolve discrepancies or
duplicate information between the incoming record and the clinical record that is in your EHR.
[important: only ask the following question if they indicated that they review/read release
notes, view training videos, etc. based on their earlier answer regarding how they find out
about new features and how to use them. Be ready to display the help file.]
Would you like to review the help topic for these features now?
Following the procedural instructions, participants were started with a specific patient’s chart data. Prior to
giving the participant mouse and keyboard control, the moderator gave the following instructions:
For each task, I will ask you to read the task and indicate when you begin. At that point, please
perform the task and say “Done” once you believe you have successfully completed the task. I would
like to request that you not talk aloud or verbalize while you are doing the tasks. You may certainly ask
questions if necessary and we may provide guidance or a hint, however we will not provide direct
instruction during the tasks. I will ask you your impressions about the task once you are done.
Participants were then given 4 tasks to complete. Tasks are listed in Appendix 4.
3.8
USABILITY METRICS
According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health
Records, EHRs should support a process that provides a high level of usability for all users. The goal is
for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To
this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability
testing.
The goals of the test were to assess:
1. Effectiveness of ENABLEMYHEALTH EHR by measuring participant success rates and errors
2. Efficiency of ENABLEMYHEALTH EHR by measuring the average task time and path deviations
3. Satisfaction with ENABLEMYHEALTH EHR by measuring ease of use ratings
3.9
DATA SCORING
The following table details how tasks were scored, errors evaluated, and the time data analyzed.
Rationale and Scoring
Measure
Effectiveness:
Task Success
A task was counted as a “Success” if the participant was able to achieve the
correct outcome, without assistance, within the overall time allotted for the
entire set of tasks.
The total number of successes were calculated for each task and then
divided by the total number of times that task was attempted. The results
are provided as a percentage.
Task times were recorded for successes. Observed task times divided by the
optimal time for each task is a measure of optimal efficiency. Due to the
variability of multiple correct paths and a large variety of user settable
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
9
preferences, all of which can affect time on task, optimal task times and
deviations from these times were unrealistic to assess. All participants were
trained on Enablemyhealth EHR and so the performance factor of 1.0 was
use. The overall usability goal for any feature of the EHR is task time that is
deemed acceptable by the end user with no critical errors. Thus, if expert,
optimal performance on a task was [60] seconds then allotted task time
performance was [60 * 1.0] seconds. This ratio is aggregated across tasks and
reported with mean and variance scores.
Effectiveness:
Task Failures
Efficiency:
Task Deviations
If the participant abandoned the task, did not reach the correct answer,
performed it incorrectly, or was unsure if they had completed the task, the
task was counted as a “Critical Failure.” No task times were taken for failed
tasks.
Minor errors were defined as an errant click, initial selection of an incorrect
menu option, or incorrect entries that the participant noticed and corrected.
The total number of errors was calculated for each task and then divided by
the total number of times that task was attempted. Minor errors and
deviations were noted but not counted as significant errors. This is expressed
as the mean number of failed tasks per participant.
The participant’s path (i.e., steps) through the application was recorded.
Deviations occur if the participant, for example, went to a wrong screen,
clicked on an incorrect menu item, followed an incorrect link, or interacted
incorrectly with an on-screen control. These path deviations were included in
the minor error count. The minor error count is expressed as an average
across all participants.
Efficiency:
Task Time
Satisfaction:
Task Rating
Each task was timed from when the participant indicated they were
beginning the task until the participant said “Done.” If he or she failed to say
“Done,” the time was stopped when the participant stopped performing the
task. Only task times for tasks that were successfully completed were
included in the average task time analysis. Average time per task was
calculated for each task as was variance (standard deviation).
Participant’s subjective impression of the ease of use of the application was
measured by administering both a simple post-task question as well as a
post-session questionnaire. After each task, the participant was asked to rate
“Overall, this task was:” on a scale of 1 (Very Difficult) to 7 (Very Easy).
Average difficulty ratings per task were calculated as was variance.
Common convention is that average ratings for systems judged easy to use
should be 3.3 or above.
To measure participants’ confidence in and likeability of the
ENABLEMYHEALTH EHR feature overall, the testing team administered the
System Usability Scale (SUS) post-test questionnaire. Questions included, “I
think I would like to use this system frequently,” “I thought the system was
easy to use,” and “I would imagine that most people would learn to use this
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
10
system very quickly.” See full System Usability Score questionnaire in
Appendix 7.
4
4.1
RESULTS
DATA ANALYSIS AND REPORTING
The results of the usability test were calculated according to the methods specified in the Usability
Metrics section above. Participants who failed to follow session and task instructions had their data
excluded from the analyses, however there were no instances of this occurring.
The usability testing results for the ENABLEMYHEALTH EHR are detailed below (Table [x]). The results
should be seen in light of the objectives and goals outlined in Section 3.2 Study Design. The data should
yield actionable results that, if corrected, yield material, positive impact on user performance.
TASKS
Find CDA and check or map to
patient
Reconcile current medications,
allergies, and problems list
Mean across tasks
Number
Errors Mean
(SD)
Task
Ratings
5=Easy
Mean
(SD)
Path Deviations
(Observed /
Optimal)
Task Time
Mean (SD)
Task Time
Deviations
(Observed /
Optimal)
1 100.00
1.15 (4.6/4)
22.6 (1.8)
1.1 (22.6/20)
1.86 (1.1)
4.6 (0.5)
1 100.00
1.18 (4.7/4)
12.3 (1.8)
1.2 (12.3/10)
0.57 (0.7)
4.7 (0.5)
100.00
1.16 (4.65/4)
17.45 (1.8)
1.2 (17.5/15)
1.2 (0.9)
4.7 (.5)
Task
Succes
s
The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system
based on performance with these tasks to be: 83.6. Broadly interpreted, this indicates above average
usability, but with further room for improvement.
4.2
DISCUSSION OF THE FINDINGS
Clinical Information Reconciliation is performed by using the “Import CDA” tool, which we call Referral.
This feature allows:
· Selection of an incoming clinical summary document from an external source.
· Verifying or remapping the CDA to a patient in the EHRUT.
· Ability to add a new patient and appointment.
· Selecting the medications, medication allergies, and problems to consolidate into a patient’s
clinical record.
· A single click imports the data.
· A single click opens the patient encounter/note.
Since we implemented the CDA as a referral on the appointment calendar, it was easier for participants
to understand how this feature worked. The automated mapping and use of one screen to reconcile
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
11
medications, allergies and problems made the entire process very easy to import. The minor error rate,
which includes optimal path deviations, is acceptably low, with a consolidated mean of 1.2 and 0.9
standard deviation. The number of steps to perform all tasks is only 8. This can vary if individual
medications, allergies and problems are selected instead of all, but we selected steps to add all since this
is more a more common usage. The mean task difficulty was 4.6 out of a high of 5.0. Satisfaction,
measured by an SUS score of 83.5 is very good.
4.3
EFFECTIVENESS
Based on the success rate, minor error rate, and critical error rate data, the CIR has a high acceptable
level of effectiveness after a short amount of training. We did not test use of this feature without
training, but with only 8 steps the process is very easy. CIR exhibited 100% success with low error mean
of 1.2 across all tasks. The usability test consisted of 2 tasks of which all 2 were successfully completed.
One participant did not select all the problems because they inadvertently clicked import prior to being
finished, so they then select the problems and reimported.
4.4
EFFICIENCY
Observation of task time and minor errors, which include path deviations, indicates strong acceptance
and efficiency. There is room for improvement by adding tool tips to provide more direction and show
the patient and sending provider on the selection screen.
The combined mean task time of 17.45 second with a standard deviation of 1.8 seconds, which is very
close to the optimal time of 15 seconds to perform both tests. If more time is use to think about the
medications, allergies and problems a patient has then the task time could increase substantially, but
this is not on account of usability merely considering the implications of the information provided.
4.5
SATISFACTION
Based on task difficulty ratings and SUS results data, user satisfaction of the feature is considered
favorable. Users perception of ease of use is considered easy, with a mean rating of 4.04 across all tasks
(1=very difficult, 5=very easy). SUS scoring is very good with a score of 83.6. These scores accurately
describe the average patient satisfaction with Enablemyhealth feature set.
SUS Scoring
Mean
StDev
4.6
83.57
5.80
MAJOR FINDINGS
The major usability difficulties encountered were:
·
Knowing how to select individual medications, allergies or problems or all in each
category appeared to be most confusing.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
12
·
·
4.7
Participants were confused that they had to select an encounter to import the data
into.
One user clicked import by accident prior to being completed and went back and
imported the rest.
AREAS FOR IMPROVEMENT
·
·
·
·
Display patient name and referring provider name on the table to make it easier to find CDA.
Add Step text to help guide novice user.
If Import button is clicked Ask first if user has selected the appropriate medications, allergies
and problems to import. additional tool tips for more information and perhaps
Make Open Encounter active only once import has been performed.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
13
5
APPENDICES
The following appendices include supplemental data for this usability test report. Following is a list of
the appendices provided:
1: Participant Recruiting Screener
2: Participant demographics
3: Informed Consent Form
4: Task Scenarios
5: Moderator’s Guide
6: Sample data collection form
7: System Usability Scale Questionnaire
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
14
Appendix 1: PARTICIPANT RECRUITING SCREENER
Introduction
Hello, my name is __. Enabledoc is seeking doctors and clinicians who are users of
• Electronic prescribing software
• Computerized order entry (such as labs and imaging)
• Medication list and medication allergy list management
to take part in a usability study of that portion of the EnableMyHealth EHR.
This study will assist us in designing and developing a solution that meets your needs. Your experiences
in using this particular design will greatly help our designers and developers. The testing of our design
will take place in your office using remote meeting technology, requiring only your time, thoughts, and
suggestions. We expect the session to last approximately 15 minutes.
Does this sound like something that interests you? Before I schedule you for a session, do you have a
few moments to answer some questions?
General Questions
1. Are you male or female? [Recruit a mix of participants]
2. Have you participated in a focus group or usability test in the past three months? [Note but do not
terminate if yes]
3. Which of the following best describes your age? [25 or less; 26 to 39; 40 to 59; 60 to 74; 75 and
older][Recruit a mix of ages]
Professional Demographics
4. What is your current position/role in your practice?
5. How long have you been in this role?
6. Do you currently perform e-prescribing? [computerized order entry, manage medication and
medication allergy lists – pick appropriate for test][Terminate if no to specific task for test] How many
prescriptions [or lab orders – use appropriate choice for test] do you write per day (or week if that is a
better estimate)?
7. What year did you receive your medical degree?
Computer Expertise
8. About how many hours per week do you spend on the computer that is medical practice related?
[Recruit a range of use, e.g., 0 to 10, 11 to 25, 26+ hours per week][Terminate if less than 5]
9. Do you use a computer outside of your medical work?
10. If so, about how many hours per week do spend using a computer for non-work related endeavors?
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
15
11. Regarding your use of EnableMyHealth Practice Manager & EHR, what percentage of time is spent in
each product? (per day)
Domain Knowledge
·
Rate your expertise or comfort in using EnableMyHealth software on a scale of 1 to 5 (1=just
starting, 5=expert) for
§ PM
§ EHR (overall)
§ EHR e-Prescribing
[Terminate if 1 or 2]
·
If you have used similar or competing products, describe your level of expertise in those
products [it is not necessary to name the product(s) unless they want to].
Contact Information
[If the person matches your qualifications, ask for any info we do not have] May I have your contact
information?
· Name of participant:
· Office Key:
· Best phone number:
· Email address:
Those are all the questions I have for you. Your background matches the people we're looking for.
Would you be able to participate on [date, time]?
Alternative: select from a list of sessions [ideal approach]
Alternative; What would be the best date and time for you?
Before your session starts, we will ask you to verbally acknowledge a release form allowing us to record
your session. The recording will only be used internally for further study if needed and will never be
used for advertising or marketing purposes. Also, you will not be personally identified with any
recording. Will you consent to be recorded? [Terminate if no]
This study will take place remotely via conferencing software, allowing you to participate in the session
at the place of your choosing. I will confirm your appointment a couple of days before your session and
provide you with any additional information. What is the best time to contact you?
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
16
Appendix 2: PARTICIPANT DEMOGRAPHICS
Following is a high-level overview of the participants in this study.
Gender
Men
Women
Total (participants)
Occupationan/Role
RN/NP/MA
Physician
Total (participants)
Years of Experience (average)
Professional
EHR Product Use
2
5
7
4
3
7
23.71
2.86
Demographic detail
Participant ID
C1
C2
C3
C4
C5
C6
C7
Gender
M
F
F
F
F
M
F
Occupation
/ role
Age
48 MD
58 Medical Ass
38 Medical Ass
37 MD
74 MD
35 Medical Ass
25 Medical Ass
Enabledoc LLC ©2015 Confidential and Proprietary
Computer
Experience
Professional (intermediate,
advanced,
Experience
expert)
(yrs)
27 intermediate
35 intermediate
17 intermediate
16 intermediate
53 intermediate
14 intermediate
4 intermediate
Version 1
Product
Experience
(yrs)
3
3
4
3
3
3
1
17
Appendix 3: Informed Consent Form
Enabledoc would like to thank you for participating in this study. The purpose of this study is to evaluate
an electronic health records system. If you decide to participate, you will be asked to perform several
tasks using the prototype and give your feedback. The study will last up to 60 minutes.
Agreement
I understand and agree that as a voluntary participant in the present study conducted by Enabledoc. I
am free to withdraw consent or discontinue participation at any time. I understand and
agree to participate in the study conducted by Enabledoc.
I understand and consent to the use and release of the recording by Enabledoc. I understand that the
information and recording is for research purposes only and that my name and image will not be used
for any purpose other than research. I relinquish any rights to the recording and understand the
recording may be copied and used by Enabledoc without further permission.
I understand and agree that the purpose of this study is to make software applications more useful and
usable in the future. I understand and agree that the data collected from this study will not be shared
outside of Enabledoc
I understand and agree that data confidentiality is assured, because only de-identified data – i.e.,
identification numbers not names – will be used in analysis and reporting of the results.
I agree to immediately raise any concerns or areas of discomfort with the study administrator. I
understand that I can leave at any time.
Please check or verbally indicate one of the following:
 YES, I have read the above statement and agree to be a participant.
 NO, I choose not to participate in this study.
Signature: _____________________________________ Date: ____________________
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
18
Appendix 4: Task Scenarios
Note: In conjunction with this usability study were tasks relating to Clinical Information Reconciliation.
Usability Task Scenarios – Clinical Information Reconciliation
Note: in all tasks, please complete only to the point where any new information is ready to be saved or
signed. Please do not sign any prescriptions or perform the final completion step!
1. A new patient referral is received on the appointment calendar. The referring provider has
electronically sent you this patient’s clinical information. Using a new feature, access this
incoming patient’s information for Hilary Clinton. You realize you have already had receive a
CDA for this patient from another provider that was already imported. Check that the system
mapped to correct patient and select the encounter already created.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
2. The current medications in the system are displayed in the left list. The received medications
appear in the center box. Click on the medications and the right arrow to add to the
consolidated list or click the double right arrow to add all of them. Perform the same steps for
allergies and problems, then click the Import Button.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
Enabledoc LLC ©2015 Confidential and Proprietary
4
Version 1
5
19
Appendix 5: Usability Test Administration Moderator’s Guide
1. Items often forgotten
a. Stop watch for recording time on task
b. Elapsed time timer to time code observations/comments in the recording
2. Office Key Setup
a. Environment
b. Obtain copy of user key database if required for test
c. Any required feature access
d. Create and enter any necessary test data
e. Create Database / Office Key snapshot
3. Schedule Join.Me sessions
4. Schedule participants
5. Supply test participants with materials prior to test
a. Join.Me access
b. Task scenarios (be prepared to resend at the start of the test – do not assume the
participant has multiple monitors, so they may also need to print the scenarios)
c. Informed consent form (can be done verbally during the session)
6. Set up administrative / test PC
a. Join.Me installed and up to date
b. Shut down all programs not needed for the test – remember things like Communicator
c. Turn off notifications – audible and screen – on any software that may be running during
the test (e.g.: Outlook)
d. Ensure the application installed and functional
7. Running a test session
a. Ensure office key/database snapshot of base data exists
b. Start EHR session on test PC and arrive at starting screen for test
c. Select first patient if selection is not part of the task scenario
d. Start Join.Me session
e. Share appropriate screen
f. Greet user
g. Administer introductory materials to user (see below, following checklist)
h. Start Join.Me recording
i. Display informed consent form on shared screen
j. Obtain consent via verbal acceptance
k. Give mouse and keyboard control to ALL
l. Have user work through tasks
m. For each task, record
i. Time on task
ii. Critical errors (anything that constitutes task failure)
iii. Minor errors (anything the user detects and recovers)
iv. Deviation(s) from optimal path
v. Interesting comments (try to include recording time stamp for reference)
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
20
n.
After each task or task set (see specific scenarios) ask for response to single ease of use
question
o. Allow verbal response to any questions contained in task scenarios
p. At end of all tasks, display SUS questionnaire and have the participant place an X in
appropriate response to each statement
i. Save SUS with participant code as part of file name
q. Conduct debrief to solicit any additional feedback, comments, Q&A, etc.
Thank you for participating in this study. Your input is very important. Our session today will last about
15 minutes. During that time you will use a version of EnableMyHealth EHR and work with specific
features. Our goal is to determine where there are areas of difficulty and design aspects that can be
improved.
I will ask you to complete a few tasks using this system and answer some questions. You should complete
the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your
own following the instructions very closely. Please note that we are not testing you we are testing the
system, therefore if you have difficulty all this means is that something needs to be improved in the
system. There are no wrong answers! We will be here in case you need specific help, but we will not be
able to instruct you or provide help in how to use the application, however we may provide specific hints
as necessary.
Overall, we are interested in how easy (or how difficult) this system is to use, and how we could improve
it. I did not have any involvement in its creation, so please be honest with your opinions.
We are recording the audio and screenshots of our session today. All of the information that you provide
will be kept confidential and your name will not be associated with your comments at any time. Should
you feel it necessary you are able to withdraw at any time during the testing for any reason.
[Show informed consent document and get verbal acceptance]
Do you have any questions or concerns before we begin?
Let’s start with some basic information:
What is your job title?
How long have you been in this profession?
What is your specialty?
How long have you used the EHR?
How do you learn about new features as they are added to the EHR?
[if necessary, prompt about reading release notes, viewing training videos, using the help system,
or do they just start using the feature]
In today’s session, we will be using some new features that are currently being developed. Let’s go over
those now so you will have some familiarity.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
21
The first new feature is called the Infobutton, denoted by a small blue “i” icon that appears as necessary
to provide additional clinical information.
The second new feature is a tool to retrieve incoming clinical information for a patient that is sent
electronically from another provider outside your practice. This tool is to resolve discrepancies or
duplicate information between the incoming record and the clinical record that is in your EHR.
[important: only ask the following question if they indicated that they review/read release notes, view
training videos, etc. based on their earlier answer regarding how they find out about new features
and how to use them. Be ready to display the help file.]
Would you like to review the help topic for this feature now?
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
22
Appendix 6: Sample data collection form
Summative Usability Test Data Log
Test type:
Participant:
Participant code:
Session Date/start time:
File name:
Task 1:
·
·
·
·
·
Critical Error count:
Optimal path deviations (minor error):
Time on task:
SEoUQ response:
Comments/observations:
<repeat for each task>
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
23
Appendix 7: System Usability Scale Questionnaire
In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systems
usability” known as the System Usability Scale or SUS. Lewis and Sauro (2009) and others have
elaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, at
http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008).
Strongly
Disagree
1. I think that I would like to use this system
frequently
Strongly
agree
1
2
3
4
Strongly
Disagree
5
Strongly
agree
2. I found the system unnecessarily complex
1
2
3
4
5
1
2
3
4
5
3. I thought the system was easy to use
4. I think that I would need the support of a
technical person to be able to use this
system
5. I found the various functions in this system
were well integrated
6. I thought there was too much
inconsistency in this system
7. I would imagine that most people would
learn to use this system very quickly
8. I found the system very cumbersome to
use
Enabledoc LLC ©2015 Confidential and Proprietary
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
Version 1
24
9. I felt very confident using the system
10. I needed to learn a lot of things before I
could get going with this system
Enabledoc LLC ©2015 Confidential and Proprietary
1
2
3
4
5
1
2
3
4
5
Version 1
25
Usability Test
Report of
Enablemyhealth®
EHR Safety
Enhanced Design
93
Safety-Enhanced Design 170.314(g)(3)
Report based on ISO/IEC 25062:2006 Common Industry Format for
Usability Test Reports
Date of Usability Test: 15 AUG 2015 – 14 SEP 2015
Date of Report: 14 SEP 2015
Report Prepared By: Stephen Rothschild
877.540.0933
[email protected]
EnableDoc LLC
7700 Falstaff Road
McLean VA 22104
Table of Contents
Table of Contents .......................................................................................................................................... 1
1
Executive Summary ............................................................................................................................... 2
2
INTRODUCTION ..................................................................................................................................... 4
3
METHOD................................................................................................................................................ 4
4
5
3.1
PARTICIPANTS ............................................................................................................................... 4
3.2
STUDY DESIGN............................................................................................................................... 5
3.3
TASKS............................................................................................................................................. 6
3.4
PROCEDURES................................................................................................................................. 6
3.5
TEST LOCATION ............................................................................................................................. 7
3.6
TEST ENVIRONMENT ..................................................................................................................... 7
3.7
TEST FORMS AND TOOLS .............................................................................................................. 8
3.8
PARTICIPANT INSTRUCTIONS ........................................................................................................ 8
3.9
USABILITY METRICS ....................................................................................................................... 9
3.10
DATA SCORING .............................................................................................................................. 9
RESULTS .............................................................................................................................................. 11
4.1
DATA ANALYSIS AND REPORTING ............................................................................................... 11
4.2
DISCUSSION OF THE FINDINGS ................................................................................................... 11
4.3
EFFECTIVENESS ........................................................................................................................... 12
4.4
EFFICIENCY .................................................................................................................................. 12
4.5
SATISFACTION ............................................................................................................................. 13
4.6
MAJOR FINDINGS ........................................................................................................................ 13
4.7
AREAS FOR IMPROVEMENT ........................................................................................................ 14
APPENDICES ........................................................................................................................................ 14
Appendix 1: PARTICIPANT RECRUITING SCREENER ................................................................................ 15
Appendix 2: PARTICIPANT DEMOGRAPHICS ........................................................................................... 17
Appendix 3: Informed Consent Form...................................................................................................... 18
Appendix 4: Task Scenarios ..................................................................................................................... 19
Appendix 5: Usability Test Administration Moderator’s Guide .............................................................. 22
Appendix 6: Sample data collection form ............................................................................................... 24
Appendix 7: System Usability Scale Questionnaire ................................................................................ 25
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
1
1
Executive Summary
A usability test of EnableMyHealth version EMH4, an ambulatory EHR, was conducted between August
15, 2015 and September 14, 2015 in the Rochester MN. office of EnableDoc LLC, and each providers
offices. The purpose of this test was to test and validate the usability of the current user interface for
Computerized Provider Order Entry, and provide evidence of usability in the EHR Under Test (EHRUT).
During the usability test, seven (7) healthcare providers and clinical staff that meet the target
demographic and professional profile serve as participants and used Enablemyhealth EHR in simulated,
but representative tasks. This study collected performance data on the following these four tasks
typically conducted in an EHR:
• Prescription entry and modifying prescriptions
• Entry and modification of laboratory orders
• Entry and modification of imaging orders
• View prior orders
All test participants conducted the test sessions remotely via on-line conferencing software. During the
30 minute one-on-one usability test, each participant was greeted by the administrator and asked to
review and verbally acknowledge an informed consent/release form (included in Appendix 3); they were
instructed that they could withdraw at any time. Participants had prior experience with the EHR as they
are current users/customers. No additional training materials were provided other than that usually
given to customers. The administrator introduced the test and instructed participants to complete a
series of tasks (given one at a time) using Enablemyhealth. During the testing, the administrator timed
the test and, along with the data logger(s) recorded user performance data on paper and electronically.
The administrator did not give the participant assistance in how to complete the task. Participant
screens and audio were recorded for subsequent analysis. The following types of data were collected for
each participant:
• Number of tasks successfully completed within the allotted time without assistance
• Time to complete the tasks
• Participant’s subjective assessment of the ease of each task
• Number and types of errors
• Path deviations
• Participant’s verbalizations
• Participant’s satisfaction ratings of the system
All participant data was de-identified – no correspondence could be made from the identity of the
participant to the data collected. Following the conclusion of the testing, participants were asked to
complete a post-test questionnaire and were compensated with a $50 gift card for their time. Following
the conclusion of the testing, participants were asked to complete a post-test SUS questionnaire.
Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the
Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the
usability of the EHRUT.
Result data matrix - refer to Appendix 4 for complete task descriptions
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
2
MEASURES
TASKS
Path Deviations
Task
(Observed /
Number Success
Optimal)
Task Time
Mean (SD)
Task Time
Task
Deviations
Ratings
5=Easy
(Observed /
Errors
Optimal) Mean (SD) Mean (SD)
Create 3 RX
3
100.00 1.06 (31.8/30)
117 (16.7)
1.3 (117/90)
1.86 (1.1)
3.7 (0.5)
Create 3 Lab orders
Create 2 imaging orders
Modify Rx
Modify lab order
Modify image order
View imaging order
View prior Rx
3
2
1
1
1
1
1
100.00 1.05 (11.57/11)
100.00 1.05 (8.43/8)
100.00 1.04 (13.57/13)
100.00 1.05 (8.43/8)
100.00
1.00 (8/8)
100.00 1.05 (3.14/3)
100.00
1.00 (3/3)
75 (19)
44 (10.7)
32 (7.5)
36 (7.2)
32 (6.1)
13 (1.3)
12 (1.3)
1.4 (75/55)
1.5 (44/55)
1.1 (32/30)
1.4 (36/25)
1.3 (32/25)
1.3 (13/10)
1.2 (12/10)
0.57 (0.7)
0.43(0.7)
0.57(0.7)
0.43 (0.7)
0 (0)
0.14 (0.4)
0 (0)
4.1 (0.6)
4.6 (0.5)
3.6 (0.5)
3.9 (0.6)
4.1 (0.4)
4.1 (0.4)
4.1 (0.4)
The results from the System Usability Scale scored the subjective satisfaction with the system based on
performance with these tasks to be: 80 In addition to the performance data, the following qualitative
observations were made:
- Major findings
Electronic prescribing difficulties:
·
·
·
·
Providers tend to just want to write or dictate the script rather then select a frequency of use
Medications other than tablets (e.g.: injectables, inhalers) presented the most serious
challenges
Most frequent challenge was determining units of medication when none is defaulted.
Changing a prescription frequency of use was not easily understood and preference is just to
modify the script directions
Lab and imaging order difficulties:
·
·
·
Lab orders can be confusing because the descriptions and codes are specific to the lab company.
These can be modified to improve usability.
The user is required to know the structure/set-up to avoid excessive and confusing searching.
Lab orders are associated with a lab order company. Further, groups of lab orders can be set up
as lab order sets to simplify the process further.
Search is not global across all lab companies, which would prevent having to select the lab. The
lab name would need to be appended to the lab order filter.
- Areas for improvement
Electronic prescribing:
·
·
·
·
Add default units for all prescriptions
Provide aids in recommended dosage based on patient BSA, age, gender, and diagnosis.
Provide ability to type or speak script and parse data to build the prescription script.
Add a wizard for first time or novice users.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
3
Lab and imaging orders:
·
·
·
2
Allow search for labs across lab companies with the lab company listed on the test order.
Add a wizard for first time use or novice users.
Add support for speech recognition searching
INTRODUCTION
The EHRUT tested for this study was EnableMyHealth EHR4 Release, an ambulatory EHR. Designed to
present medical information to healthcare providers in primarily group practices with a focus on family
practice, surgery, eye, and PT/OT/Chiropractic medicine, the EHRUT consists of modules and features
that include but not limited to support for tested functionality:
· Patient demographics
· Patient problem and allergy lists
· Creating lab orders and receiving lab results
· Immunization recording
· Electronic prescribing
· Drug-drug, drug-allergy, and drug-problem interaction checking
· Clinical decision support
The EHRUT is used predominantly by physicians and clinical staff such as MA’s, RN’s, LPN’s, etc.
The usability testing attempted to represent realistic exercises and conditions.
The purpose of this study was to test and validate the usability of the current user interface, and provide
evidence of usability of the entry and modification of electronic prescriptions, laboratory orders, and
imaging orders in the EHR Under Test (EHRUT).To this end, measures of effectiveness, efficiency and
user satisfaction, such as successful task completion rate, time on task, number and types of errors, and
participant satisfaction were captured during the usability testing.
3
3.1
METHOD
PARTICIPANTS
A total of 7 participants were tested on the CPOE feature of the EHRUT. Participants in the test were
physicians and clinical staff such as RNs and MAs. Participants were recruited by Enabledoc staff from
existing customers and paid a $50 gift card for their help in testing.
Participants had no direct connection to the development of or organization producing the EHRUT other
than being current customers. Participants were not from the testing or supplier organization.
Participants were actual end users and thus have the same orientation and level of training as other
non-participant customers. For the test purposes, end-user characteristics were identified and
translated into an internal recruitment screener used to solicit potential participants; an example of a
screener is provided in Appendix [1].
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
4
Recruited participants had a mix of backgrounds and demographic characteristics conforming to the
recruitment screener. The following is a table of participants by characteristics, including demographics,
professional experience, computing experience and user needs for assistive technology. Participant
names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual
identities.
Participant ID
C1
C2
C3
C4
C5
C6
C7
Gender
M
F
F
F
F
M
F
Occupation
/ role
Age
48 MD
58 Medical Ass
38 Medical Ass
37 MD
74 MD
35 Medical Ass
25 Medical Ass
Computer
Experience
Professional (intermediate,
advanced,
Experience
expert)
(yrs)
27 intermediate
35 intermediate
17 intermediate
16 intermediate
53 intermediate
14 intermediate
4 intermediate
Product
Experience
(yrs)
3
3
4
3
3
3
1
Seven (7) participants (matching the demographics in the section on Participants) were recruited and all
participated in the usability test. 3); they were instructed that they could withdraw at any time.
Participants all had prior experience with the EHR. No participants failed to show for the study.
Participants were scheduled for 30 minute sessions with 30 minutes in between each session for debrief
by the administrator and data logger, and to reset systems to proper test conditions. All testing was
performed over several days to allow the participants to schedule time at their convenience. A
spreadsheet was used to keep track of the participant schedule.
The administrator introduced the test, and instructed participants to complete a series of tasks (given
one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the
data logger(s) recorded user performance data on paper and electronically. The administrator did not
give the participant assistance in how to complete the task.
3.2
STUDY DESIGN
The objective of this test was to perform summative testing to measure the key usability metrics of
effectiveness, efficiency, and user satisfaction. These metrics will uncover areas where the application
performed well and areas where the application failed to meet the needs of the participants in achieving
our internal usability goals. The data from this test may serve as a baseline for future tests with an
updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used.
In short, this testing serves as both a means to record or benchmark current usability, but also to
identify areas where improvements should be made.
During the usability test, participants interacted exclusively with the EHRUT. Each participant used the
system in their preferred location, and was provided with the same instructions. All sessions were
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
5
conducted remotely with Join.Me conferencing software. Screens with associated interaction and the
audio stream were recorded for later analysis. The system was evaluated for effectiveness, efficiency
and satisfaction as defined by measures collected and analyzed for each participant:
• Number of tasks successfully completed within the allotted time without assistance
• Time to complete the tasks
• Number and types of errors
• Path deviations
• Participant’s verbalizations (comments)
• Participant’s satisfaction ratings of the system
Additional information about the various measures can be found in Section 3.9 on Usability Metrics.
3.3
TASKS
Tasks were constructed that would be realistic and representative of the kinds of activities a user might
do with this EHR in the area of CPOE and based on the CMS test scripts for Meaningful Use Phase 2. The
tasks for this study broadly included:
1. Create 3 prescriptions
2. Create 3 lab orders
3. Create 2 imaging orders
4. Modify a prescription
5. Modify a lab order
6. Modify an image order
7. View an lab and an imaging order
8. View a prior prescription
Task Selection and Priority
Tasks were selected based on their frequency of use, criticality of function, and those that may be most
troublesome for users. The tasks were directly modeled on the CMS CPOE test procedure
[170.314(a)(1)].
Tasks were ordered and prioritized based on their impact on patient safety, in which the tasks that had
the greatest potential for patient harm due to critical errors were performed first. Thus the creation
tasks of prescriptions and lab orders were done first, followed by prescription and lab order
modifications, and finally viewing orders and prescriptions.
The task scenario document is contained in appendix 4.
3.4
PROCEDURES
During the time a participant was scheduled, an email was sent to the participant that:
• Confirmed the time of the session
• Provided Join.Me access codes
• A copy of the informed consent form
• A document containing the tasks for the test session
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
6
Just prior to the scheduled time, the test administrator started the join.me session and greeted the
participant on arrival; their identity was verified and matched with a name on the participant schedule.
Participants were then assigned a participant ID. Recording of the session was started using the join.me
recording feature.
Each participant reviewed and agreed to the informed consent and release form via verbal
acknowledgment (See Appendix 3). A representative from the test team witnessed the participant’s
verbal agreement.
To ensure that the test ran smoothly, testing was performed remotely by an experienced usability
practitioner with over 25 years of experience in healthcare user interface and workflow design.
The administrator moderated the session including administering instructions and tasks. The
administrator also monitored task times, obtained post-task rating data, took notes on participant
comments, path deviations, number and type of errors, and comments.
Participants were instructed to perform the tasks (see specific instructions below):
·
·
·
As quickly as possible making as few errors and deviations as possible.
Without assistance; administrators were allowed to give immaterial guidance and clarification
on tasks, but not instructions on use.
Without using a think aloud technique.
For each session, the participants were given an electronic copy of the tasks for that session. They were
requested to not read the tasks prior to the session. Participants were asked to read the task aloud prior
to each task; task timing began once the participant finished reading the question and verbally indicated
they were starting the task. The task time was stopped once the participant indicated they had
successfully completed the task. Scoring is discussed below in Section 3.9.
Following the session, the administrator gave the participant the post-test questionnaire (the System
Usability Scale, see Appendix 7), solicited any further comments, and thanked each individual for their
participation.
Participants' demographic information, task success rate, time on task, errors, deviations, verbal
responses, and post-test questionnaire were recorded into a spreadsheet.
3.5
TEST LOCATION
3.6
TEST ENVIRONMENT
The test was conducted remotely through the use of Join.Me virtual conferencing and screen sharing
software. Thus the actual test location was at the discretion of the test participants. The test
administrator conducted the test from Enabledoc LLC offices in Rochester MN.
The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was
conducted in healthcare office. For testing, the computer used a Windows laptop running Windows OS
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
7
and Chrome browser. The participants used a mouse and keyboard when interacting with the EHRUT.
The Enablemyhealth application was used on a 13 inch laptop with 1366 by 768 resolution and 32 bit
color. The application was set up by the vendor according to the vendor’s documentation describing the
system set-up and preparation. The application itself was running on a Windows 2012 Server using a
test database on a WAN connection. Technically, the system performance (i.e., response time) was
representative to what actual users would experience in a field implementation with a minor lag caused
by video screen sharing and recording. Additionally, participants were instructed not to change any of
the default system settings (such as control of font size).
3.7
TEST FORMS AND TOOLS
During the usability test, various documents and instruments were used, including:
• Informed Consent
• Test task scenarios
• Moderator’s Guide
• Observer’s data collection template
• Post-test SUS Questionnaire
Examples of these documents can be found in Appendices 3 – 7 respectively.
The participant’s interaction with the EHRUT was captured and recorded digitally using the screen
recording capability of join.me running on the test machine. This recording included the audio stream of
verbalizations. The test sessions were electronically transmitted to any additional observers who logged
into the Join.me session.
3.8
PARTICIPANT INSTRUCTIONS
The administrator reads the following instructions aloud to the each participant (also see the full
moderator’s guide in Appendix 5):
Thank you for participating in this study. Your input is very important. Our session today will last
about 30 minutes. During that time you will use a version of EnableMyHealth EHR and work with
specific features. Our goal is to determine where there are areas of difficulty and design aspects that
can be improved.
I will ask you to complete a few tasks using this system and answer some questions. You should
complete the tasks as quickly as possible making as few errors as possible. Please try to complete the
tasks on your own following the instructions very closely. Please note that we are not testing you we
are testing the system, therefore if you have difficulty all this means is that something needs to be
improved in the system. There are no wrong answers! We will be here in case you need specific help,
but we will not be able to instruct you or provide help in how to use the application, however we may
provide specific hints as necessary.
Overall, we are interested in how easy (or how difficult) this system is to use, and how we could
improve it. I did not have any involvement in its creation, so please be honest with your opinions.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
8
We are recording the audio and screen interaction of our session today. All of the information that
you provide will be kept confidential and your name will not be associated with your comments at any
time. Should you feel it necessary you are able to withdraw at any time during the testing for any
reason.
Following the procedural instructions, participants were started with a specific patient’s chart data. Prior
to giving the participant mouse and keyboard control, the moderator gave the following instructions:
For each task, I will ask you to read the task and indicate when you begin. At that point, please
perform the task and say “Done” once you believe you have successfully completed the task. I would
like to request that you not talk aloud or verbalize while you are doing the tasks. You may certainly
ask questions if necessary and we may provide guidance or a hint, however we will not provide direct
instruction during the tasks. I will ask you your impressions about the task once you are done.
Participants were then given 8 tasks to complete. Tasks are listed in Appendix 4.
3.9
USABILITY METRICS
According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health
Records, EHRs should support a process that provides a high level of usability for all users. The goal is for
users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To
this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability
testing. The goals of the test were to assess:
1. Effectiveness of ENABLEMYHEALTH EHR by measuring participant success rates and errors
2. Efficiency of ENABLEMYHEALTH EHR by measuring the average task time and path deviations
3. Satisfaction with ENABLEMYHEALTH EHR by measuring ease of use ratings
3.10 DATA SCORING
The following table details how tasks were scored, errors evaluated, and the time data analyzed:
Rationale and Scoring
Measure
Effectiveness:
Task Success
A task was counted as a “Success” if the participant was able to achieve the
correct outcome, without assistance, within the overall time allotted for the
entire set of tasks.
The total number of successes were calculated for each task and then
divided by the total number of times that task was attempted. The results
are provided as a percentage.
Task times were recorded for successes. Observed task times divided by the
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
9
optimal time for each task is a measure of optimal efficiency. Due to the
variability of multiple correct paths and a large variety of user settable
preferences, all of which can affect time on task, optimal task times and
deviations from these times were unrealistic to assess. All participants were
trained on Enablemyhealth EHR and so the performance factor of 1.0 was
use. The overall usability goal for any feature of the EHR is task time that is
deemed acceptable by the end user with no critical errors. Thus, if expert,
optimal performance on a task was [60] seconds then allotted task time
performance was [60 * 1.0] seconds. This ratio is aggregated across tasks and
reported with mean and variance scores.
Effectiveness:
Task Failures
Efficiency:
Task Deviations
If the participant abandoned the task, did not reach the correct answer,
performed it incorrectly, or was unsure if they had completed the task, the
task was counted as a “Critical Failure.” No task times were taken for failed
tasks.
Minor errors were defined as an errant click, initial selection of an incorrect
menu option, or incorrect entries that the participant noticed and corrected.
The total number of errors was calculated for each task and then divided by
the total number of times that task was attempted. Minor errors and
deviations were noted but not counted as significant errors. This is expressed
as the mean number of failed tasks per participant.
The participant’s path (i.e., steps) through the application was recorded.
Deviations occur if the participant, for example, went to a wrong screen,
clicked on an incorrect menu item, followed an incorrect link, or interacted
incorrectly with an on-screen control. These path deviations were included in
the minor error count. The minor error count is expressed as an average
across all participants.
Efficiency:
Task Time
Satisfaction:
Task Rating
Each task was timed from when the participant indicated they were
beginning the task until the participant said “Done.” If he or she failed to say
“Done,” the time was stopped when the participant stopped performing the
task. Only task times for tasks that were successfully completed were
included in the average task time analysis. Average time per task was
calculated for each task as was variance (standard deviation).
Participant’s subjective impression of the ease of use of the application was
measured by administering both a simple post-task question as well as a
post-session questionnaire. After each task, the participant was asked to rate
“Overall, this task was:” on a scale of 1 (Very Difficult) to 7 (Very Easy).
Average difficulty ratings per task were calculated as was variance.
Common convention is that average ratings for systems judged easy to use
should be 3.3 or above.
To measure participants’ confidence in and likeability of the
ENABLEMYHEALTH EHR feature overall, the testing team administered the
System Usability Scale (SUS) post-test questionnaire. Questions included, “I
think I would like to use this system frequently,” “I thought the system was
easy to use,” and “I would imagine that most people would learn to use this
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
10
system very quickly.” See full System Usability Score questionnaire in
Appendix 7.
4
4.1
RESULTS
DATA ANALYSIS AND REPORTING
The results of the usability test were calculated according to the methods specified in the Usability
Metrics section above. Participants who failed to follow session and task instructions had their data
excluded from the analyses, however there were no instances of this occurring. The usability testing
results for the ENABLEMYHEALTH EHR are detailed below (Table 1). The results should be seen in light of
the objectives and goals outlined in Section 3.2 Study Design. The data yield actionable results that yield
material, positive impact on user performance.
Table 1
MEASURES
TASKS
Path Deviations
Task
(Observed /
Number Success
Optimal)
Task Time
Mean (SD)
Task Time
Task
Deviations
Ratings
(Observed /
Errors
5=Easy
Optimal) Mean (SD) Mean (SD)
Create 3 RX
3
100.00 1.06 (31.8/30)
117 (16.7)
1.3 (117/90)
1.86 (1.1)
3.7 (0.5)
Create 3 Lab orders
Create 2 imaging orders
Modify Rx
Modify lab order
Modify image order
View imaging order
View prior Rx
3
2
1
1
1
1
1
100.00 1.05 (11.57/11)
100.00 1.05 (8.43/8)
100.00 1.04 (13.57/13)
100.00 1.05 (8.43/8)
100.00
1.00 (8/8)
100.00 1.05 (3.14/3)
100.00
1.00 (3/3)
75 (19)
44 (10.7)
32 (7.5)
36 (7.2)
32 (6.1)
13 (1.3)
12 (1.3)
1.4 (75/55)
1.5 (44/55)
1.1 (32/30)
1.4 (36/25)
1.3 (32/25)
1.3 (13/10)
1.2 (12/10)
0.57 (0.7)
0.43(0.7)
0.57(0.7)
0.43 (0.7)
0 (0)
0.14 (0.4)
0 (0)
4.1 (0.6)
4.6 (0.5)
3.6 (0.5)
3.9 (0.6)
4.1 (0.4)
4.1 (0.4)
4.1 (0.4)
100.00
45 (8.7)
1.3 (45/34)
0.5 (0.6)
4.0 (0.5)
Mean across tasks
1.46 (11/10)
The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system
based on performance with these tasks to be: 80.7. Broadly interpreted, this indicates above average
usability, but with further room for improvement.
4.2
DISCUSSION OF THE FINDINGS
The CPOE feature is a combination of electronic prescribing, lab order entry, and imaging order entry. As
such there are two distinctly different user interfaces involved. Particularly noteworthy is that the
participants were familiar with and had been using for up to the last 3 years prior to this study.
Overall, the CPOE feature of Enablemyhealth user experience exhibits acceptable effectiveness, with an
acceptably low critical error rate and minimal variance among users. The critical error rate is of greatest
concern for an EHR and in the case of the EHRUT there were no task failures due to critical errors out of
91 attempted tasks, but there were 2 deviations from normal operations. This is within internal usability
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
11
goal guidelines. Of the deviation errors, 13 were attributable to electronic prescribing, which was
primarily finding the correct prescription and dosage/form and a non defaulting measurement unit.
Efficiency measures indicate that while acceptable, there is room for improvement, specifically in the
electronic prescription interface, although searching for labs and radiology orders can also be difficult if
the names of the tests do not match the terminology used by the participant. One older user took
significantly longer time to perform every task, but this can also be attributed to nervousness and
second guessing themselves.
Satisfaction, measured by an SUS score of 80 is very good, but can be improved. Working to make the
user experience more natural and intelligent will improve the overall user experience and help guide the
novice and advanced users.
4.3
EFFECTIVENESS
Based on the success rate, minor error rate, and critical error rate data, the CPOE feature has an
acceptable level of effectiveness. The usability test consisted of 91 tasks of which 91 were attempted.
Out of the 91 attempted tasks, there were no critical errors, giving a successful completion rate of 100%.
The 13 deviation (minor) errors in the electronic prescriptions were difficulties associated non defaulting
scripts that required making changes in each field to build the script. Lab errors were with selecting a lab
and then find the lab. Clearly the shorter the lab test list, the easier the user experience. As our system
can customize the lab and radiology order names, this allows the tests to be tailored to the user. Order
sets were not used in this testing and can eliminate the problem of complex prescription and other
types of orders. The deviation error mean of 1.86 is within the acceptable range of effectiveness, with
the errors most often being finding the order by spelling it.
4.4
EFFICIENCY
Observation of task time is the primary area for improvement. The mean task time for three prescription
orders of 117.14 seconds is largely due to
• time to find the correct prescription, dose and form
• one complex prescription script
• number of fields on the prescription screen
Simple prescription tasks such as 1 tablet of a drug once per day by mouth for 30 days posed little
difficulties and did not greatly affect task time. More complex prescriptions, such as those that required
complex scripts, selection of measurement units, and detailed directions had increased the task time by
30%. We know that prescription ordersets addresses this issue, but we also know that there are of
solutions to improving usability.
Three lab orders had a mean entry time of 75 seconds with a standard deviation of 19 seconds, which
was significantly impact by one participant that does not order labs and took more than twice as long as
mean participant. None of the labs had Ask on Entry questions, which would have added significantly
more time. Again, lab ordersets were not used and also simplify and speed the process.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
12
Two radiology orders had a mean entry time of 44 seconds with a standard deviation of 10.7 seconds.
Again, one participant took more than twice as long as the mean participant. On average Lab orders
took three seconds more than radiology orders per task. This was caused by confusion of selecting a lab
to order from and not selecting a radiology group to order from. This perhaps is something we will
change in our next usability testing to make consistent for both.
4.5
SATISFACTION
Based on task difficulty ratings and SUS results data, user satisfaction of the feature is considered
favorable. Users perception of ease of use is considered easy, with a mean rating of 4.04 across all tasks
(1=very difficult, 5=very easy). SUS scoring is very good with a score of 80.7. These scores accurately
describe the average patient satisfaction with Enablemyhealth feature set.
SUS Scoring
Mean
StDev
4.6
80.71
4.57
MAJOR FINDINGS
The major usability problems encountered in this study were:
Electronic prescribing difficulties:
• Time on task continues to be a serious problem, because there are so many fields to write a script.
• Medications other than tablets (e.g.: injectables, inhalers) presented the most serious challenges
• Non-tablet prescriptions consistently lower ease of use scores
• Frequency was not easily understood and that this field, dose, days calculate the quantity and auto
generate the directions.
Lab and imaging order difficulties:
• The concept of Lab Order Sets can be confusing for some
• The user is required to know the structure/set-up to avoid excessive and confusing searching. Lab
orders are associated with a lab order company. Further, groups of lab orders can be set up as lab order
sets, but there is no easy way to determine the set membership of a given type of lab procedure
• Search is not global across all lab companies. Searching is strictly within a single lab company, thus if a
desired lab procedure is not within the selected company, search returns a null
• Hidden search functionality – search is explicitly invoked in all other screens in the EHR except in the
lab order search function where it is performed implicitly upon hitting the enter key
• Only one user was able to search by lab order description successfully
• Lab orders are arranged in a horizontal alphabetic layout which was unexpected and confusing
• Required field(s) not indicated – the “facility” field is required and not indicated. Many users missed
this, increasing the minor error rate
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
13
4.7
AREAS FOR IMPROVEMENT
Electronic prescribing:
1. Always default a measurement unit for every prescription type.
2. Add speech recognition to allow prescription to be spoken and the script to be automatically
filled in with natural language processing.
3. Add more intelligence to recommend prescriptions and scripts based on diagnosis and patient
demographics.
4. Provide a wizard option that leads users through the prescription scripting process.
Lab and imaging orders:
1. Improve the search function to allow labs orders to search across lab companies and not require
a lab to be selected.
2. Make print also print for electronic labs
3. Add New button to clear the lab
4. Add speech recognition to allow lab/radiology orders to be spoken.
5. Add more intelligence to recommend lab and radiology test based on diagnosis and patient
demographics.
6. Provide a wizard option that leads users through the order process.
5
APPENDICES
The following appendices include supplemental data for this usability test report. Following is a list of
the appendices provided:
1: Participant Recruiting Screener
2: Participant demographics
3: Informed Consent Form
4: Task Scenarios
5: Moderator’s Guide
6: Sample data collection form
7: System Usability Scale Questionnaire
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
14
Appendix 1: PARTICIPANT RECRUITING SCREENER
Introduction
Hello, my name is __. Enabledoc is seeking doctors and clinicians who are users of
• Electronic prescribing software
• Computerized order entry (such as labs and imaging)
• Medication list and medication allergy list management
to take part in a usability study of that portion of the EnableMyHealth EHR.
This study will assist us in designing and developing a solution that meets your needs. Your experiences
in using this particular design will greatly help our designers and developers. The testing of our design
will take place in your office using remote meeting technology, requiring only your time, thoughts, and
suggestions. We expect the session to last approximately 45 minutes.
Does this sound like something that interests you? Before I schedule you for a session, do you have a
few moments to answer some questions?
General Questions
1. Are you male or female? [Recruit a mix of participants]
2. Have you participated in a focus group or usability test in the past three months? [Note but do not
terminate if yes]
3. Which of the following best describes your age? [25 or less; 26 to 39; 40 to 59; 60 to 74; 75 and
older][Recruit a mix of ages]
Professional Demographics
4. What is your current position/role in your practice?
5. How long have you been in this role?
6. Do you currently perform e-prescribing? [computerized order entry, manage medication and
medication allergy lists – pick appropriate for test][Terminate if no to specific task for test] How many
prescriptions [or lab orders – use appropriate choice for test] do you write per day (or week if that is a
better estimate)?
7. What year did you receive your medical degree?
Computer Expertise
8. About how many hours per week do you spend on the computer that is medical practice related?
[Recruit a range of use, e.g., 0 to 10, 11 to 25, 26+ hours per week][Terminate if less than 5]
9. Do you use a computer outside of your medical work?
10. If so, about how many hours per week do spend using a computer for non-work related endeavors?
11. Regarding your use of EnableMyHealth Practice Manager & EHR, what percentage of time is spent in
each product? (per day)
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
15
Domain Knowledge
·
Rate your expertise or comfort in using EnableMyHealth software on a scale of 1 to 5 (1=just
starting, 5=expert) for
§ PM
§ EHR (overall)
§ EHR e-Prescribing
[Terminate if 1 or 2]
·
If you have used similar or competing products, describe your level of expertise in those
products [it is not necessary to name the product(s) unless they want to].
Contact Information
[If the person matches your qualifications, ask for any info we do not have] May I have your contact
information?
· Name of participant:
· Office Key:
· Best phone number:
· Email address:
Those are all the questions I have for you. Your background matches the people we're looking for.
Would you be able to participate on [date, time]?
Alternative: select from a list of sessions [ideal approach]
Alternative; What would be the best date and time for you?
Before your session starts, we will ask you to verbally acknowledge a release form allowing us to record
your session. The recording will only be used internally for further study if needed and will never be
used for advertising or marketing purposes. Also, you will not be personally identified with any
recording. Will you consent to be recorded? [Terminate if no]
This study will take place remotely via conferencing software, allowing you to participate in the session
at the place of your choosing. I will confirm your appointment a couple of days before your session and
provide you with any additional information. What is the best time to contact you?
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
16
Appendix 2: PARTICIPANT DEMOGRAPHICS
Following is a high-level overview of the participants in this study.
Gender
Men
Women
Total (participants)
2
5
7
Occupationan/Role
RN/NP/MA
Physician
Total (participants)
4
3
7
Years of Experience (average)
Professional
EHR Product Use
23.71
2.86
Demographic detail
Participant ID
C1
C2
C3
C4
C5
C6
C7
Gender
M
F
F
F
F
M
F
Age
Occupation
/ role
48 MD
58 Medical Ass
38 Medical Ass
37 MD
74 MD
35 Medical Ass
25 Medical Ass
Enabledoc LLC ©2015 Confidential and Proprietary
Computer
Experience
Professional (intermediate,
advanced,
Experience
expert)
(yrs)
27 intermediate
35 intermediate
17 intermediate
16 intermediate
53 intermediate
14 intermediate
4 intermediate
Version 1
Product
Experience
(yrs)
3
3
4
3
3
3
1
17
Appendix 3: Informed Consent Form
Enabledoc would like to thank you for participating in this study. The purpose of this study is to
evaluate an electronic health records system. If you decide to participate, you will be asked to perform
several tasks using the prototype and give your feedback. The study will last up to 60 minutes.
Agreement
I understand and agree that as a voluntary participant in the present study conducted by Enabledoc. I
am free to withdraw consent or discontinue participation at any time. I understand and agree to
participate in the study conducted by Enabledoc.
I understand and consent to the use and release of the recording by Enabledoc. I understand that the
information and recording is for research purposes only and that my name and image will not be used
for any purpose other than research. I relinquish any rights to the recording and understand the
recording may be copied and used by Enabledoc without further permission.
I understand and agree that the purpose of this study is to make software applications more useful and
usable in the future. I understand and agree that the data collected from this study will not be shared
outside of Enabledoc
I understand and agree that data confidentiality is assured, because only de-identified data – i.e.,
identification numbers not names – will be used in analysis and reporting of the results.
I agree to immediately raise any concerns or areas of discomfort with the study administrator. I
understand that I can leave at any time.
Please check or verbally indicate one of the following:
 YES, I have read the above statement and agree to be a participant.
 NO, I choose not to participate in this study.
Signature: _____________________________________ Date: ____________________
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
18
Appendix 4: Task Scenarios
Usability Task Scenarios – CPOE
For CPOE test we will use Garth Brooks encounter date 7/15/2015 to create prescriptions, labs, and
radiology orders, then modify them and view them.
Create Orders
1. Medications – create and save the following 3 prescriptions. The pharmacy can use generics if
they choose to do so. Do not sign these prescriptions.
a. Simvastatin, 20 mg tablet by mouth once daily; dispense 30, 1 refill
b. Lorazepam, 0.5 mg tablet by mouth three times daily; dispense 20, 1 refill
c. Insulin Glargine, 10 units once daily; package of 5, 2 refills
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
Task Time : _____ seconds
Comments:
2. Laboratory Orders – Click Orders on main menu then select Lab then select each lab order and save
each one:
a. Select: Creatinine 24H renal clearance panel and click save.
b. Select Cholesterol in HDL in serum and click save.
c. Select Fasting glucose in serum and click save.
Overall, how difficult or easy did you find this task?
Very Difficult
1
2
Task Time : _____ seconds
Comments:
Very Easy
3
4
5
3. Radiology Orders - Click Orders on main menu then select:
a. Select Radiologic examination knee 3 views (cpt 73562) and click save
b. Select CT head/brain w/o contrast material (cpt 70450) and slick save.
Overall, how difficult or easy did you find this task?
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
19
Very Difficult
1
Very Easy
2
3
4
5
Task Time : _____ seconds
Comments:
Modify Orders
4. Modify Prescription for Garth Brooks encounter 7/15/2015: Simvastatin 20 mg tablet by mouth
once daily; dispense 30, 1 refill was changed to:
Atorvastatin 20 mg tablet by mouth once daily; dispense 30, 2 refills
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
Task Time : _____ seconds
Comments:
5. Modify Lab order: Cholesterol in HDL in serum or plasma was changed to:
Cholesterol in LDL in serum or plasma by direct assay
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
Task Time : _____ seconds
Comments:
6. Modify Radiology order: CT head/brain w/o contrast material was changed to:
CT head/brain w/ contrast material (Computed tomography, head or brain; with contrast
material)
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
Enabledoc LLC ©2015 Confidential and Proprietary
4
Version 1
5
20
Task Time : _____ seconds
Comments:
View Orders:
7. Find and view radiology orders for patient Garth Brooks on 7/15/2015 in Patient Center.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
Task Time : _____ seconds
Comments:
7. Find and view prescription orders for patient Garth Brooks on 7/15/2015 in Patient Center.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
Task Time : _____ seconds
Comments:
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
21
Appendix 5: Usability Test Administration Moderator’s Guide
1. Items often forgotten
a. Stop watch for recording time on task
b. Elapsed time timer to time code observations/comments in the recording
2. Office Key Setup
a. Environment
b. Obtain copy of user key database if required for test
c. Any required feature access
d. Create and enter any necessary test data
e. Create Database / Office Key snapshot
3. Schedule Join.Me sessions
4. Schedule participants
5. Supply test participants with materials prior to test
a. Join.Me access
b. Task scenarios (be prepared to resend at the start of the test – do not assume the
participant has multiple monitors, so they may also need to print the scenarios)
c. Informed consent form (can be done verbally during the session)
6. Set up administrative / test PC
a. Join.Me installed and up to date
b. Shut down all programs not needed for the test – remember things like softphone
c. Turn off notifications – audible and screen – on any software that may be running during
the test e.g.: Outlook
d. Ensure the application installed and functional
7. Running a test session
a. Ensure office key/database snapshot of base data exists
b. Start EHR session on test PC and arrive at starting screen for test
c. Select first patient if selection is not part of the task scenario
d. Start Join.Me session
e. Share appropriate screen
f. Greet user
g. Administer introductory materials to user (see below, following checklist)
h. Start Join.Me recording
i. Display informed consent form on shared screen
j. Obtain consent via verbal acceptance
k. Give mouse and keyboard control to ALL
l. Have user work through tasks
m. For each task, record
i. Time on task
ii. Critical errors (anything that constitutes task failure)
iii. Minor errors (anything the user detects and recovers)
iv. Deviation(s) from optimal path
v. Interesting comments (try to include recording time stamp for reference)
n. After each task or task set (see specific scenarios) ask for response to single ease of use
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
22
question
o. Allow verbal response to any questions contained in task scenarios
p. At end of all tasks, display SUS questionnaire and have the participant place an X in
appropriate response to each statement
i. Save SUS with participant code as part of file name
q. Conduct debrief to solicit any additional feedback, comments, Q&A, etc
Thank you for participating in this study. Your input is very important. Our session today will last about
30 minutes. During that time you will use a version of EnableMyHealth EHR and work with specific
features. Our goal is to determine where there are areas of difficulty and design aspects that can be
improved.
I will ask you to complete a few tasks using this system and answer some questions. You should complete
the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your
own following the instructions very closely. Please note that we are not testing you we are testing the
system, therefore if you have difficulty all this means is that something needs to be improved in the
system. There are no wrong answers! We will be here in case you need specific help, but we will not be
able to instruct you or provide help in how to use Enablemyhealth, however we may provide specific
hints as necessary.
Overall, we are interested in how easy (or how difficult) this system is to use, and how we could improve
it. I did not have any involvement in its creation, so please be honest with your opinions.
We are recording the audio and screenshots of our session today. All of the information that you provide
will be kept confidential and your name will not be associated with your comments at any time. Should
you feel it necessary you are able to withdraw at any time during the testing for any reason.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
23
Appendix 6: Sample data collection form
Summative Usability Test Data Log
Test type:
Participant:
Participant code:
Session Date/start time:
File name:
Task 1:
Critical Error count:
Minor error count:
Optimal path deviations:
Time on task:
SEoUQ response:
Comments/observations:
<repeat for each task>
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
24
Appendix 7: System Usability Scale Questionnaire
In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systems
usability” known as the System Usability Scale or SUS. Lewis and Sauro (2009) and others have
elaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, at
http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008).
Strongly
Disagree
1. I think that I would like to use this system
frequently
Strongly
agree
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
2. I found the system necessarily complex
3. I thought the system was easy to use
4. I think that I would need the support of a
technical person to be able to use this
system
5. I found the various functions in this system
were well integrated
6. I thought there was too much
inconsistency in this system
7. I would imagine that most people would
learn to use this system very quickly
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
8. I found the system very cumbersome to
use
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
25
1
2
3
4
5
1
2
3
4
5
9. I felt very confident using the system
10. I needed to learn a lot of things before I
could get going with this system
1
2
3
4
5
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
26
EHR Usability Test
Report of
Enablemyhealth
EHR Clinical
Decision Support
Safety-Enhanced Design 170.314(g)(3)
Report based on ISO/IEC 25062:2006 Common Industry Format for
Usability Test Reports
Date of Usability Test: 15 AUG 2015 – 14 SEP 2015
Date of Report: 14 SEP 2015
Report Prepared By: Stephen Rothschild
877.540.0933
[email protected]
EnableDoc LLC
7700 Falstaff Road
McLean VA 22104
Table of Contents
1
Executive Summary ............................................................................................................................... 2
2
INTRODUCTION ..................................................................................................................................... 4
3
METHOD................................................................................................................................................ 4
4
5
3.1
PARTICIPANTS ............................................................................................................................... 4
3.2
STUDY DESIGN............................................................................................................................... 5
3.3
TASKS............................................................................................................................................. 6
3.4
PROCEDURES................................................................................................................................. 6
3.5
TEST LOCATION ............................................................................................................................. 7
3.6
TEST ENVIRONMENT ..................................................................................................................... 7
3.7
TEST FORMS AND TOOLS .............................................................................................................. 8
3.8
PARTICIPANT INSTRUCTIONS ........................................................................................................ 8
3.9
USABILITY METRICS ....................................................................................................................... 9
3.10
DATA SCORING .............................................................................................................................. 9
RESULTS .............................................................................................................................................. 10
4.1
DATA ANALYSIS AND REPORTING ............................................................................................... 11
4.2
DISCUSSION OF THE FINDINGS ................................................................................................... 11
4.3
EFFECTIVENESS ........................................................................................................................... 12
4.4
EFFICIENCY .................................................................................................................................. 12
4.5
SATISFACTION ............................................................................................................................. 12
4.6
MAJOR FINDINGS ........................................................................................................................ 12
4.7
AREAS FOR IMPROVEMENT ........................................................................................................ 13
APPENDICES ........................................................................................................................................ 13
Appendix 1: PARTICIPANT RECRUITING SCREENER ................................................................................ 14
Appendix 2: PARTICIPANT DEMOGRAPHICS ........................................................................................... 16
Appendix 3: Informed Consent Form...................................................................................................... 17
Appendix 4: Task Scenarios ..................................................................................................................... 18
Appendix 5: Usability Test Administration Moderator’s Guide .............................................................. 20
Appendix 6: Sample data collection form ............................................................................................... 23
Appendix 7: System Usability Scale Questionnaire ................................................................................ 24
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
1
1
Executive Summary
A usability test of EnableMyHealth – 2014.1, an ambulatory EHR, was conducted on 15 Aug 2015 – 14
Sep 2015 in the McLean, VA and Rochester MN offices of EnableDoc LLC. The purpose of this test was to
test and validate the usability of the current user interface for Clinical Decision Support (CDS), and
provide evidence of usability in the EHR Under Test (EHRUT).
During the usability test, four healthcare providers and clinical staff matching the target demographic
criteria served as participants and used the EHRUT in simulated, but representative tasks.
This study collected performance data on 7 clinical decision support tasks that trigger clinical
intervention warnings, provide clinical reference information, and allow the CDS status to be updated in
an EHR where the user is required to:
· Trigger a problem intervention, click Infobutton, and change status to completed.
· Trigger a medication intervention, click Infobutton, and change status to completed.
· Trigger a medication allergy intervention, click Infobutton, and change status to completed.
· Trigger a lab test and value intervention, click Infobutton, and change status to completed.
· Trigger a vital intervention and change status to completed.
· Trigger a demographic intervention, click Infobutton, and change status to completed.
· Trigger a demographic with a problem intervention, click Infobutton, and change status to
completed.
All test participants conducted the test sessions remotely via on-line conferencing software. During the
30 minute one-on-one usability test, each participant was greeted by the administrator and asked to
review and verbally acknowledge an informed consent/release form (included in Appendix 3); they were
instructed that they could withdraw at any time. Participants had prior experience with the EHR as they
are current users/customers. No additional training materials were provided other than that usually
given to customers. The administrator introduced the test and instructed participants to complete a
series of tasks (given one at a time) using Enablemyhealth. During the testing, the administrator timed
the test and, along with the data logger(s) recorded user performance data on paper and electronically.
The administrator did not give the participant assistance in how to complete the task. Participant
screens and audio were recorded for subsequent analysis. The following types of data were collected for
each participant:
• Number of tasks successfully completed within the allotted time without assistance
• Time to complete the tasks
• Participant’s subjective assessment of the ease of each task
• Number and types of errors
• Path deviations
• Participant’s verbalizations
• Participant’s satisfaction ratings of the system
All participant data was de-identified – no correspondence could be made from the identity of the
participant to the data collected. Following the conclusion of the testing, participants were asked to
complete a post-test questionnaire and were compensated with a $50 gift card for their time. Following
the conclusion of the testing, participants were asked to complete a post-test SUS questionnaire.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
2
Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the
Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the
usability of the EHRUT.
Result data matrix - refer to Appendix 4 for complete task descriptions:
Table 1
TASKS
Trigger a problem intervention
Trigger a medication intervention
Trigger a medication allergy
intervention
Trigger a lab test and value
intervention
Trigger a vital intervention
Trigger a demographic intervention
Trigger a demographic with a
problem intervention
Mean across tasks
Path Deviations Task Time
(Observed /
Mean
Task
Optimal)
(SD)
Number
Success
100.00 1.2 (10.6/9)
58.7(5.3)
1
100.00
1.1 (12/11)
1
58.6 (2)
Task Time
Deviations
(Observed /
Optimal)
1.2 (58.7/50)
1.1 '(58.6/55)
Task Ratings
Errors
5=Easy Mean
(SD)
Mean (SD)
1.9 (1.1)
4.1 (0.4)
0.6 (0.7)
4.3 (0.5)
1
100.00
1.1 (13.7/13)
61.9 (3.4)
1.1 (61.8/58)
0.4 (0.7)
4.6 (0.5)
1
1
1
100.00
100.00
100.00
1.0 (7.3/7)
1.1 (12/11)
1.0 (8/8)
32.7 (2.1)
44.3 (3.3)
31.9 (1.6)
1.1 (32.7/30)
1.1 (44.3/40)
1.1 (31.9/30)
0.6 (0.7)
0.4 (0.7)
0.00
4.6 (0.5)
4.7 (0.5)
5.0 (0)
1
100.00
1.1 (9.6/9)
53.6 (2.2)
1.1 (53.6/50)
0.1 (0.4)
4.9 (0.4)
100.00
1.1 (10.5/9.7)
48.8 (2.9)
1.1 (48.8/44.7)
0.6 (0.6)
4.6 (0.4)
The results from the System Usability Scale scored the subjective satisfaction with the system based on
performance with these tasks to be: 87.5.
In addition to the performance data, the following qualitative observations were made:
The CDS capability of the EHRUT exhibits high usability. The trigger functionality is automated. Most
deviations were selecting wrong screen to enter data. Any longer task times were the participant being
careful in their selection or selection of the incorrect screen.
The major usability difficulties encountered were:
·
·
Users were confused why they had to go to the Superbill to trigger the CDS alert. We explained
that all providers go to the Superbill screen at some point in the clinical documentation process
to document the diagnosis and billing codes.
There was confusion concerning the infobutton and the value it provided and how this worked
with Education Materials. Functionally it worked well, but some labs and problems did not have
links in MedlinePlus.
Areas for Improvement
·
·
Provide more textual description on the CDS popup screen.
Allow users to set which screens trigger decision support.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
3
2
INTRODUCTION
The EHRUT tested for this study was EnableMyHealth, Fall 2013 Release, an ambulatory EHR. Designed
to present medical information to healthcare providers in primarily small private practice facilities (1-5
providers) with a focus on family practice, Ob/GYN, Pediatrics and Internal Medicine, the EHRUT consists
of modules and features that include but not limited to support for:
· Patient demographics
· Patient problem and allergy lists
· Patient encounter notes and charts
· Patient clinical documents
· Creating lab orders and receiving lab results
· Growth charting
· Inoculation recording
· Electronic prescribing
· Drug-drug, drug-allergy, and drug-problem interaction checking
· Clinical decision support
· Quality reporting
· Patient population reporting
The EHRUT is used predominantly by physicians and clinical staff such as MA’s, RN’s, LPN’s, etc.
The usability testing attempted to represent realistic exercises and conditions.
The purpose of this study was to test and validate the usability of the current user interface, and provide
evidence of usability of adding, modifying, and viewing a patient’s medication list in the EHR Under Test
(EHRUT).To this end, measures of effectiveness, efficiency and user satisfaction, such as successful task
completion rate, time on task, number and types of errors, and participant satisfaction were captured
during the usability testing.
3
3.1
METHOD
PARTICIPANTS
A total of 7 participants were tested on the CPOE feature of the EHRUT. Participants in the test were
physicians and clinical staff such as RNs and MAs. Participants were recruited by Enabledoc staff from
existing customers and paid a $50 gift card for their help in testing.
Participants had no direct connection to the development of or organization producing the EHRUT other
than being current customers. Participants were not from the testing or supplier organization.
Participants were actual end users and thus have the same orientation and level of training as other
non-participant customers. For the test purposes, end-user characteristics were identified and
translated into an internal recruitment screener used to solicit potential participants; an example of a
screener is provided in Appendix [1].
Recruited participants had a mix of backgrounds and demographic characteristics conforming to the
recruitment screener. The following is a table of participants by characteristics, including demographics,
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
4
professional experience, computing experience and user needs for assistive technology. Participant
names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual
identities.
Participant ID
C1
C2
C3
C4
C5
C6
C7
Gender
M
F
F
F
F
M
F
Age
Occupation
/ role
48 MD
58 Medical Ass
38 Medical Ass
37 MD
74 MD
35 Medical Ass
25 Medical Ass
Computer
Experience
Professional (intermediate,
advanced,
Experience
expert)
(yrs)
27 intermediate
35 intermediate
17 intermediate
16 intermediate
53 intermediate
14 intermediate
4 intermediate
Product
Experience
(yrs)
3
3
4
3
3
3
1
Seven (7) participants (matching the demographics in the section on Participants) were recruited and all
participated in the usability test. 3); they were instructed that they could withdraw at any time.
Participants all had prior experience with the EHR. No participants failed to show for the study.
Participants were scheduled for 30 minute sessions with 30 minutes in between each session for debrief
by the administrator and data logger, and to reset systems to proper test conditions. All testing was
performed over several days to allow the participants to schedule time at their convenience. A
spreadsheet was used to keep track of the participant schedule.
The administrator introduced the test, and instructed participants to complete a series of tasks (given
one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the
data logger(s) recorded user performance data on paper and electronically. The administrator did not
give the participant assistance in how to complete the task.
3.2
STUDY DESIGN
The objective of this test was to perform summative testing to measure the key usability metrics of
effectiveness, efficiency, and user satisfaction. These metrics will uncover areas where the application
performed well and areas where the application failed to meet the needs of the participants in achieving
our internal usability goals. The data from this test may serve as a baseline for future tests with an
updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used.
In short, this testing serves as both a means to record or benchmark current usability, but also to
identify areas where improvements should be made.
During the usability test, participants interacted exclusively with the EHRUT. Each participant used the
system in their preferred location, and was provided with the same instructions. All sessions were
conducted remotely with join.me conferencing software. Screens with associated interaction and the
audio stream were recorded for later analysis. The system was evaluated for effectiveness, efficiency
and satisfaction as defined by measures collected and analyzed for each participant:
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
5
·
·
·
·
·
·
Number of tasks successfully completed within the allotted time without assistance
Time to complete the tasks
Number and types of errors
Path deviations
Participant’s verbalizations (comments)
Participant’s satisfaction ratings of the system
Additional information about the various measures can be found in Section 3.9 on Usability Metrics.
3.3
TASKS
Tasks were constructed that would be realistic and representative of the kinds of activities a user might
do with this EHR in the area of clinical decision support interventions and heavily based on the CMS test
scripts for Meaningful Use Phase 2. The tasks for this study broadly included:
·
·
·
·
·
·
·
Trigger a problem intervention, click Infobutton, and change status to completed.
Trigger a medication intervention, click Infobutton, and change status to completed.
Trigger a medication allergy intervention, click Infobutton, and change status to completed.
Trigger a lab test and value intervention, click Infobutton, and change status to completed.
Trigger a vital intervention and change status to completed.
Trigger a demographic intervention, click Infobutton, and change status to completed.
Trigger a demographic with a problem intervention, click Infobutton, and change status to
completed.
Task Selection and Priority
Tasks were selected based on their frequency of use, criticality of function, and those most troublesome
to users. Tasks were directly modeled on the CMS Clinical Decision Support test procedure
[170.314(a)(8)].
3.4
PROCEDURES
During the time a participant was scheduled, an email was sent to the participant that:
• Confirmed the time of the session
• Provided Join.Me access codes
• A copy of the informed consent form
• A document containing the tasks for the test session
Just prior to the scheduled time, the test administrator started the join.me session and greeted the
participant on arrival; their identity was verified and matched with a name on the participant schedule.
Participants were then assigned a participant ID. Recording of the session was started using the join.me
recording feature.
Each participant reviewed and agreed to the informed consent and release form via verbal
acknowledgment (See Appendix 3). A representative from the test team witnessed the participant’s
verbal agreement.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
6
To ensure that the test ran smoothly, testing was performed remotely by an experienced usability
practitioner with over 25 years of experience in healthcare user interface and workflow design.
The administrator moderated the session including administering instructions and tasks. The
administrator also monitored task times, obtained post-task rating data, took notes on participant
comments, path deviations, number and type of errors, and comments.
Participants were instructed to perform the tasks (see specific instructions below):
·
·
·
As quickly as possible making as few errors and deviations as possible.
Without assistance; administrators were allowed to give immaterial guidance and clarification
on tasks, but not instructions on use.
Without using a think aloud technique.
For each session, the participants were given an electronic copy of the tasks for that session. They were
requested to not read the tasks prior to the session. Participants were asked to read the task aloud prior
to each task; task timing began once the participant finished reading the question and verbally indicated
they were starting the task. The task time was stopped once the participant indicated they had
successfully completed the task. Scoring is discussed below in Section 3.9.
Following the session, the administrator gave the participant the post-test questionnaire (the System
Usability Scale, see Appendix 7), solicited any further comments, and thanked each individual for their
participation.
Participants' demographic information, task success rate, time on task, errors, deviations, verbal
responses, and post-test questionnaire were recorded into a spreadsheet.
3.5
TEST LOCATION
The test was conducted remotely through the use of Join.Me virtual conferencing and screen sharing
software. Thus the actual test location was at the discretion of the test participants. The test
administrator conducted the test from Enabledoc LLC offices in Rochester MN.
3.6
TEST ENVIRONMENT
The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was
conducted in healthcare office. For testing, the computer used a Windows laptop running Windows OS
and Chrome browser. The participants used a mouse and keyboard when interacting with the EHRUT.
The Enablemyhealth application was used on a 13 inch laptop with 1366 by 768 resolution and 32 bit
color. The application was set up by the vendor according to the vendor’s documentation describing the
system set-up and preparation. The application itself was running on a Windows 2012 Server using a
test database on a WAN connection. Technically, the system performance (i.e., response time) was
representative to what actual users would experience in a field implementation with a minor lag caused
by video screen sharing and recording. Additionally, participants were instructed not to change any of
the default system settings (such as control of font size).
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
7
3.7
TEST FORMS AND TOOLS
During the usability test, various documents and instruments were used, including:
• Informed Consent
• Test task scenarios
• Moderator’s Guide
• Observer’s data collection template
• Post-test SUS Questionnaire
Examples of these documents can be found in Appendices 3 – 7 respectively.
The participant’s interaction with the EHRUT was captured and recorded digitally using the screen
recording capability of join.me running on the test machine. This recording included the audio stream of
verbalizations. The test sessions were electronically transmitted to any additional observers who logged
into the Join.me session.
3.8
PARTICIPANT INSTRUCTIONS
The administrator reads the following instructions aloud to the each participant (also see the full
moderator’s guide in Appendix 5):
Thank you for participating in this study. Your input is very important. Our session today will last about
30 minutes. During that time you will use a version of EnableMyHealth EHR and work with specific
features. Our goal is to determine where there are areas of difficulty and design aspects that can be
improved.
I will ask you to complete a few tasks using this system and answer some questions. You should
complete the tasks as quickly as possible making as few errors as possible. Please try to complete the
tasks on your own following the instructions very closely. Please note that we are not testing you we
are testing the system, therefore if you have difficulty all this means is that something needs to be
improved in the system. There are no wrong answers! We will be here in case you need specific help,
but we will not be able to instruct you or provide help in how to use the application, however we may
provide specific hints as necessary.
Overall, we are interested in how easy (or how difficult) this system is to use, and how we could
improve it. I did not have any involvement in its creation, so please be honest with your opinions.
We are recording the audio and screen interaction of our session today. All of the information that you
provide will be kept confidential and your name will not be associated with your comments at any time.
Should you feel it necessary you are able to withdraw at any time during the testing for any reason.
In today’s session, we will be using the decision support functionality that has added some new
features. Let’s go over those now so you will have some familiarity.
The first new feature is called the Infobutton, to provide additional clinical information from
MedlinePlus. The second new feature are links to reference and diagnostic source resources. Lastly,
CDS alerts have a status that is saved in the Patient Center screen for each patient to track when
alerts are made and when the status of the alert is changed.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
8
[important: only ask the following question if they indicated that they review/read release
notes, view training videos, etc. based on their earlier answer regarding how they find out
about new features and how to use them. Be ready to display the help file.]
Would you like to review the help topic for these features now?
Following the procedural instructions, participants were started with a specific patient’s chart data. Prior to
giving the participant mouse and keyboard control, the moderator gave the following instructions:
For each task, I will ask you to read the task and indicate when you begin. At that point, please
perform the task and say “Done” once you believe you have successfully completed the task. I would
like to request that you not talk aloud or verbalize while you are doing the tasks. You may certainly ask
questions if necessary and we may provide guidance or a hint, however we will not provide direct
instruction during the tasks. I will ask you your impressions about the task once you are done.
Participants were then given 4 tasks to complete. Tasks are listed in Appendix 4.
3.9
USABILITY METRICS
According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health
Records, EHRs should support a process that provides a high level of usability for all users. The goal is
for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To
this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability
testing.
The goals of the test were to assess:
1. Effectiveness of ENABLEMYHEALTH EHR by measuring participant success rates and errors
2. Efficiency of ENABLEMYHEALTH EHR by measuring the average task time and path deviations
3. Satisfaction with ENABLEMYHEALTH EHR by measuring ease of use ratings
3.10 DATA SCORING
The following table details how tasks were scored, errors evaluated, and the time data analyzed
Rationale and Scoring
Measure
Effectiveness:
Task Success
A task was counted as a “Success” if the participant was able to achieve the
correct outcome, without assistance, within the overall time allotted for the
entire set of tasks.
The total number of successes were calculated for each task and then
divided by the total number of times that task was attempted. The results
are provided as a percentage.
Task times were recorded for successes. Observed task times divided by the
optimal time for each task is a measure of optimal efficiency. Due to the
variability of multiple correct paths and a large variety of user settable
preferences, all of which can affect time on task, optimal task times and
deviations from these times were unrealistic to assess. All participants were
trained on Enablemyhealth EHR and so the performance factor of 1.0 was
use. The overall usability goal for any feature of the EHR is task time that is
deemed acceptable by the end user with no critical errors. Thus, if expert,
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
9
optimal performance on a task was [60] seconds then allotted task time
performance was [60 * 1.0] seconds. This ratio is aggregated across tasks and
reported with mean and variance scores.
Effectiveness:
Task Failures
Efficiency:
Task Deviations
If the participant abandoned the task, did not reach the correct answer,
performed it incorrectly, or was unsure if they had completed the task, the
task was counted as a “Critical Failure.” No task times were taken for failed
tasks.
Minor errors were defined as an errant click, initial selection of an incorrect
menu option, or incorrect entries that the participant noticed and corrected.
The total number of errors was calculated for each task and then divided by
the total number of times that task was attempted. Minor errors and
deviations were noted but not counted as significant errors. This is expressed
as the mean number of failed tasks per participant.
The participant’s path (i.e., steps) through the application was recorded.
Deviations occur if the participant, for example, went to a wrong screen,
clicked on an incorrect menu item, followed an incorrect link, or interacted
incorrectly with an on-screen control. These path deviations were included in
the minor error count. The minor error count is expressed as an average
across all participants.
Efficiency:
Task Time
The participant’s path (i.e., steps) through the application was recorded.
Deviations occur if the participant, for example, went to a wrong screen,
clicked on an incorrect menu item, followed an incorrect link, or interacted
incorrectly with an on-screen control. These path deviations were included in
the minor error count. The minor error count is expressed as an average
across all participants.
Satisfaction:
Task Rating
Participant’s subjective impression of the ease of use of the application was
measured by administering both a simple post-task question as well as a
post-session questionnaire. After each task, the participant was asked to rate
“Overall, this task was:” on a scale of 1 (Very Difficult) to 7 (Very Easy).
Average difficulty ratings per task were calculated as was variance.
Common convention is that average ratings for systems judged easy to use
should be 3.3 or above.
To measure participants’ confidence in and likeability of the
ENABLEMYHEALTH EHR feature overall, the testing team administered the
System Usability Scale (SUS) post-test questionnaire. Questions included, “I
think I would like to use this system frequently,” “I thought the system was
easy to use,” and “I would imagine that most people would learn to use this
system very quickly.” See full System Usability Score questionnaire in
Appendix 7.
4
RESULTS
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
10
4.1
DATA ANALYSIS AND REPORTING
The results of the usability test were calculated according to the methods specified in the Usability
Metrics section above. Participants who failed to follow session and task instructions had their data
excluded from the analyses, however there were no instances of this occurring.
The usability testing results for the ENABLEMYHEALTH EHR are detailed below (Table 1). The results
should be seen in light of the objectives and goals outlined in Section 3.2 Study Design. The data should
yield actionable results that, if corrected, yield material, positive impact on user performance.
Table 1
TASKS
Trigger a problem intervention
Trigger a medication intervention
Trigger a medication allergy
intervention
Trigger a lab test and value
intervention
Trigger a vital intervention
Trigger a demographic intervention
Trigger a demographic with a
problem intervention
Path Deviations Task Time
Task
(Observed /
Mean
Number
Success
Optimal)
(SD)
58.7(5.3)
1
100.00 1.2 (10.6/9)
58.6 (2)
1.1 (12/11)
1
100.00
Task Time
Deviations
(Observed /
Optimal)
1.2 (58.7/50)
1.1 '(58.6/55)
Task Ratings
Errors
5=Easy Mean
Mean (SD)
(SD)
1.9 (1.1)
4.1 (0.4)
0.6 (0.7)
4.3 (0.5)
1
100.00
1.1 (13.7/13)
61.9 (3.4)
1.1 (61.8/58)
0.4 (0.7)
4.6 (0.5)
1
1
1
100.00
100.00
100.00
1.0 (7.3/7)
1.1 (12/11)
1.0 (8/8)
32.7 (2.1)
44.3 (3.3)
31.9 (1.6)
1.1 (32.7/30)
1.1 (44.3/40)
1.1 (31.9/30)
0.6 (0.7)
0.4 (0.7)
0.00
4.6 (0.5)
4.7 (0.5)
5.0 (0)
1
100.00
1.1 (9.6/9)
53.6 (2.2)
1.1 (53.6/50)
0.1 (0.4)
4.9 (0.4)
100.00
1.1 (10.5/9.7)
48.8 (2.9)
1.1 (48.8/44.7)
0.6 (0.6)
4.6 (0.4)
Mean across tasks
The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system
based on performance with these tasks to be: 87.5. SUS is considered to be above average because its
automated and simple alerts.
4.2
DISCUSSION OF THE FINDINGS
Clinical Decision Support interventions are trigged when a patient meets the rule criteria and provider
accesses the Superbill screen or Diagnosis or Orders screen. A CDS alert is triggered as a popup
describing the alert, providing guidance source, displaying reference information and links, provides
Infobutton links to Medlineplus, allows orderset to be selected, and the status of the alert to be
changed. The rules that trigger CDA alerts are based on configurable criteria, such as encounter dates,
lab test and values, vital signs, demographics, problem, current medications, and allergies.
CDS has been available in Enablemyhealth since 2010, but new information is now presented to comply
with the 2014 requirements. Since this is an automated feature, participants has no difficulty triggering
the alerts, viewing the Infobutton links, and changing the status.
Efficiency measures indicate that users had minor deviator from the optimal steps (average across taks
of 1.1). The minor error rate, which includes optimal path deviations, is acceptably low, with an overall
mean of 0.6.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
11
Satisfaction, measured by an SUS score of 87.5 is satisfactory. This score tended to be reduced by the
participants value in CDS. Participant perception of difficulty is an acceptably high score, with a mean of
4.6 on a 1-5 scale across all tasks.
4.3
EFFECTIVENESS
Based on the success rate, minor error rate, and critical error rate data, the interactions module has an
acceptable level of effectiveness after sufficient training. Without training however, the CDS features
exhibit unacceptably low usability, as demonstrated by a successful completion rate of less than 50%.
The usability test consisted of 49 tasks across all participants of which 49 were attempted. During the
test, one participant spent more time evaluating the information rather than focusing on completion of
the task.
Out of the 49 attempted tasks, the successful completion rate was 100%. On some tasks participants
produced deviation error by mistakenly clicking the incorrect menu option. The minor error mean of 0.6
is well within the acceptable range of effectiveness.
4.4
EFFICIENCY
Observation of task time and minor errors which include path deviations indicates acceptable
performance. The mean task times compared to optimal were very consistent across all tasks. Some
participants wanted to examine the information in the links and references. Each task had very few
steps to trigger the alert, which reduces the time it took to perform the task, but more time was spent
reviewing the information.
4.5
SATISFACTION
Based on task difficulty ratings and SUS results data, user satisfaction of the feature is considered
favorable. Users perception of ease of use is considered easy, with a mean rating of 4.04 across all tasks
(1=very difficult, 5=very easy). SUS scoring is very good with a score of 87.5 These scores accurately
describe the average patient satisfaction with Enablemyhealth feature set.
SUS Scoring
Mean
StDev
4.6
84.64
4.90
MAJOR FINDINGS
The major usability difficulties encountered were:
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
12
·
·
4.7
AREAS FOR IMPROVEMENT
·
·
5
Users were confused why they had to go to the Superbill to trigger the CDS alert. We explained
that all providers go to the Superbill screen at some point in the clinical documentation process
to document the diagnosis and billing codes.
There was confusion concerning the infobutton and the value it provided and how this worked
with Education Materials. Functionally it worked well, but some labs and problems did not have
links in MedlinePlus.
Provide more textual description on the CDS popup screen.
Allow users to set which screens trigger decision support.
APPENDICES
The following appendices include supplemental data for this usability test report. Following is a list of
the appendices provided:
1: Participant Recruiting Screener
2: Participant demographics
3: Informed Consent Form
4: Task Scenarios
5: Moderator’s Guide
6: Sample data collection form
7: System Usability Scale Questionnaire
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
13
Appendix 1: PARTICIPANT RECRUITING SCREENER
Introduction
Hello, my name is __. Enabledoc is seeking doctors and clinicians who are users of
• Electronic prescribing software
• Computerized order entry (such as labs and imaging)
• Medication list and medication allergy list management
to take part in a usability study of that portion of the EnableMyHealth EHR.
This study will assist us in designing and developing a solution that meets your needs. Your experiences
in using this particular design will greatly help our designers and developers. The testing of our design
will take place in your office using remote meeting technology, requiring only your time, thoughts, and
suggestions. We expect the session to last approximately 45 minutes.
Does this sound like something that interests you? Before I schedule you for a session, do you have a
few moments to answer some questions?
General Questions
1. Are you male or female? [Recruit a mix of participants]
2. Have you participated in a focus group or usability test in the past three months? [Note but do not
terminate if yes]
3. Which of the following best describes your age? [25 or less; 26 to 39; 40 to 59; 60 to 74; 75 and
older][Recruit a mix of ages]
Professional Demographics
4. What is your current position/role in your practice?
5. How long have you been in this role?
6. Do you currently perform e-prescribing? [computerized order entry, manage medication and
medication allergy lists – pick appropriate for test][Terminate if no to specific task for test] How many
prescriptions [or lab orders – use appropriate choice for test] do you write per day (or week if that is a
better estimate)?
7. What year did you receive your medical degree?
Computer Expertise
8. About how many hours per week do you spend on the computer that is medical practice related?
[Recruit a range of use, e.g., 0 to 10, 11 to 25, 26+ hours per week][Terminate if less than 5]
9. Do you use a computer outside of your medical work?
10. If so, about how many hours per week do spend using a computer for non-work related endeavors?
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
14
11. Regarding your use of EnableMyHealth Practice Practice Manager & EHR, what percentage of time is
spent in each product? (per day)
Domain Knowledge
·
Rate your expertise or comfort in using EnableMyHealth software on a scale of 1 to 5 (1=just
starting, 5=expert) for
§ PM
§ EHR (overall)
§ EHR e-Prescribing
[Terminate if 1 or 2]
·
If you have used similar or competing products, describe your level of expertise in those
products [it is not necessary to name the product(s) unless they want to].
Contact Information
[If the person matches your qualifications, ask for any info we do not have] May I have your contact
information?
· Name of participant:
· Office Key:
· Best phone number:
· Email address:
Those are all the questions I have for you. Your background matches the people we're looking for.
Would you be able to participate on [date, time]?
Alternative: select from a list of sessions [ideal approach]
Alternative; What would be the best date and time for you?
Before your session starts, we will ask you to verbally acknowledge a release form allowing us to record
your session. The recording will only be used internally for further study if needed and will never be
used for advertising or marketing purposes. Also, you will not be personally identified with any
recording. Will you consent to be recorded? [Terminate if no]
This study will take place remotely via conferencing software, allowing you to participate in the session
at the place of your choosing. I will confirm your appointment a couple of days before your session and
provide you with any additional information. What is the best time to contact you?
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
15
Appendix 2: PARTICIPANT DEMOGRAPHICS
Following is a high-level overview of the participants in this study.
Gender
Men
Women
Total (participants)
Occupationan/Role
RN/NP/MA
Physician
Total (participants)
Years of Experience (average)
Professional
EHR Product Use
2
5
7
4
3
7
23.71
2.86
Demographic detail
Participant ID
C1
C2
C3
C4
C5
C6
C7
Gender
M
F
F
F
F
M
F
Occupation
/ role
Age
48 MD
58 Medical Ass
38 Medical Ass
37 MD
74 MD
35 Medical Ass
25 Medical Ass
Enabledoc LLC ©2015 Confidential and Proprietary
Computer
Experience
Professional (intermediate,
advanced,
Experience
expert)
(yrs)
27 intermediate
35 intermediate
17 intermediate
16 intermediate
53 intermediate
14 intermediate
4 intermediate
Version 1
Product
Experience
(yrs)
3
3
4
3
3
3
1
16
Appendix 3: Informed Consent Form
Enabledoc would like to thank you for participating in this study. The purpose of this study is to evaluate
an electronic health records system. If you decide to participate, you will be asked to perform several
tasks using the prototype and give your feedback. The study will last up to 60 minutes.
Agreement
I understand and agree that as a voluntary participant in the present study conducted by Enabledoc. I
am free to withdraw consent or discontinue participation at any time. I understand and agree to
participate in the study conducted by Enabledoc.
I understand and consent to the use and release of the recording by Enabledoc. I understand that the
information and recording is for research purposes only and that my name and image will not be used
for any purpose other than research. I relinquish any rights to the recording and understand the
recording may be copied and used by Enabledoc without further permission.
I understand and agree that the purpose of this study is to make software applications more useful and
usable in the future. I understand and agree that the data collected from this study will not be shared
outside of Enabledoc
I understand and agree that data confidentiality is assured, because only de-identified data – i.e.,
identification numbers not names – will be used in analysis and reporting of the results.
I agree to immediately raise any concerns or areas of discomfort with the study administrator. I
understand that I can leave at any time.
Please check or verbally indicate one of the following:
 YES, I have read the above statement and agree to be a participant.
 NO, I choose not to participate in this study.
Signature: _____________________________________ Date: ____________________
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
17
Appendix 4: Task Scenarios
Note: In conjunction with this usability study were tasks relating to Clinical Information Reconciliation.
The results of that study are described in a separate report and not included here. The two tasks of
that study (#8 & #9) are included in this task scenario document in light grey text and are irrelevant to
this report.
Usability Task Scenarios – CDS
Note: in all tasks, please complete only to the point where any new information is ready to be saved or
signed. Please do not sign any prescriptions or perform the final completion step!
1. Patient George Burns is coming in for diabetes checkup. After code E11.00 is selected. Click
Superbill to trigger Diabetes Type II alert. Click information link at bottom of screen. Change
status to completed.
Overall, how difficult or easy did you find this task?
Very Difficult
1
2
3
Very Easy
4
5
2. Patient Hally Clinton is taking Warfarin 10 mg oral tablet for 30days 1 tablet once a day. The
current medication is entered and saved. Superbill is clicked and triggers an alert. Review the
detailed information regarding warnings with Warfarin by clicking the medication link and
change the status to Completed.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
3. Patient Jesus Trump has a severe allergy to aspirin and gets a rash that makes his complexion
look Mexican. Select Aspiring as the allergy, select rash and select severe, then click Save. Click
superbill to activate the alert. Click the information link to see more information about aspirin
allergies, then click Completed.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
4. Patient Max Carson had a WBC lab test which returned a 10 Hemoglobin result. Click superbill
triggers alert to check information on low Hemoglobin results, then click Completed.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
18
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
5. Donald Obama has vitals entered for 6 feet and 250 pounds, which generates a BMI over 30.
Click superbill to trigger an alert. Review the guidance and select Completed.
6.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
7. Sam Adams is now over 50 years old and has a physical. Click Superbill provides guidance on
other tests Sam should have. Select Completed.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
8. Bo Hussien is over 50 years old and has straining on urination. Click superbill triggers guidance
to have a PSA test performed to check for prostate cancer. Click information link to find out
other possible diagnosis. Select completed.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
Enabledoc LLC ©2015 Confidential and Proprietary
4
Version 1
5
19
Appendix 5: Usability Test Administration Moderator’s Guide
1. Items often forgotten
a. Stop watch for recording time on task
b. Elapsed time timer to time code observations/comments in the recording
2. Office Key Setup
a. Environment
b. Obtain copy of user key database if required for test
c. Any required feature access
d. Create and enter any necessary test data
e. Create Database / Office Key snapshot
3. Schedule Join.Me sessions
4. Schedule participants
5. Supply test participants with materials prior to test
a. Join.Me access
b. Task scenarios (be prepared to resend at the start of the test – do not assume the
participant has multiple monitors, so they may also need to print the scenarios)
c. Informed consent form (can be done verbally during the session)
6. Set up administrative / test PC
a. Join.Me installed and up to date
b. Shut down all programs not needed for the test – remember things like Communicator
c. Turn off notifications – audible and screen – on any software that may be running during
the test (e.g.: Outlook)
d. Ensure the application installed and functional
7. Running a test session
a. Ensure office key/database snapshot of base data exists
b. Start EHR session on test PC and arrive at starting screen for test
c. Select first patient if selection is not part of the task scenario
d. Start Join.Me session
e. Share appropriate screen
f. Greet user
g. Administer introductory materials to user (see below, following checklist)
h. Start Join.Me recording
i. Display informed consent form on shared screen
j. Obtain consent via verbal acceptance
k. Give mouse and keyboard control to ALL
l. Have user work through tasks
m. For each task, record
i. Time on task
ii. Critical errors (anything that constitutes task failure)
iii. Minor errors (anything the user detects and recovers)
iv. Deviation(s) from optimal path
v. Interesting comments (try to include recording time stamp for reference)
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
20
n.
After each task or task set (see specific scenarios) ask for response to single ease of use
question
o. Allow verbal response to any questions contained in task scenarios
p. At end of all tasks, display SUS questionnaire and have the participant place an X in
appropriate response to each statement
i. Save SUS with participant code as part of file name
q. Conduct debrief to solicit any additional feedback, comments, Q&A, etc.
Introductory script:
Thank you for participating in this study. Your input is very important. Our session today will last about
30 minutes. During that time you will use a version of EnableMyHealth EHR and work with specific
features. Our goal is to determine where there are areas of difficulty and design aspects that can be
improved.
I will ask you to complete a few tasks using this system and answer some questions. You should complete
the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your
own following the instructions very closely. Please note that we are not testing you we are testing the
system, therefore if you have difficulty all this means is that something needs to be improved in the
system. There are no wrong answers! We will be here in case you need specific help, but we will not be
able to instruct you or provide help in how to use the application, however we may provide specific hints
as necessary.
Overall, we are interested in how easy (or how difficult) this system is to use, and how we could improve
it. I did not have any involvement in its creation, so please be honest with your opinions.
We are recording the audio and screenshots of our session today. All of the information that you provide
will be kept confidential and your name will not be associated with your comments at any time. Should
you feel it necessary you are able to withdraw at any time during the testing for any reason.
[Show informed consent document and get verbal acceptance]
Do you have any questions or concerns before we begin?
Let’s start with some basic information:
What is your job title?
How long have you been in this profession?
What is your specialty?
How long have you used the EHR?
How do you learn about new features as they are added to the EHR?
[if necessary, prompt about reading release notes, viewing training videos,
using the help system, or do they just start using the feature]
In today’s session, we will be using some new features that are currently being developed. Let’s go over
those now so you will have some familiarity.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
21
The first new feature is called the Infobutton, denoted by a small blue “i” icon that appears as necessary
to provide additional clinical information.
The second new feature is a tool to retrieve incoming clinical information for a patient that is sent
electronically from another provider outside your practice. This tool is to resolve discrepancies or
duplicate information between the incoming record and the clinical record that is in your EHR.
[important: only ask the following question if they indicated that they review/read release notes, view
training videos, etc. based on their earlier answer regarding how they find out about new features
and how to use them. Be ready to display the help file.]
Would you like to review the help topic for this feature now?
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
22
Appendix 6: Sample data collection form
Summative Usability Test Data Log
Test type:
Participant:
Participant code:
Session Date/start time:
File name:
Task 1:
·
·
·
·
·
·
Critical Error count:
Minor error count:
Optimal path deviations:
Time on task:
SEoUQ response:
Comments/observations:
<repeat for each task>
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
23
Appendix 7: System Usability Scale Questionnaire
In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systems
usability” known as the System Usability Scale or SUS. Lewis and Sauro (2009) and others have
elaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, at
http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008).
Strongly
Disagree
1. I think that I would like to use this system
frequently
Strongly
agree
1
2
3
4
Strongly
Disagree
5
Strongly
agree
2. I found the system unnecessarily complex
1
2
3
4
5
1
2
3
4
5
3. I thought the system was easy to use
4. I think that I would need the support of a
technical person to be able to use this
system
5. I found the various functions in this system
were well integrated
6. I thought there was too much
inconsistency in this system
7. I would imagine that most people would
learn to use this system very quickly
8. I found the system very cumbersome to
use
Enabledoc LLC ©2015 Confidential and Proprietary
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
Version 1
24
9. I felt very confident using the system
10. I needed to learn a lot of things before I
could get going with this system
Enabledoc LLC ©2015 Confidential and Proprietary
1
2
3
4
5
1
2
3
4
5
Version 1
25
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
26
EHR Usability Test
Report of
Enablemyhealth
EHR
Drug-Drug & DrugAllergy Interaction
Checking
Safety-Enhanced Design 170.314(g)(3)
Report based on ISO/IEC 25062:2006 Common Industry Format for
Usability Test Reports
Date of Usability Test: 15 AUG 2015 – 14 SEP 2015
Date of Report: 14 SEP 2015
Report Prepared By: Stephen Rothschild
877.540.0933
[email protected]
EnableDoc LLC
7700 Falstaff Road
McLean VA 22104
Table of Contents
1
Executive Summary ............................................................................................................................... 2
2
INTRODUCTION ..................................................................................................................................... 3
3
METHOD................................................................................................................................................ 4
4
5
3.1
PARTICIPANTS ............................................................................................................................... 4
3.2
STUDY DESIGN............................................................................................................................... 5
3.3
TASKS............................................................................................................................................. 5
3.4
PROCEDURES................................................................................................................................. 6
3.5
TEST LOCATION ............................................................................................................................. 7
3.6
TEST ENVIRONMENT ..................................................................................................................... 7
3.7
TEST FORMS AND TOOLS .............................................................................................................. 7
3.8
PARTICIPANT INSTRUCTIONS ........................................................................................................ 7
3.9
USABILITY METRICS ....................................................................................................................... 8
3.10
DATA SCORING .............................................................................................................................. 8
RESULTS .............................................................................................................................................. 10
4.1
DATA ANALYSIS AND REPORTING ............................................................................................... 10
4.2
DISCUSSION OF THE FINDINGS ................................................................................................... 10
4.3
EFFECTIVENESS ........................................................................................................................... 11
4.4
EFFICIENCY .................................................................................................................................. 11
4.5
SATISFACTION ............................................................................................................................. 11
4.6
MAJOR FINDINGS ........................................................................................................................ 11
4.7
AREAS FOR IMPROVEMENT ........................................................................................................ 12
APPENDICES ........................................................................................................................................ 12
Appendix 1: PARTICIPANT RECRUITING SCREENER ................................................................................ 12
Appendix 2: PARTICIPANT DEMOGRAPHICS ........................................................................................... 15
Appendix 3: Informed Consent Form...................................................................................................... 16
Appendix 4: Task Scenarios ..................................................................................................................... 17
Appendix 5: Usability Test Administration Moderator’s Guide .............................................................. 17
Appendix 6: Sample data collection form ............................................................................................... 20
Appendix 7: System Usability Scale Questionnaire ................................................................................ 21
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
1
1
Executive Summary
A usability test of EnableMyHealth – 2014.1, an ambulatory EHR, was conducted on 15 Aug 2015 – 14
Sep 2015 in the McLean, VA and Rochester MN offices of EnableDoc LLC. The purpose of this test was to
test and validate the usability of the current user interface for Drug-Drug &Drug-Allergy Interaction
Checking, and provide evidence of usability in the EHR Under Test (EHRUT).
During the usability test, five healthcare providers and clinical staff matching the target demographic
criteria served as participants and used the EHRUT in simulated, but representative tasks.
This study collected performance data on 2 electronic prescription tasks that trigger drug-drug or drugallergy interaction warnings in an EHR where the user is required to:
·
·
Enter a Drug-Drug intervention notification
Enter a Drug-Allergy Intervention notification
All test participants conducted the test sessions remotely via on-line conferencing software. During the
15 minute one-on-one usability test, each participant was greeted by the administrator and asked to
review and verbally acknowledge an informed consent/release form (included in Appendix 3); they were
instructed that they could withdraw at any time. Participants had prior experience with the EHR as they
are current users/customers. No additional training materials were provided other than that usually
given to customers. The administrator introduced the test and instructed participants to complete a
series of tasks (given one at a time) using Enablemyhealth. During the testing, the administrator timed
the test and, along with the data logger(s) recorded user performance data on paper and electronically.
The administrator did not give the participant assistance in how to complete the task. Participant
screens and audio were recorded for subsequent analysis. The following types of data were collected for
each participant:
• Number of tasks successfully completed within the allotted time without assistance
• Time to complete the tasks
• Participant’s subjective assessment of the ease of each task
• Number and types of errors
• Path deviations
• Participant’s verbalizations
• Participant’s satisfaction ratings of the system
All participant data was de-identified – no correspondence could be made from the identity of the
participant to the data collected. Following the conclusion of the testing, participants were asked to
complete a post-test questionnaire and were compensated with a $50 gift card for their time. Following
the conclusion of the testing, participants were asked to complete a post-test SUS questionnaire.
Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the
Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the
usability of the EHRUT.
Result data matrix - refer to Appendix 4 for complete task descriptions
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
2
MEASURES
TASKS
Number Task Success
Drug- Drug Interaction
6
100.00
Drug- Allergy Interaction
3
100.00
Mean across tasks
100.00
Path Deviations
(Observed /
Optimal)
1.1 (14.7/13)
1.1 (12.1/11)
Task Time
Mean
(SD)
72 (3.5)
68.4 (3.3)
Task Time
Deviations
(Observed /
Optimal)
1.1 (72/65)
1.1 (68.4/60)
1.1 (13.4/12)
70 (3.4)
1.1 (70/62.5)
Errors Mean Task Ratings 5=Easy
(SD)
Mean (SD)
3.7 (0.5)
1.86 (1.1)
4.1 (0.6)
0.57 (0.7)
1.2 (0.9)
3.9 (0.6)
The results from the System Usability Scale scored the subjective satisfaction with the system based on
performance with these tasks to be: 81.
In addition to the performance data, the following qualitative observations were made:
Major findings
·
·
Same issues with adding CPOE prescriptions and medications:
1. Providers tend to just want to write or dictate the script rather then select a frequency
of use
2. Medications other than tablets (e.g.: injectables, inhalers) presented the most serious
challenges
3. Most frequent challenge was determining units of medication when none is defaulted.
4. Changing a prescription frequency of use was not easily understood and preference is
just to modify the script directions
Participants inquired why they had to over-ride the notifications with typing a reason.
Areas for improvement
·
·
·
·
·
2
Add default units for all prescriptions
Provide aids in recommended dosage based on patient BSA, age, gender, and diagnosis.
Provide ability to type or speak script and parse data to build the prescription script.
Add a wizard for first time or novice users for adding prescriptions.
Add a drop down of common override reasons and do not require the box to also be
checked.
INTRODUCTION
The EHRUT tested for this study was EnableMyHealth EHR4 Release, an ambulatory EHR. Designed to
present medical information to healthcare providers in primarily group practices with a focus on family
practice, surgery, eye, and PT/OT/Chiropractic medicine, the EHRUT consists of modules and features
that include but not limited to support for tested functionality:
· Patient demographics
· Patient problem and allergy lists
· Creating lab orders and receiving lab results
· Immunization recording
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
3
·
·
·
Electronic prescribing
Drug-drug, drug-allergy, and drug-problem interaction checking
Clinical decision support
The EHRUT is used predominantly by physicians and clinical staff such as MA’s, RN’s, LPN’s, etc.
The usability testing attempted to represent realistic exercises and conditions.
The purpose of this study was to test and validate the usability of the current user interface, and provide
evidence of usability of the entry and modification of electronic prescriptions, laboratory orders, and
imaging orders in the EHR Under Test (EHRUT).To this end, measures of effectiveness, efficiency and
user satisfaction, such as successful task completion rate, time on task, number and types of errors, and
participant satisfaction were captured during the usability testing.
3
3.1
METHOD
PARTICIPANTS
A total of 7 participants were tested on the CPOE feature of the EHRUT. Participants in the test were
physicians and clinical staff such as RNs and MAs. Participants were recruited by Enabledoc staff from
existing customers and paid a $50 gift card for their help in testing.
Participants had no direct connection to the development of or organization producing the EHRUT other
than being current customers. Participants were not from the testing or supplier organization.
Participants were actual end users and thus have the same orientation and level of training as other
non-participant customers. For the test purposes, end-user characteristics were identified and
translated into an internal recruitment screener used to solicit potential participants; an example of a
screener is provided in Appendix [1].
Recruited participants had a mix of backgrounds and demographic characteristics conforming to the
recruitment screener. The following is a table of participants by characteristics, including demographics,
professional experience, computing experience and user needs for assistive technology. Participant
names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual
identities.
Participant ID
C1
C2
C3
C4
C5
C6
C7
Gender
M
F
F
F
F
M
F
Age
Occupation
/ role
48 MD
58 Medical Ass
38 Medical Ass
37 MD
74 MD
35 Medical Ass
25 Medical Ass
Enabledoc LLC ©2015 Confidential and Proprietary
Computer
Experience
Professional (intermediate,
advanced,
Experience
expert)
(yrs)
27 intermediate
35 intermediate
17 intermediate
16 intermediate
53 intermediate
14 intermediate
4 intermediate
Version 1
Product
Experience
(yrs)
3
3
4
3
3
3
1
4
Seven (7) participants (matching the demographics in the section on Participants) were recruited and all
participated in the usability test. 3); they were instructed that they could withdraw at any time.
Participants all had prior experience with the EHR. No participants failed to show for the study.
Participants were scheduled for 15 minute sessions with 15 minutes in between each session for debrief
by the administrator and data logger, and to reset systems to proper test conditions. All testing was
performed over several days to allow the participants to schedule time at their convenience. A
spreadsheet was used to keep track of the participant schedule.
The administrator introduced the test, and instructed participants to complete a series of tasks (given
one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the
data logger(s) recorded user performance data on paper and electronically. The administrator did not
give the participant assistance in how to complete the task.
3.2
STUDY DESIGN
The objective of this test was to perform summative testing to measure the key usability metrics of
effectiveness, efficiency, and user satisfaction. These metrics will uncover areas where the application
performed well and areas where the application failed to meet the needs of the participants in achieving
our internal usability goals. The data from this test may serve as a baseline for future tests with an
updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used.
In short, this testing serves as both a means to record or benchmark current usability, but also to
identify areas where improvements should be made.
During the usability test, participants interacted exclusively with the EHRUT. Each participant used the
system in their preferred location, and was provided with the same instructions. All sessions were
conducted remotely with Join.Me conferencing software. Screens with associated interaction and the
audio stream were recorded for later analysis. The system was evaluated for effectiveness, efficiency
and satisfaction as defined by measures collected and analyzed for each participant:
• Number of tasks successfully completed within the allotted time without assistance
• Time to complete the tasks
• Number and types of errors
• Path deviations
• Participant’s verbalizations (comments)
• Participant’s satisfaction ratings of the system
Additional information about the various measures can be found in Section 3.9 on Usability Metrics.
3.3
TASKS
Tasks were constructed that would be realistic and representative of the kinds of activities a user might
do with this EHR in the area of [describe feature under study] and heavily based on the CMS test scripts
for Meaningful Use Phase 2. The tasks for this study included:
• Create a prescription that triggers a drug-drug interaction check
• Create a prescription that triggers a drug-allergy interaction check
Task Selection and Priority
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
5
Tasks were selected based on their frequency of use, criticality of function, and those that may be most
troublesome for users. The tasks were directly modeled on the CMS Drug-Drug/Drug-Allergy Interaction
test procedure [170.314(a)(2)].
Tasks were ordered and prioritized based on their impact on patient safety, in which the tasks that had
the greatest potential for patient harm due to critical errors were performed first. Thus the entry of
prescriptions that triggered drug-drug or drug-allergy interaction checks were followed by modifying a
prescription to a new drug that then triggered an interaction check.
The task scenario document is contained in appendix 4.
3.4
PROCEDURES
Just prior to the scheduled time, the test administrator started the join.me session and greeted the
participant on arrival; their identity was verified and matched with a name on the participant schedule.
Participants were then assigned a participant ID. Recording of the session was started using the join.me
recording feature.
Each participant reviewed and agreed to the informed consent and release form via verbal
acknowledgment (See Appendix 3). A representative from the test team witnessed the participant’s
verbal agreement.
To ensure that the test ran smoothly, testing was performed remotely by an experienced usability
practitioner with over 25 years of experience in healthcare user interface and workflow design.
The administrator moderated the session including administering instructions and tasks. The
administrator also monitored task times, obtained post-task rating data, took notes on participant
comments, path deviations, number and type of errors, and comments.
Participants were instructed to perform the tasks (see specific instructions below):
·
·
·
As quickly as possible making as few errors and deviations as possible.
Without assistance; administrators were allowed to give immaterial guidance and clarification
on tasks, but not instructions on use.
Without using a think aloud technique.
For each session, the participants were given an electronic copy of the tasks for that session. They were
requested to not read the tasks prior to the session. Participants were asked to read the task aloud prior
to each task; task timing began once the participant finished reading the question and verbally indicated
they were starting the task. The task time was stopped once the participant indicated they had
successfully completed the task. Scoring is discussed below in Section 3.9.
Following the session, the administrator gave the participant the post-test questionnaire (the System
Usability Scale, see Appendix 7), solicited any further comments, and thanked each individual for their
participation.
Participants' demographic information, task success rate, time on task, errors, deviations, verbal
responses, and post-test questionnaire were recorded into a spreadsheet.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
6
3.5
TEST LOCATION
3.6
TEST ENVIRONMENT
The test was conducted remotely through the use of Join.Me virtual conferencing and screen sharing
software. Thus the actual test location was at the discretion of the test participants. The test
administrator conducted the test from Enabledoc LLC offices in Rochester MN.
The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was
conducted in healthcare office. For testing, the computer used a Windows laptop running Windows OS
and Chrome browser. The participants used a mouse and keyboard when interacting with the EHRUT.
The Enablemyhealth application was used on a 13 inch laptop with 1366 by 768 resolution and 32 bit
color. The application was set up by the vendor according to the vendor’s documentation describing the
system set-up and preparation. The application itself was running on a Windows 2012 Server using a
test database on a WAN connection. Technically, the system performance (i.e., response time) was
representative to what actual users would experience in a field implementation with a minor lag caused
by video screen sharing and recording. Additionally, participants were instructed not to change any of
the default system settings (such as control of font size).
3.7
TEST FORMS AND TOOLS
During the usability test, various documents and instruments were used, including:
• Informed Consent
• Test task scenarios
• Moderator’s Guide
• Observer’s data collection template
• Post-test SUS Questionnaire
Examples of these documents can be found in Appendices 3 – 7 respectively.
The participant’s interaction with the EHRUT was captured and recorded digitally using the screen
recording capability of join.me running on the test machine. This recording included the audio stream of
verbalizations. The test sessions were electronically transmitted to any additional observers who logged
into the join.me session.
3.8
PARTICIPANT INSTRUCTIONS
The administrator reads the following instructions aloud to the each participant (also see the full
moderator’s guide in Appendix 5):
Thank you for participating in this study. Your input is very important. Our session today will last
about 15 minutes. During that time you will use a version of EnableMyHealth EHR and work with
specific features. Our goal is to determine where there are areas of difficulty and design aspects that
can be improved.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
7
I will ask you to complete a few tasks using this system and answer some questions. You should
complete the tasks as quickly as possible making as few errors as possible. Please try to complete the
tasks on your own following the instructions very closely. Please note that we are not testing you we
are testing the system, therefore if you have difficulty all this means is that something needs to be
improved in the system. There are no wrong answers! We will be here in case you need specific help,
but we will not be able to instruct you or provide help in how to use the application, however we may
provide specific hints as necessary.
Overall, we are interested in how easy (or how difficult) this system is to use, and how we could
improve it. I did not have any involvement in its creation, so please be honest with your opinions.
We are recording the audio and screen interaction of our session today. All of the information that
you provide will be kept confidential and your name will not be associated with your comments at any
time. Should you feel it necessary you are able to withdraw at any time during the testing for any
reason.
Following the procedural instructions, participants were started with a specific patient’s chart data. Prior
to giving the participant mouse and keyboard control, the moderator gave the following instructions:
For each task, I will ask you to read the task and indicate when you begin. At that point, please
perform the task and say “Done” once you believe you have successfully completed the task. I would
like to request that you not talk aloud or verbalize while you are doing the tasks. You may certainly
ask questions if necessary and we may provide guidance or a hint, however we will not provide direct
instruction during the tasks. I will ask you your impressions about the task once you are done.
Participants were then given 4 tasks to complete. Tasks are listed in Appendix 4.
3.9
USABILITY METRICS
According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health
Records, EHRs should support a process that provides a high level of usability for all users. The goal is for
users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To
this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability
testing. The goals of the test were to assess:
1. Effectiveness of ENABLEMYHEALTH EHR by measuring participant success rates and errors
2. Efficiency of ENABLEMYHEALTH EHR by measuring the average task time and path deviations
3. Satisfaction with ENABLEMYHEALTH EHR by measuring ease of use ratings
3.10 DATA SCORING
The following table details how tasks were scored, errors evaluated, and the time data analyzed:
Rationale and Scoring
Measure
Effectiveness:
A task was counted as a “Success” if the participant was able to achieve the
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
8
Task Success
correct outcome, without assistance, within the overall time allotted for the
entire set of tasks.
The total number of successes were calculated for each task and then
divided by the total number of times that task was attempted. The results
are provided as a percentage.
Task times were recorded for successes. Observed task times divided by the
optimal time for each task is a measure of optimal efficiency. Due to the
variability of multiple correct paths and a large variety of user settable
preferences, all of which can affect time on task, optimal task times and
deviations from these times were unrealistic to assess. All participants were
trained on Enablemyhealth EHR and so the performance factor of 1.0 was
use. The overall usability goal for any feature of the EHR is task time that is
deemed acceptable by the end user with no critical errors. Thus, if expert,
optimal performance on a task was [60] seconds then allotted task time
performance was [60 * 1.0] seconds. This ratio is aggregated across tasks and
reported with mean and variance scores.
Effectiveness:
Task Failures
Efficiency:
Task Deviations
If the participant abandoned the task, did not reach the correct answer,
performed it incorrectly, or was unsure if they had completed the task, the
task was counted as a “Critical Failure.” No task times were taken for failed
tasks.
Minor errors were defined as an errant click, initial selection of an incorrect
menu option, or incorrect entries that the participant noticed and corrected.
The total number of errors was calculated for each task and then divided by
the total number of times that task was attempted. Minor errors and
deviations were noted but not counted as significant errors. This is expressed
as the mean number of failed tasks per participant.
The participant’s path (i.e., steps) through the application was recorded.
Deviations occur if the participant, for example, went to a wrong screen,
clicked on an incorrect menu item, followed an incorrect link, or interacted
incorrectly with an on-screen control. These path deviations were included in
the minor error count. The minor error count is expressed as an average
across all participants.
Efficiency:
Task Time
Satisfaction:
Task Rating
Each task was timed from when the participant indicated they were
beginning the task until the participant said “Done.” If he or she failed to say
“Done,” the time was stopped when the participant stopped performing the
task. Only task times for tasks that were successfully completed were
included in the average task time analysis. Average time per task was
calculated for each task as was variance (standard deviation).
Participant’s subjective impression of the ease of use of the application was
measured by administering both a simple post-task question as well as a
post-session questionnaire. After each task, the participant was asked to rate
“Overall, this task was:” on a scale of 1 (Very Difficult) to 7 (Very Easy).
Average difficulty ratings per task were calculated as was variance.
Common convention is that average ratings for systems judged easy to use
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
9
should be 3.3 or above.
To measure participants’ confidence in and likeability of the
ENABLEMYHEALTH EHR feature overall, the testing team administered the
System Usability Scale (SUS) post-test questionnaire. Questions included, “I
think I would like to use this system frequently,” “I thought the system was
easy to use,” and “I would imagine that most people would learn to use this
system very quickly.” See full System Usability Score questionnaire in
Appendix 7.
4
4.1
RESULTS
DATA ANALYSIS AND REPORTING
The results of the usability test were calculated according to the methods specified in the Usability
Metrics section above. Participants who failed to follow session and task instructions had their data
excluded from the analyses, however there were no instances of this occurring. The usability testing
results for the ENABLEMYHEALTH EHR are detailed below (Table 1). The results should be seen in light of
the objectives and goals outlined in Section 3.2 Study Design. The data should yield actionable results
that, if corrected, yield material, positive impact on user performance.
Table 1
MEASURES
TASKS
Number Task Success
Drug- Drug Interaction
6
100.00
Drug- Allergy Interaction
3
100.00
Mean across tasks
100.00
Path Deviations
(Observed /
Optimal)
1.1 (14.7/13)
1.1 (12.1/11)
Task Time
Mean
(SD)
72 (3.5)
68.4 (3.3)
Task Time
Deviations
(Observed /
Optimal)
1.1 (72/65)
1.1 (68.4/60)
1.1 (13.4/12)
70 (3.4)
1.1 (70/62.5)
Errors Mean Task Ratings 5=Easy
(SD)
Mean (SD)
3.7 (0.5)
1.86 (1.1)
4.1 (0.6)
0.57 (0.7)
1.2 (0.9)
3.9 (0.6)
The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system
based on performance with these tasks to be: 81. Broadly interpreted, this indicates above average
usability, but with further room for improvement.
4.2
DISCUSSION OF THE FINDINGS
Overall, the CPOE feature of Enablemyhealth user experience exhibits good effectiveness, with no
critical errors out of 14 attempted tasks, but there were 1.2 mean deviations errors across all tasks. This
is within internal usability goal guidelines. Of the total 17 deviation errors, 13 were attributable to
electronic prescribing, which was primarily finding the correct prescription, dosage/form and
measurement unit, which was consistent with the CPOE and eRx test results.
Efficiency measures indicate that while acceptable, there is room for improvement, specifically in the
electronic prescription interface, although searching for labs and radiology orders can also be difficult if
the names of the tests do not match the terminology used by the participant. One older user took
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
10
significantly longer time to perform every task, but this can also be attributed to nervousness and
second guessing themselves.
Satisfaction, measured by an SUS score of 81 is very good, but can be improved. Working to make the
user experience more natural and intelligent will improve the overall user experience and help guide the
novice and advanced users.
4.3
EFFECTIVENESS
Based on the success rate, minor error rate, and critical error rate data, the CPOE feature has an
acceptable level of effectiveness. The usability test consisted of 14 tasks of which 14 were attempted.
Out of the 14 attempted tasks, there were no critical errors, giving a successful completion rate of 100%.
The 13 deviation (minor) errors in the electronic prescriptions were difficulties associated non defaulting
scripts that required making changes in each field to build the script. Order sets were not used in this
testing and can eliminate the problem of complex prescription and other types of orders. The deviation
error mean of 1.2 is within the acceptable range of effectiveness, with the errors most often being
finding the order by spelling it.
4.4
EFFICIENCY
Observation of task time is the primary area for improvement. The mean task time across all tasks is
70.2 seconds out of an optimal 62.5 was primary caused by searching for medications and reading the
interaction guidance.
Drug-Drug interaction testing took 4 seconds longer to enter the medication than drug-allergy, but the
results were just about 1 standard deviation apart. Overall, improving efficiency is really related to
medication CPOE improvements.
4.5
SATISFACTION
Based on task difficulty ratings and SUS results data, user satisfaction of the feature is considered
favorable. Users perception of ease of use is considered easy, with a mean rating of 3.9 across all tasks
(1=very difficult, 5=very easy). SUS scoring is very good with a score of 81. These scores accurately
describe the average patient satisfaction with Enablemyhealth feature set.
SUS Scoring
Mean
StDev
4.6
81.07
3.98
MAJOR FINDINGS
The major usability problems encountered in this study were:
·
Same issues with adding CPOE prescriptions and medications:
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
11
·
4.7
1. Providers tend to just want to write or dictate the script rather then select a frequency
of use
2. Medications other than tablets (e.g.: injectables, inhalers) presented the most serious
challenges
3. Most frequent challenge was determining units of medication when none is defaulted.
4. Changing a prescription frequency of use was not easily understood and preference is
just to modify the script directions
Participants inquired why they had to over-ride the notifications with typing a reason.
AREAS FOR IMPROVEMENT
Electronic prescribing with alerts:
·
·
·
·
·
5
Add default units for all prescriptions
Provide aids in recommended dosage based on patient BSA, age, gender, and diagnosis.
Provide ability to type or speak script and parse data to build the prescription script.
Add a wizard for first time or novice users for adding prescriptions.
Add a drop down of common override reasons and do not require the box to also be
checked.
APPENDICES
The following appendices include supplemental data for this usability test report. Following is a list of
the appendices provided:
1: Participant Recruiting Screener
2: Participant demographics
3: Informed Consent Form
4: Task Scenarios
5: Moderator’s Guide
6: Sample data collection form
7: System Usability Scale Questionnaire
Appendix 1: PARTICIPANT RECRUITING SCREENER
Introduction
Hello, my name is __. Enabledoc is seeking doctors and clinicians who are users of
• Electronic prescribing software
• Computerized order entry (such as labs and imaging)
• Medication list and medication allergy list management
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
12
to take part in a usability study of that portion of the EnableMyHealth EHR.
This study will assist us in designing and developing a solution that meets your needs. Your experiences
in using this particular design will greatly help our designers and developers. The testing of our design
will take place in your office using remote meeting technology, requiring only your time, thoughts, and
suggestions. We expect the session to last approximately 45 minutes.
Does this sound like something that interests you? Before I schedule you for a session, do you have a
few moments to answer some questions?
General Questions
1. Are you male or female? [Recruit a mix of participants]
2. Have you participated in a focus group or usability test in the past three months? [Note but do not
terminate if yes]
3. Which of the following best describes your age? [25 or less; 26 to 39; 40 to 59; 60 to 74; 75 and
older][Recruit a mix of ages]
Professional Demographics
4. What is your current position/role in your practice?
5. How long have you been in this role?
6. Do you currently perform e-prescribing? [computerized order entry, manage medication and
medication allergy lists – pick appropriate for test][Terminate if no to specific task for test] How many
prescriptions [or lab orders – use appropriate choice for test] do you write per day (or week if that is a
better estimate)?
7. What year did you receive your medical degree?
Computer Expertise
8. About how many hours per week do you spend on the computer that is medical practice related?
[Recruit a range of use, e.g., 0 to 10, 11 to 25, 26+ hours per week][Terminate if less than 5]
9. Do you use a computer outside of your medical work?
10. If so, about how many hours per week do spend using a computer for non-work related endeavors?
11. Regarding your use of ADP Advancedmd Practice Manager & EHR, what percentage of time is spent
in each product? (per day)
Domain Knowledge
·
Rate your expertise or comfort in using ADP Advancedmd software on a scale of 1 to 5 (1=just
starting, 5=expert) for
§ PM
§ EHR (overall)
§ EHR e-Prescribing
[Terminate if 1 or 2]
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
13
·
If you have used similar or competing products, describe your level of expertise in those
products [it is not necessary to name the product(s) unless they want to].
Contact Information
[If the person matches your qualifications, ask for any info we do not have] May I have your contact
information?
· Name of participant:
· Office Key:
· Best phone number:
· Email address:
Those are all the questions I have for you. Your background matches the people we're looking for.
Would you be able to participate on [date, time]?
Alternative: select from a list of sessions [ideal approach]
Alternative; What would be the best date and time for you?
Before your session starts, we will ask you to verbally acknowledge a release form allowing us to record
your session. The recording will only be used internally for further study if needed and will never be
used for advertising or marketing purposes. Also, you will not be personally identified with any
recording. Will you consent to be recorded? [Terminate if no]
This study will take place remotely via conferencing software, allowing you to participate in the session
at the place of your choosing. I will confirm your appointment a couple of days before your session and
provide you with any additional information. What is the best time to contact you?
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
14
Appendix 2: PARTICIPANT DEMOGRAPHICS
Following is a high-level overview of the participants in this study.
Gender
Men
Women
Total (participants)
2
5
7
Occupationan/Role
RN/NP/MA
Physician
Total (participants)
4
3
7
Years of Experience (average)
Professional
EHR Product Use
23.71
2.86
Demographic detail
Participant ID
C1
C2
C3
C4
C5
C6
C7
Gender
M
F
F
F
F
M
F
Age
Occupation
/ role
48 MD
58 Medical Ass
38 Medical Ass
37 MD
74 MD
35 Medical Ass
25 Medical Ass
Enabledoc LLC ©2015 Confidential and Proprietary
Computer
Experience
Professional (intermediate,
advanced,
Experience
expert)
(yrs)
27 intermediate
35 intermediate
17 intermediate
16 intermediate
53 intermediate
14 intermediate
4 intermediate
Version 1
Product
Experience
(yrs)
3
3
4
3
3
3
1
15
Appendix 3: Informed Consent Form
Enabledoc would like to thank you for participating in this study. The purpose of this study is to
evaluate an electronic health records system. If you decide to participate, you will be asked to perform
several tasks using the prototype and give your feedback. The study will last up to 60 minutes.
Agreement
I understand and agree that as a voluntary participant in the present study conducted by Enabledoc. I
am free to withdraw consent or discontinue participation at any time. I understand and
agree to participate in the study conducted by Enabledoc.
I understand and consent to the use and release of the recording by Enabledoc. I understand that the
information and recording is for research purposes only and that my name and image will not be used
for any purpose other than research. I relinquish any rights to the recording and understand the
recording may be copied and used by Enabledoc without further permission.
I understand and agree that the purpose of this study is to make software applications more useful and
usable in the future. I understand and agree that the data collected from this study will not be shared
outside of Enabledoc
I understand and agree that data confidentiality is assured, because only de-identified data – i.e.,
identification numbers not names – will be used in analysis and reporting of the results.
I agree to immediately raise any concerns or areas of discomfort with the study administrator. I
understand that I can leave at any time.
Please check or verbally indicate one of the following:
 YES, I have read the above statement and agree to be a participant.
 NO, I choose not to participate in this study.
Signature: _____________________________________ Date: ____________________
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
16
Appendix 4: Task Scenarios
Usability Task Scenarios – Drug – Drug and Drug – Allergy Interaction
Participant account is set for Interaction of High, Medium and Low Severity. Access the clinical
information for Garth Brooks by clicking Patient Center, then click Open Note and perform these steps to
trigger Drug – Drug Notification:
1. Click Current Medications and select Warfarin 10 mg tablet and click Save.
2. Click Prescription tab and select aspirin 325 mg oral tablet and click Save.
3. Review interaction guidance and click Close.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
1. Click Allergies and select amoxicillin, select fever, select severe and click Save.
2. Click Prescription tab and select azithromycin 250 mg oral tablet, enter 1 for dosage, and 10
days and click Save.
3. Review interaction guidance and click Close.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
Appendix 5: Usability Test Administration Moderator’s Guide
1. Items often forgotten
a. Stop watch for recording time on task
b. Elapsed time timer to time code observations/comments in the recording
2. Office Key Setup
a. Environment
b. Obtain copy of user key database if required for test
c. Any required feature access
d. Create and enter any necessary test data
e. Create Database / Office Key snapshot
3. Schedule Join.Me sessions
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
17
4. Schedule participants
5. Supply test participants with materials prior to test
a. Join.Me access
b. Task scenarios (be prepared to resend at the start of the test – do not assume the
participant has multiple monitors, so they may also need to print the scenarios)
c. Informed consent form (can be done verbally during the session)
6. Set up administrative / test PC
a. Join.Me installed and up to date
b. Shut down all programs not needed for the test – remember things like Communicator
c. Turn off notifications – audible and screen – on any software that may be running during
the test (e.g.: Outlook)
d. Ensure the application installed and functional
7. Running a test session
a. Ensure office key/database snapshot of base data exists
b. Start EHR session on test PC and arrive at starting screen for test
c. Select first patient if selection is not part of the task scenario
d. Start Join.Me session
e. Share appropriate screen
f. Greet user
g. Administer introductory materials to user (see below, following checklist)
h. Start Join.Me recording
i. Display informed consent form on shared screen
j. Obtain consent via verbal acceptance
k. Give mouse and keyboard control to ALL
l. Have user work through tasks
m. For each task, record
i. Time on task
ii. Critical errors (anything that constitutes task failure)
iii. Minor errors (anything the user detects and recovers)
iv. Deviation(s) from optimal path
v. Interesting comments (try to include recording time stamp for reference)
n.
After each task or task set (see specific scenarios) ask for response to single ease of use
question
o. Allow verbal response to any questions contained in task scenarios
p. At end of all tasks, display SUS questionnaire and have the participant place an X in
appropriate response to each statement
i. Save SUS with participant code as part of file name
q. Conduct debrief to solicit any additional feedback, comments, Q&A, etc
Thank you for participating in this study. Your input is very important. Our session today will last about
45 minutes. During that time you will use a version of EnableMyHealth EHR and work with specific
features. Our goal is to determine where there are areas of difficulty and design aspects that can be
improved.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
18
I will ask you to complete a few tasks using this system and answer some questions. You should complete
the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your
own following the instructions very closely. Please note that we are not testing you we are testing the
system, therefore if you have difficulty all this means is that something needs to be improved in the
system. There are no wrong answers! We will be here in case you need specific help, but we will not be
able to instruct you or provide help in how to use the application, however we may provide specific hints
as necessary.
Overall, we are interested in how easy (or how difficult) this system is to use, and how we could improve
it. I did not have any involvement in its creation, so please be honest with your opinions.
We are recording the audio and screenshots of our session today. All of the information that you provide
will be kept confidential and your name will not be associated with your comments at any time. Should
you feel it necessary you are able to withdraw at any time during the testing for any reason.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
19
Appendix 6: Sample data collection form
Summative Usability Test Data Log
Test type:
Participant:
Participant code:
Session Date/start time:
File name:
Task 1:
Critical Error count:
Minor error count:
Optimal path deviations:
Time on task:
SEoUQ response:
Comments/observations:
<repeat for each task>
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
20
Appendix 7: System Usability Scale Questionnaire
In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systems
usability” known as the System Usability Scale or SUS. Lewis and Sauro (2009) and others have
elaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, at
http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008).
Strongly
Disagree
1. I think that I would like to use this system
frequently
Strongly
agree
1
2
3
4
Strongly
Disagree
5
Strongly
agree
2. I found the system necessarily complex
1
2
3
4
5
1
2
3
4
5
3. I thought the system was easy to use
4. I think that I would need the support of a
technical person to be able to use this
system
5. I found the various functions in this system
were well integrated
6. I thought there was too much
inconsistency in this system
7. I would imagine that most people would
learn to use this system very quickly
8. I found the system very cumbersome to
use
Enabledoc LLC ©2015 Confidential and Proprietary
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
Version 1
21
9. I felt very confident using the system
10. I needed to learn a lot of things before I
could get going with this system
Enabledoc LLC ©2015 Confidential and Proprietary
1
2
3
4
5
1
2
3
4
5
Version 1
22
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
23
EHR Usability Test
Report of
EnableMyHealth
EHR Electronic
Prescribing
Safety-Enhanced Design 170.314(g)(3)
Report based on ISO/IEC 25062:2006 Common Industry Format for
Usability Test Reports
Date of Usability Test: 15 AUG 2015 – 14 SEP 2015
Date of Report: 14 SEP 2015
Report Prepared By: Stephen Rothschild
877.540.0933
[email protected]
EnableDoc LLC
7700 Falstaff Road
McLean VA 22104
Table of Contents
1
Executive Summary ............................................................................................................................... 2
2
INTRODUCTION ..................................................................................................................................... 3
3
METHOD................................................................................................................................................ 4
4
5
3.1
PARTICIPANTS ............................................................................................................................... 4
3.2
STUDY DESIGN............................................................................................................................... 5
3.3
TASKS............................................................................................................................................. 6
3.4
PROCEDURES................................................................................................................................. 6
3.5
TEST LOCATION ............................................................................................................................. 7
3.6
TEST ENVIRONMENT ..................................................................................................................... 7
3.7
TEST FORMS AND TOOLS .............................................................................................................. 8
3.8
PARTICIPANT INSTRUCTIONS ........................................................................................................ 8
3.9
USABILITY METRICS ....................................................................................................................... 9
3.10
DATA SCORING .............................................................................................................................. 9
RESULTS .............................................................................................................................................. 10
4.1
DATA ANALYSIS AND REPORTING ............................................................................................... 10
4.2
DISCUSSION OF THE FINDINGS ................................................................................................... 11
4.3
EFFECTIVENESS ........................................................................................................................... 11
4.4
EFFICIENCY .................................................................................................................................. 12
4.5
SATISFACTION ............................................................................................................................. 12
4.6
MAJOR FINDINGS ........................................................................................................................ 12
4.7
AREAS FOR IMPROVEMENT ........................................................................................................ 13
APPENDICES ........................................................................................................................................ 13
Appendix 1: PARTICIPANT RECRUITING SCREENER ................................................................................ 13
Appendix 2: PARTICIPANT DEMOGRAPHICS ........................................................................................... 15
Appendix 3: Informed Consent Form...................................................................................................... 17
Appendix 4: Task Scenarios ..................................................................................................................... 18
5: Usability Test Administration Moderator’s Guide .............................................................................. 18
Appendix 6: Sample data collection form ............................................................................................... 21
Appendix 7: System Usability Scale Questionnaire ................................................................................ 22
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
1
1
Executive Summary
A usability test of EnableMyHealth version EMH4, an ambulatory EHR, was conducted between August
15, 2015 and September 14, 2015 in the Rochester MN. office of EnableDoc LLC, and each providers
offices. The purpose of this test was to test and validate the usability of the current user interface for
Computerized Provider Order Entry, and provide evidence of usability in the EHR Under Test (EHRUT).
During the usability test, seven (7) healthcare providers and clinical staff that meet the target
demographic and professional profile serve as participants and used Enablemyhealth EHR in simulated,
but representative tasks. This study collected performance data on the following these four tasks
typically conducted in an EHR:
·
·
·
Enter and send one Prescription
Enter and send a second Prescription
Enter and send a third Prescription
All test participants conducted the test sessions remotely via on-line conferencing software. During the
15 minute one-on-one usability test, each participant was greeted by the administrator and asked to
review and verbally acknowledge an informed consent/release form (included in Appendix 3); they were
instructed that they could withdraw at any time. Participants had prior experience with the EHR as they
are current users/customers. No additional training materials were provided other than that usually
given to customers. The administrator introduced the test and instructed participants to complete a
series of tasks (given one at a time) using Enablemyhealth. During the testing, the administrator timed
the test and, along with the data logger(s) recorded user performance data on paper and electronically.
The administrator did not give the participant assistance in how to complete the task. Participant
screens and audio were recorded for subsequent analysis. The following types of data were collected for
each participant:
• Number of tasks successfully completed within the allotted time without assistance
• Time to complete the tasks
• Participant’s subjective assessment of the ease of each task
• Number and types of errors
• Path deviations
• Participant’s verbalizations
• Participant’s satisfaction ratings of the system
All participant data was de-identified – no correspondence could be made from the identity of the
participant to the data collected. Following the conclusion of the testing, participants were asked to
complete a post-test questionnaire and were compensated with a $50 gift card for their time. Following
the conclusion of the testing, participants were asked to complete a post-test SUS questionnaire.
Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the
Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the
usability of the EHRUT.
Result data matrix - refer to Appendix 4 for complete task descriptions
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
2
MEASURES
TASKS
Send prescription 1
Send prescription 2
Send prescription 3
Number
1
1
1
Mean across tasks
1
Path Deviations
Task
(Observed /
Success
Optimal)
100.00 1.9 (12.9/11)
100.00 1.3 (13.1/12)
100.00 1.5 (18/17)
100.00 1.6 (14.7/13.3)
Task Time
Mean (SD)
96 (8.8)_
92 (3.1)
90.1 (2.6)
Task Time
Deviations
(Observed /
Optimal)
1.2 (96/80)
1.1 (92/81)
1.0 (90.1/86)
Errors
Mean
(SD)
1.9 (1.1)
0.6 (0.7)
0.4 (0.7)
Task Ratings
5=Easy Mean
(SD)
3.9 (0.6)
3.9 (0.6)
3.7 (0.7)
92.7 (4.8)
1.1 (92.7/82.3)
1 (0.9)
3.8 (0.7)
The results from the System Usability Scale scored the subjective satisfaction with the system based on
performance with these tasks to be: 79.6 In addition to the performance data, the following qualitative
observations were made:
- Major findings
·
·
·
·
·
Providers tend to just want to write or dictate the script rather then select a frequency of use
Medications other than tablets (e.g.: injectables, inhalers) presented the most serious
challenges
Most frequent challenge was determining units of medication when none is defaulted.
Changing a prescription frequency of use was not easily understood and preference is just to
modify the script directions
Finding a pharmacy took some time to search.
- Areas for improvement
·
·
·
·
·
2
Add default units for all prescriptions
Provide aids in recommended dosage based on patient BSA, age, gender, and diagnosis.
Provide ability to type or speak script and parse data to build the prescription script.
Add a wizard for first time or novice users.
Add a short list of common pharmacies they order from.
INTRODUCTION
The EHRUT tested for this study was EnableMyHealth EHR4 Release, an ambulatory EHR. Designed to
present medical information to healthcare providers in primarily group practices with a focus on family
practice, surgery, eye, and PT/OT/Chiropractic medicine, the EHRUT consists of modules and features
that include but not limited to support for tested functionality:
· Patient demographics
· Patient problem and allergy lists
· Creating lab orders and receiving lab results
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
3
·
·
·
·
Immunization recording
Electronic prescribing
Drug-drug, drug-allergy, and drug-problem interaction checking
Clinical decision support
The EHRUT is used predominantly by physicians and clinical staff such as MA’s, RN’s, LPN’s, etc.
The usability testing attempted to represent realistic exercises and conditions.
The purpose of this study was to test and validate the usability of the current user interface, and provide
evidence of usability of the entry and modification of electronic prescriptions, laboratory orders, and
imaging orders in the EHR Under Test (EHRUT).To this end, measures of effectiveness, efficiency and
user satisfaction, such as successful task completion rate, time on task, number and types of errors, and
participant satisfaction were captured during the usability testing.
3
3.1
METHOD
PARTICIPANTS
A total of 7 participants were tested on the CPOE feature of the EHRUT. Participants in the test were
physicians and clinical staff such as RNs and MAs. Participants were recruited by Enabledoc staff from
existing customers and paid a $50 gift card for their help in testing.
Participants had no direct connection to the development of or organization producing the EHRUT other
than being current customers. Participants were not from the testing or supplier organization.
Participants were actual end users and thus have the same orientation and level of training as other
non-participant customers. For the test purposes, end-user characteristics were identified and
translated into an internal recruitment screener used to solicit potential participants; an example of a
screener is provided in Appendix [1].
Recruited participants had a mix of backgrounds and demographic characteristics conforming to the
recruitment screener. The following is a table of participants by characteristics, including demographics,
professional experience, computing experience and user needs for assistive technology. Participant
names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual
identities.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
4
Participant ID
C1
C2
C3
C4
C5
C6
C7
Gender
M
F
F
F
F
M
F
Age
Occupation
/ role
48 MD
58 Medical Ass
38 Medical Ass
37 MD
74 MD
35 Medical Ass
25 Medical Ass
Computer
Experience
Professional (intermediate,
advanced,
Experience
expert)
(yrs)
27 intermediate
35 intermediate
17 intermediate
16 intermediate
53 intermediate
14 intermediate
4 intermediate
Product
Experience
(yrs)
3
3
4
3
3
3
1
Seven (7) participants (matching the demographics in the section on Participants) were recruited and all
participated in the usability test. 3); they were instructed that they could withdraw at any time.
Participants all had prior experience with the EHR. No participants failed to show for the study.
Participants were scheduled for 30 minute sessions with 30 minutes in between each session for debrief
by the administrator and data logger, and to reset systems to proper test conditions. All testing was
performed over several days to allow the participants to schedule time at their convenience. A
spreadsheet was used to keep track of the participant schedule.
The administrator introduced the test, and instructed participants to complete a series of tasks (given
one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the
data logger(s) recorded user performance data on paper and electronically. The administrator did not
give the participant assistance in how to complete the task.
3.2
STUDY DESIGN
The objective of this test was to perform summative testing to measure the key usability metrics of
effectiveness, efficiency, and user satisfaction. These metrics will uncover areas where the application
performed well and areas where the application failed to meet the needs of the participants in achieving
our internal usability goals. The data from this test may serve as a baseline for future tests with an
updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used.
In short, this testing serves as both a means to record or benchmark current usability, but also to
identify areas where improvements should be made.
During the usability test, participants interacted exclusively with the EHRUT. Each participant used the
system in their preferred location, and was provided with the same instructions. All sessions were
conducted remotely with Join.Me conferencing software. Screens with associated interaction and the
audio stream were recorded for later analysis. The system was evaluated for effectiveness, efficiency
and satisfaction as defined by measures collected and analyzed for each participant:
• Number of tasks successfully completed within the allotted time without assistance
• Time to complete the tasks
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
5
• Number and types of errors
• Path deviations
• Participant’s verbalizations (comments)
• Participant’s satisfaction ratings of the system
Additional information about the various measures can be found in Section 3.9 on Usability Metrics.
3.3
TASKS
Tasks were constructed that would be realistic and representative of the kinds of activities a user might
do with this EHR in the area of Electronic Prescribing and based on the CMS test scripts for Meaningful
Use Phase 2. The tasks for this study broadly included:
Create the following prescriptions
· Enter one types of prescriptions using different script fields and Send it electronically
· Enter second types of prescriptions using different script fields and Send it electronically
· Enter third types of prescriptions using different script fields and Send it electronically
Task Selection and Priority
Tasks were selected based on their frequency of use, criticality of function, and those that may be most
troublesome for users. The tasks were directly modeled on the CMS Electronic Prescribing test
procedure [170.314(b)(3)].
Tasks were ordered and prioritized based on their impact on patient safety, in which the tasks that had
the greatest potential for patient harm due to critical errors were performed first. Thus the creation
tasks of prescriptions preceded those of prescription modifications. Within the prescription creation
tasks, the first task was intentionally fairly simple as this was the first time users have seen the
redesigned prescription user interface. Following the first prescription task were the more complex tasks
where user errors could have a large potential impact on patient safety. Specifically, these were
prescriptions involving pharmacy instructions, dosage calculations, patient instructions, and drug
interaction warnings.
The task scenario document is contained in appendix 4.
3.4
PROCEDURES
During the time a participant was scheduled, an email was sent to the participant that:
• Confirmed the time of the session
• Provided Join.Me access codes
• A copy of the informed consent form
• A document containing the tasks for the test session
Just prior to the scheduled time, the test administrator started the join.me session and greeted the
participant on arrival; their identity was verified and matched with a name on the participant schedule.
Participants were then assigned a participant ID. Recording of the session was started using the join.me
recording feature.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
6
Each participant reviewed and agreed to the informed consent and release form via verbal
acknowledgment (See Appendix 3). A representative from the test team witnessed the participant’s
verbal agreement.
To ensure that the test ran smoothly, testing was performed remotely by an experienced usability
practitioner with over 25 years of experience in healthcare user interface and workflow design.
The administrator moderated the session including administering instructions and tasks. The
administrator also monitored task times, obtained post-task rating data, took notes on participant
comments, path deviations, number and type of errors, and comments.
Participants were instructed to perform the tasks (see specific instructions below):
·
·
·
As quickly as possible making as few errors and deviations as possible.
Without assistance; administrators were allowed to give immaterial guidance and clarification
on tasks, but not instructions on use.
Without using a think aloud technique.
For each session, the participants were given an electronic copy of the tasks for that session. They were
requested to not read the tasks prior to the session. Participants were asked to read the task aloud prior
to each task; task timing began once the participant finished reading the question and verbally indicated
they were starting the task. The task time was stopped once the participant indicated they had
successfully completed the task. Scoring is discussed below in Section 3.9.
Following the session, the administrator gave the participant the post-test questionnaire (the System
Usability Scale, see Appendix 7), solicited any further comments, and thanked each individual for their
participation.
Participants' demographic information, task success rate, time on task, errors, deviations, verbal
responses, and post-test questionnaire were recorded into a spreadsheet.
3.5
TEST LOCATION
3.6
TEST ENVIRONMENT
The test was conducted remotely through the use of Join.Me virtual conferencing and screen sharing
software. Thus the actual test location was at the discretion of the test participants. The test
administrator conducted the test from Enabledoc LLC offices in Rochester MN.
The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was
conducted in healthcare office. For testing, the computer used a Windows laptop running Windows OS
and Chrome browser. The participants used a mouse and keyboard when interacting with the EHRUT.
The Enablemyhealth application was used on a 13 inch laptop with 1366 by 768 resolution and 32 bit
color. The application was set up by the vendor according to the vendor’s documentation describing the
system set-up and preparation. The application itself was running on a Windows 2012 Server using a
test database on a WAN connection. Technically, the system performance (i.e., response time) was
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
7
representative to what actual users would experience in a field implementation with a minor lag caused
by video screen sharing and recording. Additionally, participants were instructed not to change any of
the default system settings (such as control of font size).
3.7
TEST FORMS AND TOOLS
During the usability test, various documents and instruments were used, including:
• Informed Consent
• Test task scenarios
• Moderator’s Guide
• Observer’s data collection template
• Post-test SUS Questionnaire
Examples of these documents can be found in Appendices 3 – 7 respectively.
The participant’s interaction with the EHRUT was captured and recorded digitally using the screen
recording capability of join.me running on the test machine. This recording included the audio stream of
verbalizations. The test sessions were electronically transmitted to any additional observers who logged
into the Join.me session.
3.8
PARTICIPANT INSTRUCTIONS
The administrator reads the following instructions aloud to the each participant (also see the full
moderator’s guide in Appendix 5):
Thank you for participating in this study. Your input is very important. Our session today will last about
15 minutes. During that time you will use a version of EnableMyHealth EHR and work with specific
features. Our goal is to determine where there are areas of difficulty and design aspects that can be
improved.
I will ask you to complete a few tasks using this system and answer some questions. You should
complete the tasks as quickly as possible making as few errors as possible. Please try to complete the
tasks on your own following the instructions very closely. Please note that we are not testing you we
are testing the system, therefore if you have difficulty all this means is that something needs to be
improved in the system. There are no wrong answers! We will be here in case you need specific help,
but we will not be able to instruct you or provide help in how to use the application, however we may
provide specific hints as necessary.
Overall, we are interested in how easy (or how difficult) this system is to use, and how we could
improve it. I did not have any involvement in its creation, so please be honest with your opinions.
We are recording the audio and screen interaction of our session today. All of the information that you
provide will be kept confidential and your name will not be associated with your comments at any time.
Should you feel it necessary you are able to withdraw at any time during the testing for any reason.
Following the procedural instructions, participants were started with a specific patient’s chart data. Prior to
giving the participant mouse and keyboard control, the moderator gave the following instructions:
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
8
For each task, I will ask you to read the task and indicate when you begin. At that point, please
perform the task and say “Done” once you believe you have successfully completed the task. I would
like to request that you not talk aloud or verbalize while you are doing the tasks. You may certainly ask
questions if necessary and we may provide guidance or a hint, however we will not provide direct
instruction during the tasks. I will ask you your impressions about the task once you are done.
Participants were then given 4 tasks to complete. Tasks are listed in Appendix 4.
3.9
USABILITY METRICS
According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health
Records, EHRs should support a process that provides a high level of usability for all users. The goal is
for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To
this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability
testing.
The goals of the test were to assess:
1. Effectiveness of ENABLEMYHEALTH EHR by measuring participant success rates and errors
2. Efficiency of ENABLEMYHEALTH EHR by measuring the average task time and path deviations
3. Satisfaction with ENABLEMYHEALTH EHR by measuring ease of use ratings
3.10 DATA SCORING
The following table details how tasks were scored, errors evaluated, and the time data analyzed:
Rationale and Scoring
Measure
Effectiveness:
Task Success
A task was counted as a “Success” if the participant was able to achieve the
correct outcome, without assistance, within the overall time allotted for the
entire set of tasks.
The total number of successes were calculated for each task and then
divided by the total number of times that task was attempted. The results
are provided as a percentage.
Task times were recorded for successes. Observed task times divided by the
optimal time for each task is a measure of optimal efficiency. Due to the
variability of multiple correct paths and a large variety of user settable
preferences, all of which can affect time on task, optimal task times and
deviations from these times were unrealistic to assess. All participants were
trained on Enablemyhealth EHR and so the performance factor of 1.0 was
use. The overall usability goal for any feature of the EHR is task time that is
deemed acceptable by the end user with no critical errors. Thus, if expert,
optimal performance on a task was [60] seconds then allotted task time
performance was [60 * 1.0] seconds. This ratio is aggregated across tasks and
reported with mean and variance scores.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
9
Effectiveness:
Task Failures
Efficiency:
Task Deviations
If the participant abandoned the task, did not reach the correct answer,
performed it incorrectly, or was unsure if they had completed the task, the
task was counted as a “Critical Failure.” No task times were taken for failed
tasks.
Minor errors were defined as an errant click, initial selection of an incorrect
menu option, or incorrect entries that the participant noticed and corrected.
The total number of errors was calculated for each task and then divided by
the total number of times that task was attempted. Minor errors and
deviations were noted but not counted as significant errors. This is expressed
as the mean number of failed tasks per participant.
The participant’s path (i.e., steps) through the application was recorded.
Deviations occur if the participant, for example, went to a wrong screen,
clicked on an incorrect menu item, followed an incorrect link, or interacted
incorrectly with an on-screen control. These path deviations were included in
the minor error count. The minor error count is expressed as an average
across all participants.
Efficiency:
Task Time
Satisfaction:
Task Rating
Each task was timed from when the participant indicated they were
beginning the task until the participant said “Done.” If he or she failed to say
“Done,” the time was stopped when the participant stopped performing the
task. Only task times for tasks that were successfully completed were
included in the average task time analysis. Average time per task was
calculated for each task as was variance (standard deviation).
Participant’s subjective impression of the ease of use of the application was
measured by administering both a simple post-task question as well as a
post-session questionnaire. After each task, the participant was asked to rate
“Overall, this task was:” on a scale of 1 (Very Difficult) to 7 (Very Easy).
Average difficulty ratings per task were calculated as was variance.
Common convention is that average ratings for systems judged easy to use
should be 3.3 or above.
To measure participants’ confidence in and likeability of the
ENABLEMYHEALTH EHR feature overall, the testing team administered the
System Usability Scale (SUS) post-test questionnaire. Questions included, “I
think I would like to use this system frequently,” “I thought the system was
easy to use,” and “I would imagine that most people would learn to use this
system very quickly.” See full System Usability Score questionnaire in
Appendix 7.
4
4.1
RESULTS
DATA ANALYSIS AND REPORTING
The results of the usability test were calculated according to the methods specified in the Usability
Metrics section above. Participants who failed to follow session and task instructions had their data
excluded from the analyses, however there were no instances of this occurring.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
10
The usability testing results for the ENABLEMYHEALTH EHR are detailed below (Table 1). The results
should be seen in light of the objectives and goals outlined in Section 3.2 Study Design. The data should
yield actionable results that, if corrected, yield material, positive impact on user performance.
Table 1
MEASURES
TASKS
Send prescription 1
Send prescription 2
Send prescription 3
Mean across tasks
Number
1
1
1
1
Path Deviations
Task
(Observed /
Success
Optimal)
100.00 1.9 (12.9/11)
100.00 1.3 (13.1/12)
100.00 1.5 (18/17)
100.00 1.6 (14.7/13.3)
Task Time
Mean (SD)
96 (8.8)_
92 (3.1)
90.1 (2.6)
Task Time
Deviations
(Observed /
Optimal)
1.2 (96/80)
1.1 (92/81)
1.0 (90.1/86)
Errors
Mean
(SD)
1.9 (1.1)
0.6 (0.7)
0.4 (0.7)
Task Ratings
5=Easy Mean
(SD)
3.9 (0.6)
3.9 (0.6)
3.7 (0.7)
92.7 (4.8)
1.1 (92.7/82.3)
1 (0.9)
3.8 (0.7)
The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system
based on performance with these tasks to be: 79.6 with a standard deviation of 5.08. Broadly
interpreted, this score is 0.04 from being considered above average.
4.2
DISCUSSION OF THE FINDINGS
The eprescribe feature is a combination of electronic prescribing and Send screens. Particularly
noteworthy is that the participants were familiar with and had been using for up to the last 3 years prior
to this study.
Overall, the CPOE feature of Enablemyhealth user experience exhibits acceptable effectiveness, with an
acceptably low critical error rate and minimal variance among users. The critical error rate is of greatest
concern for an EHR and in the case of the EHRUT there were no task failures due to critical errors out of
21 attempted tasks. Of the deviation errors, each prescription averaged 1.6 deviations per task, which
was primarily finding the correct prescription and dosage/form and a non defaulting measurement unit.
Efficiency measures indicate that while acceptable, there is room for improvement. One older user took
significantly longer time to perform every task, but this can also be attributed to nervousness and
second guessing themselves. Finding the pharmacy also increased the time.
Satisfaction, measured by an SUS score of 79.6 is very good, but can be improved. Working to make the
user experience more natural and intelligent will improve the overall user experience and help guide the
novice and advanced users.
4.3
EFFECTIVENESS
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
11
Based on the success rate, minor error rate, and critical error rate data, the CPOE feature has an
acceptable level of effectiveness. The usability test consisted of 21 tasks of which 21 were attempted.
Out of the 21 attempted tasks, there were no critical errors, giving a successful completion rate of 100%.
The deviation (minor) errors in the electronic prescriptions were difficulties associated non defaulting
scripts that required making changes in each field to build the script and finding the pharmacy. 4 out the
7 users do not prescribe regularly, so we felt these results we very positive and users improved the more
they performed the tasks. Order sets were not used in this testing and can eliminate the problem of
complex prescription and other types of orders. The deviation error mean of 1.0 is within the acceptable
range of effectiveness, with the errors most often being finding the order by spelling it.
4.4
EFFICIENCY
Observation of task time is the primary area for improvement. The mean task time for three prescription
orders of 92.7 seconds is largely due to
• time to find the correct prescription, dose and form
• one complex prescription script
• number of fields on the prescription screen
• time to find pharmacy
Simple prescription tasks such as 1 tablet of a drug once per day by mouth for 30 days posed little
difficulties and did not greatly affect task time. More complex prescriptions, such as those that required
complex scripts, selection of measurement units, and detailed directions had increased the task time by
30%. We know that prescription ordersets addresses this issue, but we also know that there are of
solutions to improving usability.
4.5
SATISFACTION
Based on task difficulty ratings and SUS results data, user satisfaction of the feature is considered
favorable. Users perception of ease of use is considered easy, with a mean rating of 3.8 across all tasks
(1=very difficult, 5=very easy). SUS scoring is very good with a score of 79.6. These scores accurately
describe the average patient satisfaction with Enablemyhealth feature set.
SUS Scoring
Mean
StDev
4.6
79.64
5.08
MAJOR FINDINGS
The major usability problems encountered in this study were:
· Providers tend to just want to write or dictate the script rather then select a frequency of use
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
12
·
·
·
·
4.7
AREAS FOR IMPROVEMENT
·
·
·
·
·
5
Medications other than tablets (e.g.: injectables, inhalers) presented the most serious
challenges
Most frequent challenge was determining units of medication when none is defaulted.
Changing a prescription frequency of use was not easily understood and preference is just to
modify the script directions
Finding a pharmacy took some time to search.
Add default units for all prescriptions
Provide aids in recommended dosage based on patient BSA, age, gender, and diagnosis.
Provide ability to type or speak script and parse data to build the prescription script.
Add a wizard for first time or novice users.
Add a short list of common pharmacies they order from.
APPENDICES
The following appendices include supplemental data for this usability test report. Following is a list of
the appendices provided:
1: Participant Recruiting Screener
2: Participant demographics
3: Informed Consent Form
4: Task Scenarios
5: Moderator’s Guide
6: Sample data collection form
7: System Usability Scale Questionnaire
Appendix 1: PARTICIPANT RECRUITING SCREENER
Introduction
Hello, my name is __. Enabledoc is seeking doctors and clinicians who are users of
• Electronic prescribing software
• Computerized order entry (such as labs and imaging)
• Medication list and medication allergy list management
to take part in a usability study of that portion of the EnableMyHealth EHR.
This study will assist us in designing and developing a solution that meets your needs. Your experiences
in using this particular design will greatly help our designers and developers. The testing of our design
will take place in your office using remote meeting technology, requiring only your time, thoughts, and
suggestions. We expect the session to last approximately 45 minutes.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
13
Does this sound like something that interests you? Before I schedule you for a session, do you have a
few moments to answer some questions?
General Questions
1. Are you male or female? [Recruit a mix of participants]
2. Have you participated in a focus group or usability test in the past three months? [Note but do not
terminate if yes]
3. Which of the following best describes your age? [25 or less; 26 to 39; 40 to 59; 60 to 74; 75 and
older][Recruit a mix of ages]
Professional Demographics
4. What is your current position/role in your practice?
5. How long have you been in this role?
6. Do you currently perform e-prescribing? [computerized order entry, manage medication and
medication allergy lists – pick appropriate for test][Terminate if no to specific task for test] How many
prescriptions [or lab orders – use appropriate choice for test] do you write per day (or week if that is a
better estimate)?
7. What year did you receive your medical degree?
Computer Expertise
8. About how many hours per week do you spend on the computer that is medical practice related?
[Recruit a range of use, e.g., 0 to 10, 11 to 25, 26+ hours per week][Terminate if less than 5]
9. Do you use a computer outside of your medical work?
10. If so, about how many hours per week do spend using a computer for non-work related endeavors?
11. Regarding your use of EnableMyHealth Practice Manager & EHR, what percentage of time is spent in
each product? (per day)
Domain Knowledge
·
Rate your expertise or comfort in using EnableMyHealth software on a scale of 1 to 5 (1=just
starting, 5=expert) for
§ PM
§ EHR (overall)
§ EHR e-Prescribing
[Terminate if 1 or 2]
·
If you have used similar or competing products, describe your level of expertise in those
products [it is not necessary to name the product(s) unless they want to].
Contact Information
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
14
[If the person matches your qualifications, ask for any info we do not have] May I have your contact
information?
· Name of participant:
· Office Key:
· Best phone number:
· Email address:
Those are all the questions I have for you. Your background matches the people we're looking for.
Would you be able to participate on [date, time]?
Alternative: select from a list of sessions [ideal approach]
Alternative; What would be the best date and time for you?
Before your session starts, we will ask you to verbally acknowledge a release form allowing us to record
your session. The recording will only be used internally for further study if needed and will never be
used for advertising or marketing purposes. Also, you will not be personally identified with any
recording. Will you consent to be recorded? [Terminate if no]
This study will take place remotely via conferencing software, allowing you to participate in the session
at the place of your choosing. I will confirm your appointment a couple of days before your session and
provide you with any additional information. What is the best time to contact you?
Appendix 2: PARTICIPANT DEMOGRAPHICS
Following is a high-level overview of the participants in this study.
Gender
Men
Women
Total (participants)
Occupationan/Role
RN/NP/MA
Physician
Total (participants)
Years of Experience (average)
Professional
EHR Product Use
2
5
7
4
3
7
23.71
2.86
Demographic detail
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
15
Participant ID
C1
C2
C3
C4
C5
C6
C7
Gender
M
F
F
F
F
M
F
Age
Occupation
/ role
48 MD
58 Medical Ass
38 Medical Ass
37 MD
74 MD
35 Medical Ass
25 Medical Ass
Enabledoc LLC ©2015 Confidential and Proprietary
Computer
Experience
Professional (intermediate,
advanced,
Experience
expert)
(yrs)
27 intermediate
35 intermediate
17 intermediate
16 intermediate
53 intermediate
14 intermediate
4 intermediate
Version 1
Product
Experience
(yrs)
3
3
4
3
3
3
1
16
Appendix 3: Informed Consent Form
Enabledoc would like to thank you for participating in this study. The purpose of this study is to evaluate
an electronic health records system. If you decide to participate, you will be asked to perform several
tasks using the prototype and give your feedback. The study will last up to 60 minutes.
Agreement
I understand and agree that as a voluntary participant in the present study conducted by Enabledoc. I
am free to withdraw consent or discontinue participation at any time. I understand and
agree to participate in the study conducted by Enabledoc.
I understand and consent to the use and release of the recording by Enabledoc. I understand that the
information and recording is for research purposes only and that my name and image will not be used
for any purpose other than research. I relinquish any rights to the recording and understand the
recording may be copied and used by Enabledoc without further permission.
I understand and agree that the purpose of this study is to make software applications more useful and
usable in the future. I understand and agree that the data collected from this study will not be shared
outside of Enabledoc
I understand and agree that data confidentiality is assured, because only de-identified data – i.e.,
identification numbers not names – will be used in analysis and reporting of the results.
I agree to immediately raise any concerns or areas of discomfort with the study administrator. I
understand that I can leave at any time.
Please check or verbally indicate one of the following:
 YES, I have read the above statement and agree to be a participant.
 NO, I choose not to participate in this study.
Signature: _____________________________________ Date: ____________________
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
17
Appendix 4: Task Scenarios
Usability Task Scenarios – eRx
Click Patient Center and select Garth Brooks, click the first encounter and Click Open Note button. Click
New Prescriptions and enter these prescriptions:
1. Simvastatin, 20 mg tablet by mouth once daily; dispense 30, 1 refill, allow generic medications.
Click Save, select a pharmacy Brian Testing, and then click Send.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
2. Azithromycin 500 mg tablet, 1 tablet by mouth once a day for 10 days, dispense 10, 1 refill,
allow generic medications. Click Save, select a pharmacy Brian Testing, and then click Send.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
3. Insulin Glargine, 10 units once daily; package of 5, 2 refills, allow generic medications. Click
Save, select a pharmacy Brian Testing, and then click Send.
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
5: Usability Test Administration Moderator’s Guide
1. Items often forgotten
a. Stop watch for recording time on task
b. Elapsed time timer to time code observations/comments in the recording
2. Office Key Setup
a. Environment
b. Obtain copy of user key database if required for test
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
18
c. Any required feature access
d. Create and enter any necessary test data
e. Create Database / Office Key snapshot
3. Schedule Join.Me sessions
4. Schedule participants
5. Supply test participants with materials prior to test
a. Join.Me access
b. Task scenarios (be prepared to resend at the start of the test – do not assume the
participant has multiple monitors, so they may also need to print the scenarios)
c. Informed consent form (can be done verbally during the session)
6. Set up administrative / test PC
a. Join.Me installed and up to date
b. Shut down all programs not needed for the test – remember things like Communicator
c. Turn off notifications – audible and screen – on any software that may be running during
the test (e.g.: Outlook)
d. Ensure the application installed and functional
7. Running a test session
a. Ensure office key/database snapshot of base data exists
b. Start EHR session on test PC and arrive at starting screen for test
c. Select first patient if selection is not part of the task scenario
d. Start Join.Me session
e. Share appropriate screen
f. Greet user
g. Administer introductory materials to user (see below, following checklist)
h. Start Join.Me recording
i. Display informed consent form on shared screen
j. Obtain consent via verbal acceptance
k. Give mouse and keyboard control to ALL
l. Have user work through tasks
m. For each task, record
i. Time on task
ii. Critical errors (anything that constitutes task failure)
iii. Minor errors (anything the user detects and recovers)
iv. Deviation(s) from optimal path
v. Interesting comments (try to include recording time stamp for reference)
n.
After each task or task set (see specific scenarios) ask for response to single ease of use
question
o. Allow verbal response to any questions contained in task scenarios
p. At end of all tasks, display SUS questionnaire and have the participant place an X in
appropriate response to each statement
i. Save SUS with participant code as part of file name
q. Conduct debrief to solicit any additional feedback, comments, Q&A, etc.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
19
Thank you for participating in this study. Your input is very important. Our session today will last about
15 minutes. During that time you will use a version of EnableMyHealth EHR and work with specific
features. Our goal is to determine where there are areas of difficulty and design aspects that can be
improved.
I will ask you to complete a few tasks using this system and answer some questions. You should complete
the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your
own following the instructions very closely. Please note that we are not testing you we are testing the
system, therefore if you have difficulty all this means is that something needs to be improved in the
system. There are no wrong answers! We will be here in case you need specific help, but we will not be
able to instruct you or provide help in how to use the application, however we may provide specific hints
as necessary.
Overall, we are interested in how easy (or how difficult) this system is to use, and how we could improve
it. I did not have any involvement in its creation, so please be honest with your opinions.
We are recording the audio and screenshots of our session today. All of the information that you provide
will be kept confidential and your name will not be associated with your comments at any time. Should
you feel it necessary you are able to withdraw at any time during the testing for any reason.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
20
Appendix 6: Sample data collection form
Summative Usability Test Data Log
Test type:
Participant:
Participant code:
Session Date/start time:
File name:
Task 1:
·
·
·
·
·
·
Critical Error count:
Minor error count:
Optimal path deviations:
Time on task:
SEoUQ response:
Comments/observations:
<repeat for each task>
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
21
Appendix 7: System Usability Scale Questionnaire
In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systems
usability” known as the System Usability Scale or SUS. Lewis and Sauro (2009) and others have
elaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, at
http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008).
Strongly
Disagree
1. I think that I would like to use this system
frequently
Strongly
agree
1
2
3
4
Strongly
Disagree
5
Strongly
agree
2. I found the system unnecessarily complex
1
2
3
4
5
1
2
3
4
5
3. I thought the system was easy to use
4. I think that I would need the support of a
technical person to be able to use this
system
5. I found the various functions in this system
were well integrated
6. I thought there was too much
inconsistency in this system
7. I would imagine that most people would
learn to use this system very quickly
8. I found the system very cumbersome to
use
Enabledoc LLC ©2015 Confidential and Proprietary
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
Version 1
22
9. I felt very confident using the system
10. I needed to learn a lot of things before I
could get going with this system
Enabledoc LLC ©2015 Confidential and Proprietary
1
2
3
4
5
1
2
3
4
5
Version 1
23
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
24
EHR Usability Test
Report of
EnableMyHealth
EHR Medication
List Management
Safety-Enhanced Design 170.314(g)(3)
Report based on ISO/IEC 25062:2006 Common Industry Format for
Usability Test Reports
Date of Usability Test: 15 AUG 2015 – 14 SEP 2015
Date of Report: 14 SEP 2015
Report Prepared By: Stephen Rothschild
877.540.0933
[email protected]
EnableDoc LLC
7700 Falstaff Road
McLean VA 22104
Table of Contents
1
Executive Summary ............................................................................................................................... 2
2
METHOD................................................................................................................................................ 4
3
4
2.1
PARTICIPANTS ............................................................................................................................... 4
2.2
STUDY DESIGN............................................................................................................................... 5
2.3
TASKS............................................................................................................................................. 5
2.4
PROCEDURES................................................................................................................................. 6
2.5
TEST LOCATION ............................................................................................................................. 7
2.6
TEST ENVIRONMENT ..................................................................................................................... 7
2.7
TEST FORMS AND TOOLS .............................................................................................................. 7
2.8
PARTICIPANT INSTRUCTIONS ........................................................................................................ 7
2.9
USABILITY METRICS ....................................................................................................................... 8
2.10
DATA SCORING .............................................................................................................................. 8
RESULTS .............................................................................................................................................. 10
3.1
DATA ANALYSIS AND REPORTING ............................................................................................... 10
3.2
DISCUSSION OF THE FINDINGS ................................................................................................... 10
3.3
EFFECTIVENESS ........................................................................................................................... 11
3.4
EFFICIENCY .................................................................................................................................. 11
3.5
SATISFACTION ............................................................................................................................. 11
3.6
MAJOR FINDINGS ........................................................................................................................ 12
3.7
AREAS FOR IMPROVEMENT ........................................................................................................ 12
APPENDICES ........................................................................................................................................ 12
Appendix 2: PARTICIPANT DEMOGRAPHICS ........................................................................................... 15
Appendix 3: Informed Consent Form...................................................................................................... 16
Appendix 4: Task Scenarios ..................................................................................................................... 17
Appendix 5: Usability Test Administration Moderator’s Guide .............................................................. 19
Appendix 6: Sample data collection form ............................................................................................... 21
Appendix 7: System Usability Scale Questionnaire ................................................................................ 22
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
1
1
Executive Summary
A usability test of EnableMyHealth version EMH4, an ambulatory EHR, was conducted between August
15, 2015 and September 14, 2015 in the Rochester MN. office of EnableDoc LLC, and each providers
offices. The purpose of this test was to test and validate the usability of the current user interface for
Computerized Provider Order Entry, and provide evidence of usability in the EHR Under Test (EHRUT).
During the usability test, seven (7) healthcare providers and clinical staff that meet the target
demographic and professional profile serve as participants and used Enablemyhealth EHR in simulated,
but representative tasks. This study collected performance data on the following these four tasks
typically conducted in an EHR:
·
·
·
Enter current medications
Modify or mark as inactive
View current or historical medications
All test participants conducted the test sessions remotely via on-line conferencing software. During the
15 one-on-one usability test, each participant was greeted by the administrator and asked to review and
verbally acknowledge an informed consent/release form (included in Appendix 3); they were instructed
that they could withdraw at any time. Participants had prior experience with the EHR as they are current
users/customers. No additional training materials were provided other than that usually given to
customers. The administrator introduced the test and instructed participants to complete a series of
tasks (given one at a time) using the EHRUT. During the testing, the administrator timed the test and,
along with the data logger(s) recorded user performance data on paper and electronically. The
administrator did not give the participant assistance in how to complete the task. Participant screens
and audio were recorded for subsequent analysis.
The following types of data were collected for each participant:
· Number of tasks successfully completed within the allotted time without assistance
· Time to complete the tasks
· Participant’s subjective assessment of the ease of each task
· Number and types of errors
· Path deviations
· Participant’s verbalizations
· Participant’s satisfaction ratings of the system
All participant data was de-identified – no correspondence could be made from the identity of the
participant to the data collected. Following the conclusion of the testing, participants were asked to
complete a post-test SUS questionnaire.
Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the
Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the
usability of the EHRUT.
Result data matrix - refer to Appendix 4 for complete task descriptions
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
2
MEASURES
TASKS
Path Deviations
Task
(Observed /
Optimal)
Number Success
Task Time
Mean (SD)
Task Time
Task
Deviations
Ratings
Errors
(Observed /
5=Easy
Optimal) Mean (SD) Mean (SD)
Create 6 medications
6
100.00
1.1 (29.1/26)
289 (8)
1.1 (289/275)
1.86 (1.1)
3.7 (0.5)
Modify 3 medications
Review Current Meds and History
3
1
100.00
100.00
1.3 (11.6/9)
1.4 (2.9/2)
75 (19)
18 (6)
1.3 (75/60)
1.2 (18/15)
0.57 (0.7)
0.43 (0.7)
4.1 (0.6)
4.7 (0)
100.00
1.3 (14.5/12)
128 (11)
1.2 (128/116)
1 (0.9)
4.2 (0.4)
Mean across tasks
The results from the System Usability Scale scored the subjective satisfaction with the system based on
performance with these tasks to be: 81.4.
In addition to the performance data, the following qualitative observations were made:
Major findings
·
·
·
·
Providers tend to just want to write or dictate the script rather then select a frequency of use
Medications other than tablets (e.g.: injectables, inhalers) presented the most serious
challenges
Most frequent challenge was determining units of medication when none is defaulted.
Changing a prescription frequency of use was not easily understood and preference is just to
modify the script directions
Areas for improvement
· Add default units for all prescriptions
· Provide aids in recommended dosage based on patient BSA, age, gender, and diagnosis.
· Provide ability to type or speak script and parse data to build the prescription script.
· Add a wizard for first time or novice users.
The EHRUT tested for this study was EnableMyHealth EHR4 Release, an ambulatory EHR. Designed to
present medical information to healthcare providers in primarily group practices with a focus on family
practice, surgery, eye, and PT/OT/Chiropractic medicine, the EHRUT consists of modules and features
that include but not limited to support for tested functionality:
· Patient demographics
· Patient problem and allergy lists
· Creating lab orders and receiving lab results
· Immunization recording
· Electronic prescribing
· Drug-drug, drug-allergy, and drug-problem interaction checking
· Clinical decision support
The EHRUT is used predominantly by physicians and clinical staff such as MA’s, RN’s, LPN’s, etc.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
3
The usability testing attempted to represent realistic exercises and conditions.
The purpose of this study was to test and validate the usability of the current user interface, and provide
evidence of usability of the entry and modification of electronic prescriptions, laboratory orders, and
imaging orders in the EHR Under Test (EHRUT).To this end, measures of effectiveness, efficiency and
user satisfaction, such as successful task completion rate, time on task, number and types of errors, and
participant satisfaction were captured during the usability testing.
2
2.1
METHOD
PARTICIPANTS
A total of 7 participants were tested on the CPOE feature of the EHRUT. Participants in the test were
physicians and clinical staff such as RNs and MAs. Participants were recruited by Enabledoc staff from
existing customers and paid a $50 gift card for their help in testing.
Participants had no direct connection to the development of or organization producing the EHRUT other
than being current customers. Participants were not from the testing or supplier organization.
Participants were actual end users and thus have the same orientation and level of training as other
non-participant customers. For the test purposes, end-user characteristics were identified and
translated into an internal recruitment screener used to solicit potential participants; an example of a
screener is provided in Appendix [1].
Recruited participants had a mix of backgrounds and demographic characteristics conforming to the
recruitment screener. The following is a table of participants by characteristics, including demographics,
professional experience, computing experience and user needs for assistive technology. Participant
names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual
identities.
Participant ID
C1
C2
C3
C4
C5
C6
C7
Gender
M
F
F
F
F
M
F
Occupation
/ role
Age
48 MD
58 Medical Ass
38 Medical Ass
37 MD
74 MD
35 Medical Ass
25 Medical Ass
Computer
Experience
Professional (intermediate,
advanced,
Experience
expert)
(yrs)
27 intermediate
35 intermediate
17 intermediate
16 intermediate
53 intermediate
14 intermediate
4 intermediate
Product
Experience
(yrs)
3
3
4
3
3
3
1
Seven (7) participants (matching the demographics in the section on Participants) were recruited and all
participated in the usability test. 3); they were instructed that they could withdraw at any time.
Participants all had prior experience with the EHR. No participants failed to show for the study.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
4
Participants were scheduled for 30 minute sessions with 30 minutes in between each session for debrief
by the administrator and data logger, and to reset systems to proper test conditions. All testing was
performed over several days to allow the participants to schedule time at their convenience. A
spreadsheet was used to keep track of the participant schedule.
The administrator introduced the test, and instructed participants to complete a series of tasks (given
one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the
data logger(s) recorded user performance data on paper and electronically. The administrator did not
give the participant assistance in how to complete the task.
2.2
STUDY DESIGN
The objective of this test was to perform summative testing to measure the key usability metrics of
effectiveness, efficiency, and user satisfaction. These metrics will uncover areas where the application
performed well and areas where the application failed to meet the needs of the participants in achieving
our internal usability goals. The data from this test may serve as a baseline for future tests with an
updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used.
In short, this testing serves as both a means to record or benchmark current usability, but also to
identify areas where improvements should be made.
During the usability test, participants interacted exclusively with the EHRUT. Each participant used the
system in their preferred location, and was provided with the same instructions. All sessions were
conducted remotely with Join.Me conferencing software. Screens with associated interaction and the
audio stream were recorded for later analysis. The system was evaluated for effectiveness, efficiency
and satisfaction as defined by measures collected and analyzed for each participant:
·
·
·
·
·
·
Number of tasks successfully completed within the allotted time without assistance
Time to complete the tasks
Number and types of errors
Path deviations
Participant’s verbalizations (comments)
Participant’s satisfaction ratings of the system
Additional information about the various measures can be found in Section 3.9 on Usability Metrics.
2.3
TASKS
Tasks were constructed that would be realistic and representative of the kinds of activities a user might
do with this EHR in the area of Medication List Management and based on the CMS test scripts for
Meaningful Use Phase 2. The tasks for this study broadly included:
1. Enter 6 current medications
2. Change 3 current medications
3. Review current medications and history for a patient
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
5
Task Selection and Priority
Tasks were selected based on their frequency of use, criticality of function, and those that may be most
troublesome for users. The tasks were directly modeled on the CMS Medication List test procedure
[170.314(a)(6)].
Tasks were ordered and prioritized based on their impact on patient safety, in which the tasks that had
the greatest potential for patient harm due to critical errors were performed first. Thus the entry of current
medication allergies was followed by modifying some of the current allergies and finished with the reviews
of current medications and then the reviewing medication history task.
The task scenario document is contained in appendix 4.
2.4
PROCEDURES
During the time a participant was scheduled, an email was sent to the participant that:
·
·
·
·
Confirmed the time of the session
Provided Join.Me access codes
A copy of the informed consent form
A document containing the tasks for the test session
Just prior to the scheduled time, the test administrator started the join.me session and greeted the
participant on arrival; their identity was verified and matched with a name on the participant schedule.
Participants were then assigned a participant ID. Recording of the session was started using the join.me
recording feature.
Each participant reviewed and agreed to the informed consent and release form via verbal
acknowledgment (See Appendix 3). A representative from the test team witnessed the participant’s
verbal agreement.
To ensure that the test ran smoothly, testing was performed remotely by an experienced usability
practitioner with over 25 years of experience in healthcare user interface and workflow design.
The administrator moderated the session including administering instructions and tasks. The
administrator also monitored task times, obtained post-task rating data, took notes on participant
comments, path deviations, number and type of errors, and comments.
Participants were instructed to perform the tasks (see specific instructions below):
·
·
·
As quickly as possible making as few errors and deviations as possible.
Without assistance; administrators were allowed to give immaterial guidance and clarification
on tasks, but not instructions on use.
Without using a think aloud technique.
For each session, the participants were given an electronic copy of the tasks for that session. They were
requested to not read the tasks prior to the session. Participants were asked to read the task aloud prior
to each task; task timing began once the participant finished reading the question and verbally indicated
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
6
they were starting the task. The task time was stopped once the participant indicated they had
successfully completed the task. Scoring is discussed below in Section 3.9.
Following the session, the administrator gave the participant the post-test questionnaire (the System
Usability Scale, see Appendix 7), solicited any further comments, and thanked each individual for their
participation.
Participants' demographic information, task success rate, time on task, errors, deviations, verbal
responses, and post-test questionnaire were recorded into a spreadsheet.
2.5
TEST LOCATION
The test was conducted remotely through the use of Join.Me virtual conferencing and screen sharing
software. Thus the actual test location was at the discretion of the test participants. The test
administrator conducted the test from Enabledoc LLC offices in Rochester MN.
2.6
TEST ENVIRONMENT
The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was
conducted in healthcare office. For testing, the computer used a Windows laptop running Windows OS
and Chrome browser. The participants used a mouse and keyboard when interacting with the EHRUT.
The Enablemyhealth application was used on a 13 inch laptop with 1366 by 768 resolution and 32 bit
color. The application was set up by the vendor according to the vendor’s documentation describing the
system set-up and preparation. The application itself was running on a Windows 2012 Server using a
test database on a WAN connection. Technically, the system performance (i.e., response time) was
representative to what actual users would experience in a field implementation with a minor lag caused
by video screen sharing and recording. Additionally, participants were instructed not to change any of
the default system settings (such as control of font size).
2.7
TEST FORMS AND TOOLS
During the usability test, various documents and instruments were used, including:
·
·
·
·
·
Informed Consent
Test task scenarios
Moderator’s Guide
Observer’s data collection template
Post-test SUS Questionnaire
Examples of these documents can be found in Appendices 3 – 7 respectively.
The participant’s interaction with the EHRUT was captured and recorded digitally using the screen
recording capability of join.me running on the test machine. This recording included the audio stream of
verbalizations. The test sessions were electronically transmitted to any additional observers who logged
into the join.me session.
2.8
PARTICIPANT INSTRUCTIONS
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
7
The administrator reads the following instructions aloud to the each participant (also see the full
moderator’s guide in Appendix 5):
Thank you for participating in this study. Your input is very important. Our session today will last about
30 minutes. During that time you will use a version of EnableMyHealth EHR and work with specific
features. Our goal is to determine where there are areas of difficulty and design aspects that can be
improved.
I will ask you to complete a few tasks using this system and answer some questions. You should
complete the tasks as quickly as possible making as few errors as possible. Please try to complete the
tasks on your own following the instructions very closely. Please note that we are not testing you we
are testing the system, therefore if you have difficulty all this means is that something needs to be
improved in the system. There are no wrong answers! We will be here in case you need specific help,
but we will not be able to instruct you or provide help in how to use the application, however we may
provide specific hints as necessary.
Overall, we are interested in how easy (or how difficult) this system is to use, and how we could
improve it. I did not have any involvement in its creation, so please be honest with your opinions.
We are recording the audio and screen interaction of our session today. All of the information that you
provide will be kept confidential and your name will not be associated with your comments at any time.
Should you feel it necessary you are able to withdraw at any time during the testing for any reason.
Following the procedural instructions, participants were started with a specific patient’s chart data. Prior to
giving the participant mouse and keyboard control, the moderator gave the following instructions:
For each task, I will ask you to read the task and indicate when you begin. At that point, please
perform the task and say “Done” once you believe you have successfully completed the task. I would
like to request that you not talk aloud or verbalize while you are doing the tasks. You may certainly ask
questions if necessary and we may provide guidance or a hint, however we will not provide direct
instruction during the tasks. I will ask you your impressions about the task once you are done.
Participants were then given 4 tasks to complete. Tasks are listed in Appendix 4.
2.9
USABILITY METRICS
According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health
Records, EHRs should support a process that provides a high level of usability for all users. The goal is
for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To
this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability
testing.
The goals of the test were to assess:
1. Effectiveness of ENABLEMYHEALTH EHR by measuring participant success rates and errors
2. Efficiency of ENABLEMYHEALTH EHR by measuring the average task time and path deviations
3. Satisfaction with ENABLEMYHEALTH EHR by measuring ease of use ratings
2.10 DATA SCORING
The following table details how tasks were scored, errors evaluated, and the time data analyzed
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
8
Rationale and Scoring
Measure
Effectiveness:
Task Success
A task was counted as a “Success” if the participant was able to achieve the
correct outcome, without assistance, within the overall time allotted for the
entire set of tasks.
The total number of successes were calculated for each task and then
divided by the total number of times that task was attempted. The results
are provided as a percentage.
Task times were recorded for successes. Observed task times divided by the
optimal time for each task is a measure of optimal efficiency. Due to the
variability of multiple correct paths and a large variety of user settable
preferences, all of which can affect time on task, optimal task times and
deviations from these times were unrealistic to assess. All participants were
trained on Enablemyhealth EHR and so the performance factor of 1.0 was
use. The overall usability goal for any feature of the EHR is task time that is
deemed acceptable by the end user with no critical errors. Thus, if expert,
optimal performance on a task was [60] seconds then allotted task time
performance was [60 * 1.0] seconds. This ratio is aggregated across tasks and
reported with mean and variance scores.
Effectiveness:
Task Failures
Efficiency:
Task Deviations
If the participant abandoned the task, did not reach the correct answer,
performed it incorrectly, or was unsure if they had completed the task, the
task was counted as a “Critical Failure.” No task times were taken for failed
tasks.
Minor errors were defined as an errant click, initial selection of an incorrect
menu option, or incorrect entries that the participant noticed and corrected.
The total number of errors was calculated for each task and then divided by
the total number of times that task was attempted. Minor errors and
deviations were noted but not counted as significant errors. This is expressed
as the mean number of failed tasks per participant.
The participant’s path (i.e., steps) through the application was recorded.
Deviations occur if the participant, for example, went to a wrong screen,
clicked on an incorrect menu item, followed an incorrect link, or interacted
incorrectly with an on-screen control. These path deviations were included in
the minor error count. The minor error count is expressed as an average
across all participants.
Efficiency:
Task Time
Satisfaction:
Task Rating
Each task was timed from when the participant indicated they were
beginning the task until the participant said “Done.” If he or she failed to say
“Done,” the time was stopped when the participant stopped performing the
task. Only task times for tasks that were successfully completed were
included in the average task time analysis. Average time per task was
calculated for each task as was variance (standard deviation).
Participant’s subjective impression of the ease of use of the application was
measured by administering both a simple post-task question as well as a
post-session questionnaire. After each task, the participant was asked to rate
“Overall, this task was:” on a scale of 1 (Very Difficult) to 7 (Very Easy).
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
9
Average difficulty ratings per task were calculated as was variance.
Common convention is that average ratings for systems judged easy to use
should be 3.3 or above.
To measure participants’ confidence in and likeability of the
ENABLEMYHEALTH EHR feature overall, the testing team administered the
System Usability Scale (SUS) post-test questionnaire. Questions included, “I
think I would like to use this system frequently,” “I thought the system was
easy to use,” and “I would imagine that most people would learn to use this
system very quickly.” See full System Usability Score questionnaire in
Appendix 7.
3
3.1
RESULTS
DATA ANALYSIS AND REPORTING
The results of the usability test were calculated according to the methods specified in the Usability
Metrics section above. Participants who failed to follow session and task instructions had their data
excluded from the analyses, however there were no instances of this occurring.
The usability testing results for the ENABLEMYHEALTH EHR are detailed below (Table 1). The results
should be seen in light of the objectives and goals outlined in Section 3.2 Study Design. The data should
yield actionable results that, if corrected, yield material, positive impact on user performance.
Table 1
MEASURES
TASKS
Path Deviations
Task
(Observed /
Optimal)
Number Success
Task Time
Mean (SD)
Task Time
Task
Deviations
Ratings
Errors
(Observed /
5=Easy
Optimal) Mean (SD) Mean (SD)
Create 6 medications
6
100.00
1.1 (29.1/26)
289 (8)
1.1 (289/275)
1.86 (1.1)
3.7 (0.5)
Modify 3 medications
Review Current Meds and History
3
1
100.00
100.00
1.3 (11.6/9)
1.4 (2.9/2)
75 (19)
18 (6)
1.3 (75/60)
1.2 (18/15)
0.57 (0.7)
0.43 (0.7)
4.1 (0.6)
4.7 (0)
100.00
1.3 (14.5/12)
128 (11)
1.2 (128/116)
1 (0.9)
4.2 (0.4)
Mean across tasks
The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system
based on performance with these tasks to be: 81.4 with a standard deviation of 4.4. Broadly interpreted,
this indicates above average usability, but with further room for improvement.
3.2
DISCUSSION OF THE FINDINGS
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
10
The Current Medication feature works similar to electronic prescribing for entering medication scripts.
The findings and issues are identical to eprescribe. Particularly noteworthy is that the participants were
familiar with this interface and some had been using for up to 3 years prior to this study.
Overall the CPOE feature of Enablemyhealth user experience exhibits acceptable effectiveness, with an
acceptably low critical error rate and minimal variance among users. The critical error rate is of greatest
concern for an EHR and in the case of the EHRUT there were no task failures due to critical errors out of
91 attempted tasks, but there were 20 deviations from normal operations. This is within internal
usability goal guidelines. Of the deviation errors, 13 were attributable to current medications, which was
primarily finding the correct medication and dosage/form and a non defaulting measurement unit.
Efficiency measures indicate that while acceptable, there is room for improvement, in time to enter
medications. One older user took significantly longer time to perform every task, but this can also be
attributed to nervousness and second guessing themselves.
Satisfaction, measured by an SUS score of 81.4 is very good, but can be improved. Working to make the
user experience more natural and intelligent will improve the overall user experience and help guide the
novice and advanced users.
3.3
EFFECTIVENESS
Based on the success rate, minor error rate, and critical error rate data, the medication list feature has
an acceptable level of effectiveness. The usability test consisted of 70 tasks of which 70 were attempted.
Out of the 70 attempted tasks, there were no critical errors, but there were 20 deviations from optimal
operation. The minor error mean of 1.86 (1.1 standard deviation) is within the acceptable range of
effectiveness, with the errors most often being associated with entry of medications. These numbers are
consistent with prescriptions CPOE.
3.4
EFFICIENCY
Observation of task time and minor errors which include path deviations indicates acceptable
performance; however there is ample room for improvement to reduce time on task. The mean task
time across all tasks is 127.6 seconds (11 second standard deviation), but the mean task time for
entering the 6 medications was 289 seconds (8 second standard deviation). This is largely due to making
sure all the medication field data was entered correctly. The average ERX took 39 seconds per
prescription verses 48 seconds per current medication entered, but this is within one standard
deviation. We had expected some improvement, but the time to enter medications varies depending on
the complexity of the script.
3.5
SATISFACTION
Based on task difficulty ratings and SUS results data, user satisfaction of the feature is considered
favorable. Users perception of ease of use is considered easy, with a mean rating of 4.04 across all tasks
(1=very difficult, 5=very easy). SUS scoring is very good with a score of 80.7. These scores accurately
describe the average patient satisfaction with Enablemyhealth feature set.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
11
SUS Scoring
Mean
StDev
3.6
81.43
4.40
MAJOR FINDINGS
The major usability problems encountered in this study were:
Current Medications list difficulties:
·
·
·
·
3.7
Time on task continues to be a serious problem, because there are so many fields to write a
script.
Medications other than tablets (e.g.: injectables, inhalers) presented the most serious
challenges
Non-tablet prescriptions consistently lower ease of use scores
Frequency was not easily understood and that this field, dose, days calculate the quantity and
auto generate the directions.
AREAS FOR IMPROVEMENT
Current medications improvements:
1. Always default a measurement unit for every prescription type.
2. Add speech recognition to allow prescription to be spoken and the script to be automatically
filled in with natural language processing.
3. Add more intelligence to recommend prescriptions and scripts based on diagnosis and patient
demographics.
4. Provide a wizard option that leads users through the prescription scripting process.
4
APPENDICES
The following appendices include supplemental data for this usability test report. Following is a list of
the appendices provided:
1: Participant Recruiting Screener
2: Participant demographics
3: Informed Consent Form
4: Task Scenarios
5: Moderator’s Guide
6: Sample data collection form
7: System Usability Scale Questionnaire
Appendix 1: PARTICIPANT RECRUITING SCREENER
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
12
Introduction
Hello, my name is __. Enabledoc is seeking doctors and clinicians who are users of
• Electronic prescribing software
• Computerized order entry (such as labs and imaging)
• Medication list and medication allergy list management
to take part in a usability study of that portion of the EnableMyHealth EHR.
This study will assist us in designing and developing a solution that meets your needs. Your experiences
in using this particular design will greatly help our designers and developers. The testing of our design
will take place in your office using remote meeting technology, requiring only your time, thoughts, and
suggestions. We expect the session to last approximately 30 minutes.
Does this sound like something that interests you? Before I schedule you for a session, do you have a
few moments to answer some questions?
General Questions
1. Are you male or female? [Recruit a mix of participants]
2. Have you participated in a focus group or usability test in the past three months? [Note but do not
terminate if yes]
3. Which of the following best describes your age? [25 or less; 26 to 39; 40 to 59; 60 to 74; 75 and
older][Recruit a mix of ages]
Professional Demographics
4. What is your current position/role in your practice?
5. How long have you been in this role?
6. Do you currently perform e-prescribing? [computerized order entry, manage medication and
medication allergy lists – pick appropriate for test][Terminate if no to specific task for test] How many
prescriptions [or lab orders – use appropriate choice for test] do you write per day (or week if that is a
better estimate)?
7. What year did you receive your medical degree?
Computer Expertise
8. About how many hours per week do you spend on the computer that is medical practice related?
[Recruit a range of use, e.g., 0 to 10, 11 to 25, 26+ hours per week][Terminate if less than 5]
9. Do you use a computer outside of your medical work?
10. If so, about how many hours per week do spend using a computer for non-work related endeavors?
11. Regarding your use of EnableMyHealth Practice Manager & EHR, what percentage of time is spent in
each product? (per day)
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
13
Domain Knowledge
·
Rate your expertise or comfort in using EnableMyHealth software on a scale of 1 to 5 (1=just
starting, 5=expert) for
§ PM
§ EHR (overall)
§ EHR e-Prescribing
[Terminate if 1 or 2]
·
If you have used similar or competing products, describe your level of expertise in those
products [it is not necessary to name the product(s) unless they want to].
Contact Information
[If the person matches your qualifications, ask for any info we do not have] May I have your contact
information?
· Name of participant:
· Office Key:
· Best phone number:
· Email address:
Those are all the questions I have for you. Your background matches the people we're looking for.
Would you be able to participate on [date, time]?
Alternative: select from a list of sessions [ideal approach]
Alternative; What would be the best date and time for you?
Before your session starts, we will ask you to verbally acknowledge a release form allowing us to record
your session. The recording will only be used internally for further study if needed and will never be
used for advertising or marketing purposes. Also, you will not be personally identified with any
recording. Will you consent to be recorded? [Terminate if no]
This study will take place remotely via conferencing software, allowing you to participate in the session
at the place of your choosing. I will confirm your appointment a couple of days before your session and
provide you with any additional information. What is the best time to contact you?
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
14
Appendix 2: PARTICIPANT DEMOGRAPHICS
Following is a high-level overview of the participants in this study.
Gender
Men
Women
Total (participants)
Occupationan/Role
RN/NP/MA
Physician
Total (participants)
Years of Experience (average)
Professional
EHR Product Use
2
5
7
4
3
7
23.71
2.86
Demographic detail
Participant ID
C1
C2
C3
C4
C5
C6
C7
Gender
M
F
F
F
F
M
F
Occupation
/ role
Age
48 MD
58 Medical Ass
38 Medical Ass
37 MD
74 MD
35 Medical Ass
25 Medical Ass
Enabledoc LLC ©2015 Confidential and Proprietary
Computer
Experience
Professional (intermediate,
advanced,
Experience
expert)
(yrs)
27 intermediate
35 intermediate
17 intermediate
16 intermediate
53 intermediate
14 intermediate
4 intermediate
Version 1
Product
Experience
(yrs)
3
3
4
3
3
3
1
15
Appendix 3: Informed Consent Form
Enabledoc would like to thank you for participating in this study. The purpose of this study is to evaluate
an electronic health records system. If you decide to participate, you will be asked to perform several
tasks using the prototype and give your feedback. The study will last up to 60 minutes.
Agreement
I understand and agree that as a voluntary participant in the present study conducted by Enabledoc. I
am free to withdraw consent or discontinue participation at any time. I understand and agree to
participate in the study conducted by Enabledoc.
I understand and consent to the use and release of the recording by Enabledoc. I understand that the
information and recording is for research purposes only and that my name and image will not be used
for any purpose other than research. I relinquish any rights to the recording and understand the
recording may be copied and used by Enabledoc without further permission.
I understand and agree that the purpose of this study is to make software applications more useful and
usable in the future. I understand and agree that the data collected from this study will not be shared
outside of Enabledoc
I understand and agree that data confidentiality is assured, because only de-identified data – i.e.,
identification numbers not names – will be used in analysis and reporting of the results.
I agree to immediately raise any concerns or areas of discomfort with the study administrator. I
understand that I can leave at any time.
Please check or verbally indicate one of the following:
 YES, I have read the above statement and agree to be a participant.
 NO, I choose not to participate in this study.
Signature: _____________________________________ Date: ____________________
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
16
Appendix 4: Task Scenarios
Usability Task Scenarios – Medication List
Access the clinical information for a patient of your choice. Make note of the patient’s name as that will
be required in later tasks
1: Record Patient Active Medication List -- this patient provides their current medications. Each one
below is selected and modified then saved:
Simvastatin 20 mg tablet by mouth once daily
Lorazepam 0.5 mg tablet by mouth three times daily
Insulin Glargine 10 units mL once daily for 10 days
Metoprolol Tartrate 50 mg tablet by mouth once daily
Warfarin 5 mg tablet by mouth once daily Monday, Wednesday, Friday, Sunday
Warfarin 2.5 mg tablet by mouth once daily Tuesday, Thursday, Saturday
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
2: Change Patient Active Medication List – make the following changes to the patient’s medication list.
Click Current Medications, then click Edit next to each one of the medications below and Update each
one with these changes:
· Simvastatin 20 mg tablet was discontinued (uncheck Taking box) and
· The frequency of Lorazepam 0.5 mg tablet was changed from three times daily to every six
hours (select Frequency)
· The dose of Insulin Glargine was changed from 10 units to 20 units (make Dose amount 2)
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
4
5
3: Access Patient Active Medication List – retrieve the clinical records for patient Angelica C Taber and
review their active medications with the usability session facilitator
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
Enabledoc LLC ©2015 Confidential and Proprietary
4
Version 1
5
17
4: Access Patient Medication History – Click Patient Center then click Medication tab and review the
medications:
· Simvastatin 20 mg tablet by mouth once daily - Discontinued
· Lorazepam 0.5 mg tablet by mouth every six hours – Active
· Insulin Glargine 20 units once daily - Active
· Metoprolol Tartrate 50 mg tablet by mouth once daily - Active
· Warfarin Sodium 5 mg tablet by mouth once daily Monday, Wednesday, Friday, Sunday - Active
·
Warfarin Sodium 2.5 mg tablet by mouth once daily Tuesday, Thursday, Saturday – Active
Overall, how difficult or easy did you find this task?
Very Difficult
1
Very Easy
2
3
Enabledoc LLC ©2015 Confidential and Proprietary
4
Version 1
5
18
Appendix 5: Usability Test Administration Moderator’s Guide
1. Items often forgotten
a. Stop watch for recording time on task
b. Elapsed time timer to time code observations/comments in the recording
2. Office Key Setup
a. Environment
b. Obtain copy of user key database if required for test
c. Any required feature access
d. Create and enter any necessary test data
e. Create Database / Office Key snapshot
3. Schedule Join.Me sessions
4. Schedule participants
5. Supply test participants with materials prior to test
a. Join.Me access
b. Task scenarios (be prepared to resend at the start of the test – do not assume the
participant has multiple monitors, so they may also need to print the scenarios)
c. Informed consent form (can be done verbally during the session)
6. Set up administrative / test PC
a. Join.Me installed and up to date
b. Shut down all programs not needed for the test – remember things like Communicator
c. Turn off notifications – audible and screen – on any software that may be running during
the test (e.g.: Outlook)
d. Ensure the application installed and functional
7. Running a test session
a. Ensure office key/database snapshot of base data exists
b. Start EHR session on test PC and arrive at starting screen for test
c. Select first patient if selection is not part of the task scenario
d. Start Join.Me session
e. Share appropriate screen
f. Greet user
g. Administer introductory materials to user (see below, following checklist)
h. Start Join.Me recording
i. Display informed consent form on shared screen
j. Obtain consent via verbal acceptance
k. Give mouse and keyboard control to ALL
l. Have user work through tasks
m. For each task, record
i. Time on task
ii. Critical errors (anything that constitutes task failure)
iii. Minor errors (anything the user detects and recovers)
iv. Deviation(s) from optimal path
v. Interesting comments (try to include recording time stamp for reference)
n.
After each task or task set (see specific scenarios) ask for response to single ease of use
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
19
question
o. Allow verbal response to any questions contained in task scenarios
p. At end of all tasks, display SUS questionnaire and have the participant place an X in
appropriate response to each statement
i. Save SUS with participant code as part of file name
q. Conduct debrief to solicit any additional feedback, comments, Q&A, etc.
Thank you for participating in this study. Your input is very important. Our session today will last about
45 minutes. During that time you will use a version of EnableMyHealth EHR and work with specific
features. Our goal is to determine where there are areas of difficulty and design aspects that can be
improved.
I will ask you to complete a few tasks using this system and answer some questions. You should complete
the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your
own following the instructions very closely. Please note that we are not testing you we are testing the
system, therefore if you have difficulty all this means is that something needs to be improved in the
system. There are no wrong answers! We will be here in case you need specific help, but we will not be
able to instruct you or provide help in how to use the application, however we may provide specific hints
as necessary.
Overall, we are interested in how easy (or how difficult) this system is to use, and how we could improve
it. I did not have any involvement in its creation, so please be honest with your opinions.
We are recording the audio and screenshots of our session today. All of the information that you provide
will be kept confidential and your name will not be associated with your comments at any time. Should
you feel it necessary you are able to withdraw at any time during the testing for any reason.
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
20
Appendix 6: Sample data collection form
Summative Usability Test Data Log
Test type:
Participant:
Participant code:
Session Date/start time:
File name:
Task 1:
·
·
·
·
·
·
Critical Error count:
Minor error count:
Optimal path deviations:
Time on task:
SEoUQ response:
Comments/observations:
<repeat for each task>
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
21
Appendix 7: System Usability Scale Questionnaire
In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systems
usability” known as the System Usability Scale or SUS. Lewis and Sauro (2009) and others have
elaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, at
http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008).
Strongly
Disagree
1. I think that I would like to use this system
frequently
Strongly
agree
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
2. I found the system unnecessarily complex
3. I thought the system was easy to use
4. I think that I would need the support of a
technical person to be able to use this
system
5. I found the various functions in this system
were well integrated
6. I thought there was too much
inconsistency in this system
7. I would imagine that most people would
learn to use this system very quickly
8. I found the system very cumbersome to
use
Enabledoc LLC ©2015 Confidential and Proprietary
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
Version 1
22
9. I felt very confident using the system
10. I needed to learn a lot of things before I
could get going with this system
Enabledoc LLC ©2015 Confidential and Proprietary
1
2
3
4
5
1
2
3
4
5
Version 1
23
Enabledoc LLC ©2015 Confidential and Proprietary
Version 1
24
Test Results Summary for 2014 Edition EHR Certification 16‐3721‐R‐0003‐PRA V1.0, March 3, 2016 Appendix C: Quality Management System ©2016 InfoGard. May be reproduced only in its original entirety, without revision 11 Test Results Summary for 2014 Edition EHR Certification 16‐3721‐R‐0003‐PRA V1.0, March 3, 2016 Appendix D: Privacy and Security ©2016 InfoGard. May be reproduced only in its original entirety, without revision 12 Test Results Summary for 2014 Edition EHR Certification 16‐3721‐R‐0003‐PRA V1.0, March 3, 2016 Test Results Summary Document History Version V1.0 Description of Change Initial release Date 3/03/2016 END OF DOCUMENT ©2016 InfoGard. May be reproduced only in its original entirety, without revision 13 

Similar documents

Report - Drummond Group

Report - Drummond Group Network Time Protocol Version 3 (RFC 1305)

More information

EHR Final Test Report

EHR Final Test Report review and sign an informed consent/release form (included in Appendix 3); they were instructed that they could withdraw at any time. Participants had prior experience with the EHR. The administrat...

More information