Report - Drummond Group

Transcription

Report - Drummond Group
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
ONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification
Part 1: Product and Developer Information
1.1 Certified Product Information
Product Name:
Florida Department of Health,
Health Management System
Product Version:
2015.1.05
Domain:
Ambulatory
Test Type:
Modular EHR
1.2 Developer/Vendor Information
Developer/Vendor Name:
Florida Department of Health
Address:
4052 Bald Cypress Tallahassee FL 32399
Website:
www.floridahealth.gov
Email:
[email protected]
Phone:
850-245-4255
Developer/Vendor Contact:
Michael Cragg
Page 1 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
Part 2: ONC-Authorized Certification Body Information
2.1 ONC-Authorized Certification Body Information
ONC-ACB Name:
Drummond Group
Address:
13359 North Hwy 183, Ste B-406-238, Austin, TX 78750
Website:
www.drummondgroup.com
Email:
[email protected]
Phone:
817-294-7339
ONC-ACB Contact:
Bill Smith
This test results summary is approved for public release by the following ONC-Authorized Certification
Body Representative:
Bill Smith
ONC-ACB Authorized Representative
Signature and Date
Certification Body Manager
Function/Title
5/4/2015
2.2 Gap Certification
The following identifies criterion or criteria certified via gap certification
§170.314
(a)(1)
(a)(17)
(d)(5)
(d)(9)
(a)(6)
(b)(5)*
(d)(6)
(f)(1)
(a)(7)
(d)(1)
(d)(8)
*Gap certification allowed for Inpatient setting only
x No gap certification
Page 2 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
2.3 Inherited Certification
The following identifies criterion or criteria certified via inherited certification
§170.314
x
(a)(1)
x
(a)(14)
x
(c)(3)
x
(f)(1)
x
(a)(2)
x
(a)(15)
x
(d)(1)
x
(f)(2)
x
(a)(3)
(a)(16) Inpt. only
x
(d)(2)
x
(f)(3)
x
(a)(4)
(a)(17) Inpt. only
x
(d)(3)
(f)(4) Inpt. only
x
(a)(5)
x
(b)(1)
x
(d)(4)
x
(a)(6)
x
(b)(2)
x
(d)(5)
(f)(5) Optional &
Amb. only
x
(a)(7)
x
(b)(3)
x
(d)(6)
x
(a)(8)
x
(b)(4)
x
(d)(7)
(f)(6) Optional &
Amb. only
x
(a)(9)
x
(b)(5)
x
(d)(8)
(g)(1)
x
(a)(10)
x
(a)(11)
x
x
(b)(6) Inpt. only
(d)(9) Optional
x
(g)(2)
x
(b)(7)
(e)(1)
x
(g)(3)
(a)(12)
x
(c)(1)
x
(e)(2) Amb. only
x
(g)(4)
(a)(13)
x
(c)(2)
x
(e)(3) Amb. only
No inherited certification
Page 3 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
Part 3: NVLAP-Accredited Testing Laboratory Information
Report Number: SG-04292015-2462
Test Date(s): N/A
3.1 NVLAP-Accredited Testing Laboratory Information
ATL Name:
Drummond Group EHR Test Lab
Accreditation Number:
NVLAP Lab Code 200979-0
Address:
13359 North Hwy 183, Ste B-406-238, Austin, TX 78750
Website:
www.drummondgroup.com
Email:
[email protected]
Phone:
512-335-5606
ATL Contact:
Beth Morrow
For more information on scope of accreditation, please reference NVLAP Lab Code 200979-0.
Part 3 of this test results summary is approved for public release by the following Accredited Testing
Laboratory Representative:
Sonia Galvan
ATL Authorized Representative
Signature and Date
Test Proctor
Function/Title
N/A
Location Where Test Conducted
5/4/2015
3.2 Test Information
3.2.1 Additional Software Relied Upon for Certification
Additional Software
Applicable Criteria
Functionality provided
by Additional Software
e-Prescribing vendor
Emdeon
170.314.b.3; 170.314.f.3
EMR Direct phiMail
170.314.b.1, 2
HISP
Microsoft Health Vault
170.314.e.3
Patient Portal Messaging
Microsoft Excel
170.314.a.14
Patient List
No additional software required
Page 4 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
3.2.2 Test Tools
Version
Test Tool
Cypress
x
x
2.4.1
ePrescribing Validation Tool
1.0.4
HL7 CDA Cancer Registry Reporting Validation Tool
1.0.3
HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool
1.8
x
HL7 v2 Immunization Information System (IIS) Reporting Validation
Tool
1.8
x
HL7 v2 Laboratory Results Interface (LRI) Validation Tool
1.7
x
HL7 v2 Syndromic Surveillance Reporting Validation Tool
1.7
x
Transport Testing Tool
179
x
Direct Certificate Discovery Tool
3.0.2
No test tools required
3.2.3 Test Data
Alteration (customization) to the test data was necessary and is described in
Appendix [insert appendix letter]
No alteration (customization) to the test data was necessary
3.2.4 Standards
3.2.4.1 Multiple Standards Permitted
The following identifies the standard(s) that has been successfully tested
where more than one standard is permitted
Criterion #
Standard Successfully Tested
§170.204(b)(1)
(a)(8)(ii)(A)(2)
HL7 Version 3 Implementation
Guide: URL-Based
Implementations of the
Context-Aware Information
Retrieval (Infobutton) Domain
x
(a)(13)
§170.207(a)(3)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
§170.204(b)(2)
HL7 Version 3 Implementation
Guide: Context-Aware
Knowledge Retrieval
(Infobutton) Service-Oriented
Architecture Implementation
Guide
§170.207(j)
HL7 Version 3 Standard:
Clinical Genomics; Pedigree
Page 5 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
Criterion #
Standard Successfully Tested
x
(a)(15)(i)
(a)(16)(ii)
§170.204(b)(1)
HL7 Version 3 Implementation
Guide: URL-Based
Implementations of the
Context-Aware Information
Retrieval (Infobutton) Domain
§170.210(g)
§170. 210(g)
Network Time Protocol
Version 4 (RFC 5905)
The code set specified at 45
CFR 162.1002(c)(2) (ICD-10CM) for the indicated
conditions
§170.207(i)
(b)(7)(i)
HL7 Version 3 Implementation
Guide: Context-Aware
Knowledge Retrieval
(Infobutton) Service-Oriented
Architecture Implementation
Guide
Network Time Protocol
Version 3 (RFC 1305)
§170.207(i)
(b)(2)(i)(A)
§170.204(b)(2)
The code set specified at 45
CFR 162.1002(c)(2) (ICD-10CM) for the indicated
conditions
x
§170.207(a)(3)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
x
§170.207(a)(3)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
Annex A of the FIPS Publication 140-2
(e)(1)(i)
(e)(1)(ii)(A)(2)
[list encryption and hashing algorithms]
§170.210(g)
§170. 210(g)
Network Time Protocol
Version 3 (RFC 1305)
Network Time Protocol
Version 4 (RFC 5905)
Annex A of the FIPS Publication 140-2
(e)(3)(ii)
[list encryption and hashing algorithms]
AES
SHA-256
§170.207(a)(3)
Common MU
Data Set (15)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
x
§170.207(b)(2)
The code set specified at 45
CFR 162.1002(a)(5) (HCPCS
and CPT-4)
None of the criteria and corresponding standards listed above are
applicable
3.2.4.2 Newer Versions of Standards
Page 6 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
The following identifies the newer version of a minimum standard(s) that
has been successfully tested
Newer Version
Applicable Criteria
No newer version of a minimum standard was tested
3.2.5 Optional Functionality
Criterion #
x (a)(4)(iii)
Optional Functionality Successfully Tested
Plot and display growth charts
(b)(1)(i)(B)
Receive summary care record using the standards specified at
§170.202(a) and (b) (Direct and XDM Validation)
(b)(1)(i)(C)
Receive summary care record using the standards specified at
§170.202(b) and (c) (SOAP Protocols)
(b)(2)(ii)(B)
Transmit health information to a Third Party using the standards
specified at §170.202(a) and (b) (Direct and XDM Validation)
(b)(2)(ii)(C)
Transmit health information to a Third Party using the standards
specified at §170.202(b) and (c) (SOAP Protocols)
(f)(3)
Ambulatory setting only – Create syndrome-based public health
surveillance information for transmission using the standard
specified at §170.205(d)(3) (urgent care visit scenario)
Common MU
Data Set (15)
Express Procedures according to the standard specified at
§170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental Procedures
and Nomenclature)
Common MU
Data Set (15)
Express Procedures according to the standard specified at
§170.207(b)(4) (45 CFR162.1002(c)(3): ICD-10-PCS)
No optional functionality tested
Page 7 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
3.2.6 2014 Edition Certification Criteria* Successfully Tested
Criteria #
Version
TP** TD***
Criteria #
Version
TP
TD
(c)(3)
1.6
(d)(1)
1.2
(a)(1)
1.2
(a)(2)
1.2
(a)(3)
1.2
1.4
(d)(2)
1.5
(a)(4)
1.4
1.3
(d)(3)
1.3
(a)(5)
1.4
1.3
(d)(4)
1.3
(a)(6)
1.3
1.4
(d)(5)
1.2
(a)(7)
1.3
1.3
(d)(6)
1.2
(a)(8)
1.2
(d)(7)
1.2
(a)(9)
1.3
1.3
(d)(8)
1.2
(a)(10)
1.2
1.4
(d)(9) Optional
1.2
(a)(11)
1.3
(e)(1)
1.8
1.5
(a)(12)
1.3
(e)(2) Amb. only
1.2
1.6
(a)(13)
1.2
(e)(3) Amb. only
1.3
(a)(14)
1.2
(f)(1)
1.2
1.2
(a)(15)
1.5
(f)(2)
1.3
1.7.1
(a)(16) Inpt. only
1.3
(f)(3)
1.3
1.7
(a)(17) Inpt. only
1.2
(f)(4) Inpt. only
1.3
1.7
(b)(1)
1.7
1.4
(b)(2)
1.4
1.6
(f)(5) Optional &
Amb. only
1.2
1.2
(b)(3)
1.4
1.2
(b)(4)
1.3
1.4
(f)(6) Optional &
Amb. only
1.3
1.0.3
(b)(5)
1.4
1.7
(g)(1)
1.7
1.9
(b)(6) Inpt. only
1.3
1.7
(g)(2)
1.7
1.9
(b)(7)
1.4
1.6
(g)(3)
1.3
(c)(1)
1.6
1.6
(g)(4)
1.2
(c)(2)
1.6
1.6
1.5
1.2
1.6
x No criteria tested
*For a list of the 2014 Edition Certification Criteria, please reference
http://www.healthit.gov/certification (navigation: 2014 Edition Test Method)
**Indicates the version number for the Test Procedure (TP)
***Indicates the version number for the Test Data (TD)
Page 8 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
3.2.7 2014 Clinical Quality Measures*
Type of Clinical Quality Measures Successfully Tested:
Ambulatory
x
Inpatient
No CQMs tested
*For a list of the 2014 Clinical Quality Measures, please reference http://www.cms.gov
(navigation: 2014 Clinical Quality Measures)
CMS ID
Version
CMS ID
2
90
22
x
117
50
122
52
123
56
x
124
Version
136
v2
CMS ID
x
137
x
v2
138
155
v2
139
158
140
159
141
160
62
126
142
161
143
163
164
127
v2
65
128
144
66
129
145
68
130
69
v2
x
146
x
v2
165
147
74
132
148
169
75
133
149
177
77
134
153
179
82
135
154
182
CMS ID
Version
CMS ID
x
Version
v2
166
131
Inpatient CQMs
Version CMS ID
v2
157
125
x
Version
156
61
64
x
Ambulatory CQMs
Version CMS ID
167
CMS ID
9
71
107
172
26
72
108
178
30
73
109
185
31
91
110
188
32
100
111
190
53
102
113
55
104
114
60
105
171
v2
Version
Page 9 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
3.2.8 Automated Numerator Recording and Measure Calculation
3.2.8.1 Automated Numerator Recording
Automated Numerator Recording Successfully Tested
(a)(1)
(a)(9)
(a)(16)
(b)(6)
(a)(3)
(a)(11)
(a)(17)
(e)(1)
(a)(4)
(a)(12)
(b)(2)
(e)(2)
(a)(5)
(a)(13)
(b)(3)
(e)(3)
(a)(6)
(a)(14)
(b)(4)
(a)(7)
(a)(15)
(b)(5)
x Automated Numerator Recording was not tested
3.2.8.2 Automated Measure Calculation
Automated Measure Calculation Successfully Tested
x
(a)(1)
x
(a)(9)
(a)(16)
(b)(6)
x
(a)(3)
x
(a)(11)
(a)(17)
(e)(1)
x
(a)(4)
x
(a)(12)
x
(b)(2)
x
(e)(2)
x
(a)(5)
x
(a)(13)
x
(b)(3)
x
(e)(3)
x
(a)(6)
x
(a)(14)
x
(b)(4)
x
(a)(7)
x
(a)(15)
x
(b)(5)
Automated Measure Calculation was not tested
3.2.9 Attestation
Attestation Forms (as applicable)
Appendix
x Safety-Enhanced Design*
A
x Quality Management System**
B
x Privacy and Security
C
*Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16),
(b)(3), (b)(4)
**Required for every EHR product
3.3 Appendices
Attached below.
Page 10 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
Test Results Summary Change History
Test Report ID
Description of Change
Date
2014 Edition Test Report Summary
Page 11 of 11
USER CENTER DESIGN REPORT –
TEST REPORT UPDATE
This test report was updated in December 2015 to satisfy User Center Design Report specifications by
ONC.
The new Test Report ID is amended as follows:
“Part 3: NVLAP-Accredited Testing Laboratory Information: Report Number” plus the suffix “_Dec2015”.
Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports
The Florida Department of Health, Health Management System Version 2014.4.00
Date of Usability Test:
Date of Report:
Report Prepared By:
8/28/2014, 9/8/2014, 9/9/2014
9/20/2014
The Florida Department of Health
Michael Cragg, Project Manager, Contract Staff
850-245-4255
[email protected]
4052 Bald Cypress Way 110R.02
Tallahassee, FL 32399-1733
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
Table of Contents
1.
Executive Summary _____________________________________________________ 4
1.1.
Major findings ____________________________________________________________ 6
1.2.
Areas for improvement _____________________________________________________ 6
2.
Introduction ___________________________________________________________ 8
3.
Method _______________________________________________________________ 8
3.1.
Participants ______________________________________________________________ 8
3.2.
Study Design _____________________________________________________________ 9
3.3.
Tasks ___________________________________________________________________ 10
3.4.
Procedure _______________________________________________________________ 12
3.5.
Test Location ____________________________________________________________ 13
3.6.
Test Environment_________________________________________________________ 13
3.7.
Test Forms and Tools ______________________________________________________ 13
3.8.
Participant Instructions ____________________________________________________ 14
3.9.
Usability Metrics _________________________________________________________ 15
3.10.
4.
Data Scoring ___________________________________________________________ 16
Results _______________________________________________________________ 19
4.1.
Data Analysis and Reporting ________________________________________________ 19
4.2.
Discussion of the Findings __________________________________________________ 21
4.2.1.
4.2.2.
4.2.3.
4.2.4.
4.2.5.
5.
EFFECTIVENESS _______________________________________________________________ 21
EFFICIENCY __________________________________________________________________ 22
SATISFACTION ________________________________________________________________ 23
MAJOR FINDINGS _____________________________________________________________ 24
AREAS FOR IMPROVEMENT _____________________________________________________ 25
Appendices ___________________________________________________________ 27
A.
Appendix 1: Recruiting screener _______________________________________________ 28
B.
Appendix 2: Participant demographics __________________________________________ 31
The Florida Department of Health
Page 2 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
C.
Appendix 3: Informed Consent Form ___________________________________________ 32
D.
Appendix 4: Moderator’s Guide _______________________________________________ 33
E.
Appendix 5: System Usability Scale Questionnaire ________________________________ 34
The Florida Department of Health
Page 3 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
1. Executive Summary
A usability test of The Florida Department of Health, Health Management System Ambulatory EHR
was conducted on 8/28/2014, 9/8/2014 and 9/9/2014 at the Florida Department of Health Central
Office, Hillsborough and Pinellas county health departments by The Florida Department of Health.
The purpose of this test was to test and validate the usability of the current user interface, and
provide evidence of usability in the EHR Under Test (EHRUT). During the usability test, six healthcare
providers matching the target demographic criteria served as participants and used the EHRUT in
simulated, but representative tasks.
This study collected performance data on 25 tasks typically conducted on an EHR. The tasks were
concentrated in the following areas:
 § 170.314(a)(1) Computerized provider order entry
 § 170.314(a)(2) Drug-drug, drug-allergy interaction checks
 § 170.314(a)(6) Medication list
 § 170.314(a)(7) Medication allergy list
 § 170.314(a)(8) Clinical decision support
 § 170.314(b)(3) Electronic prescribing
 § 170.314(b)(4) Clinical information reconciliation
The tasks listed above have been forced-ranked and tested in the order of degree of associated risk
and the potential to cause patient harm if performed incorrectly by the user.
The elements used to evaluate the risk included:




Number of individuals that would see/ be able to check the request prior to the patient’s
involvement
The number of steps required to complete the effort
The base medical knowledge required by the user for each criteria
Frequency of the need to perform each function
The resultant forced-ranking was as follows:
 § 170.314(a)(7) Medication allergy list
 § 170.314(a)(6) Medication list
 § 170.314(b)(3) Electronic prescribing
 § 170.314(a)(1) Computerized provider order entry (Medications only)
 § 170.314(a)(2) Drug-drug, drug-allergy interaction checks
 § 170.314(a)(1) Computerized provider order entry (Labs, Radiology)
 § 170.314(a)(8) Clinical decision support
 § 170.314(b)(4) Clinical information reconciliation
The Florida Department of Health
Page 4 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
During the hour long one-on-one usability test, each participant was greeted by the administrator
and asked to review and sign an informed consent/release form (included as Appendix 3).
Participants were instructed that they could withdraw at any time. Most participants had prior
experience with the EHR; however no participants had experience performing all of the tasks
included in the study. One week prior to the study, each participant was provided instructional
design materials that explained the functionality within each of the modules involved with the test.
These materials are similar in form and content to the materials provided to the user community with
each EHR upgrade or release. The administrator introduced the test, and instructed participants to
complete a series of tasks (given one at a time) using the EHRUT. During the testing, the
administrator timed the test while the data logger recorded user performance data on paper. The
administrator did not give the participant assistance on how to complete the task but rather directed
the participant to the instructional design materials. Note: The administrator made available
additional copies of the materials if the training materials previously provided were not readily
available.
Participant screen navigation and audio were recorded for subsequent analysis. The following types
of data were collected for each participant:
 Number of tasks successfully completed within the allotted time without assistance
 Time taken to complete the tasks
 Number and types of errors
 Path deviations
 Participant’s verbalizations
 Participant’s satisfaction ratings of the system
Following the conclusion of the test, participants were asked to complete a post-test questionnaire.
Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the
Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate
the usability of the EHRUT. Following is a summary of the performance and rating data collected on
the EHRUT, represented in TABLE 1: EHRUT Performance and Rating data summarized by application
module. All participant data was de-identified – no correspondence could be made from the identity
of the participant to the data collected.
The Florida Department of Health
Page 5 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
Measure
Task
N
Task
Success
#
Mean
Path Deviation
Deviations
(Observed/
Optimal)
Percent
Task Time
(Seconds)
Mean
Errors
Deviations
(Observed/
Optimal)
Mean
Task
Ratings
1=Easy
Mean
1.02
1.82
1.38
1.52
1.3
.2
.67
1.7
2.3
2.2
2
2.7
1.7
2.3
Percent
Allergy
1.x
94%
16.6/13
+27.7%
50
17
+194%
Medication
2.x
92%
37/29
+27.6%
103
32
+222%
DUR
3.x
92%
29.55/24
+23%
113
59
+91.5%
CPOE-Lab
4.x
89%
39.48/27
+46%
96
31
+236%
CPOE-Rad
4.x
75%
23.7/18
+31.7%
153
42
+264%
CDS
5.x
89%
66/61
+8%
65
45
+44%
Reconciliation
6.x
100%
18.42/16
+15%
66
10
+560%
TABLE 1: EHRUT Performance and Rating data summarized by application module.
The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system
based on performance. The SUS score for the tasks in this test was 58 (as seen in Table 5 below).
Broadly interpreted, scores under 68 represent systems with below average usability; scores over 68
would be considered above average.
In addition to the performance data, the following qualitative observations were made:
1.1.Major findings
The following are the major findings of the study based on interpretation of the qualitative
findings, verbal reports of the participants and observations from the administrator and data
logger.
 Users experienced inconsistency in the look, feel and functionality of pages within the
application.
 The Computerized Provider Order Entry process is expressed over multiple pages,
creating difficulty in tracking new orders, orders yet to be completed, orders in-process
and resulted orders.
 Efficiency is diminished when users are required to navigate to a separate page to
complete workflow tasks. This is most apparent in the ‘additional information’ required
in e-prescribing and in navigating to the clinical decision support module.
 The criteria that cause Clinical Decision Support (CDS) rules to fire are inconsistent.
1.2.Areas for improvement
Based upon participant feedback, the major areas of improvement for the EHRUT have been
identified as: application consistency, workflow integration, consolidation of the CPOE ordering
process and enhancements to new functionality, including clinical decision support.
The Florida Department of Health
Page 6 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
Application Consistency
The study demonstrated that as users navigated through the application inconsistencies in the
way information was displayed reduced the overall quality of the user experience. In addition,
the controls used on the pages govern the way users interact with the application. The variation
in control behavior requires extra effort on the part of the user to complete tasks and increases
the potential for associated errors.
Another inconsistency is the presence of outdated functionality. As the application has evolved,
the addition of new functionality has sometimes necessitated the development of new forms and
pages to perform tasks. In certain cases, the previous pages were not removed for one reason or
another. This is acknowledged as a safety concern.
Consolidation of Ordering Information for CPOE
The Clinician Portal contains a menu selection for orders that includes a page for lab, medication
and radiology orders as well as in- house services. The links from this section direct the user to
the order entry page. A separate workflow must be followed to see outstanding orders and
results. The study showed that this additional workflow was not always recognized by the study
participants. The disjointed nature of the CPOE order entry in HMS may result in duplicate orders
entered and/or missed results.
Workflow
The Clinical Decision Support (CDS) module contains all clinical decision alerts for a patient. The
user is alerted that CDS alerts exist for the patient by an alert counter that is located on the Clinic
Visit banner. Certain workflows within the application do not require that the user access the
specific pages where the presence of an alert is noted.
During the electronic ordering process, the Florida Department of Health requires collection of
additional data elements before an order can be completed. The method used to collect this data
is not well-integrated with the application’s workflow. As a result, this selection was commonly
missed by users, prompting an alert and additional step.
Clinical Decision Support
The Clinical Decision Support module is designed to alert clinicians to take action based upon
values entered into the EHR including demographics, allergies, medications, vital signs, problems
or a combination of this information. The application does not provide any differentiation
between alerts based on severity. The majority of users completed Clinical Decision Support
The Florida Department of Health
Page 7 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
tasks, without difficulty; however inconsistencies in the triggering criteria added non-value added
steps to some alerts.
2. Introduction
The EHRUT tested for this study was The Florida Department of Health, Health Management System
Version 2014.4.00 Ambulatory EHR. The EHRUT is designed to present medical information to
healthcare providers in Florida’s 67 county health departments. The EHRUT consists of standard
clinical functionality, programmatic templates representing the programs within the Florida
Department of Health and a complete practice management module. The usability testing attempted
to represent realistic exercises and conditions.
The purpose of this study was to test and validate the usability of the current user interface, and
provide evidence of usability in the EHR Under Test (EHRUT). To this end, measures of effectiveness,
efficiency and user satisfaction, such as the time spent on tasks, frequency that a successful task
execution occurred and the total number of path deviations, were captured during the usability
testing.
3. Method
3.1.Participants
A total of six participants were tested on the EHRUT. Participants in the test were doctors and
nurses. Participants were recruited by the Florida Department of Health and were compensated
for their time through their existing salaries. In addition, participants had no direct connection to
the development of the EHRUT(s). Participants were not from the testing or supplier
organization. Participants were given the opportunity to have the same orientation and level of
training as the actual end users would have received.
For test purposes, end-user characteristics were identified and translated into a recruitment
screening tool used to solicit potential participants (included as Appendix 1).
Recruited participants had a mix of backgrounds and demographic characteristics conforming to
the recruitment screener. TABLE 2: Study participant demographics and experience, lists
participants by characteristics, including demographics, professional experience, computing
experience and user needs for assistive technology. Participant names were replaced with
Participant IDs so that an individual’s data could not be tied back to individual participants.
The Florida Department of Health
Page 8 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
Part ID
Gender
Age
Education
Occupation/
Role
Professional
Experience
Computer
Experience
Product
Experience
Physician
Registered Nurse
Registered Nurse
15-20 yrs.
20-25 yrs.
10-15 yrs.
Extensive
Moderate
Minimal
Extensive
Moderate
Minimal
Assistive
Technology
Needs
None
None
None
3456777516
3437124919
3430418030
Male
Female
Female
40-49
50-59
30-39
3428591363
Female
>60
postgraduate
postgraduate
college
graduate
postgraduate
Advanced
Registered Nurse
Practitioner
Physician
Licensed Practical
Nurse
>25 yrs.
Extensive
Moderate
None
3478167954
3495586237
Female
Female
40-49
30-39
postgraduate
some college
15-20
10-15
Moderate
Moderate
Extensive
Moderate
None
None
TABLE 2: Study participant demographics and experience.
Seven participants (matching the demographics in the Participants section) were recruited and six
participated in the usability test. One participant was unwilling to perform the required tasks to
be included within the study.
Participants were scheduled for one hour sessions with sufficient time allocated between each
session for debrief by the administrator(s) and data logger(s), and reset systems to proper test
conditions. A spreadsheet was used to keep track of the participant schedule, and included each
participant’s demographic characteristics as provided by the recruiters.
3.2.Study Design
Overall, the objective of this test was to uncover areas where the application performed well
(effectively, efficiently, and with satisfaction) and areas where the application failed to meet the
needs of the participants. The data from this test may serve as a baseline for future tests with an
updated version of the same EHR and/or comparison with other EHRs provided the same tasks
are used. In short, this testing serves as both a means to record or benchmark current usability,
but also to identify areas where improvements must be made.
During the usability test, participants interacted with only The Florida Department of Health,
Health Management System. The participants used the system in different locations and worked
from their usual work location using the technical resources (computer and peripherals) that
were familiar to them. Participants were provided with the same instructions and a common
training curriculum. The system was evaluated for effectiveness, efficiency and satisfaction as
defined by measures collected and analyzed for each participant:

Number of tasks successfully completed within the allotted time without assistance
The Florida Department of Health
Page 9 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00





Time to complete the tasks
Number and types of errors
Path deviations
Participant’s verbalizations (comments)
Participant’s satisfaction ratings of the system
Additional information about the various measures can be found in Section 3.9 on Usability
Metrics.
3.3.Tasks
A number of tasks were constructed that would be realistic and representative of the kinds of
activities a user might complete with this EHR, including:
§170.314(a)(7) Medication allergy list
1.1. Allergy- Record: Enter an allergy [Norco]
1.2. Allergy- Change: Change the status of an allergy[Norco] from active to inactive
1.3. Allergy- Access: View the history of an allergy [Norco]
§170.314(a)(6) Medication list /§170.314(b)(3) Electronic prescribing /§170.314(a)(1)(i)
Computerized provider order entry (CPOE)
2.1. Record (Medication List) : Add a medication [Metformin/reported] to the patients list of
home medications
2.2. Record (ERx/CPOE): ePrescribe a medication [Tylenol 325-600mg PO q4-6 hours]
2.3. Change (Medication List/ERx/CPOE): Change the program component for a medication
[Tylenol]
2.4. Access (Medication List/ERx/CPOE): View a list of all medications including active, inactive,
on hold an voided medications
§170.314(a)(2) Drug-drug, drug-allergy interaction checks
3.1. DUR- Drug- Drug: Indicate a Drug-Drug Interaction [Tylenol + add Warfarin] (Moderate Alert)
3.2. DUR- Drug-Allergy: Indicate a severe Drug-Allergy Alert [Warfarin + add Cipro] (Severe Alert)
– Cancel prescription prior to issue
§170.314(a)(1)(i-ii) Computerized provider order entry
4.1. CPOE- Laboratory Order-Record: Order a lab [CBC]
The Florida Department of Health
Page 10 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
4.2. CPOE- Laboratory Order-Change: Delete the previous lab [CBC] order
4.3. CPOE- Laboratory Order-View: View all lab orders (resulted and non-resulted) for the active
client
4.4. CPOE- Radiology Order-Record: Add a radiological order [Chest X-Ray, Single View, Frontal]
status: planned
4.5. CPOE- Radiology Order- Access: View all outstanding radiology orders
§170.314(a)(8) Clinical Decision Support (CDS)
5.1. CDS: Lab Tests/Incorporate CCD: Incorporate a CCD in to the clients record [includes lab
result for HgA1c/Alert will appear as a result of import] Navigate to the CDS module to view
alert
5.2. CDS: Problem List/Linked Referential Materials: Add a problem [Diabetes] to the clients
problem list. Navigate to the CDS alert to view details and bibliographic information.
5.3. CDS -Medication list: Add medication [Warfarin] to the medication list, navigate to the CDS
module to view this alert
5.4. CDS-Medication Allergy List: Add an allergy [eggs] to the allergy list, navigate to the CDS
module to view this alert
5.5. CDS- Vital Signs: Add a pulse rate [220] to the vitals and measures record, navigate to the
CDS module to view this alert
5.6. CDS-Combo: Add a height [5’5’’] and weight [300lbs] to the vitals and measures record. Add a
medication [Metformin] to the medication list. Navigate to the CDS module to view this alert.
5.7. CDS-Demographics: [Female client, age 50, no prior service/med hx] Navigate to the CDS
module to view this alert.
§170.314(b)(4) Clinical Information Reconciliation (CIR)
6.1. CIR-Electronically and simultaneously display data from at least 2 sources (Medications,
Allergies, Problems): Incorporate a CCD [Browse for File/Reconcile]
6.2. CIR-Problem List: Select a problem from the imported list and add to the list of current
problems.
6.3. CIR-Allergies: Select an allergy [eggs] from the imported file and add to the list of current
allergies
6.4. CIR-Medications- Select a medication [Warfarin] which is missing from your active
medication list, add this medication to the list of current medications.
Tasks were selected based on their frequency of use, criticality of function, and those that may be
most troublesome for users. In addition, tasks were constructed in light of the study objectives.
The Florida Department of Health
Page 11 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
The seven areas of the system where Safety Enhanced Design was specifically applied were the
main focus of the study.
3.4.Procedure
At the beginning of each observational session, participants were greeted. Participants were then
assigned a participant ID. Each participant reviewed and signed an informed consent and release
form (included as Appendix 3) prior to the start of the observation session. A representative
from the test team witnessed the participant’s signature.
To ensure that the test ran smoothly, two staff members participated in this test, the usability
administrator and the data logger. The usability testing staff conducting the test was extensively
experienced with the following qualifications:
Data Logger: Kelly Leffler
Kelly is a Registered Nursing Consultant with 8 years of nursing experience that includes a
hospital setting and the lead for EHR development. She earned her bachelor’s degree in Nursing
in 2012 and is pursuing a Masters of Science in Nursing at the University of Central Florida. In her
role as lead nurse for EHR development, Kelly has provided clinical expertise to the Florida
Department of Health as well as facilitating multiple user groups and clinician based focus groups
that provide input into the development of the department’s home-grown EHR.
Administrator: Michael Cragg
Michael is a Certified Project Management Professional with 15 years of experience managing
information technology development projects. He earned his bachelor’s degree in Computer
Science Information Systems from the State University of New York in 2000 and was certified as a
Project Management Professional in 2008. As the project manager for the department’s EHR and
Health Information Exchange projects, Michael has worked extensively with the user community
as well as functioning in a facilitative role overseeing a development team of 40.
The administrator moderated the session including administering instructions and tasks. The
administrator also monitored task times, obtained post-task rating data, and took notes on
participant comments. A second person served as the data logger and took notes on task success,
path deviations, number and type of errors, and comments.
Participants were instructed to perform the tasks (see specific instructions below):

As quickly as possible making as few errors and deviations as possible.
The Florida Department of Health
Page 12 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00


Without assistance; administrators were allowed to give immaterial guidance and
clarification on tasks, but not instructions on use.
Without using a think aloud technique.
For each task, the participants were given a written copy of the task. Task timing began once the
administrator finished reading the question. The task time was stopped once the participant
indicated they had successfully completed the task. Scoring is discussed below in Section 3.10.
Following the session, the administrator gave the participant the post-test questionnaire
(included as Appendix 5), and thanked each individual for their participation.
Participants' demographic information, task success rate, time on task, errors, deviations, verbal
responses, and post-test questionnaire were recorded into a spreadsheet.
3.5.Test Location
The test facility included a quiet testing room with a table, computer for the participant, and
ample working space for the administrator and data logger. Only the participant, data logger and
administrator were in the observation room. To ensure that the environment was comfortable
for users, noise levels were kept to a minimum with the ambient temperature within a normal
range. All of the safety instruction and evacuation procedures were valid, in place, and visible to
the participants.
3.6.Test Environment
The EHRUT would typically be used in a healthcare office or facility. In this instance, the testing
was conducted in actual healthcare settings within The Florida Department of Health’s county
health departments. For testing, the computer was a Dell model running Windows 7 with
Internet Explorer 8 or higher. The participants used a mouse and keyboard when interacting with
the EHRUT. The application was set up by the Florida Department of Health clinical team
according to the documentation describing the system set-up and preparation. The application
itself was running on a Cache platform using a training / test database on a LAN connection.
Technically, the system performance (i.e., response time) was representative to what actual users
would experience in a field implementation. Additionally, participants were instructed not to
change any of the default system settings (such as control of font size).
3.7.Test Forms and Tools
During the usability test, various documents and instruments were used, including:
1. Informed Consent
The Florida Department of Health
Page 13 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
2. Moderator’s Guide
3. Post-test Questionnaire
Examples of these documents can be found in Appendices 2-3 respectively.
The Moderator’s Guide was devised to be able to capture required data. The participant’s
interaction with the EHRUT was captured and recorded digitally with screen capture software
running on the test machine. A [video / web] camera recorded each participant’s facial
expressions synced with the screen capture, and verbal comments were recorded with a
microphone. The test sessions were electronically transmitted to a nearby observation room
where the data logger observed the test session.
3.8.Participant Instructions
The administrator read the following instructions aloud to each participant (included as
Appendix 4).
Introduction
I want to thank you so much for agreeing to participate in this study on the Department of Health’s
Electronic Health Record, HMS. As you may know already, the purpose of the study is to help us
understand what the user experience is like. The results will be used to help make improvements to
the design of the site.
Training Q/A
Before we get started, are there any questions regarding the training materials that you were
provided?
- If yes, spend no more than 10 minutes answering questions
- If no, proceed to test
Session Recording
(Optional - Turn on recording)
Agenda for session
What I’d like to do is ask you a few questions about your background up front. Then we’re going to
step through a series of tasks in which you will use the HMS to complete the action. I’m going to ask
you to work through the tasks the best that you can, like you would typically do without someone
watching you. At the end of each section, we will discuss any comments about the action. At the end
The Florida Department of Health
Page 14 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
of the series we will talk about your overall experience with the HMS EHR and I’ll have you fill out a
short questionnaire. The whole process should take about an hour.
We’re testing our ideas, not you
I want you to know that we are testing HMS today. We are in no way testing you. There is no right or
wrong answers to the questions I’ll ask you today. We’re interested in your honest impressions, so
please feel free to share what you think of your experience. You’re not going to hurt anyone’s
feelings – we just want to find out what is and is not working so we can make it better. Some of the
functionality that you will see in the version of HMS that you work with today will not be available to
you at the county health department, until further development to ensure ease of use is completed.
Discussion
Participants should not use a think-aloud protocol during the testing. Excessive verbalization or
attempts to converse with the moderator during task performance should be strongly discouraged.
Participants will naturally provide commentary, but they should do so, ideally, after the testing. Some
verbal commentary may be acceptable between tasks, but again should be minimized by the
moderator.
If at any time you need me to repeat a question, or if you want to stop or take a break, just say so and
we’ll do that. If there are any required values that are not stated in the instructions, use your best
judgment. Do you have any questions? All right, let’s get started.
Participants were then given 25 tasks to complete. Tasks are listed in the moderator’s guide in
Appendix 4.
3.9.Usability Metrics
According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic
Health Records, EHRs should support a process that provides a high level of usability for all
users. The goal is for users to interact with the system effectively, efficiently, and with an
acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user
satisfaction were captured during the usability testing.
The goals of the test were to assess:
1. Effectiveness of the Florida Department of Health, Health Management System
by measuring participant success rates and errors
2. Efficiency of the Florida Department of Health, Health Management System by
measuring the average task time and path deviations
The Florida Department of Health
Page 15 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
3.10.
Data Scoring
The following table (Table 3) details how tasks were scored, errors evaluated and time data
analyzed.
The Florida Department of Health
Page 16 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
Measures
Effectiveness:
Task Success
Rationale and Scoring
A task was counted as a “Success” if the participant was able to achieve
the correct outcome, without assistance, within the time allotted on a per
task basis.
The total number of successes were calculated for each task and then divided by
the total number of times that task was attempted. The results are provided as
a percentage.
Task times were recorded for successes. Observed task times were divided by
the optimal time for each task as a measure of optimal efficiency.
Optimal task performance time, as benchmarked by expert performance under
realistic conditions, was recorded when constructing tasks. Target task times used
for task times in the Moderator’s Guide were operationally defined by taking
multiple measures of optimal performance and multiplying by a factor of 125%
that allows some time buffer because the participants were not trained to expert
performance. Thus, if expert, optimal performance on a task was 100 seconds
then allotted task time performance was 125 seconds. This ratio should was
aggregated across tasks and reported with mean and variance scores.
Effectiveness:
Task Failures
If the participant abandoned the task, did not reach the correct answer or
performed it incorrectly, or reached the end of the allotted time before
successful completion, the task was counted as a “Failure.” No task times were
taken for errors.
The total number of errors was calculated for each task and then divided by the
total number of times that task was attempted. Not all deviations were
counted as errors. This is expressed as the mean number of failed tasks per
participant.
On a qualitative level, an enumeration of errors and error types was collected.
The Florida Department of Health
Page 17 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
Measures
Efficiency:
Task
Deviations
Rationale and Scoring
The participant’s path (i.e., steps) through the application was recorded.
Deviations occurred if the participant, for example, went to a wrong screen,
clicked on an incorrect menu item, followed an incorrect link, or interacted
incorrectly with an on-screen control. This path was compared to the optimal
path. The number of steps in the observed path is divided by the number of
optimal steps to provide a ratio of path deviation.
All task deviations were recorded within the detailed notes of the study.
Optimal paths (i.e., procedural steps) were recorded when constructing tasks.
Efficiency:
Task Time
Each task was timed from when the administrator said “Begin” until the
participant said, “Done.” If he or she failed to say “Done,” the time was stopped
when the participant stopped performing the task. Only task times for tasks that
were successfully completed were included in the average task time analysis.
Average time per task was calculated for each task. Variance measures (standard
deviation and standard error) were also calculated.
Satisfaction:
Task Rating
Participant’s subjective impression of the ease of use of the application was
measured by administering both a simple post-task question as well as a postsession questionnaire. After each task, the participant was asked to rate “Overall,
this task was:” on a scale of 1 (Very Easy) to 5 (Very Difficult). This data was
averaged across participants.
Common convention is that average ratings for systems judged easy to use should
be 1.8 or below.
To measure participants’ confidence in and likeability of the Florida Department
of Health, Health Management System overall, the testing team administered the
System Usability Scale (SUS) post-test questionnaire. Questions included, “I think I
would like to use this system frequently,” “I thought the system was easy to use,”
and “I would imagine that most people would learn to use this system very
quickly.” See full System Usability Score questionnaire in Appendix 5.
TABLE 3: Rationale and scoring.
The Florida Department of Health
Page 18 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
4. Results
4.1.Data Analysis and Reporting
The results of the usability test were calculated according to the methods specified in the
Usability Metrics section above. Participants who failed to follow session and task instructions
had their data excluded from the analyses.
During the usability study, one participant demonstrated an unwillingness to cooperate
effectively with the intent of the study. Training materials provided in advance of the
observation were not reviewed and the participant was reluctant to utilize them during the
observation. As a result, a very high number of tasks were skipped by the participant prior to a
legitimate effort to accomplish them. For those reasons, the evaluation team felt it would not be
worthwhile to include this participant’s study results within the EHR Usability Test Report.
To facilitate the usability tests of the Clinical Decision Support modules, data was preloaded to
simulate typical clinical scenarios. The participants were asked to add additional data or change
existing data in order to initiate each alert. Following the firing of the alerts, the participants
were asked to use the application to acknowledge each of the alerts. It was determined during
the test that data required for the firing of the Problem List-based alert was not configured
correctly. Therefore, the tasks related to this alert were not included within the overall mean
scores tabulated. The task is represented in the documentation with a null score. The evaluation
team felt that there were a sufficient number of tasks related to the Clinical Decision Support
module to enable a critical evaluation of the usability without inclusion of the Problem List-based
alert.
The usability testing results for the EHRUT are detailed below (see Table 4). The results should be
seen in light of the objectives and goals outlined in Section 3.2 Study Design. The data should
yield actionable results that, if corrected, yield material, positive impact on user performance.
The Florida Department of Health
Page 19 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
Measure
Task
N
Task
Success
Path Deviation
#
Mean
Deviations
(Observed/
Optimal)
Task Time
(Seconds)
Mean
Percent
Allergy-Record
1.1
100%
9.3/7
+32%
Allergy-Change
1.2
83%
4.8/4
+20%
Allergy-Access
1.3
100%
2.5/2
+25%
Medication-Record
2.1
67%
16.5/12
+37.5%
Medication-Prescribe
2.2
100%
12.83/11 +16.6%
Medication-Access
2.3
100%
2.5/2
+25%
Medication-Change
2.4
100%
4.66/4
+15%
DUR-Drug-Drug
3.1
83%
10.8/9
+20%
DUR-Drug-Allergy
3.2
100%
18.75/15
+25%
CPOE-Lab Record
4.1
100%
20.58/13 +58.3%
CPOE-Lab Change
4.2
100%
7/7
0%
CPOE-Lab Access
4.3
67%
11.9/7
+70%
CPOE-Rad Record
4.4
67%
10.5/7
+50%
CPOE-Rad Access
4.5
83%
13.2/11
+20%
CDS-Lab
5.1
100%
2.2/2
+10%
CDS-Problem
5.2
0%*
0*
0%*
CDS-Med Allergy
5.3
83%
28.6/26
+10%
CDS-Vitals
5.4
100%
10.83/10
+8%
CDS-Combo
5.5
67%
9/9
0%
CDS-Demographics
5.6
100%
6.5/6
+8%
CDS-Medication
5.7
83%
8.8/8
+10%
Reconciliation-Select
6.1
100%
5.42/5
+8%
Reconciliation-Problems
6.2
100%
3/3
0%
Reconciliation-Allergies
6.3
100%
5/4
+25%
Reconciliation-Meds
6.4
100%
5/4
+25%
TABLE 4: EHRUT Performance and Rating data detail
75
58
18
177
135
38
62
116
110
171
35
83
180
126
57
0*
68
50
50
53
112
81
45
76
62
Errors
Deviations
(Observed/
Optimal)
Percent
40
9
3
54
54
8
13
66
51
35
10
49
53
31
10
0*
31
19
30
31
148
15
5
9
10
+87.5%
+500%
+500%
+227%
+150%
+375%
+377%
+75.7%
+115%
+389%
+250%
+69.3%
+239%
+306%
+470%
0%*
+119%
+163%
+66.6%
+70.9%
-24.3%
+440%
+800%
+744%
+520%
Mean
Task
Ratings
1=Easy
Mean
1.16
1.4
.5
1.8
1
1.16
3.33
1.6
1.16
2.16
0
2.4
1.6
1
.66
0*
.2
.33
0
0
0
.5
0
1.16
1
2.3
1.7
1.3
3
2.5
2
1.83
2.4
2
2
1.5
2.6
2.8
2.6
1.66
0*
1.6
2
1.5
1.5
2.2
2.33
2.16
2.66
2
*Technical problems within the application environment prevented all participants from completing this
task. The metrics for this task were not included within any study data components.
The results from the SUS (System Usability Scale) scored the subjective satisfaction with the
system based on performance with these tasks to be: 58 (as seen in Table 5 below). Broadly
interpreted, scores under 68 represent systems with below average usability; scores over 68
would be considered above average.
The Florida Department of Health
Page 20 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
4
3
5
2
3
2
3
3
2
3
4
5
1
4
2
3
2
3
2
3
3
2
3
4
1
1
4
2
5
3
2
2
4
2
3
2
3
2
3
3
3
0
1
2
3
4
3
2
3
3
5
3
2
2
5
2
3
2
3
2
2
User
7
Score
Contribution
4
User
6
Score
Contribution
10 I needed to learn a lot of things before I could
get going with this system
1
2
User
5
User
1
Score
Contribution
9 I felt very confident using the system
3
User
4
Score
Contribution
4 I think that I would need the support of a
technical person to be able to use this system
5 I found the various functions in this system
were well integrated
6 I thought there was too much inconsistency in
this system
7 I would imagine that most people would learn
to use this system very quickly
8 I found the system very cumbersome to use
3
User
3
Score
Contribution
3 I thought the system was easy to use
User
2
Score
Contribution
#
Statement
1 I think that I would like to use this system
frequently
2 I found the system unnecessarily complex
Score
Contribution
System Usability Score Results
Score Contribution Sum
0
3
2
15.0
3
2
4
1
21.0
2
3
2
2
1
15.0
3
2
4
1
3
2
15.0
4
2
1
2
1
3
2
15.0
1
4
2
3
2
3
4
1
18.0
4
2
1
3
2
2
1
2
1
13.0
1
4
1
4
3
2
3
2
3
2
20.0
2
3
2
5
4
3
2
3
2
2
1
15.0
3
3
2
3
2
3
2
3
2
2
3
16.0
Total Score Contributions:
System Usability Score:
163
58
TABLE 5: System Usability Scale Results
4.2.Discussion of the Findings
4.2.1. EFFECTIVENESS
The data collected regarding the task success and path deviations, and represented in TABLE
4: EHRUT Performance and Rating data detail, is a good indicator of the effectiveness of the
functionality present in the application. For the purposes of this study, we will consider a
task success percentage of 83 or better acceptable. The following tasks demonstrate
‘ineffective’ functionality in the application:
2.1 Record (Medication List) – (67%)
4.3 CPOE- Laboratory Order-View – (67%)
4.4 CPOE- Radiology Order-Record – (67%)
5.5 CDS-Combo – (67%)
As acknowledged below in the ‘Areas For Improvements’, and as witnessed by the study
administrator and data logger, the disjointed nature of the application’s ordering modules
caused a reduced success percentage on these tasks. Some of these tasks also had a
The Florida Department of Health
Page 21 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
heightened degree of path deviation which is indicative of the participant’s need to ‘look’ for
what they were trying to find. Some participants abandoned these tasks which can explain
why only some and not all of the path deviation scores were elevated.
4.2.2. EFFICIENCY
The data collected regarding the task time and path deviations, and represented in TABLE 4:
EHRUT Performance and Rating data detail, is a good indicator of the efficiency of the
functionality present in the application. For the purposes of this study, we will consider a
task time percentage of +300% or more a cause for concern. The task required greater than
three times the amount of time to complete than when performed by the ‘expert’ user. The
following tasks demonstrate ‘inefficient’ functionality in the application:
1.2 Allergy- Change – (500%)
1.3 Allergy- Access – (500%)
2.3 Change (Medication List/ERx/CPOE) – (375%)
2.4 Access (Medication List/ERx/CPOE) – (377%)
4.1 CPOE- Laboratory Order-Record – (389%)
4.5 CPOE- Radiology Order- Access – (306%)
5.1 CDS: Lab Tests/Incorporate CCD – (470%)
6.1 CIR-Electronically and simultaneously display data from at least 2 sources (Medications,
Allergies, Problems) – (440%)
6.2 CIR-Problem List – (800%)
6.3 CIR-Allergies – (744%)
6.4 CIR-Medications – (520%)
When analyzing the data, some consideration must be given to task numbers 5.1, 6.1, 6.2, 6.3
and 6.4 as these tasks are related to an entirely new module. Besides the pre-study training
materials, no study participants had any exposure to these.
Within the allergy module, although the tasks themselves were not long, the study
participants did struggle to determine exactly how to perform a task. The module should be
reviewed with this perspective and determine if there are ways to make the module’s
functionality more apparent and easy to use. The application employs a coded data set that is
used for defining allergies. This caused some extended task times as participants needed
additional time to identify the specific allergy within a long list; however, this was not a
contributor to the time variance for the change and access tasks.
The Florida Department of Health
Page 22 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
Within the medication module, the functions that permit the accessing of medication history
and updating of a medication order are not initiated from the same location where the
original medication orders are generated. Although seasoned users may be familiar with the
design, it is not intuitive and caused delays in the study participant’s ability to complete the
task.
The radiology ordering module is a very old module of the application and the design is very
different from the more recently created modules. The variance in design along with the fact
that none of the study participants had ever used the module are likely the causes for the
variance.
The lab ordering process was also one of the tasks that demonstrated an extraordinarily high
task time variance. The administrator and data logger noted that the cause for the variance
was likely in the design of the task to be executed. The study participants were instructed to
select a lab order that was difficult to find. All study participants had trouble finding the
particular test to order and as a result, the task took longer than expected. The rest of the
activities involved in the task were not problematic to the study participants.
4.2.3. SATISFACTION
The data collected regarding the system usability, and represented in TABLE 5: System
Usability Scale Results, is a good indicator of the satisfaction that users get from using the
application. Broadly interpreted, scores under 68 represent systems with below average
usability; scores over 68 would be considered above average. This system’s score was 10%
below the average score of 68 which indicates that there is a less than acceptable degree of
dissatisfaction with the system.
Based on the SUS results, Study Participants indicated that:





The system was not easy to use.
They would need the support of a technical person to use the system.
Functions within the system were not well integrated.
People would not learn to use the system quickly.
They did not feel confident using the system.
Evaluation of the results of the study participant’s task ratings indicates that participants do
not find the system overall easy to use. 16 out of 25 tasks had an average score of 2 or
The Florida Department of Health
Page 23 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
higher. No tasks were rated by all study participants as easy. It is highly likely that the
efficiency and effectiveness concerns as cited above are a major contributor to the overall
lack of user satisfaction. Users do not feel confident that they will be able to easily use the
system to get their work completed.
4.2.4. MAJOR FINDINGS
The following are the major findings of the study based on interpretation of the qualitative
findings, verbal reports of the participants and observations from the administrator and data
logger.




Users experienced inconsistency in the look, feel and functionality of pages within the
application. The application exhibits design inconsistencies both in form and function.
Application-wide there are several visual design strategies employed. Although these
inconsistencies are primarily superficial, it increases the time that it will take a user to
become familiar with the application. The controls used on the application’s pages do not
always react the same way to user interaction. A library of standard controls has not
been implemented and as a result the behavior of these controls varies from page to
page within the application. As example, a text box in the application will sometimes
resize itself based on data input and sometimes not. Type-ahead functionality for drop
down lists is another example where inconsistency was seen.
The Computerized Provider Order Entry process is expressed over multiple pages,
creating difficulty in tracking new orders, orders yet to be completed, orders in-process
and resulted orders. Users have an expectation that if information is available within the
application it will be easily accessible from the same general location. Some users may
venture out and look for additional information if they know it is there but the majority
of users tend to assume that the information is not available.
Efficiency is diminished when users are required to navigate to a separate page to
complete workflow tasks. This is most apparent in the additional information required in
e-prescribing and in navigating to the Clinical Decision Support module. In addition, the
additional navigation may result in some users disregarding the alerts, greatly diminishing
their value. However, it is recognized that the application did employ a good strategy in
that the Clinical Decision Support module does not rely on pop-up messages to alert
users. Many of the participants commented that they like the way the alerts were
presented but wished it was more closely integrated into the clinical workflow.
Clinical Decision Support (CDS) rules are inconsistent in the criteria that cause them to
fire. Efficiency can be gained in this module by adding a severity to each alert and
The Florida Department of Health
Page 24 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
allowing the user to take action on the CDS list page. Indicators of clinical decision
support functions should be enhanced to ensure that they are present in the applicable
workflow. Some users indicated that in the case of a very severe alert, the application
should require some action before proceeding.
4.2.5. AREAS FOR IMPROVEMENT
Based upon participant feedback the major areas of improvement for the EHRUT have been
identified as: application consistency, workflow integration and consolidation of the CPOE
process and enhancements to new functionality, including Clinical Decision Support.
Application Consistency
As users navigated through the application the study administrator and data logger observed
inconsistencies in the way that information is displayed and in the way that users interacted
with the application. For example in the Allergy module and in the Clinical Visit, selections
from a table are captured via the ‘look ahead’ search function, whereas the application’s
record a service order entry page requires the user to enter the first couple of characters and
press enter. Users that were not familiar with the record a service page expected that the
search function would operate in the same manner as the clinic visit. The result was that the
users assumed that the item they were looking for was not present in the list. When allowed,
the user’s response was often to create a free text entry of the item. The team should
implement a standard set of controls that exhibit consistent behavior. This will result in a
greater degree of predictability within the application.
Another inconsistency discovered in the application is the presence of outdated functionality
for the Allergy module. When attempting to add allergies, some users attempted to navigate
to the old functionality, displaying incomplete data. This is recognized as a safety concern. As
the application is evolved, effort should be made to retire certain pages completely. This
may require additional functionality to other areas of the application but it will cut down on
‘dead ends’ within the application.
Consolidation of Ordering Information for CPOE
The Clinician Portal contains a menu selection for orders, containing a page for lab,
medication and radiology orders as well as in-house services. The links from this section
direct the user to the order entry page. A separate workflow must be followed to see
outstanding orders and results. The disjointed nature of the CPOE order entry in HMS may
result in duplicate orders to be entered and/or results to be missed. The ordering functions
were integrated into the application over several years and that is obvious. No effort was
The Florida Department of Health
Page 25 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
made to create a single ordering interface and users often express the desire to have one.
Although it may result in a significant effort, it is recommended that a consolidated ordering
interface is developed that provides users the ability to generate new orders and check the
status of existing orders.
Workflow
The Clinical Decision Support (CDS) module contains all clinical decision alerts for a patient.
The user is alerted that CDS alerts exist for the patient by an alert counter that is located on
the clinic visit banner. One user identified that they would not have seen the alert regarding
an allergy for the flu vaccine in her normal workflow for that type of visit. It is recommended
that the presence of alerts be indicated within multiple places in the application to ensure
that users executing a variety of workflows have the opportunity to benefit from the
information.
In the e-prescribing module, Florida Department of Health users are required to capture
organizational specific information, including program component, sub program component
and shipping indicators. This information is not part of the industry standard and is therefore
not present on the main prescription page. The user is forced to navigate to a separate
screen, known as “Additional Information” to enter this information. This selection was
commonly missed by users, requiring additional steps to be taken. A different method to
capture similar information is used in the electronic lab ordering module. Study participants
indicated that they would like to see changes in the way this data is collected so that it does
not result in several additional steps. One participant indicated that it would be an
improvement if the collection method used in the two areas was at least consistent.
Clinical Decision Support
The clinical decision support module is designed to alert clinicians to take action based upon
values entered into the EHR including demographics, allergies, medications, vital signs,
problems or a combination of this information. Study participants requested that the severity
of alerts be added to the clinical decision support module to decrease alert fatigue. The
majority of users completed clinical decision support tasks, without difficulty, however
inconsistencies in the triggering criteria (some CDS alerts require that a clinical visit be added
after the information was entered) added non-valued steps to some alerts. One user
suggested that CDS alerts are able to be handled from the list page to prevent the user from
having to access the detail of each alert. Efficiency can be gained in this module by adding a
severity to each alert and allowing the user to take action on the CDS list page. Indicators of
The Florida Department of Health
Page 26 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
clinical decision support functions should be enhanced to ensure that they are present in the
applicable workflow.
5. Appendices
The following appendices include supplemental data for this usability test report. Following is a
list of the appendices provided:





Sample Recruiting screener
Participant demographics
Non-Disclosure Agreement (NDA) and Informed Consent Form
Example Moderator’s Guide
System Usability Scale Questionnaire
The Florida Department of Health
Page 27 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
A. Appendix 1: Recruiting screener
Recruiting Script for Florida Department of Health Electronic Health Record Usability Study
Hello, my name is Kelly Leffler, calling from the Bureau of Clinical Management and Informatics in the
Division of Public Health Statistics and Performance Management within the Florida Department of
Health. We are recruiting individuals to participate in a usability study for the Health Management
System electronic health record. We would like to ask you a few questions to see if you qualify and if
would like to participate. This should only take a few minutes of your time. This is strictly for research
purposes. If you are interested and qualify for the study, we would welcome your participation in the
study and you will be able to meet the requirements of the study within the normal course of your
work day. Can I ask you a few questions?
1. Are you male or female? [We are attempting to recruit a mix of participants]
_______________________________________________________________________________
2. Have you participated in a focus group or usability test in the past 12 months? [If
yes, Terminate]
_______________________________________________________________________________
3. Do you, or does anyone in your home, work in marketing research, usability research,
web design […etc.]? [If yes, Terminate]
_______________________________________________________________________________
4. Do you, or does anyone in your home, have a commercial or research interest in an
electronic health record software or consulting company? [If yes, Terminate]
_______________________________________________________________________________
5. Which of the following best describes your age? [23 to 39; 40 to 59; 60 - to 74; 75 and
older] [We are attempting to recruit a mixed demographic]
_______________________________________________________________________________
6. Which of the following best describes your race or ethnic group? [e.g., Caucasian,
Asian, Black/African-American, Latino/a or Hispanic, etc.]
_______________________________________________________________________________
The Florida Department of Health
Page 28 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
7. Do you require any assistive technologies to use a computer? [if so, please describe]
_______________________________________________________________________________
Professional Demographics
8. What is your current position and title? (Must be healthcare provider or system administrator)
RN: Specialty
Physician: Specialty
Resident: Specialty
Administrative Staff
Other [Terminate]
___________________________
___________________________
___________________________
___________________________
9. How long have you held this position?
_______________________________________________________________________________
10. Describe your work location (or affiliation) and environment?
_______________________________________________________________________________
11. Which of the following describes your highest level of education? [e.g., high school
graduate/GED, some college, college graduate (RN, BSN), postgraduate (MD/PhD),
other (explain)]
_______________________________________________________________________________
Computer Expertise
12. Besides reading email, what professional activities do you do on the computer? [e.g.,
access EHR, research; reading news; shopping/banking; digital pictures;
programming/word processing, etc.] [If no computer use at all, Terminate]
_______________________________________________________________________________
13. About how many hours per week do you spend on the computer? [Recruit according to
the demographics of the intended users, e.g., 0 to 10, 11 to 25, 26+ hours per week]
_______________________________________________________________________________
The Florida Department of Health
Page 29 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
14. What computer platform do you usually use? [e.g., Mac, Windows, etc.]
_______________________________________________________________________________
15. What Internet browser(s) do you usually use? [e.g., Firefox, IE, AOL, etc.]
_______________________________________________________________________________
16. In the last month, how often have you used an electronic health record?
_______________________________________________________________________________
17. How many years have you used an electronic health record?
_______________________________________________________________________________
18. How many EHRs do you use or are you familiar with?
_______________________________________________________________________________
Contact Information If the person matches our qualifications, ask
Those are all the questions I have for you. Your background matches the people we're
looking for. Would you be able to participate on [date, time]? [If so collect contact
information]
May I get your contact Information?
Name of participant:
_____________________________________
Address:
_____________________________________
City, State, Zip:
_____________________________________
Daytime phone number:
_____________________________________
Evening phone number:
_____________________________________
Alternate [cell] phone number: _____________________________________
Email address:
_____________________________________
The Florida Department of Health
Page 30 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
B. Appendix 2: Participant demographics
Part ID
Gender
Age
Education
Occupation/
Role
3456777516
3437124919
3430418030
Male
Female
Female
40-49
50-59
30-39
3428591363
Female
>60
postgraduate
postgraduate
college
graduate
postgraduate
3478167954
3495586237
Female
Female
40-49
30-39
postgraduate
some college
The Florida Department of Health
Professional
Experience
Computer
Experience
Product
Experience
Physician
Registered Nurse
Registered Nurse
15-20 yrs.
20-25 yrs.
10-15 yrs.
Extensive
Moderate
Minimal
Extensive
Moderate
Minimal
Assistive
Technology
Needs
None
None
None
Advanced
Registered Nurse
Practitioner
Physician
Licensed Practical
Nurse
>25 yrs.
Extensive
Moderate
None
15-20
10-15
Moderate
Moderate
Extensive
Moderate
None
None
Page 31 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
C. Appendix 3: Informed Consent Form
The Florida Department of Health would like to thank you for participating in this study.
The purpose of this study is to evaluate the Health Management System electronic
health record system. If you decide to participate, you will be asked to perform several
tasks using the prototype and give your feedback. The study will last about 60 minutes.
Agreement
I understand and agree that as a voluntary participant in the present study conducted by the
Florida Department of Health, I am free to withdraw consent or discontinue participation at
any time. I understand and agree to participate in the study conducted and agree to be
observed by the Florida Department of Health.
I understand and consent to the use and release of any and all information collected by
the Florida Department of Health. I understand that the information is for research
purposes only and that my name and image will not be used for any purpose other than
research. I relinquish any rights to the information and understand the information may be
used by the Florida Department of Health without further permission.
I understand and agree that the purpose of this study is to make software applications
more useful and usable in the future.
I understand and agree that the data collected from this study may be shared with outside
of the Florida Department of Health. I understand and agree that data confidentiality is
assured, because only de- identified data – i.e., identification numbers not names – will be
used in analysis and reporting of the results.
I agree to immediately raise any concerns or areas of discomfort with the study
administrator. I understand that I can leave at any time.
Please check one of the following:
____YES, I have read the above statement and agree to be a participant.
____NO, I choose not to participate in this study.
Signature:_______________________________________________________
The Florida Department of Health
Page 32 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
D. Appendix 4: Moderator’s Guide
See ‘EHR Usability Test Moderator’s Guide’ included as an external attachment.
The Florida Department of Health
Page 33 of 35
EHR Usability Test Report of
The Health Management System (HMS) Version 2014.4.00
E. Appendix 5: System Usability Scale Questionnaire
#
Statement
Strongly
Disagree
Strongly
Agree
1
I think that I would like to use this
system frequently
1
2
3
4
5
2
I found the system unnecessarily
complex
1
2
3
4
5
3
I thought the system was easy to use
1
2
3
4
5
4
I think that I would need the support of
a technical person to be able to use this
system
I found the various functions in this
system were well integrated
1
2
3
4
5
1
2
3
4
5
5
6
I thought there was too much
inconsistency in this system
1
2
3
4
5
7
I would imagine that most people would
learn to use this system very quickly
1
2
3
4
5
8
I found the system very cumbersome to
use
1
2
3
4
5
9
I felt very confident using the system
1
2
3
4
5
1
2
3
4
5
10 I needed to learn a lot of things before I
could get going with this system
The Florida Department of Health
Page 34 of 35
The Florida Department of Health
Page 35 of 35