Report - Drummond Group

Transcription

Report - Drummond Group
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Nov-2014
ONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification
Part 1: Product and Developer Information
1.1 Certified Product Information
Product Name:
TransMed Client Software
Product Version:
5.0
Domain:
Ambulatory
Test Type:
Complete EHR
1.2 Developer/Vendor Information
Developer/Vendor Name:
TransMed Network
Address:
201 South Buena Vista St. Suite #245 Burbank CA 91505
Website:
http://www.transmed.net
Email:
[email protected]
Phone:
(877) 999-8633
Developer/Vendor Contact:
Jay Luong
Page 1 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Nov-2014
Part 2: ONC-Authorized Certification Body Information
2.1 ONC-Authorized Certification Body Information
ONC-ACB Name:
Drummond Group
Address:
13359 North Hwy 183, Ste B-406-238, Austin, TX 78750
Website:
www.drummondgroup.com
Email:
[email protected]
Phone:
817-294-7339
ONC-ACB Contact:
Bill Smith
This test results summary is approved for public release by the following ONC-Authorized Certification
Body Representative:
Bill Smith
ONC-ACB Authorized Representative
Certification Committee Chair
Function/Title
12/8/2014
Signature and Date
2.2 Gap Certification
The following identifies criterion or criteria certified via gap certification
§170.314
x
(a)(1)
(a)(17)
x
(d)(5)
x
(a)(6)
(b)(5)*
x
(d)(6)
x
(a)(7)
(d)(1)
x
(d)(8)
x
(d)(9)
x
(f)(1)
*Gap certification allowed for Inpatient setting only
No gap certification
Page 2 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Nov-2014
2.3 Inherited Certification
The following identifies criterion or criteria certified via inherited certification
§170.314
(a)(1)
(a)(14)
(c)(3)
(f)(1)
(a)(2)
(a)(15)
(d)(1)
(f)(2)
(a)(3)
(a)(16) Inpt. only
(d)(2)
(f)(3)
(a)(4)
(a)(17) Inpt. only
(d)(3)
(f)(4) Inpt. only
(a)(5)
(b)(1)
(d)(4)
(a)(6)
(b)(2)
(d)(5)
(f)(5) Optional &
Amb. only
(a)(7)
(b)(3)
(d)(6)
(a)(8)
(b)(4)
(d)(7)
(f)(6) Optional &
Amb. only
(a)(9)
(b)(5)
(d)(8)
(g)(1)
(a)(10)
(b)(6) Inpt. only
(d)(9) Optional
(g)(2)
(a)(11)
(b)(7)
(e)(1)
(g)(3)
(a)(12)
(c)(1)
(e)(2) Amb. only
(g)(4)
(a)(13)
(c)(2)
(e)(3) Amb. only
x No inherited certification
Page 3 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Nov-2014
Part 3: NVLAP-Accredited Testing Laboratory Information
Report Number: SG-12022014-2589
Test Date(s): 10/23/2014, 10/24/2014, 11/5/2014, 12/1/2014
3.1 NVLAP-Accredited Testing Laboratory Information
ATL Name:
Drummond Group EHR Test Lab
Accreditation Number:
NVLAP Lab Code 200979-0
Address:
13359 North Hwy 183, Ste B-406-238, Austin, TX 78750
Website:
www.drummondgroup.com
Email:
[email protected]
Phone:
512-335-5606
ATL Contact:
Beth Morrow
For more information on scope of accreditation, please reference NVLAP Lab Code 200979-0.
Part 3 of this test results summary is approved for public release by the following Accredited Testing
Laboratory Representative:
Sonia Galvan
ATL Authorized Representative
Signature and Date
Test Proctor
Function/Title
Houston, TX
Location Where Test Conducted
12/8/2014
3.2 Test Information
3.2.1 Additional Software Relied Upon for Certification
Additional Software
Applicable Criteria
Functionality provided
by Additional Software
e-Prescribing
NewCrop
170.314.a.1, 2, 10;
170.314.b.3
EMR Direct phiMail
170.314.b.1, 2; 170.314.e.1
HISP
PopHealth
170.314.c.1-3
Calculate Clinical Quality
Measures
No additional software required
Page 4 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Nov-2014
3.2.2 Test Tools
Version
Test Tool
Cypress
x
x
2.4.1
ePrescribing Validation Tool
1.0.4
HL7 CDA Cancer Registry Reporting Validation Tool
1.0.3
HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool
1.8
x
HL7 v2 Immunization Information System (IIS) Reporting Validation
Tool
1.8
x
HL7 v2 Laboratory Results Interface (LRI) Validation Tool
1.7
x
HL7 v2 Syndromic Surveillance Reporting Validation Tool
1.7
x
Transport Testing Tool
179
x
Direct Certificate Discovery Tool
3.0.2
No test tools required
3.2.3 Test Data
Alteration (customization) to the test data was necessary and is described in
Appendix [insert appendix letter]
No alteration (customization) to the test data was necessary
3.2.4 Standards
3.2.4.1 Multiple Standards Permitted
The following identifies the standard(s) that has been successfully tested
where more than one standard is permitted
Criterion #
Standard Successfully Tested
§170.204(b)(1)
(a)(8)(ii)(A)(2)
HL7 Version 3 Implementation
Guide: URL-Based
Implementations of the
Context-Aware Information
Retrieval (Infobutton) Domain
x
(a)(13)
§170.207(a)(3)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
§170.204(b)(2)
HL7 Version 3 Implementation
Guide: Context-Aware
Knowledge Retrieval
(Infobutton) Service-Oriented
Architecture Implementation
Guide
§170.207(j)
HL7 Version 3 Standard:
Clinical Genomics; Pedigree
Page 5 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Nov-2014
Criterion #
Standard Successfully Tested
x
(a)(15)(i)
(a)(16)(ii)
§170.204(b)(1)
HL7 Version 3 Implementation
Guide: URL-Based
Implementations of the
Context-Aware Information
Retrieval (Infobutton) Domain
§170.210(g)
§170. 210(g)
Network Time Protocol
Version 4 (RFC 5905)
The code set specified at 45
CFR 162.1002(c)(2) (ICD-10CM) for the indicated
conditions
§170.207(i)
(b)(7)(i)
HL7 Version 3 Implementation
Guide: Context-Aware
Knowledge Retrieval
(Infobutton) Service-Oriented
Architecture Implementation
Guide
Network Time Protocol
Version 3 (RFC 1305)
§170.207(i)
(b)(2)(i)(A)
§170.204(b)(2)
The code set specified at 45
CFR 162.1002(c)(2) (ICD-10CM) for the indicated
conditions
x
§170.207(a)(3)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
x
§170.207(a)(3)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
Annex A of the FIPS Publication 140-2
(e)(1)(i)
[list encryption and hashing algorithms]
AES
SHA-1
(e)(1)(ii)(A)(2)
§170.210(g)
Network Time Protocol
Version 3 (RFC 1305)
x
§170. 210(g)
Network Time Protocol
Version 4 (RFC 5905)
Annex A of the FIPS Publication 140-2
(e)(3)(ii)
[list encryption and hashing algorithms]
AES
SHA-1
x
Common MU
Data Set (15)
§170.207(a)(3)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
§170.207(b)(2)
The code set specified at 45
CFR 162.1002(a)(5) (HCPCS
and CPT-4)
None of the criteria and corresponding standards listed above are
applicable
3.2.4.2 Newer Versions of Standards
Page 6 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Nov-2014
The following identifies the newer version of a minimum standard(s) that
has been successfully tested
Newer Version
Applicable Criteria
No newer version of a minimum standard was tested
3.2.5 Optional Functionality
Criterion #
x (a)(4)(iii)
Optional Functionality Successfully Tested
Plot and display growth charts
(b)(1)(i)(B)
Receive summary care record using the standards specified at
§170.202(a) and (b) (Direct and XDM Validation)
(b)(1)(i)(C)
Receive summary care record using the standards specified at
§170.202(b) and (c) (SOAP Protocols)
(b)(2)(ii)(B)
Transmit health information to a Third Party using the standards
specified at §170.202(a) and (b) (Direct and XDM Validation)
(b)(2)(ii)(C)
Transmit health information to a Third Party using the standards
specified at §170.202(b) and (c) (SOAP Protocols)
x (f)(3)
Ambulatory setting only – Create syndrome-based public health
surveillance information for transmission using the standard
specified at §170.205(d)(3) (urgent care visit scenario)
Common MU
Data Set (15)
Express Procedures according to the standard specified at
§170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental Procedures
and Nomenclature)
Common MU
Data Set (15)
Express Procedures according to the standard specified at
§170.207(b)(4) (45 CFR162.1002(c)(3): ICD-10-PCS)
No optional functionality tested
Page 7 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Nov-2014
3.2.6 2014 Edition Certification Criteria* Successfully Tested
Criteria #
Version
TP** TD***
Criteria #
Version
TP
TD
(c)(3)
1.6
(d)(1)
1.2
(a)(1)
1.2
x
(a)(2)
1.2
x
(a)(3)
1.2
1.4
x
(d)(2)
1.5
x
(a)(4)
1.4
1.3
x
(d)(3)
1.3
x
(a)(5)
1.4
1.3
x
(d)(4)
1.3
(a)(6)
1.3
1.4
(d)(5)
1.2
(a)(7)
1.3
1.3
(d)(6)
1.2
x
(a)(8)
1.2
(d)(7)
1.2
x
(a)(9)
1.3
1.3
(d)(8)
1.2
x
(a)(10)
1.2
1.4
(d)(9) Optional
1.2
x
(a)(11)
1.3
x
(e)(1)
1.8
1.5
x
(a)(12)
1.3
x
(e)(2) Amb. only
1.2
1.6
x
(a)(13)
1.2
x
(e)(3) Amb. only
1.3
x
(a)(14)
1.2
(f)(1)
1.2
1.2
x
(a)(15)
1.5
x
(f)(2)
1.3
1.7.1
(a)(16) Inpt. only
1.3
x
(f)(3)
1.3
1.7
(a)(17) Inpt. only
1.2
(f)(4) Inpt. only
1.3
1.7
x
(b)(1)
1.7
1.4
x
(b)(2)
1.4
1.6
(f)(5) Optional &
Amb. only
1.2
1.2
x
(b)(3)
1.4
1.2
x
(b)(4)
1.3
1.4
(f)(6) Optional &
Amb. only
1.3
1.0.3
x
(b)(5)
1.4
1.7
(g)(1)
1.7
1.9
(b)(6) Inpt. only
1.3
1.7
x
(g)(2)
1.7
1.9
x
(b)(7)
1.4
1.6
x
(g)(3)
1.3
x
(c)(1)
1.6
1.6
x
(g)(4)
1.2
x
(c)(2)
1.6
1.6
1.5
x
x
1.2
1.6
No criteria tested
*For a list of the 2014 Edition Certification Criteria, please reference
http://www.healthit.gov/certification (navigation: 2014 Edition Test Method)
**Indicates the version number for the Test Procedure (TP)
***Indicates the version number for the Test Data (TD)
Page 8 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Nov-2014
3.2.7 2014 Clinical Quality Measures*
Type of Clinical Quality Measures Successfully Tested:
Ambulatory
x
Inpatient
No CQMs tested
*For a list of the 2014 Clinical Quality Measures, please reference http://www.cms.gov
(navigation: 2014 Clinical Quality Measures)
CMS ID
x
2
Version
v3
CMS ID
x
22
x
x
50
90
Ambulatory CQMs
Version CMS ID
v3
117
v2
x
122
v2
Version
CMS ID
136
x
155
v2
137
x
156
v2
138
157
52
123
139
158
56
124
140
159
61
125
141
160
62
126
142
161
64
127
143
163
65
128
144
164
66
129
145
130
146
166
147
167
68
v3
69
x
131
v2
x
165
74
132
148
169
75
133
149
177
77
134
153
179
82
135
154
182
CMS ID
Version
CMS ID
Version
Inpatient CQMs
Version CMS ID
Version
CMS ID
9
71
107
172
26
72
108
178
30
73
109
185
31
91
110
188
32
100
111
190
53
102
113
55
104
114
60
105
171
v2
Version
Page 9 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Nov-2014
3.2.8 Automated Numerator Recording and Measure Calculation
3.2.8.1 Automated Numerator Recording
Automated Numerator Recording Successfully Tested
(a)(1)
(a)(9)
(a)(16)
(b)(6)
(a)(3)
(a)(11)
(a)(17)
(e)(1)
(a)(4)
(a)(12)
(b)(2)
(e)(2)
(a)(5)
(a)(13)
(b)(3)
(e)(3)
(a)(6)
(a)(14)
(b)(4)
(a)(7)
(a)(15)
(b)(5)
x Automated Numerator Recording was not tested
3.2.8.2 Automated Measure Calculation
Automated Measure Calculation Successfully Tested
x
(a)(1)
x
(a)(9)
(a)(16)
(b)(6)
x
(a)(3)
x
(a)(11)
(a)(17)
x
(e)(1)
x
(a)(4)
x
(a)(12)
x
(b)(2)
x
(e)(2)
x
(a)(5)
x
(a)(13)
x
(b)(3)
x
(e)(3)
x
(a)(6)
x
(a)(14)
x
(b)(4)
x
(a)(7)
x
(a)(15)
x
(b)(5)
Automated Measure Calculation was not tested
3.2.9 Attestation
Attestation Forms (as applicable)
Appendix
x Safety-Enhanced Design*
A
x Quality Management System**
B
x Privacy and Security
C
*Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16),
(b)(3), (b)(4)
**Required for every EHR product
3.3 Appendices
Attached below.
Page 10 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Nov-2014
Test Results Summary Change History
Test Report ID
Description of Change
Date
2014 Edition Test Report Summary
Page 11 of 11
TransMed Client Software 5 (CS5) Safety­Enhanced Design Usability Report Report based on NISTIR 7741: NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records Product TransMed Client Software 5 (CS5) Date of Usability Test Multiple tests were conducted between July 24, 2014 ­ October 1, 2014 Date of Report October 7, 2014 Report Prepared By TransMed Network 201 South Buena Vista St., Suite #425 Burbank, CA 91505 TransMed SED Report | 1 Table of Contents 1.
2.
3.
4.
5.
6.
EXECUTIVE SUMMARY…………………………………………………………… 3 INTRODUCTION…………………………………………………………………….. 7 USER­CENTERED DESIGN PROCESS………………………………………….... 8 METHOD…………………………………………………………………………….... 10 4.1.
PARTICIPANTS……………………………………………………………….. 10 4.2.
STUDY DESIGN………………………………………………………………. 11 4.3.
TASKS…………………………………………………………………………. 11 4.4.
PROCEDURE………………………………………………………………….. 13 4.5.
TEST LOCATION……………………………………………………………... 14 4.6.
TEST ENVIRONMENT……………………………………………………….. 14 4.7.
TEST FORMS AND TOOLS………………………………………………….. 14 4.8.
PARTICIPANT INSTRUCTIONS…………………………………………….. 15 4.9.
USABILITY METRICS……………………………………………………….. 16 RESULTS…………………………………………………………………………….... 19 5.1.
DATA ANALYSIS AND REPORTING………………………………………. 19 5.2.
DISCUSSION OF THE FINDINGS………...…………………………………. 24 5.2.1. MAJOR FINDINGS AND AREAS FOR IMPROVEMENT………...... 24 APPENDICES…………………………………………………………………....……. 27 6.1.
APPENDIX 1: RECRUITING SCREENER………………………………….... 27 6.2.
APPENDIX 2: PARTICIPANT DEMOGRAPHICS……………………...…… 29 6.3.
APPENDIX 3: NON­DISCLOSURE AGREEMENT AND INFORMED…….. 30 CONSENT FORM 6.4.
APPENDIX 4: MODERATOR’S GUIDE……………………………….…….. 31 6.5.
APPENDIX 5: SYSTEM USABILITY SCALE QUESTIONNAIRE……….... 39 6.6.
APPENDIX 6: INCENTIVE RECEIPT……………………………………….. 40 TransMed SED Report | 2 1.
EXECUTIVE SUMMARY TransMed Network conducted usability testing with Client Software 5 (CS5) on a set of features outlined in §170.314.g.3. The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). During the usability test, ten healthcare providers matching the target demographic criteria served as participants and used the EHRUT in simulated, but representative tasks. This study collected performance data on tasks typically conducted on an EHR under the following seven sections: ● Computerized provider order entry ● Drug­drug, drug­allergy interaction checks ● Medication list ● Medication allergy list ● Clinical decision support ● Electronic prescribing ● Clinical information reconciliation During the one hour, one­on­one usability test, each participant was greeted by the administrator and asked to review and sign an informed consent/release form (included in Appendix 3); they were instructed that they could withdraw at any time. Participants have had prior experience with TransMed Client Software (At least a year). The administrator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the Client Software. During the testing, the administrator timed the test and, along with the data loggers who recorded user performance data on paper and electronically. The administrator did not give the participant assistance in how to complete the task. The following types of data were collected for each participant: ● Number of tasks successfully completed within the allotted time without assistance ● Time to complete the tasks ● Number and types of errors ● Path deviations ● Participant’s verbalizations ● Participant’s satisfaction ratings of the system Source: http://www.nist.gov/healthcare/usability/upload/LowryNISTIR­7742Customized_CIF_Template
_for_EHR_Usability_Testing_Publicationl_Version­doc.pdf TransMed SED Report | 3 All participant data was de­identified – no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a post­test questionnaire and were compensated with $75 for their time. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of the Client Software. Following is a summary of the performance and rating data collected on the Client Software. Test results summary: Criterion Task Success Path Deviations Task Time (seconds) Errors Task Efficiency (1­5) 1 = Very Efficient Task Ratings (1­5) 1 = Very Easy CPOE (314.a.1): Record a Medication Order CPOE (314.a.1): Change/Access Medication Order M 1 M 0.38 M 118 M 0.25 M 1.25 M 1.25 SD 0 SD 0.74 SD 23 SD 0.46 SD 0.71 SD 0.46 M 1 M 0.25 M 123 M 0.13 M 1.38 M 1.50 SD 0 SD 0.46 SD 29 SD 0.35 SD 0.74 SD 1.07 CPOE (314.a.1): M Record, Change, Access, SD Laboratory Order CPOE (314.a.1): M Record, Change, Access, SD Radiology/Imag
ing Order 1 M 0.13 M 303 M 0.25 M 3.50 M 3.50 0 SD 0.35 SD 80 SD 0.71 SD 0.76 SD 0.93 1 M 0.13 M 358 M 0.13 M 3.50 M 3.50 0 SD 0.35 SD 49 SD 0.35 SD 0.76 SD 0.93 TransMed SED Report | 4 Drug­drug, drug­allergy interactions checks (314.a.2): Create and adjust severity of interventions Drug­drug, drug­allergy interactions checks (314.a.2): Adjustment of severity level of drug­drug interventions Medication List (314.a.6): Record, change, access medication list. M 1 M 0 M 51 M 0 M 1 M 1 SD 0 SD 0 SD 12 SD 0 SD 0 SD 0 M 1 M 0 M 25 M 0 M 1 M 1 SD 0 SD 0 SD 6 SD 0 SD 0 SD 0 M 1 M 0 M 193 M 0 M 1.13 M 1.13 SD 0 SD 0 SD 63 SD 0 SD 0.35 SD 0.35 Medication allergy list (314.a.7): Record, change, access medication allergy list. Clinical decision support (314.a.8): Interventions (Problem, medication, medication allergy, demographics, lab tests, vital signs) M 1 M 0 M 224 M 0 M 1 M 1 SD 0 SD 0 SD 47 SD 0 SD 0 SD 0 M 1 M 0 M 105 M 0.25 M 1.50 M 1.50 SD 0 SD 0 SD 32 SD 0.46 SD 0.76 SD 0.76 TransMed SED Report | 5 Clinical decision support (314.a.8): Intervention configuration E­Prescribing (314.b.3): Creating prescriptions Clinical Information reconciliation (314.b.4): M 1 M 0 M 131 M 0.25 M 1.38 M 1.75 SD 0 SD 0 SD 16 SD 0.46 SD 0.52 SD 1.04 M 1 M 0.38 M 110 M 0.25 M 1.25 M 1.38 SD 0 SD 0.74 SD 20 SD 0.71 SD 0.46 SD 0.52 M 1 M 0.25 M 354 M 0.38 M 1.63 M 1.75 SD 0 SD 0.46 SD 69 SD 0.52 SD 0.74 SD 1.04 TransMed SED Report | 6 2.
INTRODUCTION These studies were conducted under TransMed Client Software 5. CS5 is a full­featured EHR which encompasses practice management, charting, billing management, and standard operations for medications and test ordering. Our chronological chart keeps a sequential history of all activity related to a patient’s health. The usability testing attempted to represent realistic exercises and conditions. This study will help with our usability program and to satisfy the Safety­Enhanced Design requirements for 2014 ONC EHR certification. Our focus is on measures of effectiveness, efficiency, and user satisfaction. TransMed SED Report | 7 3.
USER­CENTERED DESIGN PROCESS Our user­centered design process is based on the ISO 9241­210 ­ Ergonomics of human­system interaction standard. At each stage of the design process, we make extensive use of the user models, user­centered phrasing/error messages, an effective feedback loop to help with refinements, and currently TransMed is in the middle of constructing a core architecture that will allow these usability elements to be more seamlessly integrated while remaining robust. It is an iterative process which includes all the design, task analysis, implementation, documentation, and evaluation phases. Given the amount of difficulty with EHR interfaces in general, user­centered design is an integral part of our design process as the benefits to the users would help encourage a strong continuous feedback and results/implementation system. This will help us proliferate and accumulate a healthy amount of data we could use to improve the experience for our users. As new features and current features are being built and enhanced, usability analysis is performed before it it released in production. From our past usability tests, we’ve discovered most our goals in UCD we obtain from users continuous feedback. The importance of maintaining and monitoring this feedback system cannot be understated. TransMed UCD Phases: Requirements Analysis / Modeling A large part of our system design is based on user roles and models. This helps us build a deeper comprehensible system which provides consistent interfaces, user control, and predictable behavior. A task list helps paint a scenario to which we could understand user behavior and perform workflow analysis. This will help us align our solutions with our goals and objectives which will build an ideal experience scenarios for our users. Refinement Throughout the development process, developers and designers work cohesively to determine ideal direction of component design and making sure they align with our goals. TransMed SED Report | 8 Feedback Continuous feedback from our users is our primary source of direction and design. When multiple users “upvote” a feature request or a bug­fix, those items get moved to the top of the queue and are implemented first. The status of this feature is constantly updated so the users are informed on the current state and progression of these items. Developers could also leave messages for the posters of these requests and the messages are made public to everyone and allow a conversation to develop. We’ve found this to help keep our users more proactive in our UCD process. Testing On many occasions, usability could only be evaluated while watching the user interact with the system. The path the user takes and the comments the user makes are huge indicators of deficiencies and room for improvement. They also tell us what we’re doing right and which design patterns should be utilized more frequencies during which scenarios. TransMed SED Report | 9 4.
METHOD 4.1 ­ PARTICIPANTS A total of ten participants (with admin privileges) were tested on CS5. Participants in the test were physician users of CS5 and were compensated $75 for their participation. In addition, participants had no direct connection to the development of or organization producing the CS5. Participants were not from the testing or supplier organization. Participants were given the opportunity to have the same orientation and level of training as the actual end users would have received. For the test purposes, end­user characteristics were identified and translated into a recruitment screener used to solicit potential participants; the screener is provided in Appendix 1. All participants match the previously stated description of the intended users. Recruited participants had a mix of backgrounds and demographic characteristics conforming to the recruitment screener. The following is a table of participants by characteristics, including demographics, professional experience, computing experience and user needs for assistive technology. Participant names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual identities. Each session was an hour long. ID Specialty Role Professional Experience (Years) TransMed CS Experience (Years) Assistive Technology Needs 1 Neurology Physician 20 8 None 2 General Practice Physician 10 8 None 3 Endocrinology Physician 21 7 None 4 Dermatology Physician 5 3 None 5 Internal Medicine Physician 8 7 None 6 Pulmonary Medicine Physician 16 5 None 7 Family Practice Physician 30 3 None 8 Cardiology Physician 18 7 None TransMed SED Report | 10 9 Pediatrics Physician 26 4 None 10 Family Practice Physician 32 5 None 4.2 ­ STUDY DESIGN Overall, the objective of this test was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction – and areas where the application failed to meet the needs of the participants. The data from this test may serve as a baseline for future tests with an updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements must be made. During the usability test, participants interacted with one EHR. Each participant used the system remotely, and was provided with the same instructions. The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each participant: ● Number of tasks successfully completed within the allotted time without assistance ● Time to complete the tasks ● Number and types of errors ● Path deviations ● Participant’s verbalizations (comments) ● Participant’s satisfaction ratings of the system 4.3 ­ TASKS A number of tasks were constructed that would be realistic and representative of each feature specified by the ONC. Identical test data for each section was provided to each participant. User tasks employed in the study are prioritized in accordance with the risk associated with the user. These tasks are also prioritized on our request inclusion and enhancement. Each task was also evaluated based on the potential level of adverse risk to the patient. Selection of user path for tasks is prioritized in accordance with identifying the risk associated with any possible user errors. The possible values are: ● High ● Moderate ● Low Involvement with medications are generally high however risks associated with laboratory orders would be considered low. TransMed SED Report | 11 Task / Path Risk Assessment CPOE (314.a.1) ­ Medication Order High ● Record medication order ○ Start an encounter, medication order under CPOE using High ● Change/access medication order ○ Go into the encounter, modify medication order in CPOE High CPOE (314.a.1) ­ Laboratory/Radiology/Imaging Order Low ● Record laboratory/radiology/imaging order ○ Start an encounter, create an order under CPOE Low ● Change/access laboratory/radiology/imaging order ○ Access encounter, modify order under CPOE Low Drug­drug, drug­allergy interactions check (314.a.2) High ● Create drug­drug and drug­allergy interventions prior to CPOE completion under provider settings. High ● Adjustment of severity level of drug­drug interventions under the High Newcrop admin interface when modifying drug. Medication list (314.a.6) High ● Record medication list ○ Add medication to the patient’s chart High ● Change/access medication list ○ Access and the change the patient’s medication list. High Medication allergy list (314.a.7) High ● Record medication allergy list: Add medication allergy list to patient’s allergy list under their profile. ­
High Change/access medication allergy list: Access/change medication High allergy list on patient’s profile. Clinical decision support (314.a.7) ● Access interventions depending on the intervention from the encounter or chart. Moderate Moderate TransMed SED Report | 12 ● Identify user diagnostic and therapeutic reference information in the encounter or chart. Low ­
Moderate Configuration of CDS by user by accessing provider CDS engine. Electronic prescribing (314.b.3) High ● Prescribe patient medication from the encounter under CPOE module. Clinical information reconciliation (314.b.4) High Moderate ● Reconcile patient’s active medication list with another source via Moderate CCDA ● Reconcile patient’s active problem list with another source via CCDA Moderate ● Reconcile patient’s active medication allergy list with another source via CCDA Moderate 4.4 ­ PROCEDURES Before each session the administrator gives an overview to the data logger. This overview consists of a review of the flow and the key usability elements for observation to make sure all important data would be recorded. All ten participants were remote. The remote connections were ensured to be secure and stable. The participants screen was displayed to the loggers and administrator through a screen­sharing program, this also included the sharing of audio for identification and listening purposes. Steps the administrator took prior to test: 1. Verbally ask them about their expectations for the test 2. Provide instructions to the participants 3. Give a high­level overview of the tasks required of them Steps during the test: 1. The timer starts when the administrator directs the participant to work on the task 2. The timer ends when the participant had successfully completed the task. The administrator moderated the session including administering instructions and tasks. The administrator also monitored task times, obtained post­task rating data, and took notes on participant comments. The data logger took notes on task success, path deviations, number and TransMed SED Report | 13 type of errors, and comments. Following the session, the administrator gave the participant the post­test questionnaire (e.g., the System Usability Scale, see Appendix 5), compensated them for their time, and thanked each individual for their participation. Participants were instructed to perform the tasks ● As quickly as possible making as few errors and deviations as possible. ● Without assistance; administrators were allowed to give immaterial guidance and clarification on tasks, but not instructions on use. ● Verbally comment if necessary Participants' demographic information, task success rate, time on task, errors, deviations, verbal responses, and post­test questionnaire were recorded into a spreadsheet. Participants were thanked for their time and compensated. Participants receipts were recorded (See Appendix 6) indicating that they had received the compensation. 4.5 ­ TEST LOCATION The remote tests were conducted from TransMed’s headquarters in Burbank, California. The administrator, observers, and the data logger worked from a separate room where they could see the participant’s screen and face shot, and listen to the audio of the session. To ensure that the environment was comfortable for users, noise levels were kept to a minimum with the ambient temperature within a normal range. All of the safety instruction and evacuation procedures were valid, in place, and visible to the participants. Remote participants were allowed to use their own systems for the testing which would be more representative of what they would experience in a field implementation. 4.6 ­ TEST ENVIRONMENT Remote participants were allowed to use their own system they normally would use to access the EHR system. The same reliable and stable screen­sharing software was used and a stable connection was ensured. All participants used a dial­in conference number. 4.7 ­ TEST FORMS AND TOOLS During the usability test, various documents and instruments were used, including: 1. Informed Consent 2. Moderator’s Guide 3. Post­test Questionnaire TransMed SED Report | 14 4. Incentive Receipt TransMed SED Report | 15 4.8 ­ PARTICIPANT INSTRUCTIONS The test session were electronically transmitted via a screen­sharing application to our remote location where the data logger observed the test session. The administrator reads the following instructions aloud to the each participant (also see the full moderator’s guide in Appendix 4): Thank you for participating in this study. Your input is very important. Our session today will last about 60 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in case you need specific help, but I am not able to instruct you or provide help in how to use the application. Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing. The administrator gave the following instructions: For each task, I will read the description to you and say “Begin.” At that point, please perform the task and say “Done” once you believe you have successfully completed the task. I would like to request that you not talk aloud or verbalize while you are doing the tasks. Participants were then given 12 tasks to complete. Tasks are listed in the moderator’s guide in Appendix 4. I will ask you your impressions about the task once you are done. TransMed SED Report | 16 4.9 ­ USABILITY METRICS Participants will help evaluate the usability of TransMed CS via certain predefined metrics. Our focus is measure both the subjective effectiveness and satisfaction levels of the participants. ● Satisfaction ratings ● Successful outcomes: ○ The allotted time ○ Without assistance ● Task ratings (ease of use): ○ Ease and efficiency ● Efficiency rating ● Time to complete the tasks ● Path deviations ● Number/types of errors ● Participant’s verbalizations/comments Source: http://www.nist.gov/healthcare/usability/upload/Draft_EUP_09_28_11.pdf According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess: 1. Effectiveness of CS5 by measuring participant success rates and errors 2. Efficiency of CS5 by measuring the average task time and path deviations 3. Satisfaction with CS5 by measuring ease of use ratings Data Scoring The following table details how tasks were scored, errors evaluated, and the time data analyzed. Measures Rationale and Scoring Effectiveness: Task Success A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis. TransMed SED Report | 17 The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage. Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency. Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator’s Guide must be operationally defined by taking multiple measures of optimal performance and multiplying by some factor of 1.25 that allows some time buffer because the participants are presumably not trained to expert performance. Thus, if expert, optimal performance on a task was 120 seconds then allotted task time performance was 120 * 1.25 = 150 seconds. This ratio should be aggregated across tasks and reported with mean and variance scores. Effectiveness: Task Failures If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an “Failures.” No task times were taken for errors. The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors. Efficiency: The participant’s path (i.e., steps) through the application was recorded. Task Deviations Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on­screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation. It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks. Efficiency: Task Time Each task was timed from when the administrator said “Begin” until the participant said, “Done.” If he or she failed to say “Done,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task TransMed SED Report | 18 time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated. Satisfaction: Task Rating Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post­task question as well as a post­session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Difficult) to 5 (Very Easy). These data are averaged across participants. To measure participants’ confidence in and likeability of TransMed CS5 overall, the testing team administered the System Usability Scale (SUS) post­test questionnaire. TransMed SED Report | 19 5.
RESULTS 5.1 ­ DATA ANALYSIS AND REPORTING The results of the usability test were calculated according to the methods specified in the Usability Metrics section above. Two participants had difficulties completing the session, due to an assortment of connection problems and misunderstanding of the instructions. Task Ratings and Efficiency ratings are from 1­5, easiest to difficult/efficient respectively. CPOE (314.a.1) ­ Record Medication Order Task Success (1 = Yes) ID Task Time Path Deviations (seconds) Task Task Ratings Efficiency (1 (1 ­ 5) ­ 5) Errors 1 1 0 119.00 0 1 1 2 1 2 150.00 1 3 2 3 1 0 85.00 0 1 1 4 1 0 101.00 0 1 1 5 1 1 139.00 1 1 2 6 1 0 136.00 0 1 1 7 1 0 122.00 0 1 1 8 1 0 95.00 0 1 1 Mean 1.00 0.38 118.38 0.25 1.25 1.25 SD 0.00 0.74 23.03 0.46 0.71 0.46 CPOE (314.a.1) ­ Change/Access Medication Order Task Success (1 = Yes) ID Task Time Path Deviations (seconds) Task Task Ratings (1 ­ Efficiency (1 5) ­ 5) Errors 1 1 0 105.00 0 1 1 2 1 1 180.00 1 3 4 3 1 0 110.00 0 1 1 4 1 0 100.00 0 1 1 5 1 1 150.00 0 2 2 6 1 0 130.00 0 1 1 7 1 0 97.00 0 1 1 8 1 0 112.00 0 1 1 TransMed SED Report | 20 Mean 1.00 0.25 123.00 0.13 1.38 1.50 SD 0.00 0.46 28.90 0.35 0.74 1.07 CPOE (314.a.1) ­ Record/Change/Access Laboratory Order Task Success (1 = Yes) ID Task Time Path Deviations (seconds) Task Task Ratings (1 ­ Efficiency (1 5) ­ 5) Errors 1 1 0 250.00 0 3 3 2 1 0 295.00 0 3 4 3 1 0 400.00 0 4 5 4 1 0 249.00 0 3 3 5 1 0 276.00 0 4 4 6 1 1 450.00 2 5 2 7 1 0 275.00 0 3 3 8 1 0 225.00 0 3 4 Mean 1.00 0.13 302.50 0.25 3.50 3.50 SD 0.00 0.35 79.64 0.71 0.76 0.93 CPOE (314.a.1) ­ Record/Change/Access Radiology/Imaging Order Task Success (1 = Yes) ID Task Time Path Deviations (seconds) Task Task Ratings (1 ­ Efficiency (1 5) ­ 5) Errors 1 1 0 235.00 0 3 3 2 1 0 275.00 0 3 4 3 1 1 350.00 1 4 5 4 1 0 220.00 0 3 3 5 1 0 235.00 0 4 4 6 1 0 300.00 0 5 2 7 1 0 250.00 0 3 3 8 1 0 195.00 0 3 4 Mean 1.00 0.13 257.50 0.13 3.50 3.50 SD 0.00 0.35 49.35 0.35 0.76 0.93 TransMed SED Report | 21 Drug­drug, Drug­allergy Interactions Checks ­ Creating Interventions (314.a.2) Task Success (1 = Yes) ID Task Time Path Deviations (seconds) Task Task Ratings (1 ­ Efficiency (1 5) ­ 5) Errors 1 1 0 65.00 0 1 1 2 1 0 70.00 0 1 1 3 1 0 55.00 0 1 1 4 1 0 35.00 0 1 1 5 1 0 50.00 0 1 1 6 1 0 49.00 0 1 1 7 1 0 39.00 0 1 1 8 1 0 46.00 0 1 1 Mean 1.00 0.00 51.13 0.00 1.00 1.00 SD 0.00 0.00 11.97 0.00 0.00 0.00 Drug­drug, Drug­allergy Interactions Checks ­ Adjustment of Severity Level (314.a.2) Task Success (1 = Yes) ID Task Time Path Deviations (seconds) Task Task Ratings (1 ­ Efficiency (1 5) ­ 5) Errors 1 1 0 25.00 0 1 1 2 1 0 15.00 0 1 1 3 1 0 30.00 0 1 1 4 1 0 23.00 0 1 1 5 1 0 18.00 0 1 1 6 1 0 27.00 0 1 1 7 1 0 32.00 0 1 1 8 1 0 29.00 0 1 1 Mean 1.00 0.00 24.88 0.00 1.00 1.00 SD 0.00 0.00 5.94 0.00 0.00 0.00 Medication List (314.a.6) ­ Record/Change/Access Medication List Task Success (1 = Yes) ID Task Time Path Deviations (seconds) Task Task Ratings (1 ­ Efficiency (1 5) ­ 5) Errors 1 1 0 250.00 0 1 1 2 1 0 138.00 0 1 1 TransMed SED Report | 22 3 1 0 295.00 0 1 1 4 1 0 150.00 0 1 1 5 1 0 198.00 0 2 2 6 1 0 201.00 0 1 1 7 1 0 215.00 0 1 1 8 1 0 100.00 0 1 1 Mean 1.00 0.00 193.38 0.00 1.13 1.13 SD 0.00 0.00 62.96 0.00 0.35 0.35 Medication Allergy List (314.a.7) ­ Record/Change/Access Medication Allergy List Task Success (1 = Yes) ID Task Time Path Deviations (seconds) Task Task Ratings (1 ­ Efficiency (1 5) ­ 5) Errors 1 1 0 234.00 0 1 1 2 1 0 313.00 0 1 1 3 1 0 175.00 0 1 1 4 1 0 185.00 0 1 1 5 1 0 235.00 0 1 1 6 1 0 260.00 0 1 1 7 1 0 185.00 0 1 1 8 1 0 207.00 0 1 1 Mean 1.00 0.00 224.25 0.00 1.00 1.00 SD 0.00 0.00 46.55 0 0 0 Clinical Decision Support (314.a.8) ­ Interventions Task Success (1 = Yes) ID Task Time Path Deviations (seconds) Task Task Ratings (1 ­ Efficiency (1 5) ­ 5) Errors 1 1 0 142.00 1 3 2 2 1 0 110.00 0 1 1 3 1 0 150.00 0 2 2 4 1 0 76.00 0 1 1 5 1 0 101.00 0 1 1 6 1 0 123.00 1 2 3 7 1 0 74.00 0 1 1 8 1 0 66.00 0 1 1 TransMed SED Report | 23 Mean 1.00 0.00 105.25 0.25 1.50 1.50 SD 0.00 0.00 31.81 0.46 0.76 0.76 Clinical Decision Support (314.a.8) ­ Configuration Task Success (1 = Yes) ID Task Time Path Deviations (seconds) Task Task Ratings (1 ­ Efficiency (1 5) ­ 5) Errors 1 1 0 135.00 0 2 3 2 1 0 133.00 0 1 1 3 1 0 151.00 1 1 3 4 1 0 102.00 0 2 1 5 1 0 140.00 0 1 1 6 1 0 144.00 1 2 3 7 1 0 115.00 0 1 1 8 1 0 127.00 0 1 1 Mean 1.00 0.00 130.88 0.25 1.38 1.75 SD 0.00 0.00 15.96 0.46 0.52 1.04 Electronic Prescribing (314.b.3) ­ Create Prescriptions Task Success (1 = Yes) ID Task Time Path Deviations (seconds) Task Task Ratings (1 ­ Efficiency (1 5) ­ 5) Errors 1 1 0 90.00 0 1 1 2 1 0 110.00 0 1 1 3 1 0 120.00 0 1 2 4 1 2 150.00 2 2 2 5 1 0 101.00 0 1 1 6 1 1 123.00 0 2 2 7 1 0 98.00 0 1 1 8 1 0 90.00 0 1 1 Mean 1.00 0.38 110.25 0.25 1.25 1.38 SD 0.00 0.74 20.33 0.71 0.46 0.52 TransMed SED Report | 24 Clinical Information Reconciliation (314.b.4) ­ Medication, Problem, and Med. Allergy List Task Success (1 = Yes) ID Task Time Path Deviations (seconds) Task Task Ratings (1 ­ Efficiency (1 5) ­ 5) Errors 1 1 0 333.00 0 1 1 2 1 1 500.00 0 3 3 3 1 0 350.00 0 1 1 4 1 0 398.00 0 2 1 5 1 0 349.00 0 1 1 6 1 1 332.00 0 1 1 7 1 0 275.00 0 1 1 8 1 0 298.00 0 1 1 Mean 1.00 0.25 354.38 0.00 1.38 1.25 SD 0.00 0.46 69.28 0 0.74 0.71 5.2 ­ DISCUSSION OF THE FINDINGS The results from the System Usability Scale scored the subjective satisfaction with the system based on performance with these tasks to be: 70. Broadly interpreted, scores under 60 represent systems with poor usability; scores over 80 would be considered above average.) In addition to the performance data, the following qualitative observations were made: 5.2.1 ­ MAJOR FINDINGS AND AREAS FOR IMPROVEMENT Major Findings ● Some of the core infrastructure implementations were both convenient and made usability problematic. One example was our treatment test and module library. Although it provided a convenient way to to select and customize common tests, one had to go out of the encounter in order to do it. ● Advanced users prefer to have many buttons listed rather than to have to select from a pull­down menu. ● Some sections such as Immunizations, was a bit confusing to navigate and insert data for. This is mostly due to the fact that each record will have to be under an event (in this case a contact). The flow did not make this apparent. ● Some options within setting up modules (which affects a majority of the tasks) were problematic because of the drag­and­drop feature. At first we thought it was useful and favorable but we quickly realized having a static button was more effective. TransMed SED Report | 25 ● Some items which users expect to be part of the encounter modules section took longer to find because of illogically placed menu items. ● The location of the search boxes were a bit cumbersome. Participants would prefer to have the searches all to the left. ● Some error messages were not as specific as the user wanted them to be. ● Participants seemed to be happy with the module flow within an encounter. ● Many of the participants found the CDS system to be a bit awkward because of the lack of textual guidance and disorganization of the UI. It was not incredibly intuitive. ● Labs/Radiology/Imaging tasks were difficult for the participants, largely due to having to go out of the event in order to add additional tests. Areas for improvement ● TransMed is in the middle of a complete revamp of our UI. We’ve been collecting feedback from our users for the past few years and considered implementing many of the most popular requested changes. ● We definitely want to provide more informative alerts and informational guidance throughout the flow of the system. ● We want to minimize the number of clicks and page views the user has to go through (get user closer to optimal path). ● We want to organize the interface into more logical sections to enable the user to depend more on intuition rather than complications. ● Orders will be made much easier in the near future. We realize the deficiencies of not tying in on­the­fly configurations during an encounter. Effective The accuracy and completeness with which specified users can achieve specified goals in particular environments ● Task completion rates were high. There were a few path deviations and error rates were low. Most paths were obvious however when an incorrect path was taken, some overlapping paths had to be retaken. For example, when a new laboratory order needs to be imported on the fly, the user has to actually exit the encounter, enter the lab with standard codes, and re­enter the encounter. This disrupts the optimal path one could take enormously. Fortunately, in a real world scenario, most of these labs are entered once and rarely change. Efficiency The resources expended in relation to the accuracy and completeness of goals achieved ● There were a few awkward sections which affected how a user could be more efficient with the tasks. For example, there will be times data would have to be re­entered however, depending on the resolution of the monitor, the user would have to scroll down to the form again and resubmit.There are also a few cases where the use of color to TransMed SED Report | 26 differentiate the various sections could be used, especially in the medications module. We are planning to do a complete redesign to our core architecture in the upcoming version to address all the deficiencies, primarily built with usability in mind. Satisfaction The comfort and acceptability of the work system to its users and other people affected by its use ● Overall the participants were a bit mixed about the user interface and flow of some parts of the system. All parties thought the system was feature­rich. The more advanced users were able to find optimal paths and solutions to questionable flows more easily. These users are very comfortable with working with web­based systems. As for general users, they were satisfied with the capabilities of the system but felt the friendliness of the flows to be lacking. TransMed SED Report | 27 APPENDIX 1: RECRUITING SCREENER The importance of human performance and user experience with healthcare systems cannot be overstated. Effective user interfaces enable healthcare providers to make a more informed decision and potentially save the lives of patients. We at TransMed are looking for participants in our ongoing usability design process. You will be asked to complete a set of tasks using our CS5 system. Our goal is to determine how usable the system is for you. At the end of the session we will gather additional feedback and suggestions pertaining to the system’s UI/UX as well as any other relevant aspects of this test. ● Have you participated in a [focus group or usability test] in the past six months? [If yes, Terminate] ● Do you, or does anyone in your home, work [in a field or employer that would disqualify you from participating, e.g., web designer, government employee, etc.]? [If yes, Terminate] ● What is your current position and title? ● How long have you held this position? ● How long have you used TransMed Client Software? [If less than a year, Terminate] ● What computer platform do you usually use? [e.g. Mac, Windows XP, etc.] ● What Internet browser(s) do you usually use? [e.g., Firefox, IE v.6, AOL, etc.] ● Do you, or does anyone in your home, have a commercial or research interest in an electronic health record software or consulting company? [If yes, Terminate] Contact Information [If the person matches your qualifications, ask] May I have your contact information? Name of participant: Address: City, State, Zip: Daytime phone number: Evening phone number: Alternate [cell] phone number: Email address: Those are all the questions I have for you. Your background matches the people we're looking for. Would you be able to participate on [date, time]? Before your session starts, we will ask you to sign a release form allowing us capture your screen during the duration of the test. The video file will only be used internally for further study if needed. Do you give us the consent to capture this information? After the session you will receive a check in mail in the amount of $75. TransMed SED Report | 28 The tests are done remotely via an online meeting application. This application will allow us to use the telephone for voice and enable screen share during the testing. You will be given a test account which will not contain any production level data including real patient records. Session times Monday through Friday ­ 9 am to 7 pm. [Dependent upon staff/participant availability] Sources http://www.usability.gov/sites/default/files/usability­test­screener­website­example­1_0.docx http://www.nist.gov/healthcare/usability/upload/Draft_EUP_09_28_11.pdf TransMed SED Report | 29 APPENDIX 2: PARTICIPANT DEMOGRAPHICS 1. Participant has used TransMed Client Software for at least a year. 2. Have not participated in a focus group or usability test in the last three months. 3. Does not, nor does anyone in their home, work in marketing research, usability research, or web design. 4. Does not, nor does anyone in their home, have a commercial or research interest in an electronic health record software or consulting company. TransMed SED Report | 30 APPENDIX 3: NON­DISCLOSURE AGREEMENT AND INFORMED CONSENT FORM Non­Disclosure Agreement THIS AGREEMENT is entered into as of _ _, 2010, between _________________________ (“the Participant”) and the testing organization TransMed Network. The Participant acknowledges his or her voluntary participation in today’s usability study may bring the Participant into possession of Confidential Information. The term "Confidential Information" means all technical and commercial information of a proprietary or confidential nature which is disclosed by TransMed Network, or otherwise acquired by the Participant, in the course of today’s study. By way of illustration, but not limitation, Confidential Information includes trade secrets, processes, formulae, data, know­how, products, designs, drawings, computer aided design files and other computer files, computer software, ideas, improvements, inventions, training methods and materials, marketing techniques, plans, strategies, budgets, financial information, or forecasts. Any information the Participant acquires relating to this product during this study is confidential and proprietary to TransMed Network and is being disclosed solely for the purposes of the Participant’s participation in today’s usability study. By signing this form the Participant acknowledges that s/he will receive monetary compensation for feedback and will not disclose this confidential information obtained today to anyone else or any other organizations. Participant’s printed name: ___________________________________________ Signature: _____________________________________ Date: ____________________ TransMed SED Report | 31 APPENDIX 4: MODERATOR’S GUIDE Administrator ________________________ Data Logger ________________________ Date _____________________________ Time _________ Participant # ________ Prior to testing ● Confirm schedule with Participants ● Ensure EHRUT lab environment is running properly ● Ensure lab and data recording equipment is running properly Prior to each participant ● Reset application ● Start session recordings with tool Prior to each task ● Reset application to starting point for next task After each participant ● End session recordings with tool After all testing ● Back up all video and data files Orientation (5 minutes) Thank you for participating in this study. Our session today will last 90 minutes. During that time you will take a look at an electronic health record system. I will ask you to complete a few tasks using this system and answer some questions. We are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Do not do anything more than asked. If you get lost or have difficulty I cannot answer help you with anything to do with the system itself. Please save your detailed comments until the end of a task or the end of the session as a whole when we can discuss freely. I did not have any involvement in its creation, so please be honest with your opinions. The product you will be using today is describe the state of the application, i.e., production version, early prototype, etc. Some of the data may not make sense as it is placeholder data. TransMed SED Report | 32 We are recording the audio and video of our session today. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Do you have any questions or concerns? Preliminary Questions (X minutes) 1.
2.
3.
4.
5.
What is your job title / appointment? How long have you been working in this role? Years in healthcare Years using TransMed CS Assistive technology needs? Sign In ● Sign into TransMed CS with your test account and password ○ Admin: ■ User Name: [xxxxx] ■ Password: [xxxxx] ○ Non­Admin: ■ User Name: [xxxxx] ■ Password: [xxxxx] Task 1: Order Medication Instructions: Create a medication inside the CPOE of an encounter. Comments Outcome: Task Time: ________ Seconds Optimal Path: Observed Errors and Verbalizations Task Rating: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Task Efficiency: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Administrator / Notetaker Comments TransMed SED Report | 33 Task 2: Change/Access Medication Order Instructions: Change/Access Medication Order from the CPOE of the encounter. Comments Outcome: Task Time: ________ Seconds Optimal Path: Observed Errors and Verbalizations Task Rating: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Task Efficiency: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Administrator / Notetaker Comments Task 3: Record/Change/Access Laboratory Order Instructions: Enter a lab order within an encounter Comments Outcome: Task Time: ________ Seconds Optimal Path: Observed Errors and Verbalizations Task Rating: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Task Efficiency: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Administrator / Notetaker Comments TransMed SED Report | 34 Task 4: Record/Change/Access Radiology/Imaging Instructions: Perform operations within an encounter Comments Outcome: Task Time: ________ Seconds Optimal Path: Observed Errors and Verbalizations Task Rating: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Task Efficiency: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Administrator / Notetaker Comments Task 5: Drug­Drug, drug­allergy interactions checks Instructions: Create drug­drug/drug allergy interventions Comments Outcome: Task Time: ________ Seconds Optimal Path: Observed Errors and Verbalizations Task Rating: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Task Efficiency: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Administrator / Notetaker Comments TransMed SED Report | 35 Task 6: Drug­drug, drug­allergy interactions checks Instructions: Adjustment of severity level of drug­drug interventions Comments Outcome: Task Time: ________ Seconds Optimal Path: Observed Errors and Verbalizations Task Rating: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Task Efficiency: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Administrator / Notetaker Comments Task 7: Medication List Instructions: Record/Change/Access the patient’s medication list Comments Outcome: Task Time: ________ Seconds Optimal Path: Observed Errors and Verbalizations Task Rating: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Task Efficiency: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Administrator / Notetaker Comments TransMed SED Report | 36 Task 8: Clinical Decision Support Instructions: Create interventions for the various sections and activate the intervention. Comments Outcome: Task Time: ________ Seconds Optimal Path: Observed Errors and Verbalizations Task Rating: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Task Efficiency: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Administrator / Notetaker Comments Task 9: Clinical Decision Support Instructions: Configure the CDS interventions Comments Outcome: Task Time: ________ Seconds Optimal Path: Observed Errors and Verbalizations Task Rating: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Task Efficiency: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Administrator / Notetaker Comments TransMed SED Report | 37 Task 10: Electronic prescribing Instructions: Enter an electronic prescription for a patient from the medication list provided. Comments Outcome: Task Time: ________ Seconds Optimal Path: Observed Errors and Verbalizations Task Rating: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Task Efficiency: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Administrator / Notetaker Comments Task 11: Medication Allergies Instructions: Enter medication allergies for a patient Comments Outcome: Task Time: ________ Seconds Optimal Path: Observed Errors and Verbalizations Task Rating: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Task Efficiency: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Administrator / Notetaker Comments TransMed SED Report | 38 Task 12: Clinical Reconciliation Instructions: Given the CCDA for a patient, reconcile the Problems, Medications, and Medication Allergies for the patient. Comments Outcome: Task Time: ________ Seconds Optimal Path: Observed Errors and Verbalizations Task Rating: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Task Efficiency: Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Administrator / Notetaker Comments TransMed SED Report | 39 APPENDIX 5: SYSTEM USABILITY SCALE QUESTIONNAIRE Strongly disagree…...Strongly agree 1 2 3 4 5 1.
I think that I would like to use this system frequently 2.
I found the system unnecessarily complex. 3.
I thought the system was easy to use. 4.
I think that I would need the support of a technical person to be able to use this system. 5.
I found the various functions in this system were well integrated. 6.
I thought there was too much inconsistency in this system. 7.
I would imagine that most people would learn to use this system very quickly. 8.
I found the system very cumbersome to use. 9.
I felt very confident using the system. I needed to learn a lot of things before I could 10.
get going with this system. Source: http://www.measuringusability.com/sus.php TransMed SED Report | 40 APPENDIX 6: INCENTIVE RECEIPTS Here are a few examples of the checks written to our participants. TransMed SED Report | 41 TransMed SED Report | 42 TransMed CS5 - Quality Management System
This system was used in the entire development of the EHR and encompasses all criteria for 2014 Edition MU certification. We use a homegrown QMS system that was built over time. Many documents, procedures and processes were developed to help develop and maintain the EHR. This system allows for long­term consistency and promotes due diligence and proper quality control. Software Development Processes
We take an agile approach to software development using the Scrum framework. It harnesses change for a competitive advantage. The team works together in short and sustainable development workloads called Sprints. The focus is on continuous improvement with the product goals in mind. This promotes higher levels of cohesiveness, self­organization, and performance. We try to keep the high priority stories small so it’s easier for everyone to understand and complete in a short period of time. A retrospective meeting is held to identify a few strategic changes that could be made for process improvement. This system is designed to provide a network where feedback from customers, team members, and from the market can be utilized to gain knowledge and promote growth. Roles
Product Owner
●
●
●
●
Responsible for maximizing the ROI. ○ Holds the vision for the product. ○ Represents the interest of the business and the customers Directs the team towards valuable work (Prioritize items in the backlog) Construct clear requirements: Role, feature, and reason/end­goal Answer team members’ questions Scrum Master
●
●
Deliverable is a high­performing, self­organizing team, much like a team coach. Remove any impediments/road­blocks ●
Not the team’s boss but is a peer position who is set apart by knowledge and responsibilities. Team Member
●
●
●
●
Have total authority over how the work gets done Complete user stories to increase the value of the product Self­organizes with proper estimates Contributing work in their area of speciality but also taking up other tasks to help the team get more things done. The Product Backlog
●
●
●
●
●
Who is the for? (Which users benefit from this?) What is the desired functionality? What is the purpose of this functionality? Estimate implementation time What is the acceptance criteria to determine whether it has been implemented correctly? One hour per week is spent to define and redefine the criteria. These are examples to show the story is done. The Sprint Backlog
●
●
●
A story consists of many tasks from the backlog. These tasks are units of work the individuals in a team are responsible for. These to­do lists have a finite life­span: typically 2 weeks. Burn Charts
●
●
Scope Vs Time charts Keeps track of how far the team has gone on the project. Task Board
●
We use a tracker system which keeps track of works in progress, completed, and any required feedback requests from other members of the team. ● Product owner decides which stories will be worked on and team members decide how much work they can take on. Daily Scrums
●
●
●
What tasks were completed since the last daily scrum? What tasks I expect to complete by the next daily scrum? What obstacles are slowing you down? Software Testing
●
●
We have a development environment, staging environment, and production. After code has been committed, a ticket is sent to our QA to ensure proper testing has been done including all edge cases. ●
Once this is done, the code is synced to our staging server (which mimics the specifications of our production server) and additional testing is performed. ● Unit testing and regression testing is developed alongside the task itself. Before any commit, a running of all tests is required. Software Implementation
●
Once testing has completed, the entire team involved with the update/patch/development will be present. ● A launch is done with an automated script which will sync both the code and the database schema changes (if any). ● A quick test run is done by developers and QA on production to make sure the implementation was successful. Software Maintenance
●
We have a centralized system administration system which ensures ○ All security patches are up­to­date. ○ Critical configuration files are consistent. If alterations are detected, the file will be reverted. ○ If software packages are inconsistent, the system will roll them back to its proper state automatically. ○ Run checks on database, web server, disk storage, network, and RAID statuses along with other health indicators available within the system. ○ Monitors network and system access, including VPN and sudoers. ● The EHR allows for users to submit feedback and feature requests. These requests are prioritized and distributed based on the software development cycle described above. Technical Documentation
All technical documentations for server setups, installation, security rules, network protocols, backup & failover policies, monitoring, and billing processes are shared among the proper roles. Updates to these documents are automatically tracked and any changes require reasoning and approval. Name Jay Luong Signature Title Chief Operating Officer Company TransMed Network TransMed Network 201 South Buena Vista St, Suite #425 Burbank, CA 91505 (877) 999­8633 August 15, 2014 Audit Logging ­ Privacy and Security Jay Luong Chief Operating Officer TransMed Network attests to the validity of the information below to satisfy the documentation requirements for testing and certification of the ONC 2014 Edition criteria: 170.314(d)2. Does the EHR SUT allow the following?
Disabling the Audit Log The EHR technology does not allow any users to perform the disabling of audit logs. The logs are created transactionally with actions taken within the EHR without exceptions. The audit log system is a core part of the EHR system. Monitoring and recording of audit log status The EHR technology does not allow any changes (if disabling is possible) users to perform the disabling of audit log status. All events are recorded and cannot be disabled. Monitoring and recording of status changes to Encryption is not used to satisfy the d(7) encryption, if encryption is used to satisfy the criteria. There is no data stored locally on end user device encryption (d)7 criteria end­user devices which require encryption. Cookies with PHI are removed at the end of the session. If the audit log can be disabled, is the default state for audit log and audit log status recording enabled by default? The audit log cannot be disabled. If applicable, and if the EHR also allows it to be disabled, is the encryption of electronic health information on end­user devices enabled by default?
As a web­based EHR, session cookies are stored on the client machine. When the EHR session is closed, all PHI on the client machine is removed. Does the EHR SUT permit any users to delete electronic health information?
The EHR does not allow the deletion of any electronic health information, including accidental
entries. All “deletions” are considered updates in the system but the actual rows are either archived or flagged as deleted/void. Does the EHR SUT audit logging capability monitor each of the
required actions for all instances of electronic health information
utilized by the EHR SUT in accordance with the specified standard
ASTM E2147-01?
UPDATE The EHR does not have the capability to (Deletion, Changes) delete a record permanently. This action type will encompass both updates to a record and deletions (where the rows are flagged as deleted). SELECT This view type encompasses views on PHI (Queries, Copy, Print) including on­screen records and records for print. If it were for print, additional information on the type of file will be provided in the logs (i.e., Print, E­Mail). Copying of PHI within the EHR is done through these methods and will also apply to this action. INSERT This indicates a new record was created. (Addition) Describe the method(s) through which the audit logs are protected
from being changed, overwritten, or deleted by the EHR
technology itself.
The audit log cannot be modified, overwritten, or deleted by the EHR technology itself or any user within the EHR technology. The backend system only allows for inserts and selects. This
storage engine does not have the capability of changing the rows once inserted. In the event that the audit storage user used by the EHR technology is compromised, the audit log will remain intact and no alterations of rows is possible. ●
The logs are stored on a redundant storage array (RAID­10). Hot­spares and monitoring are in place in case of hardware failure. ●
Logs are also synced across multiple data centers independently. Manual (outside of EHR) alterations to the file in one location will not affect the other locations. ●
The logs are synced periodically to a remote location to which the same administrators
have access. ●
In the event that the log files get overwritten or deleted, we have periodic as well as synced copies. Describe the method(s) through which the EHR SUT is capable of
detecting whether the audit logs have been altered.
The EHR does not allow any user the ability to change, overwrite, or delete audit logs within the EHR technology. Encryption and logging is enabled by default and is immutable. It is capable of detecting any tampering of the audit log from outside the EHR as well. This includes hacking/injection, manual tampering by an intruder/employee/auditor, and even software bugs. We assume the intruder could have access to the DBMS, the operating system, the hardware, and the data in the database. Our audit logs are designed to be correct and inalterable. We understand the integrity of the audit log must be guaranteed and verifiable. Our approach is based on cryptographically strong one­way hash functions (SHA­1 for row tuples and SHA­256 for Notary ID). In addition to replicating logs and having full & differential backups to multiple physical sites, we use a periodic validator and a digital notarization service
(geographically separate). We’re using a temporal database where modifications only add information and no information
is ever deleted (Only INSERTs and SELECTs are allowed). Modifications are treated as deletion of that record while adding a new record. Interaction with the Notarization Service: 1. Log records are recorded on SELECT/INSERT only tables. The only possible way to modify this table would be a manual back­to­back conversion done by the highest level
of permissions that could be granted. We will assume an intruder will have this level of
access. Below shows example tuples with their hashes (created via linked hashing). These hashes are stored off­site and replicated to multiple physical locations. 2. The initial state of the audit schema is hashed with the timestamp and notarized by our
off­site notarization service. This Notary ID is stored in the notarization history table. We use SHA­256 for the Notary ID. 3. All future tuple hashes are based on this trusted initial hash. 4. Currently we have a scheduled notarization event everyday at 1:00 am. At this time, a
service is called to take the latest transaction/row, send it to the notarization service, and update the notarization history table. This helps us maintain acceptable performance while maintaining a detection system. Detection Process: 1. Our validator runs automatically prior to sending a Notary ID to ensure the hash related
to the record is valid before notarizing and to detect and get notified by any tampering of the logs as soon as possible. 2. The steps the notarization server takes: a. Connects to the audit server b. Checks the current notarization history to ensure all hash + Notary ID values are valid. The notarized hash is passed in along with the Notary ID. c. Traverses all of the records from the audit log and computes the hash values along the way (linked hashing). 3. A few scenarios: a. Generally if any records were modified/deleted along the way, the record hash
values would not match the one stored off­site. An alert will be sent to the security team. b. If the intruder somehow knew the initial valid notarized hash and uses it to compute all the hashes up until the notarization service runs, the only way to detect modification is to compare the hash values taken at the time of record insertion. We have multiple copies of the original hashes in multiple secure physical locations. In addition, additional logging is in place at each location to track any modifications. c. In the case an intruder modifies a hash in the notarization history, when this hash is passed along with the Notary ID for verification, there won’t be a match
and tampering would be detected. d. If any records were modified post notary checkpoint, on re­calcuations of the hashes it can be easily determined that tampering has occurred by re­calculating hashes, passing in the relevant hash value, and Notary ID. Network Time Protocol ­ Synchronized Clock The webserver and database servers are synced to the following NIST (ITS) servers: server nist1­la.ustiming.org minpoll 6 maxpoll 10 server nist1­sj.ustiming.org minpoll 6 maxpoll 10 server nist1.symmetricom.com minpoll 6 maxpoll 10 The EHR syncs directly with the system clock of the operating system. Timestamps stored in the logs are generated from this system clock via transactional triggers. Our scripting technology stack uses the system time and our databases use UTC_TIMESTAMP() function which also uses the system clock. Historic records of the NTPv4 (RFC 5905) sync activities could be found in the system logs: ntpd ­ NTP daemon program ­ Ver. 4.2.6p3 ntpd [email protected]­o Tue Jun 5 20:12:08 UTC 2012 (1) statsdir /var/log/ntpstats/ statistics loopstats peerstats clockstats cryptostats rawstats sysstats filegen loopstats file loopstats type day enable filegen peerstats file peerstats type day enable filegen clockstats file clockstats type day enable filegen cryptostats file cryptostats type day enable filegen rawstats file rawstats type day enable filegen sysstats file sysstats type day enable loopstats: day, second, offset, drift compensation, estimated error, stability, polling interval peerstats: day, second, address, status, offset, delay, dispersion, skew (variance) I hereby attest that all above statements are true, as an authorized signing authority on behalf of my organization. Jay Luong, Chief Operating Officer August 15, 2014