EHR Final Test Report

Transcription

EHR Final Test Report
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
ONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification
Part 1: Product and Developer Information
1.1 Certified Product Information
Product Name:
AccuMed
Product Version:
v 12.11
Domain:
Ambulatory
Test Type:
Complete EHR
1.2 Developer/Vendor Information
Developer/Vendor Name:
Accumedic Computer Systems, Inc.
Address:
11 Grace Ave Suite 401 Great Neck NY 11021
Website:
http://www.accumedic.com
Email:
[email protected]
Phone:
516-466-6800
Developer/Vendor Contact:
[email protected]
Page 1 of 12
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
Part 2: ONC-Authorized Certification Body Information
2.1 ONC-Authorized Certification Body Information
ONC-ACB Name:
Drummond Group
Address:
13359 North Hwy 183, Ste B-406-238, Austin, TX 78750
Website:
www.drummondgroup.com
Email:
[email protected]
Phone:
817-294-7339
ONC-ACB Contact:
Bill Smith
This test results summary is approved for public release by the following ONC-Authorized Certification
Body Representative:
Bill Smith
ONC-ACB Authorized Representative
Certification Body Manager
Function/Title
6/1/2015
Signature and Date
2.2 Gap Certification
The following identifies criterion or criteria certified via gap certification
§170.314
(a)(1)
(a)(17)
x
(d)(5)
x
(a)(6)
(b)(5)*
x
(d)(6)
x
(a)(7)
(d)(1)
x
(d)(8)
x
(d)(9)
x
(f)(1)
*Gap certification allowed for Inpatient setting only
No gap certification
Page 2 of 12
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
2.3 Inherited Certification
The following identifies criterion or criteria certified via inherited certification
§170.314
(a)(1)
(a)(14)
(c)(3)
(f)(1)
(a)(2)
(a)(15)
(d)(1)
(f)(2)
(a)(3)
(a)(16) Inpt. only
(d)(2)
(f)(3)
(a)(4)
(a)(17) Inpt. only
(d)(3)
(f)(4) Inpt. only
(a)(5)
(b)(1)
(d)(4)
(a)(6)
(b)(2)
(d)(5)
(f)(5) Optional &
Amb. only
(a)(7)
(b)(3)
(d)(6)
(a)(8)
(b)(4)
(d)(7)
(f)(6) Optional &
Amb. only
(a)(9)
(b)(5)
(d)(8)
(g)(1)
(a)(10)
(b)(6) Inpt. only
(d)(9) Optional
(g)(2)
(a)(11)
(b)(7)
(e)(1)
(g)(3)
(a)(12)
(c)(1)
(e)(2) Amb. only
(g)(4)
(a)(13)
(c)(2)
(e)(3) Amb. only
x No inherited certification
Page 3 of 12
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
Part 3: NVLAP-Accredited Testing Laboratory Information
Report Number: GI-05072015-118
Test Date(s): 4/7/2015; 4/30/2015; 5/7/2015
3.1 NVLAP-Accredited Testing Laboratory Information
ATL Name:
Drummond Group EHR Test Lab
Accreditation Number:
NVLAP Lab Code 200979-0
Address:
13359 North Hwy 183, Ste B-406-238, Austin, TX 78750
Website:
www.drummondgroup.com
Email:
[email protected]
Phone:
512-335-5606
ATL Contact:
Beth Morrow
For more information on scope of accreditation, please reference NVLAP Lab Code 200979-0.
Part 3 of this test results summary is approved for public release by the following Accredited Testing
Laboratory Representative:
Gary Isaac
ATL Authorized Representative
Test Proctor
Function/Title
FILL IN!!! 6/1/2015
Signature and Date
Sarasota, FL
Location Where Test Conducted
3.2 Test Information
3.2.1 Additional Software Relied Upon for Certification
DrFirst Rcopia
170.314.a.1, a.2, a.10;
Functionality provided
by Additional Software
eRx related functionality
EMR Direct phiMail®
170.314.b.1, b.2
DIRECT transport
Additional Software
Applicable Criteria
No additional software required
Page 4 of 12
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
3.2.2 Test Tools
Version
Test Tool
x Cypress
x
2.6
ePrescribing Validation Tool
1.0.4
HL7 CDA Cancer Registry Reporting Validation Tool
1.0.3
HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool
1.8
x
HL7 v2 Immunization Information System (IIS) Reporting Validation
Tool
1.8
x
HL7 v2 Laboratory Results Interface (LRI) Validation Tool
1.7
x
HL7 v2 Syndromic Surveillance Reporting Validation Tool
1.7
x
Transport Testing Tool
179
x
Direct Certificate Discovery Tool
3.0.2
No test tools required
3.2.3 Test Data
Alteration (customization) to the test data was necessary and is described in
Appendix [insert appendix letter]
No alteration (customization) to the test data was necessary
3.2.4 Standards
3.2.4.1 Multiple Standards Permitted
The following identifies the standard(s) that has been successfully tested
where more than one standard is permitted
Criterion #
Standard Successfully Tested
§170.204(b)(1)
(a)(8)(ii)(A)(2)
HL7 Version 3 Implementation
Guide: URL-Based
Implementations of the
Context-Aware Information
Retrieval (Infobutton) Domain
x
(a)(13)
§170.207(a)(3)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
§170.204(b)(2)
HL7 Version 3 Implementation
Guide: Context-Aware
Knowledge Retrieval
(Infobutton) Service-Oriented
Architecture Implementation
Guide
§170.207(j)
HL7 Version 3 Standard:
Clinical Genomics; Pedigree
Page 5 of 12
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
Criterion #
Standard Successfully Tested
x
(a)(15)(i)
(a)(16)(ii)
§170.204(b)(1)
HL7 Version 3 Implementation
Guide: URL-Based
Implementations of the
Context-Aware Information
Retrieval (Infobutton) Domain
§170.210(g)
§170. 210(g)
Network Time Protocol
Version 4 (RFC 5905)
The code set specified at 45
CFR 162.1002(c)(2) (ICD-10CM) for the indicated
conditions
§170.207(i)
(b)(7)(i)
HL7 Version 3 Implementation
Guide: Context-Aware
Knowledge Retrieval
(Infobutton) Service-Oriented
Architecture Implementation
Guide
Network Time Protocol
Version 3 (RFC 1305)
§170.207(i)
(b)(2)(i)(A)
§170.204(b)(2)
The code set specified at 45
CFR 162.1002(c)(2) (ICD-10CM) for the indicated
conditions
x
§170.207(a)(3)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
x
§170.207(a)(3)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
Annex A of the FIPS Publication 140-2
(e)(1)(i)
[list encryption and hashing algorithms]
AES-128
SHA-1
x
(e)(1)(ii)(A)(2)
§170.210(g)
§170. 210(g)
Network Time Protocol
Version 3 (RFC 1305)
Network Time Protocol
Version 4 (RFC 5905)
Annex A of the FIPS Publication 140-2
(e)(3)(ii)
[list encryption and hashing algorithms]
AES-128
SHA-1
x
Common MU
Data Set (15)
§170.207(a)(3)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
§170.207(b)(2)
The code set specified at 45
CFR 162.1002(a)(5) (HCPCS
and CPT-4)
None of the criteria and corresponding standards listed above are
applicable
3.2.4.2 Newer Versions of Standards
Page 6 of 12
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
The following identifies the newer version of a minimum standard(s) that
has been successfully tested
Newer Version
Applicable Criteria
No newer version of a minimum standard was tested
3.2.5 Optional Functionality
Criterion #
x (a)(4)(iii)
Optional Functionality Successfully Tested
Plot and display growth charts
(b)(1)(i)(B)
Receive summary care record using the standards specified at
§170.202(a) and (b) (Direct and XDM Validation)
(b)(1)(i)(C)
Receive summary care record using the standards specified at
§170.202(b) and (c) (SOAP Protocols)
(b)(2)(ii)(B)
Transmit health information to a Third Party using the standards
specified at §170.202(a) and (b) (Direct and XDM Validation)
(b)(2)(ii)(C)
Transmit health information to a Third Party using the standards
specified at §170.202(b) and (c) (SOAP Protocols)
(f)(3)
Ambulatory setting only – Create syndrome-based public health
surveillance information for transmission using the standard
specified at §170.205(d)(3) (urgent care visit scenario)
Common MU
Data Set (15)
Express Procedures according to the standard specified at
§170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental Procedures
and Nomenclature)
Common MU
Data Set (15)
Express Procedures according to the standard specified at
§170.207(b)(4) (45 CFR162.1002(c)(3): ICD-10-PCS)
No optional functionality tested
Page 7 of 12
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
3.2.6 2014 Edition Certification Criteria* Successfully Tested
Criteria #
Version
TP** TD***
Criteria #
Version
TP
TD
(c)(3)
1.6
(d)(1)
1.2
x
(a)(1)
1.2
x
(a)(2)
1.2
x
(a)(3)
1.2
1.4
x
(d)(2)
1.5
x
(a)(4)
1.4
1.3
x
(d)(3)
1.3
x
(a)(5)
1.4
1.3
x
(d)(4)
1.3
(a)(6)
1.3
1.4
(d)(5)
1.2
(a)(7)
1.3
1.3
(d)(6)
1.2
x
(a)(8)
1.2
(d)(7)
1.2
x
(a)(9)
1.3
1.3
(d)(8)
1.2
x
(a)(10)
1.2
1.4
(d)(9) Optional
1.2
x
(a)(11)
1.3
x
(e)(1)
1.8
1.5
x
(a)(12)
1.3
x
(e)(2) Amb. only
1.2
1.6
x
(a)(13)
1.2
x
(e)(3) Amb. only
1.3
x
(a)(14)
1.2
(f)(1)
1.2
1.2
x
(a)(15)
1.5
x
(f)(2)
1.3
1.7.1
(a)(16) Inpt. only
1.3
x
(f)(3)
1.3
1.7
(a)(17) Inpt. only
1.2
(f)(4) Inpt. only
1.3
1.7
x
(b)(1)
1.7
1.4
x
(b)(2)
1.4
1.6
(f)(5) Optional &
Amb. only
1.2
1.2
x
(b)(3)
1.4
1.2
x
(b)(4)
1.3
1.4
(f)(6) Optional &
Amb. only
1.3
1.0.3
x
(b)(5)
1.4
1.7
(g)(1)
1.7
1.9
(b)(6) Inpt. only
1.3
1.7
x
(g)(2)
1.7
1.9
x
(b)(7)
1.4
1.6
x
(g)(3)
1.3
x
(c)(1)
1.6
1.6
x
(g)(4)
1.2
x
(c)(2)
1.6
1.6
1.5
x
x
1.2
1.6
No criteria tested
*For a list of the 2014 Edition Certification Criteria, please reference
http://www.healthit.gov/certification (navigation: 2014 Edition Test Method)
**Indicates the version number for the Test Procedure (TP)
***Indicates the version number for the Test Data (TD)
Page 8 of 12
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
3.2.7 2014 Clinical Quality Measures*
Type of Clinical Quality Measures Successfully Tested:
Ambulatory
x
Inpatient
No CQMs tested
*For a list of the 2014 Clinical Quality Measures, please reference http://www.cms.gov
(navigation: 2014 Clinical Quality Measures)
CMS ID
Version
CMS ID
2
Ambulatory CQMs
Version CMS ID
90
22
x
117
Version
136
v
3
CMS ID
x
137
156
v
3
50
122
52
123
139
158
56
124
140
159
141
160
61
x
125
x
v
3
138
155
157
62
126
142
161
64
127
143
163
65
128
144
164
66
129
145
x
68
x
69
v
4
v
3
x
130
131
v
3
x
146
x
147
165
v
4
167
132
148
169
75
133
149
177
77
134
153
179
82
135
154
182
CMS ID
Version
CMS ID
v
3
166
74
Inpatient CQMs
Version CMS ID
Version
v
3
Version
CMS ID
9
71
107
172
26
72
108
178
30
73
109
185
31
91
110
188
Version
Page 9 of 12
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
CMS ID
Version
CMS ID
Inpatient CQMs
Version CMS ID
32
100
111
53
102
113
55
104
114
60
105
171
Version
CMS ID
Version
190
Page 10 of 12
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
3.2.8 Automated Numerator Recording and Measure Calculation
3.2.8.1 Automated Numerator Recording
Automated Numerator Recording Successfully Tested
(a)(1)
(a)(9)
(a)(16)
(b)(6)
(a)(3)
(a)(11)
(a)(17)
(e)(1)
(a)(4)
(a)(12)
(b)(2)
(e)(2)
(a)(5)
(a)(13)
(b)(3)
(e)(3)
(a)(6)
(a)(14)
(b)(4)
(a)(7)
(a)(15)
(b)(5)
x Automated Numerator Recording was not tested
3.2.8.2 Automated Measure Calculation
Automated Measure Calculation Successfully Tested
x
(a)(1)
x
(a)(9)
(a)(16)
(b)(6)
x
(a)(3)
x
(a)(11)
(a)(17)
x
(e)(1)
x
(a)(4)
x
(a)(12)
x
(b)(2)
x
(e)(2)
x
(a)(5)
x
(a)(13)
x
(b)(3)
x
(e)(3)
x
(a)(6)
x
(a)(14)
x
(b)(4)
x
(a)(7)
x
(a)(15)
x
(b)(5)
Automated Measure Calculation was not tested
3.2.9 Attestation
Attestation Forms (as applicable)
Appendix
x Safety-Enhanced Design*
A
x Quality Management System**
B
x Privacy and Security
C
*Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16),
(b)(3), (b)(4)
**Required for every EHR product
3.3 Appendices
Attached below.
Page 11 of 12
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 01-Apr-2015
Test Results Summary Change History
Test Report ID
Description of Change
Date
2014 Edition Test Report Summary
Page 12 of 12
USER CENTER DESIGN REPORT –
TEST REPORT UPDATE
This test report was updated in November 2015 to satisfy User Center Design Report specifications by
ONC.
The new Test Report ID is amended as follows:
“Part 3: NVLAP-Accredited Testing Laboratory Information: Report Number” plus the suffix “_Nov2015”.
11/17/15
Accumedic Computer Systems, Inc.
AccuMed v12.11 EHR Letter of Attestation
To Whom It May Concern:
By this letter Accumedic Computer Systems Inc. attest to the veracity and
authenticity of the NISTR 7741 Usability Report provided to Drummond Group Inc.
for the purposes of certification ONC Test Procedure 170.314(g) Safety Enhanced
Design.
Citation:
http://www.nist.gov/manuscript-publication-search.cfm?pub_id=907313
http://www.nist.gov/itl/iad/vug/iusr.cfm
Best Regards,
Peter Matulich, COO
5/7/2015
To Whom It May Concern:
Please accept this letter as our attestation to meet the requirement of product certification under
the ONC HIT Certification Program for the 170.314(g)(3) Safety Enhanced Design and the validity
of the submitted Usability Test Report.
Respectfully,
Peter Matulich, COO
EHR Usability Test Report of AccuMed EHR V12.10
EHR Usability Test Report of AccuMed
EHR V12.10
Report based on ISO/IEC 25062:2006 Common Industry
Format for Usability Test Reports
Date of Usability Test: 2/2/2015 ,5/1/2015
Date of Report: 5/1/2015
Report Prepared By: Accumedic Computer Systems Inc.
Peter Matulich
516 - 466-6800
[email protected]
11 Grace Ave STE 401 Great Neck NY 11021
1
EHR Usability Test Report of AccuMed EHR V12.10
Contents
EXECUTIVE SUMMARY ..................................................................................................................................... 3
INTRODUCTION ................................................................................................................................................ 4
METHOD ........................................................................................................................................................... 4
PARTICIPANTS .......................................................................................................................................... 4
STUDY DESIGN .......................................................................................................................................... 5
TASKS ........................................................................................................................................................ 5
PROCEDURES ............................................................................................................................................ 6
TEST LOCATION ........................................................................................................................................ 7
TEST ENVIRONMENT ................................................................................................................................ 7
TEST FORMS AND TOOLS ......................................................................................................................... 7
PARTICIPANT INSTRUCTIONS ................................................................................................................... 7
USABILITY METRICS .................................................................................................................................. 8
DATA SCORING ......................................................................................................................................... 9
RESULTS .................................................................................................................................................... 9
EFFECTIVENESS ....................................................................................................................................... 16
EFFICIENCY ............................................................................................................................................. 17
SATISFACTION ........................................................................................................................................ 17
MAJOR FINDINGS ................................................................................................................................... 17
AREAS FOR IMPROVEMENT ................................................................................................................... 17
APPENDICES ................................................................................................................................................... 18
APPENDIX 1 ............................................................................................................................................ 18
APPENDIX 2 ............................................................................................................................................ 18
APPENDIX 3 ............................................................................................................................................ 21
APPENDIX 4 ............................................................................................................................................ 22
2
EHR Usability Test Report of AccuMed EHR V12.10
EXECUTIVE SUMMARY
A usability test of AccuMed EHR V 12.10 was conducted on 2/2/2015 in Great Neck by Accumedic
Computer Systems Inc. The purpose of this test was to test and validate the usability of the current user
interface, and provide evidence of usability in the EHR Usability Test (EHRUT). During the usability test, 4
users matching the target demographic criteria served as participants and used the EHRUT in simulated,
but representative tasks.
This study collected performance data on four tasks typically conducted on an EHR:




Find information in Patient Chart screen
Use the patient chart to find lab results
Check the patient’s vital signs
Prescribe a medication
During the one-on-one usability test, each participant was greeted by the administrator and asked to
review and sign an informed consent/release form (included in Appendix 3); they were instructed that
they could withdraw at any time. Participants had prior experience with the EHR. The administrator
introduced the test, and instructed participants to complete a series of tasks given one at a time using the
EHRUT. During the testing, the administrator timed the test and, along with the data logger(s) recorded
user performance data on paper and electronically. The administrator did not give the participant
assistance in how to complete the task.
If provided, a description of the training or help materials are included. The recommendation is that all
participants be given the opportunity to complete training similar to what a real end user would receive
prior to participating in the usability test.






Number of tasks successfully completed within the allotted time without assistance
Time to complete the tasks
Number and types of errors
Path deviations
Participant’s verbalizations
Participant’s satisfaction ratings of the system
All participant data was de-identified - no correspondence could be made from the identity of the
participant to the data collected. Following the conclusion of the testing, participants were asked to
complete a post-test questionnaire and were not compensated with for their time. Various recommended
metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for
Improving the Usability of Electronic Health Records, were used to evaluate the usability of the EHRUT.
Following is a summary of the performance and rating data collected on the EHRUT.
3
EHR Usability Test Report of AccuMed EHR V12.10
INTRODUCTION
The EHRUT tested for this study was AccuMed EHR V12.10, designed to present medical information to
healthcare providers in Outpatient Ambulatory Clinics. The usability testing attempted to represent
realistic exercises and conditions.
The purpose of this study was to test and validate the usability of the current user interface, and provide
evidence of usability in the EHR Usability Test (EHRUT). To this end, measures of effectiveness, efficiency
and user satisfaction, such as time on task, were captured during the usability testing.
METHOD
PARTICIPANTS
A total of 5 participants were tested on the EHRUT(s). Participants in the test were of varying level of EHR
users. Participants were volunteers recruited by AccuMedic Administrative staff and were not
compensated for their time. Participants were not from the testing or supplier organization. Participants
were given the opportunity to have the same orientation and level of training as the actual end users
would have received.
Recruited participants had a mix of backgrounds and demographic characteristics. The following is a table
of participants by characteristics, including demographics, professional experience, computing experience
and user needs for assistive technology. Participant names were replaced with Participant IDs so that an
individual's data cannot be tied back to individual identities.
Part ID
Age
Education
1
A1
43
2
E1
56
3
H1
44
4
M1
21
5
K1
38
College
Grad
College
Grad
College
Grad
College
Grad
College
Grad
4
Professional
Experience
8 years
Computer
Experience
Moderate
Product
Experience
1 year
6 years
Novice
3 years
10 years
Novice
4 months
6 months
Moderate
6 months
5 years
Moderate
3 years
EHR Usability Test Report of AccuMed EHR V12.10
STUDY DESIGN
Overall, the objective of this test was to uncover areas where the application performed well - that is,
effectively, efficiently, and with satisfaction - and areas where the application failed to meet the needs of
the participants. The data from this test may serve as a baseline for future tests with an updated version of
the same EHR and/or comparison with other EHRs provided the same tasks are used. In short, this testing
serves as both a means to record or benchmark current usability, but also to identify areas where
improvements must be made.
During the usability test, participants interacted with 1 EHR(s). Each participant used the system in the
same location, and was provided with the same instructions. The system was evaluated for effectiveness,
efficiency and satisfaction as defined by measures collected and analyzed for each participant:






Number of tasks successfully completed within the allotted time without assistance
Time to complete the tasks
Number and types of errors
Path deviations
Participant's verbalizations (comments)
Participant's satisfaction ratings of the system
Additional information about the various measures can be found in the Usability Metrics section.
TASKS
A number of tasks were constructed that would be realistic and representative of the kinds of activities a
user might do with this EHR, including:
1. Computerized Provider Order Entry (CPOE)
 Medication, laboratory and imaging orders were entered
 Medication, laboratory and imaging orders were changed
 Test script was identical to ONC (170.314.a.1) test procedure
2. Drug-drug, drug-allergy interaction checks
 Invoke/trigger Drug-drug and drug-allergy interventions
 An administrative user logged on to demonstrate restricted ability to change
Interactions
3. Medication List
 Record
 Change
 Access
 Test procedures was identical to ONC test procedure 170.314.a.6
4. Medication Allergy List
 Record
 Change
 Access
 Test procedures was identical to ONC test procedure 170.314.a.7
5
EHR Usability Test Report of AccuMed EHR V12.10
5. Clinical Decision Support (CDS)
 Information to trigger CDS warnings were entered and the warning observed in accordance
with ONC Test procedure 170.314.a.8
 The ability to control who and when warnings are seen was tested and confirmed
 Testing was completed to
6. Electronic Prescribing
 Electronic prescriptions were created.
 A specific medication was added at the proctors instructions
 The medications were observed as being correct
7. Clinical Information Reconciliation
 Information from a received C-CDA was compared to existing data in a chart
o Problem List
o Medication List
o Allergy Medication List
 Only some data was selected to be merged/consolidated into the active chart
 The user closed the active chart and reopened to verify the data was
PROCEDURES
Upon arrival, participants were greeted; their identity was verified and matched with a name on the
participant schedule. Participants were then assigned a participant ID. Each participant reviewed and
signed an informed consent and release form (See Appendix 3). A representative from the test team
witnessed the participant's signature.
To ensure that the test ran smoothly, two staff members participated in this test, the usability
administrator and the data logger. The usability testing staff conducting the test was experienced usability
practitioners.
The administrator moderated the session including administering instructions and tasks. The administrator
also monitored task times, obtained post-task rating data, and took notes on participant comments. A
second person served as the data logger and took notes on task success, path deviations, number and type
of errors, and comments.
Participants were instructed to perform the tasks (see specific instructions below):



As quickly as possible making as few errors and deviations as possible.
Without assistance; administrators were allowed to give immaterial guidance and clarification on
tasks, but not instructions on use.
Without using a think aloud technique.
For each task, the participants were given a written copy of the task. Task timing began once the
administrator finished reading the question. The task time was stopped once the participant indicated
they had successfully completed the task. Scoring is discussed below in the Data Scoring section.
6
EHR Usability Test Report of AccuMed EHR V12.10
Following the session, the administrator gave the participant the post-test questionnaire (e.g., the System
Usability Scale, see Appendix 5), and thanked each individual for their participation.
Participants' demographic information, task success rate, time on task, errors, deviations, verbal
responses, and post-test questionnaire were recorded into a word document.
TEST LOCATION
The test lab included a quiet testing room with a table, computer for the participant and recording
computer for the administrator. Only the participant, data logger and administrator were in the test room.
To ensure that the environment was comfortable for users, noise levels were kept to a minimum with the
ambient temperature within a normal range.
TEST ENVIRONMENT
The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was
conducted in lab setting comprised of computer workstations and a table. For testing, the computer used
a Dell Vostro running Windows 7 Operating System. The participants used mouse and keyboard when
interacting with the EHRUT.
The [EHRUT] used 22 inch monitor set at a resolution of 1920 X 1080 set at 32bit color depth. The
application was set up by the Accumedic according to the vendor's documentation describing the system
set-up and preparation. The application itself was running on a 64bit operating system on Internet
Explorer using a training database on a LAN connection. Technically, the system performance (i.e.,
response time) was representative to what actual users would experience in a field implementation.
Additionally, participants were not able to change any of the default system settings (such as control of
font size).
TEST FORMS AND TOOLS
During the usability test, various documents and instruments were used, including:
1.
2.
3.
Moderator's Guide
Post-test Questionnaire
Incentive Receipt and Acknowledgment Form
Examples of these documents can be found in Appendices 3-6 respectively. The Moderator's Guide was
devised so as to be able to capture required data.
PARTICIPANT INSTRUCTIONS
The administrator reads the following instructions aloud to the each participant (also see the full
moderator's guide in Appendix [B4]):
Thank you for participating in this study. Your input is very important. I will ask you to complete a few
tasks using an instance of an electronic health record system and answer some questions. You should
complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks
on your own following the instructions very closely. Please note that we are not testing you, we are testing
the system, therefore if you have difficulty all this means is that something needs to be improved in the
7
EHR Usability Test Report of AccuMed EHR V12.10
system. I will be here in case you need specific help, but I am not able to instruct you or provide help in
how to use the application.
Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to
you, and how we could improve it. I did not have any involvement in its creation, so please be honest with
your opinions. All of the information that you provide will be kept confidential and your name will not be
associated with your comments at any time. Should you feel it necessary you are able to withdraw at any
time during the testing.
Following the procedural instructions, participants were shown the EHR and as their first task, were given
time (2-3 minutes) to explore the system and make comments. Once this task was complete, the
administrator gave the following instructions:
For each task, I will read the description to you and say "Begin." At that point, please perform the task and
say "Done" once you believe you have successfully completed the task. I would like to request that you not
talk aloud or verbalize while you are doing the tasks. 9 I will ask you your impressions about the task once
you are done.
Participants were then given 4 tasks to complete. Tasks are listed in the moderator's guide in Appendix
[B4].
USABILITY METRICS
According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health
Records, EHRs should support a process that provides a high level of usability for all users. The goal is for
users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To
this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability
testing.
The goals of the test were to assess:
1.
2.
3.
8
Effectiveness of [EHRUT] by measuring participant success rates and errors
Efficiency of [EHRUT] by measuring the average task time and path deviations
Satisfaction with [EHRUT] by measuring ease of use ratings
EHR Usability Test Report of AccuMed EHR V12.10
DATA SCORING
The following table (Table 1) details how tasks were scored, error evaluated, and the time data analyzed.
Table 1.
Columns
Effectiveness: Task
Success
Effectiveness: Task
Failures
Efficiency: Task Deviations
Efficiency: Task Time
Satisfaction: Task Rating
Rationale and Scoring
A task was counted as a “success” if the participant
was able to achieve the correct outcome, without
assistance, within the time allotment on a per task
basis.
If
the participant abandoned the task, did not
reach the correct answer or performed it
incorrectly, or reached the end of the allotted time
before successful completion, the task was
counted as an “failures”. No task times were
taken
for errors. path through the application was
The
participant’s
recorded. Deviations occur if the participant, for
example, went to a wrong screen, clicked on an
incorrect menu item, followed an incorrect link, or
interacted incorrectly with an on-screen control.
Each task was timed from when the administrator
said “begin” until the participant said “done”. If he
or she failed to say “Done” the time was stopped
when the participant stopped performing the task.
Participant’s subjective impression of the ease of
the use of the application was measured by
administering a post- task questions. After the test
the participant was asked to rate each task from a
scale of 1 (very difficult) to 5 (very easy).
RESULTS
DATA ANALYSIS AND REPORTING
The results of the usability test were calculated according to the methods specified in the Usability Metrics
section above.
The usability testing results for the EHRUT are detailed below. The results should be seen in light of the
objectives and goals outlined in the Study Design section. The data should yield actionable results that, if
corrected, yield material, positive impact on user performance.
9
EHR Usability Test Report of AccuMed EHR V12.10
Task: Computerized Provider Order Entry CPOE
Medications
Record
Part
ID
Success/
Failure
A1
Success
E1
Task
Deviation
Rating
(mins)
Rating
Errors
Risk
None
5.25
Easy
None
4
Success
None
5.00
Very Easy
None
4
H1
Success
None
10.50
Easy
None
4
M1
Success
None
6.10
Easy
None
4
K1
Success
None
6.00
Easy
None
4
Rating
(mins)
Rating
Errors
Change
Part
ID
Success/
Failure
A1
Success
None
1.00
Very Easy
None
4
E1
Success
None
2.00
Easy
None
4
H1
Success
None
3.00
Easy
None
4
M1
Success
None
1.00
Very Easy
None
4
K1
Success
None
1.00
Very Easy
None
4
10
Task
Deviation
Risk
EHR Usability Test Report of AccuMed EHR V12.10
Access
Part
ID
Success/
Failure
A1
Success
E1
Task
Deviation
Rating
(mins)
Rating
Errors
Risk
None
.30
Very Easy
None
5
Success
None
.30
Very Easy
None
5
H1
Success
None
.30
Very Easy
None
5
M1
Success
None
.30
Very Easy
None
5
K1
Success
None
.30
Very Easy
None
5
Rating
(mins)
Rating
Errors
Laboratory and Radiology/Imaging
Record
Part
ID
Success/
Failure
Task
Deviation
Risk
A1
Success
None
3.00
Easy
None
4
E1
Success
None
5.00
Very Easy
None
4
H1
Success
None
10.50
Easy
None
4
M1
Success
None
6.10
Easy
None
4
K1
Success
None
6.00
Easy
None
4
Rating
(mins)
Rating
Errors
Change
Part
ID
Success/
Failure
A1
Success
None
1.00
Very Easy
None
4
E1
Success
None
2.00
Easy
None
4
H1
Success
None
3.00
Easy
None
4
M1
Success
None
1.00
Very Easy
None
4
K1
Success
None
1.00
Very Easy
None
4
11
Task
Deviation
Risk
EHR Usability Test Report of AccuMed EHR V12.10
Access
Part
ID
Success/
Failure
A1
Success
E1
Task
Deviation
Rating
(mins)
Rating
Errors
Risk
None
1.00
Very Easy
None
4
Success
None
1.00
Very Easy
None
4
H1
Success
None
1.00
Very Easy
None
4
M1
Success
None
1.00
Very Easy
None
4
K1
Success
None
1.00
Very Easy
None
4
Rating
(mins)
Rating
Errors
Task: Drug-drug, Drug allergy Interactions
Participant Id K1serverd as an administrative user.
Trigger
Part
ID
Success/
Failure
Task
Deviation
Risk
A1
Success
None
2.20
Easy
None
5
E1
Success
None
2.40
Easy
None
5
H1
Success
None
2.30
Easy
None
5
M1
Success
None
2.20
Easy
None
5
K1
Success
None
1.50
Very Easy
None
5
Rating
(mins)
Rating
Errors
Access
Part
ID
Success/
Failure
A1
Success
None
.30
Very Easy
None
5
K1
Success
None
.30
Very Easy
None
5
12
Task
Deviation
Risk
EHR Usability Test Report of AccuMed EHR V12.10
Task: Medication List
Record
Part
ID
Success/
Failure
A1
Success
E1
Task
Deviation
Rating
(mins)
Rating
Errors
Risk
None
2.00
Easy
None
4
Success
None
3.00
Easy
None
4
H1
Success
None
7.20
Easy
None
4
M1
Success
None
3.30
Easy
None
4
K1
Success
None
5.40
Easy
None
4
Rating
(mins)
Rating
Errors
Change
Part
ID
Success/
Failure
Task
Deviation
Risk
A1
Success
None
1.30
Easy
None
4
E1
Success
None
2.20
Easy
None
4
H1
Success
None
5.00
Easy
None
4
M1
Success
None
1.00
Easy
None
4
K1
Success
None
4.00
Easy
None
4
Rating
(mins)
Rating
Errors
Access
Part
ID
Success/
Failure
A1
Success
None
.40
Very Easy
None
5
E1
Success
None
.40
Very Easy
None
5
H1
Success
None
.40
Very Easy
None
5
M1
Success
None
.30
Very Easy
None
5
K1
Success
None
.30
Very Easy
None
5
13
Task
Deviation
Risk
EHR Usability Test Report of AccuMed EHR V12.10
Task: Allergy List
Record
Part
ID
Success/
Failure
A1
Success
E1
Task
Deviation
Rating
(mins)
Rating
Errors
Risk
None
1.20
Easy
None
4
Success
None
2.10
Easy
None
4
H1
Success
None
3.10
Easy
None
4
M1
Success
None
2.00
Easy
None
4
K1
Success
None
3.10
Easy
None
4
Rating
(mins)
Rating
Errors
Change
Part
ID
Success/
Failure
Task
Deviation
Risk
A1
Success
None
1.20
Easy
None
4
E1
Success
None
2.20
Easy
None
4
H1
Success
None
3.40
Easy
None
4
M1
Success
None
2.40
Easy
None
4
K1
Success
None
3.00
Easy
None
4
Rating
(mins)
Rating
Errors
Access
Part
ID
Success/
Failure
A1
Success
None
.40
Very Easy
None
5
E1
Success
None
.40
Very Easy
None
5
H1
Success
None
.40
Very Easy
None
5
M1
Success
None
.30
Very Easy
None
5
K1
Success
None
.30
Very Easy
None
5
14
Task
Deviation
Risk
EHR Usability Test Report of AccuMed EHR V12.10
Task: Clinical Decision Support
Patients, documentation where created to trigger Problem, Medication, Medication Allergy, Demographic,
Lab Test and Results and Vital sign interventions. All participants were not required to trigger all
interventions the proctor provided scenarios were the triggered intervention was pre-determined and
expected. The protector evaluated that the appropriate interventions where triggered by the participant.
Trigger
Part
ID
Success/
Failure
A1
Success
E1
Task
Deviation
Rating
(mins)
Rating
Errors
Risk
None
15.10
Moderate
None
3
Success
None
25.40
Moderate
None
3
H1
Success
None
31.40
Moderate
None
3
M1
Success
None
20.30
Moderate
None
3
K1
Success
None
15.10
Moderate
None
3
Access/Permission
K1 served as administrator role. K1 turned access on and off for roles. A1 , M1 served as roles having
access or not having access. The proctor evaluated K1 ability to modify permissions. Evaluated the
expected results for users M1 and A1
Part
ID
Success/
Failure
A1
Success
M1
K1
Task
Deviation
Rating
(mins)
Rating
Errors
Risk
None
3.00
Moderate
None
3
Success
None
2.40
Moderate
None
3
Success
None
.45
Very Easy
None
5
Rating
(mins)
Rating
Errors
Task: Electronic Prescribing
Part
ID
Success/
Failure
A1
Success
None
3.30
Easy
None
4
E1
Success
None
4.20
Easy
None
4
15
Task
Deviation
Risk
EHR Usability Test Report of AccuMed EHR V12.10
H1
Success
None
8.10
Easy
None
4
M1
Success
None
4.00
Easy
None
4
K1
Success
None
4.00
Easy
None
4
Rating
(mins)
Rating
Errors
Task: Clinical Information Reconciliation
Part
ID
Success/
Failure
Task
Deviation
Risk
A1
Success
None
1.20
Easy
None
4
E1
Success
None
1.00
Easy
None
4
H1
Success
None
1.00
Easy
None
4
M1
Success
None
1.00
Easy
None
4
K1
Success
None
1.00
Easy
None
4
The risk value assigned is based on the risk of a user or software failure causing harm to the
patient. No risk was graded as 0 and high risk was graded as 5. The grading system used
two components. The first was how obvious the failure is to the user of the application. If
the failure is obvious, the risk is decreased as the user will identify the failure easily. If the
failure is not obvious and incorrect data or items are put in place that are not obviously
incorrect, raises the risk.
The second component was how significant the item is on overall patient care if the component
does not function properly or data is entered incorrectly. If the data is used in the clinical decision
making process, the risk is increased. If the tested items is to assist in delivering consistent care,
but does not add critical data, the risk is lower.
EFFECTIVENESS
The EHRUT was very effective in providing us with the gaps we currently have with the EHR application,
because:
1. Knowing the time it took each participant to execute a certain task allows us to measure the
complexity of the task. Our goal is to minimize complexity and offer a simple but effective solution
to the users’ needs.
2. The final questions section of the EHRUT gives us a good summary of what the user experienced
during the session, which provides good information for us to begin improvement to the EHR
16
EHR Usability Test Report of AccuMed EHR V12.10
application. Many mentioned better labeling and suggested possible improvement options.
3. Record of path deviation will give us insight to what the users see first on the screen and what
they feel should be the correct sequence of actions. This will allow us to improve our user
interface. Most participants took different paths to come to the same screen for information –
some took more efficient paths than others. Path deviation often result from confusion of the
placement of certain links or buttons and their functionalities.
EFFICIENCY
The EHRUT is quite efficient in accessing the usability of the EHR as it focuses on the most important and
common tasks a user will face in a day to day basis.
SATISFACTION
Most participants are quite satisfied with the EHR and its usability for the tasks assigned during the
session.
MAJOR FINDINGS
Most of the essential patient information, such as the vitals, documents, medications, allergies, and
appointments were easy to spot for all participants.
Every participant found the AccuMed EHR to be quite user friendly.
Those that had previous experiences with other EHR systems found the AccuMed EHR to be less complex
and more user friendly.
Almost every participant had no problem locating the patient lab orders, but had slight difficulty locating
the lab results screen as the link to the page was not obvious enough for first time users.
Another area almost every participant stumbled a bit on was with the “Done” button on the medications
screen. It was meant to take the user back to our EHR when they were finished with the medications
module (DrFirst); however, they were confused about its functionality because the medication module had
a “continue” button to complete a prescription.
AREAS FOR IMPROVEMENT
Some areas of the system will require better labeling in order to improve clarity of the user interface.
17
EHR Usability Test Report of AccuMed EHR V12.10
APPENDICES
The following appendices include supplemental data for this usability test report. Following is a list of the
appendices provided:
1.
2.
3.
4.
Participant demographics
Example Moderator's Guide
System Usability Scale Questionnaire
Incentive receipt and acknowledgment form
APPENDIX 1
PARTICIPANT DEMOGRAPHICS
Gender
Men
[x]
Women
[x]
Total (participants)
[x]
Occupation/Role
Technical Support Analyst
[x]
Implementation Specialist
[x]
Product Analyst
[x]
Total (participants)
Years of Experience
Electronic Health Records (≥ 10 Yrs)
[x]
Electronic Health Records (≥ 5 Yrs)
[x]
Electronic Health Records (> 1 Yr)
[x]
Electronic Health Records (≤ 1 Yr)
[x]
Total (participants)
[x]
APPENDIX 2
EHRUT Moderator’s Guide
18
EHR Usability Test Report of AccuMed EHR V12.10
PREPARATION GUIDE
Prior to testing



Confirm schedule with Participants
Ensure EHRUT lab environment is running properly
Ensure lab and data recording equipment is running properly
Prior to each participant:


Reset application
Start session recordings with tool
Prior to each task:

Reset application to starting point for next task
After each session:


End session recordings
Thank the participant for his/her time
ORIENTATION GUIDE
Thank You for participating in this study. Our session today will last between 40
minutes to an hour. During that time you will take a look at an electronic health record
system.
I will ask you to complete a few tasks using this system and answer some questions. We
are interested in how easy (or how difficult) this system is to use, what in it would be
useful to you, and how we could improve it. You will be asked to complete these tasks on
your own trying to do them as quickly as possible with the fewest possible errors or
deviations. Do not do anything more than asked. If you get lost or have difficulty I
cannot answer help you with anything to do with the system itself. Please save your
detailed comments until the end of a task or the end of the session as a whole when we can
discuss freely.
I did not have any involvement in its creation, so please be honest with your opinions. The product
you will be using today is describe the state of the application, i.e., production version, early
prototype, etc. Some of the data may not make sense as it is placeholder data.
19
EHR Usability Test Report of AccuMed EHR V12.10
We are recording the audio and screenshots of our session today. All of the information
that you provide will be kept confidential and your name will not be associated with your
comments at any time.
Do you have any questions or concerns?
DURING THE TEST
20


Show participant the EHRUT.
Jot down notes/comments about the way the participant is navigating the application.

Ask the participants to rate each task on a scale: “Very Difficult” (1) to “Very Easy” (5).
EHR Usability Test Report of AccuMed EHR V12.10
APPENDIX 3
SYSTEM USABILITY SCALE QUESTIONNAIRE
In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of
systems usability” known as the System Usability Scale or SUS.16 Lewis and Sauro (2009)
and others have elaborated on the SUS over the years. Computation of the SUS score can be
found in Brooke’s paper, in at http://www.usabilitynet.org/trump/documents/Suschapt.doc or in
Tullis and Albert (2008).
Strongly
disagree
1. I think that I would like to use this system
frequently
Strongly
agree
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
4. I think that I would need the support of a
technical person to be able to use this system
1
2
3
4
5
5. I found the various functions in this system
were well integrated
1
2
3
4
5
6. I thought there was too much inconsistency in
this system
1
2
3
4
5
7. I would imagine that most people would learn
to use this system very quickly
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
2. I found the system unnecessarily complex
3. I thought the system was easy to use
8. I found the system very cumbersome to use
9. I felt very confident using the system
10. I needed to learn a lot of things before I could
get going
16 Brooke, J.: SUS: A “quick and dirty” usability scale. In: Jordan, P. W., Thomas, B., Weerdmeester, B. A., McClelland (eds.)
Usability Evaluation in Industry pp. 189--194. Taylor & Francis, London, UK (1996). SUS is copyrighted to Digital Equipment
Corporation, 1986.
21
EHR Usability Test Report of AccuMed EHR V12.10
Lewis, J R & Sauro, J. (2009) "The Factor Structure Of The System Usability Scale." in Proceedings of the
Human Computer Interaction International Conference (HCII 2009), San Diego CA, USA
APPENDIX 4
NON-DISCLOSURE AGREEMENT AND INFORMED CONSENT FORM
Non-Disclosure Agreement
THIS AGREEMENT is entered into as of
, between
(“the Participant”) and the testing organization
Accumedic Computer Systems Inc. located at 11 Grace Ave, Great Neck, NY 11021.
The Participant acknowledges his or her voluntary participation in today’s usability
study may bring the Participant into possession of Confidential Information. The term
"Confidential Information" means all technical and commercial information of a
proprietary or confidential nature which is disclosed by Accumedic Computer
Systems Inc., or otherwise acquired by the Participant, in the course of today’s study.
By way of illustration, but not limitation, Confidential Information includes trade secrets,
processes, formulae, data, know-how, products, designs, drawings, computer aided
design files and other computer files, computer software, ideas, improvements,
inventions, training methods and materials, marketing techniques, plans, strategies,
budgets, financial information, or forecasts.
Any information the Participant acquires relating to this product during this study is
confidential and proprietary to Accumedic Computer Systems Inc. and is being
disclosed solely for the purposes of the Participant’s participation in today’s usability
study. By signing this form the Participant acknowledges that s/he will not disclose this
confidential information obtained today to anyone else or any other organizations.
Participant’s printed name:
Signature:
22
Date:
Informed Consent
Accumedic Computer Systems Inc. would like to thank you for participating in this study.
The purpose of this study is to evaluate an electronic health records system. If you decide
to participate, you will be asked to perform several tasks using the prototype and give your
feedback. The study will last about 30 minutes.
Agreement
I understand and agree that as a voluntary participant in the present study conducted by
Accumedic Computer Systems Inc. I am free to withdraw consent or discontinue participation
at any time. I understand and agree to participate in the study conducted and recorded by the
Accumedic Computer Systems Inc.
I understand and consent to the use and release of the record by Accumedic Computer
Systems Inc. I understand that the information and recorded is for research purposes only
and that my name and image will not be used for any purpose other than research. I
relinquish any rights to the record and understand the record may be copied and used by
Accumedic Computer Systems Inc. without further permission.
I understand and agree that the purpose of this study is to make software applications
more useful and usable in the future.
I understand and agree that the data collected from this study may be shared with outside of
Accumedic Computer Systems Inc. and Accumedic Computer Systems Inc.‘s client. I
understand and agree that data confidentiality is assured, because only de- identified data –
i.e., identification numbers not names – will be used in analysis and reporting of the results.
I agree to immediately raise any concerns or areas of discomfort with the study administrator.
I understand that I can leave at any time.
Please check one of the following:
 YES, I have read the above statement and agree to be a participant.
 NO, I choose not to participate in this study.
Signature:
4/7/2015
To whom it may concern,
Accumedic Computer System Inc. (Accumedic) utilizes a home grown QMS software and
process. Accumedic has successfully utilized its QMS process for over 30 years. The home
grown software and polices have changed in accordance with corporate policy change, customer
needs, and industry change.
The current process generally speaking is as follows:








An development issue is brought to light
o Customer reports an issue
 Thru support channels
 Customer portal
 Technical Support Analyst
o Corporate decision to develop a new module or change functionality
The issue is categorized in our system
o Product
o Module
o Type
 Customer Request (Change Request)
 Internal Feature (Change Request)
 Bug
 Government Regulation Change
The issue is prioritized by severity or need.
If the issue is a “bug” the reporting personal enter the issue statement along with internal
reproducible steps. The issue is assigned to a developer. The developer performs the
necessary tasks to provide a solution.
If the issue is a program change request the development team determines together the
viability of the request. If accepted it will then be moved to information gathering phase
which typically results in a functional specification document that is shared with the
requesting party. Upon approval of the change the task is assigned to a developer and
development begins.
Upon completion of development tasks, development note are completed and the issue is
moved to QA.
All issues moved to QA are put into our automated build process which yields a new
software revision
Each feature or remedy included in the revision is then analyzed by a QA staff member.


The QA staff member perform testing of functionality and determines if the expected
results and functionality operates as intended. If yes the issue is “PASSED” if no the
feature is “FAILED” and additional information provided about the failure is reported and
returned to development team.
All issues that are PASSED will be collectively released to the customer base at large on
a quarterly basis or more frequently if necessary.
Respectfully,
Peter Matulich, COO
Peter Matulich
Chief Operating Office
[email protected]
516-466-6800
4/8/2015
Dear Gary Isaac,
[Include the following information based on the EHR System under Test’s functionality]
1. Are default settings for audit log and audit log status record are enabled by default?
Yes, End users do not have access to enabling and disabling the audit functionality.
2. Is encryption of electronic health information on end-user devices is enabled by
default?
We prevent information from being stored locally because we prevent caching in code
for all our web pages.
3. Does the EHR SUT allow a user to disable the following?
 audit log
 audit log status
 encryption status
No, user cannot access this feature.
4. Does the EHR SUT permit any users to delete electronic health information?
No, users do not have access to deleting electronic health information
5. Describe how the audit logs are protected from being changed, overwritten or deleted
by the EHR technology.
Audit log is stored in the database and is protected by Database User Controls
6. Describe how the EHR is capable of detecting whether the audit logs have been
altered.
Audit log is not accessible by users and the EHR implements user permissions which
constrain the user from unauthorized sections of the system.
Sincerely,
Peter Matulich

Similar documents

Report - Drummond Group

Report - Drummond Group Part 3: NVLAP-Accredited Testing Laboratory Information Report Number: KAM-0626-2472 Test Date(s): 03/11/14, 04/28/14, 05/23/14, 06/05/14, 06/18/14, 06/26/14 3.1 NVLAP-Accredited Testing Laboratory...

More information

IG-3721-16-0003 - UL Transaction Security

IG-3721-16-0003 - UL Transaction Security  HL7 CDA Cancer Registry Reporting Validation Tool   HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool   HL7 v2 Immunization Information System (IIS) Reporting Validation Tool   HL7 v2 L...

More information