Draft Manuscript - Society for Academic Emergency Medicine

Transcription

Draft Manuscript - Society for Academic Emergency Medicine
Research Priorities in the Utilization and Interpretation of Diagnostic Imaging:
Education, Assessment, and Competency
The ability to select and interpret diagnostic imaging is an integral skill for all emergency
medicine (EM) practitioners.(1) Competency should be achieved, assessed and
maintained for both trainees and experienced practitioners. (2) In recent years, the
complexity of proper imaging selection has increased due to the broad mix of modalities
and the associated risks and benefits of each test. In addition to appropriate selection,
emergency physicians must be proficient in image interpretation in order to provide
rapid care for Emergency Department (ED) patients. The goal of this consensus group
was to identify priority research areas related to education, assessment, and
competency in the utilization and interpretation of diagnostic imaging within the
practice of emergency medicine.
Methods
The research priorities were developed through an iterative consensus-driven process
using a modified nominal group technique that culminated in a breakout session at the
2015 Academic Emergency Medicine consensus conference on diagnostic imaging
focused on education, assessment, and competency. A workgroup consisting of
emergency physicians, radiologists, and physicists participated in pre-conference
meetings and phone call discussions to develop an initial set of research questions. The
group then ranked the importance of each question (5-point Likert scale, 1 = not very
important to 5 = very important). Questions with a mean score of 3 or greater were
circulated to conference registrants for further ranking and feedback using the Society
for Academic Emergency Medicine website. These responses were tabulated and
presented to the attendees of the consensus conference on May 12, 2015. During the
breakout session, we used verbal discussion and blinded voting technique to arrive at
the final set of prioritized research questions. This article represents the results of the
preconference surveys and the breakout session at the consensus conference.
Consensus Results
Recommendation 1: Develop a diagnostic imaging curriculum for emergency medicine
residency training.
The research focus of diagnostic imaging in emergency medicine has primarily been in
emergency ultrasound. For other imaging modalities, the Accreditation Council for
Graduate Medical Education (ACGME) does not define specific outcomes for
competency with respect to image utilization and interpretation in their requirements
for emergency medicine residency programs in the United States.(3) There is also no
defined curriculum or scope of practice for emergency medicine. In contrast, the
Canadian Royal College of Physicians and Surgeons does require emergency physicians
to demonstrate the ability to select appropriate diagnostic imaging (plain radiography,
ultrasound, and computed tomography [CT]) and to interpret results accurately.(4) The
only core curriculum in emergency radiology was established by the American Society of
Emergency Radiology.(5) This requirement was designed primarily for radiology
residents preparing to take after-hours “call” in a hospital, and not for practicing
emergency physicians.
Questions:
1. What are the critical features of an imaging curriculum that support effective
teaching and student learning of image interpretation?
2. Should a standardized imaging utilization curriculum be developed in emergency
medicine? If so, should it be combined with imaging interpretation or be
developed and taught separately?
3. What diagnostic image interpretation competencies [(e.g., specific imaging
modalities (radiograph, CT, MRI), organ system-based list, or diseased-based
findings or diagnoses list)] could be expected of a board eligible EM resident? For
an emergency physician’s maintenance of certification?
4. What evaluation tools are most effective for assessing the impact of an imaging
curriculum on patient outcomes?
5. As imaging technology evolves, how can the imaging curriculum materials be
designed to keep pace with these advances?
Recommendation 2: Develop, study, and validate tools to assess competency in
diagnostic imaging interpretation
Emergency medicine residents are assessed semiannually in the milestone
competencies. This includes diagnostic imaging utilization and interpretation (Figure
1).(6) To date, there is no agreed upon means of assessing this competency. One
approach is the utilization of radiology image banks representing normal and
pathological findings for education and assessment.(7) Another approach assumes
radiology interpretation to be a “interpretive” skill that requires deliberate practice and
that individual learners need different numbers of images to interpret before they reach
a maximum level of competency. For example, residents reading ankle images from a
case bank will demonstrate a steady improvement in their accuracy by reading more
images until they reach a plateau in their learning.(8) The number of radiographic
images at which the plateau occurs is different for each learner, so while there is no
optimal number required to achieve expertise, a learning curve can be a useful graphical
representation of an individual learner’s skill acquisition (Figure 2). Competency
however, is not always determined by having interpreted a certain number of images.
Standardized patient cases that include radiograph interpretation may also be a reliable
method to assess clinical skills and identify areas for improvement for the individual
learner.(9) As every resident has a unique learning curve, it has been suggested that
continuous quality assurance with individualized feedback may be necessary to maintain
proficiency rather than to simply require a minimum number of examinations.(10,11)
Studies of image interpretation comparing emergency physicians and radiologists show
poor to good agreement between specialists varying by diagnostic modality. Emergency
physicians have a different approach to image interpretation, which is to have a very
low miss rate for emergent diagnoses. The clinical circumstances (e.g., degree of
urgency, time sensitivity of the diagnosis) sometimes require the emergency physician
to narrow the scope of image interpretation and focus on the highest impact and most
clinically significant findings. Furthermore, even when there is a discrepancy, the clinical
significance of these discrepancies is very low, ranging from 0.1-4%.(12,13)
Questions:
1. Which components of image interpretation practice confer competency
(mentoring, ratio of normal/abnormal, mode of practice delivery, timing of
practice during different stages of career or a combination thereof)?
2. What is the learning curve for image interpretation of commonly used imaging
modalities (i.e., x-ray, CT, ultrasound)? Are there modalities that require more
practice or maintenance training to attain (and retain) proficiency? What
individual factors affect the learning curve for image interpretation?
3. What assessment tools are most effective for evaluating competency in image
interpretation for trainees and independent practitioners?
4. How does the use of various gold standards affect competency in workplacebased assessments of image interpretation?
5. How effective is technical and diagnostic accuracy of imaging interpretation at
improving patient outcomes?
6. How can future iterations of EM Milestones be adapted to incorporate the
complexities of image-interpretation-educational objectives at the individual,
department, system, and healthcare policy level?
Figure 1. Relevant EM milestone competencies for diagnostic imaging. (6)
Figure 2. The Thurstone Learning Curve. This curve illustrates a measure of performance
graphed against time spent learning. Point A is the amount of practice required before a
learner reaches the efficient phase of learning. The slope from A to B represents the
most efficient phase of learning. Point C is the number of repetitions required to reach a
level of performance after which learning becomes less efficient. Line D is the upper
asymptote representing maximal performance. (Reprinted with permission) (8) (will
need to get permission from the authors/publisher)
Recommendation 3: Simulation should play a significant role in education, assessment
and competency measures for diagnostic imaging
The Institute of Medicine’s 1999 report “To Err Is Human: Building a Safer Health
System”(14) was a wake-up call for medical practitioners. Simulation based training
provides learning opportunities without putting patients at risk and has been used
successfully for teaching methods in laparoscopic surgery(15) 37, endoscopy(16) 38,
anesthesiology(17) 39 and emergency medicine.(18)
Teaching programs in various specialties have residents work on basic skills with
simulators before their time with actual patients.(19) Initial focus on basic tasks allows
more complex skills to be added over time until the learner acquires the necessary level
of expertise. Simulation also helps residents develop familiarity and confidence in
handling critical events that happen infrequently in the clinical setting.
The authors of the book “Simulation in Radiology”(20) outline a potential protocol for
simulation-based training. First, a didactic lecture highlights the procedure’s hazards
and potential risks. A multiple-choice examination tests basic knowledge of the training
topic. Next, the trainee performs the simulated procedure with expert supervision and
receives objective grading of his or her procedure skills. Finally, the trainee takes a
written posttest to assess his or her improvement in the subject knowledge and then
participates in a private interview with the person who administered the test to receive
direct feedback. The training session is repeated over selected intervals and a logbook is
kept to record performance. In this model, the learner receives multiple opportunities
over time to develop skills until the established goals are met.
It has been suggested (21) that simulation is an option for ensuring reproducible
environments to assess procedural and interpretive skills despite the necessary financial
resources . Much of the previous research around simulation in emergency ultrasound is
in task completion. Placing central lines and performing pelvic ultrasounds on task
trainers(22,23) are only a fraction of the competencies which can be measured in a
clinician’s training. Additional research is needed in utilizing simulation to evaluate
learner knowledge, performance, interpretation, and subsequent decision-making in
emergency diagnostic imaging.
Simulation training has also been used in diagnostic emergency radiology to ensure the
clinical skills of physicians and residents in emergency departments are sufficient to
detect the most important and life-threatening conditions. No standard curriculum
exists to prepare radiology residents to take independent call. Ganguli et al.(24) suggest
that a training curriculum should include core rotations, didactic lectures, image review,
senior-led supervision, and simulation training at computer workstations to mimic the
cases seen during emergency radiology call. This group recognized there are limitations
to such computer simulations because it is conducted in a protected setting away from
realistic distractions of consultative phone calls, pagers, and normal radiographic
findings. Furthermore, no associations were drawn from residents’ computer scores to
actual real-life performance during radiology calls.
Although simulation is a means to provide training, its true effect on radiology practice
has not been widely studied. Research efforts should explore different ways simulation
can demonstrate one’s mastery of multiple competencies including: team
communication, diagnostic or interpretive skills, and clinical decision-making.
Priority Questions:
1. Does simulation address competency in diagnostic imaging? Through which imaging
scenarios?
2. Does meeting competency for image interpretation during simulation predict
success in image interpretation in patient care? What are unintended consequences of
an increased focus on imaging in simulation
3. Is simulation “success” sustained? For how long?
4. What are the learning curves for imaging utilization and interpretation for
radiographs and CT examinations?
Recommendation 4: The American College of Radiology Appropriateness Criteria (ACRAC) should be utilized as an evidence-based peer reviewed resource in determining the
use of diagnostic imaging.
Diagnostic imaging is an integral part of healthcare delivery. However, despite proven
favorable benefit-to-risk ratios in symptomatic patients(25–33), overutilization is a
major concern with increasing focus on radiation exposure, contrast-related reactions,
and the economics of healthcare.(34) Overutilization is defined as the use of imaging
procedures in cases that are less likely to result in improved patient outcomes.(35)
Some authors suggest that 20-50% of high tech imaging procedures may be at least in
part unnecessary.(35–37) Selection of appropriate imaging tests plays a key role in
increasing diagnostic yield, guiding clinical decisions, and affecting the value of
healthcare.
The American College of Radiology Appropriateness Criteria (ACR-AC) is an evidencebased, web-based, peer reviewed resource designed in 1993 to guide the use of imaging
for more than 200 clinical scenarios.(38) The ACR website lists the methods through
which the ACR-AC are derived and the American College of Emergency Physicians
contributes to the development and maintenance of these criteria.(39)
Numerous studies from the pediatric literature indicate that 20%-44% of CT scans could
be avoided if the ACR appropriateness criteria were applied or decision guidelines
followed, without compromising patient care.(38,40–42) Nearly 44% of overall radiation
dose and 39% of total fees incurred in one cohort of trauma patients could have been
avoided if ACR-AC were used.(38) Similarly, decision guidelines, such as the Canadian
Head and C-spine rules, are only useful when applied; however clinicians may fail to
employ them.(43) Finally, strict application of NEXUS criteria could potentially reduce
the number of screening cervical spine CT scans in the setting of blunt trauma.(44)
While some radiology clerkships include educational sessions on imaging utilization(45),
only 25% of medical schools require radiology as a clinical rotation.(46) Nevertheless,
most second year medical students (88% or 21/24 students) in a radiology elective have
never heard of ACR-AC.(47) Most senior medical students (96% of third and fourth year
students) in a single medical school were not aware of ACR-AC as a resource.(48)
In graduate medical education, only 60% of radiology residents in a single institution
knew how to obtain ACR-AC and 90% were unaware of its contents.(49) Several years
later, a study showed that 74% of radiology residency program directors promote ACRAC in education and 84% residents have read at least a few of the criteria. Despite a high
regard for ACR-AC, only about half of radiology program directors and residents rarely
or never refer to ACR-AC when recommending studies to referring clinicians. Most
radiology resident trainees (88.6% or 333/376) were not aware of program
requirements to use ACR-AC in their education.(50)
There is limited knowledge of imaging utilization practic and awareness of ACR-AC in
emergency medicine. According to a survey of academic emergency departments in the
United States, the use of contrast for CT abdomen/pelvis by a substantial number of
respondents deviated ACR-AC recommendations.(51) The only study directly evaluating
the proficiency of emergency medicine residents in selecting appropriate radiologic
examinations failed to show significant improvement over the course of residency
training. The authors concluded that emergency medicine residency training may be
lacking in education focusing in appropriate imaging utilization.(52)
However, several studies evaluating educational interventions targeting appropriate
imaging utilization have shown promise. Scheiner and Novelline showed that formal
didactic exposure during radiology clerkship can significantly enhance medical students’
ability to choose appropriate imaging procedures and that this exposure may translate
into improved cost-effectiveness and patient care.(45)
While 96% of MS3’s and MS4s at Boston University School of Medicine were not aware
of ACR as a resource, 89% reported having solid understanding of indications for
imaging tests following two didactic sessions on principles of evidence-based imaging
and/or a small group session and self-directed learning exercise using ACR AC. (53)
A study of radiology residents showed resident-prepared conferences on imaging
utilization to be an effective way to teaching imaging utilization guidelines to peers.(54)
Furthermore, the introduction of the ACR-AC, which were applicable to approximately
50% of MRI requests in an MRI preauthorization center, resulted in an increase in the
rate of appropriate MRIs and a decreased of inappropriate MRI’s performed by
nonradiology physicians.(55)
In order to promote more widespread use of ACR-AC, a call has been made to
encourage the ACR to make the AC more “user-friendly”, specifically by developing an
alternative approach to the numeric ratings.(56) Furthermore, the AC list, which is
organized by organ systems such as cardiac and gastrointestinal, is not visually
optimized for an emergency physician who finds colorectal cancer screening and
suspected liver metastasis guidelines in the same section as those of blunt abdominal
trauma and suspected appendicitis. Instead, ACR-AC should be subdivided into
specialty-specific sections so that an emergency physician, or any medical specialist, can
quickly find what he/she is looking for without distraction of guidelines irrelevant to
his/her practice. Collaboration with ACR to develop specialty-specific AC sections is the
crucial next step.
Priority Questions:
1. What are the barriers to EM physicians adopting the use of the ACR-AC
Would greater adoption of ACR-AC in ED settings have unintended
consequences? What is the most effective way to measure adoption of ACRAC into EM practice?”?
2. Does using ACR-AC by emergency physicians improve patient outcomes?
3. How does use of the ACR-AC compare to an EM modified (or specialty
modified) version of the AC with respect to EM physician use and patient
outcomes?
Bibliography
1.
2013 Model of the Clinical Practice of Emergency Medicine [Internet]. 2013.
Available from:
http://www.acep.org/uploadedFiles/ACEP/Practice_Resources/policy_statem
ents/2013%20EM%20Model%20%20Website%20Document%281%29.pdf?__taxonomyid=117952
2.
Miller GE. The assessment of clinical skills/competence/performance. Acad
Med J Assoc Am Med Coll. 1990 Sep;65(9 Suppl):S63–7.
3.
ACGME Program Requirements for Gracuate Medical Education in Emergency
Medicine [Internet]. 2013. Available from:
http://www.acgme.org/acgmeweb/Portals/0/PFAssets/2013-PR-FAQPIF/110_emergency_medicine_07012013.pdf
4.
Objectives of Training in the Specialty of Emergency Medicine [Internet]. 2014.
Available from:
http://www.royalcollege.ca/cs/groups/public/documents/document/y2vk/m
daw/~edisp/tztest3rcpsced000895.pdf
5.
Novelline RA, ed. Core Curriculum in Emergency Radiology [Internet]. Available
from: http://www.aseronline.org/curriculum/index.htm
6.
The Emergency Medicine Milestone Project [Internet]. Accreditation Council
for Graduate Medical Education and American Board of Emergency Medicine;
2012 [cited 2013 May 23]. Available from:
https://www.abem.org/PUBLIC/_Rainbow/Documents/EMMilestonesMeeting
4_Final1092012.pdf
7.
Rifenburg RP, Ambroz KG, Chan SB. Resident interpretation of plain
radiographs: a comparison of emergency medicine residents by year of
training. Am J Emerg Med. 2008 Mar;26(3):366–7.
8.
Pusic M, Pecaric M, Boutis K. How much practice is enough? Using learning
curves to assess the deliberate practice of radiograph interpretation. Acad Med
J Assoc Am Med Coll. 2011 Jun;86(6):731–6.
9.
Burdick WP, Ben-David MF, Swisher L, Becher J, Magee D, McNamara R, et al.
Reliability of performance-based clinical skill assessment of emergency
medicine residents. Acad Emerg Med Off J Soc Acad Emerg Med. 1996
Dec;3(12):1119–23.
10. Jang TB, Ruggeri W, Dyne P, Kaji AH. The learning curve of resident physicians
using emergency ultrasonography for cholelithiasis and cholecystitis. Acad
Emerg Med Off J Soc Acad Emerg Med. 2010 Nov;17(11):1247–52.
11. Jang TB, Ruggeri W, Dyne P, Kaji AH. Learning curve of emergency physicians
using emergency bedside sonography for symptomatic first-trimester
pregnancy. J Ultrasound Med Off J Am Inst Ultrasound Med. 2010
Oct;29(10):1423–8.
12. Lufkin KC, Smith SW, Matticks CA, Brunette DD. Radiologists’ review of
radiographs interpreted confidently by emergency physicians infrequently
leads to changes in patient management. Ann Emerg Med. 1998
Feb;31(2):202–7.
13. Suh RS, Maglinte DT, Lavonas EJ, Kelvin FM. Emergency Abdominal
Radiography: Discrepancies of preliminary and final interpretation and
management relevance. Emerg Radiol. 1995;2(6):315–8.
14. Kohn JT, Corrigan JM, Donaldson MS. To Err is Human: Building a Safer Health
System [Internet]. Washington DC: Institute of Medicine; 1999 Nov p. 1–8.
Available from:
https://www.iom.edu/~/media/Files/Report%20Files/1999/To-Err-isHuman/To%20Err%20is%20Human%201999%20%20report%20brief.pdf
15. Seymour NE, Gallagher AG, Roman SA, O’Brien MK, Bansal VK, Andersen DK, et
al. Virtual reality training improves operating room performance: results of a
randomized, double-blinded study. Ann Surg. 2002 Oct;236(4):458–63;
discussion 463–4.
16. Gerson LB, Van Dam J. Technology review: the use of simulators for training in
GI endoscopy. Gastrointest Endosc. 2004 Dec;60(6):992–1001.
17. Gaba DM. Improving anesthesiologists’ performance by simulating reality.
Anesthesiology. 1992 Apr;76(4):491–4.
18. Halamek LP, Kaegi DM, Gaba DM, Sowb YA, Smith BC, Smith BE, et al. Time for a
new paradigm in pediatric medical education: teaching neonatal resuscitation
in a simulated delivery room environment. Pediatrics. 2000 Oct;106(4):E45.
19. Issenberg SB. The scope of simulation-based healthcare education. Simul
Healthc J Soc Simul Healthc. 2006;1(4):203–8.
20. Robertson HJ, Paige T, Bok R (eds). Simulation in Radiology. New York: Oxford
University Press; 2012.
21. Lewiss RE, Hoffmann B, Beaulieu Y, Phelan MB. Point-of-Care Ultrasound
Education: The Increasing Role of Simulation and Multimedia Resources. J
Ultrasound Med Off J Am Inst Ultrasound Med. 2014 Jan;33(1):27–32.
22. Evans LV, Dodge KL, Shah TD, Kaplan LJ, Siegel MD, Moore CL, et al. Simulation
training in central venous catheter insertion: improved performance in clinical
practice. Acad Med J Assoc Am Med Coll. 2010 Sep;85(9):1462–9.
23. Girzadas DV Jr, Antonis MS, Zerth H, Lambert M, Clay L, Bose S, et al. Hybrid
simulation combining a high fidelity scenario with a pelvic ultrasound task
trainer enhances the training and evaluation of endovaginal ultrasound skills.
Acad Emerg Med Off J Soc Acad Emerg Med. 2009 May;16(5):429–35.
24. Ganguli S, Camacho M, Yam C-S, Pedrosa I. Preparing first-year radiology
residents and assessing their readiness for on-call responsibilities: results over
5 years. AJR Am J Roentgenol. 2009 Feb;192(2):539–44.
25. Godoy MCB, Cayne NS, Ko JP. Endovascular repair of the thoracic aorta:
preoperative and postoperative evaluation with multidetector computed
tomography. J Thorac Imaging. 2011 Feb;26(1):63–73.
26. Wagner HN, Conti PS. Advances in medical imaging for cancer diagnosis and
treatment. Cancer. 1991 Feb 15;67(4 Suppl):1121–8.
27. Wittenberg J, Fineberg HV, Black EB, Kirkpatrick RH, Schaffer DL, Ikeda MK, et
al. Clinical efficacy of computed body tomography. AJR Am J Roentgenol. 1978
Jul;131(1):5–14.
28. Philipp MO, Kubin K, Hörmann M, Metz VM. Radiological emergency room
management with emphasis on multidetector-row CT. Eur J Radiol. 2003
Oct;48(1):2–4.
29. Furukawa A, Kanasaki S, Kono N, Wakamiya M, Tanaka T, Takahashi M, et al. CT
diagnosis of acute mesenteric ischemia from various causes. AJR Am J
Roentgenol. 2009 Feb;192(2):408–16.
30. Batlle JC, Hahn PF, Thrall JH, Lee SI. Patients imaged early during admission
demonstrate reduced length of hospital stay: a retrospective cohort study of
patients undergoing cross-sectional imaging. J Am Coll Radiol JACR. 2010
Apr;7(4):269–76.
31. Saini M, Butcher K. Advanced imaging in acute stroke management-Part I:
Computed tomographic. Neurol India. 2009 Oct;57(5):541–9.
32. Winchester DE, Wymer DC, Shifrin RY, Kraft SM, Hill JA. Responsible use of
computed tomography in the evaluation of coronary artery disease and chest
pain. Mayo Clin Proc. 2010 Apr;85(4):358–64.
33. Obuchowski NA, Graham RJ, Baker ME, Powell KA. Ten criteria for effective
screening: their application to multislice CT screening for pulmonary and
colorectal cancers. AJR Am J Roentgenol. 2001 Jun;176(6):1357–62.
34. Bautista AB, Burgos A, Nickel BJ, Yoon JJ, Tilara AA, Amorosa JK, et al. Do
clinicians use the American College of Radiology Appropriateness criteria in
the management of their patients? AJR Am J Roentgenol. 2009
Jun;192(6):1581–5.
35. Hendee WR, Becker GJ, Borgstede JP, Bosma J, Casarella WJ, Erickson BA, et al.
Addressing overutilization in medical imaging. Radiology. 2010
Oct;257(1):240–5.
36. Brenner DJ, Hall EJ. Computed tomography--an increasing source of radiation
exposure. N Engl J Med. 2007 Nov 29;357(22):2277–84.
37. Picano E. Sustainability of medical imaging. BMJ. 2004 Mar 6;328(7439):578–
80.
38. Hadley JL, Agola J, Wong P. Potential impact of the American College of
Radiology appropriateness criteria on CT for trauma. AJR Am J Roentgenol.
2006 Apr;186(4):937–42.
39. ACR Appropriateness Criteria Process for AC Methodology Subcommittee to
Review Documentation [Internet]. 2013 [cited 2015 Mar 12]. Available from:
http://www.acr.org/~/media/ACR/Documents/AppCriteria/Process%20for
%20Document%20ReviewApr%202013.pdf
40. Kuppermann N, Holmes JF, Dayan PS, Hoyle JD, Atabaki SM, Holubkov R, et al.
Identification of children at very low risk of clinically-important brain injuries
after head trauma: a prospective cohort study. Lancet. 2009 Oct
3;374(9696):1160–70.
41. Garcia Peña BM, Cook EF, Mandl KD. Selective imaging strategies for the
diagnosis of appendicitis in children. Pediatrics. 2004 Jan;113(1 Pt 1):24–8.
42. Stein SC, Fabbri A, Servadei F, Glick HA. A critical comparison of clinical
decision instruments for computed tomographic scanning in mild closed
traumatic brain injury in adolescents and adults. Ann Emerg Med. 2009
Feb;53(2):180–8.
43. Eagles D, Stiell IG, Clement CM, Brehaut J, Taljaard M, Kelly A-M, et al.
International survey of emergency physicians’ awareness and use of the
Canadian Cervical-Spine Rule and the Canadian Computed Tomography Head
Rule. Acad Emerg Med Off J Soc Acad Emerg Med. 2008 Dec;15(12):1256–61.
44. Griffith B, Kelly M, Vallee P, Slezak M, Nagarwala J, Krupp S, et al. Screening
cervical spine CT in the emergency department, Phase 2: A prospective
assessment of use. AJNR Am J Neuroradiol. 2013 Apr;34(4):899–903.
45. Scheiner JD, Novelline RA. Radiology clerkships are necessary for teaching
medical students appropriate imaging work-ups. Acad Radiol. 2000
Jan;7(1):40–5.
46. Poot JD, Hartman MS, Daffner RH. Understanding the US medical school
requirements and medical students’ attitudes about radiology rotations. Acad
Radiol. 2012 Mar;19(3):369–73.
47. Leschied JR, Knoepp US, Hoff CN, Mazza MB, Klein KA, Mullan PB, et al.
Emergency radiology elective improves second-year medical students’
perceived confidence and knowledge of appropriate imaging utilization. Acad
Radiol. 2013 Sep;20(9):1168–76.
48. Dillon JE, Slanetz PJ. Teaching evidence-based imaging in the radiology
clerkship using the ACR appropriateness criteria. Acad Radiol. 2010
Jul;17(7):912–6.
49. Logie CI, Smith SE, Nagy P. Evaluation of resident familiarity and utilization of
the ACR musculoskeletal study appropriateness criteria in the context of
medical decision support. Acad Radiol. 2010 Feb;17(2):251–4.
50. Powell DK, Silberzweig JE. The use of ACR Appropriateness Criteria: a survey of
radiology residents and program directors. Clin Imaging. 2014 Oct 22;
51. Broder JS, Hamedani AG, Liu SW, Emerman CL. Emergency department contrast
practices for abdominal/pelvic computed tomography-a national survey and
comparison with the american college of radiology appropriateness
criteria(®). J Emerg Med. 2013 Feb;44(2):423–33.
52. Dym RJ, Burns J, Taragin BH. Appropriateness of imaging studies ordered by
emergency medicine residents: results of an online survey. AJR Am J
Roentgenol. 2013 Oct;201(4):W619–25.
53. Slanetz PJ. Teaching appropriate, cost-effective care using the American College
of Radiology appropriateness criteria. Acad Med J Assoc Am Med Coll. 2011
Nov;86(11):e14.
54. Mainiero MB, Collins J, Primack SL. Effectiveness of resident-prepared
conferences in teaching imaging utilization guidelines to radiology residents.
Acad Radiol. 1999 Dec;6(12):748–51.
55. Rao VM, Levin DC, Parker L, Frangos AJ, Sunshine JH. Trends in utilization rates
of the various imaging modalities in emergency departments: nationwide
Medicare data from 2000 to 2008. J Am Coll Radiol JACR. 2011 Oct;8(10):706–
9.
56. Blackmore CC, Medina LS. Evidence-based radiology and the ACR
Appropriateness Criteria. J Am Coll Radiol JACR. 2006 Jul;3(7):505–9.