Observations

Transcription

Observations
Contents
Knowledge assessments: written examinations
Extended matching questions (EMQs)
Useful weblinks
Knowledge-based and clinical skills-based assessments
Useful papers
Writing assessment questions
Assessment of knowledge
Page 1
Page 1
Page 1
Page 1
Page 2
Page 2
Page 2
Knowledge assessments: Written examinations
There are number of practical question formats to investigate the depth and breadth of understanding in a given discipline. For
example, short written answers, longer essay types, question ‘stems’ with a selection of answers from a number of options (i.e.
multiple choice questions or MCQs). It is therefore meaningful to explore how different question constructs are applicable to
the nature of the knowledge and its synthesis. It is rare that a single question format can fulfil the requirements of testing both
breadth and depth in a curriculum. Therefore it is advisable to use more than one test format to achieve a balanced approach
for assessment of the various dimensions of knowledge acquisition – their configuration, utility and benefits are discussed
throughout Chapter 2.
This section of the website brings together a range of resources about written assessments. It includes links to online tests, online
presentations and reference papers. It has helpful tips and examples of questionnaires.
Extended matching questions (EMQs)
Abnormal Illness Behaviour
1. Alcohol abuse
2. Anxiety disorder
3. Depressive psychosis
4. Drug Abuse
5. Factitious disorder
6. Malignancy
7. Malingering
9. Psychopathic personality disorder
8. Munchausen syndrome
10. Somatisation
For each of the following scenarios select the most likely cause from the above list. Use each scenario only once.
A – Multiple consultations with GP, a heavy smoker and a macrocytosis
B – A patient with recurrent abdominal pains and a history of three laparotomies who presents to casualty wanting further
surgery
C – Multiple and different physical complaints in a young woman whose father died when she was 10 years old
D – A middle-aged man, recently unemployed which has caused financial difficulties, who complains of palpitations
E – A woman who claims she is not worthy of medical help
Useful weblinks
Knowledge-based and clinical skills-based assessments
This website is an excellent resource with a range of free assessments that are not exclusively developed for surgeons.
• Multiple choice questions (MCQs) covering a range of clinical topics. Each question can be given a level of difficulty ranging
from student to experienced trainee.
• Extended Matching Questions over a range of topics and different levels of complexity.
• OSCE scenarios are in development and there is a range of useful links to books and tutorials.
http://www.themastersurgeon.com/
Companion website material for How to Assess Students and Trainees in Medicine and Health, First Edition. Edited by Olwyn M. R. Westwood,
Ann Griffin, and Frank C. Hay.
© 2013 John Wiley & Sons, Ltd. Published 2013 by John Wiley & Sons, Ltd.
2 Knowledge assessments: Written examinations
A British Medical Journal resource:
This website is designed primarily for medical students wishing to practise for finals examinations. It is supported by BMJ
learning and contains a host of valuable resources that can be subscribed to.
http://www.onexamination.com/students/medical-student-finals?utm_source=Bing&utm_medium=cpc&utm_
term=medical%20final&utm_campaign=Medical-Students-Final
Useful papers
Writing assessment questions
Constructing EMQ:
This is a paper describing how extended matching questions can be written and gives some examples from the author’s speciality, which is psychiatry.
George, S. (2003). Extended matching items (EMIs): solving the conundrum Psychiatric Bulletin; 27:230–2
http://pb.rcpsych.org/content/27/6/230.full.pdf+html
An e-book by Case & Swanson:
This is a detailed and comprehensive guide for item writing and is a recognised source of expert guidance. It covers examples
from basic medical sciences as well as from clinical practice and talks the reader through a step-by-step process to writing high
quality single-best answers.
Case, S, & Swanson, D. (2002). Constructing Written Test Questions For Basic and Clinical Sciences. Third Edition. Philadelphia:
NBME (an e-book)
http://www.nbme.org/pdf/itemwriting_2003/2003iwgwhole.pdf
Case, S, & Swanson, D. Recommendations for better practice in item writing
http://www.worldscientific.com/doi/pdf/10.1142/9781848162624_bmatter
Presentation by Dustin Krutsinger:
Basic guidance on writing MCQs
http://prezi.com/pbsjkfewdm-x/exam-writing/
Prezi PowerPoint presentations:
Assessment Methods by Erik Langenau:
http://prezi.com/2yaaylyyslfl/assessment-methods/
Assessment methods written with osteopaths in mind but covering the range of assessment methodologies in healthcare
Miller’s Pyramid: assessing clinical competence by Erik Langenau:
http://prezi.com/givu2cndyfep/millers-pyramid-assessing-clinical-competence/?utm_source=website&utm_medium=
prezi_landing_related&utm_campaign=prezi_landing_related_author
Assessment of knowledge
Fischer, M, Kopp, V, Holzer, M, Ruderich, F, Jünger, J. (2005). A modified electronic key feature examination for undergraduate
medical students: validation threats and opportunities. Med Teach; 27(5):450–5.
http://informahealthcare.com/doi/pdfplus/10.1080/01421590500078471
Fournier JP, Demeester A, Charlin, B. (2008). Script concordance tests: guidelines for construction. BMC Medical Informatics
and Decision Making; 8: 18.
http://www.biomedcentral.com/content/pdf/1472-6947-8-18.pdf
Gagnon, R, Charlin, B, Coletti, M, Sauvé, E, van der Vleuten, C. (2005). Assessment in the context of uncertainty: how many
members are needed on the panel of reference of a script concordance test? Med Educ; 39(3):284–91.
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2929.2005.02092.x/pdf
Mackillop L, Parker-Swift J, Crossley J. (2011). Getting the questions right: non-compound questions are more reliable than
compound questions on matched multi-source feedback instruments. Med Educ; 45: 843–8.
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2923.2011.03996.x/pdf
McCoubrie, P. (2004). Improving the fairness of multiple-choice questions: a literature review. MedTeach; 26) 8: 709–12.
http://www.drcog.co.uk/MCQ%20fulltext.pdf
Muijtjens, AMM, van Mameren, H, Hoogenboom, RJI, Evers, JLH and ven der Vleuten, CPM. (1999). The effect of a ‘don’t
know’ option on test scores: number-right and formula scoring compared. Med Educ; 33: 267–75.
http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2923.1999.00292.x/pdf
Palmer, EJ Devitt, PG. (2007). Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple
choice questions? Research paper. BMC Med Educ; 7: 49.
http://www.biomedcentral.com/content/pdf/1472-6920-7-49.pdf
Knowledge assessments: Written examinations 3
Palmer, EJ, Duggan, P, Devitt, P, Russell, R. (2010). The modified essay question: Its exit from the exit examination? MedTeach;
32: e300–7.
http://www.cpass.umontreal.ca/documents/Recherche/TCS_articles/Palmer%20E-MEQ%20exam-SCT-2010.pdf
Schuwirth LW, van der Vleuten CP. (2003). ABC of learning and teaching in medicine: Written assessment. Br Med J; 326: 643–5.
http://www.son.washington.edu/faculty/preceptors/docs/written-assessment.pdf
Schuwirth LW, van der Vleuten CP. (2004). Different written assessment methods: what can be said about their strengths and
weaknesses? Med Educ. 38(9):974-9.
http://harvardmacy.org/Upload/pdf/Schuwirth%20article.pdf
Tigelaar, D, Dolmans, D, Wolfhagen, I, & van der Vleuten, C. (2004). Using a conceptual framework and the opinions of portfolio
experts to develop a teaching portfolio prototype. Studies in Educational Evaluation; 30: 305–21.
http://eder603.wikispaces.com/file/view/using+a+conceptual+framework+and+the+opinions+of+portfolio+experats+to+
develop+a+teaching+portfolio+prototype.pdf
Contents
Observations
Clinical competence
The objective structured clinical examination (OSCE)
Checklist for writing OSCE stations
Examples of OSCE mark sheets
Setting up an OSCE
Preparation, set-up and running of a short-station OSCE:
what we do and what we have learned not to do
Useful papers cited: assessing competence
Papers cited and weblinks: structured clinical examinations
Workplace-based assessments
Forms used for workplace-based assessments
Direct observation of procedures
Case-based discussion
Papers and weblinks: workplace-based assessment tools
Simulation
Stages to planning a simulation episode
The learning environment for simulation sessions
Using actors as simulated patients
Making the simulation realistic
Making a scenario authentic to students and trainees
Responsibilities of a facilitator
Debriefing
Papers and weblinks for simulation
Page 4
Page 4
Page 4
Page 5
Page 5
Page 6
Page 7
Page 17
Page 17
Page 17
Page 18
Page 18
Page 18
Page 18
Page 19
Page 19
Page 20
Page 20
Page 20
Page 21
Page 22
Page 23
Page 23
Observations
Clinical competence
Competence assessments are designed to test what a professional is able to do in clinical practice, while performance assessments
are used to test what they actually do in their clinical practice. Competence assessments are generally used in high stakes assessments such as finals examinations or postgraduate assessment to gain membership of a Royal College. They are summative and
convened at an allocated time in a contained environment. Accordingly, candidates are judged on their performance of the
assigned task at a specific time.
The resources in this section are supplementary to Chapters 3 and 4 in the book.
Assessment of performance may happen as:
• Objective structures clinical examinations (OSCEs)
• Workplace-based assessment in practice
• Simulation events.
The objective structured clinical examination (OSCE)
An OSCE comprises a circuit of short (usually 5–15 minutes) stations, in which each candidate is examined on a one-to-one
basis by one or two assessors. Each station has a different assessor or pair of assessors and candidates move around sequentially
to complete all stations on the OSCE circuit in a fixed time (Figure 1). At each station the candidate is asked to perform a
specific task and the assessors mark the candidate using a structured mark sheet. Each station usually tests a combination of
abilities, for example communication skills and clinical method, and can involve real patients, simulated patients, manikins or
specific equipment, a video recording or interpretation of radiological image.
Companion website material for How to Assess Students and Trainees in Medicine and Health, First Edition. Edited by Olwyn M. R. Westwood,
Ann Griffin, and Frank C. Hay.
© 2013 John Wiley & Sons, Ltd. Published 2013 by John Wiley & Sons, Ltd.
Observations 5
Station
1
Station
6
Station
5
Station
2
Station 3
Station
4
Figure 1 A typical OSCE circuit.
Checklist for writing OSCE stations
Here are some questions that you need to ask yourself when constructing valid and reliable OSCE stations for the assessment
of clinical communication and skills competences:
• What do I want the station to test?
• Who are the students in the programme who will be taking the test?
The station may be similar, but more complex, in Year 2 compared with the final year.
• Does the content as written demonstrate bias that might be offensive?
For example: are you making assumptions about the patient’s sexual orientation if you refer to ‘wife’ rather than ‘partner’?
• How long should this station be? Is it feasible for a student to complete the station tasks in the allotted time frame?
• Who should I ask to peer review the station content for accuracy, validity and reliability?
• Who should I use as an examiner for the station?
Competence of examiner that ensures reliability?
Will it stand up to legal/public scrutiny – this is particularly significant for high-stakes examinations?
• What resources are needed to implement the station? This needs to be in a list that accompanies the OSCE examining
materials:
Equipment?
Simulated patient – script to act the part, age, gender?





It is expedient to carry out additional checks and gain feedback on the station in practice
• Ensure there are no discrepancies between the instructions to the:
Candidate
Examiner
Simulated patient.
• Have feedback forms for the examiner and the simulated patient to ascertain how the station performed in practice.
• Adjust the content as appropriate.



Examples of OSCE mark sheets
http://www.scribd.com/doc/92517024/OSCE-Mark-Sheet-2
OSCE Home website
This website is one that students might have seen and is a guide to understanding OSCEs and their use to test competence in
a clinical encounter with a patient.
http://www.oscehome.com/OSCEs_Examiner_Checklist.html
The following page shows an example of an OSCE station and mark sheet.
6 Observations
STATION
STUDENT INSTRUCTIONS
This patient has had non-insulin dependent diabetes for 7 years and has come for their annual check-up with you, their GP. Please
examine their feet.
Tell the examiner what you are doing and why, as you go along. Explain the importance of foot care to the patient.
ASSESSORS MARKS SHEET: OSCE STATION – DIABETIC FOOT
When giving marks for the explanation, give half of the total available for
knowledge, and half for clarity of explanation to patient (communication skills)
Marks
Introduces themselves to the patient
1
Checks consent has been given for examination
General inspection:
Looks all over the foot, explains is looking for infection or damage
1
Compares feet
1
Checks colour, warmth, hair loss
1
Checks between toes for fungal infection
1
Checks toe nails for length, fungal infection, state of skin
1
Explanation: along the lines of:
Good foot care can help prevent complications such as ulcers. Foot care is important to patients with diabetes because patients frequently
suffer with skin infections, or damage their feet without being aware of this.
2
Vascular:
Palpates for pedal pulses:
Dorsalis pedis left and right
1
Posterior tibial left and right
1
1) Check capillary return, left and right
1
2) Checks proximally for popliteal/femoral pulse
1
Explanation:
Macrovascular complications of diabetes
Poor circulation increases risk of infection etc.
1
Neurological:
Checks vibration sense left and right
1
Uses tuning fork correctly:
0.5
Places tuning fork on bony prominence
0.5
Gives patients clear instructions re ‘feeling the buzz’ rather than just feeling the tuning fork
0.5
Moves proximally until patient can feel vibration
0.5
Tests for ankle jerk correctly left and right
2
Explanation:
2
Knows that loss of vibration sense is an early sign of neuropathy
Patients with neuropathy need to take extra care with their feet as they may not be aware of damage.
TOTAL
20
Setting up an OSCE
James A. McCoy, M.D. and Hollis W. Merrick, M.D. on behalf of the Committee on Testing and Evaluation Association for
Surgical Education have made available a comprehensive guide to setting up OSCEs:
Observations 7
• Development of stations
• Organisation the set-up and trouble-shooting the event
• Training simulated patients
• Training the examiners.
See: http://www.facmed.unam.mx/sem/pdf/ObjectiveStructuredClinicalExam.pdf
Dr Kathy Boursicot and Professor Trudie Roberts have written a good guiding overview on how to set up OSCEs:
Boursicot, K. Roberts, T. (2005). How to set up an OSCE. Clinical Teach; 2(1): 16-20.
http://rlillo.educsalud.cl/Capac_Docente_BecadosAPS/Evaluacion/How%20to%20set%20up%20an%20Osce.pdf
Preparation, set-up and running of a short station OSCE: What we do and what we have learned not to do
This is a practical guide to all aspects of running an OSCE examination and is an accompaniment to Chapter 3 of the
book.
By the UCL Clinical Skills Team:
(Michael Klingenberg RGN MSc(Educ) FHEA, Deirdre Wallace BSc (Hons,) RN, MA Clin Ed. Tina Nyazika BSc (Hons), RN,
Nicola Mathastein RGN, BSc (Hons), MA Clin Ed. Richard Say RN, PgDip. Catherine Phillips BA (Hons,) RN, MA Clin Ed,
FHEA at your service! Victoria Edwards BA(Hons), MA)
The clinical skills team at UCL Medical School has been running short station OSCEs for over 10 years. The skills team collaborates with a number of groups in the medical school to ensure that the event runs smoothly.
• The underlying aim for clinical skills tutors is to ensure that the performance of students is not affected by irregularities or
unanticipated problems.
This could include anything from an unclearly written candidate instruction to an examiner leaving the circuit to get a
coffee.
In a heightened state of anxiety, even the smallest disturbance can significantly affect the performance of a candidate.
From the refining of questions and the building of a patient database, to the examiner and candidate feedback received from
previous exams: it could be said that years of work contribute to the successful running of each short station OSCE. However,
this web resource focuses on the few months leading up to and the days of the assessment event.
It shares our approach to the preparation, setting up and running of this multi-faceted exam. The stages of preparation have
been broken down into four time frames:
• The months leading up to the OSCE
• The weeks leading up to the OSCE
• The days leading up to the OSCE
• The day of the OSCE.


With months to go ...
First planning meeting
Six weeks prior to the exams, meetings are organised with all people who will be involved with running the OSCE on the day.
This includes:
• The academic lead: usually an academic fellow at UCL Medical School.
• Clinical skills tutors.
• Administrative staff: play a pivotal role before, during and after the exam.
• Porters and cleaners: need to be made aware of the added work load.
This meeting is an important opportunity to go through the various stations and discuss the practicalities of preparing for, and
running, the exam.
For example, have enough of the following been recruited to run the OSCE:
• Actors
• Patients
• Helpers and examiners?
What is the progress on the writing of questions and when are they expected to be completed?
During the meeting, the academic lead and skills tutors ensure that they are clear about what each station involves.
If a station appears unworkable, this needs to be addressed as early as possible.
The preliminary meeting is also an important opportunity for all members of the team to align expectations of responsibilities
and assign roles for the preparation and running of the OSCE.
Helpers
Helpers play a key role in the running of UCL Medical School exams and are relatively easy to recruit if this is organised well
in advance. While helpers can be used to assist setting up and taking down an exam, their assistance during the exam is essential
for the smooth running of an OSCE. Among many other things, helpers are used to:
8 Observations
• Direct students around the circuit.
• Act as patients in OSCE stations which do not require professional acting skills (such as an examination of a joint).
• Time the circuit.
• Change linen.
• Offer tea and coffee to examiners and candidates.
The number of helpers required depends on the type of exam stations that are used in any specific OSCE, but a good rule of
thumb is number of helpers who simulate patients plus three. We advise avoiding over-recruitment as helpers can get bored
and lose the necessary attention if they are without a purpose for too long.
It is generally quite easy to find adequate numbers of helpers. Often our exam period coincides with times where pupils ask for
work experience opportunities. For young people considering a career in medicine, being involved in the setting up and running
of an examination can be an invaluable experience. They will see what lies ahead of them in terms of assessment and the experience positions them well for any future application process. Helpers also get an opportunity to discuss their potential future
career with senior clinicians and other experienced teaching staff. Other areas to recruit helpers from are friends, family and
affiliated nursing schools.
There are a few things to consider when recruiting helpers:
Payment: We do not pay work experience students. However, all other helpers are paid around £50 in vouchers, which is money
well spent for a reliable friend or family member – particularly given that an OSCE day runs from 7.30am to 6.30pm.
Reliability: While work experience students tend to be excellent, many of them have not worked before and some may not
appreciate the importance of punctuality, appropriate dress and other things that are second nature to school leavers. The
importance of the exam must be stressed to all helpers. Ideally, work experience students will help with the OSCE as part of a
larger work experience block. This avoids the need to orientate the student to the centre on exam day.
Confidentiality: In line with stressing the magnitude of this exam, all helpers should be very clear about the need for confidentiality. For example, they should be reminded not to discuss the exam on their way home.
Finally, it may also be necessary to book a number of healthcare assistants should your OSCE require the help of a larger number
of patients. This should be organised well in advance of the OSCE.
Consumables
All consumables required for the OSCE are ordered well in advance as delivery may be delayed or there may be shortages. This
includes equipment such as:
• General equipment: stationery or hand rub and blue roll.
• Specific equipment: such as cannuli or other single-use items needed for stations.
• Refreshments: milk, biscuits, coffee, tea and juice.
• Catering: lunch for examiners, staff and helpers.
A station list with required items per station is a useful tool to help decide what is needed (see next page).
Companion website material for How to Assess Students and Trainees in Medicine and Health, First Edition. Edited by Olwyn M. R. Westwood,
Ann Griffin, and Frank C. Hay.
© 2013 John Wiley & Sons, Ltd. Published 2013 by John Wiley & Sons, Ltd.
Title
Change in bowel habit:
GI
Referral letter, linked to
1
ECG
Ethics
cannulation of nervous
pt
Speech assessment
Prescribing based on
PEFR
BP & documenting obs
TATT
Knee exam
Daily Mail
Mental Test score
Explaining procedure :
#
DRABCDE with BLS
explaining high/low
platelets
Motor system exam
Station
No
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
1
1
1
1
Mannequin Patient
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
Actor Helper Examiner
1
1
1
1
Couch
1
1
1
1
1
1
1
1
1
1
1
1
1
Table
2
3
1
3
3
3
2
3
3
3
3
3
3
2
2
3
Chairs
handrub, patellar hammer
handrub
Fred, oropharyngeal airways
3,4,5; nasopharyngeal airways
6,7,8; O2 reservoir bag; BVM, handrub
handrub, #
handrub, paper, pens
handrub, paper, pencil
handrub, patella hammer, body in shorts/gown
handrub
handrub, sphygmomanometer, stethoscope, observation chart,
laminated sheet with temp, pulse and respiration to be given to
student for documenting on obs chart
handrub, BNF, peak flow meter, mouthpieces, mock up drug charts,
handrub, paper, pens
arm with name band; tray with site specific cannulae, bungs,
dressings and sterets, 10ml syringe with NACL; tourniquet, handrub,
gloves, sharps bin, bin. Please note equipment from all 3 sites to be
available
handrub, paper, pens
handrub, armband on manikin, ECG electrode stickers, bin, 2
laminated ECG recordings
referral proforma, pens, box
handrub
Specific Equipment
10 Observations
Anticipating problems
It is important to spend time with others identifying any potential problems that could occur on the day of the OSCE. For
example, at one of the UCLMS sites, the fire alarms are tested every Tuesday. A re-scheduling of these alarms is always coordinated with the facilities manager weeks in advance.
With weeks to go ...
Volunteer patients
Recruitment of ‘real’ patients is arguably the trickiest aspect of preparing for an OSCE. Indeed, building and maintaining this
database is a considerable task that often takes years. It requires a significant amount of organisation and networking skills.
Clinicians need to be contacted on several occasions throughout the year to help recruit willing patients from their clinics and
patients have to be invited on the basis of their signs and symptoms.
Patients generally have agreed to participate months before the date of an OSCE. In the weeks leading up to the exam, arrangements for transport and payment need to be made. We use a voucher system to thank volunteer patients for their help. Volunteer
patients all have chronic illnesses and it is not uncommon for an exacerbation of their illness to preclude them from participation. Thus, it is essential to have regular contact with all volunteer patients in the two weeks leading up to the exams.
Finalisation of documentation
With two weeks to go, all documentation must be finalised. This includes:
• Mark sheets
• Candidate instructions
• Actor instructions
• Examiner instructions.
All instructions are proof read and standardised. This is especially important for candidate instructions as different fonts and
layouts may increase the time a student needs to take in the information required to start a station.
Laundry
The OSCEs are run in a clinical skills centre and bed linen is required above usual needs. To avoid overburdening the laundry
department with a sudden, ad hoc request we book linen from the laundry of the affiliated hospital.
With days to go ...
The stations are usually set up two or three days prior to the OSCE, this gives the faculty ample opportunity to troubleshoot,
modify the set up and check for any errors. If the OSCE is being duplicated on other sites, this time can be used to ensure that
all the stations look identical.
Preparation of the OSCE rooms
When planning the set up, it is important to take into consideration the following points:
The layout of the facility: How many rooms will you require for the OSCE and how will it flow through the facility in a circuit?
Space: You will require adequate spacing between each station to ensure fluidity of the exam. Each station should be easy for
the candidate to enter and quickly exit.
Space dividers: To divide the space, a mixture of sound boards and hospital screens are generally used; if a couch is needed for
an examination station it is common practice to use a hospital screen.
Hospital beds/couches: For examinations, students are taught to approach the patient from the right, therefore beds and
couches should be positioned accordingly.
Chairs and tables: Position the chairs and tables in each station for ease of access, ensuring that the chair on the outside is
reserved for the candidate thus making entry and exit simple. At UCL we usually sit the examiner to the left of the candidate
and the simulated patient to the right.
Briefing rooms: One for examiners and one for candidates.
Directions: Ensure it is easy for both candidates and examiners to find the facility by printing out signs and putting them up
in the appropriate places. You may also need directions within the OSCE to guarantee that the candidates follow the circuit
correctly.
Corridors: The corridors must be free of furniture and clutter so that the flow of the circuit is not obstructed in any way.
Numbers: All stations need to be numbered and numbers placed in a position that is clear to the candidates, therefore station
numbers needs to be visible from all positions within a room.
Hand gel: All stations should include a bottle of hand gel to encourage regular hand washing.
Power supply: Some stations require a power source for a laptop or for use of clinical equipment; early preparation and station
order planning will ensure that these stations are close to a power supply.
Observations 11
Equipment
Once the framework of the OSCE is complete, the fine-tuning begins. Place all relevant equipment in each station, ensuring
that it is all in good working order and easily accessible for each candidate.
Clearly label boxes of consumables if necessary and think about what other items in the station may require labelling (for
example, mock syringes full of 1mg of Lignocaine for suturing stations).
Make sure there are plenty of waste bins in the area so that there is not a rubbish build-up. It is important to note that if you
are duplicating this OSCE on other sites all the equipment needs to be identical across the board.
For written stations there needs to be a good supply of paper, pens/pencils and erasers.
Stations and instructions
Each station will have a set of instructions that need to be printed out before the initial walk around. These instructions will
include:
• Candidate instructions
• Examiner instructions
• Simulated patient instructions.
It is worth printing out at least two sets of each for back up purposes and also worth noting that these instructions may be
changed at the last minute if mistakes are identified.
Note: One set of candidate instructions should be placed on the outside of the station for the candidate to read before entering
and another on the inside for the candidate’s reference during the station.
• Putting these instructions in clear plastic wallets will keep them presentable for the day and attaching them to the table should
ensure that the candidate does not walk off with them.
• Both the examiner and the simulated patient should have a clipboard with their instructions attached. The examiner will also
have mark sheets and at least two pencils to mark the station, stationery equipment needs to be easily accessible throughout
the OSCE.
You will also need a bell or an intercom system to run the OSCE along with a timing sheet to keep track of the circuit.
Refreshments
There will need to be good supply of water for the OSCE, particularly for candidates on the circuit.
Positioning refreshments, such as tea and coffee, in a central location for simulated patients and examiners should minimise
delays during circuit breaks.
The walk around
Now that the OSCE is almost complete the members of the organising faculty must walk around the circuit with a copy of the
mark scheme for each station. This walk around will identify any errors such as typos, equipment issues and set up issues. The
faculty will also ensure that the station makes sense as a whole and that all the relevant components work together to make it
an understandable and viable station. Time must be taken to walk through each station as a candidate would to ensure it is
easily understood and workable. If there are any mistakes, these must be communicated to the other sites immediately to ensure
alignment.
On the day ...
Final walk around
The completed OSCE circuit should once again be checked by at least two members of the organising faculty for any last changes
that may have been overlooked. Despite careful checks in the weeks and days prior to the exam, it is not uncommon to pick up
inconsistencies on the morning of the exam.
If technology is being used on any of the stations (such as laptops depicting x-rays), this should be switched on and configured
to ensure that it does not time out whilst any candidates are in situ (for example energy saving settings may need to be changed).
This final walk around should also be performed as a general tidy up due to the overnight cleaning which may have displaced
screens with numbers, candidate information and any equipment or furniture within stations.
Student checks and briefing
The students should have received information regarding their OSCE with regards to site address, room with people to report to,
what to bring with them, their candidate number, start station and the start time of the OSCE. This information must be sent at
least two weeks (minimum) before the exam from the lead administrator. The students are expected to arrive to their briefing at
least 40 minutes before their OSCE start time, enabling them to be briefed and for all checks that need to prepare them for their
OSCE be completed.
These student checks include:
Identification: Students should have their university photo identification to show to a designated member of staff. In the event
that this has been misplaced a passport or driver’s licence with photo ID is acceptable.
12 Observations
Candidate number: This is a number that will be included on a sticker label or badge that will be handed to the student to
visibly place on their person as they go round the OSCE. Other details on this label will include the student’s start station and
may also contain the circuit number. To avoid mishaps these labels are handed out one student at a time, consulting with the
School register, and detail candidates’ name, number and start station. The student should cross-reference the candidate number
they are being given with the administrative details that were emailed to them to ensure the numbers correspond.
Disclaimer: This should be included in the administrative email sent to the student with all the details concerning the whereabouts, start time, candidate number and start station for the exam. The disclaimer should contain elements regarding cheating
and non-disclosure to other students they may meet after their examination. This form should be signed in the presence of,
and handed in to, the designated member of staff.
Hand-ins: If there are any items to be handed in this should be done before the start of the OSCE. The items may include log
books, portfolios or elements that the student may have been requested to bring to show before they can be permitted to partake
in the exam.
Hand-outs: These will include any cards that need to be handed out to those students who may have difficulties or impairments
(learning or otherwise), such as dyslexia, blindness, deafness, or even plasters on limbs reducing dexterity, to present to examiners as students go round the OSCE.
OSCE briefing
The OSCE briefing should include the following information:
Welcome and title of OSCE
There have been incidences where students have found themselves at the wrong exam.
Choreography of the OSCE
• The number of stations being assessed.
• The number of rooms the OSCE is taking place in.
• Length of stations (usually 5 minutes in short station OSCEs) and identifying any longer than standard stations (usually 10
minutes in short station OSCEs).
• Type of stations being assessed, for example practical, prescribing, medical emergencies.
• Advice on following the OSCE in numerical order and on the presence of helpers should guidance be required.
It should be noted here that students must be informed that their start station is stated on their badge and any early starters
must also be identified at this point. There is need for explanation regarding early starters.
Early starts are required in two situations to avoid a ‘student jam’ at one or more stations. Two stations may be linked, that is if a
student has to complete one station first to be able to do the next one (for example if one station requires assessment of a patient
and the next station a structured handover to a senior clinician. In this case one student will have to start five minutes before the
main exam on the first of the two linked stations (if there are multiple linked stations, multiple early starts may be required).
If your OSCE consist of predominantly 5-minute stations but you would like to include a more intricate exam, such as an ethics
and law station or catheterization with integrated communication skills, you may need to incorporate 10-minute stations. For
this to be possible you need to set up two identical stations for every 10 minute station you are running.
Label them station X-a, and station X-b. Let one student start 5 minutes before the main exam on station X-a. Another student
will start with the main exam on station X-b. This will ensure that only one student will leave station X every 5 minutes to go
to the following station.
Student tasks
• Only stethoscopes to be taken round (or any other proposed items).
• All other equipment is provided.
• Place all identification into their bags, none to be worn except for badge or sticker.
• All mobile phones to be switched off and placed in their bags.
Student tasks during the exam
• Ensure examiners check the candidate number on their badge corresponds with the mark sheet (this is important to avoid
the completion of the wrong mark sheet).
• Read and follow instructions at each station.
• Instructions are provided outside and inside the stations.
• Listen out for 1 minute remaining warnings as signposts.
• Wash hands between stations.
• Change stations when told to do so.
Dress code for all
• Professional dress code adhered to.
• Short sleeves or sleeves rolled up.
Observations 13
• Long hair tied back.
• No watches worn – bare below the elbow.
• Rings: one single metal band only.
• Earrings: small studs only.
• Necklaces: a simple chain only.
• No bracelets, charity wrist bands.
• Ties – according to local policy – no fraternity group ties.
‘House’ rules for students
• Be polite to all patients and examiners.
• Do not leave the station until instructed.
• Do exactly what the instructions tell you.
• Don’t panic.
• Do not discuss with colleagues who are still to sit the examination.
• Examiners have been told not to write ‘cause for concern’ forms in front of you so do not panic if examiner is writing.
Setting the scene for the candidates
• Throughout the briefing, a calming mood should be created.
• Prior to the OSCE, students tend to be very anxious and require simple, clear instructions.
• Students must be informed that if an incident occurs this should be communicated to the site lead before leaving. (These
complaints can only be made at the time in order to necessitate investigation and not when they have left the building or in
hindsight.)
• The students should be reminded that the organisations desire is for them to pass and lastly wish them all good luck!!
Examiner briefing
This should be done by the site lead or lead clinician attached to the site of exam. The site lead must remain surplus to requirements in order to be present for any queries or troubleshooting that may arise once the circuit begins, and should therefore not
be examining a station. The examiners should arrive at least 40 minutes before the start of the exam for exam orientation. This
includes:Background and choreography of the exam
• What is being tested, for example the candidates’ core clinical skills.
• Length of circuit and how many circuits they are expected to examine and if they are changing stations between circuits or
staying put.
• Number of total stations.
• Length of time of stations, if there are any linked stations, longer stations (double stations) or any early starters.
Examiner tasks
• Assess a station for number of circuits.
• Please put students at their ease.
• Confirm the student has read the task.
• Check the student/candidate number corresponds to mark sheet.
• Observe the activity.
• Complete the mark sheet.
• Ensure students remain in station until told to move on.
Examiner conduct
• Do not leave the station whilst the student is present.
• Please keep your mobile on silent – do not use it during the examination.
• Do not confirm the diagnosis to the student or give them feedback.
• Please interact with the student only where directed; this is not a viva or a chat.
• If a student asks questions related to the station, please re-direct them to the instructions for the station.
Form filling
• The site lead must explain differences between marking scheme, for example Clear Pass/Pass/Borderline/ Fail/Clear Fail to
enable examiner to mark correctly.
• Examiner must use clinical judgement or refer to the examiner instruction sheet for guidance about the borderline grading.
• The global score at the bottom determines the pass mark. This is not meant to be an ‘average’ of the individual marks – must
use clinical judgement.
14 Observations
Mark sheets
• Must use pencil, eraser – not pen.
• Draw a line in the box [----] – do not circle.
• Check all parts of each score sheet have been filled in.
• The overall judgement is used to set the pass mark.
• Forms must be in order of student presentation – check candidate number.
Judgements
• Has the candidate done the task described to the required standard?
• Do you have any serious concerns? If so complete a ‘Cause for Concern’ form.
See also examples of OSCE Session Briefings and OSCE Examiners’ Guide:
http://www.cetl.org.uk/learning/index.php
Cause for concern
These should be colour-coded for ease of access during the OSCE and provide the examiner with the opportunity to express
any ‘cause for concern’ regarding:
• Attitude
• Dress
• Professionalism
• Dangerous practice – for example, sharps disposal
• Poor communication skills
• They are NOT for poor knowledge or performance – this should be reflected in the scoring for the station.
Feedback on the OSCE station
• An evaluation form with a few questions on:
What went well, what could have been improved, and any other comments.
The time constraints, for example was the station time too short, too long or appropriate.


End of exam
• Leave all paperwork at the station.
• Fill in an evaluation form about the station.
**Both sets of briefings should include fire alarm identification and actions to be taken.**
Bell or timing sheet
• The person doing the timing is encouraged to tick the relevant time blocks accurately to enable everyone to see where the
OSCE is just in case someone else needs to take over (especially in an emergency)
• Regarding equipment, there should be at least two or three timers or stop watches – one is a backup.
5 mins
5 mins
30 secs
4 mins
5 mins
30 secs
4 mins
5 mins
Reading time begins: 30 seconds to read
After 30 seconds: Begin/Start
After 4 mins: One minute remaining
After 5 mins: Change station
9
5 mins
5 mins
8
30 secs
4 mins
30 secs
4 mins
1
4 mins
Early start
30 secs
4 mins
5 mins
5 mins
30 secs
4 mins
4 mins
9
30 secs
30 secs
8
1
Early start
5 mins
4 mins
30 secs
10
5 mins
4 mins
30 secs
10
5 mins
4 mins
30 secs
2
5 mins
4 mins
30 secs
2
5 mins
4 mins
30 secs
11
5 mins
4 mins
30 secs
11
5 mins
4 mins
30 secs
3
5 mins
4 mins
30 secs
3
5 mins
4 mins
30 secs
12
5 mins
4 mins
30 secs
12
OSCE Bell Sheet (17 Stations Includes Early Start and 10 min Station)
5 mins
4 mins
30 secs
13
5 mins
4 mins
5 mins
4 mins
30 secs
13
30 secs
4
5 mins
4 mins
30 secs
4
5 mins
4 mins
30 secs
14
5 mins
4 mins
30 secs
5
5 mins
4 mins
30 secs
14
5 mins
4 mins
30 secs
5
5 mins
4 mins
30 secs
15a/b
5 mins
4 mins
30 secs
6
5 mins
4 mins
30 secs
15a/b
5 mins
4 mins
30 secs
6
5 mins
4 mins
30 secs
15a/b
5 mins
4 mins
30 secs
7
5 mins
4 mins
30 secs
15a/b
5 mins
4 mins
30 secs
7
Observations 15
Companion website material for How to Assess Students and Trainees in Medicine and Health, First Edition. Edited by Olwyn M. R. Westwood,
Ann Griffin, and Frank C. Hay.
© 2013 John Wiley & Sons, Ltd. Published 2013 by John Wiley & Sons, Ltd.
16 Observations
Intercom system
An intercom system that can be heard in each OSCE room is desirable to make sure the OSCE runs smoothly and all students
get their timings or warnings simultaneously. This is normally the responsibility of the timer keeper. However, if there is no
system available then in every room being used there should be at least one helper designated the task of shouting the ‘1 minute
remaining’ warnings and ‘change station’ calls.
Troubleshooting emergencies: OSCE
How to deal with missed 1 minute remaining calls
As soon as it is discovered that the call has been missed, announce the 1 minute remaining whilst starting the backup timer
and allow a full minute to pass for consistency, followed by the change station call.
If the call was forgotten completely and the timer beeps then the person timing should announce that as the 1 minute remaining call and once again set the back up for a full minute followed by change station.
Students missing a station
This normally presents as two students at the same station: if the stations the students should be in are identified quickly enough
then continue with exam, however a helper needs to take the 3rd timer to the student who needs a full 5 minutes allowing them
to start their station and giving them personalised warnings.
The rest of the candidates continue with the main timer, but at the end bell of the main timer the person controlling the intercom system should ask all students to stop their station and also to remain in their stations until further instructions are given.
(The 3rd timer will inform the student to continue with their station and ignore the stop request.) These further instructions
will come when the student who has lost time catches up and their end bell signals the change station request.
If this cannot be sorted out with minimal disruption as above, the alternative is to advise the student that they will be able to
undertake the missed station after the main exam. This will require a timer to be present whilst the student is being examined
to give the necessary instructions (‘start’, ‘one minute remaining’, and ‘change stations’ or ‘end of exam’).
It is not advisable to stop the whole OSCE at any point as this can cause widespread unrest/panic for candidates for the remainder of their OSCE.
Students who become ill during the OSCE
This needs to be discussed with the clinical skills lead, lead clinician and/or site lead and agreed with members of the organising
team present at the site.
If this is before the start of a circuit the student could be invited to present themselves to a later circuit. This could be later on
that day; or if it is a two-day OSCE it may be on a circuit on the following day and could be at a different site.
If there is only one circuit of the day left that they can be allocated to, and it has a full quota of candidates, a rest station may
need to be added to the circuit to accommodate the student.
In all these situations the candidate’s mark sheets would have to be removed from the examiners’ mark sheets folders at the
stations. If this is not possible, examiners must be informed of the name and their assigned candidate number that will be
missing to make certain they mark the correct paperwork for subsequent students.
If is the illness occurs during the OSCE, they may have to sit out the remainder of their stations and be looked after in the
centre. If they are able to continue they will have to be slotted into next circuit but be kept separate to the students, only joining
them before circuit begins as an early starter at the station they would have gone to next before the episode of illness. A rest
station may have to be included in the circuit.
If the student is unable to continue, or there is no other circuit on that site or a different site into which they can be slotted,
then the case is referred to the board of examiners and medical school. If there are no scheduled re-sits, the student may have
to repeat the year and take the next year’s OSCE.
Overall if there any queries the site leads or clinical skills leads should be approached.
Epilogue
Of course the work does not end with the last ‘end of exam’ call. The equipment has to be taken down (and it is advisable to
this on the last day of your OSCE as all the helpers will still be present), confidential documents (such as candidate instructions)
need to be filed or shredded and the marking process needs to be completed.
The clinical skills team always finds it helpful to have a debriefing session. A few weeks after the exam, an official debrief meeting
with feedback from external examiners and students will mark the beginning of a new cycle.
Final note: Other uses of the OSCE format
Admissions process: may also benefit from using an OSCE format
Selecting the right students for medical school
A presentation by C McManus:
http://www.ucl.ac.uk/medicalschool/postgraduate/events/mededconference280611/cmcmanus
Observations 17
Some Schools have used OSCEs as part of the admissions process
See paper of Professor Kevin Eva et al.:
Eva KW, Rosenfeld J, Reiter HI, Norman GR. (2004). An admissions OSCE: the multiple mini-interview. Med Educ; 38(3):314–26.
http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2923.2004.01776.x/pdf
Useful papers cited: assessing competence
Boursicot, K, L, Etheridge, L, Z Setna Z, Sturrock A, Ker, J, Smee S, Sambandam E. (2011). Performance in assessment: Consensus
statement and recommendations from the Ottawa conference 2011. Med Teach; 33 (5): 370–83.
http://informahealthcare.com/doi/pdf/10.3109/0142159X.2011.565831
Crossley, J, Humphris G, Jolly B. (2002). Assessing health professionals, Med Educ; 36: 800–4.
http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2923.2002.01294.x/pdf
De Champlain, AF. (2004). Ensuring that the competent are truly competent: an overview of common methods and procedures
used to set standards on high-stakes examinations research and education reports, JVME; 31(1): 62–6.
http://medicina.udd.cl/ode/files/2010/07/DeChamplain_3105.pdf
Sales, D, Sturrock, A, Boursicot, K, Dacre, J. (2010). Blueprinting for clinical performance deficiencies – Lessons and principles
from the General Medical Council’s fitness to practise procedures. MedTeach; (32): e111–e114.
http://informahealthcare.com/doi/pdf/10.3109/01421590903386781
Van der Vleuten C. P. M. and Schuwirth, L.W.T (2005). Assessing professional competence: from methods to programmes. Med
Educ; 39, 309–17.
https://abp.org/abpwebsite/r3p/pre-read/vanderVleutenAssessProgrammes.pdf
Van der Vleuten CPM. (1996). The assessment of professional competence: developments, research and practical implications.
Adv Health Sci Educ; 1: 41–67.
http://link.springer.com/article/10.1007%2FBF00596229#page-1
Wass, V, Van der Vleuten, C, Shatzer, J, Jones R. (2001). Assessment of clinical competence, The Lancet; 357: 945–9.
http://acmd615.pbworks.com/f/Wass.pdf
Papers cited and weblinks: Structured clinical examinations
Cookson J, Crossley J, Fagan G, McKendree J, Mohsen A. (2011). A final clinical examination using a sequential design to improve
cost-effectiveness. Med Educ; 45 (7):741–7.
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2923.2010.03926.x/pdf
Cox K. (2000). Examining and recording clinical performance: a critique and some recommendations. Education for Health;13
(1): 45–52.
http://educationforhealth.net/EfHArticleArchive/1357-6283_v13n1s6_713664876.pdf
Gleeson F. (1997). Assessment of clinical competence using the objective structured long examination record (OSLER). Med
Teach; 19: 7–14.
www.medev.ac.uk/static/uploads/resources/amee.../AMEE9.doc
Hatala R, Marr S, Cuncic C, Bacchus CM. (2011). Modification of an OSCE format to enhance patient continuity in a highstakes assessment of clinical performance. BMC Med Educ; 24 (11): 23.
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3121725/pdf/1472-6920-11-23.pdf
Hodges, B., Herold McIlroy, J. (2003). Analytic global OSCE ratings are sensitive to level of training. Med Educ; 37: 1012–1016.
http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2923.2003.01674.x/pdf
Hodges, B, Regehr, G, McNaughton, N, Tiberius, R, Hanson, M. (1999). OSCE checklists do not capture increasing levels of
expertise. Academic Medicine; 74: 1129–1134.
http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2923.2003.01674.x/pdf
Newble, D. (2004). Techniques for measuring clinical competence: objective structured clinical examinations. Med Educ; 38:
199–203.
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2923.2004.01755.x/pdf
Norcini, J. (2001). The validity of long cases. Med Educ; 35(8), 720–1.
http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2923.2001.01006.x/pdf
Norcini, J.J. (2002). The death of the long case? Br Med J; 324: 408–9.
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC65539/pdf/408.pdf
Wass, V, Jones, R, Van der Vleuten, C. (2001). Standardised or real patients to test clinical competence? The long case revisited.
Case revisited. Med Educ; 35: 321–5.
http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2923.2001.00928.x/pdf
Workplace-based assessments
Workplace-based assessments are a test of what a practitioner can do in clinical practice, and is a step on from the assessment
of competence in an OSCE, where the practitioner demonstrates what they can do. That is, it is the completion of a task to an
agreed standard, typically in staged in vitro settings (van der Vleuten, 1996).
18 Observations
Dr Gavin Johnson’s presentation describes workplace-based assessments and is an accompaniment to Chapter 4:
http://www.ucl.ac.uk/medicalschool/postgraduate/events/mededconference11062012/GJ-WPBA.pdf
Forms used for workplace-based assessments
The Joint Royal College of Physicians Training Board (JRCPTB) website has a wealth of information on these assessments,
together with examples of the assessment forms and guidance on using them to mark and give feedback to students and
trainees:
See: http://www.jrcptb.org.uk/assessment/Pages/WPBA-Documents.aspx
General guidance can be found in the JRCPTB on workplace-based assessment and includes the assessment forms and guidance
for assessors on:
• Acute Care Assessment Tool (ACAT) for core medical training
• Case-based discussions (CbD)
• Direct Observation of Procedures (DOPS)
• E-portfolios
• Evaluation forms to teaching and presentation
• Mini-Clinical Evaluation Exercise (Mini-CEX)
• Multisource Feedback (MSF)
• Patient survey
The Sheffield Peer Review Assessment tool (SPRAT) can be found at:
http://www.yorksandhumberdeanery.nhs.uk/paediatrics/documents/ESPRATForm.pdf
Direct observation of procedures
Seeing a DOPS in action is much clearer, so here are some YouTube examples:
https://www.youtube.com/watch?v=RWkpJ-K78XI
https://www.youtube.com/watch?v=hLbY-PjytmY
https://www.youtube.com/watch?v=3_DKx6EoYVo
Case-based discussion
See case-based discussions in action is much clearer, so here are some YouTube examples:
https://www.youtube.com/watch?v=X3zVbmaPCis
https://www.youtube.com/watch?v=vVAfjR754XM
https://www.youtube.com/watch?v=mhTpBOV2kFU
Papers and weblinks: Workplace-based assessment tools
Archer, J, Norcini, J, Davies, HA. (2005) Use of SPRAT for peer review of paediatricians in training. Br Med J; 330(7502):
1251–3
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC558096/pdf/bmj33001251.pdf
Boulet, JR, Swanson, DB Psychometric Challenges of Using Simulations for High-Stakes Assessments in Simulation in Critical Care
and Beyond: pp119–30
http://www.famecourse.org/pdf/bouletandswanson.pdf
Cook DA, Dupras DM, Beckman TJ, Thomas KG, Pankratz VS. (2009) Effect of rater training on reliability and accuracy of
miniCEX scores: a randomised, controlled trial. J Gen Int. Med; 24(1):74–9
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2607488/pdf/11606_2008_Article_842.pdf
Eraut, M. (1996) Non-formal learning and tacit knowledge in professional work. Br J. Educ Pysch; 70: 113–36
http://onlinelibrary.wiley.com/doi/10.1348/000709900158001/pdf
Hassan, S (2001) Faculty development: Mini-CEX as workplace-based assessment. Education in Medicine Journal 3: (1)
e12–e21
http://saifulbahri.com/ejournal/eimj/2011/vol3issue1/e12-e21.pdf
Hays RB, Davies HA et al. (2002) Selecting performance assessment methods for experienced physicians. Med Educ; 36: 910–17
http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2923.2002.01307.x/pdf
Johnson GJ, Booth, J, Crossley, J, Wade W (2011) Assessing trainees in the workplace: results of a pilot study. Clinical Medicine;
11(1): 48–53
http://www.rcplondon.ac.uk/sites/default/files/clinical-medicine-11-1-pp48-53_0.pdf
Norcini JJ, Blank LL, Duffy FD et al. (2003) The mini-CEX: A method for assessing clinical skills. Ann Inter Med; 123:795–9
http://annals.org/article.aspx?articleid=716176
van der Vleuten CPM (1996) The assessment of professional competence: developments, research and practical implications.
Adv Health Sci Edu; 1(1): 41–67
http://link.springer.com/article/10.1007%2FBF00596229?LI=true#page-1
Observations 19
van der Vleuten CPM, Dolans DHJM, Scherpbier AJJA (2000) The need for evidence in education. Med Teach; 22(3): 246–50
http://www.fdg.unimaas.nl/educ/cees/CV/Publications/2000/The%20need%20for%20evidence%20in%20education.PDF
Wilkinson JR, Crossley JGM, Wragg A et al. (2008) Implementing workplace-based assessment across the medical specialties in
the United Kingdom. Med Educ; 42: 364–73
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2923.2008.03010.x/pdf
Simulation
Simulation is the reproduction of a real world encounter which attempt to provide legitimate contextualisation for
assessment.
Simulation can be:
• Role play done by simulated patients in an OSCE. Here simulation is harnessed to test history-taking skills or communication
skills.
• Use of manikins support both training and assessment, for example breast examination, resuscitation skills.
• Use of computerised software and haptics, simulating the clinical setting, can also be used to assess multidisciplinary cooperation, teamwork and leadership.
Simulation, particularly high fidelity simulated practice, has been introduced to teach and test in settings which do not compromise patient safety, where errors made and their correction can form part of the educational event. These sessions are an invaluable
preliminary training ground, where scenarios and systems can be adjusted as part of the education process. These methods have
proven so effective that ‘high-tech’ simulation training suite have been developed. But such sessions require considerable financial
resources for the equipment, technical expertise for their maintenance, and careful planning for authentic learning sessions
Stages to planning a simulation episode
The guiding principles in planning a simulation session are to identify the context of the learning experience and then the
resources for its delivery, which include the following.
Suppliers of simulator equipment
There are a number of companies who supply simulators of varying fidelities (please note there are many more), these include:
• Laerdal: http://www.laerdal.com/gb/
• CAE Healthcare: https://caehealthcare.com/home/#
• Gaumard Scientific: www.gaumard.com
• Limbs and things: http://limbsandthings.com/uk/home/
• Adam Rouilly: http://www.adam-rouilly.co.uk/
Fidelity of simulator
Simulators are supplied with a variety of fidelities.
1. Part task simulators are useful for teaching and assessing the psychomotor domain of a skill; remember using part task
trainers only may not enable students to contextualise and transfer their learning to the clinical practice setting.
Some examples include:
• Cannulation arms
• Nasogastric tube simulators
• Nursing / basic care manikins
• Resusci Anne’s.
Medium fidelity simulation involves the use of more realism but without automatic cues such as the rise of the chest to simulate
breathing. This type of simulation may involve the use of manikins or actors trained to demonstrate a condition.
Some examples include:
• Vital sims
Low and medium fidelity simulations are the most cost-effective, and usually focus on tasks and discrete situations. Most simulation learning takes place at these levels, using low-technology equipment.
High fidelity simulation provides the most realistic experience, primarily using computer-based manikins, and may use cadavers
or animal tissue. These techniques are needed for situations that cannot be replicated safely using living patients or lower fidelity
manikins. They are used to teach advanced clinical skills such as surgery and anaesthetics.
Some examples include:
• Noel obstetric trainer
• METiman / iSTAN
20 Observations
• Sim man
• Laparoscopic surgery simulators.
Contact each of the companies and they will be able to appraise you of the types of simulators available for example adult,
neonate, child.
The learning environment for simulation sessions
• To ensure that your students fully engage with the learning activity it is important that you create as authentic a learning
environment as possible. There is clear evidence that students immerse themselves in the learning experience more readily
when it is more realistic – this may include visual auditory and olfactory cues.
• Do remember to risk assess all activities and notify security, especially if you are outside, as this avoids anxious security guards
thinking that an adverse incident has occurred.
• Check with your Health and Safety advisors for any specific risk assessments that may need to be completed.
• Advise students of the clothing they need to wear for simulation activities – again this should replicate the expectations of
practice and for outside activities may require the wearing of high visibility clothing and other Personal Protective Equipment
(PPE).
Using actors as simulated patients
Simulated patients (SPs) are role players who effectively train professionals in communication and diagnostic skills. Various
medical schools and deaneries throughout the UK and internationally use and provide training for all SPs so that the approach
to students is consistent (Nestel et al., 2011). The association of simulated patients has a wealth of resources that you can access
to enhance your simulation activities for your students. See: http://www.aspeducators.org/.
A simulated patient allows healthcare professionals and students to:
• Ensure the information asked of a patient is correct for making an accurate diagnosis.
• Make patients feel comfortable about talking about difficult issues.
• Explore the best way to break bad news.
• Make individuals from British ethnic minority groups feel comfortable communicating about health issues, outside of their
culture/religion.
• Deliver the best customer experience to patients and service users.
Simulated patients are drawn from a wide range of backgrounds. They may have been a patient or carer in the past, they may
be a professional actor or simply a person interested in making a difference to the care that service users receive.
Making the simulation realistic
What is Moulage?
Moulage (French: casting/moulding) is the art of applying mock injuries for the purpose of training Emergency Response Teams
and other medical and military personnel.
Moulage may be as simple as applying pre-made rubber or latex ‘wounds’ to a healthy ‘patient’s’ limbs, chest, or head; or as
complex as using complicated makeup and theatre techniques to provide elements of realism (such as blood, vomitus, open
fractures) to the training simulation. The practice dates to at least the Renaissance, when wax figures were utilised for this
purpose. It is also now common to use Moulage in order to train military personnel for shock desensitisation.
Should I being using Moulage in medical simulation?
Moulage is a useful accessory that can be used to help the student look for the physical signs that support the stated diagnosis,
recognise new findings and it supports scenario learning objectives.
When used appropriately in simulation, Moulage will increase knowledge and performance in the following areas:
• Increased response time
• Enhanced evaluation clues
• Supportive critical thinking
• Added realism
• Engages all the senses
• Suspends disbelief.
Scenarios are designed to work with the physiology of the simulators; the addition of Moulage paints the sensory picture –
providing the remaining clues that enable educators to help students transfer their learning between a clinical case and a simulated scenario (Stephens and Jones, 2012).
All of the major simulator companies have spaces on their websites where simulation facilitators exchange recipes for Moulages
and scenarios. Some companies provide online and face to face training in simulation Moulage.
For example, Traumafx is a UK company that specialises in, and is the UK’s leading provider of, realistic casualty simulation.
See:
www.traumafx.co.uk/training-courses
http://www.trauma-sim.com/index.php/en/
Observations 21
There are various simple techniques that can be used to mimic clinical conditions,
for example:
• Rice krispies in a sealed plastic bag placed under the simulator’s skin can simulate surgical emphysaema.
• A cotton wool ball placed under the simulator’s skin over the carotid pulse can be used to simulate enlarged neck veins.
For olfactory cues, smell boxes can be purchased from companies such as http://www.daleair.com/.
Making a scenario authentic to students and trainees
Clothing that makes the scenario more authentic is vital. Here are some examples:
• If you are doing a trauma scenario, then cut the relevant item of clothing and then sew or stick velcro (the sticky side) to
either side of the cut. Purchase some felt of the same or similar colour to the clothing then the felt can be replaced between
scenarios without trying to purchase/find more clothing.
• A good source of clothing may be charity shops or asking your colleagues for cast offs – remember they need to be stretchy
as simulators’ arms and legs do not have the same range of movement as real humans and its often useful if the clothes are
a size larger than the simulator.
Documentation
To aid the authenticity of the practice experience you will need patient documentation. A good starting point is a set of notes
for each simulation scenario. These should be based on your local healthcare organisation’s proforma – very often they are
willing to share this but may require the details to be anonymised.
Here is a list of paperwork you may require in a set of case notes:
• Correspondence
• Consent forms
• Investigation/results
• ECG cardiac
• Ophthalmology
• Charts/nursing records
• Therapies
• Care pathways
• Supervision register
• Clinical notes.
Other documentation you may require is:
• Fluid chart
• Observation chart
• Risk assessments
• Pathology reports
• Radiological images.
Adjunct equipment
Regardless of the scenario you are using you will need to create a list of equipment including the paperwork required for the
simulation activity. It is a good idea if your students have had the opportunity to study the paperwork prior to the scenario
running. Loading information on the students’ Virtual learning environment as pre-scenario learning is very useful, but beware
not all students will have accessed this.
You may like to use the following as the basis for organising what you need.
22 Observations
Class Title: Year of Study
Time Frame (# of hours):
Number of
Students:
Learning Objectives
1.
2.
3.
4.
5.
6.
Setting for Scenarios
Adult ED Peds ED Adult ICU
Paediatric ICU Adult Med/Surg Paediatrics
OR PACU Perinatal
Neonatal Transport Pre-Hospital
Other:
Simulator(s) Needed
HPS PediaSim BabySim ExamSim
SimMan SimBaby AirMan Prompt Trainer
Equipment Needed*
Anaesthesia Machine/Cart Crash Cart Monophasic Defibrillator
Biphasic Defibrillator/ IV Pump(s) /Arterial line setup
Central line setup/ PA Catheter setup/ ICP line setup
EtCO2 setup/ EMR computer/ IV start arm adult/paediatric
Adult intubation trainer/ Paediatric intubation trainer/ Neonatal intubation trainer/
Cricothyrotomy trainer/ Chest tube setup
Moulage required
Wounds – type
Wound dressing yes/ no
Clothing/ Wigs
Make up cyanosis/ sweat/ oedema/jaundice
Responsibilities of a facilitator
Carefully selected facilitators are essential for a successful training programme. An ideal facilitator should be a practicing nurse,
midwife, or physician competent and confident in identifying and managing pre-eclampsia and eclampsia that is also:
• Trained in competency-based training and participatory learning methods.
• Trained in conducting clinical training programmes.
• Able to use learning principles for an effective clinical training programme.
• Able to provide care for women with pre-eclampsia and eclampsia according to the checklist.
• Competent in care for women with pre-eclampsia and eclampsia.
Before the training session begins
Facilitators should meet before training activities begin to discuss and assign the following administrative responsibilities:
I. Assign facilitation of teaching sessions, demonstrations, return demonstrations, and clinical simulations. (Each facilitator
will be responsible for ensuring that all needed resources, equipment, supplies, and medications are available for any sessions assigned to him/her.)
II. Set the classroom up in a way that ensures interactive learning.
III. Purchase flipcharts, markers, pens/pencils, notebooks, etc.
IV. Read the Reference Manual thoroughly to be sure that it is in agreement with current policies and practice guidelines in
your country. The manual is based on globally accepted, evidence-based information that countries should strive to adopt
in their guidelines. However, if this has not yet occurred for your setting, revisions may need to be made.
V. Review the Facilitator’s Guide for other preparation details.
VI. Make a copy for each facilitator of the:
• Facilitator’s Guide
• Reference Manual.
Observations 23
VII. Make a copy for each participant of the:
• Participant’s Notebook
• Pre- and mid-course questionnaire forms (in the Facilitator’s Guide)
• Reference Manual.
Before each session
• Read the content of each session thoroughly.
• Review any learning activities (case studies, role plays, etc.) and skill learning checklists for the session.
• Review the materials and resources needed for the session and make sure they are available.
• Review the suggested lesson plan, learning objectives, and PowerPoint presentation for the session. The lesson plan builds on
the knowledge from the suggested reading in the module. Use those parts of the lesson plan that are relevant to your participants’ learning needs. This will depend on the experience, skill and knowledge level of the participants and how much time
is available.
• Plan how much time to devote to each learning activity; lesson plans are included for your guidance.
After each session
• Review what parts of the session went well and what parts require revision.
• Revise lesson plans, learning activities and PowerPoint presentations as needed.
• Investigate any topics that were brought up during the session that you were not able to adequately respond to.
Debriefing
What is debriefing?
‘Debriefing allows the student to critically think through the lived experience.’ (Sanford, 2010)
‘. . . the process whereby faculty and students re-examine the clinical encounter, fosters the development of clinical reasoning
and judgment skills through reflective learning processes.’ (Dreifuerst, 2009)
Stages of debriefing
Johnson (2004) suggests that there are four stages to the debriefing process:
• Introduction
• Personal reaction (psychological component)
• Discussion of events (What happened?)
• Summary (Synthesis of knowledge, meaning making).
Papers and weblinks for simulation
Papers
Brown, R, Rasmussen, R, Baldwin, I, & Wyeth, P (2012) Design and implementation of a virtual world training simulation of
ICU first hour handover processes. Australian Critical Care; 25(3): 178–87
http://ac.els-cdn.com/S1036731412000434/1-s2.0-S1036731412000434-main.pdf?_tid=792bb966-828e-11e2-a4b7-00000aacb3
62&acdnat=1362156100_0398f813263f21434303ddae43c779b4
Edler, AA, Fanning, RG, Chen, MI, Claure, R, Almazan, D, Struyk, B, Seiden, SC (2009) Patient simulation: a literary synthesis
of assessment tools in anesthesiology. Journal of Educational Evaluation for Health Professions 6(3).doi 10.3352/jeehp.2009.6.3
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2796725/pdf/jeehp-6-3.pdf
Elliott L, De Cristofaro C, Carpenter A (2012) Blending technology in teaching advanced health assessment in a family nurse
practitioner program: using personal digital assistants in a simulation laboratory. J Am Acad Nurse Pract; 24(9): 536–43
http://onlinelibrary.wiley.com/doi/10.1111/j.1745-7599.2012.00728.x/pdf
Hamstra, SJ (2012) Keynote address: the focus on competencies and individual learner assessment as emerging themes in
medical education research. Acad Emerg Med; 19(12):1336–43
http://onlinelibrary.wiley.com/doi/10.1111/acem.12021/pdf
McGaghie W, Issenburg B, Petrusa R, Scalese E (2010) A critical review of simulation-based medical education research:
2003–2009 Med Educ; 44: 50–63
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2923.2009.03547.x/pdf
Nestel D, Tabak D, Tierney T, Layat-Burn C, Robb A, Clark A, Morrison T, Jones N, Ellis R, Smith C, McNaughton N, Knickle
K, Higham J Kneebone R (2011) Key challenges in simulated patient programs: An international comparative case study BMC
Med Educ; 11:69
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3189900/pdf/1472-6920-11-69.pdf
24 Observations
Patey, R, Flin, R, Cuthbertson, RH, MacDonald, L, Mearns, K, Cleland, J, Williams D (2007) Patient safety: helping medical
students understand error in healthcare Qual Saf Health Care; 16(4): 256–9. doi: 10.1136/qshc.2006.021014
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2464940/pdf/256.pdf
Patient Safety Agency (2010) Patient safety and simulation: using learning from national review of serious incidents London,
NPSA
http://www.nrls.npsa.nhs.uk/resources/type/guidance/?entryid45=74297
Rystedt H, Sjöblom B (2012) Realism, authenticity, and learning in healthcare simulations: rules of relevance and irrelevance
as interactive achievements. Instr Sci; 40:785–98
http://download.springer.com/static/pdf/777/art%253A10.1007%252Fs11251-012-9213-x.pdf?auth66=1363430771_2d291f2fe
9f88311aac125739da981f9&ext=.pdf
Stirling K, Hogg G, Ker J, Anderson F, Hanslip J, Byrne D (2012) Using simulation to support doctors in difficulty. Clin Teach;
9(5):285–9
http://onlinelibrary.wiley.com/doi/10.1111/j.1743-498X.2012.00541.x/pdf
Watson K, Wright A, Morris N, McMeeken J, Rivett D, Blackstock F, Jones A, Haines T, O’Connor V, Watson G, Peterson R, Jull
G (2012) Can simulation replace part of clinical time? Two parallel randomised controlled trials. Med Educ; 46(7):657–67
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2923.2012.04295.x/pdf
Wilson, MS, Middlebrook, A, Sutton, C, Stone, R, and McCloy, RF (1997) MIST VR: a virtual reality trainer for laparoscopic
surgery assesses performance. Ann R Coll Surg Engl; 79(6): 403–404. PMCID: PMC2502952
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2502952/pdf/annrcse01610-0019.pdf
Wiseman A, Horton K (2011) Developing clinical scenarios from a European perspective: Successes and challenges. Nurse Educ
Today; 31 (7): 677–681
http://ac.els-cdn.com/S026069171100013X/1-s2.0-S026069171100013X-main.pdf?_tid=fca1e77e-8721-11e2-aff2-00000
aab0f26&acdnat=1362659261_a69385772730aabdbe38864182d35d5e
Additional weblinks
Association of American Medical Colleges Medical Simulation in Medical Education
https://www.aamc.org/download/259760/data/medicalsimulationinmedicaleducationanaamcsurvey.pdf
Guidance on debriefing with video interaction can be accessed at the following website:
http://simulation.londondeanery.ac.uk/educational-resources/salift-the-foundations-for-positive-debriefing
World Health Organisation WHO Patient Safety Curriculum Guide for Medical Schools. Geneva: World Health Organisation;
2009
http://whqlibdoc.who.int/publications/2009/9789241598316_eng.pdf
Contents
Professionalism
Papers cited and weblinks: assessing professionalism
Page 25
Page 25
Professionalism
Professionalism has become a central concept in medical and healthcare education and while there is no one definition, it is
important for educators because the discourse of professionalism, that is the way in which professionalism is understood and
described that is adopted by the profession or learning organisation, will greatly affect what it is believed should be assessed, in
what way and for what purpose. This area is very much linked with the Chapter 7 on Feedback, in particular multi-source
feedback that is used in the healthcare appraisal schemes.
Building on the recommendations from the 2010 Ottawa Conference Assessment of Professionalism expert group, in Chapter 5,
Dr Deborah Gill discussed how individual, interpersonal and societal dimensions of professionalism assessment might be
approached.
Papers cited and weblinks: Assessing professionalism
Cushing, AM, Abbott, S, Lothian, D, Hall, A, Westwood, OMR. (2011). Peer feedback in formative assessment as an to aid learning: What do we want? Feedback. When do we want it? Now! Med Teach; 33(2):e105–12.
http://informahealthcare.com/doi/pdf/10.3109/0142159X.2011.542522
Epstein, RM, Hundert, EM. (2002). Defining and assessing professional competence. JAMA; 287(2): 226–35.
http://jama.jamanetwork.com/article.aspx?articleid=194554
Fialkow, M, Mandel, L, Van Blaricom, A, Chinn, M, Lentz, G, Goff, B. (2007). A curriculum for Burch colposuspension and
diagnostic cystoscopy evaluated by an objective structured assessment of technical skills. Am J Obst Gyne; 197(5): 544 e1–6.
http://ac.els-cdn.com/S0002937807009064/1-s2.0-S0002937807009064-main.pdf?_tid=69017058-827a-11e2-bc50-00000aab0f
26&acdnat=1362147484_1642e5efd2bf65339ed257c36a9b6701
Goff, B, Mandel, L, Lentz, G, Vanblaricom, A, Oelschlager, A.M, Lee, D. (2005). Assessment of resident surgical skills: is testing
feasible? American Journal of Obstetrics and Gynecology; 192:1331–8.
http://ac.els-cdn.com/S000293780500044X/1-s2.0-S000293780500044X-main.pdf?_tid=da787858-827a-11e2-8945-00000aacb
362&acdnat=1362147673_142fd4106ebaf6a8616f8edb17d1d31e
Hodges, B.D, Ginsburg, S, Cruess, R, Cruess, S, Delport, R, Hafferty, F, Ho, M.J, Holmboe, E, Holtman, M, Ohbu, S, Rees, C,
Ten Cate, O, Tsugawa, Y, Van Mook, W. (2011). Assessment of professionalism: Recommendations from the Ottawa 2010 Conference. Med. Teach; 33: 354–63.
http://informahealthcare.com/doi/pdf/10.3109/0142159X.2011.577300
Kahol K, Vankipuram M, Smith ML. (2009). Cognitive simulators for medical education and training. J Biomed Inform; 42(4):
593–604.
http://ac.els-cdn.com/S1532046409000288/1-s2.0-S1532046409000288-main.pdf?_tid=ad6653ac-827b-11e2-b881-00000aab0f
6c&acdnat=1362148027_7c171c57b9b65ae81c3f38d346aae09a
Madden, J, Quick, D, Ross-Degnan,D, Kafle, K.K. (2009). Undercover care seekers: Simulated clients in the study of health
provider behavior in developing countries. Social Science & Medicine; 45 (10):1465–82.
http://ac.els-cdn.com/S0277953697000762/1-s2.0-S0277953697000762-main.pdf?_tid=41697438-828d-11e2-b7e9-00000aacb3
5e&acdnat=1362155577_4e1a1de42afd74c6cef6569de33145ff
Passi, V, Manjo, D, Peile, E, Thistlethwaite, J, & Johnson, N. (2010). Developing medical professionalism in future doctors: a
systematic review. Int. J. Med. Educ; 1:19–29.
http://www.ijme.net/archive/1/developing-medical-professionalism-in-future-doctors.pdf
Royal College of Physicians (2005). Doctors in Society: Medical Professionalism in a Changing World. Report of a Working Party
of the Royal College of Physicians of London, London, RCP.
http://bookshop.rcplondon.ac.uk/contents/pub75-241bae2f-4b63-4ea9-8f63-99d67c573ca9.pdf
Schuwirth, L.W, Van der Vleuten, C.P. (2006). Challenges for educationalists. Br. Med. J; 333(7567): 544–6.
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1562480/pdf/bmj33300544.pdf
Van Manen, M. (1995). On the epistemology of reflective practice. Teachers and teaching: Theory and Practice; 1(1):33–50.
http://www.maxvanmanen.com/files/2011/04/1995-EpistofReflective-Practice.pdf
Companion website material for How to Assess Students and Trainees in Medicine and Health, First Edition. Edited by Olwyn M. R. Westwood,
Ann Griffin, and Frank C. Hay.
© 2013 John Wiley & Sons, Ltd. Published 2013 by John Wiley & Sons, Ltd.
26 Professionalism
Van Tartwijk, J, Driessen, E. (2009). Portfolios for assessment and learning: AMEE Guide no. 45. Med Teach; 31(9):790–801.
http://reseauconceptuel.umontreal.ca/rid=1L1F4DKB0-23M8BTQ-2SY5/BEME%2045%20assessment%20and%20learning.pdf
Wass, V, Wilkinson, T,Wade, W. (2011). Assessment of professionalism: recommendations from the Ottawa 2010 Conference.
Med Teach; 33(5):354–63.
http://informahealthcare.com/doi/pdf/10.3109/0142159X.2011.577300
Wohaibi EM, Earle DB, Ansanitis FE, Wait RB, Fernandez G, Seymour NE. (2007). A New web-based operative skills assessment
tool effectively tracks progression in surgical resident performance. J Surg Educ; 64(6):333–41.
http://ac.els-cdn.com/S1931720407001638/1-s2.0-S1931720407001638-main.pdf?_tid=6feac20e-827d-11e2-b480-00000aacb35
f&acdnat=1362148783_801248ae9b29ea45abcb1e65abcdf0b9
Contents
Ensuring rigour and high quality
Standard setting
1. Nedelsky method (Nedelsky 1954)
2. Angoff method (Angoff, 1971)
3. Ebel method (1979)
4. Cohen method
Useful papers and weblinks
Examiner stringency and external examining
Papers cited and weblinks
Some useful websites
Page 27
Page 27
Page 28
Page 28
Page 28
Page 29
Page 29
Page 31
Page 31
Page 31
Ensuring rigour and high quality
This section of the website has helpful advice and links to ensure that assessment processes are fair and robust. It contains a
practical run-through on some of the methods of standard setting, tips for external examiners and further reading. It is an
accompaniment to chapters by Steve Capey, Frank Hay and Katherine Woolf: Chapters 6 and 8.
Standard setting
This section is an accompaniment to Chapter 6 of the book and describes in detail four common ways to set the standards of
assessment.
1. Nedelsky Method
2. Angoff Method
3. Ebel Method
4. D. Cohen Method
1. Nedelsky method (Nedelsky 1954)
This intriguing early approach to standard setting is based on the opposite of what the students are asked in the question. While
the students are asked to identify the single best answer, the standard is set by identifying those answers that a borderline student
would know to be wrong and assuming that they would guess which one of the remaining answers was correct.
Steps
1. Expert judges meet and consider each MCQ in term. For each ‘wrong’ option they must decide if a borderline student would
identify this option correctly, as a wrong answer.
2. These ‘wrong’ answers are then eliminated from the calculation.
3. In a question with 5 options, a judge may decide that 2 of the options would be clearly identified as wrong. This leaves 3
options, all of which would be equally likely to be chosen by guessing. Therefore the score for this question is the reciprocal
of 3 = 0.33.
4. The scores for all the questions are then added up to produce the passing score for the paper, for each judge.
5. Add up the passing scores for all the judges and take the mean, or the median, as the passing score.
Notes
4. At its simplest the judges could score their papers independently and simply send in their scores for calculation of the passing
score, but it is more usual for the judges to meet and discuss their scores. Particularly high or low scoring judges should be
asked for their reasoning. This may be done at the end of scoring, or after each MCQ. The judges are then asked to re-score
their papers in the light of the discussion.
5. In combining the scores from all the judges the simplest approach is to add them all up and take the mean. This has the
advantage of simplicity and takes account of all the judges’ opinions. It has the disadvantage of being unduly influenced by the
scores of especially lenient or hawkish judges. An alternative approach is to take the median score.
Companion website material for How to Assess Students and Trainees in Medicine and Health, First Edition. Edited by Olwyn M. R. Westwood,
Ann Griffin, and Frank C. Hay.
© 2013 John Wiley & Sons, Ltd. Published 2013 by John Wiley & Sons, Ltd.
28 Ensuring rigour and high quality
2. Angoff method (Angoff, 1971)
Instead of looking at each question response in turn, the Angoff approach looks at a question in its entirety. This means that
the method can be applied to virtually any question format including practical questions such as objective structured clinical
examinations. Angoff only briefly outlined his approach in his original publication and this has led to many individual variations in different institutions.
Again the expert judges have to make a decision as to how a borderline, or minimally competent, student would perform on a
question. For each question the judges must make a decision on the probability that a borderline student would answer the
question correctly. Some people find it hard to estimate a probability, so an alternative approach is to ask the judges to imagine
100 borderline students answering the question, and to estimate how many of them would answer the question correctly.
Steps
1. Panel of expert judges looks at each question.
2. For each question, the judges estimate the proportion of borderline/minimally competent students that would answer that
question correctly.
3. The judges have a general discussion; it is particularly important that very high and very low scoring judges give their reasons
for their scores.
4. Judges are given the opportunity to adjust their scores.
5. Scores are added up and averaged to give the pass mark for that question.
6. Means for each question are summed and the mean calculated to give the passing score for the complete examination.
Judges
Items
A
B
C
D
E
Mean
1
0.70
0.60
0.75
0.55
0.65
0.65
2
0.90
0.80
0.95
0.85
0.90
0.90
3
0.75
0.70
0.75
0.80
0.40
0.68
4
0.55
0.45
0.60
0.50
0.55
0.53
5
0.95
0.90
0.95
0.85
0.90
0.91
Overall mean
0.73
Giving a passing score of 73%.
3. Ebel method (1979)
The Ebel approach has considerable appeal to many academics as it appears to be simple, just allocating questions to boxes,
and rates the importance of a question as well as its difficulty. In practice it takes time to do properly but it does have the
advantage of identifying unsuitable questions, such as difficult problems testing unimportant knowledge.
Expert judges first identify categories to rate the importance of the material being tested, such as Essential, Important, Acceptable. Ebel originally incorporated a further category, questionable, but these should really have been eliminated when the
examination was set.
Categories are then constructed to rate how hard the questions are to answer, such as Easy, Moderate, Difficult.
Easy
Moderate
Difficult
Essential
Important
Acceptable
This will usually be an accepted format regularly used in an institution, not requiring constructing afresh each time.
Steps
1. Expert judges must examine each question and decide on its category, for example easy and moderately difficult.
2. After each question the judges should discuss their categorisations, particularly discussing outlying opinions.
3. Judges may then change their opinion.
4. The final categorisation for each question is recorded for each judge.
5. Judges now decide what proportion of borderline students would answer questions correctly within each category. This is
difficult; judges need to think about lots of borderline students answering many questions within each category.
6. Judges again need to discuss their reasoning for their probability decisions.
7. Judges are allowed to alter their decisions.
8. The mean probability for each category is then calculated from the individual judge’s decisions.
Ensuring rigour and high quality 29
Easy
Moderate
Difficult
Essential
90%
85%
70%
Important
60%
50%
45%
Acceptable
40%
30%
20%
9. Each judge’s question categorisations are inserted into their table, for example for 13 questions:
Easy
Moderate
Difficult
Important
90% |
60% ||
85% ||
50% |
70% |
45% |||
Acceptable
40% ||
30% |
20%
Essential
10. The mean passing score for the exam for each judge calculated by multiplying the number of questions in each category
by the probability of a borderline student answering questions correctly in that category.
Calculated pass mark = (1 × 0.90) + (2 × 0.85) + (1 × 0.70) + (2 × 0.60) + (1 × 0.50) + (3 × 0.45) + (2 × 0.40) + (1 × 0.30) / 13 = 0.57
The passing score is 57%.
11. The scores for all the judges are averaged to achieve the overall pass mark for the paper.
4. Cohen method
Panels of expert judges are time consuming and expensive. They are cost-effective with national exams, taken by many students,
but can be too expensive for single institution examinations. Frequently it becomes a matter of who is available rather than
who would be ideal to form the expert panel.
Cohen has developed a secondary technique which relies on the students to set the standard. The reasoning is that the most
stable element, from year to year, is the performance of the best students. These best students are always well prepared so that
differences in their performance from year to year are likely to reflect test variations in test difficulty.
Therefore a fixed cut score of say 60% could be varied up or down each year depending on how the best students performed.
In the original description of the test method the performance of the 95th percentile was taken as the standard measure, but
other values such as the 90th percentile have since been used.
Rather than vary a fixed pass mark another approach is to look historically at properly set cut scores, such as those derived from
Angoff or Ebel panels, and to calculate what percentage of the 95th percentile each year equals the previously set cut score.
Then taking the mean of several years’ values and obtain a factor (Cohen factor) to use in future. In one of our institutions this
worked out at 65% of the 95th percentile.
Steps
1. Enter student marks data into spreadsheet.
2. Calculate percentile at 95% (or as required).
3. Multiply this percentile value by the ‘Cohen’ factor, for example 0.65, to give the passing score for this examination.
Useful papers and weblinks
Papers: Standard setting methods
Bandaranayake, R. C. (2008). Setting and maintaining standards in multiple choice examinations: AMEE Guide No. 37. Med
Teach; 30(9): 836–845.
http://informahealthcare.com/doi/pdf/10.1080/01421590802402247
Boursicot, K.A, Roberts, T.E, Pell, G. (2007). Using borderline methods to compare passing standards for OSCEs at graduation
across three medical schools. Med Educ; 41(11): 1024–31.
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2923.2007.02857.x/pdf
Brennan RL, Lockwood RE. (1980). A comparison of the Nedelsky and Angoff cutting score procedures using generalisability
theory. Appl Psychol Measurement; 4:219–40.
http://conservancy.umn.edu/bitstream/100092/1/v04n2p219.pdf
An overview can be found at:
Standard Setting in the Post-Modern Era by Dale Griffee and Jeremy R. Gevara
http://studentorgs.utexas.edu/flesa/TPFLE_New/Issues/Summer%202011/2.%20Griffee%20and%20Gevera.pdf
The American Board of Pediatrics and The Association of Pediatric Program Directors. A Primer for Pediatric Program
Directors
https://www.abp.org/abpwebsite/publicat/primer.pdf
30 Ensuring rigour and high quality
Hutchinson, L. Aitken P, Hayes T. (2002). Are medical postgraduate certification processes valid? A systematic review of the
published evidence. Med Educ;36(1):73–91. Review.
http://faculty.ksu.edu.sa/hisham/Documents/Medical%20Education/English/Medical%20Education/215.pdf
Impara, J. C, & Plake, B. S. (1997). Standard setting: An alternative approach. Journal of Educational Measurement; 34: 353–66.
http://onlinelibrary.wiley.com/doi/10.1111/j.1745-3984.1997.tb00523.x/pdf
Norcini JJ (2003). Setting standards on educational tests. Med Educ; 37:464–69.
http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2923.2003.01495.x/pdf
Taylor, C. A. (2011). Development of a modified Cohen method of standard setting. Med Teach; 33:12:e678–e682.
http://share.eldoc.ub.rug.nl/FILES/root2/2010/stansemew/Cohen_Schotanus_2010_Med_Teacher.pdf
Papers: Psychometrics and assessments
Altman DG, Bland JM. (2003). Statistics notes interaction revisited: the difference between two estimates. Br Med J; 326: 219.
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1125071/pdf/219.pdf
Baldwin DC Jr, Daugherty SR, Rowley BD, Schwarz MD. (1996). Cheating in medical school: a survey of second-year students
at 31 schools. Academic Medicine; 71(3):267–73.
http://journals.lww.com/academicmedicine/Abstract/1996/03000/Cheating_in_medical_school_a_survey_of.20.aspx
Beckman TJ, Ghosh AK, Cook DA, Erwin PJ, Mandrekar JN. (2004). How reliable are assessments of clinical teaching? a review
of the published instruments. J Gen Intern Med; 19:971–7.
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1492515/pdf/jgi_40066.pdf
Crossley J, Humphris G, Jolly B. (2002). Assessing health professionals. Med Educ; 36(9): 800–4.
http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2923.2002.01294.x/pdf
Crowne, D. P. and Marlowe, D. (1960). A new scale of social desirability independent of psychopathology. Journal of Consulting
Psychology; 24, 349–54.
Cited in: http://www.srl.uic.edu/publist/Conference/crownemarlowe.pdf
De Champlain AF. (2010). A primer on classical test theory and item response theory for assessments in medical education.
Med Educ; 44(1):109–17.
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2923.2009.03425.x/pdf
Downing SM. (2003). Item response theory: applications of modern test theory in medical education. Med Educ;
37(8):739–45.
http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2923.2003.01587.x/pdf
Downing SM. (2004). Reliability: on the reproducibility of assessment data. Med Educ; 38(9):1006–12.
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2929.2004.01932.x/pdf
Downing SM, Haladyna TM. (2004). Validity threats: overcoming inter­ference with proposed interpretations of assessment data.
Med Educ; 38(3):327–33.
http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2923.2004.01777.x/pdf
Geoffrey, N, Bordage, G, Page, G, & Keane, D. (2006). How specific is case specificity? Med Educ; 40(7) 618–23.
http://onlinelibrary.wileycom/doi/10.1111/j.1365-2929.2006.02511.x/pdf
Haladyna TM, Downing SM, Rodriguez, MC. (2002). Review of multiple-choice item-writing guidelines for classroom assessment. Applied Measurement in Education; 15(3), 309–34.
http://umdrive.memphis.edu/lfrncsch/ICL7030/haladyna.pdf
Hays, R, Sen Gupta, T, Veitch, J. (2008). The practical value of the standard error of measurement in borderline pass/fail decisions. Med Educ; 42(8):810–15.
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2923.2008.03103.x/pdf
Norman G, Bordage G, Page G, Keane D. (2006). How specific is case specificity? Med Educ; 40(7):618–23.
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2929.2006.02511.x/pdf
Schuwirth L, van der Vleuten C. (2004). Merging views on assessment. Med Educ; 38(12): 1208–10.
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2929.2004.02055.x/pdf
Tavakol, M, Dennick, R. (2011). Making sense of Cronbach’s alpha International Journal of Med Educ; 2:53–55.
http://www.ijme.net/archive/2/cronbachs-alpha.pdf
Tighe J, McManus IC, Dewhurst NG, Chis L, Mucklow J. (2010). The standard error of measurement is a more appropriate
measure of quality for postgraduate medical assessments than is reliability: an analysis of MRCP (UK) examinations. BMC Med
Educ; 2:10:40.
http://www.biomedcentral.com/content/pdf/1472-6920-10-40.pdf
Traub, RE. (1997). Classical test theory in historical perspective. Educational Measurement: Issues and Practice, 8–14.
http://onlinelibrary.wiley.com/doi/10.1111/j.1745-3992.1997.tb00603.x/pdf
Useful weblinks
Academy of Medical Royal Colleges (2009). Improving Assessment. AoMRC Press, London
http://www.aomrc.org.uk/publications/statements/doc_view/49-improving-assessment.html
Ensuring rigour and high quality 31
Angoff method: Canadian Association of Medical Radiation Technologists. The Angoff Method of Standard. Setting for Licensure and Certification Examinations
http://www.camrt.ca/certification/international/examscoring/AngoffMethod.pdf
A useful presentation – by James B Olsen, a Senior Psychometrician with the Computer Education Management Association:
James B Olsen, Setting Performance Standards & Cut Scores:
http://www.cedma-europe.org/newsletter%20articles/Webinars/Setting%20Performance%20Standards%20and%20Cut%20
Scores%20%28Apr%2011%29.pdf
Webb, NM, Shavelson RJ. Haertel, EH. (2006). Reliability coefficients and generalizability theory. Handbook of Statistics, 26
1–44.
http://www.stanford.edu/dept/SUSE/SEAL/Reports_Papers/ReliabCoefsGTheoryHdbk.pdf
Examiner stringency and external examining
The marks awarded denote the judgement by examiners on competence, for which in turn the candidates must be assured of
their expertise and complete integrity. However the interpretation of assessment criteria is not always clear-cut, therefore procedures are needed for the standardisation of examiner practice. Without them it could be entirely feasible that candidates
demonstrating the same competence in an equivalent performance, for a specific task, might receive very different marks. In
this section of the website, the various issues around the characteristics of an examination board, the examiners and the candidates are covered and practical approaches for identifying where errors might occur, the reasons why students may fail and
how to support them to reach their full potential in assessments. Following on from Professor Olwyn Westwood’s Chapter 9,
there are some useful resources on examiner behaviours, external examining and support for students.
Papers cited and weblinks
Further advice on examiner stringency and external examining
Chew-Graham CA, Rogers, A ,Yassin, N. (2003). ‘I wouldn’t want it on my CV or their records’: medical students’ experiences
of help-seeking for mental health problems. Med Educ; 37: 873–80.
http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2923.2003.01627.x/pdf
Farrokhi, F, Esfandiari, R, Schaefer, E. (2012). A many-facet rasch measurement of differential rater severity/leniency in three
types of assessment. JALT Journal; 34 (1) 79–102.
http://www.google.co.uk/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&ved=0CEYQFjAC&url=http%3A%2F%2Fjaltpublications.org%2Ffiles%2Fpdf-article%2Fjj2012a-art4.pdf&ei= mfk1UbjJJoLWPdj7gcAL&usg=AFQjCNFcyj65Cnt5fb9J1Lr
DhWoDmqkKhQ&bvm=bv.43148975,d.ZWU
Harasym, PH, Wayne Woloschuk, W, Cunning, L. (2008). Undesired variance due to examiner stringency/leniency effect in
communication skill scores assessed in OSCEs. Adv Health SciEduc Theory Pract; 13(5):617–32.
http://download.springer.com/static/pdf/616/art%253A10.1007%252Fs10459-007-9068-0.pdf?auth66=1363455886_4904aa4fd
459f345f8f15089e92987a0&ext=.pdf
Norcini, JJ, Blank, LL, Arnold, GK, Kimball, HR. (1997). Examiner differences in the mini-Cex. Advances in Health Sciences
Education; 2: 27–33.
http://download.springer.com/static/pdf/353/art%253A10.1023%252FA%253A1009734723651.pdf?auth66=1363787772_4623
ecf2607177213c98530c9eca7162&ext=.pdf
Pelgrim EA, Kramer AW, Mokkink HG, van den Elsen L, Grol RP, van der Vleuten CP. (2011). In-training assessment using
direct observation of single-patient encounters: a literature review. Adv Health Sci Educ Theory Pract; 16(1): 131–42.
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3074070/pdf/10459_2010_Article_9235.pdf
McManus IC, Thompson M and Mollon J. (2006). Assessment of examiner leniency and stringency (‘hawk-dove effect’) in the
MRCP(UK) clinical examination (PACES) using multi-facet Rasch modelling, BMC Med Educ; 6:42 doi:10.1186/14726920-6-42.
http://www.biomedcentral.com/content/pdf/1472-6920-6-42.pdf
Some useful websites
Quality assurance/compliance with a professional and statutory regulatory body
Accreditation Council for Graduate Medical Education 2009
http://www.acgme.org/acgmeweb/
GMC: Good Medical Practice – Guidance for Doctors 2009
http://www.gmc-uk.org/static/documents/content/GMP_0910.pdf
GMC: The State of Medical Education and Practice in the UK 2011
http://www.gmc-uk.org/State_of_medicine_Final_web.pdf_44213427.pdf
32 Ensuring rigour and high quality
GMC: The Patel Review: Future Regulation of Medical Education and Training 2010
http://www.gmc-uk.org/Recommendations_and_Options_for_the_Future_Regulation_of_Education_and_Training_FINAL.
pdf_31524263.pdf_34560875.pdf
GMC: Ready for Validation 2012: The Good Medical Practice Framework for Appraisal and Revalidation
http://www.gmc-uk.org/static/documents/content/GMC_Revalidation_A4_Guidance_GMP_Framework_04.pdf
Quality Assurance Agency 2006: Code of practice for the assurance of academic quality and standards in higher education
Section 6: Assessment of students
http://www.qaa.ac.uk/Publications/InformationAndGuidance/Documents/COP_AOS.pdf
Quality Assurance Agency 2012: Understanding assessment: its role in safeguarding academic standards and quality in higher
education: A guide for early career staff. Second Edition
http://www.qaa.ac.uk/Publications/InformationAndGuidance/Pages/understanding-assessment.aspx
Weblinks: External examiners
This section contains some useful information on expectation of external examiners and accompanies Chapter 9.
Evaluation questionnaire about the approach to assessment – Assessment Experience Questionnaire (AEQ)
http://www.heacademy.ac.uk/resources/detail/assessment/AEQ_Resourceform
The Higher Education Academy: A handbook for external examining:
http://www.heacademy.ac.uk/assets/documents/externalexamining/External_Examiners_Handbook_2012.pdf
Contents
Feedback
Weblinks
Web resources
YouTube resources
Papers and weblinks: self-reflection and feedback
Future developments: assessment and feedback
Papers cited and weblinks in Chapter 10
Support for students
Papers and weblinks: further advice student support
Page 33
Page 33
Page 34
Page 34
Page 34
Page 35
Page 35
Page 35
Page 35
Feedback
The information here serves as a supplement to Chapters 7 and 10 of the book.
• Feedback is an essential element in the facilitation of effective teaching and learning.
• Timely and well-crafted feedback enhances the performance of learners by highlighting to them the abilities they already
possess, as well as giving constructive advice about how to develop those skills which have scope for improvement.
• Feedback is a key method which can promote reflection within the learner and foster practitioners that are more critically
aware of their own performance.
• Giving feedback is no longer seen as the exclusive domain of the educator, rather an expected responsibility which is shared
by the whole healthcare team and patients.
• There are now a range of new devices, for example the 360 degree multisource feedback tools; e-portfolios; and workplacebased assessments (see additional web material).
Weblinks
Using IT to enhance feedback
http://evidencenet.pbworks.com/f/guide+for+academic+staff+FINAL.pdf
This is a useful guide for people who want to know how to use IT to improve feedback, its main use will be for programme
and module leads.
Formative feedback
By Charles Juwah, Debra Macfarlane-Dick, Bob Matthew, David Nicol, David Ross and Brenda Smith
An excellent resource may be found on the Higher Education Academy website: Enhancing student learning through effective
formative feedback.
http://www.heacademy.ac.uk/assets/documents/resources/resourcedatabase/id353_senlef_guide.pdf
This is key reference which details a variety of ways in which feedback can be embedded in practice. It expands on the theory
and purpose of feedback before examining some illustrative case studies. It includes useful advice on portfolios and selfassessment and has ideas for staff development workshops.
Focus on feedback
Dr Lorraine Noble
This is a presentation of Powerpoint slides from a workshop on giving feedback.
http://www.ucl.ac.uk/medicalschool/postgraduate/events/mededconference11062012/LN-Focus-on-feedback.pdf
Enhancing feedback on workplace-based assessments
Dr Alison Sturrock, Alex Nesbitt, Freya Baird, Andrew Pitcher, Lyndon James
This is a presentation of Powerpoint slides from a student-led workshop about their research and thoughts about workplace
based assessments. This is an adjunct to Chapter 7 on feedback.
http://www.ucl.ac.uk/medicalschool/postgraduate/events/mededconference11062012/AN-workshop-slides-final.pdf
The Calgary-Cambridge approach in communication skills teaching
Silverman, J D, Draper, J, and Kurtz, SM
http://www.gp-training.net/training/communication_skills/calgary/index.htm
Companion website material for How to Assess Students and Trainees in Medicine and Health, First Edition. Edited by Olwyn M. R. Westwood,
Ann Griffin, and Frank C. Hay.
© 2013 John Wiley & Sons, Ltd. Published 2013 by John Wiley & Sons, Ltd.
34 Feedback
Web resources
Giving and receiving feedback: A Guide to the use of Peers in giving Feedback
http://www.iml.uts.edu.au/assessment-futures/glossary/Giving-and-Receiving-Feedback.pdf
A good resource on providing feedback for student learning.
http://evaluate.curtin.edu.au/local/docs/5providing-feedback-for-student-learning.pdf
YouTube resources
Feedback on workplace based assessment SGUL
https://www.youtube.com/watch?v=szbsSkLp_Vg
Giving feedback – by Dr. Paula ONeill of the Academy of Academic Leadership
https://www.youtube.com/watch?v=L1CjetPDEww
Giving feedback from St George’s University of London on a
truncated Mini-Cex
https://www.youtube.com/watch?v=ubQ7KH7lxLU
How not to give feedback: St George’s University of London
https://www.youtube.com/watch?v=PRIlnUAKwDY
Multisource Feedback
https://www.youtube.com/watch?v=wLL22CwNjao
Papers and weblinks: Self-reflection and feedback
Papers with e-links
Berk, R (2009) Using the 360° multisource feedback model to evaluate teaching and professionalism. Med Teach; 31: 1073–80
http://www.ronberk.com/articles/2009_multisource.pdf
Crossley J, Eiser C, Davies HA (2005) Children and their parents assessing the doctor-patient interaction: a rating system for
doctors’ communication skills. Med Educ; 39:757–9
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2929.2005.02230.x/pdf
Davies, H, Archer, J. Bateman, A, Dewar, S, Crossley, J, Grant, J, Southgate L. (2008) Specialty-specific multi-source feedback:
assuring validity, informing training. Med Educ; 42: 1014–1020
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2923.2008.03162.x/pdf
Davies, H, Archer, J (2005) Multi source feedback: developmental and practical aspects. Clin Teach; 2(2): 77–81
https://www.abp.org/abpwebsite/r3p/preread/Davies.Mutisource%20feedback%20review.2005.pdf
Driessen E, van Tartwijk J, Dornan T (2008) The self-critical doctor: helping students become more reflective. Br Med J;
336(7648):827–30.
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2292362/pdf/bmj-336-7648-prac-00827.pdf
Ende, J. (1983) Feedback in clinical medical education. JAMA; 250(6): 771–81
http://www.lumen.lumc.edu/lumen/meded/ipm/IPM1/EndeArticle.pdf
Eraut, M, (2006) Feedback. Learning in Health and Social Care; 5(3): 111–18 10.1111/j.1473-6861.2006.00129.x
http://onlinelibrary.wiley.com/doi/10.1111/j.1473-6861.2006.00129.x/pdf
Mercer SW, McConnachie A, Maxwell M et al (2005) Relevance and performance of the Consultation and Relation Empathy
(CARE) measure in general practice. Family Practice; 22 (3): 328–34
http://fampra.oxfordjournals.org/content/22/3/328.full.pdf+html
Nestel D, Tierney T (2007) Role-play for medical students learning about communication: guidelines for maximising benefits.
BMC Med Educ; 7: 3–12
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1828731/pdf/1472-6920-7-3.pdf
Nicol, D and Macfarlane-Dick, D (2006) Formative assessment and self-regulated learning: a model and seven principles of
good feedback practice. Studies in Higher Education; 31(2): 199–218
http://www.reap.ac.uk/reap/public/papers//DN_SHE_Final.pdf
Ramsey, P, Wenrich, M; Carline, J, Inui, T, Larson, E, and J, LoGerfo (1993) Use of peer ratings to evaluate physician performance. JAMA; 269: 1655–60
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3244317/pdf/i1949-8357-3-4-511.pdf
Veloski J, Boex J R, Grasberger M J, Evans A and Wolfson D B (2006). Systematic review of the literature on assessment, feedback
and physicians’ clinical performance: BEME Guide no 7. Med Teach; 28 (2): 117–28
http://informahealthcare.com/doi/pdf/10.1080/01421590600622665
Wall, D. McAleer, S (2000) Teaching the consultant teachers – identifying the core content. Med Educ; 34 (2): 131–8
http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2923.2000.00464.x/pdf
Ward, M, Gruppen, L, Regehr, G (2002) Measuring Self-assessment: Current State of the Art Advances in Health Sciences Education
7: 63–80, 2002
http://deepblue.lib.umich.edu/bitstream/handle/2027.42/41768/10459_2004_Article_397832.pdf;jsessionid=B6D90814EC620E
84F23D55EAE46F1C94?sequence=1
Feedback 35
Weaver, M. (2006) Do students value feedback? Student perceptions of tutors’ written responses. Assessment & Evaluation in
Higher Education; 31 (3): 379–94
http://irep.ntu.ac.uk:1801/view/action/singleViewer.do?dvs=1362489176573~442&locale=en_GB&DELIVERY_RULE_ID=
12&search_terms=SYS%20=%20000005183&adjacency=N&application=DIGITOOL-3&frameId=1&usePid1=true&usePid2=true
Wood, L, Hassell, A, Whitehouse, A, Bullock, A and Wall, D (2006) A Literature review of multi-source feedback systems within and
without health services, leading to 10 tips for their successful design. Med Teach; 28(7): e185–e191
http://informahealthcare.com/doi/pdf/10.1080/01421590600834286
Useful weblinks
Academy of Medical Royal Colleges: The Effectiveness of Continuing Professional Development 2010
http://www.aomrc.org.uk/publications/statements/doc_view/213-effectiveness-of-cpd-final-report.html
Learning Portfolio, Section 3: Assessment, MSF The Royal Australian and New Zealand College of Radiologists® 2010 Multisource feedback: Instructions for assessors and trainees
http://www.google.co.uk/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&ved=0CDwQFjAC&url=http%3A%2F%2Fwww.
ranzcr.edu.au%2Fcomponent%2Fdocman%2Fdoc_download%2F341-multi-source-feedback-msf&ei=qeEwUYSeJKek4ASwjY
HYDw&usg=AFQjCNGEepnwglPXgz-t9nIN6e58sCeBfw&bvm=bv.43148975,d.bGE
Overview of Pendleton Rules
http://www.gp-training.net/training/educational_theory/feedback/pendleton.htm
Future developments: Assessment and feedback
Papers cited and weblinks in Chapter 10
Elston M (2009) Women and Medicine: The Future. Summary of findings from Royal College of Physicians research. (London)
http://www.rcplondon.ac.uk/sites/default/files/documents/women-and-medicine-summary.pdf
Goldacre, M, Taylor, K, Lambert, T (2010) Views of junior doctors on whether their medical school prepared them well for
work: Questionnaire surveys. BMC Med Educ, 10:78 doi:10.1186/1472-6920-10-78
http://www.biomedcentral.com/content/pdf/1472-6920-10-78.pdf
McManus, IC, Elder, AT; de Champlain, A, Dacre, JE, Mollon J, Chis, L (2008) Graduates of different UK medical schools show
substantial differences in performance on MRCP Part 1, Part 2 and PACES examinations. BMC Medicine; 6 (5) doi:10.1186/
1741-7015-6-5
http://www.biomedcentral.com/content/pdf/1741-7015-6-5.pdf
McManus, IC, Ludka, K (2012) Resitting a high-stakes postgraduate medical examination on multiple occasions: nonlinear
multilevel modelling of performance in the MRCP(UK) examinations, BMC Medicine;10:60 (doi:10.1186/1741-7015-10-60)
http://www.biomedcentral.com/content/pdf/1741-7015-10-60.pdf
Wakeford R, Foulkes J, McManus IC, Southgate L (1993) MRCGP pass rate by medical school and region of postgraduate training.
Br Med J; 307:542–3
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1679422/pdf/bmj00048-0069e.pdf
Support for students
Papers and weblinks: Further advice student support
Cowan, M (2010) Dyslexia, dyspraxia and dyscalculia: a toolkit for nursing staff. (London: Royal College of Nursing)
http://www.tcd.ie/disability/services/AST/Leaflets/Academic/Subject%20specific/nursing/Nursing_tool_kitf.pdf
Garrett J, Alman M, Gardner S, Born C (2007) Assessing students’ metacognitive skills. Am J Pharm Educ;15;71(1):14
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1847545/pdf/ajpe14.pdf
Gibson S, Leinster S (2011) How do students with dyslexia perform in extended matching questions, short answer questions
and observed structured clinical examinations? Adv Health Sci Educ Theory Pract;16(3):395–404
http://download.springer.com/static/pdf/805/art%253A10.1007%252Fs10459-011-9273-8.pdf?auth66=1363455846_d9eeee239
8d864c11e8418a0872716c0&ext=.pdf
Grant, J (2002) Learning needs assessment: assessing the need. Br Med J; 324(7330):156–59
http://www.bmj.com/highwire/filestream/318437/field_highwire_article_pdf/0/156.full.pdf
Tweed M, Ingham C (2010) Observed consultation: confidence and accuracy of assessors. Adv Health Sci Educ Theory
Pract;15(1):31–43
http://link.springer.com/article/10.1007/s10459-009-9163-5?LI=true#page-1
Williamson GR, Callaghan L, Whittlesea E, Heath V (2011) Improving student support using placement development teams:
staff and student perceptions. Journal of Clinical Nursing; 20: 828–36
http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2702.2010.03287.x/pdf
Woolf K, Haq I, McManus IC, Higham J, Dacre J (2008) Exploring the underperformance of male and minority ethnic medical
students in first year clinical examinations. Adv Health Sci Educ Theory Pract; 13(5):607–16
http://download.springer.com/static/pdf/576/art%253A10.1007%252Fs10459-007-9067-1.pdf?auth66=1363456295_43c6287df
a9933c8dff899db052395a4&ext=.pdf
Contents
General references
Page 36
General references
A website that discusses the different elements of Bloom’s taxonomy
http://www.nwlink.com/~donclark/hrd/bloom.html
Bartram, D. and Hambleton, R. (eds) (2006) Computer-Based Testing and the Internet:Issues and Advances. (Chichester: John Wiley & Sons)
http://eu.wiley.com/WileyCDA/WileyTitle/productCd-047001721X.html
BEME Systematic Reviews
http://www.bemecollaboration.org/Published+BEME+Reviews/
Cantillon, P., Hutchinson, L. and Wood, D. (eds) (2003) ABC of Learning and Teaching in Medicine (London: BMJ)
http://edc.tbzmed.ac.ir/uploads/39/CMS/user/file/56/scholarship/ABC-LTM.pdf
Coaley, K. (2010) An Introduction to Psychological Assessment and Psychometrics (London: Sage) DOI: 10.4135/9781446221556
http://knowledge.sagepub.com/view/an-introduction-to-psychological-assessment-and-psychometrics/SAGE.xml
Epstein, R. M. (2007) Assessment in medical education. New England Journal of Medicine; 356(4):387–96
http://www.nejm.org/doi/pdf/10.1056/NEJMra054784
Furr, R.M., Bacharach, V.R. (2008) Psychometrics: An Introduction. (Sage)
Kline, T.J.B. (2005) Psychological Testing: A Practical Approach To Design and Evaluation. (Sage)
Pendleton D., Scofield T., Tate P., Havelock P. (1984) The Consultation: An Approach to Learning and Teaching. (Oxford: Oxford University
Press)
Swanwick, T. (2010) Understanding Medicial Education, Evidence, Theory and Practice (Oxford: Wiley-Blackwell)
Yigal Attali, J. B. (2006) Automated Essay Scoring With e-rater® V.2. Journal of Technology, Learning and Assessment; 4, No 3
http://escholarship.bc.edu/ojs/index.php/jtla/article/viewFile/1650/1492
Companion website material for How to Assess Students and Trainees in Medicine and Health, First Edition. Edited by Olwyn M. R. Westwood,
Ann Griffin, and Frank C. Hay.
© 2013 John Wiley & Sons, Ltd. Published 2013 by John Wiley & Sons, Ltd.