Assessment Advisory Board papers - 10 May 2016

Transcription

Assessment Advisory Board papers - 10 May 2016
Assessment Advisory Board, 10 May 2016
10 May 2016
Assessment Advisory Board
2
Minutes of the Meeting on 5 February 2016*
Members present
Val Wass, Chair
Jonathan Beard
Jan Illing
Neil Johnson
John Richardson
Ian Ritchie
Alison Sturrock
Celia Taylor
Judith Way
Others present
Andrea Callender, Head of Diversity,
Strategy & Communication
Jane Durkin, Assistant Director,
Registration & Revalidation
Ben Griffith, Policy Manager, Education &
Standards
Richard Hankins, Head of Registration
Enquiries & Testing, Registration &
Revalidation
Michael Harriman, Board Secretary
Martin Hart, Assistant Director, Education
& Standards
*
Sophie Holland, PA to Una Lane
Judith Hulf, Senior Medical Advisor &
Responsible Officer
Una Lane, Director, Registration &
Revalidation
Christopher Pratt, Policy Officer,
Registration & Revalidation
Terence Stephenson, Chair
These Minutes should be read in conjunction with the papers for this
meeting, which are available on our website at http://www.gmc-uk.org
Assessment Advisory Board meeting, 10 May 2016
Agenda item 2 – Minutes of the meeting on 5 February 2016
Chair’s business
1
The Chair welcomed the Chair of the GMC to the meeting.
2
Apologies for absence were noted from Jean Ker.
Minutes of the meeting on 15 October 2015
3
The minutes of the meeting on 15 October 2015 were agreed as a true record, with
the following amendments:
Paragraph 3 to read: ‘…she is the academic lead at the UCL Academic Centre of
Medical Education…’
Paragraph 5b to read: ‘A suitably-modified PLAB test could form a good foundation…’
Paragraph 5d to read: ‘…had highlighted the importance of a national standard and
discussed the common assessment of clinical skills.’
Paragraph 8b: deleted
New paragraph 8b (formerly paragraph 8c) to read: ‘It would be difficult to use the
UKMLA as a ranking tool if it is designed primarily to be a pass/fail instrument.’
Paragraph 11b to read: ‘Other factors that influence educational progress include…’
Matters arising
4
There were no matters arising.
Options for the Medical Licensing Assessment
5
During the presentation about the progress of the introduction of the Medical
Licensing Assessment (‘MLA’), the Board advised that the objective of the MLA, which
was stated as:
‘To create a single, objective demonstration that those who successfully apply to
hold a licence to practise in the UK meet an equivalent minimum standard set by
the UK regulatory body.’
Would be better stated as:
2
Assessment Advisory Board meeting, 10 May 2016
Agenda item 2 – Minutes of the meeting on 5 February 2016
‘To create a single, objective demonstration that those who successfully apply to hold
a licence to practise in the UK meet a minimum standard set by the UK regulatory
body.’
6
In giving this advice, members noted that it was not clear to what the ‘minimum
standard’ would be equivalent, so would be better omitted.
Questions for the Board
7
8
Eight questions were put to the Board and discussed in two groups:

How should we approach developing the blueprint? Is the PLAB blueprint a
sensible starting point?

Should we use the blueprint to influence curricula and address concerns in key
areas eg caring for older people with comorbidities, generic professional
capabilities, GP-related competencies, professionalism, patient safety and
prescribing?

Does the three part model seem appropriate for the MLA? What alternatives
should be considered?

How should the MLA relate to medical school Finals? Is a ‘core and options’ model
feasible and desirable? How could it be achieved?

How should standards be set? Modified Angoff or Item Response Theory for Part
1? Borderline Regression for Part 2?

How can we ensure that the standard to pass is appropriate for both IMGs and UK
graduates?

Is the tentative timeline for development and implementation appropriate? Is
sufficient time set aside eg for blueprinting, IT development and piloting?

Should we have different timelines for the three Parts?
The Board’s advice is at Annex A.
Any other business and date of next meeting
9
The Board noted that the next meeting would be on the afternoon of 10 May 2016,
starting with lunch at 13:00.
3
Assessment Advisory Board meeting, 10 May 2016
Agenda item 2 – Minutes of the meeting on 5 February 2016
Confirmed:
Val Wass, Chair
10 May 2016
4
Assessment Advisory Board meeting, 10 May 2016
Minutes of the meeting on 5 February 2016
– Annex A
Blueprinting

How should we approach developing the blueprint? Is the PLAB blueprint a sensible
starting point?

Should we use the blueprint to influence curricula and address concerns in key areas
eg caring for older people with comorbidities, generic professional capabilities, GPrelated competencies, professionalism, patient safety and prescribing?
The Board’s advice
1
The PLAB blueprint is a good starting point.
2
The agreed outcomes should be the primary determinant of the blueprint. The
blueprint should set the balance between the competency areas in the assessment.
3
The medical schools already work to a single set of outcomes determined by the GMC.
So they should not need to change their curricula to satisfy the MLA. The MLA should
provide the core and the medical schools could assess other aspects of their curricula.
4
The proposed abolition of provisional registration would not change the types of skill
set out in the blueprint, but the required skill depth might be affected.
5
The undergraduate curricula, outcomes for graduates, core competencies and generic
professional capabilities should drive the assessment. There is a question about
whether it is possible to assess all the generic professional capabilities. The MLA need
not drive out diversity in curricula.
6
Assessment drives learning, so it will be important to be open and transparent.
Assessment Advisory Board meeting, 10 May 2016
Agenda item 2 – Minutes of the meeting on 5 February 2016 annex
Formats

Does the three part model seem appropriate for the MLA? What alternatives should be
considered?

How should the MLA relate to medical school Finals? Is a ‘core and options’ model
feasible and desirable? How could it be achieved?
The Board’s advice
7
The three part model and the ‘core and options’ model are both appropriate. A MCQbased Part 1 would be straightforward to deliver. The MLA would be the ‘core’, and
‘finals’ (MLA plus ‘options’ set by the medical schools) would reflect curricular diversity.
There could be concerns about cost-effectiveness and practical difficulties if, for
example, two sets of OSCEs are required.
8
Part 2 could in principle be delivered on a ‘core and options’ basis but compared to an
MCQ-based Part 1, there is much more variability in how OSCEs are delivered, but not
in the range of skills assessed.
9
A pragmatic approach will be necessary. Perhaps clusters of medical schools could
deliver an MLA core for OSCEs.
10 Another approach could be to embed MLA items in medical school OSCEs and the MSC
Assessment Alliance could look into how to do that. Moving candidates around the
country would be difficult.
11 The GMC should learn from the enhancement of PLAB Part 2 that is currently
underway as well as from the medical schools.
12 Part 3 could be challenging. When would it be held? How could it be reliable and valid?
What would be the impact on the service? We could perhaps consider a screening
approach with more in-depth assessment of individuals when appropriate.
13 Part 3 looking at practice in the workplace and patient safety will be critical but the
approach must be robust. 360 degree Multiple Source Feedback has the best evidence
base among the workplace based assessment methods.
Standard setting

How should standards be set? Modified Angoff or Item Response Theory for Part 1?
Borderline Regression for Part 2?

How can we ensure that the standard to pass is appropriate for both IMGs and UK
graduates?
A2
Assessment Advisory Board meeting, 10 May 2016
Agenda item 2 – Minutes of the meeting on 5 February 2016 annex
The Board’s advice
14 The more developed MCQ-based assessment approaches around the world initially use
the modified Angoff approach to set the initial standard and then IRT.
15 There are currently debates about whether MCQs should include five options or
whether three might be better. The consensus is between three and five options.
16 IRT could be used for Part 2 but would be challenging. Borderline regression assumes
a linear regression the suitability of which needs to be considered further.
17 There will be political drivers but the standard for the MLA needs to be based on
clinical competency to ensure patient safety.
18 A common standard for all candidates is essential. There is a potential risk of a high
failure rate.
Timeline for development

Is the tentative timeline for development and implementation appropriate? Is sufficient
time set aside eg for blueprinting, IT development and piloting?

Should we have different timelines for the three Parts?
The Board’s advice
19 The timeline is deliverable, but there is a degree of urgency because, among other
things, it would be desirable if students were given notice, although the board noted
that the urgency might make this difficult. It is important not to slip and it is better to
have a longer timeline and stick to it.
20 It will be important to pilot the MLA initially on UK students as this will help in assuring
the standard. It is not appropriate to initially roll it out to the IMG group.
21 It would be desirable for all three parts should be introduced together as it is one
assessment, otherwise, different cohorts would be treated differently. It was
acknowledged that it may be easier to introduce each part separately.
A3
Assessment Advisory Board meeting, 10 May 2016
Agenda item:
4
Report title:
Update on piloting of the new format PLAB part 2 stations
Report by:
Richard Hankins, Head of Registration Enquiries and Testing,
[email protected], 0161 250 6967
Action:
To note
Executive summary
From September 2016 part 2 of the PLAB test will have a fully revised format. The new
assessment is broader in scope and longer in duration, consisting of 18 live stations of ten
minutes duration (including two minutes reading time).
To pilot the new station prior to launch, a series of pilot days has been held. Final year
medical students, refugee doctors and junior doctors have undertaken the stations, and
feedback on all aspects of station design and performance has been gathered from
candidates, role players and examiners. This feedback has been used to amend or edit
stations as required. A small number of stations have been discontinued.
Recommendation
The Assessment Advisory Board is asked to note the progress being made with piloting the
new format stations for part 2 of the PLAB test.
Assessment Advisory Board meeting, 10 May 2016
Agenda item 4 – Update on piloting of the new format PLAB part 2 stations
Background
1
In the PLAB review, published in September 2014, the following recommendation was
made:

The GMC should seek to increase the reliability of the part 2 examination.
Options include increasing the number and/or length of the OSCE stations,
introducing a feedback session after each OSCE for examiners to discuss
candidates’ performance, and using two examiners to assess performance.
However, the GMC will need to take into account the results of the
generalizability study as well as feasibility when determining the most
appropriate and proportionate way of increasing the part 2 reliability.
2
An associated generalizability study was undertaken by Dr David Swanson in late
2014. This study made a range of recommendations including revising the standard
setting methodology, increasing the duration of stations and the total number of live
stations.
3
In order to meet this recommendation and to respond appropriately to the
generalizability study, the PLAB implementation project is seeking to increase the
reliability of the part 2 examination by increasing the length of stations to 10 minutes
(including 2 minutes preparation time), increasing the number of live stations in a
circuit from 14 to 18 and increasing the number of skills being assessed in each
station. This has resulted in a complete re-writing of PLAB 2 stations and we are
piloting the new stations, in advance of their introduction, to ensure they operate
effectively.
4
The purpose of piloting is to ensure that the station construct works effectively, that
guidance for examiners, candidates and role players is appropriate, that station
timing is sufficient and that the room set up and equipment is fit for purpose. Piloting
is not being used to set the standard which will be done on the day using borderline
regression.
5
Piloting of stations is being undertaken principally using fifth year medical students
from a range of medical schools. We have also invited refugee doctors from Salford
Royal Hospital and also junior doctors in the Manchester area to participate.
Format of the day
6
We’d originally thought that we’d run two different circuits a day, allowing us to pilot
36 stations on a day. At the first pilot on 19th of January it quickly became evident
that this wasn’t the best format to use and by the second day we settled on the
routine outlined in Table 2 of Annex A.
2
Assessment Advisory Board meeting, 10 May 2016
7
Agenda item 4 – Update on piloting of the new format PLAB part 2 stations
As the main focus of the day is gathering feedback, an approach was developed
where the circuit is stopped after six stations and feedback gathered from the
examiners. Volunteer candidates provide feedback between each station as do role
players.
Volunteers
8
We gave the volunteers three minutes outside the station to give them time to fill out
their feedback from the previous station and read the instructions for the next one.
Group feedback from the volunteers was also taken at the end of the day.
Examiner feedback
9
Examiner feedback was generally positive, with the majority of the issues surrounding
the detail of the station. Two issues that ran throughout:
a The examiners felt that the candidate instructions did not contain enough
information and should be expanded. They felt that in many cases the instruction
to “assess and manage” the patient was not directive enough. There is a balance
to be struck here: we want candidates to behave as they would during a real
consultation, but equally they need to have enough information to situate
themselves in the station.
b At the end of the stations, when the volunteers were explaining management to
the patient, they would often say, for example, that they would do blood tests,
without elaborating on what the blood tests were. This occurred in a number of
forms during the stations and examiners wanted either the role player to ask more
specific questions when the management was explained in vague terms, or to be
able to step in themselves.
Number of stations
10 To date, we have piloted 70 stations. Of those, the majority have worked well and
will require only relatively minor changes before going live. A few stations haven’t
worked and will need to be reviewed and run again at a further pilot. We expect to
pilot another 36 stations in April and May.
11 We are running six station editing days between March and May 2016 to provide new
material and to rewrite piloted stations in need of change. Forming the nucleus of
these days are new stations written by members of the part 2 panel working group,
new stations from examiners and existing stations rewritten by members of the
working group in the new format.
3
Assessment Advisory Board meeting, 10 May 2016
Agenda item 4 – Update on piloting of the new format PLAB part 2 stations
12 The full piloting timetable is shown in Table 1 of Annex A. We anticipate having a
bank of around 100 piloted stations when the new format part 2 goes live in
September 2016, expanding to about 130 by the November 2016 pilot. Piloting will
continue throughout 2017 as we continue to expand the question bank.
4
Assessment Advisory Board meeting, 10 May 2016
4 - Update on piloting of the new format PLAB part 2
stations
4 – Annex A
Table 1, Piloting dates 2016
Date
University
Number of
volunteers
19 January 2016
Lancaster
10
20 January 2016
Liverpool
19*
16 February 2016
Leeds
25
02 March 2016
Keele
33
19 April 2016
Manchester
18 May 2016
Sheffield
05 October 2016
Sheffield
29 November 2016
Manchester
*Including one refugee doctor who had passed PLAB
Assessment Advisory Board meeting, 10 May 2016
Agenda item 4 – Update on piloting of the new format PLAB part 2 stations
Table 2, Outline Schedule
Task
Time
Examiner calibration
1 hour
Stage 1-6
1 hour six minutes
Examiner debrief – stations 18-13
30 minutes
Stage 7-12
1 hour six minutes
Lunch and Examiner debrief – stations 12-7
30 mins
Stage 13-18
1 hour six minutes
Examiner debrief – stations 6-1
30 minutes
Feedback to students as a group
20-30 minutes
A2
Assessment Advisory Board meeting, 10 May 2016
Agenda item:
5
Report title:
Developing joint working between PLAB and the MSC AA
Report by:
Richard Hankins, Head of Registration Enquiries and Testing,
[email protected] 0161 250 6967
Action:
To consider
Executive summary
The GMC has recently begun to develop a formal relationship between the Medical
Schools Council Assessment Alliance (MSC AA) and PLAB. The purpose of developing a
more formal relationship is to facilitate the sharing of assessment expertise and to
enable the PLAB assessment programme to be more closely aligned with UK medical
school finals. The MSC AA now has a representative who sits on the PLAB Part 1 panel
and in return the PLAB Part 1 Chair is attending the MSC AA Question Review Group and
Reference Group. This should help support the sharing of information and expertise
across assessments but we hope we will eventually be able to share questions across
assessments and participate in the Common Content project.
This paper covers work to date. We are particularly interested in advice from Board
members on the implications of the technical differences in standard between PLAB and
medical school finals. Julian Hancock (Chair, PLAB Part 1 panel) and Mark Gurnell (MSC
AA representative, PLAB Part 1 Panel) will join the meeting for this paper.
Recommendations
The Assessment Advisory Board is asked:
a To note the work to date in formalising a working relationship with the MSC AA
b To advise on the opportunities and risks involved in participating in the Common
Content project
c To advise on what implications there may be when sharing multiple choice questions
between finals set at the level of graduation, and PLAB set at the level of first day of
foundation year 2.
Assessment Advisory Board meeting, 10 May 2016
Agenda item 5 – Developing joint working between PLAB and the MSC AA
Background
1
PLAB is a two part assessment taken by most international medical graduate (IMG)
applicants for registration. It consists of a 200 item single best answer multiple choice
(MCQ) exam followed by an objective structured clinical examination (OSCE) which
has recently been revised. PLAB is set at the standard of day one of foundation year
2 practice. It is mapped principally to the curriculum of the Foundation Programme.
2
PLAB content development, item selection and standard setting are overseen by two
panels. The part 1 panel is chaired by Dr Julian Hancock who is a GP and a senior
lecturer at Oxford University Medical School. The part 2 panel is chaired by Professor
Vikram Jha who is an obstetrician and Head of the Undergraduate Programme at
Liverpool University. Each panel is made up of consultants (or equivalent) with recent
experience of supervising Foundation Programme doctors from a selected range of
medical specialties. Many of the members of the panels are also engaged in
undergraduate assessments or hold appointments as examiners at their Royal
College.
3
The Medical Schools Council (MSC) is an affiliation of the UK’s publically funded
medical schools. The Medical Schools Council Assessment Alliance (MSC AA) is a sub
group that works exclusively on assessment policy and projects. In recent years, the
MSC AA has developed the Common Content project. This has involved developing a
selection of MCQ items of similar style and content to PLAB part 1 questions. A
selection of these is shared across the finals papers of UK medical schools.
4
MSC AA common content questions are currently provided to each school without a
standard set mark. Each school then uses its preferred methodology (principally
modified Angoff) to set scores for the questions and this data is returned to the MSC
AA. A psychometric report is then prepared which allows the MSC AA to evaluate how
similar the standards are between medical schools. The results of this pilot will shortly
be submitted for publication. The MSC AA Board is currently examining the feasibility
of establishing a national standard setting group and, in parallel, exploring the
potential for using Item Response Theory (IRT).
5
The MSC AA has also initiated a project and undertaken a survey of practices in
clinical exams in UK universities. This work is being led by Dr Amir Sam and is at a
relatively early stage. We have not asked to be part of this work stream at this time,
although, Dr Amir Sam did attend PLAB piloting as an observer.
Opportunities and challenges
6
The largest cost of running any MCQ assessment is the development of sufficient high
quality questions. Although these may be re-used a number of times they require
regular editing and review by an expert panel. The MSC AA has a large database of
2
Assessment Advisory Board meeting, 10 May 2016
Agenda item 5 – Developing joint working between PLAB and the MSC AA
questions that have been used by UK medical schools that could be re-used in PLAB
at relatively little expense. Similarly, some of the best performing PLAB questions
could be added to this database for use by medical schools.
7
There are some challenges; for example PLAB is mapped principally to the
Foundation Programme curriculum. It has a blueprint and a sampling grid that
ensures there is consistency in topics between papers. Undergraduate assessment is
mapped to medical school outcomes. Although there is substantial overlap, questions
may need to be considered on a case by case basis to ensure that they are suitable
for PLAB. They will also have to be sampled in a way that ensures the sampling grid
requirements are met.
8
Differences in the expected standards for PLAB and medical school final candidates
will also need to be considered. For PLAB, the pass standard is set at that which is
expected of a foundation year 2 doctor on his or her first day using modified Angoff.
Medical schools set their standards at the point of graduation using a number of
different methods, but most commonly via modified Angoff. However, as foundation
year 1 includes very few taught elements, and is instead focused on developing
clinical skills, it is likely that any true difference in standard is small. To investigate
this further, there may be an opportunity for PLAB to join the MSC AA’s planned
national standard setting initiative.
9
A further confounding factor is that PLAB 1 runs four times a year whereas medical
school finals are typically annual. This may impact on the practicalities of working
with the Common Content project. However, it also increases the number of
opportunities to utilise shared questions without changing the nature of any one PLAB
paper.
10 As we move towards the development of a single Medical Licensing Assessment
(MLA), closer working between the GMC and medical schools on assessment will be
essential. We hope that developing closer working relationships between PLAB and
the MSC AA will provide a useful framework and starting point for the joint work that
will be necessary to support the introduction of a single MLA.
3
Assessment Advisory Board meeting, 10 May 2016
Agenda item:
6
Report title:
Update on Medical Licensing Assessment
Report by:
Ben Griffith, Policy manager, Education and Standards
Directorate, [email protected], 020 7159 5283
Action:
To note
Executive summary
We have been heavily engaged in obtaining advice on the plans for a medical licensing
assessment (MLA), particularly from medical schools, since the very useful discussion with
the Assessment Advisory Board on 5 February 2016. We have also been engaging other
key stakeholders and assessment experts and developing materials for the GMC website.
Informed by advice from the Board, medical schools and others, we are preparing a
consultation document. We plan to take a draft to the GMC Council in September 2016 and
to launch the formal consultation by the year end.
Recommendation
The Assessment Advisory Board is asked to note the update.
Assessment Advisory Board meeting, 10 May 2016
Agenda item 6 – Update on Medical Licensing Assessment
Engagement visits to medical schools
1
Since the very helpful input from the Assessment Advisory Board on 5 February 2016,
we have scheduled visits to virtually all medical schools to discuss the medical
licensing assessment. The programme of visits will be complete by the end of June
2016.
2
Agendas for the visits vary slightly depending on the needs and interests of each
school. However, a meeting with the senior school team for an in-depth and open
conversation about the MLA is included in each visit. This is usually followed by a
larger meeting which includes a presentation and has an audience of teachers,
trainers (including NHS supervisors) and students.
3
We have been welcomed by all schools and although we have met some people who
are opposed to the introduction of the MLA, they have been in the minority and have
generally taken a constructive approach to the discussion. Medical schools have been
keen to discuss the MLA in some detail and have asked questions about standard
setting, assessment formats, assessment policies (such as number of re-sits allowed
and provision for exceptional circumstances) and quality assurance of the
assessment.
4
Some emerging themes from early discussions include:
a It is widely accepted that the MLA knowledge assessment could be integrated into
medical school written finals and would be a development of the Medical Schools
Council Assessment Alliance’s common content bank of questions
b There is emerging discussion on possible models for integrating the MLA clinical
assessment into medical school clinical finals
c Medical schools would like the MLA to incorporate assessment of prescribing
d Most medical schools are interested in a discussion about how an MLA could best
assess areas such as professionalism, ethics, resilience and human factors
e Some medical schools are concerned about the impact that the MLA will have on
the way students learn and that students could focus on preparing for the MLA
rather than improving their clinical practice
f
Many medical schools have commented that students should not pay fees to sit the
MLA
g Several medical schools have raised the point that we will need to work with
University Registrars to get the MLA established as an element of final examinations
2
Assessment Advisory Board meeting, 10 May 2016
Agenda item 6 – Update on Medical Licensing Assessment
h Most schools would like the assessment to be a pass/fail rather than a scored
assessment.
5
On conclusion of the visits we will prepare a summary report for circulation to the
medical schools, which we will share with Board members.
Wider advice and engagement
6
In addition to the visits to the medical school we have an extensive programme to
obtain advice on the MLA through engagement with key stakeholders and the wider
community of assessment experts:
a We have continuing discussions with the four UK governments. There is widespread
acceptance of the case for the MLA and a range of views on whether provisional
registration should be abolished
b We have had constructive discussions with the Medical Schools Council (MSC) and
with the MSC Assessment Alliance, which relate to the PLAB exam as well as the
development of the MLA
c We are establishing an Expert Advisory Group specifically to help us develop
detailed arrangements for the MLA. Membership is being determined and Professor
Neil Johnson, chair of the MSC Assessment Alliance, has agreed to chair the group.
The EAG will supplement the more strategic advice that we will obtain from the
Assessment Advisory Board and the Education and Training Advisory Board. We
envisage that the EAG will act as an umbrella for workshops and working groups in
which Board members will be invited to participate
d We meet many other key players on a regular basis, to discuss the MLA and other
matters of mutual interest
e We have recently published a dedicated MLA webpage, drawing on material
developed for the visits to the medical schools
f
We are also building an international Reference Community of experts whose views
will be sought as we develop proposals for the MLA.
Developing the model for consultation
7
Building on the advice and engagement to date, we are developing proposals for
consultation on the MLA.
3
Assessment Advisory Board meeting, 10 May 2016
Agenda item 6 – Update on Medical Licensing Assessment
8
We intend to take a draft consultation document to the GMC Council in September
2016 with a view to launching a formal consultation by the end of the year.
9
We will be seeking advice from Board members in preparing for the consultation as
well as during the formal process.
10 We will be consulting on a provisional timeframe for development and
implementation of the MLA. However, we currently envisage that during 2017 we will:
a Complete the formal consultation and reach strategic conclusions having reflected
on the response.
b Develop the MLA assessment blueprint alongside reviewing our Outcomes for
graduates.
c Develop assessment formats and agree examination policies and procedures.
d Start to put in place the necessary infrastructure.
11 We expect to start extensive piloting in 2018 with a view to full implementation of the
MLA from 2022.
4
Assessment Advisory Board meeting, 10 May 2016
Agenda item:
7
Report title:
Acceptable alternatives to the revalidation assessment –
update
Report by:
Clare Barton, Assistant Director, Registration and Revalidation
[email protected], 0161 923 6589
Action:
To consider
Executive summary
In addition to submitting annual returns containing evidence of their practice and
participation in annual appraisal, doctors without a Responsible Officer (RO) or a GMCapproved Suitable Person (SP) may be requested to sit the GMC’s revalidation assessment
(RA). As an alternative to the RA, we can accept other assessments that provide
acceptable information to assist our revalidation decision.
This paper seeks the Board’s advice on the assessments we might accept as an alternative
to sitting the RA and on the steps we could take when a doctor does not reach the
required standard in the RA. Since the Board’s advice in October 2015, we have modified
our policy approach to alternative assessments. We now accept a fitness to practise
performance assessment (as originally proposed), or completion of the UK Royal College or
Faculty assessments leading to membership or fellowship of the relevant College or Faculty
(rather than assessments that form part of the CCT curriculum, as originally proposed).
Where a doctor does not reach the required standard in the RA, we propose to introduce a
second assessment in the form of an ‘expert review’. This will provide further objective
evidence of a doctor’s fitness to practise to enable us to make a revalidation decision.
Recommendations
The Assessment Advisory Board is asked to advise on the:
a Revised policy approach for the assessments we will accept as alternative to a
doctor without a RO or SP undergoing the RA.
b Issues to be considered as we plan for conducting ‘expert reviews’ for any doctor
who does not reach the required standard in the RA.
Assessment Advisory Board meeting, 10 May 2016
Agenda item 7 – Acceptable alternatives to the revalidation assessment – update
The GMC’s revalidation assessment
1
If it appears reasonable to us to make such a request, licensed doctors who don’t
have a RO or SP will be asked to undertake an assessment to support their
revalidation. Revalidation decisions about doctors without a connection will generally
be based on a package of assurances that includes the results of the revalidation
assessment together with an annual return containing information about a doctor’s
practice including evidence of appraisals and reflection on their supporting
information.
2
We require completion of an assessment due to the absence of assurance that would
be gained if a doctor had a RO or SP. Doctors with prescribed connections to
designated bodies work in environments that are subject to legislative safeguards in
relation to clinical governance and appraisal set out principally in the RO Regulations.
We seek assurance that doctors with a prescribed connection are up to date and fit to
practise from their RO or SP, a registered and licensed medical practitioner whose
role is to ensure those safeguards are in place.
3
In January 2016 we launched the revalidation assessment and began asking doctors
with no RO or SP to book their assessment. Since January, doctors have been able to
book one of 12 generic tests focused on broad specialities, or a broader, more
generalist test for doctors who have no recent practice or haven’t undertaken or
completed specialty training. These will all be written (multiple choice) knowledge
tests, provided by UCL, and doctors will choose the test most appropriate to their
scope of practice.
4
We have published information and guidance for doctors who need to book a
revalidation assessment. As at 18 April 2016, 25 doctors had booked an assessment
(see Annex A). The first assessments will take place on 20 May 2016: three doctors
have booked to sit the GP test and one doctor has booked to sit the emergency
medicine test.
Acceptable alternative assessments
5
Our powers to request doctors without a RO or SP to undertake an assessment for
revalidation are provided in regulation 6.-(8) of the Licence to Practise and
Revalidation Regulations 2012 (the Regulations). The Regulations state that we have
discretion to request these doctors to undertake an assessment designed to evaluate
their fitness to practise, which is conducted by us, or ‘accepted by the Registrar as
suitable for the purpose’. It is therefore at the discretion of the Registrar to determine
what is acceptable.
6
We devised a set of criteria to evaluate what would be an acceptable alternative
assessment for these purposes. To be acceptable the assessment had to:
2
Assessment Advisory Board meeting, 10 May 2016
Agenda item 7 – Acceptable alternatives to the revalidation assessment – update
a Demonstrate compliance with professional standards comparable to those set out
in Good medical practice.
b Demonstrate the ability to practise at level comparable to, or higher than, that
required at point of full registration.
c Be related to medical practice.
d Demonstrate knowledge and/or practical skills.
e Be subject to regular, robust quality assurance and governance procedures.
f
7
Remain valid according to the criteria for that assessment, or validity of the GMC
assessment* (whichever is shorter).
Having evaluated a number of potential alternative assessments against these
criteria, we will accept a successful outcome in the following alternative assessments:
a A GMC fitness to practise performance assessment or
b A UK Royal College or Faculty assessment leading to membership or fellowship of
the relevant College or Faculty. We have published a list on the website.
8
We consider the fitness to practise performance assessment acceptable, because in
addition to meeting the above criteria, it is conducted with the specific purpose of
determining a doctor’s fitness to practise. Additionally the material for the revalidation
assessment will come from the existing bank of questions we use for the knowledge
test component of our test of competence (one of the tools used within the GMC’s
performance assessments).
9
The UK Royal College and Faculty assessments, while not conducted by the GMC, are
subject to quality assurance by the GMC, and therefore we can be assured that they
meet the criteria indicated at paragraph 6.
10 When we asked the Assessment Advisory Board for advice in October 2015, our
original policy approach was to accept College and Faculty assessments that formed
part of the CCT curriculum. We have since refined this approach to successful
completion of the formal written and skills examinations of the UK Royal Colleges and
Faculties which lead to the award of membership or fellowship. This will ensure
consistency and fairness across all training programmes. We will not accept
*
The revalidation assessment will be valid within the doctor’s current revalidation cycle, although doctors
will usually be invited to sit this in the year in which we intend to make a revalidation decision.
3
Assessment Advisory Board meeting, 10 May 2016
Agenda item 7 – Acceptable alternatives to the revalidation assessment – update
completion of only some parts of these examinations, because we have been advised
that doing so would undermine the validity of the assessments.
11 We have developed this approach in order to be fair and proportionate, and to reflect
the fact that a doctor without a RO or SP will only be required to sit an assessment if
it appears reasonable to the GMC for them to do so.
12 As at 20 April 2016, we have revalidated three doctors without a prescribed
connection on the basis of an acceptable alternative assessment (all Royal College of
Surgeons).
Expert review
13 We have no power to withdraw a doctor’s licence to practise solely on the basis of
performance in the revalidation assessment – although a doctor may relinquish his or
her licence to practise at any point. If a doctor sits the GMC’s revalidation assessment
but does not reach the required standard, we will require further objective evidence
of his or her fitness to practise before we can make a revalidation decision.
14 We have reviewed a number of options for further evidence, including asking the
doctor to sit the assessment again or submitting to other assessments/evaluations
that are designed to further investigate their fitness to practise. As with the
revalidation assessment, the doctor would sit/submit to the assessment at their own
expense.
15 We propose to introduce a second assessment in the form of an ‘expert review’. A
GMC-appointed assessor who works in the same field as the doctor will review a fair
sample of the doctor’s work (for example, recent patient records or written reports)
to identify if there are any concerns about the doctor’s fitness to practise that might
warrant further investigation under our fitness to practise procedures. The doctor will
be asked to complete a detailed portfolio so that we can design the second
assessment around their scope of practice.
16 The number of doctors likely to require a second assessment is expected to be very
small as we estimate only 100 doctors might sit the first assessment in 2016. Details
about the doctors that have booked to sit the revalidation assessment as at 20 April
2016 are set out at Annex A.
17 We know it will be difficult to design an expert review for some doctors in the
absence of available patient records. For example, doctors who:
a Solely provide medico-legal reports or advice.
b Undertake research or teaching.
4
Assessment Advisory Board meeting, 10 May 2016
Agenda item 7 – Acceptable alternatives to the revalidation assessment – update
c Work in industry/pharmaceuticals or managerial roles.
d Work wholly overseas.
e Have no recent medical practice.
18 We would welcome the Board’s advice on how we might develop a second
assessment for these doctors if they wish to retain their licence to practise and
cannot connect to a designated body or find a SP.
5
Assessment Advisory Board, 10 May 2016
7 - Acceptable alternatives to the revalidation assessment
– update
7 – Annex A
Revalidation assessment bookings
1
The table below sets out the number of doctors who have booked a revalidation
assessment.
Assessment
Anaesthetics
1
Emergency medicine
1
Foundation1
2
General practice
4
Histopathology
1
Medicine
3
Obstetrics and gynaecology
3
Paediatrics
3
Psychiatry
2
Surgery
5
Total
1
Number of doctors
25
The generalist test for doctors with no recent practice, or for those who have not undertaken or completed
specialist training.
Assessment Advisory Board, 10 May 2016
2
7 – Acceptable alternatives to the revalidation assessment – update
We should treat the information we hold about these doctors with caution as the
numbers are so small. Of the 25 doctors who have booked a revalidation assessment,
we know that:
a 19 have told us (in their most recent annual return) they are based in the UK. The
remainder are based abroad.
b Of the UK-based doctors:
i
10 are doing clinical work which requires a licence to practise.
ii
3 have no current practice (two are on a career break and one is undertaking
non-medical research).
iii 6 are working in roles that may not require a licence, most commonly medicolegal reporting or teaching.
c 13 have a UK primary medical qualification, and 12 qualified outside the UK.
d 15 are men, and 10 are women.
e 12 are under 50, 9 are aged between 51 and 70, and four are over 70.
f
11 are white, three are Asian or Asian British, three are from other ethnic groups,
and we hold no ethnicity data about the remainder.
A2