Oral and Maxillofacial Surgery Curriculum
5 Programme of Assessment
5.1 Purpose of assessment
Assessment of learning is an essential component of any curriculum. This section describes the assessment system and the purpose of its individual components which are blueprinted to the curriculum as shown in appendix 9. The focus is on good practice, based on fair and robust assessment principles and processes in order to ensure a positive educational impact on learners and to support assessors in making valid and reliable judgements. The programme of assessment comprises an integrated framework of examinations, assessments in the workplace and judgements made about a learner during their approved programme of training. Its purpose is to robustly evidence, ensure and clearly communicate the expected levels of performance at critical progression points in, and to demonstrate satisfactory completion of, training as required by the curriculum. The assessment programme is shown in Figure 3 below.
Assessments can be described as helping learning or testing learning - referred to as formative and summative respectively. There is a link between the two; some assessments are purely formative (shown in green in figure 3), others are explicitly summative with a feedback element (shown in blue) while others provide formative feedback while contributing to summative assessment (shown in orange).
The purposes of formative assessment are to:
- assess trainees’ actual performance in the workplace.
- enhance learning by enabling trainees to receive immediate feedback, understand their own performance and identify areas for development.
- drive learning and enhance the training process by making it clear what is required of trainees and motivating them to ensure they receive suitable training and experience.
- enable supervisors to reflect on trainee needs in order to tailor their approach accordingly.
The purposes of summative assessment are to:
- provide robust, summative evidence that trainees are meeting the curriculum requirements during the training programme.
- ensure that trainees possess the essential underlying knowledge required for their specialty, including the GPCs to meet the requirements of GMP.
- inform the ARCP, identifying any requirements for targeted or additional training where necessary and facilitating decisions regarding progression through the training programme.
- identify trainees who should be advised to consider changes of career direction.
- Provide information for the quality assurance of the curriculum.
Figure 3: Assessment framework.
5.2 Delivery of the programme of assessment
The programme of assessment is comprised of several different types of assessment needed to meet the requirements of the curriculum. These together generate the evidence required for global judgements to be made about satisfactory trainee performance, progression in, and completion of, training. These include the ISB examination and WBAs. The primary assessment in the workplace is the MCR, which, together with other portfolio evidence, contributes to the AES report for the ARCP. Central to the assessment framework is professional judgement. Assessors are responsible and accountable for these judgements and these judgements are supported by structured feedback to trainees. Assessment takes place throughout the training programme to allow trainees to continually gather evidence of learning and to provide formative feedback to the trainee to aid progression.
Reflection and feedback are also an integral components of all WBAs. In order for trainees to maximise the benefit of WBA, reflection and feedback should take place as soon as possible after the event. Feedback should be of high quality that should include a verbal dialogue between trainee and assessor in reflection on the learning episode, attention to the trainee’s specific questions, learning needs and achievements as well as an action plan for the trainee’s future development. Both trainees and trainers should recognise and respect cultural differences when giving and receiving feedback9. The assessment framework is also designed to identify where trainees may be running into difficulties. Where possible, these are resolved through targeted training, practise and assessment with specific trainers and, if necessary, with the involvement of the AES and TPD to provide specific remedial placements, additional time and additional resources.
5.3 Assessment framework components
Each of the components of the assessment framework are described below.
5.3.1 The sequence of assessment
Training and assessment take places within placements of six to twelve months’ duration throughout each phase of training (figure 4). Assessments are carried out by relevant qualified members of the trainee’s multi-professional team whose roles and responsibilities are described in appendix 6. Trainee progress is monitored primarily by the trainee’s AES through learning agreement meetings with the trainee. Throughout the placement trainees must undertake WBAs while specialty examinations are undertaken towards at the higher end of the programme after satisfactory completion of phase 2. The trainee’s CSs must assess the trainee on the five CiPs and nine GPC domains using the MCR. This must be undertaken towards the mid-point of each placement in a formative way and at the end of the placement when the formative assessment will contribute to the AES’s summative assessment at the final review meeting of the learning agreement. The placement culminates with the AES report of the trainee’s progress for the ARCP. The ARCP makes the final decision about whether a trainee can progress to the next level or phase of training. It bases its decision on the evidence that has been gathered in the trainee’s learning portfolio during the period between ARCP reviews, particularly the AES report in each training placement.
Figure 4: The sequence of assessment through a placement.
5.3.2 The learning agreement
The learning agreement is a formal process of goal setting and review meetings that underpin training and is formulated through discussion. The process ensures adequate supervision during training, provides continuity between different placements and supervisors and is one of the main ways of providing feedback to trainees. There are three learning agreement meetings in each placement and these are recorded in the trainee’s learning portfolio. Any significant concerns arising from the meetings should be fed back to the TPD at each point in the learning agreement.
Objective-setting meeting
At the start of each placement the AES and trainee must meet to review the trainee’s progress so far, agree learning objectives for the placement ahead and identify the learning opportunities presented by the placement. The learning agreement is constructively aligned towards achievement of the high-level outcomes (the CiPs and GPCs) and, therefore, the CiPs and GPCs are the primary reference point for planning how trainees will be assessed and whether they have attained the learning required. The learning agreement is also tailored to the trainee’s progress, phase of training and learning needs. The summative MCR from the previous placement will be reviewed alongside the most recent trainee self-assessment and the action plan for training. Any specific targeted training objectives from the previous ARCP should also be considered and addressed though this meeting and form part of the learning agreement.
Mid-point review meeting
A meeting between the AES and the trainee must take place at the mid-point of a placement (or each three months within a placement that is longer than six months). The learning agreement must be reviewed, along with other portfolio evidence of training such as WBAs, the logbook and the formative mid-point MCR, including the trainee’s self-assessment. This meeting ensures training opportunities appropriate to the trainee’s own needs are being presented in the placement, and are adjusted if necessary in response to the areas for development identified through the MCR. Particular attention must be paid to progress against targeted training objectives and a specific plan for the remaining part of the placement made if these are not yet achieved. There should be a dialogue between the AES and CSs if adequate opportunities have not been presented to the trainee, and the TPD informed if there has been no resolution. Discussion should also take place if the scope and nature of opportunities should change in the remaining portion of the placement in response to areas for development identified through the MCR.
Final review meeting
Shortly before the end of each placement trainees should meet with their AES to review portfolio evidence including the final MCR. The dialogue between the trainee and AES should cover the overall progress made in the placement and the AES’s view of the placement outcome.
AES report
The AES must write an end of placement report which informs the ARCP. The report includes details of any significant concerns and provides the AES’s view about whether the trainee is on track in the phase of training for completion within the indicative time. If necessary, the AES must also explain any gaps and resolve any differences in supervision levels which came to light through the MCR.
5.3.3 The Multiple Consultant Report
The assessment of the CiPs and GPCs (high-level outcomes of the curriculum) involves a global professional judgement of a range of different skills and behaviours to make decisions about a learner’s suitability to take on particular responsibilities or tasks that are essential to consultant practice at the standard of certification. The MCR assessment must be carried out by the consultant CSs involved with a trainee, with the AES contributing as necessary to some domains (e.g. Quality Improvement, Research and Scholarship). The number of CSs taking part reflects the size of the specialty unit and is expected to be no fewer than two. The exercise reflects what many consultant trainers do regularly as part of a faculty group.
The MCR includes a global rating in order to indicate how the trainee is progressing in each of the five CiPs. This global rating is expressed as a supervision level recommendation described in table 2 below. Supervision levels are behaviourally anchored ordinal scales based on progression to competence and reflect a judgment that has clinical meaning for assessors. Using the scale, CSs must make an overall, holistic judgement of a trainee’s performance on each CiP. Levels IV and V, shaded in grey, equate to the level required for certification and the level of practice expected of a day-one consultant in the Health Service (level IV) or beyond (level V). Figures 5 and 6 show how the MCR examines performance from the perspective of the outcome of the curriculum, the day-one consultant surgeon, in the GPCs and CiPs. If not at the level required for certification the MCR can identify areas for improvement by using the CiP or GPC descriptors or, if further detail is required, through free text. The assessment of the GPCs can be performed by CSs, whilst GPC domains 6-9 might be more relevant to assessment by the AES in some placements.
CSs will be able to best recommend supervision levels because they observe the performance of the trainee in person on a day-to-day basis. The CS group, led by a Lead CS, should meet at the mid-point and towards the end of a placement to conduct a formative MCR. Through the MCR, they agree which supervision level best describes the performance of a trainee at that time in each of the five CiPs and also identify any areas of the nine GPC domains that require development. It is possible for those who cannot attend the group meeting, or who disagree with the report of the group as a whole, to add their own section (anonymously) to the MCR for consideration by the AES. The AES will provide an overview at the end of the process, adding comments and signing off the MCR.
The MCR uses the principle of highlight reporting, where CSs do not need to comment on every descriptor within each CiP but use them to highlight areas that are above or below the expected level of performance. The MCR can describe areas where the trainee might need to focus development or areas of particular excellence. Feedback must be given for any CiP that is not rated as level IV and in any GPC domain where development is required. Feedback must be given to the trainee in person after each MCR and, therefore, includes a specific feedback meeting with the trainee using the highlighted descriptors within the MCR and/or free text comments.
The mid-point MCR feeds into the mid-point learning agreement meeting. At the mid-point it allows goals to be agreed for the second half of the placement, with an opportunity to specifically address areas where development is required. Towards the end of the placement MCR feeds into the final review learning agreement meeting, helping to inform the AES report (figure 4). It also feeds into the objective-setting meeting of the next placement to facilitate discussion between the trainee and the next AES.
The MCR, therefore, gives valuable insight into how well the trainee is performing, highlighting areas of excellence, areas of support required and concerns. It forms an important part of detailed, structured feedback to the trainee at the mid-point and before the end of the placement and can trigger any appropriate modifications for the focus of training as required. The final formative MCR, together with other portfolio evidence, feeds into the AES report which in turn feeds into the ARCP. The ARCP uses all presented evidence to make the definitive decision on progression.
Table 2: MCR anchor statements and guide to recommendation of appropriate supervision level in each CiP.
MCR Rating Scale (CiPs) |
Anchor statements |
Trainer input at each supervision level |
Does the trainee perform part or all of the task? |
Is guidance required? |
Is it necessary for a trainer to be present for the task? |
Is the trainee performing at a level beyond that expected of a day one consultant?c |
Supervision Level I: |
Able to observe only: no execution. |
no |
n/a |
n/a |
n/a |
Supervision Level IIa: |
Able and trusted to act with direct supervision: The supervisor needs to be physically present throughout the activity to provide direct supervision. |
yes |
all aspects |
throughout |
n/a |
Supervision Level IIb: |
Able and trusted to act with direct supervision: The supervisor will need to be physically present for part of the activity. The supervisor needs to guide all aspects of the activity. This guidance may partly be given from another setting. |
yes |
all aspects |
will be necessary for part |
n/a |
Supervision Level III: |
Able and trusted to act with indirect supervision: The supervisor may be required to be physically present on occasion. The supervisor does not need to guide all aspects of the activity. For those aspects which do need guidance, this may be given from another setting. |
yes |
some aspects |
may be necessary for part |
n/a |
Supervision Level IV: |
Able and trusted to act at the level of a day-one consultant. |
yes |
None a,b |
None a,b |
n/a |
Supervision Level V: |
Able and trusted to act at a level beyond that expected of a day-one consultant. |
yes |
None a |
None a |
yes |
a. This equates to the level of practice expected of a day-one consultant in the Health Service. It is recognised that advice from senior colleagues within an MDT is an important part of consultant practice. Achievement of supervision level IV indicates that a trainee is able to work at this level, with advice from their trainer at this level being equivalent to a consultant receiving advice from senior colleagues within an MDT. It is recognised that within the context of a training system that trainees are always under the educational and clinical governance structures of the Health Service.
b. Where the PBA level required by the syllabus is less than level 4 for an operative procedure, it would be expected that mentorship is sought for such procedures and this would fall within the scope of being able to carry out this activity without supervision (level IV), i.e. be a level commensurate with that of a day-one consultant.
c. Achievement of this level across the entirety of an activity would be rare, although free text could describe aspects of an activity where this level has been reached.
In making a supervision level recommendation, CSs should take into account their experience of working with the trainee and the degree of autonomy they were prepared to give the trainee during the placement. They should also take into account all the descriptors of the activities, knowledge, and skills listed in the detailed descriptions of the CiPs. If, after taking all this into account, the CSs feel the trainee is able to carry out the activity without supervision (level IV) then no further detail of this assessment is required, unless any points of excellence are noted. If the trainee requires a degree of supervision to carry out the activity then the CSs should indicate which of the descriptors of the activities, knowledge and skills require further development (to a limit of five items per CiP, so as to allow targets set at feedback to be timely, relevant and achievable). Similarly, if a trainee excels in one or more areas, the relevant descriptors should be indicated. Examples of how the online MCR will look are shown in figures 5 and 6. Figure 7 describes the MCR as an iterative process involving the trainee, CSs, the AES and the development of specific, relevant, timely and achievable action plans.
Multiple Consultant Report – assessment of the GPCs
Figure 5: An example of how the GPCs are assessed through the MCR. CSs would consider whether there are areas for development in any of the nine GPC domains. If not, then nothing further need be recorded. If there are areas for development identified, then CSs are obliged to provide feedback through the MCR. This feedback can be recorded as free text in the comments box indicated. The Descriptors box expands to reveal descriptors taken from the GPC framework. These can be used as prompts for free text feedback or verbatim as standardised language used to describe professional capabilities.
Multiple Consultant Report – assessment of the CiPs
Figure 6: An example of how the CiPs are assessed through the MCR. The CSs would decide what supervision level to recommend for each of the CiPs and record this for each through the Supervision Level box. If the level recommended is IV or V then no further comment need be recorded, unless CSs wished to capture areas of particular excellence for feedback. If levels I to III are recommended then the CSs are obliged to provide feedback through the MCR. This feedback can be recorded as free text in the comments box indicated. The Descriptors box expands to reveal CiP descriptors. These can be used as prompts for free text feedback or verbatim as standardised language to describe the clinical capabilities.
5.3.4 Trainee self-assessment
Trainees should complete the self-assessment of CiPs in the same way as CSs complete the MCR, using the same form and describing self-identified areas for development with free text or using CiP or GPC descriptors. Reflection for insight on performance is an important development tool and self-recognition of the level of supervision needed at any point in training enhances patient safety. Self-assessments are part of the evidence reviewed when meeting the AES at the mid-point and end of a placement. Wide discrepancy between the self-assessment and the recommendation by CSs in the MCR allows identification of over or under confidence and for support to given accordingly.
Figure 7: The iterative process of the MCR, showing the involvement of CSs, self-assessment by trainees, face to face meetings between trainees and supervisors and the development of an action plan focused on identified learning needs over the next three to six months of training. Progress against these action plans is reviewed by the AES and at the subsequent MCRs.
5.3.5 Workplace-based assessment (WBA)
Each individual WBA is designed to assess a range of important aspects of performance in different training situations. Taken together the WBAs can assess the breadth of knowledge, skills and performance described in the curriculum. They also constructively align with the clinical CiPs and GPCs as shown in appendix 9 and will be used to underpin assessment in those areas of the syllabus central to the specialty i.e. the critical conditions and index procedures, as well as being available for other conditions and operations as determined by the trainee and supervisors and especially where needed in the assessment of a remediation package to evidence progress in areas of training targeted by a non-standard ARCP outcome. The WBAs described in this curriculum have been in use for over ten years and are now an established component of training.
The WBA methodology is designed to meet the following criteria:
- Validity – the assessment actually does test what is intended; that methods are relevant to actual clinical practice; that performance in increasingly complex tasks is reflected in the assessment outcome
- Reliability - multiple measures of performance using different assessors in different training situations produce a consistent picture of performance over time
- Feasibility – methods are designed to be practical by fitting into the training and working environment
- Cost-effectiveness – the only significant additional costs should be in the training of trainers and the time investment needed for feedback and regular appraisal, this should be factored into trainer job plans
- Opportunities for feedback – structured feedback is a fundamental component
- Impact on learning – the educational feedback from trainers should lead to trainees’ reflections on practice in order to address learning needs.
WBAs use different trainers’ direct observations of trainees to assess the actual performance of trainees as they manage different clinical situations in different clinical settings and provide more granular formative assessment in the crucial areas of the curriculum than does the more global assessment of CiPs in the MCR. WBAs are primarily aimed at providing constructive feedback to trainees in important areas of the syllabus throughout each placement in all phases of training. Trainees undertake each task according to their training phase and ability level and the assessor must intervene if patient safety is at risk. It would be normal for trainees to have some assessments which identify areas for development because their performance is not yet at the standard for the completion of that training.
Each WBA is recorded on a structured form to help assessors distinguish between levels of performance and prompt areas for their verbal developmental feedback to trainees immediately after the observation. Each WBA includes the trainee’s and assessor’s individual comments, ratings of individual competencies (e.g. Satisfactory, Needs Development or Outstanding) and global rating (using anchor statements mapped to phases of training). Rating scales support the drive towards excellence in practice, enabling learners to be recognised for achievements above the level expected for a level or phase of training. They may also be used to target areas of underperformance. As they accumulate, the WBAs for the critical conditions and index procedures also contribute to the AES report for the ARCP.
WBAs are formative and may be used to assess and provide feedback on all clinical activity. Trainees can use any of the assessments described below to gather feedback or provide evidence of their progression in a particular area. WBAs are only mandatory for the assessment of the critical conditions and index procedures (see appendices 3 and 4). They may also be useful to evidence progress in targeted training where this is required e.g. for any areas of concern.
WBAs for index procedures and critical conditions will inform the AES report along with a range of other evidence to aid the decision about the trainee’s progress. All trainees are required to use WBAs to evidence that they have achieved the learning in the index procedures or critical conditions by certification. However, it is recognised that trainees will develop at different rates, and failure to attain a specific level at a given point will not necessarily prevent progression if other evidence shows satisfactory progress.
The assessment blueprint (appendix 9) indicates how the assessment programme provides coverage of the CiPs, the GPC framework and the syllabus. It is not expected that the assessment methods will be used for each competency and additional evidence may be used to help make a supervision level recommendation. The principle of assessment is holistic; individual GPC and CiP descriptors and syllabus items should not be assessed, other than in the critical conditions and index procedures or if an area of concern is identified. The programme of assessment provides a variety of tools to feedback to and assess the trainee.
Case Based Discussion (CBD)
The CBD assesses the performance of a trainee in their management of a patient case to provide an indication of competence in areas such as clinical judgement, decision-making and application of medical knowledge in relation to patient care. The CBD process is a structured, in-depth discussion between the trainee and a consultant supervisor. The method is particularly designed to test higher order thinking and synthesis as it allows the assessor to explore deeper understanding of how trainees compile, prioritise and apply knowledge. By using clinical cases that offer a challenge to trainees, rather than routine cases, trainees are able to explain the complexities involved and the reasoning behind choices they made. It also enables the discussion of the ethical and legal framework of practice. It uses patient records as the basis for dialogue, for systematic assessment and structured feedback. As the actual record is the focus for the discussion, the assessor can also
evaluate the quality of record keeping and the presentation of cases. The CBD is importan cal conditions (appendix 3). Trainees are assessed against the standard for the completion of their phase of training.
Clinical Evaluation Exercise (CEX) / CEX for Consent (CEX(C))
The CEX or CEX(C) assesses a clinical encounter with a patient to provide an indication of competence in skills essential for good clinical care such as communication, history taking, examination and clinical reasoning. These can be used at any time and in any setting when there is a trainee and patient interaction and an assessor is available. The CEX or CEX(C) is important for assessing the critical conditions (appendix 3). Trainees are assessed against the standard for the completion of their phase of training.
Direct Observation of Procedural Skills (DOPS)
The DOPS assesses the trainee’s technical, operative and professional skills in a range of basic diagnostic and interventional procedures during routine surgical practice in wards, out-patient clinics and operating theatres. The procedures reflect the common and important procedures. Trainees are assessed against the standard for the completion of core surgical training.
Multi-source Feedback (MSF)
The MSF assesses professional competence within a team working environment. It comprises a self-assessment and the assessments of the trainee’s performance from a range colleagues covering different grades and environments (e.g. ward, theatre, out-patients) including the AES. The competencies map to the standards of GMP and enable serious concerns, such as those about a trainee’s probity and health, to be highlighted in confidence to the AES, enabling appropriate action to be taken. Feedback is in the form of a peer assessment chart, enabling comparison of the self-assessment with the collated views received from the team and includes their anonymised but verbatim written comments. The AES should meet with the trainee to discuss the feedback on performance in the MSF. Trainees are assessed against the standard for the completion of their training level.
Procedure Based Assessment (PBA)
The PBA assesses advanced technical, operative and professional skills in a range of specialty procedures or parts of procedures during routine surgical practice in which trainees are usually scrubbed in theatre. The assessment covers pre-operative planning and preparation; exposure and closure; intra-operative elements specific to each procedure and post-operative management. The procedures reflect the routine or index procedures relevant to the specialty. The PBA is used particularly to assess the index procedures (appendix 4). Trainees are assessed against the standard for certification.
Surgical logbook
The logbook is tailored to each specialty and allows the trainee’s competence as assessed by the DOPS and PBA to be placed in context. It is not a formal assessment in its own right, but trainees are required to keep a log of all operative procedures they have undertaken including the level of supervision required on each occasion using the key below. The logbook demonstrates breadth of experience which can be compared with procedural competence using the DOPS and the PBA and will be compared with the indicative numbers of index procedures defined in the curriculum.
Observed (O)
Assisted (A)
Supervised - trainer scrubbed (S-TS)
Supervised - trainer unscrubbed (S-TU)
Performed (P)
Training more junior trainee (T)
The following WBAs may also be used to further collect evidence of achievement, particularly in the GPC domains of Quality improvement, Education and training and Leadership and team working:
Assessment of Audit (AoA)
The AoA reviews a trainee’s competence in completing an audit or quality improvement project. It can be based on documentation or a presentation of a project. Trainees are assessed against the standard for the completion of their phase of training.
Observation of Teaching (OoT)
The OoT assesses the trainee’s ability to provide formal teaching. It can be based on any instance of formalised teaching by the trainee which has been observed by the assessor. Trainees are assessed against the standard for the completion of their phase of training.
The forms and guidance for each WBA method can be found on the ISCP website (see section 7).
5.3.6 Intercollegiate Specialty Board Examination
The ISB examination is governed by the Joint Committee on Intercollegiate Examinations (JCIE, www.jcie.org.uk) on behalf of the four surgical Royal Colleges. The JCIE is served by an Intercollegiate Specialty Board in each specialty. The examination is a powerful driver for knowledge and clinical skill acquisition. It has been in existence for over twenty years and is accepted as an important, necessary and proportionate test of knowledge, clinical skill and the ability to demonstrate the behaviours required by the curriculum. The examination is taken after successful completion of phase 2 and the standard is set at having the knowledge, clinical and professional skills at the level of a day-one consultant in the generality of the specialty, and must be passed in order to complete the curriculum. The examination components have been chosen to test the application of knowledge, clinical skills, interpretation of findings, clinical judgement, decision-making, professionalism, and communication skills described within the curriculum. The examination also assesses components of the CiPs and GPCs (as shown in appendix 9) and feeds into the same process as WBA for review by the AES and ARCP.
There are two sections to the exam:
- Section 1 is a computer-based assessment comprising two papers taken on the same day. These are both Single Best Answer (SBA) papers designed to test the application of knowledge and clinical reasoning.
- Section 2 comprises the clinical component of the examination. It consists of a series of carefully designed and structured interviews on clinical topics – some scenario-based and others patient-based. The construct of section 2 allows assessment of the application of knowledge, clinical interpretation, decision-making, clinical judgement and professionalism.
Standard setting:
- Section 1 is standard set by the modified Angoff method with one set being added to the Angoff cut score to generate the eligibility to proceed mark. Section 1 is computer marked. Any questions identified as anomalous (possible wrong answers, negative discriminators etc.) are discussed at the standard setting meeting prior to the Angoff and, if necessary, removed.
- The Section 2 clinical and oral components are calibrated prior to the start of each diet. It is independently marked by examiners working in pairs but with reference to the marking descriptors and the standard agreed at the calibration meeting.
Feedback:
Following section 1, candidates will receive a formal letter from the Board Chair confirming the result and a Final Performance Report which shows:
Paper 1 (Single Best Answer) Score %
Paper 2 (Single Best Answer) Score %
Combined Score %
Following section 2, candidates will receive a formal letter from the Board Chair confirming the result. Unsuccessful candidates will also receive a Final Performance Report showing the name of each station and its pass mark, and the mark achieved by a candidate in each of the stations.
Attempts:
Trainees have a maximum of four attempts at each section of the examination with no re-entry. A pass in section 1 is required to proceed to section 2 and must be achieved within two years of the first attempt. The time limit for completion of the entire examination process is seven years. Pro-rata adjustments are permissible to these timescales for less than full time (LTFT) trainees. Trainees in become eligible to sit section 1 following an ARCP outcome 1 at the end of phase 2 of specialty training).
Further details can be found at https://www.jcie.org.uk/content/content.aspx?ID=12
5.3.7 Annual Review of Competence Progression (ARCP)
The ARCP is a formal Deanery/HEE Local Office process overseen and led by the TPD. It scrutinises the trainee’s suitability to progress through the training programme. It bases its decisions on the evidence that has been gathered in the trainee’s learning portfolio during the period between ARCP reviews, particularly the AES report in each training placement. The ARCP would normally be undertaken on an annual basis for all trainees in surgical training. A panel may be convened more frequently for an interim review or to deal with progression issues (either accelerated or delayed) outside the normal schedule. The ARCP panel makes the final summative decision that determines whether trainees are making appropriate progress through to be able to move to the next level or phase of training or to achieve certification.
5.4 Completion of training in OMFS
The following requirements are applied to all trainees completing the curriculum and applying for certification and entry to the specialist register
All seeking certification in OMFS must:
a) be fully registered with the GMC and have a licence to practise (UK trainees) or be registered with the Medical Council in Ireland (Republic of Ireland trainees)
b) be fully registered with the General Dental Council (GDC) or hold a dental qualification recognised by the GDC as fully registrable (UK trainees only)
c) have successfully passed the ISB examination
d) have achieved level IV or V in all the CiPs
e) have achieved the competencies described in the nine domains of the GPC framework
f) have been awarded an outcome 6 at a final ARCP (if applying for specialist registration through certification).
In order to be awarded an outcome 6 at the final ARCP, trainees must be able to satisfy the following specialty specific guidelines:
a) Generic requirements shared between surgical specialities
Research - Trainees must provide evidence of having met the relevant requirements for research and scholarship. For UK trainees, this can be found in the GMC’s GPC framework. Broadly, this includes capabilities in 4 areas:
- The demonstration of evidence-based practice.
- Understanding how to critically appraise literature and conduct literature searches and reviews.
- Understanding and applying basic research principles.
- Understanding the basic principles of research governance and how to apply relevant ethical guidelines to research activities.
|
Quality Improvement - evidence of an understanding of, and participation in, audit or service improvement as defined in the curriculum |
Trainees must complete or supervise an indicative number of three audit or quality improvement projects during specialty training. In one or more of these, the cycle should be completed. |
Medical Education and training - evidence of an understanding of, and participation in, medical education and training as defined in the curriculum |
Trainees must provide evidence of being trained in the training of others and present written structured feedback on their teaching uploaded to the ISCP portfolio. |
Management and leadership - evidence of an understanding of management structures and challenges of the health service in the training jurisdiction |
Trainees must provide evidence of training in health service management and leadership and having taken part in a management related activity e.g. rota administration, trainee representative, membership of working party etc. or of having shadowed a management role within the hospital. |
b) Requirements specific to OMFS
Table 3: Requirements for completion of training in OMFS: a) generic requirements shared between all surgical specialties and b) requirements specific to OMFS. Attainment of these requirements contribute to evidence that outcomes of training have been met.
Additional courses / qualifications - evidence of having attended specific courses/gained specific qualifications as defined in the curriculum |
The Advanced Trauma Life Support® (ATLS®), European Trauma Course, Definitive Surgical Trauma Skills course or equivalent locally provided course(s) meeting the outcomes described |
Specialist conferences - evidence of having attended conferences and meetings as defined in the curriculum appropriate to the specialty |
It is recommended that trainees attend national or international meetings during training (e.g. annual meetings of specialty associations or major international equivalents). |
Clinical experience - evidence of the breadth of clinical experience defined in the specialty syllabus |
Trainees must have participated in on call rotas and managed emergency cases during their training. Trainees should provide evidence of experience in the breadth of the specialty as defined by the specialty-specific modules. The majority of this experience will be obtained during rotations through recognised OMFS units within a training region. Some elements of the curriculum can only be provided in certain units or regions. Trainees will, therefore, be expected to obtain this experience from formal arrangements with other training regions and the recommended indicative timeframes are:
2 weeks craniofacial surgery
6 weeks cleft lip and palate surgery
Evidence of experience in aesthetic surgery will be obtained in a number of ways:
- Evidence of the management of patients with craniofacial, facial and reconstructive requirements in every day OMFS practice
- Evidence of experience in private health care facilities where the JCST standards have been met (JCST Principles for Training in the Private Sector_Nov2018
- Evidence of the assessment and management of patients with facial/head & neck aesthetic concerns
|
Operative experience - consolidated logbook evidence of the breadth of operative experience defined in the specialty syllabus |
Indicative numbers of procedures are outlined in appendix 4a |
Index Procedures – Index procedures are of significant importance for patient safety and to demonstrate a safe breadth of practice. |
By certification there should be documented evidence of performance at the level of a day-one consultant by means of the PBA (to level 4 as shown in appendix 4a).
- Surgical removal of impacted and buried teeth
- Drainage of tissue space infection
- Surgical access to airway (tracheostomy/cricothyroidotomy)
- Repair of facial lacerations
- Reduction and fixation of fractures of the mandible (including open reduction of condyle)
- Reduction and fixation of fractures of the midface including nose
- Repair and grafting of fractures of the orbital floor
- Excision & reconstruction of facial skin defects
- TMJ arthrocentesis
- Bone graft
- Ramus osteotomy of the mandible
- Le Fort 1 maxillary osteotomy
- Removal of a parotid lump
- Removal of neck lump including submandibular gland
- Neck dissection
- Raising and insetting of free flap
- Oral resection (Level 3)
- Microvascular anastomosis (Level 3)
(simulated operations are not accepted for this requirement but can be part of teaching and learning)
|
Critical Conditions - To ensure that trainees have the necessary skills to manage the defined critical conditions. |
There should be documented evidence of performance at the level of a day-one consultant by means of the CEX or CBD as appropriate (to level 4 as shown in appendix 3).
- Life-threatening airway compromise
- Sepsis of the head and neck
- Sight – threatening trauma
- Haemorrhage arising from the face, mouth, jaws and neck
- Malignancy of the head and neck
|
Once these requirements have been met, the ARCP panel may consider the award of outcome 6 having reviewed the portfolio and AES report. Award of outcome 6 allows the trainee to seek recommendation for certification and entry onto the specialist register.
9
Cultural awareness course