Checklists for assessment and certification of clinical procedural skills omit essential...

12
Checklists for assessment and certification of clinical procedural skills omit essential competencies: a systematic review Robert K McKinley, 1 Janice Strand, 2 Linda Ward, 2 Tracey Gray, 2 Tom Alun-Jones 2 & Helen Miller 2 OBJECTIVE To develop generic criteria for the glo- bal assessment of clinical procedural competence and to quantify the extent to which existing checklists allow for holistic assessment of procedural competencies. METHODS We carried out a systematic review and qualitative analysis of published clinical procedural skills assessment checklists and enumerated the contents of each. Source materials included all English-language papers published from 1990 to June 2005, identified from 18 databases, which described or referred to an assessment document for any clinical procedural skill. A pair of reviewers identified key generic themes and sub-themes through in-depth analysis of a subset of 20 check- lists with iterative agreement and independent retesting of a coding framework. The resulting framework was independently applied to all check- lists by pairs of reviewers checking for the emer- gence of new themes and sub-themes. Main outcome measures were identification of generic clinical procedural skills and the frequency of occurrence of each in the identified checklists. RESULTS We identified 7 themes (ÔProcedural competenceÕ, represented in 85 [97%] checklists; ÔPreparationÕ, 65 [74%]; ÔSafetyÕ, 45 [51%]; ÔCom- munication and working with the patientÕ, 32 [36%]; ÔInfection controlÕ, 28 [32%]; ÔPost-procedural careÕ, 24 [27%]; ÔTeam workingÕ, 13 [15%]) and 37 sub- themes, which encapsulated all identified checklists. Of the sub-themes, 2 were identified after the initial coding framework had been finalised. CONCLUSIONS It is possible to develop generic cri- teria for the global assessment of clinical procedural skills. A third and a half of checklists, respectively, do not enable explicit assessment of the key competen- cies ÔInfection controlÕ and ÔSafetyÕ. Their assessment may be inconsistent in assessments which use such checklists. KEYWORDS review [publication type]; clinical competence *standards; health personnel *standards; teaching *methods; teaching materials. Medical Education 2008: 42: 338–349 doi:10.1111/j.1365-2923.2007.02970.x INTRODUCTION Assessment and certification of health care workersÕ competence is a major priority for all health care educators, regulators, providers and users. All parties need to be assured that new and existing health care workers are competent in the tasks appropriate for their roles. Such assessment and certification is a high- stakes activity which must accurately identify those who are and those who are not competent. The scope of competence certification is changing. It is widening as new groups of health care workers undertake clinical procedures which were traditionally the preserve of other health care professionals. It is also deepening as entrants to health professions are assessed in individual skills to an extent undreamt of in the not-so- distant past, when, at the completion of proscribed training, practitioners were assumed to be competent within the limits of their traditional roles. 1–3 The search to improve the quality of assessment has catalysed the development and adoption of the medical education in review 1 Keele University School of Medicine, Keele University, Keele, UK 2 Directorate of Clinical Education, University Hospitals of Leicester, Leicester, UK Correspondence: Robert K McKinley, Professor of Academic General Practice, Keele University School of Medicine, Keele University, Keele, Staffordshire ST5 5BG, UK. Tel: 00 44 1782 584667; Fax: 00 44 1782 584637; E-mail: [email protected] 338 ª Blackwell Publishing Ltd 2008. MEDICAL EDUCATION 2008; 42: 338–349

Transcript of Checklists for assessment and certification of clinical procedural skills omit essential...

Checklists for assessment and certification of clinicalprocedural skills omit essential competencies: asystematic reviewRobert K McKinley,1 Janice Strand,2 Linda Ward,2 Tracey Gray,2 Tom Alun-Jones2 & Helen Miller2

OBJECTIVE To develop generic criteria for the glo-bal assessment of clinical procedural competenceand to quantify the extent to which existingchecklists allow for holistic assessment ofprocedural competencies.

METHODS We carried out a systematic review andqualitative analysis of published clinical proceduralskills assessment checklists and enumerated thecontents of each. Source materials included allEnglish-language papers published from 1990 toJune 2005, identified from 18 databases, whichdescribed or referred to an assessment documentfor any clinical procedural skill. A pair of reviewersidentified key generic themes and sub-themesthrough in-depth analysis of a subset of 20 check-lists with iterative agreement and independentretesting of a coding framework. The resultingframework was independently applied to all check-lists by pairs of reviewers checking for the emer-gence of new themes and sub-themes. Mainoutcome measures were identification of genericclinical procedural skills and the frequency ofoccurrence of each in the identified checklists.

RESULTS We identified 7 themes (�Proceduralcompetence�, represented in 85 [97%] checklists;�Preparation�, 65 [74%]; �Safety�, 45 [51%]; �Com-munication and working with the patient�, 32 [36%];�Infection control�, 28 [32%]; �Post-procedural care�,24 [27%]; �Team working�, 13 [15%]) and 37 sub-themes, which encapsulated all identified checklists.

Of the sub-themes, 2 were identified after the initialcoding framework had been finalised.

CONCLUSIONS It is possible to develop generic cri-teria for the global assessment of clinical proceduralskills. A third and a half of checklists, respectively, donot enable explicit assessment of the key competen-cies �Infection control� and �Safety�. Their assessmentmay be inconsistent in assessments which use suchchecklists.

KEYWORDS review [publication type]; clinical competence ⁄*standards; health personnel ⁄ *standards; teaching ⁄ *methods;teaching materials.

Medical Education 2008: 42: 338–349doi:10.1111/j.1365-2923.2007.02970.x

INTRODUCTION

Assessment and certification of health care workers�competence is a major priority for all health careeducators, regulators, providers and users. All partiesneed to be assured that new and existing health careworkers are competent in the tasks appropriate fortheir roles. Such assessment and certification is a high-stakes activity which must accurately identify those whoare and those who are not competent. The scope ofcompetence certification is changing. It is wideningas new groups of health care workers undertake clinicalprocedures which were traditionally the preserve ofother health care professionals. It is also deepeningas entrants to health professions are assessed inindividual skills to an extent undreamt of in the not-so-distant past, when, at the completion of proscribedtraining, practitioners were assumed to be competentwithin the limits of their traditional roles.1–3

The search to improve the quality of assessment hascatalysed the development and adoption of the

medical education in review

1Keele University School of Medicine, Keele University, Keele, UK2Directorate of Clinical Education, University Hospitals ofLeicester, Leicester, UK

Correspondence: Robert K McKinley, Professor of Academic GeneralPractice, Keele University School of Medicine, Keele University,Keele, Staffordshire ST5 5BG, UK. Tel: 00 44 1782 584667;Fax: 00 44 1782 584637; E-mail: [email protected]

338 ª Blackwell Publishing Ltd 2008. MEDICAL EDUCATION 2008; 42: 338–349

objective structured clinical examination (OSCE), inwhich the candidate is assessed against explicit,objective and, usually, procedure- and context-spe-cific criteria.4–6 The strengths and weaknesses of theOSCE have been discussed elsewhere,7–9 but wedoubt whether the future volume of assessment inhealth care can be managed using OSCE-type assess-ments, for two reasons. The first is the reliance onprocedure- and context-specific criteria. The list ofprocedures for which such criteria are required ishuge and constantly changing as new procedures aredeveloped and others dropped. Consequently,assessment criteria need constant maintenance toreflect changing practice, indications, equipment anddisposables. The second reason for doubt concernsthe logistical challenge of sustaining the requiredvolume of traditional OSCE examinations. We doubtthat either is sustainable in any resource-limitedenvironment. Furthermore, OSCEs can neverfaithfully duplicate the clinical environment andtheir validity is consequently limited.

Nevertheless, alongside the development of theOSCE, our understanding of the contribution of

global (Fig. 1) ratings of competence has advancedand an evidence base for the reliability and validity ofassessments based upon them is emerging.10–12 Analternative to developing and maintaining a compre-hensive bank of procedure-specific checklists is todevelop generic (Fig. 1) global rating scales which areapplicable to a wide range of clinical procedures inboth simulated and, potentially, workplace assess-ments. For example, surgical skills are arguably themost technically complex and demanding of clinicalprocedural skills (Fig. 1), yet a single global ratingscale, the objective structured assessment of technicalskills (OSATS), enables the reliable assessment ofsurgical skills across a wide range of surgical proce-dures.13–15 Furthermore, workplace assessment canachieve the twin goals of reliability and validity.16–18

Competence requires more than just technical skillsand includes humanistic and team competencies. Itthen follows that health care workers cannot beconsidered competent unless they possess a broaderrange of humanistic and team competencies. This isrecognised explicitly in criteria adopted by theAccreditation Council for Graduate Medical Educa-tion (ACGME) in the USA,19 the General MedicalCouncil (GMC), Nursing and Midwifery Council(NMC) and Department of Health in the UK20–22 andother regulatory bodies worldwide.23,24 Any genericcriteria for the global assessment of clinicalprocedural skills must therefore enable holistic(Fig. 1) assessment of competence.

This paper reports the first stage of a researchprogramme to develop a global holistic genericassessment tool to assess competence in clinicalprocedure skills. We conducted a systematic review todetermine whether it was possible to identify genericcriteria which would encapsulate the criteria inexisting checklists. The secondary research questionwhich arose during the review required us to quantifythe extent to which existing checklists allow assess-ment of humanistic and team competencies (i.e. toexamine the validity of current checklists for holisticassessment of clinical procedural skills).

METHODS

Finding relevant studies

We used a sensitive search strategy to retrievepotentially relevant papers from the sources listedin Table 1. These included key health databases aswell as education and specialist medical andnurse education sources. The search date was June2005.

Overview

What is already known on this subject

Rigorous assessments of competence across anincreasingly wide range of clinical procedures isrequired. Development of a comprehensivebank of validated procedure-specific checklistswill be demanding. Generic global assessmenttools can enable satisfactory assessment.

What this study adds

It was possible to encapsulate published proce-dural skills checklists in 7 categories and 37component competencies. Many existingchecklists omit safety, infection prevention andteamwork competencies. This must raiseconcerns about the validity of assessmentbased on these tools.

Suggestions for further research

Are these omissions reflected in checklists usedfor everyday assessment of competence anddoes this distort learning?

Can a generic tool enable satisfactoryassessment of competence?

ª Blackwell Publishing Ltd 2008. MEDICAL EDUCATION 2008; 42: 338–349 339

Assessment and certification of clinical procedural skills

We combined subject headings representing con-cepts of clinical competence or educational measurementwith subject headings or keywords for procedural skillsor procedural competencies or clinical skills or clinicalprocedures or clinical competencies or practical skills orpractical procedures. From these, we identified refer-ences containing the terms checklists or toolkits or globalrating or global score or global assessment or rating scaleor criteria or instrument or tool using variant wordendings as appropriate.

In addition, we searched MEDLINE and EMBASEfor key authors to ensure that important referenceswere not missed. JS screened the reference lists ofeligible articles and both reviewers reviewed allpotentially eligible papers. Finally, we e-mailed allauthors of papers published after 1 January 2000which referred to but did not include checklists inorder to request copies.

Papers eligible for the review

RKM (a doctor and medical educator) and JS (anadvanced practice research nurse and an educator)screened the titles and abstracts of all papers identi-fied by the electronic search and search of referencelists to identify all English-language papers which

matched the criteria in Table 2a. We obtained allpapers so identified by either reviewer for detailedexamination and included those which met ourinclusion criteria (Table 2b). We recorded theagreement between reviewers about including thefirst batch of titles and abstracts.

Data extraction and analysis

We performed data extraction in 2 phases. First, wedeveloped a coding framework and then we testedthe framework against the remaining checklists.

In the first phase, RKM and JS independentlyreviewed the first 20 papers and extracted data using asemi-structured form. As we each reviewed the papersand checklists, we identified key generic themeswhich emerged from and within the data. This closeinspection and in-depth analysis of the data enabledus to generate themes and sub-themes. Once we hadcompleted our individual data extraction of these first20 papers, we met to analyse the themes and sub-themes. We agreed a single set of themes and sub-themes which was systematically checked against thedata, confirmed and re-confirmed. The second stageof analysis and re-analysis of the themes and the datawas carried out by pairs of authors (2 of RKM, JS or

Figure 1 Definitions of terms used

340 ª Blackwell Publishing Ltd 2008. MEDICAL EDUCATION 2008; 42: 338–349

R K McKinley et al

TG, an advanced practice nurse and a nurse educa-tor), who independently analysed the checklists fromthe remaining papers using a structured recordingform which reflected the coding framework. Wheneach pair had finished their analyses, they met toresolve differences by discussion and review of theoriginal checklists. This allowed us to check for thepossible emergence of new themes or sub-themes thatwould necessitate further investigation of previouslyanalysed data. We repeated this process until weagreed that we had developed a framework of themesand sub-themes which encapsulated the initialchecklists. Finally, RKM and JS separately reviewed allthe checklists to determine and record which themesand sub-themes were represented in them; weresolved disagreements by consensus.

Statistical analysis

The degree of reviewer agreement was assessed usingCohen�s kappa.25 We considered a theme was repre-

sented in the checklist if any of its constituent sub-themes was addressed by any checklist item. We thendetermined the number of themes represented ineach checklist and the frequency of representation ofeach theme across all checklists. Differences in therepresentation of themes in checklists with time wereexamined using chi-square tests. We used SPSS Version12.0.1 for statistical analyses (SPSS Inc., Chicago, IL,USA).

RESULTS

Finding and selecting studies and checklists

We identified 2994 papers from the literaturesearch and, after review of titles and abstracts, 447full papers were requested. Cohen�s kappa foragreement between the 2 reviewers for obtaining thefull papers was 0.79 for the first 216 titles andabstracts reviewed. A total of 77 checklists werefound in 67 papers. We e-mailed the authors of107 papers and received responses from 58,yielding a further 11 eligible checklists from 8papers (Fig. 2).

Table 2 Inclusion and exclusion criteria (a and b)

(a) For obtaining papers from title and abstract review:

Inclusion criteria:

Published in English and

Any mention of assessment and

Any mention of results from any group of health

care workers

(b) For including papers from review of full papers

Inclusion criteria:

Containing or referring to a checklist or assessment

document and

Concerning assessment of any clinical procedural

competence or

Testing the reliability or validity of existing checklists and

Reporting results from any group of health care workers

Exclusion criteria:

Checklists or assessment documents already identified

from another paper or

Checklist or assessment documents were not included

or referred to or

Not concerning assessment of clinical procedural skills,

for example, �professionalism� or �communication�

Table 1 Databases searched

Database searched from 1990 to June 2005

MEDLINE

EMBASE

Canulation Index to Nursing and Allied Health Literature

British Nursing Index

Allied and Complementary Medicine Database

PsychInfo

DHData

King�s Fund

Science Citation Index

ERIC ⁄ British Education Index ⁄ Australian Education Index

Campbell Collaboration

http://www.campbellcollaboration.org/

Best Evidence in Medical Education (BEME)

http://www.bemecollaboration.org/

Resource Base in Continuing Medical Education

http://www.cme.utoronto.ca/rdrb/

Cochrane Library 2005 Issue 2

http://www.nelh.nhs.uk/ cochrane.asp

National Health Service Centre for Reviews and Dissemination

Databases http://www.york.ac.uk/inst/crd/

Turning Research into Practice http://www.tripdatabase.com

BIOMED Central Online Publishing

http://www.nelh.nhs.uk/core_publishing.asp

Research Findings Register

http://www.refer.nhs.uk/ViewWebPage.asp?Page=Home

ª Blackwell Publishing Ltd 2008. MEDICAL EDUCATION 2008; 42: 338–349 341

Assessment and certification of clinical procedural skills

Description of the papers

The 75 papers included 88 unique checklists.There were papers from 10 countries: 31 from theUSA; 24 from the UK; 9 from Canada; 5 fromAustralia, and 1 from each of Germany, India, Israel,Norway, Pakistan and Sweden. Some checklists werespecific to professional groups (or their respectivestudents): 44 (50%) checklists had been developedfor doctors; 28 (32%) for nurses; 5 (6%) fordentists; 3 (3%) for allied health professionals, and8 (9%) for multi-professional groups. Of the check-lists for doctors, 28 (32%) had been developed forstaff grade doctors, 27 (31%) for interns or residents,22 (25%) for students and 11 (12%) for a combina-tion of health grades. In all, 23 papers reported datafrom 152 faculty members and 54 from 2627 peopleto whom the checklists had been applied. We foundevidence of the reliability and validity in 27 (36%)papers (30 [34%] checklists) and 23 (31%) papers

(27 [31%] checklists), respectively (see Supplemen-tary Material for a full summary of publishedchecklists, Table S1).

Themes

After 4 iterations, we had identified an initial codingframework with 7 themes and 35 constituent sub-themes. During the second phase we identified 2additional sub-themes: �Respect for tissue� and �Offersappropriate post-procedure care to the patient�. Allother items in the checklists were encompassed by theother sub-themes. The final list of themes andsub-themes is shown in Table 3.

The median number (interquartile range) of themesrepresented in each checklist was 3 (2–5), with 5themes represented in 19 checklists, 6 themes in 8and 7 themes in none. Sixteen checklists containeda single theme.

Potentially relevant papers identified and screened for retrieval = 447

Before 2000, n = 294 After 1-1-2000, n = 153

Papers with no checklists, n = 107, author e-mailed

No reply to email, n = 49

Replies n = 58

No further information, n = 21 Not eligible, n = 29 : No checklist n = 12

Checklist not eligible, n = 17

Papers included, n = 46. Checklists, n = 51

Papers included, n = 8. Checklists, n = 11

Total papers included, n = 75 Checklists, n = 88

Papers included, n = 21 Checklists, n = 26

No checklist, n = 275

Total papers identified from electronic search, n = 2994

Figure 2 QUOROM statement flow diagram showing papers identified, removed and retained at each stage of the review

342 ª Blackwell Publishing Ltd 2008. MEDICAL EDUCATION 2008; 42: 338–349

R K McKinley et al

Table 3 Theme and sub-themes identified from checklists and their frequency of representation

Category and component competencies

No. of

checklists

1 Preparation 65

1.1 Ensures environment is appropriate 45

1.2 Assesses the patient appropriately 30

1.3 Appropriately assesses the indications for and against the proposed procedure 10

1.4 Plans procedure with respect to patient factors 7

1.5 Prepares patient appropriately 19

1.6 Selects equipment, disposables and consumables 44

2 Infection control 28

2.1 Washes and ⁄ or decontaminates hands 10

2.2 Prepares patient�s skin appropriately 11

2.3 Uses barrier as required 12

2.4 Displays appropriate practice of aseptic technique 18

2.5 Disposes of waste appropriately 8

3 Communication and working with the patient 32

3.1 Introduces self to patient 11

3.2 Shares information about procedure appropriately 26

3.3 Listens attentively 3

3.4 Answers questions honestly 1

3.5 Seeks and obtains consent for procedure 4

3.6 Obtains and sustains co-operation throughout 10

3.7 Use of communication skills 10

3.8 Behaviour towards patient 13

4 Team working 13

4.1 Displays understanding and respect for the roles of team members 7

4.2 Communicates appropriately with team 10

5 Safety 45

5.1 Checks patient�s identity correctly 5

5.2 Checks ⁄ completes ⁄ requests documentation correctly 7

5.3 Labels samples ⁄ printouts at the bedside correctly 2

5.4 Applies procedure-specific safety measures correctly 37

5.5 Aware of limitations of personal competence and role and acts appropriately 6

5.6 Ensures own and others� safety 5

6 Procedural competence 85

6.1 Performance of procedure 83

6.2 Displays familiarity with equipment 22

6.3 Displays knowledge of the procedure 20

6.4 Uses assistance appropriately 8

6.5 Handles samples or ensures quality control of outputs correctly 4

6.6 Displays clinical problem-solving skills 25

6.7 Demonstrates respect for tissue 7

6.8 Offers appropriate post-procedural advice to patient 8

7 Post-procedure 24

7.1 Completes documentation accurately 23

7.2 Cleans and stores equipment 5

ª Blackwell Publishing Ltd 2008. MEDICAL EDUCATION 2008; 42: 338–349 343

Assessment and certification of clinical procedural skills

The frequency of representation of the themes inthe checklists is shown in Table 4. The most com-monly represented theme was �Procedural compe-tence� (85 [97%] checklists). �Team working� wasleast likely to be represented (13 [15%]), �Infectioncontrol� was represented in 28 (32%) and �Safety� in45 (51%). To examine whether this had changedwith time, we examined the frequency of inclusionof themes in the older half (published during1990)2001, n = 35) and the newer half (publishedduring 2002)05, n = 30) of the papers. There wasno difference in representation of themes betweenolder and newer checklists (Table 4). Table S1,available online, shows the themes represented ineach checklist.

DISCUSSION

We identified 7 themes and 37 sub-themes whichencompassed all criteria from these 88 checklists whichwere used to assess a wide range of professionals andclinical procedures. �Procedural competence� wasrepresented in 97% of checklists, and �Communicationand working with patients� and �Preparation�, both ofwhich contain important humanistic skills, were rep-resented in 36% and 74%, respectively. �Infectioncontrol� and �Safety� were only represented in 32% and51% of checklists, respectively. Thus we conclude,firstly, that it is possible to develop generic criteria forthe global assessment of clinical procedural skillswhich encompass all of the criteria from existing

checklists and, furthermore, that existing checklistsfrequently do not enable assessment of humanistic andteamwork competencies.

Strengths and weaknesses of this review

Identification and inclusiveness of data

As we were concerned to include all relevant data,we utilised a sensitive search strategy in a widerange of relevant databases to identify publishedliterature. We also included papers which eitherreviewer considered eligible. Nevertheless, theagreement between the reviewers was �good� to �verygood� as assessed by Cohen�s kappa.25 Even with thesemeasures, we identified few eligible checklists, asmany of the checklists focused on communication,clinical examination skills and broad professionalcompetencies. Our approach and these data aremore analogous to qualitative research and thesearch for saturation than the quantitative methodsusually used in systematic reviews and meta-analysis.That we found no additional themes and only 2additional sub-themes in the last 55 papers reviewedstrongly suggests that we did reach saturation andthat further checklists are unlikely to contributeadditional themes or sub-themes.

Methodology

Our methodological approach shares affinities withthe main principles of grounded theory, a qualitative

Table 4 Representation of themes in checklists

Theme

Prevalence in checklists

v2

P

1990–2005 1990–2001 2002–2005

d.f. = 1n (%) n (%) n (%)

Preparation 65 (74) 35 (76) 30 (71) 0.247 0.6

Infection control 28 (32) 16 (36) 12 (29) 0.390 0.5

Communication and working with the patient 32 (36) 19 (41) 13 (31) 1.017 0.3

Team working 13 (15) 9 (20) 4 (10) 1.758 0.2

Safety 45 (51) 24 (52) 21 (50) 0.420 0.8

Procedural competence 85 (97) 43 (93) 42 (100) 2.836 0.2*

Post-procedural care 24 (26) 16 (35) 8 (19) 2.741 0.1

Total 88 (100) 46 (100) 42 (100)

* Fisher�s exact test

344 ª Blackwell Publishing Ltd 2008. MEDICAL EDUCATION 2008; 42: 338–349

R K McKinley et al

methodology.26 This involved systematically collect-ing, analysing, comparing and coding data which, inthis instance, were the checklists we identified. Thepurpose of this was to generate themes and sub-themes as emergent properties of the data whichencapsulated all of the items on all checklists; ouronly a priori assumption was that humanistic andteam competences were important aspects of clini-cal procedural skills. To increase methodologicalrigour, a pair of reviewers (a doctor and anadvanced practice research nurse) undertook aniterative analysis until the data no longer revealednew themes or sub-themes. The realisation that manychecklists did not address important skills domainsprompted us to a further quantitative examination ofthe data.

Quality of papers and checklists

Although the CONsolidated Standards of ReportingTrials (CONSORT)27 and Quality of Reporting ofMeta-analyses (QUOROM)28 statements reflect con-sensus about methodological quality markers forrandomised controlled trials and systematic reviews,respectively, there are no corresponding criteria withwhich to judge the quality of these source checklists.We chose to determine firstly whether any evidencefor the reliability or validity of the checklists wasoffered. Only 46% satisfied either of these criteria;those which did not were reported either as outcomemeasures in educational evaluations or as parts ofdescriptions of curricula or assessment procedures.Secondly, we examined the content validity of allchecklists by assessing whether they included anyaspect of each of the 7 themes of competence, andfound that 48% included 4 or more themes. Mea-sured by these criteria, the quality of the checklistswas generally not high. Nevertheless, it is arguablewhether such criteria are appropriate in a studywhere the aim was to identify and summarise all theaspects of clinical procedural competencies whichhave previously been assessed by seeking saturation ofthemes rather than by a bias-prone quantitativeestimate of effect size, as is the case in a meta-analysisof randomised controlled trials.

Validity of approach

We deliberately incorporated a diverse literature inthis review as we wanted to inclusively identifypotential domains for a new assessment tool. Theanalysis was performed by experienced doctor andnurse educators to help ensure that the perspec-tives of the 2 major health care professions wererepresented, but we will include other doctors,

nurses and allied health professionals in the furtherdevelopment of the tool. This cross-professionalapproach to competence assessment is novel, butwe argue that when professionals from differentgroups perform the same task, they should beassessed by the same criteria, irrespective ofwhether they are doctors, nurses, dentists or alliedhealth professionals. We recognise that differentstaff groups have contrasting and often comple-mentary roles in procedural skills, such as whenone person performs a clinical procedure andanother assists. We envisage that the eventual toolcould be used to assess these distinct roles andskills as other global assessment tools are used toassess quite different but related tasks.13–15

An obvious objective for future research is todetermine whether this is actually possible.

Representation of themes

RKM and JS judged that a theme was represented if anyitem in a checklist addressed any sub-theme. We do notclaim that such representation means that a theme iscomprehensively assessed by a checklist. Nevertheless,the obverse, when a theme was not represented in achecklist, means we could discern no evidence that itwould be assessed in an assessment based on thechecklist. Although we acknowledge that not all ofthese themes are applicable to all clinical procedures,we do argue that �Safety� and �Infection control� areubiquitous concerns in all health care settings. Theywere completely missing from, and therefore could notbe assessed by, a large proportion of published check-lists; many others enabled only partial assessment ofthese competencies. We argue therefore that manyclinical procedural checklists do not have contentvalidity for the global holistic assessment of skills.

Implications of this review

We have successfully identified a set of 7 themes and37 sub-themes from this systematic literature review.We believe that they have potential as the basis of ageneric clinical procedural skills assessment tool.We are currently refining them into such a tool,designated the Leicester Clinical ProcedureAssessment Tool.

This review vividly highlights the fact that manychecklists assess a limited range of competencies.They may allow assessment of technical skills but tellus little about whether the health worker performsin a manner which minimises the risk of health care-acquired infection, maximises safety, yet simulta-neously delivers humane, team-based health care.

ª Blackwell Publishing Ltd 2008. MEDICAL EDUCATION 2008; 42: 338–349 345

Assessment and certification of clinical procedural skills

No-one would argue that our patients deserve less.Safety and infection prevention skills are as funda-mental to high-quality health care as technical skills,and teamwork is vital because no-one works inisolation. Assessment is a powerful driver for learn-ing as �learners respect what we inspect�,29,30 but,given the absence of safety and infection preventionskills from so many assessment documents, can weexpect our learners to respect them? The rationalefor and the need to re-integrate the teaching andassessment of humanistic, team-working and techni-cal skills have recently been highlighted,31,32 butthese data dramatically demonstrate that key skillsdomains are not assessed in the context in whichthey are practised and this review offers no evidencethat this is improving. The laudable aims of thosewho set our professional standards are ill served bythis assessment gap, a fact which should concernall who teach, employ and consult health careprofessionals.

Contributors: the hypothesis that generic criteria for theassessment of clinical procedural skills could be developedwas developed by RKM in conjunction with TA-J and HM.TA-J and HM obtained the funding for the study. Themethodology was developed by RKM and JS. The searchstrategy was developed by LW in collaboration with RKMand JS. LW performed the electronic searches. RKM, JS andTG performed the review. RKM wrote the first draft of thepaper. All authors contributed to subsequent drafts andapproved the final manuscript. RKM is the guarantor forthe paper.Acknowledgements: the authors thank Dr Angela Lennoxand Professor Stewart Peterson, Department of Medical andSocial Care Education, University of Leicester, whostimulated the development of the hypothesis; Dr GeraldMurtagh, Department of Health Sciences University ofLeicester, who commented on methodological aspects ofthe study, and Dr Adrian Hastings, Department of Medicaland Social Care Education, University of Leicester, whocommented on educational aspects of the paper.Funding: this paper reports a study which is partof a programme funded by the Leicestershire,Northamptonshire and Rutland Workforce DevelopmentConfederation (LNRWDC) and the National Health ServiceUniversity (NHSU) and University Hospitals Leicester(UHL). The LNRWDC and UHL are, and the NHSU wasuntil its dissolution on 31 April 2005, represented on theprogramme steering group, but did not contribute to thedesign of this study, the literature search, selection ofpapers, data extraction, analysis or interpretation, or thewriting of the paper.Conflicts of interest: JS, LW, TG, TA-J and HM are employedby University Hospitals of Leicester, which is a provider oftraining for health care personnel. Much of this training isinformed by the research of which this study is part.

RKM has a longstanding research and teaching interest inthe assessment and enhancement of skills.Ethical approval: this study was approved by theLeicestershire Research Ethics Committee.

REFERENCES

1 Mason WT, Strike PW. See one, do one, teach one – isthis still how it works? A comparison of the medicaland nursing professions in the teaching of practicalprocedures. Med Teach 2003;25 (6):664–6.

2 American Nurse Association. Code of Ethics for Nurseswith Interpretive Statements. http://www.nursingworld.org/ethics/code/protected_nwcoe303.htm#prov4.2005. [Accessed 2 February 2006.].

3 Rutherford J, Leigh J, Monk J, Murray C. Creating anorganisational infrastructure to develop and supportnew nursing roles – a framework for debate. J NursManag 2005;13 (2):97–105.

4 Harden RM, Gleeson FA. Assessment of clinicalcompetence using an objective structured clinicalexamination (OSCE). Med Educ 1979;13 (1):41–54.

5 Harden RM. What is an OSCE? Med Teach 1988;10(1):19–22.

6 Newble DI, Swanson DB. Psychometric characteristicsof the objective structured clinical examination. MedEduc 1988;22 (4):325–34.

7 Wass V, Jones R, van der Vleuten CPM. Standardisedor real patients to test clinical competence? The longcase revisited. Med Educ 2001;35 (4):321–5.

8 Wass V, van der Vleuten CPM. The long case. Med Educ2004;38 (11):1176–80.

9 Norman G. The long case versus objective structuredclinical examinations. BMJ 2002;324 (7340):748–9.

10 van Luijk SJ, van der Vleuten CPM. A comparison ofchecklists and rating scales in performance-basedtesting. In: Hart IR, Harden RM, Des Marchais J, eds.Current developments in assessing clinical compe-tence. Ottawa: IRH Medical Ltd 1992;357–62.

11 Keynan A, Friedman M, Benbassat J. Reliability ofglobal rating scales in the assessment of clinicalcompetence of medical students. Med Educ 1987;21(6):477–81.

12 Regehr G, Freeman R, Robb A, Missiha N, Heisey R.OSCE performance evaluations made by standardisedpatients: comparing checklist and global rating scores.Acad Med 1999;10 (Suppl):135–7.

13 Martin JA, Regehr G, Reznick R, MacRae H,Murnaghan J, Hutchison C, Brown M. Objectivestructured assessment of technical skill (OSATS) forsurgical residents. Br J Surg 1997;84 (2):273–8.

14 Reznick R, Regehr G, MacRae H, Martin J, McCullochW. Testing technical skill via an innovative �benchstation� examination. Am J Surg 1997;173 (3):226–30.

15 Winckel CP, Reznick RK, Cohen R, Taylor B.Reliability and construct validity of a structuredtechnical skills assessment form. Am J Surg 1994;167(4):423–7.

346 ª Blackwell Publishing Ltd 2008. MEDICAL EDUCATION 2008; 42: 338–349

R K McKinley et al

16 Fraser RC, Sarkhou ME, McKinley RK, van der VleutenCPM. Regulatory end-point assessment of theconsultation competence of family practice trainees inKuwait. Eur J Public Health 2006;12:100–7.

17 Fraser RC, Lee RSY, Yui YK, Lam CLK, McKinley RK,van der Vleuten CPM. Regulatory assessment of theconsultation competence of family physicians in HongKong. The Hong Kong Practitioner 2004;26:5–15.

18 McKinley RK, Fraser RC, van der Vleuten C, HastingsAM. Formative assessment of the consultation per-formance of medical students in the setting of gen-eral practice using a modified version of the LeicesterAssessment Package. Med Educ 2000;34:573–9.

19 Accreditation Council for Graduate Medical Educa-tion. General Competencies: ACGME Outcome Project 1999.http://www.acgme.org/outcome/comp/compFull.asp. [Accessed 1 June 2005.]

20 General Medical Council. Maintaining Good MedicalPractice. London: GMC 1998.

21 Nursing and Midwifery Council. The Code of ProfessionalConduct: Standards for Performance, Conduct and Ethics.London: NMC 2004.

22 Department of Health. The Essence of Care: Patient--focused Benchmarking for Health Care Practitioners.http://www.dh.gov.uk/en/Publicationsandstatistics/Publications/PublicationsPolicyAndGuidance/DH_4005475. 2001. [Accessed 11 June 2007.]

23 New South Wales Medical Board. Good Medical Practice.Duties of a Doctor Registered with the New South WalesMedical Board. http://www.medeserv.com.au/nswmb/publications/Code_-_Duties_of_a_Doctor_ams.pdf2004. [Accessed 12 December 2005.]

24 Medical Council of Canada. Objectives for the QualifyingExamination. http://www.mcc.ca/Objectives_online/.2004. [Accessed 1 December 2005.]

25 Landis JR, Koch GG. The measurement of observeragreement for categorical data. Biometrics 1977;33(1):159–74.

26 Glaser B, Strauss A. The Discovery of GroundedTheory. Chicago: Aldine 1967.

27 Altman DG, Schulz KF, Moher D, Egger M, Davidoff F,Elbourne D, Gotzsche PC, Lang T, CONSORT Group(Consolidated Standards of Reporting Trials). Therevised CONSORT statement for reporting rando-mised trials: explanation and elaboration. Ann InternMed 2001;134 (8):663–94.

28 Moher D, Cook DJ, Eastwood S, Olkin I, Rennie D,Stroup DF. Improving the quality of reports ofmeta-analyses of randomised controlled trials: theQUOROM statement. Quality of Reporting ofMeta-analyses. Lancet 1999;354 (9193):1896–1900.

29 Swanson DB, Norman GR, Linn RL. Performance-based assessment: lessons from the health professions.Educ Res 1995;25 (5):5–11.

30 Norcini J. The Importance of Workplace-based Assessment inthe Foundation Programme. http://www.mmc.nhs.uk/download/The_importance_of_workplace_based_assessment_in_the_Foundation_Programme.ppt.2006. Skipton House, Room 531B, 80 London Road,

London, SE1 6LH, Modernising Medical Careers.[Accessed 5 April 2006.]

31 Moorthy K, Vincent C, Darzi A. Simulation-basedtraining. BMJ 2005;330 (7490):493–4.

32 Kidd J, Patel V, Peile E, Carter Y. Clinical and com-munication skills. BMJ 2005;330 (7488):374–5.

33 Bousdras V, Aghabeigi B, Petrie A, Evans A. Assess-ment of surgical skills in implant dentistry. Int J OralMaxillofac Implants 2004;19 (4):542–8.

34 Evans AW. Assessing competence in surgical dentistry.Br Dent J 2001;190 (7):343–6.

35 Macluskey M, Hanson C, Kershaw A, Wight AJ,Oghden GR. Development of a structured clinicaloperative test (SCOT) in the assessment of practicalability in the oral surgery undergraduate curriculum.Br Dent J 2004;196 (4):225–8.

36 Mossey PA, Newton JP. The structured clinical opera-tive test (SCOT) in dental competency assessment.Br Dent J 2001;190 (7):387–90.

37 Scott BJ, Evans DJ, Drummond JR, Mossey PA, StirrupsDR. An investigation into the use of a structuredclinical operative test for the assessment of a clinicalskill. Eur J Dent Educ 2001;5 (1):31–7.

38 Barsuk D, Ziv A, Lin G, Blumenfeld A, Rubin O,Keidan I, Munz Y, Berkenstadt H. Using advancedsimulation for recognition and correction of gaps inairway and breathing management skills in pre-hospi-tal trauma care. Anesth Analg 2005;100 (3):803–9.

39 Bjork I, Kirkevold M. From simplicity to complexity:developing a model of practical skill performance innursing. J Clin Nurs 2000;9 (4):620–31.

40 Day T, Farnell S, Haynes S, Wainwright S, Wilson B.Tracheal suctioning: an exploration of nurses� knowl-edge and competence in acute and high-dependencyward areas. J Adv Nurs 2002;39 (1):35–45.

41 Maylor ME. Compression therapy award. Teachingbandaging and Doppler usage for venous ulcer treat-ment. Br J Nurs 2002;11 (20):20.

42 Morse JM, Penrod J, Kassab C, Dellasega C. Evaluatingthe efficiency and effectiveness of approaches tonasogastric tube insertion during trauma care. Am JCrit Care 2000;9 (5):325–33.

43 Norris R, Ngo J, Nolan K, Hooker G. Physicians and laypeople are unable to apply pressure immobilisationproperly in a simulated snakebite scenario. WildernessEnviron Med 2005;16 (1):16–21.

44 Welch SK. Certification of staff nurses to insert enteralfeeding tubes using a research-based procedure. NutrClin Pract 1996;11 (1):21–7.

45 Abraham S. Gynaecological examination: a teachingpackage integrating assessment with learning. MedEduc 1998;32 (1):76–81.

46 Brehmer M, Tolley D. Validation of a bench model forendoscopic surgery in the upper urinary tract. Eur Urol2002;42 (2):175–9.

47 Brotzman GL, Apgar BS. Assessing colposcopic skills:the instructor�s handbook. Fam Med 1998;30 (5):350–5.

48 Cullen L, Fraser D, Symonds I. Strategies for inter-professional education: the Interprofessional Team

ª Blackwell Publishing Ltd 2008. MEDICAL EDUCATION 2008; 42: 338–349 347

Assessment and certification of clinical procedural skills

Objective Structured Clinical Examination formidwifery and medical students. Nurse Educ Today2003;23 (6):427–33.

49 Kelly MH, Campbell LM, Murray TS. Clinical skillsassessment. Br J Gen Pract 1999;49 (443):447–50.

50 Nielsen P, Foglia L, Mandel LS, Chow G. Objectivestructured assessment of technical skills for episiotomyrepair. Am J Obstet Gynecol 2003;189 (5):1257–60.

51 Agarwal M, Williamson SN. Objective structuredpractical examination (OSPE). Nurs J India 1999;90(12):276–8.

52 Alinier G. Nursing students� and lecturers� perspectivesof objective structured clinical examination incorpo-rating simulation. Nurse Educ Today 2003;23 (6):419–26.

53 Barnsley L, Lyon P, Ralston S, Hibbert E, CunninghamI, Gordon F, Field M. Clinical skills in junior medicalofficers: a comparison of self- reported confidence andobserved competence. Med Educ 2004;38 (4):358–67.

54 Drevenhorn E, Hakansson A, Petersson K. Bloodpressure measurement – an observational study of 21public health nurses. J Clin Nurs 2001;10(2):189–94.

55 Gurvis JP, Grey MT. The anatomy of a competency.J Nurs Staff Dev 1995;11 (5):247–52.

56 McKane C. Clinical objectives: a method to evaluateclinical performance in critical care orientation.J Nurses Staff Dev 2004;20 (3):134–9.

57 Nicol M, Freeth D. Assessment of clinical skills: a newapproach to an old problem. Nurse Educ Today 1998;18(8):601–9.

58 Brennan RT, Braslow A, Batcheller AM, Kaye W. Areliable and valid method for evaluating cardiopul-monary resuscitation training outcomes. Resuscitation1996;32 (2):85–93.

59 Cronin C, Cheang S, Hlynka D, Adair E, Roberts S.Videoconferencing can be used to assessneonatal resuscitation skills. Med Educ 2001;35(11):1013–23.

60 Davies N, Gould D. Updating cardiopulmonary resus-citation skills: a study to examine the efficacy of self-instruction on nurses� competence. J Clin Nurs 2000;9(3):400–10.

61 Gaba DM, Howard SK, Flanagan B, Smith BE, FishnKJ, Botney R. Assessment of clinical performanceduring simulated crises using both technical andbehavioural ratings. Anesthesiology 1998;89 (1):8–18.

62 Hohenhaus SM. Nurse educator. Assessing compe-tency: the Broselow)Luten resuscitation tape. J EmergNurs 2002;28 (1):70–2.

63 Hollis S, Gillespie N. An audit of basic life supportskills amongst general practitioner principals: is therea need for regular training? Resuscitation 2000;44(3):171–5.

64 Kovacs G, Bullock G, Ackroyd S, Cain E, Petrie D. Arandomised controlled trial on the effect of educa-tional interventions in promoting airway managementskill maintenance. Ann Emerg Med 2000;36 (4):301–9.

65 Kuhnigk H, Sefrin P, Paulus T. Skills and self-assess-ment in cardio-pulmonary resuscitation of hospitalnursing staff. Eur J Emerg Med 1994;1 (4):193–8.

66 Naik VN, Matsumoto ED, Houston PL, Hamstra SJ,Yeung RY, Mallon J, Martire TM. Fibreoptic orotrac-heal intubation on anaesthetised patients: do manip-ulation skills learned on a simple model transfer intothe operating room? Anesthesiology 2001;95 (2):343–8.

67 O�Connor HM, McGraw RC. Clinical skills training:developing objective assessment instruments. MedEduc 1997;31 (5):359–63.

68 Quan L, Shugerman RP, Kunkel NC, Brownlee CJ.Evaluation of resuscitation skills in new residentsbefore and after pediatric advanced life supportcourse. Pediatrics 2001;108 (6):110.

69 Whitfield R, Newcombe R, Woollard M. Reliability ofthe Cardiff Test of basic life support and automatedexternal defibrillation version 31. Resuscitation 2003;59(3):291–314.

70 Birnbach DJ, Santos AC, Bourlier RA, Meadows WE,Datta S, Stein DJ, Kuroda MM, Thys DM. Theeffectiveness of video technology as an adjunct toteach and evaluate epidural anaesthesia performanceskills. Anesthesiology 2002;96 (1):5–9.

71 Camp-Sorrell D, Fernandez K, Reardon MB. Teachingoncology nurses about epidural catheters. Oncol NurseForum 1990;17 (5):683–9.

72 Dugger B. Intravenous nursing competency: why is itimportant? J Intraven Nurs 1997;20 (6):287–97.

73 Engum SA, Jeffries P, Fisher L. Intravenous cathetertraining system: computer-based education versustraditional learning methods. Am J Surg 2003;186(1):67–74.

74 Fearon M. Assessment and measurement of compe-tence in practice. Nurs Stand 1998;12 (22):43–7.

75 Hill R, Hooper C, Wahl S. Look, learn, and be satis-fied: video playback as a learning strategy to improveclinical skills performance. J Nurses Staff Dev 2000;16(5):232–9.

76 Liddell MJ, Davidson SK, Taub H, Whitecross LE.Evaluation of procedural skills training in an under-graduate curriculum. Med Educ 2002;36 (11):1035–41.

77 Mason S, Fletcher A, McCormick S, Perrin J, Rigby A.Developing assessment of emergency nurse practi-tioner competence – a pilot study. J Adv Nurs 2005;50(4):425–32.

78 Rudzik J. Establishing and maintaining competency.J Intraven Nurs 1999;22 (2):69–73.

79 Snelling PC, Duffy L. Developing self-directed trainingfor intravenous cannulation. Prof Nurse 2002;18(3):137–42.

80 St John RE, Malen JF. Contemporary issues in adulttracheostomy management. Crit Care Nurs Clin NorthAm 2004;16 (3):413–30.

81 Szumlas GA. Development of an office-based curricu-lum of common pediatric primary care skills forresidents. Acad Med 2002;77 (7):749.

82 Thorburn J, Dean M, Finn T, King J, Wilkinson M.Student learning through video assessment. ContempNurse 2001;10 (1):39–45.

83 Velmahos G, Toutouzas K, Sillin L, Chan L, Clark R,Theodorou D, Maupin F. Cognitive task analysis for

348 ª Blackwell Publishing Ltd 2008. MEDICAL EDUCATION 2008; 42: 338–349

R K McKinley et al

teaching technical skills in an inanimate surgical skillslaboratory. Am J Surg 2004;187 (1):114–9.

84 Wang T, Schwartz J, Karimipour D, Orringer J,Hamilton T, Johnson T. An education theory-basedmethod to teach a procedural skill. Arch Dermatol2004;140 (11):1357–61.

85 Adrales GL, Donnelly MB, Chu UB, Witzke DB,Hoskins JD, Mastrangelo MJ Jr, Gandsas A, Park AE.Determinants of competency judgements by experi-enced laparoscopic surgeons. Surg Endosc 2004;18(2):323–7.

86 Backstein D, Agnidis Z, Regehr G, Reznick R. Theeffectiveness of video feedback in the acquisition oforthopaedic technical skills. Am J Surg 2004;187(3):427–32.

87 Cauraugh JH, Martin M, Martin KK. Modelling surgi-cal expertise for motor skill acquisition. Am J Surg1999;177 (4):331–6.

88 Dayal R, Faries P, Lin S et al. Computer simulation as acomponent of catheter-based training. J Vasc Surg2004;40 (6):1112–7.

89 Eubanks TR, Clements RH, Pohl D, Williams N,Schaad DC, Horgan S, Pellegrini C. An objectivescoring system for laparoscopic cholecystectomy. J AmColl Surg 1999;189 (6):566–74.

90 Grober ED, Hamstra SJ, Wanzel KR, Reznick RK,Matsumoto ED, Sidhu RS, Jarvi KA. Laboratory-basedtraining in urological microsurgery with bench modelsimulators: a randomised controlled trial evaluatingthe durability of technical skill. J Urol 2004;172(1):378–81.

91 Jackson MJ, Pandey V, Wolfe JHN. Training forinfrainguinal bypass surgery. Eur J Vasc Endovasc Surg2003;26 (5):457–66.

92 Khan NUZ. Plastibell technique of circumcision: aneasy and a practical procedure. J Med Sci (Faisalabad,Pakistan) 2004;20 (3):277.

93 Matsumoto ED, Hamstra SJ, Radomski SB, CusimanoMD. A novel approach to endourological training:training at the surgical skills centre. J Urol 2001;166(4):1261–6.

94 Mayooran Z, Pearce S, Tsaltas J, Rombauts L, Brown TJr, Lawrence A, Fraser K, Healy D. Ignorance of elec-trosurgery among obstetricians and gynaecologists.BJOG 2004;111 (12):1413–8.

95 Montague M, Lee M, Hussain SSM. Human erroridentification: an analysis of myringotomy and venti-lation tube insertion. Arch Otolaryngol Head Neck Surg2004;130 (10):1153–7.

96 Moorthy K, Munz Y, Sarker SK, Darzi A. Objectiveassessment of technical skills in surgery. BMJ 2003;327(7422):1032–7.

97 Rosen J, MacFarlane M, Richards C, Hannaford B,Sinanan M. Surgeon-tool force ⁄ torque signatures )evaluation of surgical skills in minimally invasivesurgery. Stud Health Technol Inform 1999;62:290–6.

98 Shah S, Thomas G, Brooker J, Suzuki N, Williams C,Thapar C, Saunders B. Use of video and magneticendoscope imaging for rating competence at colo-

noscopy: validation of a measurement tool. GastrointestEndosc 2002;56 (4):568–73.

99 Wilhelm D, Ogan K, Roehrborn C, Cadeddu J, PearleM. Assessment of basic endoscopic performance usinga virtual reality simulator. J Am Coll Surg 2002;195(5):675–81.

100 Zhang H, Payandeh S, Dill J, Lomax A. Acquiringlaparoscopic manipulative skills: a virtual tissue dis-section training module. Stud Health Technol Inform2004;98:419–21.

101 Boulet JR, Gimpel JR, Dowling DJ, Finley M. Assessingthe ability of medical students to perform osteopathicmanipulative treatment techniques. J Am OsteopathAssoc 2004;104 (5):203–11.

102 Cross V. Approaching consensus in clinical compe-tence assessment: third round of Delphi study of aca-demics� and clinicians� perceptions of physiotherapyundergraduates. Physiotherapy 2001;87 (7):341–50.

103 Cross V, Hicks C, Barwell F. Comparing the impor-tance of clinical competence criteria across specialties:impact on undergraduate assessment. Physiotherapy2001;87 (7):351–67.

104 Marshall G, Harris P. A study of the role of an objectivestructured clinical examination (OSCE) in assessingclinical competence in third year student radiogra-phers. Radiography 2000:6(2):117–22.

105 Olympio MA, Goldstein MM, Mathes DD. Instruc-tional review improves performance of anaesthesiaapparatus checkout procedures. Anesth Analg 1996;83(3):618–22.

106 Roberts LW, Mines J, Voss C, Koinis C, Mitchell S,Obenshain SS, McCarty T. Assessing medical students�competence in obtaining informed consent. Am J Surg1999;178 (4):351–5.

SUPPLEMENTARY MATERIAL

The following supplementary material is available for thisarticle

Table S1. Summary of published checklists.

This material is available as part of the online article from:http://www.blackwell-synergy.com/doi/abs/10.1111/j.1365-2923.2008.02970.x.

(This link will take you to the article abstract.)

Please note: Blackwell Publishing is not responsible for thecontent or functionality of any supplementary materialssupplied by the authors. Any queries (other than missingmaterial) should be directed to the corresponding authorfor the article.

Received 25 June 2006; editorial comments to authors 16 October2006, 23 May 2007; accepted for publication 2 October 2007

ª Blackwell Publishing Ltd 2008. MEDICAL EDUCATION 2008; 42: 338–349 349

Assessment and certification of clinical procedural skills