Iron Deficiency in Young Lebanese Children: Association With Elevated Blood Lead Levels

10
JOBNAME: No Job Name PAGE: 1 SESS: 34 OUTPUT: Tue Oct 9 18:30:08 2007 /v2451/blackwell/journals/jbr_v5_i4/jbr_084 EVIDENCE UTILIZATION Effectiveness of education in evidence-based healthcare: the current state of outcome assessments and a framework for future evaluations Mona Nabulsi xxxx, Janet Harris xxxx, Luz Letelier xxxx, Kathleen Ramos xxxx, Kevork Hopayian xxxx, Claire Parkin xxxx, Franz Porzsolt xxxx, Piersante Sestini xxxx, Mary Slavin xxxx, William Summerskill xxxx Xxxxxxxxxxxxxxxxxxxxxxxxxxsxxxxxxxxxxxxxxxxxxxx Abstract Background A discipline which critically looks at the evidence for practice should itself be critically examined. Credible evidence for the effectiveness of training in evidence-based healthcare (EBHC) is essential. We attempted to summarise the current knowledge on evaluating the effectiveness of training in EBHC while identifying the gaps. Methods A working group of EBHC teachers developed a conceptual framework of key areas of EBHC teaching and practice in need of evidence mapped to appropriate methods and outcomes. A literature search was conducted to review the current state of research in these key areas. Studies of training interventions that evaluated effectiveness by considering learner, patient or health system outcomes in terms of knowledge, skills, attitude, judgement, competence, decision-making, patient satisfaction, quality of life, clinical indicators or cost were included. There was no language restriction. Results Of 55 articles reviewed, 15 met the inclusion criteria: six systematic reviews, three randomised controlled trials and six before–after studies. We found weak indications that undergraduate training in EBHC improves knowledge but not skills, and that clinically integrated postgraduate teaching improves both knowledge and skills. Two randomised controlled trials reported no impact on attitudes or behaviour. One before–after study found a positive impact on decision-making, while another suggested change in learners’ behaviour and improved patient outcome. We found no studies assessing the impact of EBHC training on patient satisfaction, health-related quality of life, cost or population-level indicators of health. Conclusions Literature evaluating the effectiveness of training in EBHC has focused on short-term acquisition of knowledge and skills. Evaluation designs were methodologically weak, controlled trials appeared inadequately powered and systematic reviews could not provide conclusive evidence owing to weakness of study designs. Key words: education, effectiveness, outcome assessment. Background Although the number of training programs in evidence- based healthcare (EBHC) is increasing rapidly, most of these programs have not demonstrated the effectiveness of inter- ventions to teach EBHC. 1 The popularity of EBHC results from the belief that trainees become skilled at using current best research to provide the most effective and efficient healthcare to their patients. 2 However, there is little scientific evidence that links training in evidence-based practice (EBP) to improved learner, patient or health system outcomes. 3 In September 2003, delegates attending the Second International Conference for Teachers and Developers of Evidence-Based Healthcare (‘Signposting the future of EBHC’) suggested a list of issues in EBHC that are in need of further evaluation. 4 One outcome was the Sicily Statement, a consensus statement of what EBP means, a description of the skills required to practise in an EB manner and a curricu- lum that outlines the minimum requirements for train- ing health professionals in EBP. 5 Another task set by the conference was the production of a framework for the Correspondence: Dr Kevork Hopayian, Seahills Leiston Road, Alde- burgh, Suffolk IP15 5PL, UK. Email: [email protected] doi:10.1111/j.1479-6988.2007.00084.x Int J Evid Based Healthc 2007; 5: 437–445 © 2007 The Authors Journal Compilation © Blackwell Publishing Asia Pty Ltd 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 1 1

Transcript of Iron Deficiency in Young Lebanese Children: Association With Elevated Blood Lead Levels

JOBNAME: No Job Name PAGE: 1 SESS: 34 OUTPUT: Tue Oct 9 18:30:08 2007/v2451/blackwell/journals/jbr_v5_i4/jbr_084

E V I D E N C E U T I L I Z A T I O N

Effectiveness of education in evidence-based healthcare: thecurrent state of outcome assessments and a framework forfuture evaluations

Mona Nabulsi xxxx, Janet Harris xxxx, Luz Letelier xxxx, Kathleen Ramos xxxx, KevorkHopayian xxxx, Claire Parkin xxxx, Franz Porzsolt xxxx, Piersante Sestini xxxx, Mary Slavinxxxx, William Summerskill xxxxXxxxxxxxxxxxxxxxxxxxxxxxxxsxxxxxxxxxxxxxxxxxxxx

AbstractBackground A discipline which critically looks at the evidence for practice should itself be critically examined.Credible evidence for the effectiveness of training in evidence-based healthcare (EBHC) is essential. We attempted tosummarise the current knowledge on evaluating the effectiveness of training in EBHC while identifying the gaps.

Methods A working group of EBHC teachers developed a conceptual framework of key areas of EBHC teachingand practice in need of evidence mapped to appropriate methods and outcomes. A literature search was conductedto review the current state of research in these key areas. Studies of training interventions that evaluated effectivenessby considering learner, patient or health system outcomes in terms of knowledge, skills, attitude, judgement,competence, decision-making, patient satisfaction, quality of life, clinical indicators or cost were included. There wasno language restriction.

Results Of 55 articles reviewed, 15 met the inclusion criteria: six systematic reviews, three randomised controlledtrials and six before–after studies. We found weak indications that undergraduate training in EBHC improvesknowledge but not skills, and that clinically integrated postgraduate teaching improves both knowledge and skills.Two randomised controlled trials reported no impact on attitudes or behaviour. One before–after study found apositive impact on decision-making, while another suggested change in learners’ behaviour and improved patientoutcome. We found no studies assessing the impact of EBHC training on patient satisfaction, health-related qualityof life, cost or population-level indicators of health.

Conclusions Literature evaluating the effectiveness of training in EBHC has focused on short-term acquisition ofknowledge and skills. Evaluation designs were methodologically weak, controlled trials appeared inadequatelypowered and systematic reviews could not provide conclusive evidence owing to weakness of study designs.

Key words: education, effectiveness, outcome assessment.

Background

Although the number of training programs in evidence-based healthcare (EBHC) is increasing rapidly, most of theseprograms have not demonstrated the effectiveness of inter-ventions to teach EBHC.1 The popularity of EBHC resultsfrom the belief that trainees become skilled at using currentbest research to provide the most effective and efficienthealthcare to their patients.2 However, there is little scientific

evidence that links training in evidence-based practice (EBP)to improved learner, patient or health system outcomes.3

In September 2003, delegates attending the SecondInternational Conference for Teachers and Developersof Evidence-Based Healthcare (‘Signposting the future ofEBHC’) suggested a list of issues in EBHC that are in need offurther evaluation.4 One outcome was the Sicily Statement,a consensus statement of what EBP means, a description ofthe skills required to practise in an EB manner and a curricu-lum that outlines the minimum requirements for train-ing health professionals in EBP.5 Another task set by theconference was the production of a framework for the

Correspondence: Dr Kevork Hopayian, Seahills Leiston Road, Alde-burgh, Suffolk IP15 5PL, UK. Email: [email protected]

doi:10.1111/j.1479-6988.2007.00084.x Int J Evid Based Healthc 2007; 5: 437–445

© 2007 The AuthorsJournal Compilation © Blackwell Publishing Asia Pty Ltd

12

3

4

5

6

7

8

9

10

11

12

13

14

1516

17

1819202122

23

242526272829

30

3132

33

34

35

36

37

38394041424344

45

4647

484950515253545556575859

11

JOBNAME: No Job Name PAGE: 2 SESS: 34 OUTPUT: Tue Oct 9 18:30:08 2007/v2451/blackwell/journals/jbr_v5_i4/jbr_084

comprehensive evaluation of training in EBHC. The task wasdelegated to a working group at the conference. A secondmeeting of the group was convened in October 2004 inUlm, Germany to carry this forward. Our goal was to sum-marise the current state of the art on evaluation of trainingEBHC in order to identify the gaps. This paper describes theposition arrived at by the group.

Methods

Working groupThe first stage of the study was a working group meetingwithout formal consensus methods to identify key areas ofEBHC training for evaluation of effectiveness. The groupspanned four continents (Europe, Asia and North and SouthAmerica), five disciplines (nursing, public health, occupa-tional health, physical therapy and medicine), and includedboth full-time academics and practising clinicians with aca-demic commitments. The results of this stage of enquiryform the methods of the subsequent stage and are pre-sented here.

We classified what we considered relevant outcomes intothree areas: learners, patients and health systems (Table 1).Outcomes were identified for each group. Learner outcomeswere further divided into three different domains: affective,cognitive and behavioural. The affective domain includesattitudes, beliefs and intentions. The cognitive domainincludes knowledge acquisition and skills development. Thebehavioural domain includes use of evidence in clinical prac-tice. Patient outcomes were further divided into patient sat-isfaction, health-related quality of life and improved patienthealth. System outcomes focused on cost-effectiveness.

We applied the structured approach of EBHC to develop aconceptual framework to search and to evaluate the litera-ture for the second stage. We used the PIO format (popula-tion, intervention, outcome) to formulate focused questionsfor each domain. For learner outcomes, we asked ‘Doesteaching evidence-based practice to health professionalsproduce a change in knowledge, attitude, or skills?’ Forpatient outcomes, ‘Does teaching evidence-based practiceto health professionals lead to better patient outcomes interms of improved satisfaction, health related quality of life,

or improved health?’ For systems outcomes, ‘Does teachingevidence-based practice to health professionals producemore cost effective services?’

Search strategy and study selectionTwo authors (MN and JH) independently searched MEDLINE(1980–March 2006), EMBASE (1980–March 2006), CINAHL(1982–March 2006) and COCHRANE (1998–March 2006).Both thesaurus and free text terms were used to identifyevaluations of learner, patient and health system outcomes.We searched both as MeSH and free text terms:• Health professionals AND• ((Evidence based health care OR Evidence based medi-

cine) AND (Teaching OR Education OR Training)) AND• ((Knowledge OR Attitudes OR Skills) OR (Patient satisfac-

tion OR Decision making OR Quality of life) OR (Practicepatterns OR Judgement OR Competence OR Clinical indi-cators OR Cost effectiveness))Papers in any language were considered and reference

lists of selected articles were searched for additional rele-vant studies. No attempt was made to identify unpublishedmaterial.

We selected papers if they described a teaching or trainingintervention for EBHC, offered in healthcare settings andevaluated impact in terms of learner, patient or system out-comes. The study type was left deliberately broad in order toobtain an overview of the various approaches to evaluation,including randomised controlled trials (RCTs), systematicreviews, before–after studies and other designs that assessedthe effects of any method of training in EBP. Papers wereexcluded if they failed to meet the previously mentionedselection criteria, used a non-validated assessment tool or ifthey focused on training for guidelines. Guidelines wereexcluded from consideration because their focus is not EBPtraining, even though their production and disseminationare partly educational. Studies identified in the search thatwere already discussed in one of the systematic reviewsincluded in our overview were not evaluated separately.

Articles were categorised by study design into systematicreviews, RCTs, before–after studies (with or without acontrol), and other designs that assessed the effects of anymethod of training in EBHC as an intervention, on anyhealthcare learner. Data were extracted by (i) study type; (ii)

Table 1 Classification of possible outcomes of evidence-based healthcare training

Learner outcomes Patient outcomes System outcomes

Affective domains: • Increased patient satisfaction • Population health• Satisfaction with training • Health-related quality of life • Cost• Attitudes toward evidence-based practice • Improved patient health• Intentions to use an evidence-based practice approach

Cognitive domains:• Increased objective knowledge• Improved skills

Behavioural domains:• Transfer of knowledge and skills to the workplace

438 M Nabulsi et al.

© 2007 The AuthorsJournal Compilation © Blackwell Publishing Asia Pty Ltd

1234567

8

9

1011121314151617181920212223242526272829303132333435363738394041

42

43

44

45464748

495051

525354

555657

585960616263646566676869707172737475767778798081828384858687888990919293949596

JOBNAME: No Job Name PAGE: 3 SESS: 34 OUTPUT: Tue Oct 9 18:30:08 2007/v2451/blackwell/journals/jbr_v5_i4/jbr_084

learner population; (iii) content of teaching; (iv) mode ofdelivery; (v) duration of teaching; and (vi) outcome. Sam-pling, confounding, definition of outcomes and measure-ment tools were reviewed with the aim of assessing therigour of educational evaluation methods.

Results

Search resultsThe search identified 3016 articles of which 2961 wereexcluded based on the title and/or abstract (Fig. 1). Adetailed review was conducted for 55 articles. Fifteen studiesmet our selection criteria and were included in this overview:six systematic reviews,6–11 three RCTs12——14 and six before–after studies15–20 (Tables 2–4). The population of learnersvaried among studies and included medical students, resi-dents, practising physicians, nurses and physiotherapists.Training interventions were heterogeneous in terms of thebackground and skills of the teacher (resident, peer ortutor teaching); mode of delivery (lectures, seminars, andproblem-based learning); content of teaching (focusing onelements of question formulation, searching skills, criticalappraisal, and use of evidence for decision-making); dura-tion of teaching (number of seminars amd length of time);and provision of supporting materials or resources such ascomputer access to critically appraised topics or electronicdatabases. All six systematic reviews, one controlled trial14

and four before–after studies15,16,19,20 assessed knowledgeand skills as the main outcome. Change in behaviour,defined as the use of research evidence in clinical practice,was the primary outcome in one RCT,12 while change inattitude was the focus of one cluster RCT.13 Three uncon-trolled before–after studies assessed the impact of EBHCtraining on change in practice behaviour and decision-

making17,18,20 with one only reporting the impact onpatients’ outcomes.18 We found no studies assessing theeffects of training in EBHC on patient satisfaction, health-related quality of life, cost or population-level indicators ofhealth.

Learner outcomesAffective domain: attitudes, beliefs and intentionsAttitudes were variously defined as attitudes toward medicalliterature,7 attitudes toward participating in an elective,15

perceived importance for clinical practice,16 attitudes towardthe use of research information,12 perceived skills and confi-dence20 and attitudes toward facilitating integration of EBpractice into the workplace.18 We found one cluster-randomised trial that assessed learners’ attitudes as aprimary outcome of EBHC training.13 In this trial, musculosk-eletal physiotherapists of mixed experience were allocated toreceive either an EB package (critical appraisal, literaturesearching and training in low-back pain management withopinion leaders taking part in training sessions), or in-servicetraining on management of knee dysfunction. A self-administered questionnaire used to assess attitude towardEBP failed to detect any significant change. Participantsreported that they depended more upon opinion leadersand textbooks to inform treatment decisions, than onprimary resources or literature searching.13 However, thisstudy was inadequately powered with 16 subjects in theintervention group and 11 in the control group. In addition,groups were different at baseline, and had too few clustersto use multilevel analysis.

Cognitive domain: assessment of knowledge and skillsFive systematic reviews,6–9,11 one controlled trial14 and fourbefore–after studies15,16,19,20 assessed knowledge of EBP asthe primary outcome of training in EBP. It is important tonote that the definitions of knowledge or skills varied amongindividual studies. Knowledge was defined variously asknowledge of information sources,7,12,20 information retrievalskills,7 skills in critical appraisal,6,7,14 knowledge about con-cepts in critical appraisal,12 knowledge of statistics,6,7,19,20

knowledge of epidemiological concepts,7 development ofproblem solving skills and development of clinicalskills.14,16,18 The level of knowledge was further defined inone study as ‘superficial’ versus ‘deep’.15 Superficial knowl-edge was assessed via ability to reproduce facts by correctlyanswering questions about basic principles of EBP, whiledeep knowledge was explored via clinical scenarios thatwere designed to assess ability to use evidence conceptsin proxy decision-making situations.14–16 Two reviews9,10

acknowledged the heterogeneity of outcome definitions butcontained no information on the respective definitions.

The measurement tools used in the different studies wereheterogeneous. Some studies used a published validatedinstrument such as the Fresno test19,21 or Berlin question-naire,15,22 or modified the validated Fresno test,20 and othersused a locally validated tool but lacked detail on the valida-tion process.12,16 Variations in outcome definitions and in

3016 potentially relevant articles identified and screened for retrieval

2961 excluded (no evaluation of learner, patient or system outcomes; focus on guidelines training)

55 selected for full text review

40 excluded (training intervention not described; non-validated assessment tool)

15 included (6 systematic reviews; 3 controlled trials; 6 before-after studies)

Figure 1 Flow diagram of studies.

Effectiveness of EBHC education 439

© 2007 The AuthorsJournal Compilation © Blackwell Publishing Asia Pty Ltd

12345

6

7

89

101112131415161718192021222324252627282930313233

34

35

3637383940

41

424344454647484950515253545556575859606162636465

66

6768697071727374757677787980818283848586878889909192

JOBNAME: No Job Name PAGE: 4 SESS: 34 OUTPUT: Tue Oct 9 18:30:08 2007/v2451/blackwell/journals/jbr_v5_i4/jbr_084

Tab

le2

Sum

mar

yof

stud

ies

asse

ssin

gef

fect

iven

ess

ofev

iden

ce-b

ased

heal

thca

rete

achi

ngan

dle

arni

ng:

Syst

emat

icre

view

findi

ngs

Aut

hor

(yea

r)St

udy

desi

gn(n

)Le

arne

rIn

terv

entio

nO

utco

me

Valid

ityC

oncl

usio

n

Nor

man

&Sh

anno

n(1

998)

6

CT

(4)

RCT

(3)

Med

ical

stud

ents

;re

side

nts

Crit

ical

app

rais

alsk

ills

Know

ledg

e;de

cisi

on-m

akin

gLi

mite

dse

arch

stra

tegy

;m

etho

dolo

gica

lsco

re:

50–8

3%;

smal

lsam

ple

size

s;kn

owle

dge

asde

fined

inva

rious

stud

ies

not

defin

ed

Und

ergr

adua

te:

imp

rove

dkn

owle

dge;

smal

leffe

ctsi

zes.

Post

grad

uate

:m

inim

alch

ange

inkn

owle

dge;

noch

ange

inde

cisi

on-m

akin

gTa

ylor

,et

al.

(200

0)7

CT

(10)

Med

ical

stud

ents

;re

side

nts

Crit

ical

app

rais

alsk

ills

Perc

eive

dco

mp

eten

cein

abili

tyto

criti

cally

app

rais

e;Ev

iden

ce-

seek

ing

beha

viou

r;Kn

owle

dge

bios

tatis

tics,

epid

emio

logy

and

sear

chin

g;at

titud

esto

war

dev

iden

cean

dus

eof

liter

atur

e

Med

ian

qua

lity

scor

eof

stud

ies

3/10

(wea

k);

inst

rum

ents

not

sens

itive

enou

ghto

mea

sure

true

valu

eof

CA

Sp

rogr

ams

Imp

rove

dkn

owle

dge;

inco

nclu

sive

resu

ltsfo

rot

her

outc

omes

Park

eset

al.

(200

1)8

RCT

(1)

Hea

lthp

rofe

ssio

nals

Crit

ical

app

rais

alsk

ills

Know

ledg

e;p

roce

ssof

care

;p

atie

ntou

tcom

es

Onl

yon

est

udy

incl

uded

;sm

alls

amp

lesi

zeM

odes

tim

pro

vem

ent

inkn

owle

dge;

noev

iden

cefo

und

for

othe

rou

tcom

esC

oom

aras

amy

etal

.(2

003)

9RC

Ts(4

);C

ontr

olle

d,no

n-ra

ndom

ised

(6);

Befo

re–a

fter

(9)

Post

grad

uate

sEB

M;

criti

cala

pp

rais

alKn

owle

dge;

skill

s;at

titud

e;be

havi

our

Het

erog

eneo

usst

udy

feat

ures

and

met

hodo

logi

calq

ualit

y

Imp

rove

dkn

owle

dge;

noco

mm

ent

onef

fect

size

s;no

chan

gein

othe

rou

tcom

esBr

ettle

(200

3)10

RCTs

,C

ohor

t,q

ualit

ativ

e(2

4)H

ealth

care

pro

fess

ions

Info

rmat

ion

skill

sSk

ills;

Patie

ntca

reM

arke

dhe

tero

gene

ityof

stud

yde

sign

sLi

mite

dev

iden

ceon

imp

rove

dsk

ills

orch

ange

inp

atie

ntca

reC

oom

aras

amy

&Kh

an(2

004)

11

RCTs

(4)

Con

trol

led

non-

rand

omis

ed(7

)Be

fore

–aft

er(1

2)

Post

grad

uate

sSt

and-

alon

eEB

Mte

achi

ngvs

.C

linic

ally

inte

grat

edEB

Mte

achi

ng

Know

ledg

e;Sk

ills;

Att

itude

;Be

havi

our

Het

erog

eneo

usst

udy

feat

ures

and

met

hodo

logi

calq

ualit

y

Stan

d-al

one:

imp

rove

dkn

owle

dge

only

.C

linic

ally

inte

grat

ed:

imp

rove

dkn

owle

dge,

skill

s,at

titud

e&

beha

viou

r;no

com

men

ton

effe

ctsi

zes

Num

ber

inp

aren

thes

is,

num

ber

ofst

udie

sin

clud

edin

the

revi

ew.

CT,

non-

rand

omis

edco

ntro

lled

tria

l;EB

M,

evid

ence

-bas

edm

edic

ine;

RCT,

rand

omis

edco

ntro

lled

tria

l.

440 M Nabulsi et al.

© 2007 The AuthorsJournal Compilation © Blackwell Publishing Asia Pty Ltd

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38

m100
2

JOBNAME: No Job Name PAGE: 5 SESS: 34 OUTPUT: Tue Oct 9 18:30:08 2007/v2451/blackwell/journals/jbr_v5_i4/jbr_084

Tab

le3

Sum

mar

yof

stud

ies

asse

ssin

gef

fect

iven

ess

ofev

iden

ce-b

ased

heal

thca

re(E

BHC

)te

achi

ngan

dle

arni

ng:

cont

rolle

dtr

ials

Aut

hor

(yea

r)St

udy

desi

gnLe

arne

rIn

terv

entio

nO

utco

me

Valid

itySt

udy

conc

lusi

on

Fors

etlu

ndet

al.

(200

3)12

RCT

Nor

weg

ian

phy

sici

ans

EBH

Cw

orks

hop

;ne

wsl

ette

r;in

form

atio

nse

rvic

e;el

ectr

onic

data

base

;el

ectr

onic

disc

ussi

onlis

t

Beha

viou

r:us

eof

EB-r

esea

rch;

attit

udes

;kn

owle

dge

ofin

form

atio

nso

urce

san

dco

ncep

ts;

self-

effic

acy;

deci

sion

toad

opt

Ade

qua

tera

ndom

isat

ion

and

blin

ding

;va

lidat

edin

stru

men

ts;

adju

sted

anal

yses

;<8

0%re

spon

se;

noIT

Tan

alys

es

No

chan

gein

beha

viou

r;in

crea

sein

know

ledg

e

Stev

enso

net

al.

(200

4)13

Clu

ster

RCT

Phys

ioth

erap

ists

EBp

rogr

am;

low

-bac

kp

ain

man

agem

ent;

opin

ion

lead

ers

taki

ngp

art

intr

aini

ngse

ssio

ns.

Con

trol

:in

-ser

vice

trai

ning

onm

anag

emen

tof

knee

dysf

unct

ion

Att

itude

tow

ard

EBP

Smal

lsam

ple

size

(E=

16;

C=

11);

blin

ding

?;gr

oup

sdi

ffere

ntat

base

line;

self-

adm

inis

tere

dva

lidat

edin

stru

men

t

No

chan

gein

attit

ude;

gett

ing

evid

ence

into

pra

ctic

evi

alit

erat

ure

sear

chin

gw

asa

low

prio

rity

and

not

ahi

ghly

rate

dfa

ctor

inch

oice

oftr

eatm

ent

Maj

or-

Kinc

ade

etal

.(2

001)

14

Qua

sira

ndom

ised

cont

rolle

dtr

ial

Paed

iatr

icho

use

staf

f;N

ICU

rota

tion

45–6

0m

inof

EBet

hics

,in

clud

ing

criti

cal

app

rais

al

Know

ledg

e,ap

pra

isal

skill

s,de

cisi

on-m

akin

gin

ethi

cali

ssue

s

Non

-val

idat

edq

uest

ionn

aire

;cl

inic

alsc

enar

ios;

app

rais

alof

pro

gnos

isst

udie

s

Imp

rove

dkn

owle

dge

ofsh

ort-

term

outc

omes

;no

imp

act

onlo

ng-t

erm

outc

omes

;no

imp

act

oncr

itica

lap

pra

isal

skill

sor

ethi

cal

deci

sion

-mak

ing

EB,

evid

ence

-bas

ed;

EBP,

evid

ence

-bas

edp

ract

ice;

ITT,

xxx;

NIC

U,

xxxx

;RC

T,ra

ndom

ised

cont

rolle

dtr

ial.

Effectiveness of EBHC education 441

© 2007 The AuthorsJournal Compilation © Blackwell Publishing Asia Pty Ltd

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25

m100
3
m100
4

JOBNAME: No Job Name PAGE: 6 SESS: 34 OUTPUT: Tue Oct 9 18:30:08 2007/v2451/blackwell/journals/jbr_v5_i4/jbr_084

Tab

le4

Sum

mar

yof

stud

ies

asse

ssin

gef

fect

iven

ess

ofev

iden

ce-b

ased

heal

thca

rete

achi

ngan

dle

arni

ng:

befo

rean

daf

ter

stud

ies

Aut

hor

(yea

r)St

udy

desi

gnLe

arne

rIn

terv

entio

nO

utco

me

Valid

itySt

udy

conc

lusi

on

Akl

etal

.(2

004)

15C

ontr

olle

d(t

wo

grou

ps)

Post

grad

uate

sTw

oEB

lect

ures

;re

side

ntte

achi

ng:

resi

dent

sear

ched

,ap

pra

ised

and

fed

back

onq

uest

ions

ofw

ard

team

Know

ledg

e;at

titud

esre

gard

ing

acce

pta

bilit

yRa

ndom

ised

but

imba

lanc

edba

selin

ech

arac

teris

tics;

cont

rol

blin

ded;

valid

ated

inst

rum

ent

(ada

pte

dBe

rlin)

Imp

rove

dkn

owle

dge

(NS)

;p

ositi

veat

titud

esto

war

dEB

P

Web

ersc

hock

etal

.(2

005)

16

Unc

ontr

olle

d;Va

lidat

edin

stru

men

t

Med

ical

stud

ents

Four

EBM

sem

inar

s;p

eer

teac

hing

Self-

asse

ssed

know

ledg

e;sk

ills

defin

edas

pee

rev

alua

tion

ofcl

inic

alsc

enar

ios;

acce

pta

bilit

yde

fined

asp

erce

ived

imp

orta

nce

for

clin

ical

pra

ctic

e

Inst

rum

ent

valid

ated

byas

sess

ing

tuto

rs’

per

cep

tion

ofdi

fficu

ltyw

ithp

rop

ortio

nof

corr

ect

resp

onse

s

Imp

rove

dkn

owle

dge

and

skill

s:q

uest

ion

form

ulat

ion,

sear

chin

g,cr

itica

lap

pra

isal

and

app

licat

ion

tocl

inic

alsc

enar

io

Stra

uset

al.

(200

5)17

Befo

re–a

fter

;no

cont

rol

Att

endi

ngp

hysi

cian

san

dre

side

nts

Com

ple

xin

terv

entio

n:se

ven

1-h

sess

ions

onEB

Msk

ills;

elec

tron

icEB

reso

urce

son

hosp

ital

netw

ork

and

war

d

Dec

isio

n-m

akin

g:p

rovi

sion

ofEB

ther

apie

sto

pat

ient

sas

sess

edth

roug

hp

re-p

ost

revi

ewof

disc

harg

esu

mm

arie

sof

com

mon

adm

ittin

gdi

agno

ses

Blin

ding

ofou

tcom

eas

sess

ors;

obje

ctiv

eou

tcom

eas

sess

men

t;hi

ghly

rele

vant

clin

ical

diag

nose

s

Incr

ease

inhi

gh-q

ualit

yEB

inte

rven

tions

Jeffe

ryet

al.

(200

4)18

Befo

re–a

fter

;no

cont

rol

Mac

edon

ian

doct

ors

and

nurs

es

Com

ple

xin

terv

entio

n:EB

Med

ucat

ion

deliv

ered

inta

ndem

with

infr

astr

uctu

rala

ndor

gani

satio

nals

upp

ort

Hea

lthsy

stem

:p

erin

atal

mor

talit

yra

te.

Lear

ner:

satis

fact

ion

with

trai

ning

,cl

inic

alco

mp

eten

cean

dp

robl

emso

lvin

g

Obj

ectiv

eou

tcom

eas

sess

men

t:q

uest

ionn

aire

,ob

serv

atio

n,cl

inic

alau

dit

data

,us

eof

com

put

er-b

ased

pro

toco

ls

EBed

ucat

ion

deliv

ered

with

ina

pro

gram

targ

etin

gsp

ecifi

cse

rvic

eim

pro

vem

ents

and

chan

ging

man

agem

ent

skill

sre

duce

dp

erin

atal

mor

talit

yra

teby

21.5

%in

2ye

ars

Din

kevi

chet

al.

(200

6)19

Unc

ontr

olle

dPa

edia

tric

resi

dent

sFo

ur2-

hEB

Mse

ssio

nsKn

owle

dge

and

skill

sVa

lidat

edto

ol(F

resn

o)Im

pro

ved

know

ledg

e&

skill

s

McC

lusk

y&

Lova

rini

(200

5)20

Unc

ontr

olle

dO

ccup

atio

nal

ther

apis

tsTw

o-da

yEB

Pw

orks

hop

;m

ultif

acet

edte

achi

ngap

pro

ach;

CAT

’sge

nera

tion,

follo

w-u

pou

trea

chsu

pp

ort

Know

ledg

e;sk

ills;

attit

udes

;be

havi

our

Valid

ated

inst

rum

ent

(ada

pte

dFr

esno

);q

uest

ionn

aire

;ac

tivity

diar

yof

EBP

activ

ities

;lo

ssto

follo

wup

>80%

Imp

rove

dkn

owle

dge

and

skill

sm

aint

aine

dfo

r8

mon

ths;

noch

ange

inbe

havi

our

CAT

,cr

itica

llyap

pra

ised

top

ics;

EB,

evid

ence

-bas

ed;

EBM

,ev

iden

ce-b

ased

med

icin

e;EB

P,ev

iden

ce-b

ased

pra

ctic

e.

442 M Nabulsi et al.

© 2007 The AuthorsJournal Compilation © Blackwell Publishing Asia Pty Ltd

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42

m100
New Stamp

JOBNAME: No Job Name PAGE: 7 SESS: 34 OUTPUT: Tue Oct 9 18:30:08 2007/v2451/blackwell/journals/jbr_v5_i4/jbr_084

assessment tools contributed significantly to heterogeneityamong studies, which limited meaningful comparison.

Two systematic reviews6,7 and one uncontrolled before–after study16 assessing the effect of EBHC training on knowl-edge and skills in undergraduate medical students reportedimprovement in knowledge but not skills. For postgraduates,the same two reviews6,7 and one controlled trial14 suggestedminimal improvement in knowledge only. Two subsequentsystematic reviews9,11 concluded that clinically integratedEBHC teaching improved knowledge and skills of residents,compared with stand-alone teaching, which improvedknowledge only. Both these reviews included randomisedand non-randomised controlled trials, as well as before–aftercomparison studies, and were heterogeneous with respectto study design and methodological quality. In addition,effect sizes of individual studies were not specified withoutcomes reported as change in knowledge and skills. Find-ings of two other systematic reviews8,10 evaluating the effectsof EBHC training on health professionals in general wereconsistent in terms of modest knowledge improvement andlimited evidence on improved skills. A controlled before–after study15 involving postgraduates found a non-significantimprovement in knowledge using the validated Berlin ques-tionnaire, while a more recent uncontrolled before–afterstudy reported improved knowledge and skills of paediatricresidents when assessed with the Fresno test.19

Behavioural domainOne before–after study assessed change in the frequencyof question formulation, evidence retrieval and criticalappraisal.20 This study followed participants over time,finding that although learners sustained an increase inknowledge of terms and resources for EBP, there was littleincrease in frequency of searching or critical appraisal. Wefound one RCT12, and one before–after study17 that assessedthe impact of EBHC training on physicians’ behaviours, usingdifferent methods. While the RCT12 reported that a multifac-eted training intervention did not lead to greater use ofevidence, the before–after study17 reported change in use ofevidence in decisions about treatment following EBHC train-ing of a group of residents and attending physicians in acommunity hospital.

Patient outcomesOnly one study18 reported on patients’ health outcomesfollowing EB training of health professionals. Doctors andnurses received a complex intervention consisting of educa-tion in EBHC that was delivered in tandem with infrastruc-tural and organisational support, and aimed at reducingperinatal mortality rate (PMR) in Macedonia, which has oneof the highest PMR rates in Europe. In addition to the healthsystem outcome, learner outcomes such as satisfaction withtraining, clinical competence and problem solving wereassessed through questionnaires, observation, clinical auditdata and use of computer-based protocols. This complexintervention was successful in reducing PMR by 21.5% over2 years. Although it is difficult to quantify the contribution ofEB education to this success, the study is a good example of

the importance of contextual factors in enhancing theimpact of teaching and learning EBP.

System outcomesTraining in EBHC can affect outcomes at the health servicelevel with respect to population health and quality of care.This was demonstrated by Jeffery et al.,18 who reported thata multistage project that began with training in EBP, whencombined with support in the implementation of evidence,brought about system-wide changes in practice. No otherstudies were found, however, that assessed the contributionof training in EBP to improvements in healthcare systems.

Cost-effectivenessIt is not known whether or not the systematic application ofEBHC is cost-effective. We found no study that assessed theimpact of EBHC training on cost of healthcare.

Discussion

Strengths and limitations of the studyEnrolment in the original workshop and invitation to theUlm workshop were unrestricted, so that the decisions takenare those of freely participating and independent individuals.It could be argued that the self-selected nature of the groupraises the possibility that its priorities may not be represen-tative. However, the broad span of its membership lendscredibility to its deliberations.

The current state of assessmentMost of the literature evaluating the effectiveness of EBHCtraining has focused on short-term acquisition of knowledgeand skills not exceeding 9 months. Knowledge has beenassessed through both validated and unvalidated tools.Although there are validated instruments for evaluating EBP,most instruments have assessed self-reporting of learners’perceptions knowledge, skill or performance. A recent sys-tematic review has found that retrospective self-assessmentsof skills are not an accurate indicators of skills acquisition.23

There are few assessment tools that capture changes inattitudes, behaviors or skills, particularly in relation to theimpact on patient outcomes.24 Systematic reviews could notprovide conclusive evidence of significant improvement inknowledge or skills following training in EBHC because of theweakness of study designs and their heterogeneity. Sourcesof heterogeneity included: variations in learner populationsin the different studies with potential variation in learners’needs; combinations of multiple simultaneous interventions;differences in training interventions in terms of backgroundand skills of the teacher, mode of delivery, duration of theteaching and learning, and availability of supporting mate-rials and resources; different definitions of knowledge orskills in the various studies; and variations in assessment toolsand time frames. The degree of heterogeneity is not surpris-ing, given the wide range of teaching interventions, partici-pants and study designs, but now that this variation hasbeen documented, future reviews should focus on exploringhow these factors influence outcome in greater depth.

Effectiveness of EBHC education 443

© 2007 The AuthorsJournal Compilation © Blackwell Publishing Asia Pty Ltd

123456789

10111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758

5960616263646566676869707172737475

76

77

78798081828384858687888990919293949596979899

100101102103104105106107108109110111112113114115

JOBNAME: No Job Name PAGE: 8 SESS: 34 OUTPUT: Tue Oct 9 18:30:08 2007/v2451/blackwell/journals/jbr_v5_i4/jbr_084

Evidence from RCTs was weak because of either poorquality or threats to internal validity, such as small samplesize or differences in baseline characteristics between experi-mental and control groups. Studies that assessed skills oftenused a proxy measurement such as clinical scenarios. Proxymeasures may not capture the barriers in everyday practicethat prevent implementation of EB skills, and better obser-vational measures or indicators of changes in practice suchas those used by Straus et al.17 need to be used.

There is a paucity of research evaluating the link betweentraining in EBHC and longer-term effects. As acknowledgedby Forsetlund et al.,12 evaluation tools may capture part ofthe effect (short-term improvement in knowledge and skills),but not the long-term sustainability of knowledge, skills andattitudes or their translation into changed behaviour ofhealth professionals and improved outcomes for patients. Inorder to answer whether EBHC training makes a difference,complex interventions are needed, which may representcombinations of multiple simultaneous interventions such asthose found in the Forsetlund study. The challenges of estab-lishing a causal link between exposure to an intervention,changes in behaviour and improved outcomes have beenextensively discussed elsewhere in evaluation research,25 andmethods for establishing causation must be tailored to par-ticular behaviours and outcomes.26 More studies need to beconducted that attempt to establish a relationship betweenteaching, adoption of EBP and changes in the process ofcare. The two studies by Straus et al.17 and Jeffery et al.18 areimportant because they attempted to establish such a link,nested the intervention in the practitioner context, andattempted to describe other contextual factors that mayhave influenced the impact of training in EBHC.

Future evaluationsIn evaluating the impact of training in EBHC, there is a needto shift the focus from educational interventions to evalua-tion of the teaching and learning pathway. Assessment ofthe ‘big picture’ of increased knowledge and skills, durabilityof attained skills, change in practice, and patient outcomeshould be a priority for interested researchers. RCTs arethe ideal study designs; however, they may be difficult toconduct, and it may be unethical to deprive the controlgroup from training in EBHC, especially if the outcome to beassessed is long-term retention of acquired skills. Given thewidespread popularity of EBHC concepts, it cannot beguaranteed that the control group remains unexposed totraining in EBHC for the duration of the study. Therefore,multi-site studies, perhaps with cluster randomisation, maybe needed to overcome the problems of contamination,inadequate power and generalisability. In addition, complexintervention designs such as the use of mixed qualitative andquantitative methods, or selection of multiple measurementtools may be helpful in capturing the various factors influ-enced by training in EBHC. Countries that have attempted toimprove their quality of care through re-organisations oftheir health service, similar to Jeffrey et al.’s study,18 are‘natural experiments’ where observational research could beconducted.

The primary goal of healthcare education is to producehealth professionals who deliver high-quality healthcare.However, there has been remarkably little investment intothe conceptualisation and study of the association betweenthe process of medical education and quality of care.27

Longer-term behaviour change, which includes consistentconsideration of evidence in clinical practice, is much moredifficult to attribute to a teaching intervention. Behaviourchange among health professionals does not automaticallyfollow from increased knowledge and skills, as it is a productof culminating experiences, rather than a single educationalintervention. Although patient outcomes have not beenextensively examined, they are the most relevant productof training in EBHC and therefore a priority for evaluationresearch. Hence, it is essential to identify appropriate andmeasurable patient health outcomes when examining theeffects of EBHC training. Implementation of EBP is very likelyto be affected by the extent of institutional support or resis-tance faced by the health professional. Barriers to practicewill certainly limit the observed effect on behaviour changeand consequently on patient health outcomes. It is impera-tive therefore, that when evaluating the impact of training inEBHC on health systems, the methods include description ofthe context in which the intervention takes place. Consider-ation needs to be given to contextual factors that supporteither the buttressing or the deterioration of an effect overtime. Future research evaluating training in EBHC should alsofocus on providing evidence for the cost-effectiveness of theinterventions. Information produced from this sort of studywould help to convince managers, department heads andpolicymakers to support the transfer of EB knowledge andskills to the workplace.

Conclusions

Most of the literature evaluating the effectiveness of trainingin EBHC has focused on short-term acquisition of knowledgeand skills. There is paucity of research evaluating the linkbetween training in EBHC and knowledge transfer. Providingcredible evidence on the effectiveness of training in EBHCrequires concerted efforts on the part of many educators andresearchers. We hope that this overview is useful to thoseinterested in evaluation research, and that some years fromnow we will be able to provide more conclusive answersabout the effectiveness of training programs for EBHC.

References1. Hatala R, Guyatt G. Evaluating the teaching of evidence-based

medicine. JAMA 2002; 288: 1110–2.2. Sackett DL, Straus SE, Richardson WS, Rosenberg W, Haynes RB.

Evidence-Based Medicine: How to Practice and Teach EBM.London: Churchill Livingstone, 2000.

3. Straus SE, McAlister FA. Evidence-based medicine: a commen-tary on common criticisms. CMAJ 2000; 163: 837–41.

4. Second International Conference of Evidence-Based Health CareTeachers and Developers. Reports from Working Groups, 2003.Accessed 11 Jun 2006. Available from: http://www.ebhc.org/2003/13-9.htm

444 M Nabulsi et al.

© 2007 The AuthorsJournal Compilation © Blackwell Publishing Asia Pty Ltd

123456789

10111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758

5960616263646566676869707172737475767778798081828384858687888990

91

92

93949596979899

100101102

103

104

105106107108109110111112113114115

JOBNAME: No Job Name PAGE: 9 SESS: 34 OUTPUT: Tue Oct 9 18:30:08 2007/v2451/blackwell/journals/jbr_v5_i4/jbr_084

5. Dawes MG, Summerskill W, Glasziou P et al. Sicily statement onevidence-based practice. BMC Med Educ 2005; 5: 1. Accessed30 October 2006. Available from: http://www.biomedcentral.com/1472-6920/5/1

6. Norman GR, Shannon SI. Effectiveness of instruction in criticalappraisal (evidence-based medicine) skills: a critical appraisal.CMA J 1998; 158: 177–81.

7. Taylor R, Reeves B, Ewings P, Binns S, Keast J, Mears R. Asystematic review of the effectiveness of critical appraisal skillstraining for clinicians. Med Educ 2000; 34: 120–5.

8. Parkes J, Hyde C, Deeks J, Milne R. Teaching critical appraisalskills in health care settings. Cochrane Database Syst Rev 2001;3: CD001270.

9. Coomarasamy A, Taylor R, Khan KS. A systematic review ofpostgraduate teaching in evidence-based medicine and criticalappraisal. Med Teach 2003; 25: 77–81.

10. Brettle A. Information skills training: a systematic review of theliterature. Health Info Libr J 2003; 20: 3–9.

11. Coomarasamy A, Khan KS. What is the evidence that post-graduate teaching in evidence based medicine changes any-thing? A systematic review. BMJ 2004; 329: 1017–22.

12. Forsetlund L, Bradley P, Forsen L, Nordheim L, Jamtvedt G,Bjørndal A. Randomised controlled trial of a theoreticallygrounded tailored intervention to diffuse evidence-based publichealth practice. BMC Med Educ 2003; 3: 2.

13. Stevenson K, Dip G, Lewis M, Hay E. Do physiotherapists’attitudes towards evidence-based practice change as a result ofan evidence-based educational programme? J Eval Clin Pract2004; 10: 207–17.

14. Major-Kincade TL, Tyson JE, Kennedy KA. Training pediatrichouse staff in evidence-based ethics: an exploratory controlledtrial. J Perinatol 2001; 21: 161–6.

15. Akl EA, Izuchukwu IS, El-Dika S, Fritsche L, Kunz R, SchünemannHJ. Integrating an evidence-based medicine rotation into aninternal medicine residency program. Acad Med 2004; 79: 897–904.

16. Weberschock TB, Ginn TC, Reinhold J et al. Change in knowl-edge and skills of year 3 undergraduates in evidence-basedseminars. Med Educ 2005; 39: 665–71.

17. Straus SE, Ball C, Balcombe N, Sheldon J, McAlister FA. Teachingevidence-based medicine skills can change practice in a com-munity Hospital. J Gen Intern Med 2005; 20: 340–3.

18. Jeffery HE, Kocova M, Tozija F et al. The impact of evidence-based education on a perinatal capacity-building initiative inMacedonia. Med Educ 2004; 38: 435–47.

19. Dinkevich E, Markinson A, Ahsan S, Lawrence B. Effect of a briefintervention on evidence-based medicine skills of pediatric resi-dents. BMC Med Educ 2006; 6: 1.

20. McCluskey A, Lovarini M. Providing education on evidence-based practice improved knowledge but did not change behav-iour: a before and after study. BMC Med Educ 2005; 5: 40.

21. Ramos KD, Schafer S, Tracz SM. Validation of the Fresno test ofcompetence in evidence based medicine. BMJ 2003; 326: 319–21.

22. Fritsche L, Greenhalgh T, Falck-Ytter Y, Neumayer HH, Kunz R.Do short courses in evidence based medicine improve knowl-edge and skills? Validation of Berlin questionnaire and beforeand after study of courses in evidence based medicine. BMJ2002; 325: 1338–41.

23. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE,Perrier L. Accuracy of physician self-assessment compared withobserved measures of competence: a systematic review. JAMA2006; 296: 1094–102.

24. Shaneyfelt T, Baum KD, Bell D et al. Instruments for evaluatingeducation in evidence-based practice: a systematic review.JAMA 2006; 296: 1116–27.

25. Pawson R, Tilley N. Realistic Evaluation. London: Sage, 1998.26. Grol R, Grimshaw J. From best evidence to best practice: effec-

tive implementation of change in patients’ care. Lancet 2003;362: 1225–30.

27. Chen FM, Bauchner H, Burstin H. A call for outcomes researchin medical education. Acad Med 2004; 79: 955–60.

Effectiveness of EBHC education 445

© 2007 The AuthorsJournal Compilation © Blackwell Publishing Asia Pty Ltd

123456789

101112131415161718192021222324252627282930313233343536

373839404142434445464748495051525354555657585960616263646566676869707172

JOBNAME: No Job Name PAGE: 10 SESS: 34 OUTPUT: Tue Oct 9 18:30:08 2007/v2451/blackwell/journals/jbr_v5_i4/jbr_084

AUTHOR QUERY FORM

Dear Author

During the preparation of your manuscript, the questions listed below have arisen. Please answer all the queries(marking any other corrections on the proof enclosed) and return this form with your proofs.

QueryReferences

Query Remarks

q1 Au: Please supply the authors’ qualifications and affiliation addresses.

q2 Au: (Table 2) Please spell out: CAS.

q3 Au: (Table 3) Please clarify what the question mark ‘?’ (as in ‘blinding ?’)indicates.

q4 Au: (Table 3) Please define ITT and NICU.

q5 Au: (Table 4) Please spell out: NS.

SNP Best-set Typesetter Ltd.Journal Code: JBR Proofreader: MonyArticle No: 084 Delivery date: 9 October 2007Page Extent: 9 Copyeditor: Wendy