Maximizing the Implementation Quality of Evidence-Based Preventive Interventions in Schools: A...

23
6 Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland F E A T U R E Special Section: Enhancing Implementation Quality Key words: implementation quality; prevention; schools; conceptual framework Introduction Moving evidence-based practices into real-world settings is both a high priority and a challenge for researchers, practitioners, and policymakers. A large number of interventions delivered in schools have been shown to be effective in preventing problem behavior in children and adolescents (for reviews see Berryhill & Prinz, 2003; Burns & Symington, 2002; Catalano et al, 2002; Greenberg et al, 2001; Hahn et al, 2007), and an in- creasing number of federal policies encourage the use of evidence-based practices and programs in schools. Although schools offer an enormous opportunity for prevention of behavioral and mental health problems, unique contextual factors at play in schools influence the quality of implementation of preventive interventions (Elliott & Mihalic, 2004; Schoenwald & Hoagwood, 2001). This paper aims to promote and improve research on the quality of implementation of preventive inter- ventions in schools. We first establish a definition of implementation that includes characteristics of the intervention itself and characteristics of the interven- tion’s system of support. We then present a multilevel framework of factors that both theory and empirical research suggest may influence the quality with which preventive interventions are implemented in schools. We conclude with a discussion of areas for future research on implementation and diffusion in school settings. Implementation of evidence-based programs in schools Children and adolescents spend a considerable amount of time at school, making it an ideal setting for pre- vention efforts (Kaftarian et al, 2004). Schools are the Celene E. Domitrovich 1 , Catherine P. Bradshaw 2 , Jeanne M. Poduska 3 , Kimberly Hoagwood 4 , Jacquelyn A. Buckley 5 , Serene Olin 4 , Lisa Hunter Romanelli 6 , Philip J. Leaf 2 , Mark T. Greenberg 1 and Nicholas S. Ialongo 5 1 Pennsylvania State University Prevention Research Center 2 Johns Hopkins Center for the Prevention of Youth Violence 3 American Institutes for Research 4 Columbia University 5 Johns Hopkins Center for Prevention and Early Intervention 6 Resource for Advancing Children’s Health (REACH) Institute Maximizing the Imple- mentation Quality of Evidence-Based Preventive Interventions in Schools: A Conceptual Framework A B S T R A C T Increased availability of research-supported, school-based prevention programs, coupled with the growing national policy emphasis on use of evidence-based practices, has contributed to a shift in research priori- ties from efficacy to implementation and dissemination. A critical is- sue in moving research to practice is ensuring high-quality implementation of both the intervention model and the support sys- tem for sustaining it. The paper describes a three-level framework for considering the implementation quality of school-based interventions. Future directions for research on implementation are discussed.

Transcript of Maximizing the Implementation Quality of Evidence-Based Preventive Interventions in Schools: A...

6 Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

FF EE AA TT UU RR EE Special SSection: EEnhancing IImplementation QQuality

Key wwords: implementation quality; prevention;schools; conceptual framework

IInnttrroodduuccttiioonn

Moving evidence-based practices into real-world settingsis both a high priority and a challenge for researchers,practitioners, and policymakers. A large number ofinterventions delivered in schools have been shown tobe effective in preventing problem behavior in childrenand adolescents (for reviews see Berryhill & Prinz, 2003;

Burns & Symington, 2002; Catalano et al, 2002;Greenberg et al, 2001; Hahn et al, 2007), and an in-creasing number of federal policies encourage the useof evidence-based practices and programs in schools.Although schools offer an enormous opportunity forprevention of behavioral and mental health problems,unique contextual factors at play in schools influencethe quality of implementation of preventive interventions(Elliott & Mihalic, 2004; Schoenwald & Hoagwood, 2001).

This paper aims to promote and improve researchon the quality of implementation of preventive inter-ventions in schools. We first establish a definition ofimplementation that includes characteristics of theintervention itself and characteristics of the interven-tion’s system of support. We then present a multilevelframework of factors that both theory and empiricalresearch suggest may influence the quality with whichpreventive interventions are implemented in schools.We conclude with a discussion of areas for future researchon implementation and diffusion in school settings.

IImmpplleemmeennttaattiioonn ooff eevviiddeennccee--bbaasseedd pprrooggrraammss iinn sscchhoooollss

Children and adolescents spend a considerable amountof time at school, making it an ideal setting for pre-vention efforts (Kaftarian et al, 2004). Schools are the

Celene EE. DDomitrovich1, CCatherine PP. BBradshaw2,Jeanne MM. PPoduska3, KKimberly HHoagwood4,

Jacquelyn AA. BBuckley5, SSerene OOlin4, LLisa HHunterRomanelli6, PPhilip JJ. LLeaf2, MMark TT. GGreenberg1

and NNicholas SS. IIalongo5

1Pennsylvania State University PreventionResearch Center

2Johns Hopkins Center for the Preventionof Youth Violence

3American Institutes for Research4Columbia University

5Johns Hopkins Center for Preventionand Early Intervention

6Resource for Advancing Children’s Health(REACH) Institute

MMaaxxiimmiizziinngg tthhee IImmppllee--mmeennttaattiioonn QQuuaalliittyy ooffEEvviiddeennccee--BBaasseedd PPrreevveennttiivveeIInntteerrvveennttiioonnss iinn SScchhoooollss::AA CCoonncceeppttuuaall FFrraammeewwoorrkk

AA BB SS TT RR AA CC TT

Increased availability of research-supported, school-based prevention

programs, coupled with the growing national policy emphasis on use of

evidence-based practices, has contributed to a shift in research priori-

ties from efficacy to implementation and dissemination. A critical is-

sue in moving research to practice is ensuring high-quality

implementation of both the intervention model and the support sys-

tem for sustaining it. The paper describes a three-level framework for

considering the implementation quality of school-based interventions.

Future directions for research on implementation are discussed.

7Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

context in which children with mental health needs aremost likely to receive some type of service, particularlychildren whose problems do not meet diagnosticcriteria (Farmer et al, 2003; Leaf et al, 1996; Poduskaet al, 2008). With federal funds available to supportimplementation of evidence-based programs, schoolsare increasingly able to adopt more comprehensive,public health approaches to providing student supportservices. The public health approach to school-basedprevention is based on an epidemiological under-standing of population risk in which preventive inter-ventions followed by treatment and maintenance aredelivered to distinct populations and subpopulationson the basis of levels of risk (Hoagwood & Johnson,2003). Mrazek and Haggerty (1994) defined threelevels of preventive interventions. Universal preventiveinterventions are delivered to an entire population andoften aim at strengthening some aspect of the environ-ment. Examples are social-emotional curricula,classroom behavior management strategies, andschool-wide behavior support. Universal interven-tions are backed up by selective interventions and indi-cated interventions, which are delivered to the segmentof the population that is at increased risk and has notresponded to interventions earlier in the continuum.However, it is rare that there is cross-level coordinationbetween universal and more targeted prevention efforts.

Despite interest in the use of evidence-basedprevention programs by school districts, theory andresearch remain limited on how to move these programsinto general practice with high-quality implementation(Elliott & Mihalic, 2004; Domitrovich & Greenberg,2000; Schoenwald & Hoagwood, 2001). In fact, althougha considerable body of research links student outcomeswith quality of program implementation (Derzon et al,2005; Durlak & DuPre, 2008; Durlak & Weissberg,2005), there is general consensus in the field that pre-vention programs implemented in schools outside ofefficacy trials or highly controlled research studies aretypically not implemented with high quality (Dusenburyet al, 2005; Gottfredson & Gottfredson, 2002; Ringwaltet al, 2004). A critical yet often overlooked aspect ofthe dissemination process in schools is the influence ofcontextual factors at multiple levels on the quality ofprogram implementation (Adelman & Taylor, 2003;Burns & Hoagwood, 2005; Dusenbury et al, 2003;Fixsen et al, 2005; Greenhalgh et al, 2005; Rohrbachet al, 2006; Wandersman et al, 2008).

Consistent with a social-ecological framework (Atkinset al, 1998; Bronfenbrenner, 1979) and supported by

extant research, we present a multilevel model forconsidering contextual factors that may affect, eitherdirectly or indirectly, the implementation quality ofschool-based interventions (see Figure 11, overleaf). Themultilevel framework takes into consideration the influ-ences of macro-level factors (for example federal,state, and district policies), school-level factors, andindividual-level factors. These contextual factors mayhave more or less importance depending on the stageof implementation or diffusion (program adoption,implementation, or institutionalization) (Fixsen et al,2005). In this paper, we focus on the stage of imple-mentation that occurs after a school or a district hasdecided to adopt a program, but before it is sustainedor formally integrated into a system.

Given the focus on implementation quality as theoutcome of interest, it is positioned at the center of theproposed model. Implementation quality is the dis-crepancy between what is planned and what isactually delivered when an intervention is conducted,so it is necessary to specify the model against whichactual practice will be measured. As will be describedin the next section, the model for both the interventionand its associated support system requires specification.

DDeeffiinniinngg tthhee iinntteerrvveennttiioonn aanndd ssuuppppoorrtt ssyysstteemm mmooddeell

Two conceptually distinct components must be con-sidered in regard to quality of implementation: theintervention itself and the support ssystem for the in-tervention (Chen, 1998, 2003; Klein & Sorra, 1996).Interventions are strategies or innovations linked bya causal mechanism to specified, intended outcomes(Chen, 1998, 2003). They can include programs,policies, processes, or principals (Saul et al, 2008).There is tremendous variation between interventionsin terms of the risk factors at which they are aimed,the targets of the interventions (individuals, systems,the environment, for example), and the methodsthrough which they operate. The purpose of a supportsystem is to reduce variability in the quality of programimplementation by providing the means and estab-lishing the context for the delivery of interventionsthrough elements such as training implementers andproviding the infrastructure necessary to coordinatethe deployment of the intervention (Greenberg et al,2001). The intervention and the correspondingsupport system are independent, though interrelated,components of a whole.

This conceptualization of implementation extends

8 Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

traditional perspectives of program implementationthat have tended to focus solely on fidelity to the inter-vention components (Moncher & Prinz, 1991). Theplanned model for each of these components – the in-tervention and its corresponding support system –should be specified and monitored to ensure replica-tion with high-quality implementation (Greenberg etal, 2001). Given this dual focus, both components arerepresented as layers in the center of the conceptualmodel (Figure 11). As described in the next section, ide-ally this model is standardized and specified for theintervention and the support system in terms of coreelements and the delivery mmodel. These three ele-ments are part of the figure representing implementa-tion quality.

TThhee iimmpplleemmeennttaattiioonn mmooddeell

CCoorree eelleemmeennttss ooff tthhee iinntteerrvveennttiioonn mmooddeell

Well-developed programs have a specified set of fea-

tures or practices that are directly related to the under-lying theory of the intervention and describe the mech-anisms of change (Collaborative for Academic, Social,and Emotional Learning, 2005; Mrazek & Haggerty,1994). These features or practices are often called thecore components or core elements of the intervention.For example, meta-analytic studies of school-baseddrug prevention programs have found larger effectswhen programs were interactive and focused on de-veloping drug refusal skills, changing normative be-liefs, and promoting social competence (Tobler et al,2000; Wilson et al, 2001; Wilson et al, 2003). Thiswas in comparison with information-only programsthat involved didactic delivery methods, or programsthat consisted primarily of recreational activities, tutor-ing, or mentoring.

Researchers have used randomized trials compar-ing intervention components and mediation analysesof proximal and distal outcomes to validate theoreticalmodels that underlie intervention models (McNeal etal, 2004; Skara et al, 2005). Unfortunately, the major-

FIGURE 1 Factors that Can Affect Implementation Quality: A Multi-LLevel Model

9Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

ity of interventions have not been subjected to componentanalyses of these kinds. In the meantime, descriptivestudies of implementation quality under naturalconditions can be used to provide initial support orsuggest refinements to program theory when the imple-mentation measures are specifically aligned with thehypothesized core elements. For example, Stevens,van Oost, and de Bourdeaudhuij (2001) studied abullying prevention program and found that only twoof the hypothesized six core components were relatedto program outcomes.

Absence of core components, poorly delivered corecomponents, or negative adaptations all have the poten-tial to reduce the intervention impact. Assessments ofthe implementation quality of core elements shouldtherefore be used as process measures that guidecontinuing quality improvement and professionaldevelopment strategies to improve practice. Thesemeasures can also be used to understand the implica-tions of program adaptations. Once the critical contentor process elements of an intervention are specified, itis possible to assess the degree to which an adaptationdeviates from the model.

SSttaannddaarrddiizzaattiioonn ooff tthhee iinntteerrvveennttiioonn mmooddeell

Another characteristic of interventions found to bepositively related to implementation quality is pro-gram standardization (Payne et al, 2006). The spec-ification or documentation of the core componentsof school-based interventions often includes detailedinstructional manuals and lesson plans; however, itis important to remember that interventions includea variety of approaches and may target systemsrather than individuals in order to improve studentoutcomes (Berryhill & Prinz, 2003). Lesson plans arenot relevant for preventive interventions that applypractices or procedures designed to cause systemicchange, such as the school-wide Positive BehaviorSupports model (Sugai & Horner, 2006) or theGood Behavior Game (Kellam et al, 2008). However,even for a systemic model which represents a con-fluence of effective practices and systems of support,it is critical that the implementation quality bemonitored to ensure standardization across sites(Bradshaw, Reinke et al, 2008).

DDeelliivveerryy ooff tthhee iinntteerrvveennttiioonn mmooddeell

The delivery strategy of an intervention can be defined

in terms of the frequency, duration, timing, andmode of delivering the core components, as well asthe individuals actually responsible for implementingthe intervention. Research focused on the relation-ship of delivery strategies to program implementa-tion and outcomes is in its infancy. With regard toduration, although it seems intuitive that this aspectof delivery would be related to program outcomes,Gottfredson and Wilson (2003) failed to find a significantrelationship between duration of school-based sub-stance abuse interventions (measured in months)and student outcomes. However, the literature onsocial and emotional learning suggests that two-year programs produce more substantial effectsthan one-year or shorter-term programs (Weissberg& Greenberg, 1998b). For systemic interventions, itoften takes schools three to five years to reach highimplementation quality and to achieve the desiredstudent outcomes (Rimm-Kauffman et al, 2007;Sugai & Horner, 2006). With regard to the mode ofdelivery, Vicary and colleagues created a modifiedversion of Life Skills Training to compare two methodsof delivery (standard delivery versus an infusedapproach) which incorporated the intervention’scontent into the existing grade-level subject cur-ricula. Neither delivery model resulted in positivemain effects, but both showed some short-termimpact on substance use reported by girls (Vicary etal, 2006).

Research comparing different individuals asdelivery agents of interventions has not shown aconsistent pattern. Meta-analytic studies suggest thatsome deliverers of school-based substance-abuseinterventions produce larger effects than others(Gottfredson & Wilson, 2003). For example, severalrecent studies have reported that better implementationquality was obtained when delivery was conducted byteachers rather than program specialists from outsideagencies (Spoth et al, 2007; McNeal et al, 2004).Cameron and colleagues (1999) reported similarresults, but found that the differences were onlysignificant in high-risk educational settings. It is impor-tant to note that positive outcomes for school-basedsubstance abuse interventions have been achievedusing implementers from outside the educational setting(Ellickson & Bell, 1990; Ellickson et al, 1993). Acomparison of two types of implementer with ProjectToward No Drugs found no differences between teachersand specialists in student outcomes or implementationquality (Rohrbach et al, 2005).

10 Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

SSuuppppoorrtt ssyysstteemm

CCoorree eelleemmeennttss ooff tthhee ssuuppppoorrtt ssyysstteemm

Regardless of a program’s specific content or deliverymode, most programs require some form of supportsystem for effective implementation (Chen, 1990). Themost common support system is pre-intervention trainingthat gives implementers the knowledge and skills theyneed to conduct or use the intervention (Fixsen et al,2005). Studies comparing teachers who received trainingwith those who did not have shown that training is animportant element for effective implementation (Parcelet al, 1991; Perry et al, 1990; Ross et al, 1991). Thepresence or absence of training is examined moreoften than specific core components, since the lattervary with the intervention being used. However, alarge body of literature exists on adult learning andprofessional development for educators that can informthe development of support system models for school-based and classroom-based interventions, particularlythose for which the teacher is the delivery agent(Fishman et al, 2000; Garet et al, 2001; Joyce &Showers, 2002; Putnam & Borko, 2000). This workhighlights the importance of providing opportunitiesfor active learning through observation, meaningfuldiscussion, practice, and reflection.

Professional development is often conceptualizedas an ongoing process rather than as a single eventthat is aligned and integrated into professional work.Often it includes the use of a more knowledgeablecoach to mentor implementers, or the use of peersupport (Gingiss, 1992; Joyce & Showers, 2002).Coaching or mentoring is one form of technical assis-tance, and facilitates the implementation process byhelping users understand the intervention, the mechanicsof program delivery, and appropriate ways to applyand adapt the intervention with existing practices(Dusenbury et al, 2007). Mentoring that includes in-classroom coaching and out-of-classroom consultationis emerging as a promising professional developmentstrategy that improves the behavioral and educationaloutcomes of school-based interventions beyond thoseachieved through traditional instructional workshops(Aber et al, 1998; Barrett et al, 2008; Haskins &Loeb, 2007; Gorman-Smith et al, 2003).

Little is known about the mechanisms through whichmentors or coaches influence behavior change in theindividuals with whom they work. However, somedescriptive studies suggest that the support and encour-

agement they provide are essential ingredients(Brooks, 1996; McCormick & Brennan, 2001). Empiricalresearch suggests that the performance feedbackprovided by coaches may be a critical elementcontributing to the success of this professionaldevelopment strategy (Leach & Conto, 1999; Noellet al, 2005; Rose & Church, 1998).

SSttaannddaarrddiizzaattiioonn

While standardization has been identified as animportant intervention factor related to high-qualityimplementation, it has rarely been applied to the inter-vention support system. Most of the research onevidence-based interventions has been conducted byprogram developers who also provide training and, insome cases, on-going support as interventions aredisseminated (Elliot, 1998). There may be less need toverify the implementation quality of this componentwhen control of the support system remains with thedevelopers. However, as the demand for replication ofpreventive interventions by independent researchersgrows, this aspect of implementation monitoring willbecome more relevant. Some intervention developersmay already be aware of this need as they conduct‘train the trainer’ models that allow communities todevelop the internal capacity to sustain interventionsover time (for example Committee for Children, n.d.).

DDeelliivveerryy

A few studies have empirically tested different modelsof support system delivery in schools. One such studyfound that live training and video training both re-sulted in similar levels of implementation of the LifeSkills Training program (Botvin et al, 1995). Othershave found that live training of teachers resulted ingreater fidelity of implementation (Basen-Enquist et al,1994). Similarly, one of the few implementation studiescomparing levels of training intensity found no differ-ence in the quality of initial implementation betweenteachers who received an intensive training and thosewho received a brief training in the Adolescent AlcoholPrevention model (Rohrbach et al, 1993). Cameronand colleagues (1999) found similar reductions insubstance use by students when teachers participatedin a self-preparation training rather than a workshop.In some cases, the lack of difference in outcomesbetween training methods may reflect the level ofcomplexity associated with implementing particular

11Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

programs. To date, there has been very little researchcomparing coaching or mentoring support systemdelivery models. Most is descriptive, and does notprovide guidance on the intensity or frequency ofmentoring that is needed to achieve high levels ofimplementation quality and subsequent changes instudent outcomes.

MMeeaassuurriinngg tthhee qquuaalliittyy ooff tthhee iinntteerrvveennttiioonn aanndd tthheessuuppppoorrtt ssyysstteemm

Despite the existing research linking quality of programimplementation with student outcomes, the process ofmonitoring the quality of implementation is often over-looked, or given lower priority than measuring outcomes.Comprehensive measurement of the implementationquality of both the intervention and support systemshould include assessments of adherence in terms offidelity (degree to which an intervention and its supportsystem are conducted as planned), dosage (specific unitsof an intervention and support system), and quality oofdelivery (Dane & Schneider, 1998; Dusenbury et al, 2005).Ideally, multiple indicators (teacher self-report, coach rating,for example) of each dimension are also collected.

Fidelity to the intervention is commonly assessed byusing implementer self-report, observation, or havingparticipants report on the occurrence of core elements(whether specific content was delivered or techniquesused) (Hansen, 1996). There is considerable evidencethat teachers routinely adapt programs (Bumbarger &Perkins, 2008; Datnow & Castellano, 2000; Ringwaltet al, 2003), yet to date little research has focused onthe impact of the adaptations on program outcomes.As models of intervention support systems are increas-ingly specified and empirically validated, the fidelity ofthis component will also need to be assessed.

Dosage is one of the easiest measures of imple-mentation quality to quantify. It is often presented interms of specific units of an intervention (such as numberof lessons delivered) or amount of time that a participantis exposed to an intervention (for example hours ofcontact). Similar ratings can be made of the imple-mentation support model, such as quantifying thenumber of hours of training or the number of contactsfrom a coach or supervisor.

Quality of delivery should be monitored as it relatesto both the specific intervention components andgeneralization of the core concepts. Affective engage-ment, sensitivity, and responsiveness are among thegeneral skills that might apply to most school-based

interventions, but sometimes more specific techniques(for example role plays) are needed to deliver an inter-vention effectively. Dusenbury and colleagues (2005)used the term ‘quality of process’ to highlight the impor-tance of engaging participants in an intervention andthe reciprocal nature of interactions that are necessaryfor learning and behavior change. Generalization ofan intervention is the application of an intervention’score components beyond its given framework. Thisprocess requires ‘deep structure’ knowledge of the inter-vention model, and skillful application, to contribute tostudent improvement (Han & Weiss, 2005). General-ization is more difficult to capture through self-reportmeasures, and may require more frequent observation,as the behavior is spontaneous and dependent onspecific conditions.

Interpersonal skills are not only necessary for inter-vention delivery but also an essential part of deliveringtraining and on-going support – the common elementsof an intervention support system. In a national studyof delinquency prevention in schools, Payne and col-leagues (2006) combined a measurement of quality oftraining with a measurement of amount of training toform a multi-item construct referred to as ‘local programprocess’. They found that this construct correlated pos-itively with the level of intervention use as reported byschool personnel responsible for coordinating activitiesof these types.

In summary, two types of ‘drift’ commonly occurwhen evidence-based interventions are implementedin school settings: deviation from the intervention modeland deviation from the corresponding support system.Multiple indicators of program adherence allow astrong assessment of the degree of discrepancy betweenthe ‘model’ version of the intervention and the supportsystem as conceived by the developers, and the way itis implemented in real-world settings by school systempersonnel. Data on implementation quality servesmultiple purposes. In practice, it is a process tool thatinforms professional development and guides qualityimprovement efforts. In research, information onimplementation quality provides the data necessaryfor testing theories of intervention and determiningwhich core elements of the intervention and/or supportsystem are associated with student outcomes(Gottfredson et al, 2008).

TThhee mmuullttiilleevveell iimmpplleemmeennttaattiioonn qquuaalliittyy ffrraammeewwoorrkk

Implementation of evidence-based practices in schools

12 Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

does not occur in a vacuum; it is influenced by a broadarray of school, district, state, and federal policies andpractices. Figure 11 depicts a multilevel conceptualframework that organizes factors that influence imple-mentation quality into three levels: macro level, schoollevel, and individual level. As stated previously, inclusionof factors is based on both theory and empirical researchwhich is provided whenever possible. Factors at allthree levels of the model are interdependent, and havethe potential to influence the quality with which inter-ventions are implemented and student outcomes (Brint,2006). For example, macro state-level policy has thepotential to have a direct impact on the quality ofprogram delivery by teachers, or to affect it indirectly,through organizational factors such as the funding ortime allocated at building level to these types ofpreventive intervention.

Organizational influences can also have direct andindirect effects on staff attitudes and efficacy for imple-menting interventions. For example, strong administrativesupport or an organizationally healthy school environ-ment can positively influence staff members’ willingnessto try innovative interventions, their attendance attraining events, and their openness about challengesfaced when implementing the program. Teachersworking in this type of environment may becomemore empowered and have greater efficacy, which inturn can affect the quality with which they implementinnovations. However, if a new program is adoptedwithout a consideration of how it will fit into theschool’s instructional day, teachers may experienceburden and stress, which can negatively affect programimplementation. These influences may also operatebi-directionally across or within levels; for example,teacher characteristics (for example attitudes to mentalhealth) may influence quality of implementation, orquality of implementation may have a direct and sub-stantial effect on teacher characteristics, such as burnoutand stress. A detailed description of the factors ineach of the three levels of the conceptual frameworkis provided below.

MMaaccrroo--lleevveell ffaaccttoorrss

The first level of the model (Figure 11) is the broadest,and includes community factors that have the potentialto influence the quality of implementation within schools.These sources are not limited to the educationalsystem, but include government and communityentities as well.

Policies aand ffinancingThe first level of the framework represents macro-systemic sources of influence, such as policies andpractices at federal, state, and district level that havethe potential to influence the implementation ofevidence-based programs in schools, primarily by fiscal,regulatory, and administrative means. Title I of theElementary and Secondary Education Act, lastreauthorized by Congress as the No Child Left BehindAct (NCLB), and the special education law, alsoknown as Individuals with Disabilities Education Act(IDEA), are highly visible federal education policieswith significant impact on state and district practicesand policies through the influence of procedural man-dates tied to receipt of federal funding (Kataoke et al,2008). Although NCLB emphasizes standards-basedreforms focused on improving the academic achieve-ment of students, it contains several provisions thatsupport a broader public health mission of schools.These include Title I Part D (programs for children whoare neglected, delinquent, or at risk), Part H (dropoutprevention), Title IV, Part A (Safe and Drug FreeSchools), Title V, Part D, Subpart 2 (Elementary andSecondary School Counseling Programs), Subpart 3(Partnerships in Character Education), and Subpart 14(Grants to Improve the Mental Health of Children).

Both policy and legislative action can have a stronginfluence on implementation processes. For example,both Illinois and New York have passed legislation re-quiring that schools develop plans for social and emo-tional development (Katulak et al, 2008). In addition,Illinois has now developed standards and benchmarksfor this domain at each grade level (Illinois StateBoard of Education, 2004). Researchers and programdevelopers who understand the mission and vision ofpolicymakers and administrators at local, state, andfederal levels will be better prepared to establish mutualinterests with school district personnel, from superin-tendents to teachers.

Several district-level factors can enhance or impedeimplementation. Many districts face fiscal challengesin financing their programs. Often the money availablein a district cannot be easily blended. Monies fromfederal grants, foundations, and partnerships withbusinesses and research institutions are often ear-marked for specific activities. The stability of districtleadership can influence implementation through itseffects on mission articulation, staffing decisions, andprogrammatic choices. Establishing partnerships withthe broader community and institution base can help

13Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

ensure stability of mission and focus across times ofleadership transition and across institutions (Adelman& Taylor, 2003; National Research Council, 2004).

Leadership aand hhuman ccapitalConcepts such as community capacity and empower-ment, common in community psychology and participatoryresearch, have not always been given adequateattention in the field of prevention (Wandersman, 2003;Weissberg & Greenberg, 1998a), but they representmacro-level factors that may influence the implemen-tation process within schools. Some researchers applytheories of community science to create community-level interventions that target student outcomes suchas youth violence and substance use (Wandersman &Florin, 2003). Many of these interventions, such asCommunities that Care (Hawkins & Catalano, 1992),include formation of a community coalition as a mo-bilization strategy. Research on the functioning ofthese groups has focused more on the adoption ofevidence-based interventions – rather than implemen-tation – and has produced mixed results regardingtheir impact (Wandersman, 2003). Less attention hasbeen paid to their potential impact on quality of implemen-tation, although theoretical models of how buildingcommunity capacity assists in reaching this goal areemerging (Chinman et al, 2005). Yet community coali-tions have the potential to have a positive influence onaspects of the implementation support system, such asproviding training and technical assistance (Feinberget al, 2008; Spoth & Greenberg, 2005).

Other macro-level factors that have an impact onimplementation include availability of qualifiedprofessionals in a community to implement programs,availability of trainers or coaches to support imple-mentation in schools, and, within the educational system,allocation of professional development days acrossthe school year that can be used for professionaldevelopment. A well-respected champion of a programand district-level administrative support also appearsto be important for high-quality sustainable imple-mentation (Adelman & Taylor, 2003; Barrett et al,2008; Elliott & Mihalic, 2004).

Community–university ppartnerships There is increasing awareness of the importance ofcommunity–university partnerships in promoting useand implementation of evidence-based interventions.The extension service of land grant universities is onemechanism that has been leveraged to disseminate

evidence-based interventions in schools (Spoth &Greenberg, 2005). For example, the PROSPER part-nership model creates local teams with personnelrepresenting university extension, schools, and com-munity agencies. In order to foster implementation ofschool-based and family interventions, the PROSPERmodel focuses on assessing local needs, monitoringimplementation, and evaluating outcomes. Researchby the developers of this model documents the bene-fits of community-university partnerships for achievingpositive implementation outcomes (Spoth et al, 2007).In another example, Wandersman and his colleagues(Wandersman et al, 1999) developed the Getting toOutcomes process to help practitioners plan, imple-ment and evaluate interventions in order to maximizeresults. The Collaborative for Academic Social andEmotional Learning (CASEL), at the University ofIllinois at Chicago, has created a similar develop-mental model that supports schools or districtsthrough a multi-year process which attempts to ensurehigh-quality implementation and sustainability ofevidence-based programming (Devaney et al, 2006).This model helps participants build a vision, identifycommunity needs, select appropriate interventionstrategies, and create a support system for theirtraining and use.

SScchhooooll--lleevveell ffaaccttoorrss

The second level of the framework represents theschool as an organizational entity that has an influ-ence on program implementation. Understandingthe organizational context of schools is critical for theimplementation and sustainability of interventionsbecause children, teachers, and other school staffare all embedded in this shared environment(Ringeisen et al, 2003). Included in this level of theframework are factors that relate to the school’sorganizational functioning, such as the structure orpolicies within the building, the resources availableto support interventions, and the school climate.School climate is reflected by the staff’s perceptions,either of their relationships or of the workplace envi-ronment, as well as by characteristics of the studentbody (Hoagwood & Johnson, 2003; Owens, 2004;Ringeisen et al, 2003; Tanyu et al, 2005). Also includedin this level are characteristics of the school and theclassroom that have been found to affect implemen-tation, or that theory would suggest might beimportant to consider.

14 Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

Mission–policy aalignmentTeaching and learning are central to any school’smission. Interventions that align directly with a school’smission or are easily integrated into the school’s policyand practices are more likely to be prioritized, imple-mented with quality, and sustained over time (Datnowet al, 2002; Kallestad & Olweus, 2003; Payne et al,2006). Understanding that the primary mission of schoolsis academic achievement requires that researchersand program developers highlight the link betweenprevention of social-emotional and behavioral prob-lems and academic achievement, to illustrate how anintervention helps the school meet its mission (Durlaket al, 2008).

Decision sstructureAn important characteristic of the school’s organiza-tional structure is the extent to which power is central-ized and roles are formalized and rigid (Hoagwood &Johnson, 2003; Owens, 2004). In schools, structuredescribes the amount of discretion exercised by teachersin solving problems they encounter in the classroom,their contribution to the development of school policies,and the flexibility they have in how they teach. Involve-ment of an organization’s members in decision-makingdecreases resistance to change and increasesmembers’ perceptions of successful program adop-tion. For curriculum- or classroom-based preventiveinterventions, teachers who have an active role indeciding what intervention to adopt or determininghow a new program fits into the context of the existingeducational program are more motivated andcommitted to high-quality program implementation(Ringwalt et al, 2003).

ResourcesThe amount and type of resources available to deliverevidence-based services in schools – including funds,materials, knowledge, skills, and equipment available toprovide the intervention – are important organizational-level factors to consider (Owens, 2004; Ringeisen etal, 2003). Tangible forms of support, such as monetaryincentives (for example stipends for training), dedicatedstaff time for prevention activities, space, equipment,and other necessary program resources, are part of aschool’s capacity to implement an intervention. Although,in many districts, monetary resources are controlled atthe district level rather than the school level, buildingadministrators do have influence over the allocation ofstaff time and space.

Personnel eexpertiseAnother factor related to a school’s capacity to implementprograms with high quality is the level of preventionexpertise in the building. A model for building capacitywithin a school is to provide enhanced training andtechnical assistance in classroom-based interventionsto a teacher who is a key opinion leader in the school.This person, in turn, can recommend evidence-basedstrategies to other teachers in the building and serveas an internal prevention specialist (Atkins et al, inpress). With the involvement of school personnel, thismodel can increase the use of ecologically appropri-ate interventions. As noted above, the availability ofqualified staff, such as master teachers, coaches, andschool psychologists throughout the district and withinschools, can have a significant impact on the qualityof intervention implementation and the support system.

Administrative lleadershipSchool administrators can help transform schools intoplaces that are committed to using innovative programsand practices (Datnow et al, 2002; Hallinger & Heck,1996). A strong leader who advocates using evidence-based practices within a school can have a significantimpact on the successful implementation of interven-tions (Gottfredson & Gottfredson, 2002; Kam et al,2003; Payne et al, 2006). In addition to endorsing theintervention, effective administrators provide the over-sight and accountability that are necessary to maintainfocus and ensure follow-through by implementers inthe schools. For example, strong administrative sup-port for an intervention occurs when leaders within theschool actively participate in the planning, training,and implementation of the program. They can alsoincrease implementation by specifying program partic-ipation in staff job descriptions (Barrett et al, 2008)and by requiring that staff allocate class time to imple-ment the program. Formally committing staff and ad-ministrators to the prevention activity increasesaccountability for quality implementation (Rohrbach etal, 2005). In contrast, poor administrative leadership– both in general and for the program – can have anegative impact on implementation quality by con-tributing to low staff morale and limiting the timeallocated for activities perceived as outside the academicmission of the school (Kam et al, 2003).

School ccultureIn the mental health literature, organizational culturehas been distinguished from organizational climate

15Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

and shown to influence the implementation of services(quality of services) in human service agencies (Aarons,2005; Glisson & Hemmelgarn, 1998). Distinctionsbetween culture and climate have been proposed asimportant to consider in schools as well (Hoy et al,1998; Owens, 2004; Van Houtte, 2005). Cultureinfluences the way things are routinely done in anorganization, and reflects the norms, values, and sharedbeliefs or assumptions of the membership. In contrast,climate reflects an individual’s perceptions (Glisson &Hemmelgarn, 1998; Reichers & Schneider, 1990).

Culture is often assessed by surveying the membersof an organization and aggregating their responseson items referring to group norms and shared expec-tations. School culture is an important factor to considerbecause the introduction of evidence-based interven-tions may require expansion of the academic missionand a new focus on monitoring practices with precisionto inform professional development. Past experiencesmay also influence how preventive interventions areincorporated into a school’s culture, and the qualitywith which interventions are implemented. Similarly,research on mental health providers indicates thatworking environments that are less bureaucratic andhave written practice policies regarding the use ofevidence-based practices tend to have staff who aremore supportive of using evidence-based practices(Aarons, 2005; Aarons & Sawitzy, 2006). We anticipatethat a similar relationship exists in school settings, andmay in turn influence adoption of evidence-basedprograms as well as implementation quality.

School cclimate aand oorganizational hhealthSchool climate is the organizational personality of aschool (Halpin & Croft, 1963). It is relatively stableover time, and influences the behavior of individualsin a building. Research on school climate often focuseson the social or psychological aspect of the construct,and includes measuring student, staff, and/or parentperceptions of interpersonal exchanges (for exampleopen, trusting, respectful) between members of theschool community (Bryk & Schneider, 2002). Schoolclimate can also include an individual’s perceptions ofother structural or philosophical characteristics of theeducational setting.

Organizational health, or an organization’s abilityto adapt to challenges over time, is an important indi-cator of school climate (Bevans et al, 2007; Hoy et al,1998). Studies have linked positive school climate withstudent achievement and behavioral adjustment (Bryk

& Schnieder, 2002; Esposito, 1999). Battistich andHom (1997) found that elementary schools withhigher ‘sense of community’ scores had significantlylower student drug use and delinquency among fifth-and sixth-grade students. Similarly, staff reports ofschools’ overall organizational health have beenlinked with greater efficacy, commitment, and jobsatisfaction, as well as with positive outcomes for stu-dents (Hoy et al, 1998). Constructs such as opennessin communication, orientation to change (Kallestad &Olweus, 2003), and an open and supportive environ-ment (Parcel et al, 2003) have been positively relatedto measures of implementation quality, whereas poorstaff morale, a sense of resignation, and a history offailed intervention attempts have been associated withdifficulty in implementing and sustaining innovations(Gottfredson & Gottfredson, 2002).

Although the organizational context of schools isconsidered highly relevant, little is known about theunderlying mechanisms linking climate or health to aninstitution’s ability to implement evidence-based prac-tices successfully. Schools that are organizationallyhealthy and provide a positive, supportive, and safeenvironment for staff may contribute to staff members’efficacy and willingness to commit to the intervention(Bradshaw, Koth et al, 2008). Similarly, the staff’s col-lective self-efficacy (the perception that the faculty’s ef-forts as a whole can have a positive impact on students)has been found to be positively associated with studentachievement (Goddard & Goddard, 2001). Consequently,the quality with which interventions are implemented islikely to be influenced by staff members’ collectiveself-efficacy.

Characteristics oof tthe sschoolAlthough the mechanisms are not well understood,school-based researchers typically acknowledge thatschool-level characteristics, such as school size andstudent mobility, can influence both the outcomesachieved through interventions and the implementa-tion quality. Characteristics of the school building orthe student body aggregated at building level can beused to examine potential moderators of student out-comes and may predict the quality with which inter-ventions are implemented (Bradshaw, Koth et al, inpress). For example, some school-based preventive in-terventions may be easier to implement in smallschools than in large schools, or in rural or suburbancommunities than in urban settings. Schools that aredisorganized or have a large number of at-risk students

16 Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

may encounter more problems in implementing inter-ventions with high fidelity (Gottfredson et al, 2002; Tolanet al, 2004). This may be due in part to high mobilityor absenteeism, which results in reduced exposure ofparticipants to critical components of the intervention(low dosage because of high student absenteeism).Schools in disadvantaged neighborhoods may experi-ence greater staff turnover, which undermines the abilityto sustain a workforce trained to implement preventiveinterventions.

Classroom cclimateAlthough no single factor defines the climate of aclassroom, this construct typically refers to the array ofsocial and psychological aspects of the classroomenvironment, including the sense of belonging, thelevel of cooperation and mutual respect among class-room members, and the relationships between teacherand students (Wang et al, 1997). It can also reflecteducational aspects of the environment (such asteaching practices, rule clarity). Because many preven-tive interventions in schools are classroom-based(universal drug prevention, character education pro-grams, for example), classroom climate can have asignificant impact on the quality of implementationand outcomes of evidence-based practices (Dunn &Harris, 1998). A classroom climate characterized byhigh levels of peer or teacher–student conflict maynegatively influence program implementation andprogram effectiveness. High levels of student misconductin the classroom may result in a teacher spendingmore time on management than on instruction, and mayhave a negative impact on the classroom environment(Koth et al, 2008; Kellam et al, 1998).

IInnddiivviidduuaall--lleevveell ffaaccttoorrss

The third level of our framework represents individual-level factors that can promote or undermine the qualityof intervention implementation in schools. Althoughsome theoretical work highlights the potential signifi-cance of individual factors in relation to high-qualityimplementation and sustainability of interventions(Han & Weiss, 2005; Jennings & Greenberg, 2008;Klein & Sorra, 1996), empirical research on howimplementer characteristics influence implementationquality has been relatively limited.

Professional ccharacteristicsFew teacher training programs include training on

classroom management or prevention programs.Similarly, training programs for counselors and schoolpsychologists typically focus on one-on-one orgroup counseling, rather than on implementation ofclassroom-based prevention programs. Consequently,school staff vary widely in their education, skills, andexperience, which can influence attitudes to the imple-mentation of prevention programs. Studies of therelationship between the professional characteristics ofteachers and program implementation attitudes andbehaviors offer mixed findings. Aarons (2004) foundthat, among mental health clinicians, having a higherlevel of education or being an intern was associatedwith more positive attitudes to evidence-based practices.In the substance-abuse prevention literature, feweryears of teaching experience and greater teachingskills were associated with higher levels of implemen-tation (Ringwalt et al, 2002; Rohrbach et al, 1993).Other research has shown that years of experienceare not related to program fidelity or to the likelihoodof using an evidence-based curriculum (Ringwalt et al,2002, 2003).

Psychological ccharacteristics Lack of experience in implementing preventive inter-ventions, or level of comfort with certain methods (suchas interactive teaching) may increase implementers’anxiety when they are called upon to implement suchinterventions (Ennett et al, 2003). Although psychologicalmindedness can vary considerably between individuals,such awareness can be helpful in understanding bothnegative reactions (such as anxiety, reluctance, andanger) and positive reactions (like enthusiasm orconfidence) in one’s own self and in interventionparticipants. Some researchers have examined thepersonal characteristics of implementers as potentialpredictors of positive intervention effects. Findingssuggest that traits such as sociability, extroversion, agree-ableness, conscientiousness, and individualization arecharacteristics associated with positive implementationoutcomes (Lochman et al, 2008; St. Pierre et al, 2007).Cynicism, on the other hand, was inversely relatedto implementation quality (Lochman et al, 2008).

Another important characteristic of the implementeris psychological functioning. Stress, depression, andprofessional burnout are aspects of psychologicalfunctioning that can reduce productivity and the qualityof performance in the workplace. Professional burnout,typically defined as emotional exhaustion, depersonal-ization (such as indifferent or negative feelings displayed

17Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

toward students), and lack of work-related accom-plishment (educators feeling that they are no longercontributing to students’ development, for example), isa significant problem among teachers (Borg et al,1991). Although few studies have examined how thesepsychological constructs relate to implementation ofacademic or preventive interventions (Evers et al, 2002),psychological functioning probably has an impact onimplementation quality, particularly when programinnovations are perceived as causing additional burdenor competing with other priorities.

Self-efficacy is an indicator of psychological func-tioning that drives effort and persistence in the face ofchallenges. In the education literature, efficacy describesteachers’ perceived ability to conduct instructionalpractices, manage the classroom environment, andaffect change in student behaviors (Tschannen-Moran& Woolfolk Hoy, 2001). Research on self-efficacy relatedto behavioral interventions or instructional strategieshas generally concluded that greater efficacy is associatedwith higher-quality program implementation (Kallestad& Olweus, 2003; Rohrbach et al, 1993). Intervention-specific efficacy (for example knowledge of programtheory and components, proficiency in deliveringactivities) has been shown to relate to higher imple-mentation quality, as has comfort with using inter-active methods in intervention delivery (Ennett et al,2003; Rohrbach et al, 1993). Ransford (2008) foundan interaction between teacher efficacy and burnoutassociated with the quality of implementation of asocial and emotional learning program. Teachers whoreported high burnout and high efficacy reportedhigh-quality implementation, whereas teachers whoreported high burnout and low efficacy showed sub-stantially poorer implementation. In addition, bothprincipal leadership and teacher’s perceptions of thequality of the coaching independently predictedimplementation quality.

Perceptions oof aand aattitudes tto tthe iinterventionA variety of program attributes that are reflected inimplementers’ perceptions and attitudes appear toenhance implementation quality. One of the mostimportant is acceptance of the intervention, whichvaries with the individual’s needs and priority (Rohrbachet al, 1993). In some cases it is the perception that theintervention is a useful strategy for addressing a localproblem. Related to this is the perception that the pro-gram is better than the current practice (Elias et al,2003; Pankratz et al, 2002; Parcel et al, 1991; Ringwalt

et al, 2003) and that the program is compatible withthe values, needs, mission, and experiences of the in-stitution (Pankratz et al, 2002; Rogers, 2003). Re-search suggests that programs requiring special skilland coordination among many people (high in com-plexity) are less likely to be perceived by the imple-menting staff as effective and are less likely to bemaintained over time (Dusenbury et al, 2003).

Other perceptions or attitudes evolve as imple-menters have experience with the intervention. Severalresearchers have found that perceived effectiveness isassociated with higher implementation quality (Datnow& Castellano, 2000; Kealey et al, 2000; Ringwalt etal, 2003). Han and Weiss (2005) provide a model ofhow implementer experience of success with an inter-vention and attribution of student improvements to theintervention influence motivation and skill over time,which, in turn, promotes high-quality implementationand sustainability.

As implementers become more familiar with anintervention, their ability to understand the interventionhas implications for how it will be delivered (Dusenburyet al, 2003; Pankratz et al, 2002; Rogers, 2003).Related to this, the perceived value of the program isan important factor; if teachers do not see the valueof fostering a specific skill or conducting lessons onparticular topics, they may be more likely to skip thoseactivities, even those that are core elements of theprogram. In a related manner, teachers may be lesscommitted to implementing interventions aimed atdepression than those aimed at classroom behaviormanagement, because the symptoms of depression aretypically less disruptive for the classroom environmentthan are externalizing behavior problems (Bradshaw,Buckley et al, in press).

As discussed above in the section on support systemsand professional development, the acceptability andimplementation of an intervention can be increasedwhen implementers are exposed to model implementersor exemplars of quality implementation in contextuallysimilar schools. Further, the quality of engagementduring trainings, and satisfaction with the content andhow it is delivered, are likely to be important predictorsof the quality with which implementers deliver theintervention.

FFuuttuurree ddiirreeccttiioonnss

The field of prevention science is at a critical juncture,as the focus expands from intervention efficacy and

18 Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

effectiveness to include sustainability, transportability,and dissemination (Brounstein et al, 2006; Schoenwald& Hoagwood, 2001; Racine, 2006; Rohrbach et al,2006). However, the research base on implementationquality is not keeping pace with the growing emphasison adoption of empirically derived interventions(Kaftarian et al, 2004; Greenberg, 2004, 2007). Themultilevel contextual framework presented in thispaper can help guide the next stage of research,focused on the implementation quality of school-basedpreventive interventions. We conclude by identifyingspecific areas for future research.

TThheeoorryy--ddrriivveenn rreesseeaarrcchh oonn iimmpplleemmeennttaattiioonn qquuaalliittyy

Interest in assessing implementation quality in school-based intervention research is increasing, but it isoften viewed as a secondary or tertiary aim instead ofas the focus of a study. Although several of the contex-tual factors identified above are malleable, the major-ity of existing implementation studies in schools havebeen cross-sectional, thereby limiting their ability todraw conclusions about cause and effect relationships.A natural next step for the field is carefully to developand test theory-based interventions aimed at specificindividual and contextual factors in the proposedmodel (Pentz, 2004). Given the multilevel structure ofschool systems, the next stage of research requires afocus on the theories of both individual change andorganizational change (Glasgow et al, 2003; Klein &Sorra, 1996). Systematic testing of different aspects ofthe proposed model by experimental trials will allowus to determine which factors are most influential inpromoting high-quality implementation. Although wemight predict that more proximal factors, such asteacher qualifications and attitudes, would have thestrongest influence on implementation quality, organi-zational factors, such as administrative leadership,school culture, and school climate, should not beunder-estimated. Future studies should include multipleindicators of the implementation quality of the inter-vention model and the support system, along with anassessment of the contextual factors that the theoreticalmodel suggests would predict implementation quality.

One such study examined the impact of enhancedprincipal leadership on implementation quality byconducting a brief, 30-minute intervention for principalsthat emphasized the importance of the interventionand administrative leadership in facilitating high-qualityimplementation (Rohrbach et al, 1993). Although the

effect of a principal’s intervention on the teachers’ qualityof implementation was positive when compared withthe controls, no differences were found between thetwo conditions on principals’ self-report ratings orteachers’ ratings of their principal’s encouragement.Other studies could examine the impact of ‘pre-implementation’ training designed to improveschools’ and implementers’ readiness for imple-mentation, such as focusing on strategies for gainingbuy-in and enhancing the implementers’ skill andefficacy in implementing the intervention with integrity.Another possible staff-focused study could examinethe impact of pre-implementation trainings that incor-porate principles of mindfulness to reduce stress andpromote emotional insight.

Specifying the core elements of the interventionand the support system is critical to understandingimplementation and identifying which core elementsare related to outcomes.

The importance of the relationships between thecore elements of the intervention, the core elements ofthe support system, and implementation quality isillustrated by the five-year group randomized trial ofthe universal school-wide Positive Behavior Support.The initial two-day training, coupled with ongoingsupport and coaching and annual booster trainingsessions, resulted in high-quality implementation in all21 trained elementary schools, and significantimprovements in the schools’ organizational health.The 16 comparison schools, which did not receive formaltraining or coaching in Positive Behavior Support,adopted some elements of the school-wide model(Bradshaw, Reinke et al, 2008), but this did not resultin significant improvements in the schools’ organiza-tional climate (Bradshaw, Koth et al, 2008). Thesefindings suggest that, even though comparison schoolsimplemented some aspects of the model, their lack offormal training and coaching probably hindered themfrom ‘self-implementing’ the model with high quality,or achieving the intended outcomes. These findings alsohighlight the importance of monitoring implementa-tion in both intervention and comparison conditions inrandomized trials.

At a broader level, there is a need not only to assessprograms and the factors influencing their implemen-tation, but also to consider broader structural modelsof school-wide decision-making. For example, doesengaging schools in an intentional multi-stage planningand implementation process over multiple years (aswith the CASEL Sustainability Tool Kit; Devaney et al,

19Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

2006) lead to higher-quality implementation andgreater sustainability?

DDeetteerrmmiinniinngg tthhee eeffffeecctt ooff aaddaappttaattiioonnss oonn iimmpplleemmeennttaa--ttiioonn qquuaalliittyy aanndd oouuttccoommeess

Considerable evidence indicates that teachers routinelyadapt programs (Datnow & Castellano, 2000; Ringwaltet al, 2004). Some intervention developers and re-searchers believe that adaptations or modificationsmay be necessary for successful implementation,because of increased buy-in and improved fit betweenan intervention, its consumers, and the context (Castroet al, 2004; Dusenbury & Hansen, 2004; Kumper etal, 2002; Wandersman, 2003). An in-depth study ofthe Success for All program in two schools found that,although all teachers adapted the program, the levelof support for the program was not related to teachers’fidelity to practice (Datnow & Castellano, 2000).However, many community psychologists see adapta-tion and tailoring of interventions to community needsas the essential ingredients for successful disseminationof evidence-based interventions that have been missingfrom prevention science, and are calling for applicationof systematic strategies to guide this process so thatessential ingredients are retained but communities areempowered (Wandersman, 2003).

A study that coded teacher adaptations to a drugprevention curriculum in terms of detractions or enhance-ments found that the latter were positively associatedwith adherence to the intervention (defined as thenumber of objectives reached and major points coveredby teachers during delivery) (Dusenbury et al, 2005).This suggests that adaptation is complicated, and thatnot all adaptations are the same. Further research onadaptations is necessary before assuming that all changesat local level are acceptable and do not weakenprogram impact (Elliott & Mihalic, 2004).

The fidelity/adaptation debate is particularly relevantwhen interventions are used by communities whosecultures differ from that of the original trial population.Both implementers and trainers often add to or omitparts of interventions during adaptation for a newcontext, especially when working with ethnic minoritypopulations who are typically under-represented inefficacy trials. Although few preventive interventionshave been adapted for these populations, a recentmeta-analysis of rigorously tested cultural adaptationssuggests that carefully articulated adaptations can beeffective for minority populations (Griner & Smith,

2006). However, the majority of cultural adaptationsrequire rigorous evaluations to establish their efficacy(Castro et al, 2004).

MMeeaassuurriinngg iimmpplleemmeennttaattiioonn qquuaalliittyy

Research is needed in several areas of measurementrelated to implementation quality (Dusenbury et al,2005). The field needs to develop well-validated, cost-effective measures of implementation quality. Currently,the data used to assess implementation quality rangein both mode (self-report, participant report, liveobservation, coding of recorded sessions) and depth(random selection of one session vs. assessment of allsessions). Though independent ratings of observedquality by an expert are more reliable and valid thanself-report ratings from implementers (Lillehoj et al,2004), which may be easier to obtain and morecomprehensive (Hansen & McNeal, 1999), schoolsneed to measure implementation reliably and withvalidity, without total dependence on external researchers.It is also important that we determine the thresholds ofimplementation quality necessary for producing theintended outcomes. Few programs have precise research-supported criteria for ‘high’ or ‘low’ implementationquality.

More studies are needed that move beyond tradi-tional measures of fidelity and dosage to assessprocess variables and more complicated constructs,such as intervention generalization or adaptation(Bumbarger & Perkins, 2008; Dusenbury et al, 2005).Measuring intervention and support system implemen-tation in both intervention and comparison settings iscritical, and should be typical practice in school-basedresearch; most schools have some interventions alreadyin place when new ones are introduced and evaluated(Bradshaw, Reinke et al, 2008; Gottfredson et al, 2008).

MMeeaassuurriinngg mmuullttiilleevveell ccoonnssttrruuccttss

Attention needs to be paid to the measurement ofschool-level and other multilevel constructs. Charac-teristics of the school building, the student body, andthe staff are often aggregated at the building level.However, several of these school-level indicators co-occur (for example free and reduced-price meals rate,mobility, urbanicity, absenteeism) (Gottfredson et al,2005), making it statistically difficult to adjust for thesefactors while avoiding potential problems with multico-linearity. Another troubling issue is that individual-level

20 Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

variables are often aggregated into group-level variableswithout careful consideration of underlying theory orconstructs (Klein & Kozlowski, 2000). If individual-leveldata show a high degree of within-group agreement,a case can be made for aggregating the data torepresent a group-level construct, but only after a fullexamination of the data. For example, individuals’perceptions of a school tend to be highly correlated,showing a high degree of within-group agreement, yetaggregating at the school level can mask some systematicvariation in perceptions (Raudenbush & Bryk, 2002).

For example, research suggests that among ele-mentary school staff, ethnic minorities and males tendto perceive the school environment less favorably, asdo staff who work primarily with students in specialeducation (Bevans et al, 2007). With regard to studentperceptions of climate, Koth and colleagues (2008)found that both classroom characteristics (such as pre-ponderance of students with behavior problems) andindividual characteristics (for example ethnicity, gender)had a significant influence on students’ perceptions ofthe school environment. Aggregating group-referentitems is often a better strategy than aggregatingindividual-referent items when measuring attributes ofthe group (Klein & Kozlowski, 2000). An example is theconstruct of collective efficacy, which is conceptualizedas a shared property of the group, not the average ofteam members’ self-efficacy, and is measured by itemsreferencing the group, not the individual (Goodard etal, 2001).

AAnnaallyyttiicc ssttrraatteeggiieess

Although use of multilevel methods to address theclustering that is common in school-based trials hassignificantly increased (Murray, 1998; Shinn, 2003;West & Aiken, 1997), there is a need for strategiesto account for clustering over time that occurs inthe context of longitudinal school-based research.This includes more advanced modeling techniquesto examine mediators and moderators of studentoutcomes within hierarchical structures. Furthercomplicating this process is the finding that imple-mentation quality can vary considerably even in asingle school, which suggests that contextual influ-ences at multiple levels may be at play (Choi, 2003;Kallestad & Olweus, 2003). The classroom, ratherthan the school, may be the appropriate level ofanalysis for some research questions.

Designs to support implementation research are

needed. Obtaining the statistical power to study therelationship between variation in implementationquality and outcomes will require multiple group (forexample schools, classrooms) and longitudinal studieswith groups that have varying levels of implementationquality. The treatment of these contextual variables inanalytic modeling also has an influence on the statisticalpower to detect intervention effects in group random-ized trials. Sometimes researchers have no option butto include these variables as school-level factors, as inthe case of school size or urbanicity. However, whenindividual responses are available (as in the case withstudent or staff reports of climate or context), enteringthem in the model as individual-level covariates doesnot affect the power to detect an effect, whereas loadingup the model with school-level covariates can attenuatethe statistical power (Murray, 1998). Power is furtherdiminished when group- or school-level interactions areexamined. Additional work is needed to understandbetter the most appropriate treatment of contextualfactors, from both conceptual and methodologicalperspectives.

LLoonnggiittuuddiinnaall rreesseeaarrcchh oonn tthhee iimmpplleemmeennttaattiioonn pprroocceessss

Although the model presented in this paper examinesimplementation quality at the point when schools havecommitted to using a specific evidence-based intervention,it is important to recognize that implementation is partof a non-linear diffusion process that evolves overtime (Adelman & Taylor, 2003; Dusenbury & Hansen,2004). It begins with an adoption phase, when inter-ventions are being considered, and it progressesthrough an implementation phase until interventionsare either abandoned or institutionalized and sustainedover time (Rogers, 2003). Whereas some researchers,such as Han and Weiss (2005), have acknowledgedthat time is a critical factor in sustainability, we contendthat making the process cyclical by including feedbackloops across phases is an important issue to considerin future research on implementation quality (Devaneyet al, 2006).

Longitudinal evaluations of interventions focusedon clarifying the ‘process’ of quality implementationwould be useful, particularly if the studies prospec-tively measured factors that exist prior to the adoptionor implementation phase. One such study used theBridge-It survey tool to assess factors that predicted theimplementation of school-based tobacco preventionprograms (Gingiss et al, 2006). The findings support

21Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

the use of an overall measure that consists of eightdimensions, including characteristics of the intervention,implementer, school building, and broader context.The study of Positive Behavior Supports cited abovealso found that schools with lower levels of organi-zational health at baseline tended to take longer toimplement the model with high implementationquality, but also tended to benefit the most from theprogram (Bradshaw, Koth et al, 2008). These findingshighlight the importance of a longitudinal assessmentof both implementation quality and contextual factorsthat may both moderate and mediate the implemen-tation quality.

SSuuppppoorrttiinngg tthhee nneexxtt ssttaaggee ooff rreesseeaarrcchh oonn iimmpplleemmeennttaattiioonnqquuaalliittyy

Studying the process of moving evidence-basedschool-based programs into widespread practicewith fidelity will require partnerships with multipleschool systems (Glasgow et al, 2003). The chal-lenges of multi-site trials, such as maintainingconsistency in design, measurement, and interven-tion protocols, are present when working with multi-ple schools in any single school district. Fundinglarge-scale implementation studies is another chal-lenge, especially when schools are the unit of ran-domization. Research grants cannot cover the entirecost of services for schools included in designs thatmanipulate intervention content or aspects of thesupport system model (such as method of training,coaching versus no coaching). The scientific communityneeds to develop models of researcher–communitypartnerships. Such partnerships can ensure that theprograms developed are applicable and relevant toschool systems, and that the systems are ready toadopt and scale up these programs as they areproven effective. Partnership networks could provideopportunities to link funds from multiple sources,including district funds, local funds, grants fromnational foundations, service grants, and researchgrants. The concept of ‘braided funding’ is advocatedby professional organizations that specialize inprevention research (Society for Prevention Research,2005). Even more important is that such partnershipsoffer the opportunity for districts and researchers tolearn from and support each other in their worktoward the shared goal of moving evidence-basedprograms into general practice with high-qualityimplementation.

CCoonncclluussiioonn

Federal and state policies place schools undertremendous scrutiny, and many schools are turning toevidence-based preventive interventions in an effort toimprove student outcomes. Despite the large numberof interventions that have been shown to be effectivein preventing problem behavior in children and ado-lescents, and federal policies that support their use,theory and research on how best to implement theseprograms and practices with high fidelity in schoolsare limited (Racine, 2006; Schoenwald & Hoagwood,2001). The existing research suggests that a variety ofcontextual factors at multiple levels (macro, school,and individual) have a significant impact on the qualitywith which evidence-based interventions are imple-mented in schools. This paper addresses this research-to-practice gap by presenting a multilevel model thatidentifies factors that influence implementation qualityin school settings. A primary aim of this paper is topromote a comprehensive research agenda for movingevidence-based programs into general practice inschools with high-quality implementation.

At present, there are more questions than answersabout how to integrate preventive interventions inschools so that they are implemented with highquality and are sustained over time. Even so, thepossibility of building an empirically-based scienceof implementation is timely and opportune. It ispromising to see that federal agencies, such as theNational Institute of Mental Health, the Centers forDisease Control and Prevention, and the Institute ofEducation Sciences, are increasingly funding researchthat aims to inform all stages of the process ofmoving research into practice. It is imperative thatthe next stage of work be informed by comprehensiveand well-thought-out theories of implementationand dissemination that can guide empirical researchand inform practice, thereby maximizing positiveintervention impact.

AAcckknnoowwlleeddggeemmeenntt

Preparation of this manuscript was supported by agrant jointly funded by the National Institute of MentalHealth and the National Institute on Drug Abuse (P30MH06624) and grants from the Centers for DiseaseControl and Prevention (1U49CE 000728 andK01CE001333-01) and the National Institute on DrugAbuse (RO1 DA019984).

22 Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

AAddddrreessss ffoorr ccoorrrreessppoonnddeennccee

Celene Domitrovich, The Pennsylvania State University,Prevention Research Center, 109 South HendersonBuilding, University Park, PA, 16801; e-mail:[email protected].

RReeffeerreenncceess

Aarons GA (2005) Measuring provider attitudes towardadoption of evidence-based practice: considerationsof organizational context and individual differences.Child and Adolescent Psychiatric Clinics of NorthAmerica 14 255–71.

Aarons GA & Sawitzy AC (2006) Organizational cul-ture and climate and mental health provider attitudestoward evidence-based practice. Psychological Ser-vices 3 61–72.

Aber JL, Jones SM, Brown JL, Chaudry N & Samples F(1998) Resolving conflict creatively: evaluating thedevelopmental effects of a school-based violence pre-vention program in neighborhood and classroom con-text. Development and Psychopathology 10 187–213.

Adelman HS & Taylor L (2003) On sustainability ofproject innovations as systemic change. Journal ofEducational and Psychological Consultation 14 1–25.

Atkins MS, Frazier SL, Leathers SJ et al (in press)Teacher key opinion leaders and mental health con-sultation in urban low-income schools. Journal of Con-sulting and Clinical Psychology.

Atkins MS, McKay MM, Arvanitis P et al (1998) Anecological model for school-based mental health servicesfor urban low-income aggressive children. Journal ofBehavioral Health Services & Research 25 64–75.

Barrett S, Bradshaw CP & Lewis-Palmer T (2008)Maryland state-wide PBIS initiative: systems, evalua-tion, and next steps. Journal of Positive Behavior Inter-ventions 10 105–14.

Basen-Engquist K, O’Hara-Tompkins N, Lovato CY etal (1994) The effect of two types of teacher trainingon implementation of smart choices: a tobacco pre-vention curriculum. Journal of School Health 64 334–9.

Battistich V & Hom A (1997) The relationship betweenstudents’ sense of their school as community and theirinvolvement in problem behaviors. American Journalof Public Health 87 1997–2001.

Borg MG, Riding RJ & Falzon JM (1991) Stress in teach-ing: a study of occupational stress and its determinants,job satisfaction and career commitment among primaryschoolteachers. Journal of Educational Psychology 11 59–75.

Botvin GJ, Baker E, Dusenbury L, Botvin EM & Diaz T(1995) Long-term follow-up results of a randomizeddrug abuse prevention trial in a white middle-classpopulation. Journal of the American Medical Associa-tion 273 1106–12.

Bradshaw CP, Buckley J & Ialongo N (in press) School-based service utilization among urban children withearly-onset educational and mental health problems:the squeaky wheel phenomenon. School PsychologyQuarterly.

Bradshaw CP, Koth CW, Bevans KB, Ialongo N & LeafPJ (in press) The impact of school-wide positive behav-ioral interventions and supports (PBIS) on the organi-zational health of elementary schools. SchoolPsychology Quarterly.

Bradshaw CP, Koth CW, Thornton L & Leaf PJ (2008)‘Altering school context through school-wide positivebehavioral interventions and supports: findings from agroup-randomized effectiveness trial.’ Manuscript sub-mitted for publication.

Bradshaw CP, Reinke WM, Brown LD, Bevans KB &Leaf PJ (2008) Implementation of school-wide positivebehavioral interventions and supports (PBIS) in ele-mentary schools: observations from a randomizedtrial. Education & Treatment of Children 31 1–26.

Brint S (2006) Schools as social institutions. Schoolsand Society. Stanford University Press.

Brooks V (1996) Mentoring: the interpersonal dimen-sion. Teacher Development 5–10.

Bronfenbrenner U (1979) The Ecology of HumanDevelopment: Experiments by nature and design.Cambridge, MA: Harvard University of Press.

Brounstein PJ, Gardner SE & Backer TE (2006)Research to practice: efforts to bring effective preven-tion to every community. Journal of Primary Prevention27 91–109.

Bryk AS & Schnieder B (2002) Trust in Schools: A coreresource for improvement. New York: Russell Sage.

Bumbarger B & Perkins D (2008) ‘After randomizedtrials: issues related to dissemination of evidence-based interventions.’ Unpublished manuscript.

Burns BJ & Hoagwood K (Eds) (2005) Evidence-basedpractices Part II: Effecting change. Child and Adoles-cent Psychiatric Clinics of North America 14 (2) xv–xvii.

Burns MK & Symington T (2002) A meta-analysis ofprereferral intervention teams: student and systemicoutcomes. Journal of School Psychology 40 437–47.

Cameron R, Brown KS, Best JA et al (1999) Effec-tiveness of a social influences smoking prevention

23Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

program as a function of provider type, trainingmethod, and school risk. American Journal of PublicHealth 89 1827–31.

Castro FG, Barrera M & Martinez CR (2004) The culturaladaptation of prevention interventions: resolving tensionsbetween fidelity and fit. Prevention Science 5 41–5.

Catalano RF, Berglund ML, Ryan JAM, Lonczak HS &Hawkins JD (2002) Positive youth development in theUnited States: research findings on evaluations of pos-itive youth development programs. Prevention & Treat-ment 5 Article 15.

Chen HT (1990) Theory-Driven Evaluations. NewburyPark, CA: Sage Publications.

Chen HT (1998) Theory-driven evaluations. Advancesin Educational Productivity 7 15–34.

Chen HT (2003) Theory-driven approach for facilita-tion of planning health promotion or other programs.Canadian Journal of Program Evaluation 18 91–113.

Chinman M, Hannah G, Wandersman A et al (2005)Developing a community science research agenda forbuilding community capacity for effective preventiveinterventions. American Journal of Community Psychol-ogy 35 143–57.

Choi JN (2003) How does context influence individualbehavior? Multilevel assessment of the implementationof social innovations. Prevention & Treatment 6 1–8.

Collaborative for Academic, Social, and EmotionalLearning (2005) The Illinois Edition of Safe and Sound:An educational leader’s guide to evidence-based socialand emotional learning programs. Chicago, IL: Author.

Committee for Children (n.d.) ‘Second Step: steps forsuccessful implementation in schools [Handout].’ Seat-tle, WA: Author.

Dane AV & Schneider BH (1998) Program integrity inprimary and early secondary prevention: are imple-mentation effects out of control? Clinical PsychologyReview 18 23–45.

Datnow A & Castellano M (2000) Teachers’ responsesto Success for All: how beliefs, experiences, and adap-tations shape implementation. American EducationalResearch Journal 37 775–99.

Datnow A, Hubbard L & Mehan H (2002) ExtendingEducational Reform: From One School to Many. Lon-don: Routledge/Falmer.

Derzon JH, Sale E, Springer JF & Brounstein P (2005)Estimating intervention effectiveness: synthetic projec-tion of field evaluation results. Journal of Primary Pre-vention 26 321–43.

Devaney E, O’Brien MU, Resnik H, Keister S & Weiss-berg RP (2006) Sustainable School-Wide Social andEmotional Learning: Implementation guide and toolkit.Chicago, IL: Collaborative for Academic, Social, andEmotional Learning.

Domitrovich CE & Greenberg MT (2000) The study ofimplementation: current findings from effective pro-grams for school aged children. Journal of Education-al and Psychological Consultation 11 193–222.

Dunn RJ & Harris LG (1998) Organizational dimen-sions of climate and the impact on school achieve-ment. Journal of Instructional Psychology 25 100–14.

Durlak JA (1998) Why program implementation isimportant. Journal of Prevention and Intervention inthe Community 17 5–18.

Durlak JA & DuPre EP (2008) Implementation matters:a review of research on the influence of implementa-tion on program outcomes and the factors affectingimplementation. American Journal of Community Psy-chology 41 327–50.

Durlak JA & Weissberg RP (2005, August) ‘A major meta-analysis of positive youth development programs.’ Invitedpresentation at the Annual Meeting of the AmericanPsychological Association. Washington, DC.

Durlak JA, Weissberg RP, Dymnicki AB, Taylor RD &Schellinger K (2008) The Effects of Social and Emo-tional Learning on the Behavior and Academic Per-formance of School Children. Chicago, IL:Collaborative for Academic, Social, and EmotionalLearning.

Dusenbury L, Branningan R, Falco M & Hansen WB(2003) A review of research on fidelity of implementa-tion: implications for drug abuse prevention in schoolsettings. Health Education Research 18 237–56.

Dusenbury L, Brannigan R, Hansen WB, Walsh J &Falco M (2005) Quality of implementation: developingmeasures crucial to understanding the diffusion ofpreventive interventions. Health Education Research 20308–13.

Dusenbury L & Hansen WB (2004) Pursuing the coursefrom research to practice. Prevention Science 5 55–9.

Dusenbury L, Hansen W, Jackson-Newson J et al(2007) ‘Coaching to implementation fidelity.’ Paperpresented at the 15th annual meeting of the Societyfor Prevention Research, May. Washington, DC.

Elias MJ, Zins JE, Graczyk PA & Weissberg RP (2003)Implementation, sustainability, and scaling up ofsocial-emotional and academic innovations in publicschools. School Psychology Review 32 303–19.

24 Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

Ellickson PL & Bell RM (1990) Drug prevention injunior high: a multi-site longitudinal test. Science 2471299–305.

Ellickson PL, Bell RM & McGuigan K (1993) Preventingadolescent drug use: long-term results of a junior highprogram. American Journal of Public Health 83 856–61.

Elliott D (1998) Blueprints for Violence Prevention.Golden, CO: Venture.

Elliott DS & Mihalic S (2004) Issues in disseminatingand replicating effective prevention programs. Preven-tion Science 5 47–53.

Ennett ST, Ringwalt CL, Thorne J et al (2003) A com-parison of current practice in school-based substanceuse prevention programs with meta-analysis findings.Prevention Science 4 1–14.

Esposito C (1999) Learning in urban blight: school cli-mate and its effect on the school performance ofurban, minority, low-income children. School Psycholo-gy Review 28 365–77.

Evers WJ, Brouwers A & Tomic W (2002) Burnout andself-efficacy: a study on teachers’ beliefs when imple-menting an innovative educational system in theNetherlands. British Journal of Educational Psychology72 227–44.

Fagan AA & Mihalic S (2003) Strategies for enhancingthe adoption of school-based prevention programs:lessons learned from the Blueprints for Violence Pre-vention replications of The Life Skills Training Pro-gram. Journal of Community Psychology 31 235–53.

Farmer EMZ, Burns BJ, Phillips SD, Angold A & Costel-lo EJ (2003) Pathways into and through mental healthservices for children and adolescents. Psychiatric Ser-vices 54 60–6.

Feinberg ME, Gomez BJ, Puddy RW & Greenberg MT(2008) Evaluation and community prevention coali-tions: validation of an integrated web-based/technicalassistance consultant model. Health Education andBehavior 35 9–21.

Fishman BJ, Marx RW, Best S & Tal RT (2003) Linkingteacher and student learning to improve professionaldevelopment in systemic reform. Teaching and TeacherEducation 19 643–58.

Fishman BJ, Best S, Foster J & Marx R (2000) ‘Foster-ing teacher learning in systemic reform: a design pro-posal for developing professional development.’ Paperpresented at the annual meeting of the National Asso-ciation of Research in Science Teaching, New Orleans,LA, April 28–May 1.

Fixsen DL, Naoom SF, Blasé KA, Friedman RM & Wal-lace F (2005) Implementation Research: A Synthesis of

the Literature. Tampa: University of South Florida,Louis de la Parte Florida Mental Health Institute, TheNational Implementation Research Network (FMHIPublication #231).

Flay BR, Biglan A, Boruch RF et al (2005) Standards ofevidence: criteria for efficacy, effectiveness and dis-semination. Prevention Science 6 151–75.

Garet M, Porter A, Desimone L, Birman B & Yoon KS(2001) What makes professional development effec-tive? Results from a national sample of teachers.American Education Research Journal 38 915–45.

Gingiss PL (1992) Enhancing program implementationand maintenance through a multiphase approach topeer-based staff development. Journal of SchoolHealth 62 161–6.

Gingiss PM, Roberts-Gray C & Boerm M (2006)Bridge-It: a system for predicting implementationfidelity for school-based tobacco prevention programs.Prevention Science 7 197–207.

Glasgow RE, Lichtenstein E & Marcus AC (2003) Whydon’t we see more translation of health promotionresearch to practice? Rethinking the efficacy-to-effec-tiveness transition. American Journal of Public Health93 1261–7.

Glisson C & Hemmelgarn A (1998) The effects oforganizational climate and interorganizational coordi-nation on the quality of outcomes of children’s servicesystems. Child Abuse & Neglect 22 401–21.

Goddard RD & Goddard YL (2001) A multilevel analy-sis of the relationship between teacher and collectiveefficacy in urban schools. Teaching and Teacher Edu-cation 17 807–18.

Gorman-Smith D, Beidel D, Brown TA, Lochman J &Haaga AF (2003) Effects of teacher training and con-sultation on teacher behavior towards students at highrisk for aggression. Behavior Therapy 34 437–52.

Gottfredson DC & Gottfredson GD (2002) Quality ofschool-based prevention programs: results from anational survey. Journal of Research on Crime andDelinquency 39 3–35.

Gottfredson DC & Wilson DB (2003) Characteristics ofeffective school-based substance abuse prevention.Prevention Science 4 27–38.

Gottfredson G, Nese J, Nebbergall A & Shaw F (2008)‘Alternative measures of implementation in an experi-mental study of elementary school social skills instruc-tion.’ Paper presented at the 16th annual meeting ofthe Society for Prevention Research. San Francisco,CA, May.

Gottfredson GD, Gottfredson DC, Payne AA &

25Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

Gottfredson NC (2005) School climate predictors ofschool disorder: results from a national study of delin-quency prevention in schools. Journal of Research inCrime and Delinquency 42 412–44.

Gottfredson GD, Jones EM & Gore TW (2002) Imple-mentation and evaluation of a cognitive-behavioralintervention to prevent problem behavior in a disorga-nized school. Prevention Science 3 27–38.

Greenberg MT (2004) Current and future challengesin school-based prevention: the researcher perspec-tive. Prevention Science 5 5–13.

Greenberg MT (2007) Community and team memberfactors that influence the early phase functioning ofcommunity prevention teams: the prosper project.Journal of Primary Prevention 28 485–504.

Greenberg M, Domitrovich CE, Graczyk P & Zins J(2001) The Study of Implementation in School-BasedPreventive Interventions: Theory, research, & practice.Report to Center for Mental Health Services (CMHS),Substance Abuse Mental Health Services Administra-tion, U.S. Department of Health and Human Services.

Greenberg MT, Domitrovich C & Bumbarger B (2001)The prevention of mental disorders in school-agedchildren: current state of the field. Prevention & Treat-ment 4 Article 1.

Greenhalgh T, Robert G, Macfarlane F et al (2005)Diffusion of Innovations in Health Service Organiza-tions: A systematic literature review. Oxford: Blackwell.

Griner D & Smith TB (2006) Culturally adapted men-tal health intervention: a meta-analytic review. Psy-chotherapy: Theory, Research, Practice, Training 43531–48.

Hahn EJ, Noland MP, Rayens MK & Christie DM(2002) Efficacy of training and fidelity of implementa-tion of the life skills training program. Journal ofSchool Health 72 282–7.

Hahn R, Fuqua-Whitley D, Wethington H, Lowy J et al(2007) Effectiveness of universal school-based pro-grams to prevent violent and aggressive behavior: asystematic review. American Journal of Preventive Med-icine 33 (Suppl 2) S114–29.

Hall GE & Hord SM (1987) Change in School: Facili-tating the process. Albany, NY: State University of NewYork Press.

Hallinger P & Heck RH (1996) Reassessing the princi-pal’s role in school effectiveness: a review of empiricalresearch, 1980–1995. Educational AdministrationQuarterly 32 5–44.

Halpin A & Croft D (1963) The Organizational Cli-

mate of Schools. Chicago: Midwest AdministrativeCenter, University of Chicago.

Han SS & Weiss B (2005) Sustainability of teacherimplementation of school-based mental health pro-grams. Journal of Abnormal Child Psychology 33665–79.

Hansen WB (1996) Pilot test results comparing the AllStar program with seventh grade D.A.R.E.: programintegrity and mediating variable analysis. SubstanceUse & Misuse 31 1359–77.

Hansen WB & McNeal RB (1999) Drug educationpractice: results of an observational study. Health Edu-cation Research 14 85–97.

Haskins R & Loeb S (2007) A Plan to Improve theQuality of Teaching in American Schools. The Future ofChildren (Policy Brief). Princeton, NJ: Woodrow WilsonSchool of Public and International Affairs.

Hoagwood KE & Johnson J (2003) School psychology:a public health framework. I. From evidence-basedpractices to evidence based policies. Journal of SchoolPsychology 41 3–21.

Hawkins JD & Catalano RF (1992) Communities thatcare: Action for drug abuse. San Francisco, CA:Jossey-Bass Inc.

Hoy WK, Hannum J & Tschannen-Moran M (1998)Organizational climate and student achievement: aparsimonious and longitudinal view. Journal of SchoolLeadership 8 336–59.

Illinois State Board of Education: Division of EarlyChildhood Education (2004) The Illinois LearningStandards.

Jennings PA & Greenberg MT (2008) ‘The prosocialclassroom: teacher social and emotional competencein relation to child and classroom outcomes.’ Manu-script submitted for publication.

Joyce B & Showers B (2002) Student Achievementthrough Staff Development (3rd edition). Alexandria,VA: Association for Supervision and CurriculumDevelopment.

Kaftarian S, Robinson E, Compton W, Davis BW &Volkow N (2004) Blending prevention research andpractice in schools: critical issues and suggestions.Prevention Science 5 1–3.

Kallestad JH & Olweus D (2003) Predicting teachers’and schools’ implementation of the Olweus bullyingprevention program: a multilevel study. Prevention &Treatment 6.

Kam C, Greenberg MT & Walls CT (2003) Examining

26 Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

the role of implementation quality in school-basedprevention using the PATHS curriculum. PreventionScience 4 55–63.

Kataoke S, Rowan B & Hoagwood K (2008) ‘Mentalhealth and education policy: issues and future directions.’Paper presented at the Fundamental Policy – Spotlight onMental Health Conference. Washington, DC.

Katulak NA, Brackett MA & Weissberg RP (2008) School-based social and emotional learning (SEL) programming:current perspectives. In: A Lieberman, M Fullan, A Hargreaves& D Hopkins (Eds) International Handbook of EducationalChange (2nd Edition). New York: Springer.

Kealey KA, Peterson AV, Gaul MA & Dinh KT (2000)Teacher training as a behavior change process: princi-pals and results from a longitudinal study. Health Edu-cation & Behavior 27 64–81.

Kellam SG, Brown CH, Poduska J et al (2008) Effectsof a universal classroom behavior management pro-gram in first and second grades on young adultbehavioral, psychiatric, and social outcomes. Drugand Alcohol Dependence 95 S5–S28.

Kellam SG, Ling X, Merisca R, Brown CH & Ialongo N(1998) The effect of the level of aggression in the firstgrade classroom on the course and malleability ofaggressive behavior into middle school. Developmentand Psychopathology 10 165–85.

Kiernan M & Regier DA (1996) Mental health serviceuse in the community and schools: results from thefour-community MECA study. Journal of the AmericanAcademy of Child & Adolescent Psychiatry 35 889–97.

Klein KJ & Kozlowski SWJ (2000) From micro to meso:critical steps in conceptualizing and conducting multilevelresearch. Organizational Research Methods 3 211–236.

Klein KJ & Sorra JS (1996) The challenge of innova-tion implementation. Academy of Management Review21 1055–80.

Koth CW, Bradshaw CP & Leaf PJ (2008) Examiningthe relationship between classroom-level factors andstudents’ perception of school climate. Journal of Edu-cational Psychology 100 96–104.

Kumper KL, Alvarado R, Smith P & Bellamy N (2002)Cultural sensitivity and adaptation in family-basedprevention interventions. Prevention Science 3 241–6.

Leach DJ & Conto H (1999) The additional effects ofprocess and outcome feedback following brief in-ser-vice training. Educational Psychology 19 441–62.

Leaf PJ, Alegria M, Cohen P et al (1996) Mentalhealth service use in the community and schools:

results from the four-community MECA study. Journalof the American Academy of Child & Adolescent Psy-chiatry 35 889–97.

Lillejoj CJ, Griffin KW & Spoth R (2004) Program providerand observer ratings of school-based preventive interven-tion implementation: agreement and relation to youthoutcomes. Health Education & Behavior 31 242–57.

Lochman J, Powell N, Boxmeyer C et al (2008) ‘Theeffect of school and counselor characteristics onimplementation of a preventive intervention.’ Paperpresented at the 16th annual meeting of the Societyfor Prevention Research, May. San Francisco, CA.

Lytle LA, Ward J, Nader PR, Pederson S & Williston BJ(2003) Maintenance of a health promotion programin elementary schools: results from the CATCH-ONstudy key informants interviews. Health Education andBehavior 30 503–18.

McCormick KM & Brennan S (2001) Mentoring thenew professional in interdisciplinary early childhoodeducation. Topics in Early Childhood Special Education21 (3) 131–49.

McNeal RB, Hansen WB, Harrington NG & Giles SM(2004) How All Stars works: an examination of pro-gram effects on mediating variables. Health Education& Behavior 31 165–78.

Moncher FJ & Prinz RJ (1991) Treatment fidelity in out-come studies. Clinical Psychology Review 11 247–66.

Mrazek PG & Haggerty RJ (Eds) (1994) Reducing Risksfor Mental Disorders: Frontiers for Preventive InterventionResearch. Washington, DC: National Academy Press.

Murray DM (1998) Design and Analysis of Group-Ran-domized Trials. New York: Oxford Press.

National Research Council (2004) Implementing Ran-domized Field Trials in Education: Report of a Work-shop. L Towne & M Hilton (Eds) Center for Education,Division of Behavioral and Social Sciences and Educa-tion. Washington, DC: The National Academies Press.

Noell GH, Witt JC, Slider NJ et al (2005) Treatmentimplementation following behavioral consultation inschools: a comparison of three follow-up strategies.School Psychology Review 34 87–106.

Owens RG (2004) Organizational Behavior in Educa-tion: Adaptive Leadership and School Reform (8th edi-tion). Boston: Pearson Allyn and Bacon.

Pankratz M, Hallfors D & Cho H (2002) Measuringperceptions of innovation adoption: the diffusion of afederal drug prevention policy. Health EducationResearch 17 315–26.

27Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

Parcel GS, Perry CL, Kelder SH et al (2003) School cli-mate and the institutionalization of the CATCH pro-gram. Health Education & Behavior 30 489–502.

Parcel GS, Ross JG, Lavin AT et al (1991) Enhancingimplementation of the teenage health teaching mod-ules. Journal of School Health 61 35-38.

Payne AA, Gottfredson DC & Gotfredson GD (2006)School predictors of the intensity of implementation ofschool-based prevention programs. Prevention Science7 225–37.

Pentz MA (2004) Form follows function: designs forprevention effectiveness and diffusion research. Pre-vention Science 5 23–9.

Perry CL, Murray DM & Griffin G (1990) Evaluatingthe stateside dissemination of smoking prevention cur-ricula: factors in teacher compliance. Journal ofSchool Health 60 501–4.

Poduska J, Kellam S, Wang W et al P (2008) Impact ofa universal classroom behavior intervention in firstgrade on young adult service use for problems withemotions, behavior or drugs or alcohol. Drug andAlcohol Dependence 95s s29–s44.

Putnam RT & Borko H (2000) What do new views ofknowledge and thinking have to say about researchon teacher learning? Educational Researcher 29 4–15.

Racine DP (2006) Reliable effectiveness: a theory onsustaining and replicating worthwhile innovations.Administration and Policy in Mental Health and MentalHealth Services Research 33 356–87.

Ransford C (2008) ‘The Role of School and TeacherCharacteristics on Teacher Burnout and Implementa-tion Quality of SEL Curriculum.’ Unpublished doctoraldissertation. Pennsylvania State University.

Raudenbush SW & Bryk AS (2002) Hierarchical LinearModels: Applications and Data Analysis Methods (2ndedition). Thousand Oaks, CA: Sage.

Reichers AE & Schneider B (1990) Climate and cul-ture: an evolution of constructs. In: B Schneider (Ed)Organizational Climate and Culture. San Francisco:Jossey-Bass.

Rimm-Kaufman SE, Fan X, Chiu Y & You W (2007)The contribution of the Responsive Classroomapproach on children’s academic achievement: resultsfrom a three year longitudinal study. Journal of SchoolPsychology 45 401–21.

Ringeisen H, Henderson K & Hoagwood K (2003)Context matters: schools and the ‘research to practicegap’ in children’s mental health. School PsychologyReview 32 153–68.

Ringwalt CL, Ennett S, Johnson R et al (2003) Factorsassociated with fidelity to substance use preventioncurriculum guides in the nation’s middle schools.Health Education and Behavior 30 375–91.

Ringwalt CL, Ennett S, Vincus A et al (2002) The preva-lence of effective substance use prevention curricula inU.S. middle schools. Prevention Science 3 257–65.

Ringwalt CL, Vincus A, Ennett S, Johnson R &Rohrbach LA (2004) Reasons for teachers’ adaptationof substance use prevention curricula in schools withnon-White student populations. Prevention Science 561–7.

Rogers EM (2003) Diffusion of Innovations (2nd edi-tion). New York: The Free Press.

Rohrbach LA, Graham JW & Hansen WB (1993) Dif-fusion of a school-based substance abuse preventionprogram: predictors of program implementation. Pre-ventive Medicine 22 237–60.

Rohrbach LA, Grana R, Sussman S & Valente TW(2006) Type II translation: transporting preventioninterventions from research to real-world settings.Evaluation and the Health Profession 29 302–33.

Rohrbach LA, Ringwalt CL, Ennett ST & Vincus AA(2005) Factors associated with adoption of evidence-based substance use prevention curricula in US schooldistricts. Health Education Research 20 514–26.

Rose DJ & Church RJ (1998) Learning to teach: theacquisition and maintenance of teaching skills. Journalof Behavioral Education 8 5–35.

Ross JG, Luepker RV, Nelson GD, Saavedra P &Hubbard BM (1991) Teenage health teaching mod-ules: impact of teacher training on implementationand student outcomes. Journal of School Health 6131–18.

Schoenwald SK & Hoagwood K (2001) Effectiveness,transportability, and dissemination of interventions:what matters when? Psychiatric Services 52 1190–7.

Saul J, Duffy J, Noonan R, Lubell K et al (2008) Bridg-ing science and practice in violence prevention:addressing ten key challenges. American Journal ofCommunity Psychology 41 197–205.

Scott TM & Martinek G (2006) Coaching PositiveBehavior Support in school settings. Journal of PositiveBehavior Interventions 8 165–73.

Shinn M (2003) Understanding implementation ofprograms in multilevel systems. Prevention and Treat-ment 61.

Skara S, Rohrbach LA, Sun P & Sussman S (2005) An

28 Advances in School Mental Health Promotion VOLUME 1 ISSUE 3 - July 2008 © The Clifford Beers Foundation & University of Maryland

F E A T U R E Special SSection: EEnhancing IImplementation QQuality

evaluation of the fidelity of implementation of aschool-based drug abuse prevention program: projecttoward no drug use (TND) Journal of Drug Education35 305–29.

Society for Prevention Research (2005) A Case forBraided Prevention Research and Service Funding.Fairfax, VA: Author.

Spoth RL & Greenberg MT (2005) Toward a compre-hensive strategy for effective practitioner-scientist part-nerships and large-scale community health andwell-being. American Journal of Community Psycholo-gy 35 107–26.

Spoth R, Guyll M, Lillehoj CJ, Redmond C & Green-berg MT (2007) PROSPER study of evidence-basedintervention implementation quality by community-uni-versity partnerships. Journal of Community Psychology35 981–99.

Stevens V, van Oost P & de Bourdeaudhuij I (2001)Implementation process of the Flemish antibullyingintervention and relation with program effectiveness.Journal of School Psychology 39 303–17.

St Pierre TL, Osgood DW, Siennick SE, Kauh TJ & Bur-don FF (2007) Project Alert with outside leaders: whatleader characteristics are important for success? Pre-vention Science 8 51–64.

Sugai G & Horner R (2006) A promising approach forexpanding and sustaining school-wide positive behav-ior support. School Psychology Review 35 245–59.

Tanyu MF, Payton J, Weissberg RP & O’Brien MU(2005) Social and Emotional Learning Practices in Illi-nois Schools: A survey of district-level and school-building administrators. A report prepared for theIllinois State Board of Education and the Illinois Chil-dren’s Mental Health Partnership. Chicago: IL.

Tobler NS, Roona MR, Ochshorn P et al (2000) School-based adolescent drug prevention programs: 1998 meta-analysis. Journal of Primary Prevention 20 275–337.

Tolan PH, Gorman-Smith D & Henry DB (2004) Sup-porting families in high risk settings: proximal effectsof the SAFE children prevention program. Journal ofConsulting and Clinical Psychology 72 855–69.

Tschannen-Moran M & Woolfolk Hoy A (2001)Teacher efficacy: capturing an elusive construct. Teach-ing and Teacher Education 17 783–805.

Van Houtte M (2005) Climate or culture? A plea forconceptual clarity in school effectiveness research.School Effectiveness and School Improvement 16 71–89.

Vicary JR, Smith EA, Swisher JD et al (2006) Results ofa 3-year study of two methods of delivery of life skillstraining. Health Education & Behavior 33 325–39.

Wandersman A (2003) Community science: bridgingthe gap between science and practice with communi-ty-centered models. American Journal of CommunityPsychology 31 227–42.

Wandersman A, Duffy J, Flaspohler P et al (2008)Bridging the gap between prevention research andpractice: the interactive systems framework for dissem-ination and implementation. American Journal ofCommunity Psychology 41 171–81.

Wandersman A & Florin P (2003) Community inter-vention and effective prevention. American Psycholo-gist 58 441–8.

Wandersman A, Imm P, Chinman M & Kaftarian S(1999) Getting to Outcomes: Methods and tools forplanning, evaluation and accountability. Rockville, MD:Center for Substance Abuse Prevention.

Wang MC, Jaertel GD & Walberg HJ (1997) Learninginfluences. In: HJ Walberg & GD Haertel (Eds) Psycholo-gy and Educational Practice. McCuthan: Berkley CA.

Weissberg RP & Greenberg MT (1998a) Preventionscience and community collaborative action research:combining the best from both perspectives. Journal ofMental Health 7 477–90.

Weissberg RP & Greenberg MT (1998b) School andcommunity competence-enhancement and preventionprograms. In: W Damon (Series Ed), IE Sigel & KARenninger (Vol. Eds) Handbook of Child Psychology:Vol 4. Child psychology in practice (5th edition). NewYork: John Wiley & Sons.

West SG & Aiken LS (1997) Toward understandingindividual effects in multicomponent prevention pro-grams: design and analysis strategies. In: KJ Bryant,M Windle & SG West (Eds) The Science of Prevention:Methodological Advances from Alcohol and SubstanceAbuse research. Washington, DC, US: American Psy-chological Association.

Wilson DB, Gottfredson DC & Najaka SS (2001)School-based prevention of problem behaviors: ameta-analysis. Journal of Quantitative Criminology 17247–72.

Wilson SJ, Lipsey MW & Derzon JH (2003) The effectsof school-based intervention programs on aggressivebehavior: a meta-analysis. Journal of Consulting andClinical Psychology 71 136–49.