Download - Integrating Triple P into Existing Family Support Services: A Case Study on Program Implementation

Transcript

Integrating Triple P into Existing Family Support Services:A Case Study on Program Implementation

Rhonda Breitkreuz & David McConnell &Amber Savage & Alec Hamilton

Published online: 13 July 2011# Society for Prevention Research 2011

Abstract The purpose of this paper is to present a casestudy of “evidence-based” program uptake and imple-mentation. The process of integrating Triple P (levels 2and 3) into existing family support centers in Alberta,Canada, was examined. We conducted ten individualinterviews with directors, and ten group interviews,involving a total of 62 practitioners across ten Triple Ppilot sites. Key findings show that there was variabilityin the approach and extent to which Triple P wasintegrated into family support centers. Five key factorsimpacting the integration process emerged from theinterviews. These were: (1) the level of development ofpre-existing support services; (2) the degree of “fit”between the Triple P program approach and existing agencypractice, including the perceived suitability/unsuitability forsome client groups; (3) practitioner perceptions of theadaptability of the program; (4) rules about who can andwho cannot use Triple P resources; and (5) training andsustainability issues. In addition to identifying specific factors,this study was able to provide some insight as towhy and howthese factors were significant, thereby adding to the literatureon knowledge/program dissemination processes.

Keywords Triple P. Program implementation . Knowledgedissemination . Evidence-based programs . Parentingprogram

Introduction

The Triple P Positive Parenting Program, a behavior based,parent training and support program, was developed inAustralia and has been widely implemented throughoutmany countries including Canada, the United States, NewZealand, the Netherlands, and Germany. Triple P iscomprised of five levels of intervention, ranging frommulti-media strategies designed to improve parent access tohigh-quality parenting information, through to multi-modalparent training with enhancements for high-risk families.Triple P has a well structured and systematic strategy ofprogram dissemination that includes practitioner training,accreditation, and use of proprietary resources. The dis-semination of Triple P is based on an ecological modelinformed by self-regulatory and systems-contextualapproaches (Sanders and Turner 2005). It has beenidentified as an effective, relatively inexpensive, researchsupported program with a substantial and increasingnumber of randomized control trials to provide evidenceof its effectiveness (Sanders et al. 2002, 2009; Sanders andTurner 2005; Seng et al. 2006).

One particularly interesting aspect of Triple P is that it isbeing introduced into existing organizations with diverseorganizational cultures, staff of varying educationalbackgrounds, and wide-ranging geographic and culturalsettings. As such, the melding of this program intoexisting organizations provides an interesting windowinto the uptake of evidence-based programs into naturalisticsettings with pre-established workplace cultures, programsand staff. Yet, to date, this aspect of Triple P has received littlescholarly attention. Situated within this context, the purpose ofthis paper is to contribute to the literature on knowledgedissemination by presenting a case study of one Triple Pimplementation process in the Province of Alberta, Canada.

R. Breitkreuz (*)Human Ecology, University of Alberta,Edmonton, AB T6G 2N1, Canadae-mail: [email protected]

D. McConnell :A. Savage :A. HamiltonFamily and Disability Studies Initiative, University of Alberta,11487 89 Ave,Edmonton, AB T6G 2M7, Canada

Prev Sci (2011) 12:411–422DOI 10.1007/s11121-011-0233-6

We investigate the process of integrating levels 2 and 3 of theTriple P system (hereafter referred to simply as Triple P) intoexisting family support centres in Alberta, including influenceson program uptake and utilization.

Background

A growing body of literature that examines the disseminationof evidence-based programs as well as other knowledgedissemination strategies within the social, health andbehavioral sciences suggests that there are multipleinfluences on the uptake of research supported programs.This research suggests that key influences on the extentto which uptake of new programs occurs includes:organizational support (Seng et al. 2006), includingsupport from front-line staff (Rapp et al. 2010); compatibility(Addis 2002); adaptability (Addis and Krasnow 2000);practitioner confidence (Aarons and Palinkas 2007);opportunity to trial the program (Rogers 1995); andevidence that the new program will meet existing needs(Berwick 2008). In short, evidence of efficacy is typicallynot enough to ensure adoption.

Critical to uptake is support from top levels of theorganization (Seng et al. 2006). Management has to beopen to the innovation and willing to invest in the changeprocess, including practitioner training and program resources(Addis 2002). For this to occur, there is typically someperceived or demonstrated cost-advantage (Linney 1990).However, evidence of efficacy and “top-down” support arestill not sufficient conditions for the successful disseminationof research products and utilization of evidence-supportedinterventions (Schinke et al. 1991).

One key determinant of successful dissemination activity,in terms of program uptake, is the compatibility or fit of theprogram with currently felt needs and the beliefs and values ofthe potential adopter, where the potential adopter could be anorganization and/or individual practitioner (Berwick 2008;Landry et al. 2006). When a good fit exists, programs aremore likely to be accepted and integrated into practice. Incontrast, when the values of the organization are seen to beput at risk by the proposed program, integration is less likely.“Mis-fit” occurs, for example, when the implementation ofpre-packaged manualized programs are perceived, rightly orwrongly, as detracting from the therapeutic relationship or asantithetical to client-centered practice: turning professionalsinto technicians rather than caring human beings (Addis2002; Addis and Krasnow 2000).

Another key influence on the uptake of evidence-supported programs is the perceived simplicity (ease ofadoption) and adaptability of the program. Dissemination ismore likely to succeed when the program is simple,flexible, and adaptable to different adoption settings. This

includes, but is not limited to, the perceived adaptability ofthe program for different client groups and particular clientneeds. Local adaptation of evidence-supported programs is,however, controversial. Proponents of strict programfidelity point to evidence suggesting that tailoring mayreduce program efficacy (e.g., Kumpfer et al. 2002).Diffusion research shows, however, that any insistence onrigid adherence may be a barrier to successful dissemination.Programs (and other innovations) that are successfullydisseminated are almost always adapted in some way(Berwick 2003). Skillful competence appears to be a morerealistic goal than rigid, technical adherence (Addis andKrasnow 2000).

The training experience for a new evidence-supportedprogram and, in turn, practitioner confidence in his orher newly developed implementation skills have alsobeen identified as important determinants of programuptake and sustained use. In the child welfare context,Aarons and Palinkas (2007) found that practitioners weremore likely to “buy in” to a new program if the rationalefor implementation was clear; if the trainers demonstratedrespect for the practitioners’ experiences and were responsiveto their concerns; and, if the trainer was perceived bypractitioners to have expertise. In addition, diffusion researchsuggests that practitioners may need to trial the program andobserve the benefits for themselves before an evidence-supported program or innovation is fully integrated into theirhelping repertoire (Rogers 1995). Addis (2002) also notesthat learning a new program often requires practitioners tostep out of their comfort zone, so opportunities to try outnew interventions and to receive support from colleagues isoften vital for practitioners to develop confidence in theirimplementation skills. Moreover, positive client feedbackmay be the single most important determinant of whether aprogram is fully adopted and sustained (Sanders et al. 2009).

Informed by this previous research, we conductedindividual and group interviews with the directors and staffin ten Triple P sites in Alberta to identify factors that werefacilitators and barriers to implementation. Importantly, inaddition to identifying specific factors, this study was alsoable to provide some insight as to why and how thesefactors were significant, thereby adding to the literature onknowledge dissemination processes.

Triple P

The primary objective of Triple P is to improve the overallhealth, resourcefulness, and independence of familiesthrough enhancing parental knowledge, skills, and confi-dence. A key assumption of Triple P is that enhancedparenting will lead to healthy child development andreduced incidences of child abuse, mental illness and

412 Prev Sci (2011) 12:411–422

behavioral problems. The program is based on a multi-level approach that includes five intervention levels. Inthe Triple P approach, level 1 includes a media campaignintended for all parents interesting in learning aboutparenting and child development. Levels 2 and 3(detailed below) offer interventions by primary carepractitioners for specific behavior problems. Level 4 isfor parents of children with more severe behavioralproblems and entails eight to ten sessions. Finally, level5 offers intensive parent training programs addressingbroader family issues such as relationship conflict andparental depression, anger, and stress. It is usuallyprovided to parents who have already taken, or arecurrently in, a level 4 program (Triple P 2010).

Triple P (Levels 2 and 3)

Levels 2 and 3 of Triple P, currently being implementedin Alberta, are designed for use in primary care settingswith parents who seek professional guidance and supportto deal with common, discrete child behavior problems(e.g., tantrums, whining) and challenging (but typical)child developmental transitions (e.g., toilet training).Selected Triple P (i.e., level 2) is available in twoformats. The first is a brief, one to two sessionintervention that provides early anticipatory developmentalguidance to parents of children with mild behavioraldifficulties or developmental issues with the aid of tipsheets and videotapes that demonstrate specific parentingskills. Additionally, selected Triple P can be offered as aseminar series, including three specific positive parentingtopics. The seminars are used both to promote awarenessof Triple P and to provide information to parents. Eachseminar includes a presentation, a question and answerperiod, distribution of parenting tip sheets, and anopportunity for parents to consult with practitioners tomake individual inquiries and request further assistance.Primary Care Triple P (i.e., level 3) is a four-sessionintervention designed for children with mild to moderatebehavior problems and includes active skills training forparents.

Implementing Triple P in Alberta

In 2007, Alberta Children and Youth Services (ACYS)implemented a pilot of levels 2 and 3 of the Triple Pprogram in 19 family support centers in three regions in theProvince of Alberta. The pilot sites included urban andrural locations. ACYS limited training in the pilot to levels2 and 3 of the Triple P system, ascertaining that thesewould provide the most appropriate levels of intervention inthe non-targeted settings of family support centers. TheTriple P pilot sites were expected to integrate Triple P

programming into current parent education services,replacing programs that address similar issues but wereidentified by ACYS as lacking an evidence base.

Triple P International was contracted to provide trainingand accreditation for 60 family support center staff in level2 (provision of parenting advice through seminars and briefconsultations with parents) and level 3 (narrow-focus parentskills training) in 2007–2008. Staff from the agenciesparticipating in this evaluation received Triple P trainingand accreditation in two waves. The first cohort was trainedin Fall, 2007, and the second cohort was trained in Fall,2008. Staff participated in four consecutive days of trainingin Triple P levels 2 and 3, followed by an accreditationsession 6 weeks following training.

Methods

During the period of May through July, 2009, interviewswere conducted with ten directors of family supportagencies piloting Triple P, and group interviews wereconducted with 62 practitioners (including Triple Paccredited and non-accredited staff and directors) at eachof the ten Triple P “pilot” sites in both rural and urban areasthroughout the Province of Alberta. All of the familysupport centers piloting Triple P had five universal pre-existing core services including parent education, earlylearning and care (e.g., drop-in playgroup activities),developmental screening, family support (e.g., communitykitchen, clothing exchange), and information and referral.All interviews were conducted at the participating agenciesby a doctoral student who was also an experiencedpracticing psychologist with demonstrated interviewingand group facilitation skills. Participation was voluntaryand written informed consent was obtained.

Each group interview consisted of three to ninepractitioners, with two or three of these being Triple Ptrained. There was an average of six practitioners pergroup interview, and these interviews took between 60and 90 min to complete. The participating practitionerswere all women, and most of them had college diplomasor undergraduate degrees in early childhood developmentor social work. The work experience of the practitionersranged from 3 years to over 20 years of employment infamily support services. The participants also had a rangeof ethnic and racial backgrounds, including those ofAboriginal origin, reflecting the population of Alberta.

A semi-structured interview format was employed.Semi-structured interviews are designed to seek informationabout a particular topic, covering various domains ofknowledge, while still maintaining the flexibility of anunstructured interview (Richards and Morse 2007). Theareas of interest for the purposes of these interviews

Prev Sci (2011) 12:411–422 413

included: information about the local community and theparents accessing the service; the organization, itsmission and the range of services provided by it; theexperience of implementing Triple P within the organi-zation; and the strengths and limitations of Triple P fromthe perspective of the practitioners. Using this approach,an interview guide was developed to shape the course ofthe interview and ensure that particular areas of interestwere considered, but this was used more as an aidememoire than a rigid interview protocol. The aide memoirewas adapted over the course of the interviews as theconcurrent data analysis revealed data collection needs (e.g.,divergent findings or emerging themes that required furtherexploration). Issues and emerging insights garnered fromearlier interviews were also brought up in later interviewsfor verification. Using this approach, the interviewer wasfree to probe at certain points to elicit more in-depthinformation, and ask questions in a responsive manner(Bernard 2000). This style allowed for consistent data tobe collected, while leaving room for important andenriching data to emerge (Mayan 2009).

Detailed field notes were made by the interviewerfollowing each group and individual interview. Informationin the field notes included summaries of the interview aswell as brief documentation of compelling points made thatseemed particularly prevalent. With participant consent,each interview was digitally recorded and then transcribedverbatim. Transcripts were checked for accuracy. Theinterviewer then completed the preliminary analysis,identifying recurring themes using the constant-comparisonmethod (Strauss and Corbin 1998). This method involvescomparing two or more descriptions of experiences or keyevents and looking for similarities and differences acrossparticipants (Miles and Huberman 1994). These descriptivethemes were summarized into a preliminary report thatdetailed specific information about the implementationprocess of Triple P, how the program was being utilized,what practitioners noticed about the program, how it hadhelped or hindered practitioners’ ability to fulfill the mandateof the organization, and their perceptions of what hadworked well or not worked well in the implementation anddelivery of Triple P programming.

The first author then conducted a secondary thematicanalysis of the interview data to ensure the rigor of thepreliminary analysis, refine, develop and expand onrecurring themes, and search for and analyze “negativecases” (i.e., any inconsistencies) (Simons et al. 2008). Thiswas done through an iterative process of reading thetranscripts and writing analytical notes throughout theprocess, searching for commonalities and noting any pointsof divergence. By comparing chunks of coded data to lookfor commonalities, a process called “axial coding” (Straussand Corbin 1998), interrelationships between codes were

discovered, and these codes were merged to createcomprehensive themes. Through this in-depth analysis,two overarching themes were identified in the data: 1)Triple P adds value to existing services; and 2) there werefacilitators and barriers to the integration of Primary CareTriple P into existing services. Under these comprehensivethemes, key factors were delineated, as described in thefindings below.

Findings

Primary Care Triple P: Adding Value to Existing Services

Overall, staff at the Triple P pilot sites indicated thatalthough it was still “early days,” the experience ofimplementing Triple P into their organizations had beenpositive. The consensus was that Triple P is enablingstaff to “do what they do” more efficiently and moreeffectively. One practitioner describes her success withthe program: “I have had a few really great experienceswith it—very, very positive.” Interview participantsgenerally indicated that although Triple P was not aradical departure from existing services, there were still anumber of ways that Triple P was enhancing agencyservices. These included: 1) the high quality of Triple Presources; 2) a change in how services were offered; 3)enhanced credibility; and 4) improved linkages with otheragencies.

First, practitioners described how the high qualityTriple P resources, and the structured and systematicnature of the program, were optimizing teaching timeand effectiveness. One staff member captures thissentiment:

Triple P has [packaged] good parenting well…Onceupon a time we had this filing cabinet full of resources.And so I’d have a client come and then I’d have to goback to my filing cabinet, and I knew something ontoilet training was in there and I’d get it out. And there’sfive things on toilet training that would work for it. AndI would write up that sheet and then I’d give it to them.Whereas now, I can go straight to Triple P toilettraining.

Another practitioner describes the efficiency of Triple P,portraying the “no nonsense” approach of the program:

With Triple P you’re the expert coming in saying,okay…this is what you need to do. We don’t have alot of time for small talk… With home visitation youcan be messing around for a very long time to get thatsame solution. It’s efficient. It’s effective. It’s fast, butit’s not that warm and fuzzy…

414 Prev Sci (2011) 12:411–422

In this quote, the practitioner hints that there was somewhatof a trade-off between efficiency and rapport-building withthe client, yet still describes the effectiveness of thisapproach.

Second, although Triple P had not necessarily changedwhat family support practitioners did, for at least somepractitioners, it had transformed how they did it. Triple Pwas enhancing efficiency through a systematized process ofsupport and service delivery. One participant explains:“There was nothing new, nothing I had not seen before,nothing I hadn’t come across before… I think it’s in howthe Triple P provider approaches as systematically as youdo with the forms and the tracking—that makes thatprogram unique.” Triple P was also described as wellstructured and simple to implement. Another participantsummed-up Triple P as “a great little package that is easyto deliver.”

Third, practitioners perceived that the accreditationprocess, and the Triple P emphasis on evidence-basedpractices, gave them more credibility. Having a “struc-tured,” “defined,” “research-based” program meant thatstaff could draw on a larger body of evidence todemonstrate that these techniques worked. As oneparticipant said, “this isn’t just something airy-fairy…the key is to stay evidence-based.” The structure of theprogram, perhaps ironically, facilitated a more individu-alized case plan for the client:

What’s different about this program, and I appreciatethis a lot, is that I am not spending an hour talking atthe parent. I am spending an hour working with theparent…So it’s much more interactive. It’s much morefocused on the parent. It’s their program. It’s aboutthem, where they’re at, and meeting them wherethey’re at. I just guide them through it.

Triple P also provided a rationale for presenting andsticking with a particular solution to a problem. Onepractitioner describes this advantage: “It was excellentbecause you could say, we have been trained in this. Thisis an evidence-based program and these are the thingsthat if we follow with, it will work. You know, we haveto stick to it.”

In short, offering an evidence-based program was valuedand appreciated by organization staff.

Fourth, directors and practitioners reported that Triple Pwas enhancing linkages with other agencies. One interviewparticipant indicated that they were receiving referralsbecause “word of mouth is we are doing a good job.”Similarly, another indicated that the health unit had been avery positive source of referrals “because they heard thatwe are making a difference in those clients’ lives.”Evidence of enhanced inter-agency relationships andincreased referrals was described by one agency director:

“We seem to be getting a lot of parents that are beingreferred from Child Welfare to do Triple P as well.”Others also observed that their relationship with childand youth protection services had never been morepositive. As one participant indicated: “We haven’t had,I don’t believe, as much of a relationship with SocialServices as we have now.”

In summary, there were numerous aspects of Triple Pthat were value-added for existing organizations offeringfamily support and parenting advice: efficiency; asystematic approach; more credibility; and, enhancedrelationships with other service organizations.

Facilitators and Barriers to Integrating Primary Care TripleP into Existing Agencies

There was variability in the way and extent to whichTriple P was integrated into existing family supportservices. Five key factors impacting the integrationprocess emerged from the group interviews. Thesewere: (1) the level of development of pre-existingservices; (2) the degree of “fit” between the Triple Pprogram approach and existing agency practice, includ-ing philosophical approach, methods of delivery, andthe perceived suitability/unsuitability for some clientgroups; (3) practitioner perceptions of how free theywere to adapt the program; (4) rules about who couldand could not use Triple P resources; and (5) trainingand sustainability issues.

Level of Development of Agency Services

One factor that seemed particularly salient in theimplementation of Triple P was the pre-existing level ofservice/program development before Triple P was intro-duced. If the agency was already conducting a variety ofquality programs, Triple P seemed to be more readilyaccepted and implemented as another “tool in the toolbox” of resources. This is eloquently described by onepractitioner:

You know how they say how you build something…you put the big rocks in first and then the smallerrocks and then the gravel and then the sand and thenthe water. So we already had the big rocks andprobably some of the smaller rocks. I think we areprobably at the point where Triple P adds the gravel.

Notably, a few group interview participants pointed outthat the existing skills of agency staff contributed greatly tothe success of Triple P implementation. One staff memberstated: “without the skills and the experience level that thestaff bring to Triple P, we would not have the success thatwe have.” With a well-established agency, staff already had

Prev Sci (2011) 12:411–422 415

pre-existing relationships with parents, and were subse-quently able to “market” Triple P to parents moreeffectively. This point is described by a practitioner:“So that relationship is important and then it makes themkind of able to participate…so we are getting the parentswho have already built a relationship with [staff] andlove them.”

Practitioners saw the approach they took to theirroutine agency programming overall as key to legiti-mizing Triple P. Through offering programs that werenot stigmatizing or threatening, parents would be opento additional programs such as Triple P. One staffmember described this as “normalizing getting help.”Staff believed that it was important that relationshipsdeveloped between parents and staff occurred within adestigmatizing context in order for their agency man-dates to be fulfilled. One staff member indicated thatproblems were normalized through relationships betweenstaff and families in programs such as drop-in play-groups. Another staff member echoes this sentiment:

It’s through the relationships that we have establishedwith our families. You know, you are not coming herebecause you’re having parenting difficulties orbecause you’re isolated or because you’re in adifficult relationship. You are coming here to playwith your child. And because they have a relation-ship with all of us, if these things come up, theyare more willing to talk to us. You know, there isnot a lot of stigma coming here.

Programs such as playgroups served as a “foot in the door”so that parenting concerns could then be addressed in anon-threatening manner.

If, on the other hand, the agency was still in the processof “getting traction” in its overall programming, it appearedthat Triple P was more difficult to implement. There were anumber of reasons for this: lack of adequate infrastructure,insufficient staffing, and the inability to coordinate yetanother program in an already struggling organization. Notsurprisingly, implementing Triple P into an organizationthat was struggling to survive was difficult at best. Thisappeared to be the biggest struggle in agencies where therewas a hub with various satellite sites in remote locations.One practitioner describes some of the challenges: “Wehave never established our center really closely out there.We hired one staff, she stayed for almost a year before shewent on—and then we’ve had staff short, you know…” Inshort, having a well-established family support programwith pre-existing rapport with parents enhanced thelikelihood of successful implementation and uptake ofTriple P. In contrast, an agency struggling to staff itsprograms, particularly in rural and remote areas, had moredifficulty implementing a new program.

Degree of “Fit”Between Triple P andCurrent Agency Practices

There were three key issues with Triple P identified bysome practitioners as not fitting well with existing agencyapproaches and needs: the behavioral approach of Triple P,the lecture style of Triple P seminars, and the “mis-fit” ofTriple P for some client groups.

Theoretical Approach Some agency practitioners expresseddiscomfort with the underlying theory of Triple P:behavioral Family Intervention (BFI). They indicatedthat this behavior modification approach ran counter totheir training in early childhood development andattachment theory. One participant said that she is “stillstruggling with how attachment [fits] in Triple P.” Otherparticipants indicated discomfort with the use of “timeout” and “cry out” sessions. For instance, one practitionersaid: “I am not in love with some of the cry time and time outstuff.” Others indicated that these techniques were far tooprevalent in the Triple P materials, stating that “on everysingle tip sheet it gets back to the time outs and the quiettimes.” These concerns with the approach were expressedby practitioners in half of the centers in our study, asindicated in the following quote:

I found the behavior modification stuff, the rewardsand time outs, that kind of stuff was too prevalent in itfor me—I did not feel comfortable with it. And Ifinished the training and I went back to my employerand said, I don’t really like this program. I can’t seemyself using it—I will use parts of it. There is a lot ofstuff I will use but the few things in it that I didn’t likeI really don’t like and I feel strongly about.

In addition to being concerned with the philosophy ofbehavioral approaches, participants indicated that this blanketapproach did not seem to recognize the individuality ofdifferent children and families:

I struggle with time outs…I don’t necessarily believein them. And for me it was a bit of a hard—it’s a hardsell… and I also don’t think it works with every childalthough Triple P would absolutely disagree with me.I think that’s pigeon-holing people and I think youneed to find out what works…so that’s my struggle.And I had a hard time presenting that.

Because their concern about the content was contrary totheir theoretical approach, some practitioners chose not to usesome of the Triple P resources. One participant describes howshe dealt with this: “And there is one piece in that video, and itseems to me it’s the crying it out [part]. And then you leaveand you just let them cry….But I remember saying ‘ladies,I’m not even going to really play this for you…’ I cannotpromote something I am completely against.”

416 Prev Sci (2011) 12:411–422

Seminar Style Most staff perceived Triple P seminars to bevery useful for parents. One staff member indicated that theTriple P seminars had been “hugely, hugely attended, moresuccessful…than any of the [other] parenting courses.”However, some also found the lecture style difficult. Thiswas due to two key issues: their own personal fears ofpublic speaking, and their general disagreement withholding a lecture-type seminar as compared to a moreprocess-oriented workshop style of group work. Concernswere raised about the clinical nature of the seminarapproach versus a more process-oriented workshop styleof facilitation. This perspective was described by oneparticipant: “I found the validity of doing workshopbased programs compared to seminar based, it appearedto me that the participants got more out of it compared toa seminar.” Part of the concern with seminars seemed tobe a lack of knowing the impact on parents. Oneparticipant summed this up, stating: “Because you spendthe whole time just giving information…you’d neverhave a chance to see how it works.”

The Suitability/Unsuitability of Triple P for Some ClientGroups Practitioners reported that Triple P did not workwell for English as second language (ESL) families and wasnot appropriate for their clients with multiple or morecomplex needs. One participant who worked in an agencythat served many immigrant families explained that becauseESL families are struggling with language, the Triple Pmaterial, although good, “needed to be simplified.” Anotherparticipant indicated that it was a “big challenge” to getthrough the seminar material with an ESL group.Participants also noted that Triple P was not suitablefor their clients who had more complex needs. Theyindicated that if they screened a family and found thatthey had more than one or two issues, Triple P (levels 2and 3) would not be appropriate. One practitionerindicated: “So they need to be at a place where theyfeel they can focus…whereas if there is too much otherstuff going on in their life, they likely don’t have time totrack things and make a chart and—you know.”

In summary, the theoretical orientation of Triple P, theapproach toward seminar delivery, and the unsuitability ofTriple P (levels 2 and 3) for some client groups were issuesidentified by some practitioners as potential areas ofconcern for Triple P implementation in their agencies.

“Permission” to Adapt Triple P

The way in which Triple P staff were trained influenced theextent to which Triple P was implemented “by the book” oradapted to meet the specific needs of the parents at aparticular site. Practitioners who had participated in one

wave of Triple P training used words like “rigid” and“inflexible” to describe the program. Conversely, practi-tioners who had participated in another wave of trainingviewed Triple P as a flexible and adaptable program.Although some practitioners described adapting some ofthe actual material, most adaptations pertained to how thematerial was delivered. Practitioners sometimes changedthe wording of materials or provided additional examples toensure that the information was clear to clients. Yet anotheradaptive approach was to combine Triple P resources withother pre-existing program resources, as explained below.

So I just gathered some information when I washere, put a package together for them and fired itoff. And there was some Triple P and there wassome Active Parenting in it, put it all together.Because if I have information that they are lookingfor, I am doing them a disservice not to pass it on.So I turned a blind eye…

Concerns related to the perceived adaptability of theprogram were closely related to understanding about whocould or could not utilize Triple P resources.

Rules About Who Can Use Triple P Resources

A number of practitioners described frustration with “therules” about who could use the tips sheets and other TripleP resources. This sentiment is well-captured by oneparticipant: “Thou shalt not give out a tip sheet unlessyou are an accredited Triple P facilitator….” Anotherparticipant suggested that it would be helpful to post thetip sheets on their wall instead of keeping them lockedaway, where they can only be accessed by Triple Paccredited staff. Another staff member indicated that theresources were fabulous and beneficial to parents butinaccessible to staff who weren’t accredited. She voicedher frustration with what she called the “Triple P police.”Yet another used hyperbole to express her viewpointabout how the guidelines for usage of Triple P resourcescreated a barrier:

If your tip sheets are all hidden in this metal cupboard—and you need a swipe card and a key to get into it right,and then show your accreditation pass and put yourthumb print in, and it opens—then it makes it seem likescary and something that’s unapproachable…whereas ifit’s up on our wall and parents are reading it and they’reinterested and are asking questions, it will get utilized,and it will just become a normal part of what we are,what we do, what we offer.

Above, this practitioner is suggesting that in addition toissues with access to resources, there is a kind ofprohibition around the resources that is not constructive.

Prev Sci (2011) 12:411–422 417

Although practitioners were concerned about breakingthe rules of Triple P, it was sometimes difficult to resist thetemptation to use the high-quality Triple P resources. Onenon-accredited practitioner described how she “cheated” byusing a video from Triple P, although as she explained, shedidn’t call it Triple P because she would “get into trouble”:

I cheated a little with the video. There was a group thatalready exists that does a parent topic once a month.And you know, they wanted positive parenting. So I justbrought the “Every Parent’s Survival Guide” video. Andwe just played it. And I paused it at good spots and wediscussed it, and played a little bit more and wediscussed it. And it was actually really successful.

Similarly, one director indicated that in her agency, staffcreatively incorporated tip sheets into their general program-ming in addition to giving them out to select parents. This way,the information could be more broadly utilized.

Training and Sustainability

Participants described the training as “interesting” and“worthwhile” while at the same time indicating that it was“intense,” “stressful,” “overwhelming,” and “difficult.”Whilesome said it was the process of training that was difficult,others described the anticipation of the accreditation processas the pressure point. Staff were unequivocal in theirstatements, however, that despite how challenging theprocesses of training and accreditation were, the end resultwas beneficial. One participant summed up the sentiment wellin her comparison of the training experience to child birth:“It’s like being pregnant, right? You give birth to the baby. Youreally don’t want to do that again but you like the end result.”

Perhaps a more salient issue around training was concernabout the sustainability of Triple P due to staff turnover. Anumber of organizations had lost a Triple P trained worker,and these workers had not been replaced. There wereconcerns about how they would be able to continue offeringTriple P because training was not offered very frequently,and participants were aware that training was a costlyprocess. Staff suggested that a train-the-trainer modelwould help to ensure the continuity of the program.

But you know, like for example, if [staff member] was toleave, or if [staff member] was to leave….then we havelost that piece of the program, because there is nobodyelse…again, it’s just that whole turnover…..I think thereshould be a training trainer. So that even if there was oneor two people from each [family support centre] that weretrained as trainers…there might be somebody in another[agency] that could still come in and train the staff.

Concerns were also raised about the expense of theprogram, and how much time and energy it took from staff

to implement Triple P. When asked about whether or notTriple P has added to the agency, one director said:

It absolutely added. I wouldn’t argue that. But at thesame time, you know, you are using the staff youhave. And so if you are adding programs to their listthen you have to subtract programs somewhere else,right? So you know, in that sense, it’s a bit of abalancing act to just weave it in with what we do andmake sure everybody has a balanced piece of theprogram…I think we have actually been working farbeyond our capacity.

Because the programs the organizations offered wereprovided by their pre-existing staff, the tax on staffing ofadding yet another program could be worrisome. Staffshortages in personnel were a concern. Clearly, thesustainability of the Triple P program was a prominentconcern for staff.

Discussion

From the perspective of Triple P International (theproprietor), Alberta Children and Youth Services (thecustomer), and many, though not all of the participatingfamily support agencies and practitioners, the dissemina-tion of Triple P in the Province of Alberta could beviewed as a success. The proprietor of Triple P has amajor share in the international market for parentingtraining programs, and was successful in engaging theinterest of ACYS in piloting the program in Alberta.ACYS was also successful in engaging the participationof non-government family support agencies and practi-tioners, although it is unclear whether or to what extenttheir participation was truly voluntary. And, many familysupport practitioners reported success in engaging parent-clients in the Triple P program. This case study identifiedseveral factors that were key to this success, and whichmay be applicable or transferable to other programs andknowledge dissemination projects. These are: Triple P asevidence-based practice; the organizational or work-placecontext; and high-quality resources.

Key Success Factors

The branding of Triple P as evidence-based practice appearsto be one of the keys to its successful dissemination andimplementation in Alberta. In popular evidence-basedpractice discourse, evidence generated by “hard science”(i.e., randomized controlled trials) is privileged and thelegitimacy of other sources and forms of evidence, such aspractice-based experience, is reduced or discounted (Clegg2005; La Caze 2009). The influence of this popular

418 Prev Sci (2011) 12:411–422

discourse in preparing the groundwork for the dissemi-nation and implementation of Triple P in Alberta wasapparent at two levels of the dissemination chain. First, itappears that ACYS instigated the pilot of Triple P inAlberta despite there being little or no evidence thatexisting programs were ineffective. Rather, it seems thatthe branding of Triple P as evidence-based practice, andthe fact that existing services were not branded as such(despite having many of the same basic ingredients asTriple P), was sufficient justification for the pilot. At thenext level, a number of interview participants explainedthat although the Triple P program was not radicallydifferent from pre-existing programs, the fact that it was“evidence-based” gave them more credibility in the eyesof clients and other service providers. The reported resultwas increased inter-agency cooperation, client referrals,and client engagement. These findings are similar tothose reported by Dean et al. (2003), who found thatlinkages with external agencies were strengthened throughTriple P implementation.

The next key success factor identified in this studywas the organizational or workplace context. Most ofthe participating agencies were established (e.g., interms of programming and community presence) andstable (e.g., in terms of staffing), and these agenciesfound it easier to integrate Triple P into the servicesthey offered. Sanders et al. (2009) found that lack ofconfidence in parent consultation work, lack of time dueto after-hours appointments, and lack of knowledge orskill in behavioral family intervention were barriers touse of Triple P. It may be that more established familysupport centers in our study were able to provideadditional time and support to staff implementing TripleP, thereby making the transition to the new program lesstaxing. A related point is that stable agencies were ableto capitalize on parent trust earned through other “non-stigmatizing” programs, to engage parents in Triple P.Programs such as drop-in playgroups were an effectiveoutreach to parents in the community and an effectivemedium for promoting Triple P and bringing parentsinto the Triple P fold.

The third key success factor was the high qualityTriple P resources. These resources effectively translatedknowledge from research into user-friendly resources forpractitioners and their clients. Practitioners in this studyhighlighted “efficiency gains” related to the quick accessto high-quality material available through Triple P.Having these educational resources in-hand was time-saving, and the systematic nature of the interventionensured that time was used effectively. The way thatTriple P was able to consolidate and distribute research-based information for practitioners shows the importanceof this aspect of evidence-based practice. Relevant here

is the discipline of health informatics (Spring 2007).This term refers, in part, to how information is stored andmanaged so that it is readily available when it is needed.Informatics goes beyond resource management, however,to also consider electronic record keeping of clientinformation, professional practice guidelines, and sys-tematic reviews of the evidence. It is, in short, about“the technological systems infrastructure that providesdecision support” (Spring 2007, p. 616). Findings fromour study suggest that exploring ways to facilitateaccessibility to high-quality parenting materials could behelpful for family support centers. Health informaticscould be one way to enhance access in a sustainable way.The advantage of programs like Triple P is that theyprovide consistent access to relevant, high-quality, andup-to-date information for practitioners, facilitating betteruse of time, and enhancing the quality of services thatorganizations in various locals can provide.

Ongoing Tensions and Threats to the Sustainabilityof “Evidence-Based” Programs

Although the dissemination and implementation ofTriple P in Alberta may be described as a success fromat least some vantage points, this case study also shedslight on some of the central tensions, conflicts, andpotential barriers inherent in the dissemination of TripleP. In this study, central tensions existed between theinterests of the proprietor and the interests of end-users,particularly in regard to training and sustainability;between the requirement of program fidelity and thepractical need for adaptation or tailoring to best meet theneeds of local populations and clients; between thetheoretical underpinnings of the program and thetheoretical positions and practice experience of practi-tioners; and, between program scope and client needs.Insights gained from better understanding these tensionscould potentially be useful in the implementation ofsimilar standardized programs.

Proprietor and End-User Interests One point of tensionsurrounded the interests of proprietors and end-users inregard to practitioner training. On the one hand, it wasclear that practitioners in our study perceived thetraining they received from Triple P as worthwhile. Thisis not surprising, as training in a naturalistic settingoffers hands-on skills that didactic approaches cannotteach as effectively (Spring 2007). Onsite trainingprovided group skill development with peers to enhancecommon understanding. Furthermore, the requirementthat practitioners demonstrate their skills through initialtesting and repeated practice through an accreditationprocess, although challenging for practitioners, was

Prev Sci (2011) 12:411–422 419

perceived as a way to better ensure program credibility. Incombination, these aspects of the training processappeared to be key components of successfully preparingpractitioners for the implementation of a new programinto an existing organization.

On the other hand, the Triple P training andaccreditation model, which involved “bringing a trainerin from the outside,” was viewed as a significant threat.Because it was so costly, contracting agencies requiredthe best possible and long-lasting training for theirsubstantial investments. It is here that we see aconsiderable pressure point in our study. Programsustainability was a significant concern for agencydirectors and practitioners generally and more so forthose working in environments in which there was highstaff turnover. Our findings suggest that there areinherent tensions between the interests of agencies intheir efforts to sustain a pre-packaged manualizedprogram in the long-term, and the interests of thecorporation attempting to build a business that guaranteesprogram integrity and consistency backed by researchevidence. There is no easy resolution of this tensionbetween the needs of the individual agencies and thoseof the program owners. However, this study suggests thatinnovative solutions will be required in order to findways to reconcile competing needs.

One such way may be through a train-the-trainerapproach. Indeed, several participants suggested that atrain-the-trainer model would be more responsive toagency needs, and promote program sustainability.Findings from other implementation studies suggest thattrain-the-trainer models can be an effective way toincrease the likelihood of successful program uptakeand sustainability (Corelli et al. 2007; McLellan et al.2009). Introducing a model such as this inevitably bringnew challenges related to program consistency and quality,and trade-offs for standardized programs are inevitable.However, in order to keep programs running over thelong-term—an interest shared by both the corporationsdisseminating them and the agencies adopting them—alternative approaches to program implementation ofevidence-based programs will need to be considered toenhance sustainability.

Fidelity vs. Adaptation Another tension related to the issueof sustainability was the perceived adaptability of theTriple P manualized program. In our study, somepractitioners perceived Triple P to be rigid while othersperceived it to be flexible and adaptable. This appearedto be an effect of training, depending on the way thematerial was presented. The extent to which a program isperceived as adaptable is important. Maintaining programadherence is a legitimate concern for program devel-

opers; yet, it is also critical to recognize the importanceof adaptations to a particular program in a particularcontext that will enhance rather than diminish theintegrity of a program. The challenge of recognizingthe difference between competent adaptation and rigidadherence to a program is critical to successful programimplementation, but not easy to discern (Addis 2002).However, Berwick (2003) indicates that adaptations toprograms are imperative to successful implementation.He posits that “innovations are more robust to modifica-tion than their inventors think, and local adaptation,which often involves simplification, is nearly a universalproperty of successful dissemination” (Berwick 2003,p.1971). The results of this study suggest that innovationsmay be an important component of program implementa-tion; however, we also posit that these innovations shouldbe scientifically evaluated in order to more clearlyunderstand the implications of these adaptations onprogram outcomes.

Theory-Fit The fit or mis-fit between practitioner’stheoretical orientation or preferred approach and thetheory and approach of Triple P was another factor thatappeared to create some tension for the successfulimplementation of Triple P. Specifically, some practi-tioners preferred a more relationship-based approach overthe seminar style of Triple P, and some felt that Triple Pwas “too behavioral” and/or “too problem-focused.”Ogden et al. (2005) had similar findings in their studyof a parenting program implemented in Norway. Theyfound that in addition to having to change their theoreticalorientation, practitioners were somewhat resistant to thenew approach because the change in direction implied acritique of previous practice. Here, we observe a tensionbetween the valorization of evidence-based practice(Clegg 2005), and practitioners’ experiences of “triedand true” methods of program delivery, known to them tobe effective through substantial experiential learning. Itseems that introducing “new wine into old wine skins” cancreate bumps in an implementation process and warrantscareful consideration.

Program Scope and Client Needs Another component of“fit” that appeared to affect Triple P implementation wasthe extent to which the program met some clients’ needs. Inparticular, agencies serving families with multiple orcomplex needs or ESL clients raised concerns about thesuitability of Triple P for their particular client groups. Thisconcern may in part be an impact of the levels of Triple Poffered in this particular implementation process. Triple Plevels 2 and 3 are designed for discrete and/or minor childbehavioral problems and not designed for more complexbehavioral problems or families with considerable dysfunc-

420 Prev Sci (2011) 12:411–422

tion. However, concerns raised about the suitability forESL families require further investigation. Diversity inclients served, particularly in immigrant-receiving nationssuch as Canada and the U.S., must carefully beconsidered when implementing a manualized programinto a particular context.

Limitations

There are several limitations in this study. First, whenusing interviews as a primary research tool, researchereffect must be considered in the interview (Kvale 1996).The extent to which participants shape their answers toreflect what they think the interviewer may want to hear isincreased. Because this study was a case study of animplementation process of a pilot program in familysupport centers throughout Alberta, it is possible that staffmay have believed that the interviewer wanted to hearpositive things about Triple P and framed their responsesto questions accordingly.

Using group interviews also has its limitations.Although group interviews can enhance discussionthrough enriching the details of an experience or eventfrom multiple standpoints, there is also a risk that powerdynamics within the group can curtail some points ofview, while exaggerating others. In the case of thesegroup interviews, practitioners and directors were pres-ent, as were Triple P accredited staff and non-accreditedstaff. It is possible that some practitioners may have feltpressure to censor their comments because their boss wasthere, or because there were Triple P trained staff presentthat they did not wish to offend. Alternatively, Triple Pstaff may have felt it necessary to over-emphasize thecontributions of non-accredited staff to the successfulimplementation of Triple P in the sites.

Finally, it must be acknowledged that this is a relativelysmall case study of a program implemented in manycountries throughout the world. The extent to which thesefindings can be generalized must be carefully assessed,recognizing that experiences of implementation will vary indifferent national, geographic, and organizational contexts.

Conclusion

Sanders and Turner (2005) attribute the success of TripleP to a variety of factors, including but not limited to thefollowing: quality of the intervention; flexibility of theTriple P system; strategic alliances with organizations,including the identification and support of an internaladvocate to ensure that program adoption is supportedby management; a “just right” (i.e., not too onerous)

approach to practitioner training that includes activeskills training; development of peer support and super-vision networks; and, built-in evaluation mechanisms or“feed-back loops” to reinforce success and fostercontinuous quality improvement.

The findings from our study support some of Sandersand Turner’s (2005) assertions. In particular, the quality ofthe program, the value of partnership with existingorganizations, and the value of high-quality resources wereevident in the interviews we conducted with directors andstaff of Triple P sites in Alberta. However, our studyfindings also suggest that there are remaining issues to beresolved in the successful implementation of Triple P intoexisting organizational structures. The sustainability of theprogram due to high training costs and the requirement thatthe trainer be from Triple P International, combined withhigh staff turnover, is an issue that merits further consid-eration. The inability to sustain the program due to staffattrition could result in what Schinke et al. (1991) calls“maintenance failure” (cited in Sanders and Turner 2005).In addition, perceived flexibility and adaptability of theprogram were ongoing concerns for staff. Finally, thebehavioral approach was somewhat disconcerting forpractitioners educated in different theoretical schoolssuch as attachment approaches, as was suitability of theprogram for some client groups. Developers of evidence-based programs must be cognizant of this concern, to avoid anunintended outcome of “one size fits none” in programdelivery. Future studies of the implementation process ofTriple P and other research-based programs will continue toshed light on how to facilitate successful adoption of a newprogram into an existing organization. Like Addis (2002), wesuggest that careful consideration of voices from the trencheswill provide valuable insight into program implementationthat will ultimately serve to strengthen evidence-basedpractice and best assist the clients these programs aim toserve.

Acknowledgement The work was supported by a grant from theAlberta Centre for Child, Family and Community Research.

We would like to thank Laura Hedlin for her research assistance onthis manuscript. We would also like to thank two anonymousreviewers who provided valuable feedback on this paper.

References

Aarons, G. A., & Palinkas, L. A. (2007). Implementation of evidence-based practice in child welfare: Service provider perspectives.Administration and Policy in Mental Health and Mental HealthServices Research, 34, 411–419.

Addis, M. E. (2002). Methods for disseminating research products andincreasing evidence-based practice: Promises, obstacles, and futuredirections. Clinical Psychology: Science and Practice, 9, 367–378.

Prev Sci (2011) 12:411–422 421

Addis, M. E., & Krasnow, A. D. (2000). A national survey ofpracticing psychologists’ attitudes toward psychotherapytreatment manuals. Journal of Consulting and Clinical Psychology,68, 331–339.

Bernard, H. R. (2000). Social research methods: Qualitative andquantitative approaches. Thousand Oaks, CA: Sage.

Berwick, D. M. (2003). Disseminating innovations in health care.Journal of the American Medical Association, 289, 1969–1975.

Berwick, D. M. (2008). The science of improvement. Journal of theAmerican Medical Association, 299, 1182–1184.

Clegg, S. (2005). Evidence-based practice in educational research: Acritical realist critique of systematic review. British Journal ofSociology of Education, 26, 415–428.

Corelli, R., Fenlon, C., Kroon, L., Prokhorov, A., & Hudmon, K.(2007). Evaluation of a train-the-trainer program for tobaccocessation. American Journal of Pharmaceutical Education, 71,1–9.

Dean, C., Myors, K., & Evans, E. (2003). Community-wide imple-mentation of a parenting program: The south east Sydneypositive parenting project. Australian e-Journal for the Advance-ment of Mental Health, 2. Retrieved December 1, 2010 fromhttp://www.reachoflouisville.com/meath/meath/Community-wide%20Implementation%20of%20a%20Parenting%20Program%20The%20South%20East%20Sydney%20Positive%20Parenting%20Project.pdf

Kvale, S. (1996). Interviews: An introduction to qualitative researchinterviewing. Thousand Oaks, CA: Sage.

Kumpfer, K. L., Alvarado, R., Smith, P., & Bellany, N. (2002).Cultural sensitivity and adaptation in family-based preventioninterventions. Prevention Science, 3, 241–246.

La Caze, A. (2009). Evidence-based medicine must be…. The Journalof Medicine and Philosophy, 34, 509–527.

Landry, R., Amara, N., Pablos-Mendes, A., Shademani, R., & Gold, I.(2006). The knowledge-value chain: A conceptual framework forknowledge translation in health. Bulletin of the World HealthOrganization, 84, 597–601.

Linney, J. A. (1990). Community psychology into the 1990s:Capitalizing opportunity and promoting innovation. AmericanJournal of Community Psychology, 18, 1–17.

Mayan, M. (2009). Essentials of qualitative inquiry. Walnut Creek,CA: Left Coast Press.

McLellan, J., Leon, T., Haffey, S., & Barker, L. (2009). Exporting aCanadian parenting education program to the Dominican Republic.Public Health Nursing, 26, 183–191.

Miles, M., & Huberman, A. (1994). Qualitative data analysis: Asourcebook of new methods (2nd ed.). Thousand Oaks, CA: Sage.

Ogden, T., Forgatch, M. S., Askeland, E., Patterson, G. R., & Bullock,B. M. (2005). Implementation of parent management training atthe national level: The case of Norway. Journal of Social WorkPractice, 19, 317–329.

Rapp, C. A., Etzel-Wise, D., Marty, D., Coffman, M., Carlson, L.,Asher, D., et al. (2010). Barriers to evidence-based practiceimplementation: Results of a qualitative study. CommunityMental Health Journal, 46, 112–118.

Richards, L., & Morse, J. M. (2007). Read me first for a user’s guideto qualitative methods (2nd ed.). Thousand Oaks, CA: Sage.

Rogers, E. M. (1995). Diffusion of innovations (4th ed.). New York:Free Press.

Sanders, M. R., & Turner, K. M. (2005). Reflections on the challengesof effective dissemination of behavioral family intervention: Ourexperience with the Triple P—Positive Parenting Program. Childand Adolescent Mental Health, 10, 158–169.

Sanders, M. R., Turner, K. M., & Markie-Dadds, C. (2002). Thedevelopment and dissemination of the Tripe P- Positive ParentingProgram: A multi-level, evidence-based system of parenting andfamily support. Prevention Science, 3, 173–189.

Sanders, M. R., Prinz, R. J., & Shapiro, C. J. (2009). Predictingutilization of evidence-based parenting interventions withorganizational, service-provider and client variables. Administrationand Policy in Mental Health and Mental Health Services Research,36, 133–143.

Schinke, S. P., Botvin, G. J., & Orlandi, M. A. (1991). Substanceabuse in children and adolescents: Evaluation and intervention(Vol. 22). Thousand Oaks, CA: Sage.

Seng, A., Prinz, R., & Sanders, M. (2006). The role of trainingvariables in effective dissemination of evidence-based parentinginterventions. International Journal of Mental Health Promotion,8, 19–27.

Simons, L., Lathlean, J., & Squire, C. (2008). Shifting the focus:Sequential methods of analysis with qualitative data. QualitativeHealth Research, 18, 120–132.

Spring, B. (2007). Evidence-based practice in clinical psychology:What it is, why it matters; what you need to know. Journal ofClinical Psychology, 64, 611–631.

Strauss, A., & Corbin, J. (1998). Basics of qualitative research:Techniques and procedures for developing grounded theory (2nded.). Thousand Oaks, CA: Sage.

Triple P. (2010). What is Triple P? Retrieved December 4, 2010 fromhttp://www27.triplep.net/?pid=29.

422 Prev Sci (2011) 12:411–422