Complexity Science and Health IT

11
i n t e r n a t i o n a l j o u r n a l o f m e d i c a l i n f o r m a t i c s 8 3 ( 2 0 1 4 ) e12–e22 journa l h omepage: www.ijmijournal.com Complexity and the science of implementation in health IT—Knowledge gaps and future visions Patricia A. Abbott a,b,, Joanne Foster c , Heimar de Fatima Marin d , Patricia C. Dykes e,f,g a Division of Nursing Business & Health Systems, University of Michigan School of Nursing, Ann Arbor, MI, USA b Office of Global Outreach, University of Michigan School of Nursing, Ann Arbor, MI, USA c School of Nursing, Queensland University of Technology, Brisbane, Australia d School of Nursing, Universidade Federal de São Paulo, Brazil e Center for Patient Safety, Research & Practice, Brigham and Women’s Hospital, MA, USA f Center for Nursing Excellence, Brigham and Women’s Hospital, MA, USA g Harvard Medical School, Boston, MA, USA a r t i c l e i n f o Article history: Received 1 July 2013 Accepted 24 October 2013 Keywords: Implementation science Complexity Implementation of health information technology Consolidated framework for implementation research a b s t r a c t Objectives: The intent of this paper is in the examination of health IT implementation pro- cesses the barriers to and facilitators of successful implementation, identification of a beginning set of implementation best practices, the identification of gaps in the health IT implementation body of knowledge, and recommendations for future study and application. Methods: A literature review resulted in the identification of six health IT related imple- mentation best practices which were subsequently debated and clarified by participants attending the NI2012 Research Post Conference held in Montreal in the summer of 2012. Using the framework for implementation research (CFIR) to guide their application, the six best practices were applied to two distinct health IT implementation studies to assess their applicability. Results: Assessing the implementation processes from two markedly diverse settings illus- trated both the challenges and potentials of using standardized implementation processes. In support of what was discovered in the review of the literature, “one size fits all” in health IT implementation is a fallacy, particularly when global diversity is added into the mix. At the same time, several frameworks show promise for use as “scaffolding” to begin to assess best practices, their distinct dimensions, and their applicability for use. Conclusions: Health IT innovations, regardless of the implementation setting, requires a close assessment of many dimensions. While there is no “one size fits all”, there are commonali- ties and best practices that can be blended, adapted, and utilized to improve the process of implementation. This paper examines health IT implementation processes and identifies a beginning set of implementation best practices, which could begin to address gaps in the health IT implementation body of knowledge. © 2013 Elsevier Ireland Ltd. All rights reserved. Corresponding author at: Division of Nursing Business & Health Systems, University of Michigan School of Nursing, Ann Arbor, MI, USA. E-mail address: [email protected] (P.A. Abbott). 1386-5056/$ see front matter © 2013 Elsevier Ireland Ltd. All rights reserved. http://dx.doi.org/10.1016/j.ijmedinf.2013.10.009

Transcript of Complexity Science and Health IT

i n t e r n a t i o n a l j o u r n a l o f m e d i c a l i n f o r m a t i c s 8 3 ( 2 0 1 4 ) e12–e22

journa l h omepage: www.i jmi journa l .com

Complexity and the science of implementation in healthIT—Knowledge gaps and future visions

Patricia A. Abbotta,b,∗, Joanne Foster c, Heimar de Fatima Marind, Patricia C. Dykese,f,g

a Division of Nursing Business & Health Systems, University of Michigan School of Nursing, Ann Arbor, MI, USAb Office of Global Outreach, University of Michigan School of Nursing, Ann Arbor, MI, USAc School of Nursing, Queensland University of Technology, Brisbane, Australiad School of Nursing, Universidade Federal de São Paulo, Brazile Center for Patient Safety, Research & Practice, Brigham and Women’s Hospital, MA, USAf Center for Nursing Excellence, Brigham and Women’s Hospital, MA, USAg Harvard Medical School, Boston, MA, USA

a r t i c l e i n f o

Article history:

Received 1 July 2013

Accepted 24 October 2013

Keywords:

Implementation science

Complexity

Implementation of health

information technology

Consolidated framework for

implementation research

a b s t r a c t

Objectives: The intent of this paper is in the examination of health IT implementation pro-

cesses – the barriers to and facilitators of successful implementation, identification of a

beginning set of implementation best practices, the identification of gaps in the health IT

implementation body of knowledge, and recommendations for future study and application.

Methods: A literature review resulted in the identification of six health IT related imple-

mentation best practices which were subsequently debated and clarified by participants

attending the NI2012 Research Post Conference held in Montreal in the summer of 2012.

Using the framework for implementation research (CFIR) to guide their application, the six

best practices were applied to two distinct health IT implementation studies to assess their

applicability.

Results: Assessing the implementation processes from two markedly diverse settings illus-

trated both the challenges and potentials of using standardized implementation processes.

In support of what was discovered in the review of the literature, “one size fits all” in health

IT implementation is a fallacy, particularly when global diversity is added into the mix. At

the same time, several frameworks show promise for use as “scaffolding” to begin to assess

best practices, their distinct dimensions, and their applicability for use.

Conclusions: Health IT innovations, regardless of the implementation setting, requires a close

assessment of many dimensions. While there is no “one size fits all”, there are commonali-

ties and best practices that can be blended, adapted, and utilized to improve the process of

implementation. This paper examines health IT implementation processes and identifies a

beginning set of implementation best practices, which could begin to address gaps in the

health IT implementation body of knowledge.

∗ Corresponding author at: Division of Nursing Business & Health SystemE-mail address: [email protected] (P.A. Abbott).

1386-5056/$ – see front matter © 2013 Elsevier Ireland Ltd. All rights reshttp://dx.doi.org/10.1016/j.ijmedinf.2013.10.009

© 2013 Elsevier Ireland Ltd. All rights reserved.

s, University of Michigan School of Nursing, Ann Arbor, MI, USA.

erved.

a l i n

1

Dpwaospnmasba

rpmiia

aeimictt(i

rnriotsssmrteiwtp

stlrico

i n t e r n a t i o n a l j o u r n a l o f m e d i c

. Introduction and review of the literature

eploying new technology and practice innovations in com-lex healthcare environments is challenging, particularlyhen the innovation is disruptive to established structures

nd workflow. Electronic health record systems (EHRS) and/orther types of health information technology (IT) are con-idered disruptive technologies, and their integration intoractice has been slow and problematic. Even in light ofumerous reports of the benefits of EHRS, when evidenceeets the realities of practice [1], successful deployment and

doption can be threatened. A plethora of studies point toystems that work well in controlled situations or in the lab,ut fail miserably upon implementation in a naturalistic orlternative environment [2–5].

The seeming disconnect between the evidence and theeality post-implementation raises the question of where theroblem lies. Is the actual technology so faulty that its deploy-ent and use results in harm – or is it in the manner of

mplementation that gives rise to untoward effects? In reality,t is probably a combination of the two, which opens numerousvenues for focused inquiry.

A growing number of researchers are convinced thatchieving levels of success anticipated from EHRS is depend-nt not only upon the features and functions of the systemtself – but also in the manner in which the system is imple-

ented [6,23]. The challenge is that implementing systemsnto complex and chaotic healthcare environments is diffi-ult and disruptive – and the evidence on how to improvehe process is low. We suggest that complexity theory andhe methods and models from implementation science canand should) be used to reframe the thinking about health ITmplementation.

Numerous studies are found in the healthcare literatureegarding the application of implementation science tech-iques in general; however there is a paucity of literatureelated to the use of these approaches specifically in health ITmplementations. This has resulted in a poor understandingf the impact that the process of implementation may have onhe outcomes of a health IT intervention. Consequently, manytudies regarding EHRs result in reports of failure to achieveignificance or effect meaningful change or, more worrisome,how declines in quality, safety, and efficiency after the imple-entation of an EHRS. Failure to pay attention to the threats

elated to the process of implementation results in findingshat do not tell the whole story. The infamous article by Hant al. [2] in 2005 is a case in point. While there was an increasen infant mortality after an EHR implementation and thereere distinct issues with the technology, much of the causa-

ion for the negative outcomes fell squarely on an extremelyoor implementation process [19].

Eight years after the Han et al. article however, there is stillcant attention being paid to this critical aspect of implemen-ation science. The inability to capture, represent, and reuseessons learned in the implementation process dooms us to

epeat them. The issue is further compounded when takennto the international context. In addition to understandingomplex processes present in an organization, the impactf cultural differences can markedly affect outcomes. Cross

f o r m a t i c s 8 3 ( 2 0 1 4 ) e12–e22 e13

cultural study, workflow inquiry, and adaptation to situation isrequired. The stakes are high; the use of EHRs will continue toexpand as global resources continue to contract. The impetusto understand, deploy and capitalize upon effective productsand processes is marked – particularly as related to health IT.

The focus of this paper is in the examination of health ITimplementation processes – the barriers to and facilitators ofsuccessful implementation, identification of a beginning setof implementation best practices, the identification of gaps inthe health IT implementation body of knowledge, and recom-mendations for future study and application.

1.1. Implementation science, complexity science, andintersections with health IT

Health IT implementation is strongly influenced by the con-text in which it occurs. Clinical environments are complex,unpredictable, and replete with convoluted and highly inter-dependent relationships. Variation is the norm, and nonlinearand novel responses are commonplace. Hospitals and clin-ics fall therefore into the category of a complex and adaptivesystem (CAS).

As posited by Leykum and colleagues [7], implementationsof any sort in a CAS requires creative and critical thinking,acceptance that each system is unique, complex, and contin-ually changing, and understanding that methods that workin one organization or location may fail in another. In addi-tion, changes over time and the influence of the interventionitself on the environment will require a continual adaptationof the methods and models used to study the impact of theintervention [23]. In short, applying traditional approachesto the evaluation of HIT implementation is insufficient forgaining the level of understanding required in such complexenvironments [7]. As advocated by many [7–10,23] a deeperappreciation for the tenets of complexity science and applica-tion of implementation science principles is advised to trulyunderstand the processes and maximize the efficiency of HITimplementations in CAS.

Complexity science approaches are focused upon the iden-tification of interacting elements, an understanding of theirunpredictability, interdependencies and interactions, and thepotential impact that these elements may have on an out-come of interest [11]. Implementation science is the “studyof methods that promote the integration of research findingsand evidence into health care policy and practice. It seeks tounderstand the behavior of healthcare professionals and otherstakeholders as a key variable in the sustainable uptake, adop-tion, and implementation of evidence-based interventions”[12]. Combining complexity and implementation science prin-ciples and applying them to the study of the implementationof health IT therefore would require the knowledge that theunique and continually shifting characteristics of the health-care environment will impact the methods and models usedin a study. Moreover, the understanding that adaptation ofa set approach based on context and evolution of the envi-

ronment over time is the norm. We believe that applyingcomplexity and implementation sciences to health IT imple-mentation will facilitate a deeper understanding of processbarriers and facilitators, enabling the creation of a knowledge

i c a l i n f o r m a t i c s 8 3 ( 2 0 1 4 ) e12–e22

Fig. 1 – Chaudoir et al. multilevel framework for

e14 i n t e r n a t i o n a l j o u r n a l o f m e d

base of transferrable best practices for successful health ITimplementation.

An additional challenge to gaining the needed understand-ing and converting knowledge into replicable implementationmethods however lies in the comprehensive yet non-redundant identification of the constructs that representcharacteristics of interest and the use of a scaffold thatfacilitates their application. Numerous models, frameworks,taxonomies, and methods exist [13], but the standardizationof terms and the description of approaches are poor. Trying todetermine the meaning of ill-defined constructs, discern theirappropriateness for a given situation at varying levels (patient,group, organizational, and policy [14]), and then applying themin a systematic way is difficult. The use of a framework to guideimplementation studies is another critical consideration asa knowledge base for studies of health IT implementation isconceptualized.

A variety of frameworks for implementation research existsuch as the Dynamic Adaptation Process (DAP) [24], the Explo-ration, Preparation, Implementation, and Sustainment Model(EPIS) [25], variants of Social-Technical Frameworks [26–29],the Veterans Affairs QUERI [21], Reach, Effectiveness, Adop-tion, Implementation, and Maintenance Framework (RE-AIM)[22], Institute for Healthcare Improvement’s Framework forSpread [30], Participatory Design [32], and Normalization Pro-cess Theory (NPT) based frameworks [20]. The consolidatedframework for implementation research (CFIR), one of severaladditional frameworks evident in the literature, is one thatseems to be practical and well suited for application in healthIT implementation studies [10]. A recent and intriguing studythat utilized the CFIR specifically discusses its use in bridginginformatics and implementation science [10]. The recency andfocus upon EHRS of the CFIR model by Richardson et al. influ-enced our decision to devote additional attention to it in thisarticle. A brief overview of the CFIR follows, accompanied witha suggested amendment to its terminology for application inimplementation of health IT.

The CFIR was derived from a meta-analysis of existingimplementation theories, and designed to not only clarify,but to create a framework for “creating an implementationknowledge base across multiple studies and settings” [14].The CFIR is composed of five domains: intervention char-acteristics, outer-setting, inner-setting, characteristics of theindividual involved, and the process of implementation [15].Each of the five domains contain defining constructs and asso-ciated methods for measuring the constructs. Readers arereferred to the work of Damschroder et al. [14] for detaileddescriptions of the original CFIR and its associated constructs.

To address the need for further refinement of the origi-nal CFIR, Chaudoir, Dugan and Barr [16] in 2013 added anadditional defining characteristic to the fourth domain of“characteristics of individual involved” [15]. The authors citeda concern that the original CFIR (and its association withthe individual only within the context of an organizationalstructure) did not allow for the consideration of patient-levelvariables critical to the success of patient-centric interven-

tions [16]. Chaudoir et al. also identifies a beneficial set of62 available measures that “can be utilized to assess con-structs representing structural-, organizational-, provider-,patient-, and innovation-level factors – factors that are each

implementation research [16].

hypothesized to affect implementation outcomes.” These 62measures may be of value to individuals interested in imple-mentation science applied to HIT. This model is representedin Fig. 1.

The enhanced CFIR suggested by Chaudoir et al. [16] withthe addition of the patient-level variable consideration isvaluable and particularly apropos for use with health IT imple-mentation studies as patient-facing health IT interventionsaccelerate. However, in applying the adapted CFIR to the eval-uation of health IT implementation, we believe that a slightchange in the Chaudoir et al. terminology from the termpatient-level to user-level is warranted. While the suggestion ofthe term user-level that we suggest may appear to return toDamschroder et al. [14] initial CFIR terminology of “individ-ual characteristics” the original and important differentiationraised by Chaudoir et al. remains. The original use of the termof “individual characteristics” was used to describe the inter-relationships and impact of individual characteristics at theorganizational level, which is an unnecessary constraint.

As an overarching term, “users” would more holisticallyrepresent a myriad of individuals, outside of the organizationaldimension, whose characteristics can exert influence on theimplementation and adoption of health IT, such as fami-lies, communities and social norms. In example, the use oftelehealth applications implemented in the home may beinfluenced by family dynamics, cultural beliefs, political lean-ings, or perception – regardless of a patient’s opinion. Thesphere of influence is larger than at the patient-level. Thispoint is further supported by a recent study of habituationand implementation research [17] which makes the point thathumans are humans – regardless of whether they are cli-nicians, families, communities, or patients. Internal beliefs,habituation, and motivations, external to the organizationalcontext, will influence intentions, impacting implementationon the level of the user (who may or may not be a patient)and that these influences carry forth into behaviors regard-less of setting. It would seem therefore, that expanding theterminology suggested by Chaudoir et al. [16] from “patient”

to “user”, is prudent for use of the CFIR in health IT implemen-tation studies. Because this adaptation of the terminology issuggestive in nature and has yet to be validated, we advise the

a l i n

urCaede

1

TtiaipIiasnp

2H

Sethimatohtfm

iaiotC

dicmcTmsptl

i n t e r n a t i o n a l j o u r n a l o f m e d i c

se of the Chaudoir conceptual framework as published andepresented in Fig. 1. Finally, while the focus has been upon theFIR in this discussion, it should not imply that other modelsnd frameworks are not applicable. As implementation sci-nce and its application to EHRS interventions continues toevelop, additional uses, modifications, and new models arexpected and encouraged.

.2. Summing up the literature

he fundamentals that have been provided above are intendedo encourage a reframing of the manner in which health ITmplementation studies are conceptualized and undertaken,nd the authors are cognizant of the fact that this area ofnquiry requires much more work. Space constraints andaper intent restrict the in-depth review and debate necessary.

t is hoped however that informaticians and others interestedn health IT implementation and adoption will begin to debatend consider the use of complexity theory, implementationcience, and implementation frameworks in efforts of thisature. In this paper, we begin to consider such work, androvide the results and discussion of our efforts to date below.

. Methods and approach: Best practices inIT implementation – beginning steps

ince implementation science in general has been consid-red by many to be in its infancy, the lack of literature foundhat discusses implementation science within the context ofealth IT is not surprising, yet is presented a challenge for the

dentification and application of best practices in HIT imple-entation. As implementation science continues to evolve

nd its application in health IT interventions increases –here will be a concomitant growth of the knowledge basef proven models, methods and terminologies. In the interimowever, several aspects of complexity theory and implemen-ation science can be borrowed and used as a starting pointor considering evidence-based approaches to health IT imple-

entation.Table 1 includes six implementation best practices reported

n the literature that are relevant to health IT implementationnd were agreed upon as a starting point during a recent work-ng group meeting held as part of the NI2012 Post-Conferencen Implementation Science in Informatics Research. Explana-ions of the practices and their relationships to the multi-levelFIR framework are provided in Table 1 as well.

Two health IT implementation case studies are firstescribed and are then evaluated relative to the health IT

mplementation best practices identified in Table 1. The thirdolumn in Table 1 reflects the multi-level constructs of theodified CFIR Framework suggested by Chadoir et al. that are

ausal or predictive in regards to implementation outcomes.he case studies were selected because both focus on imple-entation of a complex health IT intervention in health care

ettings. These case studies are used to highlight the multi-le challenges faced in health IT implementation and howhese challenges can be addressed using a beginning set ofiterature-supported implementation best practices.

f o r m a t i c s 8 3 ( 2 0 1 4 ) e12–e22 e15

3. Results

In the sections below, two HIT implementation case stud-ies are first described and are then evaluated relative to thebest practices identified in Table 1 as a demonstration. Inthe discussion section, barriers and facilitators to success-ful HIT implementation are described and recommendationsare made for overcoming barriers, strengthening facilitatorsand increasing the probability of implementation success. Thecase studies highlight the multiple challenges faced in HITimplementation in very diverse environments, and how thesechallenges can be addressed using a beginning set of literaturesupported and transferrable implementation best practices.

3.1. Case Study 1: the Fall TIPS intervention

3.1.1. BackgroundFall TIPS (Tailoring Interventions for Patient Safety) is aninteractive tailored patient fall risk assessment tool designedto help nurses conduct a fall risk assessment using a validand reliable instrument, the Morse Fall Scale [35]. The FallTIPS is also designed to assist in the development and com-munication of a personalized plan based on patient-specificdeterminants of risk to all members of the healthcare teamincluding patients and family [32]. Using the web-based FallTIPS application, nurses document a patient’s fall risk factorsrelative to the following areas of risk: (1) history of falls, (2)presence of a co-morbid diagnosis, (3) presence of an intra-venous catheter, (4) use of an ambulatory aid, (5) presence ofa gait disturbance, (6) confusion or unwillingness to call forhelp. Fall TIPS displays the tailored or personalized plan that ismost likely to prevent a fall [33] based on the patient’s fall riskprofile. Fall TIPS effectiveness in reducing physiological fallswas demonstrated in a randomized control trial in four acutecare hospitals with over 10,000 patients [33]. Use of Fall TIPSresulted in a 22% reduction in patient falls and was particularlyeffective with patients over age 64 [33].

3.1.2. Best Practice 1 – identify multiple implementationmethods and modelsThe Fall TIPS project involved both development and imple-mentation of a health IT application. This required the useof multiple methods to engage stakeholders (professionaland paraprofessional providers, patients and family) in thedevelopment and implementation processes. Two frameworksguided the study; participatory design [31] and the Institutefor Health Care Improvement’s Framework for Spread (FFS)[30] Together, these frameworks helped to overcome the chal-lenges related to involving end users in iterative developmentof the intervention, engaging stakeholders in identifying waysto overcoming the inertia of previous practice, and to integratea new innovation into existing workflows [32].

The FFS is based on Roger’s Diffusion of Innovations theory[34] and supports sustained spread of messages to stakehol-ders needed to support widespread adoption and use of a new

innovation. In our earlier work, we found that the FSS waseasily integrated into existing quality management programsand was straightforward to replicate across sites [32]. Also, theFFS provided a structure that facilitated Fall TIPS adoption on

e16 i n t e r n a t i o n a l j o u r n a l o f m e d i c a l i n f o r m a t i c s 8 3 ( 2 0 1 4 ) e12–e22

Table 1 – Health IT implementation best practices and linkage to multilevel framework.

Identifiedbest-practice

Explanation Relative aspects of CFIRframework (Fig. 1)

1. Identify multipleimplementationmethods and models.

Methods and models are not perfect, that there is no “one sizefits all” approach to implementation. A standardizedapplication of a single implementation model is not validbecause structural, organizational, provider, patient andinnovation-level factors predict implementation outcomes[16]. Complex studies in complex settings often require theuse of multiple methods and models to supportimplementation. Methods may require adaptationmid-intervention.

StructuralOrganizationalProviderPatientInnovation(All levels can have predictive or causalimpact on implementation outcomes)

2. Collect data aboutvariation.

Variation across settings can deeply affect results – therefore itis important to not only be aware of this – but to collect dataabout variation. As noted above, multi-level factors predictimplementation outcomes. Collecting data on these factorscan facilitate understanding variation. In addition, Leykumet al. [6] assert that understanding the variation will allowthem to be “exploited in a way that will lead to maximalresults” (p. 62).

StructuralOrganizationalProviderPatientInnovation(All levels can have predictive or causalimpact on implementation outcomes)

3. Identifying localchampions.

Local champions are critical for engendering adoption and usein the setting is critical. Leykum et al. [6] and others advocatefor participatory action research or cooperative inquiry.

OrganizationalProviderPatient(Levels of predictive or causal impact onimplementation outcomes)

4. Understand how themultiple levels ofcomplexinterventionsintersect and howthey relate to theintervention.

With the knowledge that complex interventions have multiplelevels and are multifaceted – implementation studies requireawareness of how the levels integrate/intersect and howthose facets relate to one another to produce the fullintervention [16].

StructuralOrganizationalProviderPatientInnovation(All levels can have predictive or causalimpact on implementation outcomes)

5. Relate fidelity ofintervention tocontent and process.

In health IT implementation, the focus is upon the process ofthe implementation NOT the outcome of the intervention –the variation inherent in CAS will require considerations offidelity. It is very difficult to build a measure that yields validscores for multiple implementations or contexts, therefore abalance must be struck between efficient measurement andeffective measurement [17,39–41].

Fidelity(Focal point is upon implementationoutcome, not predictive or causalconstruct)

6. Address penetrationand sustainability aspart of the

The integration of the innovation within busy health careworkflows (penetration) and the process for operationalizingand maintaining an innovation within a service setting

imption

Implementation CostPenetrationSustainability

implementationprocess

(sustainability) should be part of theprocess of any new health IT innova

units and promoted sustained communication, assuring thatall stakeholders were aware and involved in adoption, spread,and proper execution.

3.1.3. Best Practice 2 – collect data about variationData were collected related to provider self-efficacy [33] relatedto fall prevention and patient characteristics (collected at theunit level) that could potentially affect the outcome such asage, insurance status, and length of stay. These data providedinformation about the effectiveness of the Fall TIPS interven-tion on younger versus older patients [33]. No site-specific datarelated to staffing, workload, or patient safety culture were col-lected at each site and patient characteristics were measuredat the unit level.

3.1.4. Best Practice 3 – identifying local championsLocal champions were identified and involved in development,implementation, adoption, and spread of the Fall TIPS inter-vention [32]. Unit-based staff nurse champions assisted with

lementation[16].

(Focal point is upon implementationoutcome, not predictive or causalconstruct)

pilot testing and with communicating and demonstratingthe advantages of using Fall TIPS in the context of typi-cal patient care workflows. Therefore, knowledge of betterideas, practices and associated benefits of the interventionwere communicated by peers rather than by research staff.Moreover, “just-in-time” in-service education provided byunit-based champions demonstrated advantages of Fall TIPSby peers who understand patient care and workflow chal-lenges.

3.1.5. Best Practice 4 – understand how the multiple levelsof complex interventions intersect and how they relate to theinterventionThe Fall TIPS intervention is complex and includes compo-nents to facilitate the three steps of the fall prevention process:

(1) fall risk assessment, (2) creating a tailored fall preventionplan, and (3) executing the plan. Each component of the inter-vention is important, but unless all three components areconsistently completed, the intervention will not prevent a

a l i n

psiwedtoww

3cFccwaattvs6(sst

3sBttnerdwscarar

3

3AjcIiraAtAl

i n t e r n a t i o n a l j o u r n a l o f m e d i c

atient from falling. Focus group interviews and observationaltudies were completed to inform the process for integrat-ng the Fall TIPS intervention into existing workflows. Theorkflows were then pilot tested and iteratively refined with

nd-users before the intervention was implemented. Whileata were collected on adherence with the Fall TIPS interven-ion on the unit level (denominator was the number of patientsn the unit and the numerator was the number of patientsho had the Fall TIPS toolkit components in place), no dataere collected at the individual patient level.

.1.6. Best Practice 5 – relate fidelity of intervention toontent and processidelity of the intervention (e.g., adherence with the proto-ol) and the use of Fall TIPS components were tracked viaomputer log files and unannounced monthly visits. Data onhether each patient’s fall risk assessment was completed on

dmission and the presence of tailored Fall TIPS informationt the bedside were collected to track adherence with the pro-ocol. A Fall TIPS Message of the Week was sent to end userso let them know the adherence rate for their unit and to con-ey a useful tip that were communicated by end users to thetudy team. Adherence with the Fall TIPS intervention over the-month intervention period was >90%. Qualitative methodsfocus group and individual interviews) were used to evaluateatisfaction and recommendations for improvement at eachite. The qualitative data were used to support enhancementso the Fall TIPS software.

.1.7. Best Practice 6 – address penetration andustainability as part of the implementation processased on the evidence for Fall TIPS efficacy, the four hospitalshat participated in the study requested to use Fall TIPS in rou-ine patient care. While use of Fall TIPS was supported by theursing and medical leadership, the complexity of differentlectronic record systems at each hospital, created a barrier toapid deployment. The cost of operationalizing Fall TIPS fourifferent hospitals with four different electronic systems thatere being phased out was not seen as a viable option. A deci-

ion was made to integrate the Fall TIPS application into theommon vendor system that is scheduled to be implementedcross the enterprise over the next four years. Since the studyesults were obtained under the controlled conditions of a RCTnd over a period of six months, long term sustainability inoutine care is unknown.

.2. Case Study 2

.2.1. Backgrounddoption of electronic health records (EHR) in the Ngaanyat-

arra lands located in remote Western Australia highlights theomplexity of distance and a health system controlled by anndigenous population. The Aboriginal Health Service (AHS)n the Ngaanyatjarra lands is controlled by tribal elders andeceives state and federal government funding from variousgencies. Approximately 1500 patients receive care from the

HS but this varies as many of the indigenous populations

hat it serves are mobile and frequently cross land boundaries.HS employs 65 staff members and has 7 clinics across the

ands. The geographic location is in the western desert area of

f o r m a t i c s 8 3 ( 2 0 1 4 ) e12–e22 e17

outback Western Australia, approximately 1000 km from AliceSprings (Central Australia) which is the main administrationcenter. It is nearly 1500 km from Perth (Western Australia cap-ital city) and is one of the most remote locations in Australia[36].

The Aboriginal Health Service delivers primary and pre-ventive care to communities which are dispersed across theseremote areas. There are no hospital or health providers basedin the central outback, and the majority of the communitiesare at least 1000 km from any regional center. The primarymethod of record keeping was paper based with records beingkept in the home communities. With a mobile indigenous pop-ulation, keeping up to date and readily accessible records wasnearly impossible. The move to electronic health records wasrelated to increasing government reporting requirements andregulations to secure future funding of the health service. Inaddition, access to digital radiology and pathology results wereon the increase, making accessibility a reality [36].

The implementation of EHRs in the remote areas of West-ern Australia was undertaken by the AHS Chief InformationOfficer (CIO), Chief Executive Officer (CEO), Chief Informa-tion Technology officer (CIT) as well as clinician input whereappropriate. The implementation was immediate rather thanphased in over time. The immediate cut-over from paperto electronic record keeping resulted in some issues withbuy-in from more traditional staff members. The CIO andCIT worked tirelessly to encourage and support clinical staffwith the intent of achieving a practical, but not necessar-ily perfect, workable system. Their understanding of theremote environment and diverse employee types ensuredthat the implementation process was tailored to the user-base. The case study reported that the implementationprocess experienced in the remote areas appeared to sur-pass the experiences reported in metropolitan areas. Whenone considers that the remote areas had much more com-plex environments complicated by distance and remoteness,extreme weather, transient populations, a lack of regular andvested employees in the health services and less than idealtechnology, it suggests that the process of implementationmay have been a defining success factor [36].

3.2.2. Best Practice 1 – identify multiple implementationmethods and modelsThis case study of the implementation of an electronichealth record (EHR) in this remote area of Western Australiabegan with an identification of the drivers, facilitators andbarriers to adoption of an EHR. Waldman’s Framework ofOrganisational Innovation Adoption [39] was used to eval-uate the outcome of the implementation and to guide theimplementation process. Barriers, facilitators, and driverswere identified as the impact of the multi-tiered healthsystem in Australia; “widespread uncertainty surroundingimplementation of EHR’s from political, policy, administra-tive, clinical and health consumer perspectives; inconsistentapproaches from government, public and private healthproviders primary and tertiary healthcare systems” [39]; the

pressures caused by the geography, the culture, and poverty.Additionally, overlap between these and other systemicissues were identified, including the diversity of stake-holders, cost of implementation, and privacy laws which

i c a l

e18 i n t e r n a t i o n a l j o u r n a l o f m e d

prevented some aspects of information exchange and com-plicated security [39].

3.2.3. Best Practice 2 – collect data about variationPost-implementation interviews were conducted with [18]staff members from the Ngaanyatjarra lands as well as stafffrom the main administrative center at Alice Springs. Theseinterviews were transcribed and analyzed for major themes.Additionally, other variables from the literature that have beenidentified previously with adoption of innovation were cap-tured and assessed in an attempt to more holistically considerinfluential factors. The researchers were reasonably thoroughin collecting a variety of data and observations from the imple-mentation process.

3.2.4. Best Practice 3 – identifying local championsThe CEO, the CIO, and the Chief IT Officer led the implementa-tion process, with each having a different but complementaryfocus. The CEO had the vision of integration of services formobile and vulnerable populations, the CIO was the leaderin relations with the staff, and the Chief IT Officer was theoperational guru who kept equipment and networks up andrunning in very unforgiving environments. There was no men-tion made of local champions at the individual institutionor ward levels; all local champions were from the “C-Suite”(executive level).

3.2.5. Best Practice 4 – understand how the multiple levelsof complex interventions intersect and how they relate to theinterventionThe implementers were aware, from inception, of the com-plexities that would be faced in an implementation of thisnature. The multiple levels and the interactions between themwere considered – in example, the CEO aimed for a practi-cal system that did not require perfection, there was a deepunderstanding of the local culture and the continual migra-tion, the workforce and the environment were accommodated(lower technological competencies and connectivity barriers),and the external forces from the government were acknowl-edged. Each of these factors (and others) were understood,from inception, to have bearing on the outcome of the imple-mentation.

3.2.6. Best Practice 5 – relate fidelity of intervention tocontent and processThis case study did not directly address fidelity measures,however there were some examples that could be said toreflect some of these principles, for example:

• purposeful sampling and the selection of multidisciplinaryparticipants who had direct knowledge and use of the sys-tem that was implemented

• inclusion of implementation best practices• clear outcomes required of the implementation• understanding of clinical requirements.

3.2.7. Best Practice 6 – address penetration andsustainability as part of the implementation processThe implementation process described in the case studybegan in 2004 and was extended across a large and geographic

i n f o r m a t i c s 8 3 ( 2 0 1 4 ) e12–e22

area of extreme conditions. Sustainability was not directlyaddressed in the case study, but the references to necessity ofthe effort to ensure continual governmental funding of healthservices and a deep sense of respect and compassion forthe native peoples were major drivers. Additionally, the com-mitment to a pragmatic and patient-centered approach wasrepeatedly broadcast by the local champions, and evidencedby actions undertaken during the process.

4. Discussion

The two case studies presented originate from two markedlydifferent situations. This illustrates the challenges high-lighted in the literature regarding the use of standardizedimplementation processes. “One size fits all” in health ITimplementation is a fallacy particularly when global diversityis added into the mix. However, several frameworks have beenpresented that can be used as a scaffolding to begin to assessbest practices, their distinct dimensions, and their applica-bility for use. In this fashion, a knowledge base for healthIT implementation techniques may emerge. In this regard,we use the 6 previously identified best practices (Table 1) asa schema for discussing the processes between these twomarkedly diverse implementations and we used the CFIR [16]to assure consideration of the multi-level constructs from eachsituation.

4.1. Identify multiple implementation methods andmodels

As discussed earlier in this paper, clinical environments arecomplex and variation is the norm requiring creative and crit-ical thinking and acceptance that a single model or methodmay not be sufficient for implementation studies. Adapta-tion is crucial in development, implementation and evaluationof implementation processes in CAS such as these [7]. Thetwo case studies presented used different implementationmodels. However, in both cases, the implementation mod-els supported a systematic approach to implementation thataddressed barriers and determined the data needed for eval-uation.

Two models guided the first case; participatory design [31]and the Institute for Health Care Improvement’s Frameworkfor Spread (FFS) [30]. The participatory design approach usedin Case Study 1 engaged end users (patients and providers)in iterative development of the Fall TIPS toolkit innovationand in pilot testing of the toolkit components to ensure thatthey fit within the workflow. In addition, the three phasesof the Framework for Spread: (1) planning and set-up, (2)spread within the target population and (3) continuous mon-itoring and feedback of the spread process; provided a formalmechanism to directly address causal factors from the mul-tiple organizational levels such as aligning organizationalcommunication and support for the innovation from the orga-

nizational leadership to the bedside. Identifying metrics tomeasure adoption as part of this process ensured that eval-uation occurred as an ongoing process and adaptations weremade as needed to support adoption.

a l i n

aopioftdwsosabmff

4

BmenrcTtbnwiFpct

rinaaiutpEtitnsodiangt

i n t e r n a t i o n a l j o u r n a l o f m e d i c

In Case Study 2, the use of Walden’s framework [39] enabled focused examination of the systemic barriers to adoptionf an EHR. This framework also suggests the use of princi-les that can be applied to complex EHR implementations to

mprove the potential for successful adoption [37]. The authorsf the case study used the framework to identify the specificeatures that were present during the implementation ando elicit from the subjects how certain issues were overcomeuring the implementation process. Using the frameworkas perceived as beneficial, since it allowed various dimen-

ions to be assessed and measured. While the dimensionalityf the Walden Framework allowed a relatively comprehen-ive assessment, the researchers also incorporated knowledgend findings from prior research regarding aboriginal health,ehavior patterns, and existing secondary data from govern-ent sources. The incorporation of multiple methods of data

rom a variety of sources supports the best practice of identi-ying multiple implementation methods and models.

.2. Collect data about variation

ecause variation in how a health IT innovation is imple-ented and differences in the causal factors that exist at

ach site can affect results, data on contextual variables areeeded. In the first case study, contextual data were collectedelated to use of the system at the provider level but data onertain demographics at the patient level was not collected.his was a limitation because eventually Fall TIPS was found

o be most effective with patients over age 64. Additionallyecause adherence data was collected at the unit level andot at the individual level, it is not known whether adherenceas simply better with older patients or whether the toolkit

ntervention is simply more effective with this demographic.inally, no site-specific data related to staffing, workload, oratient safety culture was collected at each site. These dataould have potentially provided insight into the relative effec-iveness of the intervention at each hospital [23].

In the second case study, the collection of a variety of dataesulted in the gains of valuable information regarding themplementation of an EHRS in remote low-resource commu-ities. The structure of the EHR and then its implementationnd use created some issues around cultural and lifestylespects for the indigenous communities. Aboriginals have anntuitive, personal and flexible concept of time, and are oftennconcerned with schedules and order. Family structures andhe use of surnames in the Aboriginal culture are not com-atible with standard methods of patient identification in anHRS, and deep linkages between cultural beliefs and symp-oms conflicted with structured and standardized HIT. Otherdentified issues related to remoteness of the region, lack ofechnical expertise of staff, staff turnover, lack of ability (oreed) to interact with other health information systems out-ide the lands, and issues related to technical support. Theverall point here is that the researchers cast a wide net, andid a reasonable job of collecting data about variations that

mpacted implementation in this unique setting. Reiterating

point made at the beginning of this paper, these findings mayot be generalizable across other distinct racial and ethnicroups (or even other Australian Aboriginal communities) buthe concepts themselves may be transferrable. These lessons

f o r m a t i c s 8 3 ( 2 0 1 4 ) e12–e22 e19

are an important step in developing an implementation pro-cess knowledge base.

4.3. Identifying local champions

The importance of identifying and engaging local champi-ons to assist with implementation is, and has always been,a best practice in HIT implementation. In Case Study 1 thelocal champions were unit based, reflecting the importantinvolvement of those who were most impacted by the imple-mentation. These local champions participated in an iterativedesign process, communicated benefits and feasibility ofusing Fall TIPS in the context of busy patient care workflowsto their peers.

The second case study was a top-down process, wherethe local champions were high level decision makers (the so-called “C-Suite” executives) The CEO and CIO from the healthservice were the champions for this case study as they drovethe implementation and adoption of the system. They hadthe knowledge, skills and understanding of the complexitiesof the healthcare system, the remote environment, employeeand employer requirements and the passion for innovationand better health outcomes for the local communities. Fromthe case study it is evident that their passion, motivation andprofessionalism was a major reason for the successful imple-mentation and adoption of the EHR system. The balancing offinishing up with a workable system – as opposed to a per-fect system – was also very valid and pragmatic. Finally, byfocusing on measures of success and maintaining the patient-centric mission engendered buy-in from the initially hesitanttraditionalists. It is interesting to note however that there wasno mention in the report of end-user involvement. This may,in of itself, be an important piece of knowledge, relevant toHIT implementation in low resource areas. HIT deploymentmay be dictated by Ministries of Health or by funding agencies,reporting structures are often rigid and hierarchical, long termsustainability and commitment can be uncertain, and politi-cal whims may dictate implementation approaches. It maybe that a top-down implementation of EHRS in low-resourceareas is a best practice in of itself. Further study is warranted.

Ultimately, these two case studies were successful, due inpart to strong local champions. Does the identification of abest practice require dictating who those local champions are?These two case studies would seem to imply that the localchampions can be top-down or bottom-up, or somewhere inthe middle. Much depends on the situation, context, and cul-ture, particularly in the case of country variation. Complexityand variation of the situation influences who the local cham-pions are – again illustrating the necessity of flexibility andadaptation in EHRS implementation interventions.

4.4. Understand how the multiple levels of complexinterventions intersect and how they relate to theintervention

As discussed earlier, health IT innovations are frequently com-

prised of multiple levels or components. This aspect heavilyinfluenced Case Study 1 and presented a challenge for fullyunderstanding how the different components make up a“dose” of the fall prevention intervention. One way to address

i c a l

e20 i n t e r n a t i o n a l j o u r n a l o f m e d

this challenge was to identify a way to measure “dose” withinthe study. This was a major limitation in the Fall TIPS study,where adherence data were collected at the unit level, but nodata related to dose (e.g., the extent to which each individ-ual patient received the three components of the Fall TIPSintervention) were collected at the patient level. There is ahigh probability that the influence of different units and dif-ferent institutions also played a role in the implementationprocess, however the lack of data delineating the specific loca-tion precluded a deeper understanding of potential barriersand facilitators to implementation outcomes.

In Case Study 2, the systemic issues identified withinWalden’s framework [38] revealed the complexity in imple-menting a large information technology projects such as this.The additional challenges of geography, dealing with vulner-able and mobile populations, and the complexities of cultureadded to the difficulty. The results of the study however sug-gested that similar implementations in complex metropolitanareas (where interfacing disparate systems massively increasethe complexity) were more difficult than those in the regionserved by the Aboriginal Health Service. While this may bean artifact of isolation, the introspection of the researchersis an important one. Regarding the region as a discrete sys-tem (i.e., totally isolated), enabled many of the complex issuessuch as multi-system or multi-vendor interoperability to bebypassed [36] which in turn eased certain aspects of the imple-mentation. Eventually these issues will require attention asEHRS roll-out across Australia, but the lessons learned in thisparticular implementation process generated valuable knowl-edge for implementations of EHRS in low-resource and remotecommunities. In addition, the findings related to working withpatients from markedly non-Western cultures provide impor-tant knowledge for EHRS implementation personnel.

4.5. Relate fidelity of intervention to content andprocess

To encourage adoption and sustained use of implementationmodels in EHRS interventions, the focus must shift from rigidadherence of a “one size fits all model” to an adaptive approachthat takes into account situational context while maintainingfidelity of the intervention. Strategies that consider how EHRSinvention processes can be adapted to meet unique situationswhile still maintaining enough rigor to insure that the findingsare transferable are desperately needed. This dimension wasnot a focal point for either case study, however, evidence offidelity measurements are present in both.

In the first case study, standard operating procedureshelped to maintain fidelity and adherence to the studyprotocols. Adaptation to the implementation process wascontrolled in a way that enabled flexibility to the core pro-cess without jeopardizing fidelity. Multiple methods includingcomputer log files and site visits were used to measure fidelityto the intervention. In addition, interviews were conducted toprovide context into the quantitative findings.

The second case study failed to formally address meas-

ures of fidelity, although as noted earlier, they did reportthe study as “successful” and they did conduct a series ofinterviews, secondary data analysis, and review of litera-ture related to aboriginal health care. One would assume

i n f o r m a t i c s 8 3 ( 2 0 1 4 ) e12–e22

that they maintained fidelity, but without mention of it,this aspect is hard to judge. It would have strengthenedthe study had the researchers used an evaluation tool thatincluded the underlying fidelity principles of purpose andtheory, adherence, data collection methodology, quality ofdelivery, participant responsiveness and program differenti-ation [7,39] to ensure readers of fidelity. While the points aremade in the literature that when high fidelity is not achievedthe impacts of an implementation cannot be identified, recti-fied or improved [39,41] adaptive mixed-model approaches arerequired to gain the level of understanding required in EHRSinterventions.

4.6. Address penetration and sustainability as part ofthe implementation process

Penetration and sustainability were important themes in bothof the case studies. For the first case study, the adherencelevel with the intervention of >90% during the clinical trialsuggested good penetration. However, sustainability was notachieved due to the cost of implementing the intervention inelectronic systems that were being phased out. This is notto imply that the lessons learned during the implementa-tion were all for naught. However, the Fall TIPS effort wastied to a distinct system that was eventually replaced in theorganization.

The second case exemplifies the importance of executivelevel buy-in to the project, and illustrates how a project of thistype, driven by governmental requirements coupled with per-sonal commitment can impact penetration and sustainability.The outcome was positive, the penetration was extensive– spreading out across the Ngaanyatjarra lands of WesternAustralia, and the system is still functioning, a testimony to itssustainability. Part of the success however is related to the iso-lation of the outback region and the relatively slow uptake ofwidespread and integrated EHRS across Australia. This has notpressured the Western Region to become interoperable withthe remainder of the nation. Eventually the isolation that wassuch a benefit for implementing EHRS in Western Australiawill begin to take its toll. The effort that it will take to linkinto other health information systems outside the region willbe considerable, and the adaptations to a more Westernizedmodel of care will impose new challenges for the AboriginalHealth Service.

5. Summation

The presentation, analysis and discussion illustrated that thenascent six implementation best practices and the CFIR wereacceptable structures for assessing certain aspects of the casestudies. There is much work left to be done, not only in gaininga deeper understanding of implementation processes, but alsoin developing a scaffold or method for beginning to analyze

EHRS implementations as we have begun. Interesting findingsemerged from both, and using the scaffold (Table 1) assistedus in deconstructing, examining, organizing and representing2 very different implementation processes.

i n t e r n a t i o n a l j o u r n a l o f m e d i c a l i n

Summary pointsWhat is already known?

• There is more than evidence needed for successfuladoption, implementation and sustainability of healthIT.

• There is a pressing need for better understanding ofwhat works, what does not, and in what context.

• Mixed models and mixed methods may work best forimplementation interventions.

What does this study add?

• A beginning set of literature supported and trans-ferrable implementation best practices are put forwardfor use and evaluation.

6

Atagcvcpg

A

PmpfsFtbw

C

To

r

l

• CFIR is suggested as a good framework for implemen-tation research.

. Conclusion

doption of electronic systems and other health IT innova-ions into complex healthcare environments has been slownd problematic. Recent work in implementation science sug-ests that both the technology and the implementation pro-ess must be addressed when implementing health IT inno-ations. This paper examined health IT implementation pro-esses and identified a beginning set of implementation bestractices, which if used consistently, could begin to addressaps in the health IT implementation body of knowledge.

uthor contributions

atricia Abbott is the lead author who organized submission,anaged the project, wrote significant portions of the paper,

articipated in editing, participated in development of paperramework/grid and identification of six best practices, andubmitted the paper. Jo Foster, Patricia Dykes and Heimar deatima Marin contributed in writing/summarizing one of thewo case studies, editing, participating in identification of sixest practices, participating in development of paper frame-ork/grid and conducting portions of the literature review.

onflict of interest

here are no conflicts of interest noted from any of the authorsn this paper.

e f e r e n c e s

[1] A. Kitson, G. Harvey, B. McCormack, Enabling theimplementation of evidence based practice: a conceptualframework, Quality in Health Care 7 (1998) 149–158.

[2] Y. Han, J. Carcillo, S. Venkataraman, R. Clark, R. Watson, T.Nguyen, H. Bayir, R. Orr, Unexpected mortality afterimplementation of a commercially sold computerizedphysician order entry system, Pediatrics 116 (2005) 1506.

f o r m a t i c s 8 3 ( 2 0 1 4 ) e12–e22 e21

[3] T. Yackel, P. Embi, Unintended errors with EHR-based resultmanagement: a case series, Journal of the American MedicalInformatics Association 2010 (17) (2009) 104–107.

[4] R. Koppel, J.P. Metlay, A. Cohen, B. Abaluck, A.R. Localio, S.E.Kimmel, et al., Role of computerized physician order entrysystems in facilitating medication errors, Journal of theAmerican Medical Association 293 (10) (2005) 1197–1203.

[5] A. Chapman, C. Lehmann, P. Donohue, S. Aucott,Implementation of computerized provider order entry in aneonatal intensive care unit: impact on admission workflowOriginal Research Article, International Journal of MedicalInformatics 81 (5) (2012) 291–295.

[6] A. Kellerman, S. Jones, Analysis and commentary: what itwill take to achieve the as-yet-unfulfilled promises of HealthInformation Technology? Health Affairs 2013 (January (32))(2013) 63–68.

[7] L. Leykum, J. Pugh, H. Lanham, J. Harmon, R. McDaniel,Implementation research design: integrating participatoryaction research into randomized clinical trials,Implementation Science 4 (2009) 69.

[8] K. Unertl, L. Novak, C. Gadd, N. Lorenzi, The science behindhealth information technology implementation:understanding failures and building on success, AMIAProceedings (2012).

[9] F. Mair, C. May, C. O’Donnell, T. Finch, F. Sullivan, E. Murray,Factors that promote or inhibit the implementation ofeHealth Systems: an explanatory systematic review, Bulletinof the World Health Organization 90 (5) (2013) (accessed21.04.13)http://www.who.int/.bulletin/volumes/90/5/11-099424/en/

[10] J.E. Richardson, E.L. Abramson, E.R. Pfoh, R. Kaushal, HITECInvestigators, Bridging informatics and implementationscience: evaluating a framework to assess electronic healthrecord implementations in community settings, AMIAAnnual Symposium Proceedings 2012 (2012) 770–778.

[11] K. Keshavjee, C. Kuziemsky, K. Vassanji, A. Ghany, Acomplex adaptive systems perspective of healthinformation technology implementation, in: K.L. Courtney,et al. (Eds.), Enabling Health and Healthcare Through ICT,IOS Press, Amsterdam, Netherlands, 2013.

[12] L.A. Lipsitz, Understanding health care as a complexsystem: the foundation for unintended consequences,Journal of the American Medical Association 308 (3) (2012)243–244, http://dx.doi.org/10.1001/jama.2012.7551.

[13] National Information Center on Health Services Researchand Health Care Technology (NICHSR),http://www.nlm.nih.gov/hsrinfo/implementation science.htm

[14] L. Damschroder, D. Aron, R. Keith,S. Kirsch, J. Alexander, J. Lowery, Fostering implementationf health service research finding into practice:a consolidate framework for advancing implementationscience, Implementation Science 4 (50) (2009).

[15] E. Ferlie, S. Shortel, Improving the quality of healthcare inthe United Kingdom and the United States: a framework forchange, Milbank Q 79 (28) (2001) 281–315.

[16] S. Chaudoir, A. Dugan, C. Barr, Measuring factors affectingimplementation of health innovations: a systematic reviewof structural, organizational, provider, patient, andinnovation level measures, Implementation Science 8 (2013)22.

[17] P. Nielsen, K. Roback, A. Boström, P. Ellström, Creatures ofhabit: accounting for the role of habit in implementationresearch on clinical behavior change, ImplementationScience (7) (2012) 53.

[18] S.K. Schoenwald, A.F. Garland, J.E. Chapman, S.L. Frazier, A.J.

Sheidow, M.A. Southam-Gerow, Toward the effective andefficient measurement of implementation fidelity,

i c a l

[41] C. Mowbray, M. Holter, G. Teague, D. Bybee, Fidelity criteria:development, measurement and validation, American

e22 i n t e r n a t i o n a l j o u r n a l o f m e d

Administration and Policy in Mental Health 38 (January (1))(2011) 32–43.

[19] C. Phibbs, A. Milstein, S. Delbanco, D. Bates, Published lettersto the editor: no proven link between CPOE and mortality,Pediatrics 116 (December (6)) (2005) 1512–1513.

[20] C.R. May, F.S. Mair, T. Finch, A. Macfarlane, C. Dowrick, S.Treweek, et al., Development of a theory of implementationand integration: normalization process theory,Implementation Science 4 (2009) 29,http://dx.doi.org/10.1186/1748-5908-4-29, PMID: 19460163.

[21] L. McQueen, B.S. Mittman, J.G. Demakis, Overview of theVeterans Health Administration (VHA) quality enhancementresearch initiative (Queri), Journal of the American MedicalInformatics Association 11 (5) (2004) 339–343.

[22] R.E. Glasgow, L.M. Klesges, D.A. Dzewaltowski, P.A.Estabrooks, T.M. Vogt, Evaluating the impact of healthpromotion programs: using the RE-AIM framework to formsummary measures for decision making involving complexissues, Health Education Research 21 (5) (2006) 688–694.

[23] S. Bakken, C. Ruland, Translating clinical informaticsinterventions into routine clinical care: how can the RE-AIMframework help? Journal of the American MedicalInformatics Association 16 (November–December (6)) (2009)889–897.

[24] G. Aarons, A. Green, L. Palinkas, S. Self-Brown, D. Whitaker,J. Lutzker, J. Silovsky, B. Hecht, M. Chaffin, Dynamicadaptation process to implement an evidence-based childmaltreatment intervention, Implementation Science 7(2012) 32.

[25] G.A. Aarons, M. Hurlburt, S.M. Horwitz, Advancing aconceptual model of evidence-based practiceimplementation in child welfare, Administration and Policyin Mental Health 38 (2011) 4–23.

[26] L.L. Novak, R.J. Holden, S.H. Anders, J.Y. Hong, B.T. Karsh,Using a sociotechnical framework to understandadaptations in health IT implementation, InternationalJournal of Medical Informatics 82 (12) (2013, Dec) e331–e344.

[27] M.I. Harrison, R. Koppel, S. Bar-Lev, Unintendedconsequences of information technologies in health care aninteractive sociotechnical analysis, Journal of the AmericanMedical Informatics Association 14 (2007) 542–549.

[28] M. Berg, J. Aarts, J. van der Lei, ICT in health care:sociotechnical approaches, Methods of Information inMedicine 42 (4) (2003) 297–301.

[29] C.M. Ruland, R.M. Maffei, E. Børøsund, A. Krahn, T.Andersen, G.H. Grimsbø, Evaluation of different features of

an eHealth application for personalized illness managementsupport: cancer patients’ use and appraisal of usefulness,International Journal of Medical Informatics 82 (7) (2013, Jul)593–603.

i n f o r m a t i c s 8 3 ( 2 0 1 4 ) e12–e22

[30] M.R. Massoud, G.A. Nielsen, K. Nolan, M.W. Schall, C. Sevin,A Framework for Spread: From Local Improvements toSystem-Wide Change IHI Innovation Series White Paper,Institute for Healthcare Improvement, Cambridge, MA, 2006www.IHI.org

[31] C. Ruland, J. Starren, T. Vatne, Participatory design withchildren in the development of a support system forpatient-centered care in pediatric oncology, Journal ofBiomedical Informatics 41 (4) (2008) 624–635.

[32] P.C. Dykes, D.L. Carroll, A.C. Hurley, A. Benoit, B. Middleton,Why do patients in acute care hospitals fall? Can falls beprevented? Journal of Nursing Administration 39 (June (6))(2009) 299–304.

[33] P.C. Dykes, D.L. Carroll, A. Hurley, S. Lipsitz, A. Benoit, F.Chang, S. Meltzer, R. Tsurikova, L. Zuyov, B. Middleton, Fallprevention in acute care hospitals: a randomized trial,Journal of the American Medical Association 304 (November(17)) (2010) 1912–1918.

[34] E.M. Rogers, Lessons for guidelines from the diffusion ofinnovations, Joint Commission Journal on QualityImprovement 21 (July (7)) (1995) 324–328.

[35] J. Morse, Preventing Patient Falls, Sage Publishing Co., CA,1997.

[36] H. Cripps, C. Standing, The implementation of electronichealth records: a case study of bush computing theNgaanyatjarra Lands, International Journal of MedicalInformatics 80 (12) (2011) 841–848.

[37] S. Standing, C. Standing, Mobile Technology and healthcare:the adoption issues and systemic problems, InternationalJournal of Electronic Healthcare 4 (3–4) (2008)221–235.

[38] J. Waldman, Thinking systems need systems thinking,Systems Research and Behavioral Science 24 (3) (2007)271–284, http://dx.doi.org/10.1002/sres.828.

[39] C. Carroll, M. Patterson, Wood E., S. Booth, R. Balain, Aconceptual framework for implementation fidelity,Implementation Science 2 (40) (2007),http://dx.doi.org/10.1186/1748-5908-2-40http://www.implementationscience.com/content/2/1/40

[40] J. Century, M. Rudnick, F. Cassie, A framework for measuringfidelity of implementation: a foundation for sharedlanguage and accumulation of knowledge, American Journalof Evaluation 31 (2) (2010) 199–218,http://dx.doi.org/10.1177/1098214010366173.

Journal of Evaluation 24 (3) (2003) 315–340,http://dx.doi.org/10.1177/109821400302400303.