A framework for assisting the design of effective software process improvement implementation...

19
A framework for assisting the design of effective software process improvement implementation strategies Mahmood Niazi a,b, * , David Wilson b , Didar Zowghi b a National ICT Australia, Empirical Software Engineering, Bay 15 Locomotive Workshop, ATP, Eveleigh NSW 1430, Australia b Faculty of Information Technology, University of Technology Sydney, NSW 2007, Australia Received 14 May 2004; received in revised form 6 September 2004; accepted 12 September 2004 Abstract A number of advances have been made in the development of software process improvement (SPI) standards and models, e.g. Capability Maturity Model (CMM), more recently CMMI, and ISOÕs SPICE. However, these advances have not been matched by equal advances in the adoption of these standards and models in software development which hasresulted in limited success for many SPI efforts. The current problem with SPI is not a lack of standards or models, but rather a lack of an effective strategy to successfully implement these standards or models. In this paper we have focused on SPI implementation issues and designed three individual components in order to assist SPI practitioners in the design of effective SPI implementation initiatives. We have pulled together individual components under one SPI implementation framework (SPI-IF) using a bottom-up approach already familiar to many practitioners and researchers. The framework is based on the results drawn from SPI literature and an empirical study we have carried out. In the design of SPI-IF, the concept of critical success factors (CSFs) was used and extended. Thirty-four CSF interviews were conducted with Aus- tralian practitioners. In addition, 50 research articles (published experience reports and case studies) were also selected and analysed in order to identify factors that play positive or negative roles in SPI implementation. The SPI-IF provides a very practical structure with which to assess and implement SPI implementation initiatives. In order to evaluate SPI-IF, a practical evaluation scheme was undertaken. The evaluation results show that SPI-IF has potential to assist SPI practitioners in the design of effective SPI implementation initiatives. Thus, we recommend organizations to use SPI-IF in order to effectively design SPI implementation initiatives. Ó 2004 Elsevier Inc. All rights reserved. Keywords: Software process improvement implementation; Practitioners; CMMI; Critical success factors 1. Introduction Despite the extensive development of standards and models for SPI (e.g. Capability Maturity Model (CMM), more recently CMMI, and ISOÕs SPICE), these advances have not been matched by equal advances in the adoption of these standards and models in software development. There has been limited success for many SPI efforts and the failure rate for SPI programmes is high, with recent estimates suggesting a rate of failure around 70% (SEI, 2002b). A defined SPI implementation process is essential to the success of any SPI initiative as a chaotic implementation process is one of the most com- mon causes of SPI implementation failure (Zahran, 1998). However, and curiously, a well-defined SPI implementation process is not included in most SPI 0164-1212/$ - see front matter Ó 2004 Elsevier Inc. All rights reserved. doi:10.1016/j.jss.2004.09.001 * Corresponding author. Tel.: +61 2 8374 5492; fax: +61 2 8374 5520. E-mail addresses: [email protected], [email protected]. edu.au (M. Niazi), [email protected] (D. Wilson), [email protected]. edu.au (D. Zowghi). www.elsevier.com/locate/jss The Journal of Systems and Software 78 (2005) 204–222

Transcript of A framework for assisting the design of effective software process improvement implementation...

www.elsevier.com/locate/jss

The Journal of Systems and Software 78 (2005) 204–222

A framework for assisting the design of effective softwareprocess improvement implementation strategies

Mahmood Niazi a,b,*, David Wilson b, Didar Zowghi b

a National ICT Australia, Empirical Software Engineering, Bay 15 Locomotive Workshop, ATP, Eveleigh NSW 1430, Australiab Faculty of Information Technology, University of Technology Sydney, NSW 2007, Australia

Received 14 May 2004; received in revised form 6 September 2004; accepted 12 September 2004

Abstract

A number of advances have been made in the development of software process improvement (SPI) standards and models, e.g.Capability Maturity Model (CMM), more recently CMMI, and ISO�s SPICE. However, these advances have not been matched byequal advances in the adoption of these standards and models in software development which hasresulted in limited success formany SPI efforts. The current problem with SPI is not a lack of standards or models, but rather a lack of an effective strategyto successfully implement these standards or models.

In this paper we have focused on SPI implementation issues and designed three individual components in order to assist SPIpractitioners in the design of effective SPI implementation initiatives. We have pulled together individual components under oneSPI implementation framework (SPI-IF) using a bottom-up approach already familiar to many practitioners and researchers.The framework is based on the results drawn from SPI literature and an empirical study we have carried out. In the design ofSPI-IF, the concept of critical success factors (CSFs) was used and extended. Thirty-four CSF interviews were conducted with Aus-tralian practitioners. In addition, 50 research articles (published experience reports and case studies) were also selected and analysedin order to identify factors that play positive or negative roles in SPI implementation. The SPI-IF provides a very practical structurewith which to assess and implement SPI implementation initiatives.

In order to evaluate SPI-IF, a practical evaluation scheme was undertaken. The evaluation results show that SPI-IF has potentialto assist SPI practitioners in the design of effective SPI implementation initiatives. Thus, we recommend organizations to use SPI-IFin order to effectively design SPI implementation initiatives.� 2004 Elsevier Inc. All rights reserved.

Keywords: Software process improvement implementation; Practitioners; CMMI; Critical success factors

1. Introduction

Despite the extensive development of standards andmodels for SPI (e.g. Capability Maturity Model(CMM), more recently CMMI, and ISO�s SPICE), these

0164-1212/$ - see front matter � 2004 Elsevier Inc. All rights reserved.doi:10.1016/j.jss.2004.09.001

* Corresponding author. Tel.: +61 2 8374 5492; fax: +61 2 83745520.

E-mail addresses: [email protected], [email protected] (M. Niazi), [email protected] (D. Wilson), [email protected] (D. Zowghi).

advances have not been matched by equal advances inthe adoption of these standards and models in softwaredevelopment. There has been limited success for manySPI efforts and the failure rate for SPI programmes ishigh, with recent estimates suggesting a rate of failurearound 70% (SEI, 2002b). A defined SPI implementationprocess is essential to the success of any SPI initiative as achaotic implementation process is one of the most com-mon causes of SPI implementation failure (Zahran,1998). However, and curiously, a well-defined SPIimplementation process is not included in most SPI

M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222 205

implementation initiatives (Goldenson and Herbsleb,1995). Most existing SPI research concentrates on‘‘what’’ activities to implement rather than ‘‘how’’ toimplement these activities. The importance of SPI imple-mentation demands that it be recognised as a complexprocess in its own right. Attention to the ‘‘how’’ to imple-ment is crucial for the successful implementation of SPIinitiatives and organizations should determine their SPIimplementation maturity through an organized set ofactivities.

The aim of this research is to report the results of anempirical study of the viewpoints and experiences ofpractitioners regarding SPI implementation and, basedon the findings, to present a framework to assist practi-tioners in the design of effective SPI implementationinitiatives. In order to design this SPI implementationframework we have extended the concept of critical suc-cess factors (CSFs) (Rockart, 1979). We have analysedthe experiences, opinions and views of practitionersthrough the literature (i.e. case studies, technical reportsand journal articles as shown in Appendix A). Further-more, we have carried out an empirical study to find fac-tors that have positive or negative impacts on theimplementation of a SPI program. Our results providevaluable advice to SPI practitioners in designing appro-priate SPI implementation strategies.

Generally, the SPI approach has four stages: assess-ment, selection of model/standard, designing of imple-mentation initiatives and implementation of initiatives(Paulish and Carleton, 1994; Zahran, 1998). All thestages are very important for successful implementationof SPI programmes. However, performing this researchhas convinced us that of these stages the �design ofimplementation initiatives� is a very critical stage. Thisis because no matter how good the SPI model or stand-ard is, an ineffective SPI implementation initiative cansignificantly affect the success of SPI efforts (Niaziet al., 2004).

In this paper we have focused on SPI implementationissues and designed three individual components—SPIimplementation factor component, SPI assessment com-ponent and SPI implementation component—in orderto assist SPI practitioners in the design of effective SPIimplementation initiatives. Each component can play avital role in the design of effective SPI implementationinitiatives. This is because the evaluation results showedthat SPI-IF has the potential to assist SPI practitionersin the design of effective SPI implementation initiatives.

The objective of the SPI factors component is to sum-marise the factors that play a positive or negative role inthe implementation of SPI programmes. We suggest thatSPI practitioners should focus on these CSFs and CBs inthe design of effective SPI implementation initiatives.The second component, SPI assessment, guides organi-zations in assessing and improving their SPI implementa-tion maturity. SPI implementation maturity indicates

whether or not an organization is ready to commencean SPI initiative and, if not, identifies implementation is-sues that should be tackled first. The objective of thirdcomponent, SPI implementation, is to assist practition-ers in effectively implementing SPI initiatives. We suggestusing a phase by phase approach in order to reduce theimplementation failure risks and identify six phases thatcan be used to effectively implement SPI initiatives.

Various parts of our SPI-IF has been previously pub-lished (Niazi et al., 2003a; Niazi et al., 2003b; Niaziet al., 2004; Niazi et al., in press; Niazi et al., 2004d).The major contribution of this paper is to combine allcomponents and to present a complete picture of theSPI-IF. In this paper we will also present the results ofthree case studies where SPI-IF was implemented inthe real world environment.

This paper is organised as follows. Section 2 describesthe background. Section 3 describes the research design.In Section 4 SPI-IF is discussed. Section 5 covers ourevaluation process through three case studies. Section6 provides the conclusion.

2. Motivation

A number of empirical studies have investigated fac-tors that positively or negatively impact SPI, e.g. (El-Emam et al., 1999; Goldenson and Herbsleb, 1995;Rainer and Hall, 2002; Stelzer and Werner, 1999). Theimportance of these empirical studies is accepted, butdue to the nature of CSFs it is possible that these CSFsmay differ from manager to manager according to theindividual�s place in the organization�s hierarchy andthey may also differ in different geographical regions(Khandelwal and Ferguson, 1999; Rockart, 1979).Much of the existing literature provides anecdotal evi-dence of CSFs and little empirical work appears to havebeen conducted in this area. Many of the studies men-tioned above have adopted the questionnaire surveymethod for the identification of factors. A disadvantageof the questionnaire survey method is that respondentsare provided with a list of possible factors and askedto select from that list. This tends to pre-empt the fac-tors investigated and to limit them to those reported inexisting studies—respondents only focus on the factorsprovided in the list. In order to provide more confidencein the study it is important that practitioners� experi-ences and perceptions should be explored independentlyand without any suggestion from the researcher.

In other studies different researchers have describedtheir experiences of SPI implementation (Butler, 1997;Diaz and Sligo, 1997; Florence, 2001; Kaltio and Kinn-ula, 2000; Kautz and Nielsen, 2000). A review of theexperience reports revealed that no standard approachhas been adopted for the implementation of SPIinitiatives. Different organizations adopted different

206 M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222

approaches, based on their own individual experiences,in order to implement SPI initiatives rather than follow-ing a standard SPI implementation approach. This canlead to a chaotic situation with no standard for SPIimplementation practices (Zahran, 1998). The SoftwareEngineering Institute has developed the IDEAL model(Jennifer and Chuck, 1997) for initiating, planning andguiding improvement action. However, this model isexplicitly linked to the CMM and is not generic enoughto be useful for designing SPI implementation pro-grammes using other SPI roadmaps or initiatives. Thismodel also does not assess the readiness of organiza-tions for SPI implementation. Little research evidenceis available to judge its effectiveness. So far we havenot identified any standard approach that has been de-signed specifically for the implementation of SPI initia-tives. This missing topic needs to be investigated inorder to assist SPI practitioners in designing effectiveSPI implementation strategies.

The focus of this paper is based on the following re-search questions: RQ1. What factors, as identified in theliterature, have a positive impact on implementing SPI?RQ2. What factors, as identified in the real practice,have a positive impact on implementing SPI? RQ3.What factors, as identified in the literature, have a neg-ative impact on implementing SPI? RQ4. What factors,as identified in the real practice, have a negative impacton implementing SPI? RQ5. What are the necessary andsufficient phases/steps for the implementation of SPIprogrammes?

To answer these questions we have conducted anempirical investigation that has led us to the develop-ment of the SPI-IF which is the main contribution ofthis paper.

3. Research design

3.1. Sample profile

From November 2002 to December 2003 twenty-ninecompanies in Sydney, Australia were visited and 34interviews were conducted. Out of 34 interviews, oneinterview was conducted with each of the 24 companiesand 2 interviews were conducted with 5 larger compa-nies. We sent out 224 letters of request to participantsand only 29 companies (13%) responded. The sampleprofile is shown in B. The target population in this re-search was those software-producing companies thathave initiated SPI programmes. Although we do notclaim this is a statistically representative sample, B doesshow that companies in the study range from a verysmall software house to very large multinational compa-nies and cover a wide range of application areas. It isfurther important to acknowledge that the data was col-lected from companies who were tackling real SPI

implementation issues on a daily basis; therefore wehave high confidence in the accuracy and validity ofthe data.

It is further important to note that the practitionerssampled within companies are representative of practi-tioners in organizations as a whole. A truly representa-tive sample is impossible to attain and the researchershould try to remove as much of the sample bias as pos-sible (Coolican, 1999). In this research, in order to makethe sample fairly representative of SPI practitioners inparticular organization, different groups of practition-ers from each organisation self-selected to participate.The sample of practitioners involved in this researchincludes developers, business analysts, methodologyanalyst, technical directors, project managers and seniormanagement.

In addition to our empirical study we also have ana-lysed 50 published experience reports, case studies andarticles. The studies we have analysed are of well-knownorganizations. A summarises published experience re-ports, case studies and papers organized according tothe respondent companies. We consider these to beimportant publications because the 34 organizations in-clude all the five organizations that have been awardedthe IEEE Computer Society Award for ProcessAchievement.

3.2. Data collection methods

3.2.1. SPI literatureWe undertook an objective reading and identified a

number of SPI implementation factors. SPI literatureconsists of case studies, experience reports and high-level software process texts. Most of the studies describereal life experiences of SPI implementation and providedifferent factors that play a positive or negative role inSPI implementation. This literature analysis was entirelyobjective and only one researcher was involved. Accord-ing to Leedy and Ormrod (2001) if the judgement of theliterature is entirely objective then one person issufficient.

In order to reduce researcher�s bias we have con-ducted inter-rater reliability in this process. Threeresearch papers were selected at random and a col-league, who was not familiar with the issues being dis-cussed, was asked to identify CSFs and criticalbarriers (CBs) that appeared in the research paper.The results were compared with our previous resultsand no great disagreements were found.

3.2.2. CSFs interviews

The CSF interview (Rockart, 1979) is a uniqueopportunity to assist the managers in better understand-ing their information needs. ‘‘The CSF interview oftenpresents the initial occasion to interact with the manager

M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222 207

on the types of information support that might be usefulto her’’ (Bullen and Rockart, 1981).

CFS interviews were conducted with three groups ofpractitioners:

• The first group was made up of designers/testers/pro-grammer/analyst. Referred to as ‘‘developers’’.

• The second group was made up of team leaders/pro-ject managers. Referred to as ‘‘managers’’.

• The third group was made up of senior managers/directors. Referred to as ‘‘senior managers’’.

Questioning was both open and close-ended with fre-quent probing to elaborate and clarify meaning. Thenegotiated interview duration was half an hour, how-ever, the researcher and interviewee would determinethe pace of the interview.

3.3. Data analysis methods

3.3.1. Content analysis

This research seeks to identify perceptions and expe-riences of practitioners about SPI implementation. Inorder to identify common themes for the implementa-tion of SPI programmes, the following process has beenadopted in this research (Baddoo, 2001; Burnard, 1991):

• Identifying themes for SPI implementation from tran-scripts: All the interview transcripts were read toidentify the major themes for SPI implementation.These themes were noted down and compared tothe notes made during the CSF interviews in orderto reassure that the transcripts being analysed areindeed a true reflection of the discussion in the CSFinterviews. These two process steps also verify thatthe transcription process has not changed the originaldata generated in the CSF interviews.

• Generate categories: All the CSF interview tran-scripts were read again to generate categories forresponses. Different themes were grouped togetherunder three categories, i.e. CSF, CB and phases/stepsneeded for SPI implementation. For example, budget,funds etc were grouped together under CSF category‘‘resources’’. Poor response, user unwilling to beinvolved etc were grouped together under CB cate-gory ‘‘lack of support’’. Each category represents aCSF, CB and steps/phases needed for the implemen-tation of SPI programme.

In order to reduce researcher�s bias we conductedinter-rater reliability in this process. Three interviewrecordings were selected at random and a colleague,who was not familiar with the issues being discussed,was asked to identify CSFs that appeared in the inter-views. The results were compared with our previousresults and no great disagreements were found.

3.3.2. Frequency analysis

According to Seaman (1999) coding in empirical re-search is one method of extracting quantitative datafrom qualitative data in order to perform some statisti-cal analysis. In this research, data from the literatureand CSF interviews is categorised and coded in orderto perform frequency analysis and also to perform somecomparative analysis of SPI implementation factors. Wehave used frequency analysis at two levels. Firstly, wemeasured the occurrence of key factors in a survey ofthe literature. We recorded the occurrence of a key fac-tor in each article. By comparing the occurrences of akey factor in a number of articles against occurrencesof other key factors in the same articles, we calculatedthe relative importance of each factor. For example, ifa factor is mentioned in 10 out of 20 articles, it has animportance of 50% for comparison purposes. In thisway we compared and ranked the factors. Secondly,we measured the occurrence of key factors in an empir-ical study. In order to analyse the CSF interview tran-scripts we recorded the occurrence of a key factor ineach CSF interview transcript. By comparing the occur-rences of a key factor in a number of CSF interviewtranscripts against occurrence of other key factors inthe same CSF interview transcript, we calculated the rel-ative importance of each factor. Finally, conclusions aredrawn regarding the factors that are critical in the liter-ature and in the empirical study.

3.4. Limitation of research design

There are a number of limitations in this study. Thisstudy explored the experiences and perceptions of prac-titioners regarding SPI implementation. These percep-tions and experiences have not been verified directly.This may mean that what practitioners say about criticalfactors may not necessarily be the critical factors for SPIimplementation. Furthermore, practitioner�s percep-tions may not be accurate.

We did not categorise research papers, experiencereports, case studies. Similarly we also did not catego-rise companies in the empirical study. This is becausewe wanted to identify aggregate factors that havebeen frequently cited in the literature and the empiricalstudy.

However, we have high confidence in this research re-sults based on opinion data (Baddoo and Hall, 2003; El-Emam et al., 1999; Hall and Wilson, 1998; Rainer andHall, 2002; Stelzer and Werner, 1999) because:

• Data was collected from different practitioners whowere dealing with SPI implementation issues on adaily basis.

• Practitioners� experiences and perceptions wereexplored independently and without any suggestionfrom the researcher.

208 M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222

• At the end of each CSF interview, we asked practi-tioners to rank from 1 to 5 their knowledge of SPIimplementation. 77% of practitioners have chosen 4and above. Only 23% have chosen 3 and below.

• More than 50% of the companies have been involvedin SPI programmes over the last 5years.

• Practitioners have cited those factors that have beenused within the practitioners� company.

• Though the software development industry is charac-terised by global competition and participation, thisstudy is limited to Australian-based software devel-opment organizations. This helps eliminate the prob-lem of interpersonal communication and tends topromote a more standardized group of organizationswith respect to basic cultural differences.

Among Australian-based software developmentorganizations, only those that have attempted to im-prove their software processes were studied. Becausethis study is limited to the software industry, thegathered data reflects the perceptions of those individu-als employed in this industry, and generalizations toother industries should be undertaken with extremecaution.

4. SPI implementation framework

During the empirical study and literature review twoimportant issues of SPI implementation were discov-ered, i.e. ‘‘what’’ and ‘‘how’’. The first issue is aboutwhat is critical in the SPI implementation. The secondissue is about how to implement SPI initiatives in theorganizations.

The principal contribution of this research is thedevelopment of the SPI implementation framework

Factors component(CSFs and CBs)

Empirical study andliteraturereview

Inform

Usedby

Usedby

Inform

Fig. 1. SPI implementa

(SPI-IF) (as shown in Fig. 1) in order to assist practi-tioners in the design of effective SPI implementation ini-tiatives. In this framework adequate attention has beenpaid to both ‘‘what’’ and ‘‘how’’ to implement activities.The framework is based on the SPI literature and anempirical study we have carried out. This frameworkhas three components, i.e. SPI implementation factorcomponent, SPI assessment component and SPI imple-mentation component.

In order to identify ‘‘what’’, it was important to iden-tify a list of factors that are critical in SPI implementa-tion. This was the starting point of our research. Wehave divided critical SPI implementation factors intotwo categories, i.e. CSFs and CBs. In order to identifythese factors, we have carried out an empirical studyabout factors that have a positive or negative impacton the implementation of a SPI program (Niazi et al.,2004). Furthermore, we have analysed the experiences,opinions and views of practitioners through the litera-ture (i.e. case studies, technical reports and journal�sarticles) (Niazi et al., 2003a). The empirical analysis ofCSFs and CBs is presented in Section 4.1.

Humphrey (1989) stressed the need for assessment as:‘‘if you do not know where you are, a map will not help’’.Zahran (1998) described the risk of not following a de-fined assessment method: ‘‘different assessment teamscould be adopting different approaches based on theirown individual experiences, rather than following astandard approach to the assessment’’ (Zahran, 1998, p.77). This risk can lead organizations to a chaotic situationwith no standard for SPI implementation practices. Zah-ran (1998) also described the requirements of softwareprocess assessment method. In the appraisal of SPI stand-ards and models, e.g. CMMI and ISO 9001, the softwareprocess maturity of the organizations is assessed. Noattention has been paid to assess the SPI implementation

AssessmentComponent

(IMM)

ImplementationComponent(SPI-IM)

CMMI and anempirical study

Empirical study

Inform

Inform

SPIImplementationInitiative

Assist

Assist

tion framework.

M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222 209

maturity of the organizations. The assessment of SPIimplementation maturity can help organizations in suc-cessfully implementing SPI initiatives. This is becausethe readiness of the organizations for successfully imple-menting SPI initiatives could be judged through thisSPI implementation maturity. We have focused on theseissues and developed an implementation maturity model(IMM) (i.e. assessment component) in order to assessthe SPI implementation maturity of the organizations(Niazi and Wilson, 2003; Niazi et al., in press). TheCMMI perspective (SEI, 2002a) and the findings fromcurrent empirical study were used in the design ofIMM. The IMM has four SPI implementation maturitylevels. These maturity levels contain different CSFs andCBs identified through the literature and CSF interviews.Under each factor, different practices have been designedthat guide how to assess and implement each factor. TheIMM is described in Section 4.2.

Findings from the literature and the empirical studyhave also led to a second issue of SPI implementation,i.e. how to implement. The SPI implementation model(SPI-IM) (i.e. implementation component) was alsodeveloped in order to assist organizations in successfullyimplementing SPI initiatives (Niazi et al., 2003c; Niaziet al., 2004d). The findings from an empirical study haveled us to the design of different phases for SPI imple-mentation. Those phases were selected in the SPI-IMwhich were most frequently cited by practitioners. TheCSFs and CBs were divided among different phases ofthe SPI-IM. In order to have more fine-grained activitieswithin each phase of the SPI-IM, a list of practices(Niazi et al., in press) were also designed for each CSFand CB. The SPI-IM is described in Section 4.3.

We have pulled together individual componentsunder one framework using a bottom-up approachalready familiar to many practitioners and researchers.The SPI-IF is a specialised, cohesive and comprehensiveframework that represents a new process view of the SPIimplementation. The SPI-IF aims to present complexSPI implementation activities in a way that can be easilyunderstood.

4.1. SPI implementation factor component

This section describes the CSFs and Critical Barriers(CBs) that have been included in the SPI Implementa-tion Factor Component of the SPI-IF.

CSFs represent a few key areas where managementshould focus their attention in order to successfullyachieve the desired results (Rockart, 1979). In order todecide the criticality of a factor, we have used the fol-lowing criteria (Niazi et al., 2004; Niazi et al., 2004d):

• If a factor is cited in the literature with a frequencypercentage of >=30%, then we treat that factor as acritical factor in the literature.

• If a factor is cited by the respondents in the inter-views with a frequency percentage of >=30% thenwe treat that factor as a critical factor in this empir-ical study.

The analysis of CSFs and CBs is fully reported inNiazi et al. (2004); Niazi et al. (2004d). The key CSFsincluded in the SPI-IF are, then:

• Senior management commitment;• SPI awareness;• Staff involvement;• Experienced staff;• Defined SPI implementation methodology;• Reviews;• Training and mentoring;• Staff time and resources;• Creating process action teams/external agents.

The key CBs included in the SPI-IF are, then:

• organizational politics;• lack of support;• lack of resources;• inexperienced staff/lack of knowledge;• time pressure.

4.2. A SPI assessment component

We have adapted CMMI (SEI, 2002a) perspectiveand developed a maturity model for SPI implementation(as shown in Fig. 2) in order to guide SPI practitionersto assess and improve their SPI implementation matu-rity. The structure of this implementation maturitymodel (IMM) is built upon the following threedimensions:

• Maturity stage dimension;• CSFs and CBs dimension;• Assessment dimension.

The categorisation of CSFs and CBs has led us to de-sign different maturity levels for the implementation ofSPI. These maturity levels contain different CSFs andCBs identified through the literature and CSF inter-views. The maturity model in Fig. 2 shows that organi-zations should address each factor in order to achieve acertain maturity level. Under each factor different prac-tices have been designed that guide how to assess andimplement each factor.

By ‘‘maturity’’ we mean an extent to which an imple-mentation process is explicitly defined, managed andmeasured (SEI, 2002a). The maturity level is defined asa well-defined stage towards achieving a mature imple-mentation process (SEI, 2002a).

SPI Implementation MaturityLevels

CSFs and CBs

Practices: How to implement andassess each factor

Contain FormImplementationCapability

Indicate

Organized by Organized into

Activities

Describe

Inform Literature review and an empirical study

CMMI maturity levels andan empirical study

Inform

Inform Literature reviewand an empiricalstudy

Fig. 2. SPI implementation maturity model structure (Adapted from (Paulk et al., 1993; SEI, 2002a)).

210 M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222

4.2.1. Maturity stage dimension

The staged representation of CMMI is structuredinto five maturity levels ranging from level 1 to 5. Forthe IMM several adjustments to this stage structureare necessary to take account of SPI implementationcharacteristics:

• The level 1 has been adopted directly from CMMI.This is the level where the SPI implementation proc-ess is chaotic and few processes are defined.

• Awareness has emerged in the empirical study as animportant factor for SPI implementation, i.e. citedin 59% of CSF interviews. SPI implementation isthe process of adoption of new practices in theorganization. It is therefore very important to pro-mote awareness of SPI and to share knowledgeamong different practitioners. Awareness can bepromoted through awareness sessions for practition-ers to fully understand the benefits of SPI plus aseries of working sessions for practitioners in orderto define activities, goals and organisational strat-egy. Therefore, level 2 of the IMM is called �aware�.We have defined level 2 as aware because SPI is anexpensive long-term approach and it takes a longtime to realise the real benefits. Hence, in order toget support of management and practitioners andto successfully continue SPI initiatives it is veryimportant to provide sufficient awareness at thevery beginning of SPI implementation programmes.SPI implementation is not as beneficial without suf-ficient awareness of its benefits. Moitra (1998) hasemphasised explanation and sharing of ‘‘how theimproved processes will help the individuals interms of their efficiency, productivity and perform-ance’’. The necessary investment of time and moneyand the need to overcome staff resistance are poten-tial barriers to SPI implementation (Stelzer andWerner, 1999). These obstacles cannot be over-come without sufficient SPI awareness within theorganization.

• Levels 3 and 4 of the IMM are adapted from CMMIlevel �3-defined� and level �5-optimising� respectively.In the IMM, level �3-defined� is the level where SPIimplementation processes are documented, standard-ized, and integrated into a standard implementationprocess for the organization. Level �4-Optimising� isthe level where organizations establish structures forcontinuous improvement.

Maturity levels of SPI maturity model are shown inFig. 3.

Although informed by CMMI, the IMM does notreplicate the level-2 ‘‘Managed’’ and level-4 ‘‘Quantita-tively Managed’’ of CMMI. This is because:

• In level-2 ‘‘Managed’’ of CMMI, the focus is on pro-ject management. We have identified �managing theSPI project� factor in this study that relates to projectmanagement. But this factor has been cited very lowin the literature (i.e. 15% of literature) and in theempirical study (i.e. 18% of interviews). We did notidentify any other factor in this study that directlyrelates to project management.

• In level-4 ‘‘Quantitatively Managed’’ of CMMI, thefocus is on establishing quantitative measures of soft-ware process. Again in this study we did not find anyfactor that directly relates to this maturity level.

We have high confidence that these four maturity lev-els are sufficient for SPI implementation because thesematurity levels are generated from factors which werecollected from companies who were tackling real issueson a daily basis. Furthermore, other researchers havealso designed similar maturity levels in areas of require-ments process improvement and personal softwareprocess (Humphrey, 1995; Sommerville et al., 1997;Sommerville et al., 1998). Sommerville et al. (1997);Sommerville et al. (1998) have published the require-ments engineering process maturity model which hasbeen derived from the CMM and has three levels, i.e.

Managed

Defined

Initial

Quantitatively Managed

Optimizing

Initial Chaotic SPIimplementation process

AwareAwareness ofSPI process

DefinedSystematic structureand definition of SPIimplementation process

OptimizingStructures forcontinuousimprovement

CMMI

Implementation maturity model

Fig. 3. Maturity stage dimension (SEI, 2002a).

M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222 211

level 1—Initial, level 2—Repeatable and level 3—De-fined. This model can be used to assess current RE proc-ess and it provides a template for requirementsengineering practice assessment. Similarly, Humphrey(1995) has designed a personal software process whichhas four process levels, i.e. PSP 0–PSP 3.

4.2.2. CSFs and CBs dimension

The CMMI consists of 25 process areas (PAs) catego-rized across the five maturity levels. We believe that asuccessful SPI implementation process should be viewedin terms of CSFs rather than PAs. This is because:

• The implementation of SPI programmes requires reallife experiences where one learns from mistakes andcontinuously improves the implementation process.CSFs are often identified after the successful comple-tion of certain activities. Hence these factors are near-to real life experiences.

• Fitzgerald and O�Kane (1999) argued the importanceof the CSF approach in SPI and emphasised the useof the CSF approach rather than the key processarea approach: ‘‘the CMM assessment team atMotorola noted this richness, and therefore success-ful process improvement can be viewed in terms ofCSFs rather than key process areas’’. Somers andNelson (2001) have also described the importanceof the CSF approach in enterprise resource planning

Table 1Categories of CSFs and CBs

Category CSFs

Awareness Senior management commitment, trainingand mentoring, staff involvement, awareness of S

Organizational Creating process action teams, experienced staff,staff time and resources, formal methodology

Support Reviews

implementation: ‘‘Critical success factors can beviewed as situated exemplars that help extend theboundaries of process improvement, and whose effectis much richer if viewed within the context of theirimportance in each stage of the implementationprocess’’.

• Different studies have confirmed the value of the CSFapproach in the field of information technology(Huotari and Wilson, 2001; Khandelwal and Fergu-son, 1999; Khandelwal and Natarajan, 2002; Pellowand Wilson, 1993; Somers and Nelson, 2001; Tyranand George, 1993). Therefore, we believe that theCSF approach can also be effective in the implemen-tation of SPI.

The 25 PAs of CMMI can be split into four catego-ries, i.e. process management, project management,engineering and support (SEI, 2002a). We have adoptedthis approach and categorised CSFs and CBs. This cat-egorisation of CSFs and CBs has led us to design threecategories, i.e. �awareness�, �organizational� and �sup-port�. The three categories with the corresponding CSFsand CBs are shown in Table 1. The basis of this catego-risation is the perceived coherence between the CSFsand CBs identified. It should also be pointed out thatthese factors and barriers are not necessarily mutuallyexclusive and there may be a certain degree of overlapamong them.

CBs

PILack of awareness, lack of support

Lack of resources, time pressure, inexperienced staff,org. politics, lack of formal methodology

Table 2CSFs dimension

Quality

Risk

Maturity Stage Front-end category Back-end category

4 – Optimising Support Awareness, Organizational

3– Defined Organizational Awareness

2 – Aware Awareness

1 – Initial

212 M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222

The awareness category contains CSFs and CBs thatsupport SPI awareness activities in the organizations,e.g. senior management commitment, training, supportand SPI awareness. The organizational category con-tains CSFs and CBs relating to defining, planning,resourcing, deploying and improving SPI implementa-tion process. Support category covers the activities thatremove defects from SPI implementation process.

In order to have more confidence in this categorisa-tion process, another researcher was asked to categorisethe CSFs and CBS. The results were compared with pre-vious results and no great disagreements were found.Furthermore, this categorisation process is evaluatedin three case studies.

In order to divide these categories of CSFs and CBsamong different levels of the IMM, we have used theperception of PAs division among different maturity lev-els of CMMI. The awareness category can be directlylinked to maturity level-2 ‘‘Aware’’ of the IMM. Thisis because the factors in this category can help to devel-op SPI awareness. For example, one needs senior man-agement commitment in order to initiate SPI awarenessactivities in the organization. Similarly, one also needsSPI training, support and staff involvement in order torealise the real benefits of SPI initiatives. The organiza-tional category can be linked to maturity level-3‘‘Defined’’, because the focus in this level is on the sys-tematic structure and definition of SPI implementationprocess. The factors in the organizational category canbe used to design systematic structures for SPI imple-mentation. For example, defined SPI implementationmethodology, allocation of resources and creating proc-ess teams etc can be used to define the SPI implementa-tion process. Focus in level-4 of the IMM is oncontinuous improvement; therefore the support cate-gory is linked with this level. We also believe that thesefactor categories may overlap and one should continu-ously monitor a previously implemented category. Thus,we named the current category as the ‘‘front-end cate-gory’’ and the previously implemented category as the‘‘back-end category’’. The final allocation of factor cat-egories among the four maturity levels of the IMM isshown in Table 2. There is no category for level-1‘‘Initial’’ because this level is characterised as chaoticand ad hoc and it does not have to be achieved as such.Similarly, CMMI also does not have any PA for level-1Initial.

4.2.3. Assessment dimension

In this dimension each of the CSFs and CBs is meas-ured to assess how well the factor has been implementedin practice. In order to measure the maturity of the SPIimplementation process we have adapted an assessmentinstrument (as shown in Appendix) that has been devel-oped, tried and tested at Motorola (Daskalantonakis,1994). This instrument is used to assess the organiza-

tion�s current status relative to CMM and identify weakareas that need attention and improvement (Daskalan-tonakis, 1994). Diaz and Sligo (1997) describe the useof this instrument: ‘‘at Motorola Government Electron-ics Divisions, each project performs a quarterly SEI self-assessment. The project evaluates each KPA activity asa score between 1 and 10, which is then rolled into anaverage score for each KPA. Any KPA average scorethat falls below 7 is considered a weakness’’. Motorola�sassessment instrument has the following three evalua-tion dimensions (Daskalantonakis, 1994):

• Approach: Criteria here are the organization commit-ment and management support for the practice aswell as the organization�s ability to implement thepractice.

• Deployment: The breadth and consistency of practiceimplementation across project areas are the key crite-ria here.

• Results: Criteria here are the breadth and consistencyof positive results over time and across project areas.

For each CSF andCBs, we have designed a list of prac-tices (Niazi et al., in press) using our empirical study andthe literature (El-Emam et al., 1999; Goldenson andHerbsleb, 1995; Johnson, 1994; Rainer and Hall, 2002;Stelzer and Werner, 1999; Zubrow et al., 1994). The fol-lowing steps have been adapted for SPI implementationassessment (Beecham and Hall, 2003; Daskalantonakis,1994):

• Step 1: For each practice of a CSF and a CB, a keystakeholder who is involved in the SPI implementa-tion effort calculates the three-dimensional scores ofthe assessment instrument.

• Step 2: The three-dimensional scores for each practiceare added together and divided by 3 and rounded up.A score for each practice is ticked in the evaluationsheet (one example is shown in Table 3).

• Step 3: Repeat this procedure for each practice. Addtogether the score of each practice and average it togain an overall score for each CSF and CB.

• Step 4: Relating the evaluation scores to SPI imple-mentation: a score of 7 or higher for each CSFand CB will indicate that specific factor has been

Table 3Factor evaluation sheet (5 + 8 + 8 + 7 + 7/No of practices) –> 35/5 = Average core: 7

Training 0 1 2 3 4 5 6 7 8 9 10

P1. Training is provided for developing the skills andknowledge needed to perform SPI implementation

X

P2. Sufficient resources and additional time to participatein SPI training will be provided to staff members

X

P3. Training program activities are reviewed on a periodic basis X

P4. Organization has developed a written training policyfor SPI to meet its training needs

X

P5. All future group or individual trainings of SPI are planned X

M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222 213

successfully implemented (Daskalantonakis, 1994).Any CSF or CB with an average score that fallsbelow 7 is considered a weakness (Daskalantonakis,1994).

• Step 5: In order to achieve any maturity level it isimportant that all those CSFs and CBs that belongto that maturity level should have an average scoreof 7 or above (Daskalantonakis, 1994). For example,in order to achieve SPI implementation maturity level2 it is important that all those CSFs and CBs thatbelong to level 2 (e.g. Senior management commit-ment, Training and mentoring, Staff involvement,Awareness, Lack of awareness, Lack of support)should have average score of 7 or above.

4.3. SPI implementation component

In this component a SPI implementation model (SPI-IM) is described. The objective of this component is toempirically explore the viewpoints and experiences ofpractitioners regarding SPI implementation and to de-velop a model in order to assist practitioners in effec-tively implement SPI initiatives. We have conducted anempirical study with software practitioners in differentorganizations with the specific aim of:

• Establishing what their typical SPI implementationexperiences are;

SPI implementationphase dimension

SPI implementCSFs and CBsdimension

ContainImplementationphases

Indicate

Activities

Describe

Pr

Contain

Fig. 4. SPI impleme

• Identifying their major concerns about SPIimplementation;

• Exploring the different phases/steps necessary for theimplementation of SPI programmes.

The findings from the literature and the empiricalstudy were used in the design of an SPI-IM (as shownin Fig. 4). The structure of this SPI-IM is built uponthe following dimensions:

• SPI implementation phase dimension;• SPI implementation CSFs and CBs dimension.

The empirical study has led us to design differentphases for SPI implementation. These phases contain dif-ferent CSFs and CBs identified through the literature andCSF interviews. The SPI implementation model in Fig. 4shows that organizations should address each factor inorder to successfully implement each phase of the model.

4.3.1. SPI implementation phase dimension

Using the content analysis of the recorded interviews,we have identified six phases for the implementation ofSPI programmes. In this section we briefly describe eachphase in turn and in the appropriate sequence. Fig. 5shows the SPI implementation phase dimension.

Awareness. Practitioners felt the need for awareness ofSPI programmes in order to fully understand the bene-fits of SPI. Practitioners said that as SPI implementation

ation

Organized into

Inform

Literature review and an empirical study

actices

Organized into

Inform

Empirical Study andliterature review

Literature review andan empiricalstudy

Inform

ntation model.

Aw

areness

Learning

Pilot Implementation

SPI implementation across the organization

Maintenance

SPI implementation action plan

Fig. 5. SPI implementation phase dimension.

214 M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222

is the process of adoption of new practices in the organ-ization, it is very important to promote awareness activ-ities of SPI and to share knowledge among differentpractitioners. Practitioners suggested involving all thestaff members in these awareness programmes.

Awareness has been selected as an ongoing phase forthe implementation of SPI programmes. This is becauseSPI is an expensive and long-term approach. Hence, inorder to get support of management and practitionersand to successfully continue SPI initiatives it is veryimportant to continuously provide sufficient SPI aware-ness during SPI implementation programmes.

Learning. Learning appears as an important factorfor SPI implementation success. For learning, practi-tioners emphasized training in SPI skills in order toachieve mastery of its use. This phase involves equippingthe practitioners with the knowledge of the critical tech-nologies which are required for SPI.

Different studies have confirmed training as animportant source of learning for the implementation ofSPI programmes (Butler, 1997; Fitzgerald and O�Kane,1999; Fowler et al., 1999; Kaltio and Kinnula, 2000;Paulk, 1999; Pitterman, 2000; Willis et al., 1998). Learn-ing comprises acquiring and transferring knowledge ofSPI activities. Managers and employees usually have ageneral idea of software process but they may not havecomplete understanding of the necessary details and alsothey may not understand how their work adds to theorganization�s mission and vision. SPI can only be suc-cessfully implemented if staff members have enoughunderstanding, guidance and knowledge of all the SPIactivities (Hall and Wilson, 1997; Stelzer and Werner,1999; Wilson and Hall, 1998).

Pilot implementation. Practitioners advised firstimplementing SPI programs at a low level and seeinghow successful it is within a particular department. Apilot implementation is important for practitioners inorder to judge their existing SPI skills and readiness.This is the phase where practitioners can decide

how much resources, training and commitment is re-quired in order to implement SPI practices across theorganization.

Our results have confirmed the results of (Halland Wilson, 1997; Wilson and Hall, 1998) where theyrecommend ‘‘Start small and slow’’ for real qualityimprovement.

SPI implementation action plan. Practitioners stressedthe need for proper planning and management. Theysaid after pilot implementation a proper plan withimplementation activities, schedule, allocated resources,responsibilities, budget and milestone should be de-signed. This plan should be based on the results andexperiences of the pilot implementation. The practition-ers also suggested to design a defined SPI implementa-tion methodology and this implementation plan shouldbe part of that implementation methodology. This de-fined SPI implementation methodology should containimplementation plan, activities, practices, and proce-dures to be used during the SPI implementation process.

SPI implementation without planning, defined SPIimplementation methodology and project managementleads to chaotic practices. Different studies emphasisedmanaging the SPI project (Butler, 1997; Fitzgerald andO�Kane, 1999; Macfarlane, 1996; Paulk, 1999; Quenn,1997; Stelzer and Werner, 1999). Often, the improve-ment projects have no specified requirements, projectplan, or schedule (Stelzer and Werner, 1999). It wasrecommended by the practitioners to treat SPI as areal project that must be managed just as any otherproject.

Implementation across the organization. After properplanning and using the pilot implementation experience,practitioners suggested implementing SPI practices inother areas/departments of the organization in orderto have a uniform development approach and maturityacross the organization. It is also important to illustratethe results of the pilot implementation to differentdepartments in order to get support and confidence. Inorder to avoid risks and to implement SPI programmesmore effectively, practitioners suggested project-by-project implementation. This is because each projectexperience can be reviewed to determine what wasaccomplished and how the organization can implementSPI programmes more effectively for future projects.Practitioners emphasised that senior management com-mitment plays a very important role in this phase. Theyalso suggested providing sufficient resources for SPIimplementation in this phase.

Maintenance. The important theme in maintenance isto continuously monitor and support the previouslyimplemented SPI activities. This maintenance will alsohelp to refine the SPI implementation methodology.Practitioners suggested continuing awareness andtraining programmes to be incorporated into mainte-nance activities as practitioners often switch jobs.

M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222 215

SPI efforts do not have long lasting effects becausepractitioners often slide back to their old habits (Stelzerand Werner, 1999). It is therefore very important tocontinuously provide them with feedback, guidance,motivation and reinforcement to stay involved in theimprovement effort (Paulish and Carleton, 1994; Stelzerand Werner, 1999; Wohlwend and Rosenbaum, 1994).

4.3.2. SPI implementation CSFs and CBs dimension

In the SPI implementation factor component (Section4.1), we have identified different CSFs and CBs throughthe literature and an empirical study. We used frequencyanalysis technique and calculated the relative importanceof each factor. As CSFs are a small number of importantissues on which management should focus their attention(Rockart, 1979), we have only considered those factorsthat are critical (i.e. with a frequency percentage of>=30%) either in both data sets or in any data set. Wehave divided CSFs and CBs among different phases ofthe implementation model as shown in Table 4.

In order to reduce researcher�s bias, the inter-raterreliability was conducted in this distribution of CSFsand CBs to different phases of SPI-IM. Two researcherswere asked to distribute CSFs and CBs among different

Table 4SPI implementation CSFs dimension

Phase CSFs

Awareness • Snr management com• Staff involvement

• SPI awareness

Learning • Snr management com• Training and mento• SPI awareness

Pilot implementation • Snr management com• Creating process act• Experienced staff• Defined SPI implem

methodology

SPI implementation action plan • Snr management com• Experienced staff• Defined SPI implem

methodology• Reviews

SPI implementation acrossthe organization

• Snr management com• Staff time and resou• Staff involvement• Experienced staff• SPI awareness• Defined SPI implem

methodology

Maintenance • Snr management com• Reviews• Training and mento

phases of the SPI-IM. The results were compared withprevious results and an agreed list of factor distributionwas generated as show in Table 4.

Table 4 suggests that practitioners should considerthese CSFs and CBs in order to successfully implementeach phase of the SPI implementation model. For exam-ple, in the �learning� phase it is important to have highermanagement support, SPI training and SPI awarenessactivities in order to provide sufficient knowledge aboutSPI and its benefits to the organization. Similarly, in the�learning� phase adequate attention should be paid toavoiding barriers—time pressure, lack of awareness,lack of support, lack of resources—in order to preventundermining of SPI implementation process.

5. Evaluation of SPI implementation framework

In order to evaluate the SPI-IF the following criterionhas been decided.

Usability of the SPI-IF. Usability is formally defined asthe ‘‘the extent to which a product can be used by speci-fied users to achieve specified goals with effectiveness, effi-ciency and satisfaction in a specified context of use’’ (ISO/

CBs

mitment • Organizational politics• Lack of awareness• Lack of support

mitmentring

• Time pressure• Lack of awareness• Lack of support• Lack of resources

mitmention teams

entation

• Inexperienced staff• Lack of defined SPI

implementation methodology• Lack of support• Lack of resources

mitment

entation

• Inexperienced staff• Time pressure

mitmentrces

entation

• Organizational politics• Time pressure• Inexperienced staff• Lack of defined SPI

implementation methodology• Lack of support• Lack of resources• Lack of SPI awareness

mitment

ring

• Inexperienced staff

216 M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222

DIS-9241-11, 1994). The goal of a usability analysis is topinpoint areas of confusion and ambiguity for the userwhich, when improved will increase the efficiency andquality of a users� experience (ISO/DIS-9241-11, 1994).

The usability of the SPI-IF can be linked with �ease ofuse� and �user satisfaction�. If one can successfully usethe product and achieve specified goals according touser needs and expectations without any confusionand ambiguity then this fulfils the usability criterion ofSPI-IF (ISO/DIS-9241-11, 1994).

The SPI-IF has three components, i.e. SPI implemen-tation factor component, SPI assessment componentand SPI implementation component. It was not possibleto evaluate the whole SPI-IF in these two evaluationmethods because of the length of the PhD research per-iod. However, two components of SPI-IF (i.e. SPIimplementation factors� component and SPI assessmentcomponent) were fully evaluated in these two evaluationexercises.

We did not fully evaluate/implement the third com-ponent, i.e. SPI implementation component becauseone needs to evaluate this component by implementingit in an entire SPI project in an organization. SPI is along-term approach and it takes a long time to fullyimplement a SPI initiative. The recent report of Soft-ware Engineering Institute shows number of months inorder to move from one maturity level of CMM to thenext one (SEI, 2004):

• Maturity level 1 to 2 is 22months• Maturity level 2 to 3 is 19months• Maturity level 3 to 4 is 25months• Maturity level 4 to 5 is 13months

Therefore, it was not possible to implement/evaluateeach phase of the implementation component (SPI-IM)due to this time constraint. However, as the SPI imple-mentation factors component is used in the implementa-tion component (Fig. 1). And the SPI implementationfactors component has been fully evaluated. Therefore,we can say that some parts of the third component(i.e. implementation component) were also evaluatedthrough these evaluations.

In order to evaluate the SPI-IF, a practical evaluationwas undertaken, i.e. a case study. The case study methodwas used because this method is said to be powerful forevaluation and can provide sufficient information in thereal software industry environment (Yin, 1993). Thecase study also provides valuable insights for problemsolving, evaluation and strategy (Cooper and Schindler,2001). Since the SPI-IF is more applicable to a real soft-ware industry environment, the case study researchmethod is believed to be a more appropriate methodfor this situation.

To provide more confidence in this evaluation, threeseparate case studies were conducted at three different

companies. Companies were selected for case studiesbecause they provided especially rich descriptions oftheir SPI efforts and because they agreed to release thisinformation.

5.1. Case studies at company A, B and C

Initially, the first author talked to each participant ver-bally, explainedwhat the case study was about and handedout a hard copy of the SPI-IF. The participants also que-ried the author through emails to solicit more informationabout the use of SPI-IF. One participant from each com-pany, who was the key member of SPI team, was involvedin each case study. The key participant communicatedwith the author through email and face-to-face discussionfor one month in order to get a thorough understanding ofthe SPI-IF. Before commencing these studies, SPI-IFtraining was also provided to the participants nominatedfor these case studies. In this training different componentsof the SPI-IF were explained and participants wereencouraged to use this model independently.

In each case study, a participant has used SPI-IF andassessed the SPI implementation maturity of his/hercompany independently without any suggestion or helpfrom the researchers. At the end of each case study, afeedback session was conducted with the participant inorder to provide feedback about SPI-IF. Each feedbacksession was an informal discussion and feedback wastaken in the form of questionnaire.

5.1.1. Demographics information

Company A is an international company that pro-vides consultancy and information technology servicesto both the private and public sector, employing10,000 professionals in Asia Pacific, Canada, Europeand United Sates. The main purpose of the companyis to enhance the efficiency and effectiveness of the infor-mation systems prevailing in the public and private sec-tors by applying relevant state-of-the-art technologiesrelated to computer software, hardware and datacommunication. This company provides services in:E-business, enterprise consulting, technology consulting,solution delivery, application portfolio management/outsourcing and project management. The SPI pro-gramme was initiated five years ago in Company A.The SPI programme was initiated by research divisionof company A. This research division has developed astandard methodology for software development. Dur-ing the development of this methodology special atten-tion was given to the requirements of ISO 9001standard and CMM model. Company A is ISO 9001certified and in currently CMM level 3.

Company B is an international company that pro-vides consultancy and information technology servicesto both the private and public sector, employing morethan 2000 professionals in Australia and worldwide.

M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222 217

The core business of the company is to provide servicesin software development, system integration, businessinnovation and business process improvement. TheSPI programme was initiated five years ago in CompanyB. The main reasons for initiating the SPI programmeswere to reduce development cost and time to market.The other reasons were to increase productivity andthe quality of the product. According to self-assessmentresults, the organization�s process maturity was found tobe in level 1. The process teams undertook different SPIactions in order to achieve level 2, i.e. working onrequirements management, software project planningand software quality assurance etc. Using CMM-basedassessment in 2001, the process maturity was found tobe in CMM level 2 with traces of level 3. Now CompanyB is working to achieve level 3.

Company C provides telecommunications servicesemploying more than 2000 professionals in Australiaand worldwide. The core business of the company isto provide cutting-edge communications, informationand entertainment services. The company provides abroad range of communications services including mo-bile, national and long distance services, local telephony,international telephony, business network services,Internet and satellite services and subscription televi-sion. The SPI programme was initiated three years agoin Company C. The reasons for initiating the SPI pro-grammes are to reduce software development cost, timeand to increase the quality of software developed. In2002 this Company C was assessed at CMM level 2.

5.1.2. Implementation of SPI-IF

The stakeholders of companies A, B and C have car-ried out SPI implementation assessments using SPI-IF.We have summarised the assessment results in Table 5.

It is clear from Table 5 that Company A stands atlevel-1 ‘‘Initial’’ of IMM because two factors of level-2‘‘Aware’’ are not fully implemented (i.e., score is <7)in the Company A. In order to achieve any maturity

Table 5Summary of results of companies A, B and C

Assessment issue Company A (CMM level-3)

Weak implementation factors inIMM level-2 �Aware�

• Senior management commitment• Staff involvement

Weak implementation factors inIMM level-3 �Defined�

• Creating process action teams

• Staff time and resources• Time pressure

Weak implementation factors inIMM level-4 �Optimising�

Nil

Total weak implementation factors 5

level it is important that all those CSFs and CBs that be-long to that maturity level should have an average scoreof 7 or above. Table 5 shows that in order to achievelevel-2 ‘‘Aware’’ of IMM the Company A needs to im-prove two factors, i.e. senior management commitmentand staff involvement. It is very surprising to see thatCompany A has not successfully implemented imple-mentation factors like �senior management commit-ment�, �staff involvement�, �creating process actionteams�, �staff time and resources� and �time pressure�.

It is also clear from Table 5 that the Company Bstands at level-1 ‘‘Initial’’ of IMM because four factorsof level-2 ‘‘Aware’’ are not fully implemented in theCompany B.

Table 5 shows that Company C stands at level-1 ‘‘Ini-tial’’ of IMM because four factors of level-2 ‘‘Aware’’are not fully implemented in the Company C. Table 5shows that in order to achieve level-2 ‘‘Aware’’ ofIMM the Company C needs to improve four factors,i.e. senior management support, training and mentoring,awareness of SPI and lack of support.

5.1.3. Discussion

In order to evaluate the SPI-IF, it is essential to dis-cuss a few important points which will lead towards theevaluation of the SPI-IF.

The CMM is structured into five maturity levels rang-ing from level 1 to 5. Each maturity level expresses a dif-ferent state of software development maturity in anorganization. Level-1 corresponds to the lowest stateof software development maturity while level-5 corre-sponds to the highest state of software developmentmaturity. In order to clarify the evaluation points, it isimportant to discuss the following:

• Higher levels of CMM (level 3 and above) indicatethat the company has well defined processes for theimplementation of SPI initiatives. This is becausethe company has successfully implemented CMM.

Company B (CMM level-2) Company C (CMM level-2)

• Awareness of SPI

• Lack of support• Staff involvement• Training and mentoring

• Awareness of SPI

• Lack of support• Senior management

commitment• Training and mentoring

• Creating process action teams

• Experienced staff• Staff time and resources• Time pressure

• Experienced staff

• Staff time and resources• Time pressure• Organizational politics

• Reviews • Reviews

9 9

218 M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222

• Lower levels of CMM (level 2 and below) indicatethat the company does not have well defined proc-esses for the implementation of SPI initiatives. Thisis because the company is struggling to successfullyimplement CMM.

Keeping in view these two points, the companies inhigher CMM levels should have less implementation is-sues than companies in lower CMM levels. As in theSPI-IF the implementation maturity of the companiesis measured through assessing different implementationfactors. So the evaluation point is:

• Companies in higher CMM level should have success-fully implemented more implementation factors thancompanies in lower CMM level.

In order to address this evaluation point, it is impor-tant to compare the results of the three case studies. Asdiscussed earlier Company A is at CMM level-3 andCompanies B and C are at CMM level-2, respectively.According to the �Evaluation point� Company A shouldhave successfully implemented more implementationfactors than Companies B and C. The results of thethree case studies are summarised into Table 5.

It is clear from Table 5 that Company A has onlytwo weak factors in IMM level-2, while Companies Band C have four weak factors in IMM level-2. ForIMM level-3, Company A has three weak factors whileCompanies B and C have four weak factors. Table 5shows that Company A has successfully implementedmore implementation factors than Companies B andC. This also shows that Company A has less weakimplementation factors (i.e. five) than Companies Band C (i.e. nine).

As discussed earlier Company B is at CMM level-2and Company C is also at CMM level-2. In order tocompare the results of companies in the same maturitylevel, it is important to compare the results of the casestudies conducted at Company B and C respectively.This will provide opportunity to identify commonweak factors of companies in the same maturity level.The results of two case studies are summarised intoTable 5. Table 5 shows that both companies have nineweak factors. It also shows that 78% (i.e. seven factors)of the weak factors are common between the twocompanies.

5.1.4. Case studies lessons learned

Three case studies were conducted in order to test andevaluate the SPI-IF. The lessons learned from thesethree case studies are summarised as follows:

• The IMM can be used effectively to identify SPIimplementation issues with a goal of increasingimplementation maturity.

• All the participants agreed that the IMM is clear,easy to use and specifically geared to assess the organ-izations� SPI implementation maturity. However,some training needs to be provided for the assessmentmethod of the IMM.

• Despite all the differences, i.e. company type, applica-tion domain and CMM maturity level, each of thethree companies was able to successfully use theIMM without any confusion and ambiguity in orderto assess their SPI implementation maturity.

• All the participants who used IMM were fullysatisfied with the assessment results and overallperformance of the model.

• All the participants have recognised SPI implementa-tion issues that the IMM has identified for their com-panies and they were agreed with those issues.

• All the participants expressed an interest to use theIMM in order to provide solutions for the identifiedSPI implementation issues.

• The five practices designed for each CSF and CB areeasy to use and unambiguous.

• The assessment method provides an entry pointthrough which the participant can effectively judgethe weak and strong implementation factors.

• All the participants agreed that IMM is generalenough and can be applied to most companies.

• It was agreed by all the participants that the use ofIMM would have a positive impact in improvingtheir SPI implementation process.

• The IMM is capable of determining the current stateof the SPI implementation process that is practicedregularly in an organization.

• The participants were willing to use IMM on a regu-lar basis in the future.

• The IMM is not only significant in the theoreticalwork but also significant in the real worldenvironment.

• The participants emphasised the importance ofhaving automated tool support available in order tofacilitate SPI practitioners in assessing organizations�SPI implementation maturity.

• Although the participants required a tool to supportall the activities of the IMM, they were still willing touse the IMM without tool support. This shows thesignificance of the IMM.

Overall, the participants were very satisfied with theuse of IMM. In particular, one aspect that the partici-pants considered important is the development of acomplete tool that can be used to perform differentactivities of the SPI-IF. The participants said that thistool will facilitate the SPI practitioners in assessingorganizations� SPI implementation maturity. This toolshould be capable of: Recoding the results of assessmentof each CSF and CB, identifying weak and strong fac-tors, guiding SPI practitioners in successfully assessing

M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222 219

the organizations� SPI implementation maturity andgenerating different assessment documents.

Automated tool support is a productive way to en-hance the visibility of processes, to identify processesweakness and to better understand the processes. A toolcan also be used to observe the behaviour of differentactivities and their interactions. The participants sug-gested that this tool will speed up the process of SPIimplementation assessment.

6. Conclusion and future work

This paper presented the SPI-IF that has the poten-tial to assist SPI practitioners in the design of effectiveSPI implementation initiatives. This framework hasthree components, i.e. SPI implementation factors� com-ponent, SPI assessment component and SPI implemen-tation component and identifies an SPI approach infour stages (Paulish and Carleton, 1994; Zahran, 1998).

To summarize how SPI-IF can assist practitioners inthe design of effective SPI implementation initiativeseach component of the SPI-IF has been reviewed withrespect to its role in the design of SPI implementationinitiatives. Fig. 6 shows each component of the SPI-IFwith its input to the SPI implementation initiatives. Inorder to effectively plan SPI implementation strategies,the objective of SPI implementation factors� componentis to provide practitioners with sufficient knowledgeabout the nature of issues that play a positive or nega-tive role in the implementation of SPI programmesand to assist them in effectively planning SPI implemen-tation strategies. The assessment component�s objectiveis to provide an implementation maturity model forSPI implementation in order to guide practitioners inassessing and improving their SPI implementationmaturity. In the implementation component the view-points and experiences of practitioners regarding SPIimplementation are empirically explored and a modelis developed in order to assist practitioners in effectivelyimplementing SPI initiatives.

Desimpinit

Factors component Assessment com

Wha

tfac

tors

play

posi

tive

orne

gativ

ero

les

inSP

Iim

plem

enta

tion,

i.e.C

SFs

and

CB

s

Rea

dine

ssfo

rS

PI

Softwareprocessassessment

Selection of an improvementmodel/standard

Fig. 6. SPI-IF assisting the design of

In order to evaluate the application of SPI-IF, a prac-tical evaluation was undertaken, i.e. a case study. Thecase study method was used because the SPI-IF is moreapplicable to the real software industry environment.Three separate case studies were conducted at three dif-ferent companies. The results of the case studies showedthat the SPI-IF is not only significant in the theoreticalwork but also significant in the real world environment.Despite all the differences, i.e. company type, applica-tion domain and CMMmaturity levels, each of the threecompanies was able to successfully use the SPI-IF to as-sess their SPI implementation maturity. The participantshave noticed the SPI implementation issues that the SPI-IF has identified for their companies and they wereagreed with those issues. It was encouraging that allthe participants wanted to use the SPI-IF in order toprovide solutions for the identified SPI implementationissues. All the participants who used the SPI-IF werefully satisfied with the assessment results and overallperformance of the model. The three case studies suc-cessfully demonstrate the validity of the SPI-IF in thereal world environment. The prime claim of this re-search is therefore that the SPI-IF provides a practicaland effective way to design the effective SPI implementa-tion initiatives. However, one aspect that the partici-pants considered important is the development of acomplete tool which can be used to facilitate the SPIpractitioners in assessing organizations� SPI implemen-tation maturity. It was suggested that this tool shouldbe capable of recoding the results of assessment, identi-fying weak and strong factors and generating differentassessment documents. Although the participants re-quired a tool to support all the activities of the SPI-IF, however, they were willing to use the SPI-IF nowwithout tool support. This shows the significance ofthe SPI-IF.

The evaluation results show that SPI-IF has potentialto assist SPI practitioners in the design of effective SPIimplementation initiatives. Thus, we recommend organ-izations to use SPI-IF in order to effectively implementSPI initiatives.

igning of SPIlementationiatives

ponent Implementation component

impl

emen

tati

on

Impl

emen

tati

onph

ases

Implementation ofan improvementmodel/standard

SPI implementation initiatives.

220 M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222

Appendix A. Organizations covered in our literature

review

• Advanced information services• AVX Ltd.• Boeing�s Space Transportation Systems• Bull HN• Corning Information Services• Eastman Kodak Comp.• Fastrak Training Inc.• High-Tech Measurement• Hughes• Lucent Technologies• MITRE Corporation• Master Systems• NASA SEL• Network Products• Nokia• Oerlikon Aerospace• Ogden Air Logistics Centre• Oklahoma City Air Logistics Centre• Raytheon

Company Scope Age(yrs)

1 Australian 32 Multi-national 21–503 Multi-national >504 Multi-national 11–205 Australian 6–106 Australian 21–507 Multi-national 21–508 Multi-national >509 Multi-national >5010 Australian >5011 Multi-national >5012 Australian <513 Multi-national >5014 Multi-national 11–2015 Australian 21–5016 Multi-national 21–5017 Multi-national 11–2018 Multi-national >5019 Australian 11–2020 Australian 21–5021 Multi-national <522 Australian 11–2023 Multi-national 6–1024 Australian <525 Australian 6–1026 Australian 6–1027 Australian >5028 Multi-national >5029 Multi-national >50

Appendix A (continued)

• Rolls-Royce• Sacramento Air Logistics Centre• Schlumberger• SEI• Siemens• SINTEF Telecom and Informatics• Space Shuttle Software Project• Sybase• Tata Consulting Services• Texas Instruments• Telcordia Technologies• Trident Data Systems• University of Hertfordshire• Xerox

Note: References to these organizations are availablefrom authors.

Appendix B. Participant company information

Size Softwaresize

SPI in operation(yrs)

38 14 <1>2000 DK >5>2000 101–500 >5>2000 501–2000 1–2<10 <10 >511–100 30 3–5>2000 DK >5501–2000 26–100 >5>2000 >2000 >5101–500 11–25 3–5>2000 >2000 3–5<10 <10 1–2>2000 DK >5>2000 >2000 3–5>2000 101–500 1–2>2000 >2000 >5>2000 11–25 >5>2000 101–500 >511–100 11–25 1–2>2000 DK >511–100 11–25 1–211–100 11–25 3–5101–500 26–100 3–5<10 <10 3–5>2000 101–500 >511–100 26–100 >5101–500 <10 1–2>2000 11–25 >5>2000 501–2000 >5

Appendix C. Assessment instrument (Source (Daskalantonakis, 1994))

Score Key Activity evaluation dimensions

Approach Deployment Results

Poor (0) • No management recognition of need• No organizational ability• No organizational commitment• Practice not evident

• No part of the organization uses the practice• No part of the organization shows interest

• Ineffective

Weak (2) • Management begins to recognize need• Support items for the practice start to

be created• A few parts of organization are able

to implement the practice

• Fragmented use• Inconsistent use• Deployed in some parts of the

organization• Limited to monitoring/verification

of use

• Spotty results• Inconsistent results• Some evidence of

effectiveness for some partsof the organization

Fair (4) • Wide but not complete commitmentby management

• Road map for practice implementationdefined

• Several supporting items for thepractice in place

• Less fragmented use• Some consistency in use• Deployed in some major parts of the

organization• Monitoring/verification of use for several

parts of the organization

• Consistent and positiveresults for several partsof the organization

• Inconsistent results forother parts of theorganization

Marginallyqualified (6)

• Some management commitment;some management becomes proactive

• Practice implementation well underway across parts of the organization

• Supporting items in place

• Deployed in some parts of the organization• Mostly consistent use across many parts of

the organization• Monitoring/verification of use for many parts

of the organization

• Positive measurable resultsin most parts of theorganization

• Consistently positive resultsover time across manyparts of the organization

Qualified (8) • Total management commitment• Majority of management is proactive• Practice established as an integral

part of the process• Supporting items encourage and

facilitate the use of practice

• Deployed in almost all parts of theorganization

• Consistent use across almost all parts of theorganization

• Monitoring/verification of usefor almost all parts of the organization

• Positive measurable resultsin almost all parts of theorganization

• Consistently positive resultsover time across almost allparts of the organization

Outstanding (10) • Management provides zealousleadership and commitment

• Organizational excellence in thepractice recognized even outsidethe company

• Pervasive and consistent deployedacross all parts of the organization

• Consistent use over time across allparts of the organization

• Monitoring/verification for all partsof the organization

• Requirements exceeded• Consistently world-class

results• Counsel sought by others

M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222 221

References

Baddoo, N., 2001. Motivators and de-motivators in software processimprovement: an empirical study, PhD, University of Hertford-shire, UK.

Baddoo, N., Hall, T., 2003. De-Motivators of software processimprovement: An analysis of practitioner�s views. Journal ofSystems and Software 66 (1), 23–33.

Beecham, S., Hall, T., 2003. Expert panel questionnaire: validating arequirements process improvement model, Available from <http://homepages.feis.herts.ac.uk/~pppgroup/requirements_cmm.htm>,Site visited May 2003.

Bullen, C.V., Rockart, J.F., 1981. A primer on critical success factor,Centre for Information Systems research. Sloan School of Man-agement, Working Paper No. 69.

Burnard, P., 1991. A method of analysing interview transcripts inqualitative research. Nurse education today (11), 461–466.

Butler, K., 1997. Process lessons learned while reaching Level 4,CrossTalk (May), 1–4.

Coolican, H., 1999. Research Methods and Statistics in Psychology.Hodder and Stoughton, London.

Cooper, D., Schindler, P., 2001. Business Research Methods, seventhed. McGraw-Hill.

Daskalantonakis, M.K., 1994. Achieving higher SEI levels. IEEESoftware 11 (4), 17–24.

Diaz, M., Sligo, J., 1997. How software process improvement helpedmotorola. IEEE Software 14 (5), 75–81.

El-Emam, K., Fusaro, P., Smith, B., 1999. Success factors and barriersfor software process improvement. Better software practice forbusiness benefit: principles and experience. IEEE Computer Society.

Fitzgerald, B., O�Kane, T., 1999. A longitudinal study of softwareprocess improvement. IEEE Software (May/June), 37–45.

Florence, A., 2001. Lessons learned in attempting to achieve softwareCMM Level 4, CrossTalk (August), 29–30.

Fowler, P., Middlecoat, B., Yo, S., 1999. Lessons learned collaborat-ing on a process for SPI at Xerox. Technical report, CMU/SEI-99-TR-006.

Goldenson, D.R., Herbsleb, J.D., 1995. After the appraisal: asystematic survey of process improvement, its benefits, and factorsthat influence success. SEI, CMU/SEI-95-TR-009.

Hall, T., Wilson, D., 1997. Views of software quality: a field report.IEEE Proceedings on Software Engineering 144 (2).

222 M. Niazi et al. / The Journal of Systems and Software 78 (2005) 204–222

Hall, T., Wilson, D., 1998. Perceptions of software quality: a pilotstudy. Software Quality Journal (7), 67–75.

Humphrey, W., 1989. Managing the Software Process. Addison-Wesley.

Humphrey, W., 1995. A Discipline for Software Engineering. Addison-Wesley.

Huotari, M.l., Wilson, T.D., 2001. Determining organizational infor-mation needs: the critical success factors approach. InformationResearch 6 (3).

ISO/DIS-9241-11, 1994. International Standards Organization, ISODIS 9241-11: Guidance on Usability.

Jennifer, G., Chuck, M., 1997. The IDEALSM model: a practicalguide for improvement, Available from: <interrefhttp://www.sei.cmu.edu/ideal/ideal.bridge.htmlurlhttp://www.sei.cmu.edu/ideal/ideal.bridge.html>, Site visited 22-5-2003.

Johnson, A., 1994. Software process improvement experience in theDP/MIS function: experience report. In: IEEE InternationalConference on Software Engineering, ICSE, pp. 323–329.

Kaltio, T., Kinnula, A., 2000. Deploying the defined software process.Software Process—Improvement and Practice (5), 65–83.

Kautz, K., Nielsen, P.A., 2000. Implementing software processimprovement: two cases of technology transfer. In: Proceedingsof the 33rd Hawaii Conference on System Sciences.

Khandelwal, V., Ferguson, J., 1999. Critical success factors and thegrowth of IT in selected geographic regions. In: 32nd HawaiiInternational Conference on System Sciences.

Khandelwal, V., Natarajan, R., 2002. Quality IT management inAustralia: critical success factors for 2002. Technical report No.CIT/1/2002, University of Western Sydney.

Leedy, P., Ormrod, J., 2001. Practical Research: Planning and Design.Prentice Hall, New Jersey.

Macfarlane, M., 1996. Eating the elephant one bite at a time: effectiveimplementation of ISO 9001/TickIT, Executive Digest—The ISO9000 Quality Management System (August).

Moitra, D., 1998. Managing change for (SPI) initiatives: a practicalexperience-based approach. Software Process Improvement andPractice (4), 199–207.

Niazi, M., Wilson, D., 2003. A Maturity model for the implementationof software process improvement. In: International Conference onSoftware Engineering Research and Practice (SERP03), pp. 650–655.

Niazi, M., Wilson, D., Zowghi, D., 2003. Critical success factors andcritical barriers for software process improvement: an analysis ofliterature. In: The Proceedings of Australasian Conference onInformation Systems (ACIS03), Australia.

Niazi, M., Wilson, D., Zowghi, D., 2003. A framework for guiding thedesign of effective implementation strategies for software processimprovement. In: International Conference on Knowledge Engi-neering and Software Engineering (SEKE 03), USA, pp. 366–371.

Niazi, M., Wilson, D., Zowghi, D., 2003. A model for the implemen-tation of software process improvement: a pilot study. In: TheProceedings of International Conference on Software Quality(QSIC03), pp. 196–203.

Niazi, M., Wilson, D., Zowghi, D., 2004. Critical barriers for SPIimplementation: an empirical study. In: IASTED InternationalConference on Software Engineering (SE 2004), Austria, pp. 389–395.

Niazi, M., Wilson, D., Zowghi, D., in press. A maturity model for theimplementation of software process improvement: an empiricalstudy. Journal of Systems and Software.

Niazi, M., Wilson, D., Zowghi, D., Wong, B., 2004d. A model for theimplementation of software process improvement: an empirical

study. Product Focused Software Process Improvement (Profes2004) Japan, 1–16.

Paulish, D., Carleton, A., 1994. Case studies of software processimprovement measurement. IEEE Computer 27 (9), 50–59.

Paulk, M., 1999. Practices of high maturity organizations. SEPGConference, 8–11.

Paulk, M., Curtis, B., Chrissis, M., Weber, C., 1993. Capabilitymaturity model for software, Version 1.1. CMU/SEI-93-TR-24,Software Engineering Institute, USA.

Pellow, A., Wilson, T.D., 1993. The management informationrequirements of heads of university departments: a critical successfactors approach. Journal of Information Science (19), 425–437.

Pitterman, B., 2000. Telcordia Technologies: the journey to highmaturity. IEEE Software (July/August), 89–96.

Quenn, E., 1997. My boss needs to hear this: how management cansupport SPI. CrossTalk (May).

Rainer, A., Hall, T., 2002. Key success factors for implementingsoftware process improvement: a maturity-based analysis. Journalof Systems and Software (62), 71–84.

Rockart, J.F., 1979. Chief executives define their own data needs.Harvard Business Review (2), 81–93.

Seaman, C., 1999. Qualitative methods in empirical studies of softwareengineering. IEEE Transactions on Software Engineering 25 (4),557–572.

SEI., 2002. Capability Maturity Model� Integration (CMMISM),Version 1.1. SEI, CMU/SEI-2002-TR-029.

SEI., 2002. Process maturity profile of the software community.Software Engineering Institute, Carnegie Mellon University.

SEI., 2004. Process Maturity Profile. Software Engineering InstituteCarnegie Mellon University.

Somers, T., Nelson, K., 2001. The impact of critical success factorsacross the stages of Enterprise Resource Planning Implementa-tions. In: 34th Hawaii International Conference on SystemSciences.

Sommerville, I., Sawyer, P., Viller, S., 1997. Requirements processimprovement through the phased introduction of good practice.Software Process-Improvement and Practice (3), 19–34.

Sommerville, I., Sawyer, P., Viller, S., 1998. Improving the require-ments process. In: Fourth International Workshop on Require-ments Engineering: Foundation of Software Quality, pp. 71–84.

Stelzer, D., Werner, M., 1999. Success factors of organizational changein software process improvement. Software Process Improvementand Practice 4 (4).

Tyran, C., George, J., 1993. The implementation of expert systems: asurvey of successful implementation. Database (Winter), 5–15.

Willis, R.R., Rova, R.M., Scott, M.D., Johnson, M.I., Ryskowski,J.F., Moon, J.A., Shumate, K.C., Winfield, T.O., 1998. Hughesaircraft�s widespread deployment of a continuously improvingsoftware process. Technical report, CMU/SEI-98-TR-006.

Wilson, D., Hall, T., 1998. Perceptions of software quality: a pilotstudy. Software Quality Journal (7), 67–75.

Wohlwend, H., Rosenbaum, S., 1994. Schlumberger�s SoftwareImprovement Program. IEEE Transactions on Software Engineer-ing 20 (11), 833–839.

Yin, R.K., 1993. Applications of Case Study Research. SagePublications.

Zahran, S., 1998. Software Process Improvement—Practical Guide-lines for Business Success. Addison-Wesley.

Zubrow, D., Hayes, W., Siegel, J., Goldenson, D., 1994. MaturityQuestionnaire. CMU/SEI-94-SR-7.