The Design of Focus Area Maturity Models

16
Submitted for publication The Design of Focus Area Maturity Models Marlies van Steenbergen 1 , Rik Bos 2 , Sjaak Brinkkemper 2 , Inge van de Weerd 2 , Willem Bekkers 2 1 Sogeti Netherlands, Wildenborch 3, 1112 XB Diemen, The Netherlands [email protected] 2 Department of Information and Computing Sciences, Utrecht University, Padualaan 14, 3584 CH Utrecht, The Netherlands {R.Bos, S.Brinkkemper, I.vandeWeerd, Bekkers}@cs.uu.nl Abstract. Maturity models are a well-known instrument to support the improvement of functional domains in IS, like software development or testing. While maturity models may share a common structure, they have to be developed anew for each functional domain. Focus area maturity models are distinguished from fixed-level maturity models, like CMM, in that they are especially suited to the incremental improvement of functional domains. In this paper we present a generic method for developing focus area maturity models based on both extensive industrial experience and scientific investigation. In doing so, we show two examples of focus area maturity models, one for enterprise architecture and one for software product management. We used a design science research process to develop the method presented. Keywords: Design Research Methodology, Design Science, Enterprise Architecture, Software Product Management, Maturity Model, Maturity Matrix, Method Engineering. 1 Introduction Capability development in functional domains in IS, like enterprise architecture or software product management, is a complex issue. Decisions have to be made with regard to how to develop new processes, deliverables and competences. As it is not possible to implement a fully mature function from scratch, functional domains are developed incrementally, improving them step by step. Maturity models are a means to support such incremental development, as they distinguish different maturity levels that an organization successively progresses through. As such they can be used as a guideline for balanced incremental improvement of a functional domain. Numerous maturity models for various functional domains have been developed over the past years [1, 2]. Most of these maturity models are so-called fixed-level models, like CMM [3]. Fixed-level maturity models distinguish a fixed number, usually around five, of generic maturity levels. Each maturity level is associated with a number of processes that have to be implemented. Fixed-level models are well- suited to benchmarking, placing an organization at a maturity level by assessing the extent to which the associated processes are implemented, but they are less suited to

Transcript of The Design of Focus Area Maturity Models

Submitted for publication

The Design of Focus Area Maturity Models

Marlies van Steenbergen1, Rik Bos2, Sjaak Brinkkemper2, Inge van de Weerd2, Willem Bekkers2

1 Sogeti Netherlands, Wildenborch 3, 1112 XB Diemen, The Netherlands

[email protected] 2 Department of Information and Computing Sciences,

Utrecht University, Padualaan 14, 3584 CH Utrecht, The Netherlands {R.Bos, S.Brinkkemper, I.vandeWeerd, Bekkers}@cs.uu.nl

Abstract. Maturity models are a well-known instrument to support the improvement of functional domains in IS, like software development or testing. While maturity models may share a common structure, they have to be developed anew for each functional domain. Focus area maturity models are distinguished from fixed-level maturity models, like CMM, in that they are especially suited to the incremental improvement of functional domains. In this paper we present a generic method for developing focus area maturity models based on both extensive industrial experience and scientific investigation. In doing so, we show two examples of focus area maturity models, one for enterprise architecture and one for software product management. We used a design science research process to develop the method presented.

Keywords: Design Research Methodology, Design Science, Enterprise Architecture, Software Product Management, Maturity Model, Maturity Matrix, Method Engineering.

1 Introduction

Capability development in functional domains in IS, like enterprise architecture or software product management, is a complex issue. Decisions have to be made with regard to how to develop new processes, deliverables and competences. As it is not possible to implement a fully mature function from scratch, functional domains are developed incrementally, improving them step by step. Maturity models are a means to support such incremental development, as they distinguish different maturity levels that an organization successively progresses through. As such they can be used as a guideline for balanced incremental improvement of a functional domain.

Numerous maturity models for various functional domains have been developed over the past years [1, 2]. Most of these maturity models are so-called fixed-level models, like CMM [3]. Fixed-level maturity models distinguish a fixed number, usually around five, of generic maturity levels. Each maturity level is associated with a number of processes that have to be implemented. Fixed-level models are well-suited to benchmarking, placing an organization at a maturity level by assessing the extent to which the associated processes are implemented, but they are less suited to

incremental improvement, as they cannot express interdependencies between the processes making up the maturity levels [1, 4]. There is a need for models that provide more guidance in incrementally improving a domain.

More suited to incremental improvement of functional domains are the so-called focus area maturity models [5]. Focus area maturity models are based on the concept of a number of focus areas that have to be developed to achieve maturity in a functional domain. Examples of focus areas are the development and maintenance of certain processes or deliverables, alignment with other disciplines, and training of certain competences. The identification of the exact focus areas depends on the functional domain. A focus area maturity model defines for each of its focus areas a series of development steps in the form of progressively mature capabilities. These capabilities are specific to the focus areas identified. This is a departure from the fixed number of generic maturity levels that the fixed-level maturity models are based on. The variation in levels that can be noticed between different fixed-level maturity models [4] suggests that the assumption of the existence of generic maturity levels is an oversimplification. We share the view that different dimensions have different maturity levels [2], taking it even one step further and claiming that each focus area has its own number and type of maturity levels. By juxtaposing all capabilities of all focus areas relative to each other, a balanced, incremental development path, taking all focus areas into account, is defined. This juxtaposition of capabilities is done by positioning the capabilities in a matrix as shown in figure 1, which gives an example of a focus area maturity model in the functional domain of enterprise architecture.

Fig. 1. A focus area maturity model for the functional domain of enterprise architecture.

Maturity Scale

Focus Area

0 1 2 3 4 5 6 7 8 9 10 11 12 13

Development of architecture A B C

Use of architecture A B C

Alignment with business A B C

Alignment with the development process A B C

Alignment with operations A B C

Relationship to the as-is state A B

Roles and responsibil ities A B C

Coordination of developments A B

Monitoring A B C D

Quality management A B C

Maintenance of the architectural process A B C

Maintenance of architectural deliverables A B C

Commitment and motivation A B C

Architectural roles and training A B C D

Use of an architectural method A B C

Consultation A B C

Architectural tools A B C

Budgeting and planning A B C

The focus areas are given in the left column, the capabilities per focus area are depicted by the letters A to D, which stand for progressively mature capabilities. The actual maturity of a specific organization can be depicted by coloring the cells up until the next capability that has not been implemented yet. The rightmost column that is completely colored indicates the maturity scale of the organization assessed. Thus, the organization in figure 1 is at maturity scale 1 as the capability A of Use of architecture in column 2 has not been achieved. We will explain the model in more detail in section 3.

The focus area maturity model makes it possible to distinguish more than five overall stages of maturity. This results in smaller steps between the stages, providing more detailed guidance to setting priorities in capability development. This makes this kind of model better suited to expressing the sometimes complex combinations of different factors that determine the effectiveness of a function.

The focus area maturity model originated in the domain of software testing [6]. Subsequently, focus area maturity models were developed for the domains of enterprise architecture and software product management. From these applications we derived a generic development method for focus area maturity models for other functional domains in IS. We did so by applying the design science research methodology for information systems research introduced by Peffers et al. [7].

The research contribution of this paper is the presentation of a development method for focus area maturity models. In terms of the research contributions guideline of Hevner et al. this is a contribution to the design foundations [8]. The practical relevance lies in the fact that we provide practitioners and researchers with a method to develop new focus area maturity models that may support practitioners in developing IS functional domains more effectively.

In section 2 we present our research approach. Section 3 discusses the application of the focus area maturity model in two fields: enterprise architecture and software product management. In section 4 the maturity matrix is mathematically formalized. A development method for focus area models is presented in section 5. In section 6, we discuss conclusions and suggestions for further research.

2 Research Approach

The objective of our research is to define a development method for focus area maturity models that aids researchers and practitioners in developing a maturity model for incremental improvement of a specific IS functional domain. Peffers et al. distinguish four different entry points to the design science research process distinguishing problem-centered, objective-centered, design and development-centered and client/context initiated approaches [7]. As our purpose is to contribute to the improvement of the IS function we applied an objective-centered approach: "an objective-centered solution (...) could be triggered by an industry or research need that can be addressed by developing an artifact.". We use the design science process to present our research approach: 1. Problem identification and motivation

The problem motivating our research is how to develop capabilities in a given functional domain in an incremental, balanced manner. In their quests for continuous improvement, practitioners and researchers are looking for well-founded development paths [2, 9]. As argued in the introduction, focus area maturity models provide such development paths.

2. Define the objectives for a solution The objective of our solution, a focus area maturity model development method, is to provide a method to develop a step by step improvement approach for a specific functional domain. This development method must be well-founded and

enable practitioners and scientists to design an optimal and feasible improvement path to a fully mature function.

3. Design and development The focus area maturity model development method is derived from both literature review and practical experience. From the literature review we defined a number of generic phases in developing maturity models. As we only found development methods for fixed-level maturity models, however, we cannot solely build on previous research for the detailing of these phases. We therefore also draw on the lessons learned from the development of focus area maturity models in the fields of enterprise architecture and software product management.

4. Demonstration The use of the development method is initially demonstrated by retrospectively applying it to two cases. Further demonstration must take place by applying it to a new field. This is to be done yet.

5. Evaluation The development method is evaluated by applying the requirements for the development of maturity models defined by Becker et al. [9], which were derived from the seven guidelines presented by Hevner et al. [8].

6. Communication Besides communication of the development method in the scientific community by publication in conferences and journals, the method will be published in practitioners’ forums.

3 Focus Area Maturity Models

The core of the focus area maturity model consists of the focus areas. Each focus area can be divided into a number of capabilities. By positioning these capabilities against each other in a matrix, as shown in figure 1, the model presents the order in which the different aspects of a functional domain should be addressed and implemented. A functional domain is the whole of activities, responsibilities and actors involved in the fulfillment of a well-defined function within an organization. We define a focus area as an aspect that has to be implemented to a certain extent for a functional domain to be effective. The collection of focus areas provides a complete and mutually disjoint coverage of the functional domain. With each focus area a number of capabilities are associated, depicted in the matrix by capital letters. A capability is here defined as an ability to achieve a predefined goal that is associated with a certain maturity level. For example in the enterprise architecture matrix in figure 1, the focus area Use of architecture has three capabilities A: architecture used informatively, B: architecture used to steer content and C: architecture integrated into the organization, representing a progression in maturity. The position of the letters in the matrix indicates the order in which the capabilities of the different focus areas must be addressed and implemented to build an architecture practice in a balanced manner. With the matrix we can define both intra-process dependencies between capabilities, where one capability must be implemented after another capability in the same focus

area and inter-process dependencies, where a capability must be implemented after a capability in another focus area.

The fourteen columns in the enterprise architecture matrix of figure 1 define progressive overall maturity scales, scale 0 being the lowest and scale 13 being the highest scale achievable. An organization is said to be at the maturity scale represented by the rightmost column for which the organization has achieved all focus area capabilities positioned in that column and in all columns to its left. The organization depicted in figure 1 has already implemented some capabilities, indicated by the colored cells. It shows an unbalance, however, in that some focus areas, like Alignment with the development process, are quite advanced, while others, like Use of architecture, are not yet developed at all. Thus despite the development of some of the focus areas, on the whole the organization in figure 1 is still only at scale 1. To achieve a balanced enterprise architecture function, its first step should be to develop the focus area Use of architecture to its first capability (the A in column 2), followed by the first capability of Monitoring (the A in column 3). By implementing these capabilities the organization will progress from maturity scale 1 to scale 3.

While the focus area oriented model originates from the field of testing [6], we applied it in the last seven years to the IS fields of enterprise architecture and software product management. We discuss these applications in the next sections.

3.1 The DyA Architecture Maturity Matrix

The DyA architecture maturity matrix (DyAMM) is the application of the focus area maturity model in the field of enterprise architecture. The DyAMM is developed as part of the DyA program in which an approach to enterprise architecture is developed, called Dynamic Architecture (DyA), that focuses on a goal-oriented, evolutionary development of the architectural function [10, 11]. The DyAMM was developed in 2002 and has been applied to over 50 organizations since. In 2004 it was slightly adjusted based on the first few applications. The resulting version was qualitatively validated in a case study [5]. A number of organizations use the DyAMM to give direction to an improvement program of years, performing a yearly assessment to monitor progress. In 2009 a quantitative analysis of the DyAMM was performed with a dataset of 56 cases [12].

The core of the DyAMM is the matrix depicted in figure 1, with each capability associated with one to four yes/no assessment questions to assess its implementation and one or more improvement actions that may support achieving it. Maturity assessment is performed by answering the yes/no questions. Only if all questions associated with a capability can be answered confirmatively, the associated capability can be said to be achieved. Table 1 shows as an example the questions associated with capability A of the focus area Use of architecture.

Table 1. Questions to measure maturity level A of focus area Use of architecture.

Nr. Question 9 Is there an architecture that management recognizes as such? 10 Does the architecture give a clear indication of what the organization wants? 11 Is the architecture accessible to all employees?

In all there are 137 assessment questions associated with the DyAMM. The

primary use of the DyAMM is as an assessment instrument used by independent assessors. The assessors fill the matrix by answering all 137 questions, basing their answers on interviews with relevant stakeholders and studying documentation. In addition, the DyAMM is used as a self assessment to be completed by individuals for their own organization. Architects can answer the 137 questions for themselves, which leads to a matrix profile, like the example in figure 1.

3.2 The SPM Maturity Matrix

Many software companies have made a shift from developing custom-made software to developing product software [13]. This means that many internal processes in these companies need to be adapted. Instead of developing a customized product for one customer, a standard product is developed for a whole range of customers. To cope with this, product software companies need to introduce the right software product management (SPM) processes.

In [14], the reference framework for SPM was proposed. This framework presents 14 SPM processes divided over four business functions: Portfolio management, Product roadmapping, Release planning and Requirements management. In addition, the SPM maturity matrix was developed in order to support local analysis and incremental improvement of SPM processes [15, 16]. The SPM matrix has been validated in approximately 15 case studies in Dutch companies of varying sizes. In addition, a survey was conducted to validate the positioning of the capabilities.

Similarly to the DyAMM, the SPM maturity matrix consists of focus areas and capabilities. The focus areas correspond directly with the SPM processes in the earlier published reference framework. In addition, the focus areas are divided into four groups, corresponding to the four identified business functions that are mentioned in the preceding paragraph. In Figure 2, the SPM Maturity Matrix is presented.

Maturity Scale

Focus Area

0 1 2 3 4 5 6 7 8 9 10 11 12

Requirements management

Requirements gathering A B C D E F

Requirements identification A B C D E

Requirements organizing A B C

Release planning

Requirements priorization A B C D E

Requirements selection A B C D

Release definition A B C D E

Release validation A B C D

Launch preparation A B C D

Scope change management A B C D

Product roadmapping

Theme identification A B

Core asset identification A B C

Roadmap construction A B C D E F

Portfolio management

Market trend identification A B C

Partnering & contracting A B C D

Product lifecycle management A B C

Product line identification A B

Fig. 2. The maturity matrix for Software Product Management

The letters A to F represent the capabilities. Each focus area has its own unique capabilities and the amount of capabilities within a focus area varies from two (A-B) to six (A-F). Each capability has five attributes: 1. Name. A name describing the capability in a few words. 2. Goal. The goal describes what purpose the capability serves and it indicates the

advantage of executing the capability. 3. Action. The action describes what must be done in order to meet the capability. 4. Prerequisite(s). Some capabilities require that one or more other capabilities be

achieved first. This relation is described by listing all the capabilities that have to be implemented first.

5. Reference(s). This optional attribute describes related literature which can aid in understanding and implementing the capability, thus having a supporting role.

In Table 2, we elaborate on the capability attributes by providing an example. The

capability that is used is Capability C of the focus area Requirements organizing.

Table 2. Capability C of the focus area Requirements Organizing (RO:C)

Attribute Content Name Requirement dependency linking Goal The existence of requirements interdependencies means that requirements

interact with and affect each other. Requirement dependency linking prevents problems that result from these interdependencies, and therewith enables better planning of the development process.

Action Dependencies between market and product requirements are determined and registered. A dependency exists when a requirement demands a specific action of another requirement. E.g. a requirement demands that another requirement be implemented too, or that another requirement is not implemented in case of conflicting requirements. The linkage can be supported by using advanced techniques, such as linguistic engineering.

Prerequisite(s) Requirements Gathering: A (RG:A) Reference(s) Dahlstedt & Persson (2003) For a capability to be achieved it must be institutionalized and documented.

4 Mathematical Formalization

In order to provide rigorous fundamentals for focus area maturity models, we need to abstract the commonalities from the cases into a mathematical model. To introduce this model, we first have to define the fundamental concepts defining the maturity matrices. For convenience we will refer in the following way to the different types of matrices: the EA-matrix will refer to the DyA architecture maturity matrix, and the SPM-matrix will refer to the software product management maturity matrix.

Both types of matrices use the concept of focus area (the rows of the matrices) for which we introduce the set F of focus areas. The number of focus areas within each matrix differs slightly: 18 for the EA-matrix and 16 for the SPM-matrix.

Another fundamental concept comes from the assessments organizations have to pass in order to reach a certain level for a specific focus area. We therefore introduce

a totally ordered set ( , )LL ≤ of levels and since an assessment is specific for a pair consisting of a focus area and a level, we are interested in the Cartesian product F L× . We abstract away from the ‘assessment’ and concentrate on the set F L× . Since not every element of F needs to have the same number of levels, this Cartesian product is in general a little bit too large. For the general definition of maturity matrix we allow subsets C of F L× . In the two example matrices, C denotes the set of capabilities and the pairs ( , )f l C∈ correspond to the cells in the matrix that are filled with a capital letter. The columns in the example matrices are the final concept we need and are formally described by a specific mapping S from C to the natural numbers. This puts us now in a position to give the following

Definition

A maturity matrix consists of 1. A triple ( , where F is a set, ( , )( , ), ( , ))L CF L C≤ ≤ LL ≤ is a completely ordered set

and ( , )CC ≤ is a partially ordered set with C F L⊆ × . Moreover, the ordering on C respects the ordering on L in the sense that if C1 1 2 2), ( , )l c f l( ,c f= = ∈ and

1 2Ll l≤ then 1 2Cc c≤ . 2. An order preserving mapping S: C → ℕ with mIm( ) {1,..., }S = for some m ∈ ℕ.

As an example take the SPM-matrix where F is the set of 16 focus areas, L =

{A,B,…} is the set of 6 levels (so F L× consists of 96 elements), and L is totally ordered in the obvious way (A < B <…). Furthermore, C is the set of 63 capabilities, consisting of specific pairs ( , where)f l , f F l L∈ ∈

1 1 2( , ) (f l f

and C is partially ordered by the intra- and inter-process capability dependencies, e.g. relations of the form (f,A) < (f,B) (intra-process) and relations of the form 2, )l< where 1f f2≠ (inter-process). Finally, the mapping S assigns every capability to one of the numbers 1 through 12 while preserving the order (so if 1 2Cc c≤ , then 1 2( )c( )S c S≤ ).

The maturity scale of an organization can now be defined. Since an organization that just started the development of a functional domain could very well have none of the capabilities defined for this domain, it makes sense to allow a zero scale. Even if they have acquired some capabilities of scale 1, but not all of them, we still define their scale as zero. Only if they have acquired all capabilities of scale 1 (i.e. all capabilities of the set ), then their scale will be 1 or higher. 1(1)S −

In general, if the set of capabilities acquired by the organization is denoted by (a subset of C), then the scale of that organization is the maximum value s for which . Note that if we substitute

AC

1({1,..., }) AS s− ⊆ C 0s = the set is a subset of , so this definition also holds if is empty or if

doesn’t contain all capabilities with scale 1 (in both cases the maturity scale of the organization will be 0).

1({1,...− , })S s

AC1S − ( )= ∅ = ∅ AC AC

5 Developing a Focus Area Maturity Model

In this section we define a method for the development of focus area maturity models. We do so by drawing on literature review and by generalizing the lessons learned from the applications to the fields of enterprise architecture and software product management. First, we provide an overview of existing approaches on developing maturity models. From this we derive four generic phases in developing maturity models. As the existing approaches all deal with fixed-level maturity models, we need to elaborate the generic phases for developing focus area maturity models.

5.1 Existing Maturity Model Development Approaches

Several researchers have described methods or procedures on how to create maturity models. For example, De Bruin et al. have investigated several maturity models in different domains [1]. Based on their literature research, they identified six general phases that are part of the process of developing a maturity model: 1) Scope: deciding on the focus and the stakeholders; 2) Design: deciding on the architecture of the model, the type of audience and respondents of the model, and the method and driver of application; 3) Populate: identifying “what needs to be measured” and “how this can be measured”. Domain components are identified and defined, and the method of measurement is determined; 4) Test: the construct of the maturity model and its instruments are tested for relevance and rigor; 5) Deploy: the model is being made available for usage and the model’s generalizability is verified; 6) Maintain: some sort of repository is kept in order to support model evolution and development.

A more recent method is developed by Mettler and Rohner [2]. They propose “a first design proposition for situational maturity models”. The design exemplar they describe in order to explain this design proposition consists of the following steps: 1) problem identification and motivation; 2) objectives of the solution; 3) design and development. This last phase is the most elaborated. First, a basic design for the maturity model is developed. Then, the maturity levels are identified and specified and the configuration parameters determined. Finally a proof of concept is delivered.

Another method is published by Becker et al. [9]. By proposing a number of requirements concerning the development of maturity models and comparing these to existing maturity models, they deduct a “generic and consolidated procedure model for the design of maturity models”. The procedure model consists of the steps: 1) problem definition; 2) comparison of existing maturity models; 3) determination of development strategy; 4) iterative maturity model development; 5) conception of transfer and evaluation; 6) implementation of transfer data; and 7) evaluation.

Finally, Maier et al. [4] propose a “practitioner guidance” that supports developing and applying “maturity grids to assess organizational capabilities” [4]. They propose the following phases: 1) Planning: the aim, purpose, requirements, scope and target audience of the maturity model are identified; 2) Development: the different parts of the maturity model are defined, which are the process areas, the maturity levels, the cell descriptions, and the administration mechanism. In addition the role of the facilitator is elaborated here; 3) Evaluation: the model is validated, verified and, if

necessary, iteratively refined; 4) Maintenance: changes on process areas and cell description must be properly evaluated and documented.

We can identify common elements in these development methods (table 3): a scoping phase in which purpose and scope of the maturity model are defined, the design of the model, followed by the development of the assessment instrument, and an implementation and exploitation phase in which the model is put to use and consequently exploited. Evaluation is not included as a common process phase as it is considered an integral part of each of the other phases.

Table 3. Maturity model development methods compared

Common process phase

De Bruin et al. Mettler and Rohner

Becker et al. Maier et al.

Scope Scope Problem identification and motivation

Problem definition

Planning

Objectives of the solution

Comparison of existing maturity models

Design model Design Design and development

Determination of development strategy

Development

Populate - components

Iterative maturity model development

Develop instrument

Populate - measurements

Conception of transfer and evaluation

Test Evaluation Implement & Exploit

Deploy Implementation of transfer data

Maintenance

Maintain Evaluation We recognize these phases also in developing a focus area maturity model.

However, the detailing of the phases is specific to focus area models and cannot be derived from the existing methods.

5.2 Development Method for Focus Area Maturity Models

In this section we elaborate the common process phases into steps for developing focus area maturity models. The method is depicted graphically in figure 3. We use the notation presented by [17], which is based on standard UML conventions, with some minor adjustments. Scope

Step 1: Identify and scope the functional domain. A focus area maturity model can be developed for any functional domain. However, in order to develop a useful model, the domain must be scoped properly. This means deciding on what to include and exclude. In this phase it is also important to identify existing maturity models for

the same or similar domains that may be used as a starting point for further development [9].

In DyAMM the functional domain is scoped to include all activities, responsibilities and actors involved in the development and application of enterprise architecture within the organization, where enterprise architecture is defined as a consistent set of rules and models that guide the design and implementation of processes, organizational structures, information flows, and technical infrastructure within an organization [18].

Fig. 3. The development method for focus area maturity models. Design model

Step 2: Determine focus areas. Within the chosen domain, the focus areas must be identified. In a relatively new field, literature review will provide a theoretical starting point which has to be followed by exploratory research methods like expert

groups or case studies [1, 2, 4, 9]. A useful source for identifying focus areas are critical success factors found in previous research [1]. According to [4] a number of around 20 focus areas is on average a good number. It is important for means of validation to make explicit the underpinning conceptual framework used in defining the focus areas. Grouping the focus areas into a small number of categories may add to the accessibility of the model and is also a means of achieving completeness.

In the SPM matrix, the focus areas are deducted from the previously developed Reference Framework for Software Product Management [14]. This framework consists of the main internal and external stakeholders in the product management domain and of the main activities that are carried out by a product manager. These activities are directly transformed into focus areas for the maturity matrix. The focus areas are grouped into Requirements Management, Release Planning, Product Roadmapping and Portfolio Management. The number of focus areas defined is 16.

Step 3: Determine capabilities. Each focus area consists of a number of different capabilities representing progressive maturity levels. The definition of these capabilities depends on the underlying rationale of how the focus area can be incrementally developed in an evolutionary way [4]. Per focus area the evolutionary path of capabilities is defined. The definition of these capabilities is again based on literature review complemented with expert discussions. There are two ways of defining maturity levels: top down and bottom up [1]. In a relatively new field, the top down approach is more suitable. This implies first identifying the capabilities and then detailing them into descriptions of how these capabilities present themselves in practice. The information source for this exercise are experts and practices.

For the SPM matrix, the capabilities are identified per focus area. In a brainstorming session with four SPM experts, cards were written, each containing one capability. After the brainstorming session, the results were compared with the existing SPM literature and, if necessary, refined or redefined. Finally, the capabilities were iteratively refined in expert interviews until agreement was reached.

Step 4: Determine dependencies. In this step dependencies between capabilities are identified. As the capabilities represent progressive maturity levels, they possess an inherent order of preferred implementation, starting with the first capability. Sometimes this order is inevitable, for instance it is not possible to use measurements for improvement if there is no measurement mechanism in place. In other cases the order is a preferred one, for instance it is advisable to set clear goals before putting a measurement mechanism in place, but it is not impossible, though unwise, to skip the step of goal-setting.

The dependencies of the capabilities in the SPM matrix were identified by stating the prerequisite(s) per capability. Some capabilities required that one or more other capabilities, either of the same focus area or of another focus area, would be implemented first. An example is the Requirements prioritization capability. In order to be able to prioritize requirements, they need to be gathered first. The prerequisite for this capability is thus the capability Gather requirements. Consequently, there exists a dependency between Prioritize requirements and Gather requirements.

Step 5: Position capabilities in matrix. Based partly on the previously determined dependencies, and partly on concerns of practicality, the capabilities are positioned in the maturity matrix. Capabilities that are dependent on other capabilities are always positioned further to the right. This gives a partial ordering. This ordering can be

further refined based on experience and practices. By this positioning the number of scales of the matrix is revealed.

To position the SPM capabilities in the matrix, first an initial positioning was done based on the dependencies in the previous steps and on the experience of the researchers. Subsequently, the maturity matrix was validated with expert validation and a survey among 45 product managers and product management experts [16]. In this survey, participants were asked to position the different capabilities in the order they would implement them in their own organization. The result was a validated maturity matrix. Develop instrument

Step 6: Develop assessment instrument. To be able to use a focus area maturity model as an instrument to assess the current maturity of a functional domain, measures must be defined for each of the capabilities. This can be done by formulating control questions for each capability. These questions can be combined in a questionnaire that can be used in assessments. Formulation of the questions is usually based on the descriptions of the capabilities and on experience and practices.

In the DyAMM each capability relates to 2 to 4 yes/no assessment questions. All questions associated with a capability must be answered with yes in order to claim achievement of that capability. The questions were based on the descriptions of the capabilities and on practices. The list of questions has been reviewed by experts.

Step 7: Define improvement actions. For each of the capabilities improvement actions can be defined to support practitioners in moving to that capability. These too, will usually be based on experience and practices. Improvement actions will in general be rather situation specific [4]. Therefore it is advisable to present them as suggestions, rather than as prescriptions. The improvement actions can also be used to provide situation specific application of the maturity model.

In the DyAMM improvement actions were identified for each capability by suggesting practices that may be implemented to realize the specific capability. These improvement actions are presented as examples meant to inspire rather than as prerequisites. An example is the improvement action to implement some form of account management within the architectural team to initiate a dialogue with business management, in order to achieve capability B, architectural processes geared to business goals, of focus area Alignment with business. Implement & exploit

Step 8: Implement maturity model. Implementation can be done in various ways. A questionnaire can be distributed by electronic means which allows for collecting many assessments in a relatively short timeframe [1]. The assessment questions can also be answered by discussion in workshops or by holding interviews. This is especially appropriate when raising awareness is the aim [4]. The very first applications of the model can be used to evaluate the model.

The DyAMM was validated firstly by applying it in a few cases. This led to an adjustment of the model in a very early stage. After this adjustment the model was validated in a number of new cases [5]. This did not lead to further adjustments.

Step 9: Improve matrix iteratively. Once enough assessments have been collected, quantitative evaluation becomes possible. To evaluate how the model

assists in incremental improvement interventions must be tracked longitudinally [1]. A repository must be kept to collect assessment results.

The DyAMM has been quantitatively validated after a repository of 56 assessments was collected. This led to a few adjustments in the assessment questions [12]. The effectiveness of the DyAMM is further illustrated by companies that have been using the model over the years to evolve their architecture practice and consequently established greater effectiveness of the practice.

Step 10: Communicate results. To further the field, the results of the design should be communicated to practitioners as well as to the scientific community.

The DyAMM has been communicated to the practitioners community by way of books, articles in professional journals and presentations on seminars. It has been communicated to the scientific community by presenting it on scientific conferences.

5.3 Evaluation

In developing the focus area maturity model development method we made use of previous research on developing fixed-level maturity models. From this research we derived four generic phases which we next elaborated for focus area maturity models on the basis of experience in developing focus area maturity models for the functional domains of enterprise architecture and software product management. We found that we could apply the generic phases retrospectively to these two cases and that we could describe both applications in terms of our development method. Further validation of the development method can be done by applying it to a new functional domain. This is yet to be done.

To provide a further initial evaluation of the development method presented here, we apply the requirements on the development of maturity models formulated by Becker et al. as presented in table 4 [9].

Table 4. Evaluation of the development method against the requirements by Becker et al.

Requirement Evaluation R1 – comparison with existing maturity models

This is included as part of step 1.

R2 – Iterative Procedure The determination of the focus areas and of the capabilities, as well as the positioning of the capabilities in the matrix is done iteratively, starting from literature followed by rounds of expert interviews and, possibly, surveys, until agreement is reached.

R3 – Evaluation Evaluation is described as an integral part of each of the steps. The primary type of evaluation is by expert review and case study.

R4 – Multi-methodological Procedure

Literature review is combined with exploratory research methods.

R5 – Identification of Problem Relevance

The development of a focus area maturity model is especially relevant to functional domains that are still in the development stage in the majority of organizations.

R6 – Problem Definition The problem definition is part of the identification and scoping of the functional domain in step 1.

R7 – Targeted Presentation of Results

Step 10 explicitly addresses the communication of results, both to practitioners and to the scientific community.

R8 – Scientific Documentation

Step 10 explicitly addresses the communication of results, both to practitioners and to the scientific community.

We found that applying the design science research process of Peffers et al. helped

us to fulfill most of the requirements as they can be recognized in the process step descriptions of the design science research process [7].

6 Conclusions and Further Research

In this paper we present a method for developing focus area maturity models. Focus area maturity models are especially suited to relatively new IS fields that require incremental, evolutionary capability development. The few focus area maturity models that have been in use up till now, show definite value in supporting organizations to incrementally improve their practices. Though many maturity models have been developed in the past few years, which is an indication of the need for maturity models, most of these are fixed-level and therefore less suited to incremental capability development. With our development method for focus area maturity models we hope to contribute to the design foundations and to further the research and practice of gradual improvement of functional domains.

The development method presented is based on both literature review in maturity model development and practical experience in applying the focus area maturity model concept to two distinct functional domains. The concept of the maturity matrix is refined by building a mathematical formalization of the matrix. This formalization made the underlying concept of the matrix explicit and helped us to better understand the rationale behind the positioning of the capabilities in the matrix. It provided us with a solid foundation for the focus area maturity model development method.

The research approach taken is that of objective-centered design research where the development of an artifact is initiated by an industry or research need [7]. The resulting development method is evaluated against the requirements formulated by Becker et al [9]. Further evaluation by applying the method to other IS domains is still to be done.

A venue for further research is to elaborate on how situationality can be brought into the focus area maturity model development, enabling model developers to tune a focus area maturity model to a specific organization.

Acknowledgments. The authors wish to thank the experts that participated in the expert groups as well as the many organizations that have participated in projects for capability development using the DyAMM or SPM matrix.

References

1. Bruin, T. de, Freeze, R., Kulkarni, U., Rosemann, M.: Understanding the Main Phases of Developing a Maturity Assessment Model. In: Proceedings of the 16th Australasian Conference on Information Systems. Sydney (2005)

2. Mettler, T., Rohner, P.: Situational Maturity Models as Instrumental Artifacts for Organizational Design. In: Proceedings of the 4th international Conference on Design Science Research in information Systems and Technology. Philadelphia (2009)

3. CMMI: CMMISM for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing; (CMMI-SE/SW/IPPD/SS, V1.1) Staged Representation; CMU/SEI-2002-TR-012 ; ESC-TR-2002-012 (2002)

4. Maier, A.M., Moultrie, J., Clarkson, P.J.: Developing Maturity Grids for Assessing Organisational Capabilities: Practitioner Guidance. In: 4th International Conference on Management Consulting, Academy of Management (MCD'09). Vienna (2009)

5. Steenbergen, M. van, Berg, M. van den, Brinkkemper, S. A Balanced Approach to Developing the Enterprise Architecture Practice. In: Filipe, J., Cordeiro, J., Cardoso, J. (eds.) Enterprise Information Systems. LNBIP 12, pp. 240–253 (2007)

6. Koomen, T., Pol, M.: Test Process Improvement, a Practical Step-by-step Guide to Structured Testing. Addison-Wesley, Boston (1999)

7. Peffers, K., Tuunanen, T., Rothenberger, M.A., Chatterjee, S.: A Design Science Research Methodology for Information Systems Research. Journal of Management Information Systems, vol.24 (3), pp. 45-78 (2008)

8. Hevner, A.R., March, S.T., Park, J., Ram, S.: Design Research in Information Systems Research. MIS Quarterly, vol.28, no.1, pp.75-105 (2004)

9. Becker, J., Knackstedt, R., Pöppelbuß, P.: Developing Maturity Models for IT Management - A Procedure Model and its Application. Business & Information Systems Engineering 1(3), pp. 213-222 (2009)

10. Berg, M. van den, Steenbergen, M. van: Building an Enterprise Architecture Practice. Springer, Dordrecht (2006)

11. Wagter, R., Berg, M. van den, Luijpers, L., Steenbergen, M. van: Dynamic Enterprise Architecture: How to Make it Work. Wiley, Hoboken (2005)

12. Steenbergen, M. van, Schipper J., Bos, R, Brinkkemper, S: The Dynamic Architecture Maturity Matrix: Instrument Analysis and Refinement. To appear in the proceedings of the 4th Workshop on Trends in Enterprise Architecture Research. Stockholm (2009)

13. Xu, L., Brinkkemper, S.: Concepts of Product Software. European Journal of Information Systems 16(5) pp. 531-541 (2007)

14. Weerd, I. van de, Brinkkemper, S., Nieuwenhuis, R., Versendaal, J., Bijlsma, L.: Towards a Reference Framework for Software Product Management. In: Proceedings of the 14th International Requirements Engineering Conference, Minneapolis/St. Paul, Minnesota, USA, pp. 319-322 (2006)

15. Bekkers, W., Spruit, M., Weerd, I. v., Brinkkemper, S.: A Situational Assessment Method for Software Product Management. To appear in ECIS2010 proceedings (2010)

16. Weerd, I. van de, Bekkers, W., Brinkkemper, S.: Developing a Maturity Matrix for Software Product Management. Technical report UU-CS-2009-015. Department of Information and Computing Sciences, Utrecht University, The Netherlands (2009)

17. Weerd, I. van de, Brinkkemper, S.: Meta-modeling for Situational Analysis and Design Methods. In Syed, M.R., Syed, S.N. (Eds.) Handbook of Research on Modern Systems Analysis and Design Technologies and Applications, pp. 38-58. Hershey: Idea Group Publishing (2008)

18. Steenbergen, M. van, Brinkkemper, S.: Modeling the Contribution of Enterprise Architecture Practice to the Achievement of Business Goals. In: Papadopoulos, G. A., Wojtkowski, W., Wojtkowski, W. G., Wrycza, S., Zupancic, J. (eds) Information Systems Development: Towards a Service Provision Society, Springer-Verlag: New York (2008)