Examining The Use Of First Principles Of Instruction By ...

167
Florida State University Libraries Electronic Theses, Treatises and Dissertations The Graduate School 2012 Examining the Use of First Principles of Instruction by Instructional Designers in a Short-Term, High Volume, Rapid Production of Online K-12 Teacher Professional Development Modules Anne M. Mendenhall Follow this and additional works at the FSU Digital Library. For more information, please contact [email protected]

Transcript of Examining The Use Of First Principles Of Instruction By ...

Florida State University Libraries

Electronic Theses, Treatises and Dissertations The Graduate School

2012

Examining the Use of First Principles ofInstruction by Instructional Designersin a Short-Term, High Volume, RapidProduction of Online K-12 TeacherProfessional Development ModulesAnne M. Mendenhall

Follow this and additional works at the FSU Digital Library. For more information, please contact [email protected]

THE FLORIDA STATE UNIVERSITY

COLLEGE OF EDUCATION

EXAMINING THE USE OF FIRST PRINCIPLES OF INSTRUCTION BY

INSTRUCTIONAL DESIGNERS IN A SHORT-TERM, HIGH VOLUME, RAPID

PRODUCTION OF ONLINE K-12 TEACHER PROFESSIONAL DEVELOPMENT

MODULES

By

ANNE M. MENDENHALL

A Dissertation submitted to the Department of Educational Psychology and Learning Systems

in partial fulfillment of the requirements for the degree of

Doctor of Philosophy

Degree Awarded: Fall Semester, 2012

ii

Anne Mendenhall defended this dissertation on August 1, 2012.

The members of the supervisory committee were:

Tristan E. Johnson

Professor Co-Directing Dissertation

James D. Klein

Professor Co-Directing Dissertation

Jonathan Adams

University Representative

Vanessa P. Dennen

Committee Member

The Graduate School has verified and approved the above-named committee members,

and certifies that the dissertation has been approved in accordance with university

requirements.

iii

For my Mom and Dad for their never-ending support and unconditional love.

For Brayden, Lynzy, Makayla, Kylee, and Sidney. Here’s hoping you find the same joy

and satisfaction I did while pursuing your own dreams.

iv

ACKNOWLEDGEMENTS

Reflecting on my many experiences through the PhD process, I’d have to say that

working with Dr. Tristan Johnson and Dr. Jim Klein have been the most remarkable.

Both Tristan and Jim have gone above and beyond to see me through the dissertation

process. It is with my deepest gratitude that I thank them both for the commitment,

sacrifice, expertise, council and advice, support, and not to mention their sense of humor.

I am so grateful for the phone call I received many years ago from Tristan

encouraging me to apply to the Instructional Systems program at Florida State

University. The experiences gained through attending FSU and working with Tristan for

nearly 5 years have been incredible and a treasured blessing. His consistent support,

encouragement, and positive attitude have carried me through. I can honestly say that the

road to PhD-hood has been a remarkable journey and I have truly “enjoy[ed] the journey”

(Oaks & Oaks, 2009, p. 31), because of Tristan.

I am so thankful for Dr. Jim Klein and his willingness to step in and provide an

incredible amount of support and expertise. His invaluable feedback, kind demeanor,

encouragement, and advice became my lifeline as I wrapped up my journey at FSU. I’ve

really enjoyed our conversations and meetings. I wish I had more time to learn from him.

I am so grateful for Jim introducing me to my new love – Design and Development

Research. I remember telling my coworkers and friends, after the first meeting I had with

Jim, about how he changed my life by introducing me to this type of research. I’d like to

thank my committee members Dr. Vanessa Dennen and Dr. Jonathan Adams for their

expertise and council. Their insight and perspectives were very valuable and contributed

greatly to my success in the PhD program. Their expertise on qualitative research has

helped me see things at different angles.

My parents James (Jim) and Evelyn (Evie) Mendenhall have been by far my

biggest supporters and sources of unconditional love (along with Hermione the dog).

They have instilled in me the ability to work hard and recognizing the value of work.

They also taught me to recognize my divine worth and to know God and to seek His

v

council. Words cannot express my love and gratitude for the two best parents a girl could

have. Thank you for sharing this journey with me.

A special Mahalo Nui Loa goes out to Dave and Kate Merrill and Bob Hayden.

All three played an instrumental part in this journey. They too, provided lots of council,

advice, and support. I’m so grateful to have them as part of my ohana. Another special

thank you goes out to the many friends who have provided encouragement and support to

name a few (I wish I could mention them all by name): ChanMin Kim, Gordon and

Jennifer Mills & Family, Kylia and Brian Barabash, the FSU PhD ABD group, my

brothers Rob and Scott and their families, and to the cohort of PhD and masters students

who have made this journey fun and meaningful. A special thanks goes to Alison Moore,

Kayla Wenting Jiang, Faiza Al-Jabry, and my Habitat Tracker coworkers for their

assistance and feedback. Last but not least, thank you to those who worked countless

hours above and beyond the call of duty designing and developing the professional

development modules used for this study.

vi

TABLE OF CONTENTS

List of Tables ................................................................................................................................. ix

List of Figures ..................................................................................................................................x

Abstract .......................................................................................................................................... xi

1. CHAPTER ONE INTRODUCTION ......................................................................................1

Purpose of Study ............................................................................................................3

Research Questions .......................................................................................................4

Significance of the Study ...............................................................................................4

2. CHAPTER TWO LITERATURE REVIEW ..........................................................................7

Differentiating Theories, Models, and Principles ..........................................................7

Instructional Systems Design Models ............................................................................8

Benefits of ISD Models ...................................................................................10

Challenges and Criticisms of ISD Models .......................................................10

Theoretical Foundations of ISD Models ..........................................................12

First Principles of Instruction .......................................................................................16

Activation .........................................................................................................19

Demonstration ..................................................................................................20

Application .......................................................................................................21

Integration ........................................................................................................22

Problem or Task-Centered ...............................................................................21

Use of First Principles of Instruction ...............................................................23

Research on First Principles of Instruction ......................................................25

Instructional Designer Decision-Making .....................................................................27

Design and Development Research .............................................................................29

3. CHAPTER THREE METHODOLOGY ..............................................................................35

Research Design ...........................................................................................................35

Participants ...................................................................................................................36

Setting and Materials ...................................................................................................37

vii

Data Sources ................................................................................................................42

Instrumentation ............................................................................................................44

Procedures ....................................................................................................................45

Data Analysis ...............................................................................................................46

Trustworthiness ............................................................................................................49

4. CHAPTER FOUR RESULTS ..............................................................................................52

Conditions Under Which First Principles Were Used .................................................52

Instructional Design Setting .............................................................................52

Decisions Regarding First Principles ...........................................................................60

Decision-Making Power ..................................................................................60

Types of Design Decisions ..............................................................................61

Instructional Design Decisions ........................................................................63

Factors Affecting Decisions .............................................................................69

Level of Understanding First Principles ......................................................................75

Frequency of First Principles Incorporated in Modules ..............................................77

Summary .......................................................................................................79

5. CHAPTER FIVE DISCUSSION ..........................................................................................81

General Research Question ..........................................................................................81

Supporting Research Question 1 ..................................................................................83

Supporting Research Question 2 ..................................................................................87

Supporting Research Question 3 ..................................................................................94

Supporting Research Question 4 ..................................................................................95

Limitations ........................................................................................................98

Future Research ........................................................................................................99

Conclusion ........................................................................................................99

APPENDIX A SCIENCE AND MATH STANDARDS INSTRUCTIONAL MODULES .......101

APPENDIX B DEMOGRAPHICS AND DESIGN KNOWLEDGE SURVEY .........................103

APPENDIX C INTERVIEW PROTOCOL AND QUESTIONS ................................................107

APPENDIX D MODULES RANDOMLY SELECTED FOR EVALUATION .........................109

APPENDIX E FIRST PRINCIPLES OF INSTRUCTION KNOWLEDGE SURVEY ..............110

viii

APPENDIX F MODULE EVALUATION SHEET ...................................................................113

APPENDIX G RECRUITMENT E-MAIL .................................................................................114

APPENDIX H CONSENT FORM ..............................................................................................115

APPENDIX I SCORING PROTOCOL AND RUBRIC FOR FPI SURVEY .............................118

APPENDIX J SAMPLE PROGRAM LOGIC AND STORYBOARD TEMPLATES ..............126

APPENDIX K HUMAN SUBJECTS APPROVAL MEMORANDUM ....................................131

APPENDIX L PRINCIPLE INVESTIGATOR APPROVAL MEMORANDUM .....................132

APPENDIX M PERMISSION TO USE FIGURES ....................................................................134

REFERENCES ......................................................................................................................136

BIOGRAPHICAL SKETCH .......................................................................................................145

ix

LIST OF TABLES

Table 2.1 Gagné’s Nine Events of Instruction ...............................................................................16 Table 3.1: Topics and Sub-topics ...................................................................................................47 Table 3.2: Data Collection and Analysis .......................................................................................49 Table 4.1: Instructional Designers Working Hours .......................................................................54 Table 4.2 Instructional Designers Demographics ........................................................................56 Table 4.3 Means and Standard Deviations of Years of Experience .............................................57 Table 4.4 Training Materials Use and Level of Understanding Results .......................................59 Table 4.5 First Principles of Instruction Knowledge Survey Scores .............................................76 Table 4.6 First Principles of Instruction Knowledge Survey Scores by Roles ..............................77 Table 4.7 Module Evaluation Frequency Counts .........................................................................78 Table 4.8 Percentage and Instances Ranges of the Use of First Principles .................................79 Table 5.1 Possible Strategy Sequence for Teaching Components .................................................82 Table 5.2 Means and Standard Deviations of Years of Experience ..............................................84 Table 5.3 Percentage and Instances Ranges of the use of First Principles ...................................88 Table 5.4 Comparisons of Gardner’s (2011) Module with 6-8 Grade Science and H.S.

Earth and Space Science Modules .................................................................................................96

x

LIST OF FIGURES

Figure 2.1 ADDIE model is a systematic approach to instruction ................................................14 Figure 2.2 Dick and Carey Systems Approach Model ...................................................................15 Figure 2.3 Pebble-in-the-Pond Model ...........................................................................................15 Figure 2.4 First Principles of Instruction ......................................................................................17 Figure 2.5 Framework of Merrill’s (2002a, 2008) First Principles of Instruction .......................19 Figure 2.6 General information is located directly next to the demonstration/specific

portrayal, which guides the learner from the concept being taught to the demonstration of

that concept (Mendenhall et al., 2006b) ........................................................................................21 Figure 3.1 The text is represented as bullet-points and is located on the left, while the

specific instance is shown on the right (Johnson, Mendenhall, et al., 2011) ................................38 Figure 3.2 In this example, which illustrates unpacking a benchmark, the steps and

specific portrayal is on the right (Johnson, Mendenhall, et al., 2011) .........................................38 Figure 3.3 Videos are sometimes used for practice activities .......................................................39 Figure 3.4 Organizational Hierarchy of Participants ...................................................................40

xi

ABSTRACT

Merrill (2002a) created a set of fundamental principles of instruction that can lead

to effective, efficient, and engaging (e3) instruction. The First Principles of Instruction

(Merrill, 2002a) are a prescriptive set of interrelated instructional design practices that

consist of activating prior knowledge, using specific portrayals to demonstrate

component skills, application of newly acquired knowledge and skills, and integrating the

new knowledge and skills into the learner’s world. The central underlying principle is

contextualizing instruction based on real-world tasks. Merrill (in press) hypothesizes that

if one or more of the First Principles are not implemented, then a diminution of learning

and performance will occur. There are only a few studies that indicate the efficaciousness

of the First Principles of Instruction. However, most claims of efficacy in the application

and usage of the principles are anecdotal and empirically unsubstantiated. This

phenomenon is not isolated to the First Principles of Instruction.

Claims of effectiveness made by ISD model users have taken precedence over

empirically validating ISD models. This phenomenon can be attributed to a lack of

comprehensive model validation procedures as well as time restraints and other limited

resources (Richey, 2005). Richey (2005) posits that theorists and model developers tend

to postulate the validity of a model due to its logicality and being supported by literature,

as is the case with the First Principles of Instruction. Likewise, designers tend to equate

the validity of a model with an appropriate fit within their environment; that is, if using

the model is easy, addresses client needs, supports workplace restraints, and the resulting

product satisfies the client then the model is viewed as being valid (Gustafson & Branch,

2002; Richey, 2005).

Richey and Klein (2007) emphasis the importance of conducting design and

development research in order to validate the use of instructional design models, which

includes the fundamental principles (e.g., First Principles of Instruction) that underlie

instructional design models. These principles and models require research that is rigorous

and assesses the model’s applicability instead of relying on unsubstantiated testimonials

xii

of usefulness and effectiveness (Gustafson & Branch, 2002). In order to validate the use

of principles and models researchers need to explore and describe the usage of the

principles and models to determine the degree of implementation in different settings

(Richey & Klein, 2007).

The purpose of this study was to examine the use of the First Principles of

Instruction (Merrill, 2002a) and the decisions made by instructional designers —

including project leads, team leads, and designers-by-assignment. The investigation of

the use of the First Principles was part of an effort to determine if these principles were

conducive to being implemented during a fast-paced project that required the design and

development of a large number of online modules. The predominant research question for

this study was: How were the First Principles of Instruction used by instructional

designers, in a short-term, high volume, rapid production of online K-12 teacher

professional development modules? Four supporting questions were also addressed: 1)

What were the conditions under which the First Principles of Instruction were used? 2)

What design decisions were made during the project? 3) What is the level of

understanding of the First Principles by instructional designers? 4) How frequently do the

modules incorporate the First Principles of Instruction?

This case study involved 15 participants who were all instructional designers and

designers-by-assignment that worked on 49 science and math professional development

modules for K-12 teachers within a short 11-week time period. Participant interviews,

extant data —project management documents, e-mail communications, personal

observations, recordings of meetings, participant surveys, and the evaluation of nine

online modules consisted of the data collected in this design and development research

study. The results indicated the First Principles of Instruction were not used at the level

expected by the lead designer and may not be conducive to being applied as described by

Merrill (2002a, 2007a, 2009a, 2009b) in this case. The frequency of use of the First

Principles in the modules showed an overuse of the Activation/Tell principle in

relationship to the number of Demonstrations/Show and Application/Ask applications.

Results also indicated that the project requirements, personnel, designer experience, the

physical setting, and training and meetings contributed to decision-making and ultimately

to the use and misuse of the First Principles of Instruction.

1

CHAPTER ONE

INTRODUCTION

One of the key tenets of Instructional Systems Design (ISD) is to create

instruction that is efficient, effective, and engaging (e3) in order to promote learning,

improve performance, and motivate learners (Merrill, in press). ISD consists of

systematic processes with interrelated components that move toward a common goal. The

components include learners, instructional materials, learning environments, instructors

and facilitators (Dick, Carey, & Carey, 2005). In addition, instructional designers and

developers are integral components within the system. Instructional designers and

developers have a common goal of producing e3 instruction. However, even with the

intent of producing e3 instruction, there are many cases where the instruction didn’t meet

the criteria to be efficient, effective, and/or engaging (Merrill, 2009b). Merrill (in press)

asserts that one of the greatest hindrances to e3 instruction is that too often the only

requirement for instructional designers is content knowledge and not an understanding of

the principles of ISD. Likewise, the lack of e3 instruction is also blamed on instructional

designer’s decisions (Rowland, 1993), which can lead to uncontrolled or non-systematic

approaches to designing instruction (Visscher-Voerman, 1999). According to van den

Akker, Boersma, and Nies (1990; as cited in Visscher-Voerman, 1999) there is evidence

that the design processes could be improved when instruction doesn’t fully meet e3

standards.

One design practice that can improve the impact of instruction is the appropriate

use of ISD models. ISD models can provide structure and order and are used to create a

good standard in designing instruction (Richey, 2005). ISD models can provide

immediate value (Dick, Carey, & Carey, 2005) by regulating the instructional design

process and guiding the instructional designer into creating e3 instruction (Gustafson &

Branch, 2002). There have been a myriad of ISD models created since the 1970’s

(Gustafson & Branch, 2002) and most of these models encompass a fundamental set of

2

principles including principles of analysis, design, development, implementation, and

evaluation (Gustafson & Branch, 2002; Branch & Merrill, 2012). These basic set of

principles can be situated in multiple ways within an ISD model and the degree and

method of embodiment within a model can determine how effective, efficient, and

engaging the instructional intervention will be (Merrill, in press). In addition to

implementation of ISD principles, a model needs to be grounded in theory. Gustafson and

Branch (2002) claim that the “greater the compatibility between an [ISD] model and its

contextual, theoretical, philosophical, and phenomenological origins, the greater the

potential for success in constructing effective learning environments” (pg. 16). Even

though many ISD models are representative of effectual theories of learning, instruction,

and design there are challenges associated with ISD models and their use.

The challenges begin with model selection. Due to the diversity of instructional

design projects, performance problems, and learning environments it can be difficult to

choose an appropriate model to solve all of the design problems in a project (Visscher-

Voerman, 1999). Some ISD models have been characterized as restrictive, stifling,

passive, inflexible, lacking adaptability, or too simple (Branch, 1997; Wedman &

Tessmer, 1993). Other professionals criticize that some ISD models are “clumsy” and

they take too long to implement in a “speed-maddened” world of ISD (Gordon & Zemke,

2000). Difficulty implementing a model during a fast-paced design project can be

especially challenging with a team of novice designers (Richey, 1995). Some training

professionals assert that rigidly following ISD models hinder instructional designers’

creativity and the models do not address attitudinal or motivational elements (Gordon &

Zemke, 2000) which result in ineffective, inefficient, and disengaging instruction. Other

ISD professionals acquiesce on some criticisms; however, they assert that most criticisms

are based upon a few poor examples of inappropriate model choice and application. In

particular, the focus during the application of the models were activity-driven instead of

outcome or goal-driven (Zemke & Rossett, 2002). Furthermore, Merrill, Barclay, & Van

Schaak (2008) posit that it is the failure to implement fundamental underlying principles

of instruction, within a model, that is the cause of ineffective, inefficient, and disengaging

instruction.

3

Merrill (2002a) created a set of fundamental principles of instruction that are

believed to create e3 instruction. The First Principles of Instruction (see Merrill, 2002a,

2007a, 2007b, 2009a, 2009b) are a prescriptive set of interrelated instructional design

principles that consist of activating prior knowledge, using specific portrayals to

demonstrate component skills, application of newly acquired knowledge and skills, and

integrating the new knowledge and skills into the learner’s world. The central underlying

principle is contextualizing the instruction based on real-world tasks. Merrill (in press)

hypothesizes that if one or more of the First Principles are not implemented then a

diminution of learning and performance will occur. However, there are only a few studies

that indicate the efficaciousness of the First Principles of Instruction (see Frick, Chadha,

Watson, Wang, & Green, 2009; Gardner, 2011; Rosenburg-Kima, 2012; Thomson,

2002). Most claims of efficacy of ISD models as well as the First Principles of Instruction

are anecdotal and empirically unsubstantiated.

Claims of effectiveness made by ISD model users have taken precedence over

empirically validating ISD models. This phenomenon can be attributed to a lack of

comprehensive model validation procedures as well as time restraints and other limited

resources (Richey, 2005). Richey (2005) posits that theorists and model developers tend

to postulate the validity of a model due to its logicality and being supported by literature,

as is the case with the First Principles of Instruction. Likewise, designers tend to equate

the validity of a model with an appropriate fit within their environment; that is, if using

the model is easy, addresses client needs, supports workplace restraints, and the resulting

product satisfies the client then the model is viewed as being valid (Gustafson & Branch,

2002; Richey, 2005). Richey and Klein (2007) suggest that design and development

research, specifically model research, could validate the effectiveness of instructional

design principles, models, and processes.

Purpose of Study

The purpose of this study was to investigate the use of First Principles of

Instruction and the design and development decisions made by instructional designers to

determine if these principles are conducive to being implemented in a short-term, high

volume, rapid production of teacher professional development modules.

4

The short-term, high volume nature of the project refers to the project’s short 11-

week timeline and the creation of 49 online modules within that timeframe. The rapid

production of the modules refers to the processes taken to complete the modules in a

systematic way in order to meet the deadline. The instructional design project, used as the

context for this research study, employed nearly 30 instructional designers and designers-

by-assignment (i.e. designers who have not have formal training or education in

instructional design) to create a set of online professional development modules for K-12

teachers. The modules instructed teachers on the newly adopted and revised state science

and math standards and benchmarks. In addition, the modules contain instructional

strategies teachers can use to fulfill the math and science standards and benchmarks in

their classrooms. The First Principles of Instruction were used as a framework to create

these modules because these principles were centered on real-world tasks that seemed to

be applicable to the content and context of this instructional design project.

Research Questions

The primary research question that was addressed in this study is: How were the

First Principles of Instruction used by instructional designers, in a short-term, high

volume, rapid production of online K-12 teacher professional development modules?

Supporting Research Question 1: What are the conditions (i.e. client

restrictions, resource limitations, instructional design setting) under which the First

Principles of Instruction were used?

Supporting Research Question 2: What design decisions regarding the First

Principles of Instruction were made during the project?

Supporting Research Question 3: What is the level of understanding of the First

Principles of Instruction by instructional designers?

Supporting Research Question 4: How frequently do the modules incorporate

the First Principles of Instruction?

Significance of the Study

Most previous studies that are associated with the First Principles of Instruction

were experimental or quasi-experimental designs where the First Principles of Instruction

were used as the treatment condition and compared it with a topic-centered or controlled

5

condition (Francom, 2011; Rosenberg-Kima, 2012). Other studies explored the use of

First Principles of Instruction as a framework for active learning (Gardner, 2011b) or

examined the relationship between novice and expert instructional designers and their use

of the First Principles of Instruction (Rauchfuss, 2010). While these studies contribute to

the instructional systems design field and to the understanding of the First Principles of

Instruction, more research should be conducted in order to validate the use of the First

Principles of Instruction within different situations. In order to support or refute the claim

that the principles can be implemented “in any delivery system or using any instructional

architecture” (Clark, 2003 as cited in Merrill, Barclay, & van Schaak, 2008) instructional

design and development research should be conducted.

Van den Akker and Kuiper (2008) posit the need to conduct more of this type of

research in order to encompass the expanding view of instructional design, which

includes educational design. Educational design can incorporate additional teaching and

learning components like, the role of the teacher. They also claim the need for more

“interactive and developmental” approaches that supports the development and

refinement of instructional design models (see pp. 745-746). Richey and Klein (2007)

concur with the importance of conducting design and development research in order to

validate the use of instructional design models, which includes the fundamental principles

(i.e. First Principles of Instruction) that underlie such models.

These principles and models require research that is rigorous and assesses their

applicability instead of relying on unsubstantiated testimonials of usefulness and

effectiveness (Gustafson & Branch, 2002). In order to validate the use of principles and

models, researchers should explore and describe the use of the principles and models to

determine the degree of implementation in different settings (Richey & Klein, 2007).

As stated previously, this study aims to explore the application of the First

Principles of Instruction and design decisions made by instructional designers, of varying

skill and experience levels, in the production of online teacher professional development

modules. The rapid-pace, high volume of modules, and the very short timeline are

characteristics of the instructional design setting. This study is significant because of the

need to substantiate claims of efficacy made by model developers and model users

(Richey, 2005). The answers to the research questions may provide some insight to ISD

6

practitioners and researchers about how novice and expert instructional designers apply

the First Principles of Instruction in a fast-paced environment.

7

CHAPTER TWO

LITERATURE REVIEW

This chapter provides an overview of Instructional Systems Design (ISD) models

and their theoretical foundations. In addition, the benefits and challenges will be

discussed as well as the need to conduct research on the development and use of models

so as to provide substantiation and validation. Also, the First Principles of Instruction will

be discussed in detail. While the categorization of the First Principles of Instruction is

questionable (i.e. theory? model? or simply principles?) the assertion is made that these

prescriptive principles can also fall under the category of a model or even possibly a

theory. Next, there is a review of literature regarding expert and novice decision-making

skills and choices. Finally, this chapter concludes with a description of design and

development research and the need to conduct model use and validation research.

Differentiating Theories, Models, and Principles

Throughout the literature, it should be noted, that people interchangeably use the

terms theory, model, and principle. There is a fine line between each of these terms that

can, understandably, cause confusion. Following these paragraphs that define a theory,

model, and principle there will be a more lengthy definition of models provided along

with a few significant theories that have influenced the development of ISD models. In

the proceeding paragraphs theories, models, and principles will be briefly defined in an

effort to support a claim that the First Principles of Instruction can be viewed as an ISD

conceptual framework that includes both model and principle characteristics.

Theory. Reigeluth (1983) defines a theory as a “set of principles that are

systematically integrated and are a means to explain and predict instructional

phenomena” (p. 21). Andrews and Goodson (1980) explain that a model can incorporate

multiple theories and theories help us to more fully understand the learning environment.

Hersey, Blanchard, and Dewey (2001) state that a theory “attempts to explain why things

happen as they do…and is not designed to recreate events” (p.172). In the ISD field,

8

theories have been developed to explain how learning occurs. These theories have been

developed as a result of observing behavioral changes and the processes and triggers that

brought about that change (Driscoll, 2005). Theories are used to predict the outcome of a

series of events (Richey, Klein, & Tracey, 2011).

Models. Hersey, Blanchard, and Dewey (2001) affirm that a model “is a pattern

of already existing events that can be learned and therefore repeated” (p. 172). A model is

used to describe the application of a theory and as stated previously a model can

encompass many theories. Models are used and adapted by practitioners (Reigeluth,

1983) whereas scholars, generally, conduct theory development.

Principles. Principles are described as being a relationship that is “ always true

under appropriate conditions regardless of program or practice” (Merrill, Barclay, & Van

Schaack, 2008, p. 175). Reigeluth (1983) defines principles as “a relationship between

two actions or changes” (p. 14). He categorizes the relationships as correlational, causal,

deterministic, or probabilistic. A relationship may be correlational when there is no

indication of which action is affected by another action and causal when there is an

indication of which action is influenced by another action (Reigeluth, 1983).

Deterministic relationships is when the cause “always has the stated effect” and

probabilistic is when the relationship often or sometimes “has the stated effect” (p. 14).

Instructional Systems Design Models

Models are used in most disciplines as communication tools that represent ideas,

patterns, processes, and cycles. Models may help to visualize things that are difficult to

see, reveal gaps in our knowledge, and can help make predictions (Ryder, n.d.; Severin &

Tankard, 2001). Models are often exclusive to particular situations (Gustafson & Branch,

2002; Rothwell & Kazanas, 2008) and not generalizable across domains or environments.

Richey, Klein, & Tracey (2011) define models as “representations of reality

presented with a degree of structure and order, and… are typically idealized” (p. 8).

Deutsch (1952) characterizes models as being “structured symbols of operating rules

which is supposed to match a set of relevant points in an existing structure or process” (p.

357). He adds that models are necessary for understanding complex systems and

processes (Deutsch, 1952). Others define models as graphical representations (Andrews

9

& Goodson, 1980) of phenomena (physical phenomena, complex forms, systematic

functions & processes) that occur in the real world (Gustafson & Branch, 2002; Severin

& Tankard; 2001).

Most ISD models encompass a related set of tasks that involve some type of

analysis, selection of pedagogical strategies, learning activities and assessments,

developing teaching and learning materials, execution of the instruction, and evaluating

for instructional effectiveness and learning (Gustafson & Branch, 2002; Branch &

Merrill, 2012). Andrews and Goodson (1980) state that ISD models contain descriptive,

prescriptive, predictive, and explanatory components. Some ISD models use verbal

descriptions of pedagogical criteria and selection processes; other models use graphical

analogies to show a set of prescribed steps and verbal descriptions of procedures.

Descriptive models illustrate a specific learning environment and how it’s related

components will be affected (Edmonds, Branch, & Mukherjee, 1994). Prescriptive

models, on the other hand, provide a framework for how the learning environment can be

created or adapted to ensure the outcomes are brought forth (Edmonds, Branch, &

Mukherjee, 1994; Reigeluth, 1983).

Reigeluth (1983) defines one type of ISD model, the instructional model, as “an

integrated set of strategy components” (p. 21) like sequencing of content, use of

examples, practice, and motivation elements, which differ from instructional

development or process models like ADDIE. Further, he states that instructional models

may be fixed (descriptive) or adaptive (prescriptive). When an instructional model is

fixed the description stays the same despite the learner’s role. Whereas, an adaptive

model prescribes variations taking into account the learner’s role and responses during

instruction (Reigeluth, 1983).

Some scholars believe that, embedded within ISD models, there is a predictive

power— when the model is applied appropriately, it can predict that the instruction will

be effective (Andrews & Goodson, 1980; Gagné, Wager, Golas, & Keller, 2005). On the

contrary, Edmonds, Branch, and Mukherjee (1994) claim that one of the main criticisms

of ISD models is that they don’t have predictive power and lack methods that predict

success in specific situations (see p. 55). Gustafson and Branch (2002) addressed several

assumptions about ISD models. Among those assumptions they assert that there is not a

10

single ISD model that is perfectly suited to fit the majority of design and development

environments (Gustafson & Branch, 2002; Zemke & Rossett, 2002). Consequently,

instructional designers should be knowledgeable and skilled enough to apply and adapt

the models to fit specific project requirements and environments.

Benefits of ISD Models

The ultimate goal of instruction is to improve performance. Benefits of using ISD

models include “facilitate[ing] intentional learning” (Gagné et al., 2005, p. 1) and

providing standardization that supports good instructional design practices (Richey,

2005). ISD Models are used to assist instructional designers in the planning, designing

and developing, and the implementation of instruction. As stated previously, ISD models

can be beneficial in communicating complex ideas and processes (Richey, 2005; Ryder,

n.d.). Being able to communicate with stakeholders while developing instruction may

prevent unnecessary challenges in the future. Using ISD models can provide “immediate

value” (Gagné et al., 2005, p. 2) by providing assistance to instructional design

practitioners by offering necessary guidance through detailed prescriptive steps and

descriptions (Reigeluth & Carr-Chellman, 2009; Richey, Klein, & Tracey; 2011) and can

“inspire” instructional designers as they solve the complex problems of ISD (Kirschner,

Carr, van Merriënboer, & Sloep, 2002). The use of ISD models can contribute to the

refinement of the model and improvement of the theory it was based upon (Andrews &

Goodson, 1980) thus contributing to the improvement of teaching and learning and the

advancement of the instructional design knowledge base.

Challenges and Criticisms of ISD Models

Some ISD model authors and theorists claim their models are universal and can

be applied in many types of environments and under various conditions. However, in

reality, most models are situation specific (Gustafson & Branch, 2002; Visscher-

Voerman & Gustafson, 2004). Visscher-Voerman (1999) reported on several studies

about the instructional design activities designers participated in and her findings

indicated that instructional designers did not follow all of the steps as prescribed in ISD

models. Not following all of the prescribed steps can be a detriment to the quality of

instruction since it has been stated an ISD model can “predict” that the instruction will be

11

effective (Andrews & Goodson, 1980) however, that only stands true when the model is

applied appropriately (Merrill, in press). The application of a model during a fast-paced

instructional design project can be particularly taxing on novice instructional designers

(Richey, 2005) thus, affecting the quality of the instruction.

Since the 1970’s there has been a proliferation of ISD models causing some

difficulty in the selection of a model that can help solve the instructional design problem

appositely (Edmonds, Branch, & Mukherjee, 1994; Gustafson & Branch, 2002; Visscher-

Voerman, 1999). Furthermore, most ISD models have never been validated for efficacy

and usefulness (Andrews & Goodson, 1980; Gustafson & Branch, 2002; Richey, 2005)

causing designers to be reluctant to adopt and adapt the model in fear of risking the

success of a project (Andrews & Goodson, 1980). In addition, some designers may have

a strong persuasion to one learning theory or ISD model and will try to use and adapt that

model in most design projects they are involved with (Andrews & Goodson, 1980)

without taking into consideration the specificity of the design project. Other critics find

that the use of ISD models can thwart instructional designers creativity. Andrews and

Goodson (1980) assert that instructional designers should understand how and why the

model was developed in order to determine the model’s appropriateness for the situation.

One study that supports these criticisms was conducted by Branch (1997)

examining the graphic elements of instruction design models. Participants for this study

included 31 graduate students, half of whom were majoring in Instructional Technology

and nearly all the participants were unfamiliar with many of the details relating to ISD.

Branch’s participants were randomly assigned to one of three groups. Each group

reviewed the same diagrams but each in a different order. The diagrams were boxes [Dick

and Carey Model (1996)], ovals [Edmonds, Branch, & Mukherjee (1994)], or a mix of

both boxes and ovals [adapted from Edmonds, Branch, & Mukherjee (1994)]. The

participants were asked to provide descriptive words for each of the diagrams. The most

common descriptive words were confusing, flowing, and linear. Branch (1997) came to

the conclusion that many of the ISD model diagrams were “interpreted as stifling,

passive, lock-step and simple” (p. 429).

12

As illustrated previously, there are many challenges and criticisms of ISD models.

However, the benefits of providing guidance to instructional designers, especially novice

or designers-by-assignment, may outweigh the challenges of using ISD models.

Theoretical Foundations of ISD Models

Most ISD models are grounded in theory including behavioral learning theory,

cognitive learning theory, general systems theory, and instructional theory. ISD models

are often influenced by multiple theories. For example, a particular ISD model may have

steps that include all of the following: (1) the teacher’s role in the classroom, specifically

management and disciplining students (behaviorisms), (2) the student’s role in

understanding their own knowledge levels and deficiencies (cognitive learning theory),

(3) the school’s role in the cycle of evaluation (general systems theory), and (4) the peer’s

role in facilitating learning by providing feedback to their classmate (instructional

learning theory).

Behavioral Learning Theory. Seel and Dijkstra (1997) state that ISD models are

generally based on planning and evaluation that is characterized in the stimulus-response

theory and stimulus control, which is a reminiscent of behaviorism. The central theme

behind behavioral learning theory, simply put, is B. F. Skinner’s belief that learning can

be understood through observing cues of a learner within his or her environment

(Driscoll, 2005, 2012). Characteristics of a model based upon behavioral learning theory

can include conducting a skill analysis and determining the component skills necessary to

change a behavior and improve performance (Gropper, 1983). An element of the 4C/ID

model (van Merriënboer, Clark, and de Croock, 2002) that was influenced by

behaviorism is the emphasis on the “integration and coordinated performance of task-

specific constituent skills” (p. 39).

Cognitive Learning Theory. Cognitive approaches to teaching and learning

foster the acquisition of knowledge and attainment of higher-order thinking skills

(Tennyson & Rasch, 1988). Cognitive psychologists and theorists asservate the mental

processes are what explain how learning occurs (Richey, Klein, & Tracey, 2011). Jerome

Bruner, a cognitive psychologist, suggested that one factor for human development (i.e.

knowing when a child has developed; the endpoint) is thinking and a well-developed and

intelligent mind that can think at higher levels and make predictions (Driscoll, 2005).

13

Sink (2008) states that cognitive learning theory provides instructional designers with the

“conditions that make it more likely learners will acquire the thinking strategies (p. 205)”

necessary to achieve in the workplace and in other learning environments.

Robert Gagné created a taxonomy of learning outcomes and learning 

conditions in addition to the Nine Events of Instruction (Driscoll, 2005) all of which 

have a foundation in Cognitive Learning Theory. The learning outcomes consist of  

(1) Verbal information 

(2) Intellectual skills 

(3) Psychomotor skills 

(4) Attitudes, and  

(5) Cognitive strategies (Reiser, 2007).  

In particular, verbal information, intellectual skills, and cognitive strategies 

emphasize cognitive development. Verbal information strategies include 

memorization and recall (Driscoll, 2005; Gagné et al., 2005), mnemonics, and 

rehearsals (Richey, Klein, & Tracey, 2011). Intellectual skills, as described by Gagné 

et al. (2005), is the basis for formal education and the skills to develop can range 

from skills appropriate for early childhood (e.g. vocabulary development) to higher 

education (e.g. advanced mathematical calculations for engineers, educational 

research techniques). Cognitive strategies are the “capabilities that govern the 

individual’s own learning, remembering, and thinking behavior” (p. 50). Cognitive 

strategies are usually domain specific and are developed through experience. One 

strategy to promote the cognitive strategies outcome is to use real‐world cases that 

foster critical thinking and strengthen problem‐solving skills (Gagné et al., 2005; 

Tennyson & Rasch, 1988).  

General Systems Theory. Most ISD models describe systematic processes for

designing instruction and are based upon general systems theory (Edmonds, Branch, &

Mukherjee, 1994). A system consists of interdependent groups of things that interact

regularly and perform functions consistently toward a common goal. In a system each

component is critical to the successful functioning of the system (Dick, Carey, & Carey,

2005; Edmonds, Branch, & Mukherjee, 1994; Richey, Klein, & Tracey, 2011).

14

General systems theory is also known as a systems approach (Richey, Klein, &

Tracey, 2011). A systems approach to instructional design generally consists of various

analyses, defining learning and performance objectives, designing and developing

interventions, implementation of the intervention, and formative and summative

evaluations (Dick, Carey, & Carey, 2005; Richey, Klein, & Tracey, 2011). The most well

known models based on general systems theory are the ADDIE model (Figure 2.1) and

the Dick and Carey model (Figure 2.2). A lesser-known instructional design model is

Merrill’s (2002b) Pebble-in-the-Pond model (Figure 2.3), which describes a systematic

approach to applying the First Principles of Instruction.

Figure 2.1. The ADDIE model is a systematic approach to instruction. Diagram from (Gustafson & Branch, 2002, p. 3).

15

Instructional Theory. Instructional theory was the predecessor of instructional

systems design theories and models (Richey, Klein, & Tracey, 2011). Instructional theory

explains the principles of curriculum design and student learning including the

identification and alignment of learning objectives with instructional strategies, content

selection, sequencing of content, assessments, and feedback (Richey, Klein, & Tracey,

Figure 2.2. Dick and Carey Systems Approach Model. Diagram from (Dick, Carey, & Carey, 2005).

Figure 2.3. Merrill’s (2002b) Pebble-in-the-Pond Model.

16

2011). Instructional theory “offers explicit guidance on how to better help people learn

and develop… kinds of learning and development may include cognitive, emotional,

social, physical, and spiritual” (Reigeluth, 1983, p. 5). Gagné’s Nine Events of

Instruction (see Table 2.1) is one example of an instructional model that has a foundation

in instructional theory because of its emphasis on student learning and the alignment of

instructional strategies with learning outcomes and the conditions of learning. Some

specific element in the Nine Events that directly correlate with instructional theory

include determining and informing learners of the learning objectives, determining

appropriate instructional sequencing of the content and presenting the content, eliciting

performance, and providing feedback to the learner.

Table 2.1

Gagné’s Nine Events of Instruction

1 Gaining Attention

2 Informing Learners of the Objective

3 Stimulating Prior Recall

4 Presenting the Content

5 Providing Learning Guidance

6 Eliciting Performance

7 Providing Feedback

8 Assessing Performance

9 Enhancing Retention and Transfer

(Driscoll, 2005, p. 373)

First Principles of Instruction

Below a detailed description of the First Principles of Instruction, developed by

Merrill (2002a), is presented. Merrill asserts that this set of prescriptive principles is just

that, principles and not a model. The literature review challenges that assertion and for

the purposes of this research the First Principles of Instruction will be viewed as a model.

17

The literature written by Merrill about the First Principles of Instruction has evolved to

include more descriptions, prescribed sequencing, and graphical analogies; all of which

are characteristics of models. In a later publication Merrill (2009d) expanded his original

graphical representation of the First Principles of Instruction to include arrows that

illustrates (see Figure 2.4) a “four-phase cycle of instruction” (p. 52) providing further

evidence that this set of prescriptive principles can also be viewed as a model.

In an open dialog with Dr. M. David Merrill, a leader in the Instructional Systems

Design field and author of the First Principles of Instruction, at the 2003 Association for

Educational Communication and Technology (AECT) International Convention an

audience member asked Dr. Merrill about his concerns for the ISD field and the practice

of instructional design and development (Spector, Ohrazda, Van Schaack, & Wiley,

2005). Merrill’s response was two-fold. First, he expressed a concern that most

instruction was being designed and developed by “designers-by-assignment.” Designers-

by-assignment are individuals who are creating instruction and doing instructional design

tasks without being formally trained in instructional design (Merrill, 2007a). Further,

Merrill asserts that graduates of instructional design programs were not actually

Figure 2.4. The First Principles of Instruction, an illustration of the four-phase cycle. Diagram from (Merrill, 2009d).

18

designing instruction and developing instructional design expertise but working as project

managers and supervisors of “designers-by-assignment” (Merrill & Wilson, 2007;

Spector, et al., 2005). Merrill (2007a; in Spector, et al., 2005) claims that 95% of all

instructional design work is created by designers-by-assignment which may be the cause

of so much instruction being ineffective, inefficient, and disengaging. Second, Merrill

states that as a field, ISD’s “real value proposition is not training developers; it’s studying

the process of instruction… [The] value is making instruction more effective and more

efficient no matter how we deliver it or what instructional architecture we use. We ought

to be studying the underlying process of instruction” (Spector et al., 2005, p. 309).

Recognizing the need to determine what the underlying processes and

fundamental truths were in ISD, Merrill sought to systematically review the abundance of

ISD theories and models, research on learning and instruction, and common instructional

design practices (Merrill, 2002a, 2009a, 2009b) with the intent to discover the basic

truths of instruction and learning. Merrill assimilated the literature and identified a set of

basic principles that theorists, model authors, ISD leaders, as well as researchers and

practitioners could agree upon (Merrill, 2009c). A principle is a proposition or

relationship that is true under “appropriate conditions regardless of the methods or

models which implement” the principles (Merrill, 2009d, p. 43). The main criterion for

the inclusion of a principle was that it had to support e3 learning—effectiveness and

efficiency in learning as well as promote learner engagement (Merrill, 2009d).

Subsequent criteria included the general applicability of the principle in common

instructional design methods, programs and environments (Merrill, 2002a).

As a result of this lengthy review, five fundamental principles of teaching and

learning were identified and complied to create the First Principles of Instruction (see

Figure 2.5). The five principles encompassed in the First Principles of instruction

include: (1) problem or task-centered, (2) activation, (3) demonstration, (4) application,

and (5) integration. These principles are defined as (Merrill, 2002a, pp. 45-50):

(1) Problem or task-centered– Learning is promoted when learners are

engaged in solving real-world problems

(2) Activation Phase– Learning is promoted when relevant previous

experience is activated

19

(3) Demonstration Phase– Learning is promoted when instruction

demonstrates what is to be learned rather than merely telling

information about what is to be learned

(4) Application Phase– Learning is promoted when learners are

required to use their new knowledge or skill to solve problems

(5) Integration Phase– Learning is promoted when learners are

encouraged to integrate (transfer) the new knowledge or skill into

their everyday life

Activation

A learner’s prior knowledge is said to be one of the most robust factors that

contribute to the acquisition of new knowledge and skill development consequently

leading to higher levels of achievement (Lazarowitz & Lieb, 2005; Todorova & Mills,

2011). Merely having a learner recall information and previous experiences is not

Figure 2.5. Framework of Merrill’s (2002a, 2008) First Principles of Instruction. *Merrill initially used the term problem-centered and later added the term task-centered.

20

sufficient to stimulate a pertinent mental model that is necessary to construct new

knowledge (Merrill, in press). Using inappropriate strategies to activate prior knowledge

can have an adverse affect on a learner’s ability to achieve by allowing the learner to

recall a mental model that is not relevant (Merrill, in press; Todorova & Mills, 2011).

Todorova & Mills (2011) posit that effective instructional strategies, to activate

prior knowledge, should “ build positive and consistent knowledge” and lessen or

eliminate the damaging influence of misconceptions (p. 23). Lazarowitz and Lieb (2005)

suggest using a formative assessment to determine precisely what learners’ prior

knowledge is and then develop strategies to build upon the varying levels of learners’

prior knowledge. Merrill (2009d) asserts that learners sharing prior experiences with their

peers enhance activation of prior knowledge. In addition, a key strategy is to ensure there

is some type of facilitation to ensure that appropriate mental models are being activated

Merrill (2009d).

Demonstration

Merrill selected the demonstration or “show me” principle in order to emphasize

the great importance of showing learners how to apply the component skills instead of

just telling the learners what to do (Merrill, in press). Demonstrations can provide a

meaningful context to general information, help learners develop causal explanations

(Straits & Wilke, 2006), and augment a learner’s imagination (Driscoll, 2005). The use of

demonstrations can be used to attract the learner’s attention by arousing perceptual

curiosity (Keller & Deimann, 2012) and sustaining curiosity by coupling demonstrations

with problem-solving activities (Driscoll, 2005).

Examples and non-examples can be used to demonstrate concepts; step-by-step

process should be shown to demonstrate procedures; modeling is a technique used to

demonstrate behaviors; and graphic organizers, charts, and models can be used to portray

processes (Merrill, 2002a). The proximity of the information and the demonstration,

whether it is proximity of time or location, is equally as important as the demonstration

itself. Mendenhall, Buhanan, Suhaka, Mills, Gibson, and Merrill (2006a, 2006b) (see

Figure 2.6) designed the interface of an online entrepreneurship course to guide learners

in “processing the [general] information and for attending to the critical aspects of the

demonstration in a specific [portrayal]” (Merrill, in press, p. 11). The presentation of

21

general information is located on the left and the demonstration/portrayal is on the right

side allowing the learners to see the direct relationship between the concepts and the

demonstration of the concepts.

Application

Merrill (2002a, 2007b) uses the term application to denote instructional

interactions or practice of knowledge and skills that are being taught during instruction.

After a component skill is taught and demonstrated the learner should be provided with

multiple opportunities to apply their new knowledge. During the application phase

learners should be given guidance (Merrill, 2002a). Guidance should be diminished as

learners become more proficient during practice and guidance should be withdrawn after

the learner demonstrates their ability to complete the tasks on their own (Driscoll, 2005).

Part of guiding the learner is to provide valuable feedback along the way. Feedback

should be corrective, specific, and result in improved performance (Merrill, 2007a).

Figure 2.6. General information is located directly next to the demonstration/specific portrayal, which guides the learner from the concept being taught to the demonstration of that concept (Mendenhall et al., 2006b).

22

Integration

In order for the transfer of knowledge and skills to occur, a learner must be

provided with an opportunity to apply the newly acquired knowledge and skills in a novel

situation. “Learning from integration is enhanced when learners create, invent, or explore

personal ways to use their new knowledge or skill” (Merrill, 2009d, p.53). In addition,

learning from integration is promoted when learners are given opportunities to go public

with their new knowledge and skills by demonstrating their new skills, pondering and

reflecting on experiences, discussing the things they learned, and defending their

knowledge and skills (Merrill, 2009d; in press).

Problem or Task-Centered

A problem-centered or task-centered approach engages the learner in solving

authentic real-world problems or completing real-world tasks. Merrill (2002a, 2007a,

2007b, 2009, in press) states that knowledge acquisition and skill development occur

when the learner is actively engaged in solving real-world problems or tasks. When

learners are solving real-world problems they are more motivated to learn because

learners find relevance within the authentic environment (Mendenhall et al., 2006a;

Merrill, 2009b; Keller, 2010). An authentic real-world problem is one that can be ill-

structured (Jonassen, 1997), doesn’t usually have a specific outcome or single solution

(Merrill, 2007b), requires the same cognitive demands as if the learner was in the “real-

world” (Savery & Duffy, 1995), and is something the learner can anticipate to confront

later (Merrill, 2007b). Ideally, instruction should contain a progression of problems from

simple to complex with guidance occurring significantly more at the beginning of the

instruction and gradually diminishing to where the learner completes a problem on their

own (Mendenhall, et al., 2006a; Merrill, 2009b).

Merrill prescribes several steps to assist instructional designers in the appropriate

selection of real-world problems and tasks as well as the component skills necessary to

complete the real-world problems and tasks (see Merrill, 2007b; Merrill, in press).

Reigeluth and Carr-Chellman (2009) assert that instructional designers and especially

designers–by–assignment require guidance when trying to apply these principles in

various situations in order to obtain e3 learning. Collins & Margaryan (2005) state that

23

while the First Principles of Instruction is beneficial criteria when designing instruction,

they may not be completely universal as claimed by Merrill (2002a) however, and may

need to be adapted to fit specific needs in various situations. In the following section, the

use of the First Principles of Instruction by researchers and practitioners will be

discussed.

Use of First Principles of Instruction

Gardner (2009, 2010, 2011a, 2011b) has conducted considerable research and

development using the First Principles of Instruction. Gardner (2011a) recognized the

difficulty in applying these principles in real instructional design settings thus he created

a worksheet to assist instructors, who are often untrained in instructional design (i.e.

designers-by-assignment). The worksheet consists of a series of questions and

subsequent strategies on how to apply the principles. Gardner (2010) takes the

instructor/designer-by-assignment through each of the principles asking various

questions. The worksheet contains questions like, “What real-world, relevant problem or

task will the learners be able to perform when they finish this lesson or unit?” “How will

your students preview what they learn?” “How will you show the learners how to

perform real-world problems or tasks?” (p. 22).

Gardner and Jeon (2009) discuss the design and development decisions they made

while creating online training on using a suite of administrative tools (e.g. financial aid,

registration, etc.) for a large university. They describe the conditions (i.e. environment,

client requirements, obstacles) under which they were to apply the First Principles of

Instruction and the decisions they made in order to work around those conditions.

Gardner (2011a) conducted a study on how award-winning professors apply the

First Principles of Instruction in face-to-face courses. The participants of this study

included one professor from each of the following departments: (a) Family, Consumer,

and Human Development, (b) Marketing, Nutrition and Food Science, and (c)

Economics. For the activation phase the professors applied the following strategies to

activate prior knowledge: (1) identified outcomes from prerequisite courses and used that

as the foundation to build the new knowledge; (2) in-class review of course content

presented in prior class sessions; and (3) began each class by asking questions to students

about concepts taught previously and then proceeded to ask more abstract and complex

24

questions. For the demonstration phase, some professors used worked examples to show

how to calculate complex calculations while another professor from the Family,

Consumer, and Human Development department, had her students demonstrate their

lesson plans they developed by teaching a class at a local pre-school. Each student had an

opportunity to teach, then observe and evaluate each other. During the application phase,

these same students were given real-world case studies and discussed the implications of

the cases. The professor used reflection for the integration phase, having students openly

reflected on their experiences and share those experiences with their peers.

Mendenhall, et al (2006a, 2006b) developed an online entrepreneurship course

using the First Principles of Instruction. They describe their use of First Principles of

Instruction emphasizing the progression of problems used in the instruction. Working

closely with their subject matter experts (SME) the instructional designers and SMEs

determined to use real-world cases to help learners create business plans and eventually

starting their own businesses. The progression of whole tasks begins with a simple

business and business plan (i.e. pig farm), to a slightly more complex business plans (i.e.

service business; retail business) all the way to a very complex business plan (i.e.

restaurant business). Mendenhall et al. (2006) also emphasizes how the demonstrations

are used and the practice (i.e. application phase) the learners will engage in during the

instruction.

A pilot study, of the Entrepreneurship course, was conducted among

undergraduate students who were enrolled in a core of business classes that taught the

same concepts (i.e. finances, marketing, business plan writing) as the online

entrepreneurship course. Some participants (module group) were asked to go through the

modules and take a post-test while others (control group) were just given the post-test

without going through the online modules. Seven out of 12 participants in the module

group received a score of 80% or above on the post-test, with six of the module group

participants having received a 90% or above. All eight of the control group participants

received 80% or above with only three having received a 90% or more. The results

indicate that the module using the First Principles of Instruction may be just as effective

as the business core classes.

25

Kim, Mendenhall, and Johnson (2010) described a conceptual framework of how

to apply the First Principles of Instruction in an online English writing course. They

identified a series of problems/whole-tasks that are scaffolded from simple writing tasks

to complex writing tasks. Using a “content-first” approach the learners would see a

completed example of the whole-task before beginning the modules. They applied the

activation principle by choosing a problem for learners to solve from something they use

everyday, e-mail. The learners activated their prior knowledge by writing a procedural

essay on how to open an e-mail. The knowledge and skills gained from the first whole

task are taken into account and used as the foundation for the second whole task. Thus,

building upon prior knowledge each time a learner begins a new whole task. This team of

instructional designers chose to use examples and non-examples as the demonstration

technique. For the application phase the instructional designers chose to have the learners

evaluate their peer’s writing assignments and provide feedback as well as complete a

writing assignment of their own. Finally, for the integration phase students are to

complete a new writing task using their newly acquired skills.

One common theme among most of the above descriptions was working closely

with SMEs to determine an appropriate set of whole tasks and the component skills

associated with the whole tasks. Also it is important to note that the SMEs were not

working as the instructional designers but their role was to provide content to the

instructional designers so the designers could make pedagogical decisions and apply the

First Principles of Instruction. Another noteworthy observation is that not all of the

literature, about the use of First Principles of Instruction mentioned previously, describes

each phase of the First Principles of Instruction or the instructional designers decisions in

full detail.

Research on First Principles of Instruction

Rauchfuss (2010) conducted an exploratory study that examined the correlation

between years of formal instructional design training, experience, and the use of the First

Principles of Instruction. The sample for this study included instructional designers that

had designed and/or developed a course within one year before the study. The designers

for this study represented the military, corporate, and higher education. Rauchfuss

evaluated the courses, submitted by instructional designers, using Merrill’s (2009b) e3

26

evaluation rubric. Participants were given a questionnaire about their years of experience

and formal instructional design training. The scores from the course evaluations and

questionnaire were correlated. The results indicated there were no significant correlations

found between years of experience and years of formal training. Yet, there was a

significant correlation between years of experience and the use of First Principles of

Instruction (i.e. course evaluation scores). Upon further examination, Rauchfuss (2010)

discovered that novice and expert instructional designers applied the demonstration

principle equally but expert instructional designers were more likely to use the other

principles (i.e. activation, application, integration, problem-centered).

Collins and Margaryan (2005) used the First Principles of Instruction as the basis

for creating a model for designing and evaluating courses developed and used in their

organization. They expanded the First Principles of Instruction to include workplace

specific elements (e.g. collaboration, supervisory and stakeholder involvement,

technology, accommodation of individual learner needs). There were 68 workplace

related courses evaluated using, what Collins and Margaryan (2005) called, the Merrill+

evaluation criteria. Results indicated that on average the courses scored acceptable or

higher (on a scale of 1 to 5, acceptable is 3 and above and a score of 4 and above

indicates an advanced level of application of the principle). Specifically the application of

the problem-centered, application, and integration phases scored the highest while the

activation and demonstration phases scored 2.7 and 2.6 respectively.

Most of the research relating to the First Principles of Instruction is quantitative

using experimental, quasi-experimental, or exploratory methods that looked at various

learning outcomes like self-direction, motivation levels, and improved performance, (see

Gardner, 2011b; Francom, 2011; Rosenberg-Kima, 2012; Thomson, 2002). Very little

research has been conducted on how instructional designers use the First Principles of

Instruction, their design decisions, or the ecological validity of the application of these

principles. While the research mentioned previously is important and necessary in the

validation of the First Principles of Instruction, significantly more research needs to be

conducted. In order to validate the universality and feasibility of applying the principles

research needs to be conducted under conditions that are not controlled and experimental

but under conditions that are natural and dynamic.

27

Instructional Designer Decision-Making

Instructional Systems Design (ISD) is a complex, ill-structured, problem-solving

activity (Jonassen, 1997) that involves decision-making procedures (Winn, 1990). The

instructional design process is dependent upon the decisions that instructional designers

make (Rowland, 1993). Decisions made by instructional designers vary significantly

between novice and expert instructional designers; furthermore, variations among expert

designers are also apparent (Rowland, 1993). Research has indicated that expertise, in

other domains (and presumably in ISD) doesn’t equate to good decision-making 100% of

the time and that sometimes experts tend to make inadequate decisions that are

“inaccurate and unreliable” (Shanteau, 1992, p. 11). Shanteau (1992) posits that previous

research that indicates that most experts consistently make poor decisions is deficient. He

states that decision-making is situation specific and dependent on the skills and abilities

of the individual (Shanteau, 1992) and it can be assumed that previous research didn’t

take these variables into account. Decision-making in ISD is influenced by a myriad of

variables some of which include the following (Carliner, 1998; Le Maistre, 1998;

Rowland, 1993):

• Knowledge and understanding of ISD

• Ability to apply knowledge in real-world settings

• Skills

• Experience levels

• Attitudes and beliefs

• Working environment and conditions (e.g. management, team members)

• Conditions of the ISD project (e.g. client restrictions, available resources,

project requirements, etc.)

• Complexity, scope, and goals of the ISD project 

Extensive research has been conducted on novice-expert differences and their

decision-making processes and abilities. Shanteau (1992) identified, through literature

reviews, some common characteristics that differentiate an expert from a novice and how

that affects decision-making. Experts are believed to:

• Have extensive and current content knowledge

28

• Have an acute awareness that helps them to synthesize information

• See patterns that novices cannot see

• Can differentiate between relevance and irrelevance when a decision

needs to be made

• Simplify complex problems

• Communicate ideas, problems, and solutions more effectively than

novices

• Able to adapt decision strategies based on situational conditions (see Le

Maistre, 1998, p. 23; Shanteau, 1992, pp. 14-16)

Novices, on the other hand, may have a good knowledge base but lack the

experience and ability to apply the knowledge and solve problems efficiently and

effectively (Ertmer, York, & Gedik, 2009). When trying to identify a problem, novices

tend to summarize and list superficial problems instead of synthesizing and looking

deeper at the relationships between the superficial problems (Ertmer & Stepich, 2005).

Novice instructional designers often focus on the tasks and precisely apply prescriptions

of models instead of understanding the underlying principles (Ertmer & Stepich, 2005;

Ertmer, York, & Gedik, 2009; Reiser, 2004). Experts in ISD do not always follow the

prescribed principles when designing instruction instead they frequently make

adaptations to fit the context of the instructional design problem (Christensen &

Osguthorpe, 2004; Ertmer, York, & Gedik, 2009; Wedman & Tessmer, 1993).

To obtain expertise in ISD it is said a designer must have 10 years of consistent

hands-on experience (Perez & Emery, 1995). One criticism of the ISD field is that many

graduates of ISD programs go on to be managers of organizations or project managers

and they do very little actual design work thus, inhibiting the development of expertise

(Merrill in Spector, et al., 2005). The identifying characteristics of instructional design

expertise are difficult to determine because expert knowledge is tacit (Winn, 1990) and

solving complex ISD problems is context dependent (Jonassen in Ertmer & Stepich,

2005).

Ertmer, York, and Gedik (2009) conducted a qualitative study that aimed at

understanding how expert instructional designers applied ISD principles into practice.

Many of the experts concurred that they began the ISD process with the end in mind

29

instead of doing a thorough task and target population analysis citing that constraints,

resource restrictions, and client needs have a “strong influence on what can be

accomplished” (p. 24). Being sensitive to the context and knowing how to create quality

instruction, with the given constraints, is believed by some experts to be a predictor of

success. The expert participants also indicated that they did not use the procedures as

described in textbooks and that are often taught in the classroom. The experts used many

of the ISD principles just not as prescribed.

Design and Development Research

A call has been made to conduct design and development research in order to

advance the field of instructional systems design (ISD) and add to its knowledge base.

Design and development research can promote the development of theory and provide

empirical evidence of validation (Richey & Klein, 2007). Richey and Klein (2007) define

design and development research as the “systematic study of design, development, and

evaluation processes with the aim of establishing an empirical basis for the creation of

instructional and non-instructional products and tools and new or enhanced models that

govern their development” (p. 1). Reeves (2000) claims that design and development

research will help solve consistent problems that are occurring within the current realm of

ISD research like the poor quality of published research and literature reviews that are

confusing and insufficient. Furthermore, he asserts that ISD professionals, and educators

in general, have a narrow and simplistic view of research and some ISD professionals

gravitate toward basic research “regardless of whether it has any practical value” (p. 2).

Expanding the view of ISD research to include design and development research will

provide additional rigor necessary to solve the poor quality issues.

The poor quality can stem from the lack of rigor in basic ISD research. One

reason for the poor quality may be due to the fact that most treatments used in

experimental or quasi-experimental studies are generally completed in less than one hour

(Clark, 1983; Reeves, 2000). Design and development research, on the other hand,

requires a much longer time commitment because data collection typically lasts many

weeks, months, or more (Reeves, 2000) and often require a mixed methods approach

(Richey & Klein, 2007) in order to capture the rich information afforded through this

30

type of research. The methods of design and development research are similar to other

types of research. Design and development researchers use qualitative methods like

structured and semi-structured interviews, focus groups, observations, and document

analysis. Quantitative methods can include surveys and evaluations. The specific methods

are dependent on the research questions and the goal of the research.

The goals of design and development research vary depending on the category or

research. There are two major categories of research that are enveloped in design and

development research — product and tool research and model research (Richey & Klein,

2007). In an effort to distinguish design and development research from other types of

design research (e.g. design-based research) a description of each of the major categories

of design and development research proceeds.

Product and Tool Research. Product and tool research is conducted while a

product (e.g. online course or training program) or a tool (e.g. electronic performance

support system, knowledge object repository, or automated assessment system) is being

designed and developed. One goal of product and tool research is to provide empirical

support on the identification and resolution of instructional design problems (Hung,

Smith, Harris, & Lockard, 2007). This type of research is considered formative because

the research is conducted throughout the design and development process (Richey &

Klein, 2009; van den Akker, 1999). Product and tool development research usually is

reported as a case study with significant detail about how the product or tool was

developed and the decisions instructional designers made. A description of the

environment or situation under which the tool was developed is often included in the case

study. Research activities like pilot testing, expert reviews, evaluations, and assessments

are essential to establish validity in product and tool development (Richey & Klein,

2007).

Hung, Smith, Harris, & Lockard (2010) conducted research using the product and

tool research design framework. They investigated the process and application of a

design framework (Ausubel’s advanced organizers) in a teacher’s performance support

system (TPSS). Hung et al. (2010) used a six-phase approach to designing, developing,

and collecting data. First, they identified the theories and models to guide them in the

design and development of the TPSS system. They used previous research and literature

31

to help determine the theories and models to use. Second, they determined the user’s (e.g.

teachers) skill sets and knowledge about classroom management, how the teachers

develop and decide to use instructional strategies, and what is needed in a performance

support system. To collect this information researchers conducted a focus group of 13

teachers. In addition, researchers had participants fill out a user profile survey. During

Phase 3, the developers converted system requirements into interactive storyboards that

modeled a semi-functioning system. Data were collected from storyboards and design

requirements created by five information architects and developers. Phase 4 required

experts to review and evaluate the system. Using the Delphi method experts appraised the

TPSS and provided recommendations. In phase five, the recommendations were taken

into account and a functioning prototype was developed using a rapid prototype method.

During this phase usability data were collected for two iterations of the product. Finally,

in phase six a full implementation of the system took place in a real-world setting. The

initial 13 participants went through the TPSS while researchers assessed the system for

effectiveness in guiding the users in making appropriate decisions in their classroom.

Data were collected from surveys, interviews, and document analysis (e.g. review of

activity logs).

Model Research. Model research consists of three distinct methodologies (1)

model development, (2) model validation, and (3) model use (Richey & Klein, 2007).

Each of these types of studies often employs similar methodologies that include both

qualitative and quantitative elements and are generally exploratory in nature (Richey &

Klein, 2008).

Model Development. Model development studies explore the theoretical

foundations and processes taken by model developers and researchers. Significant review

and synthesis of the literature is required in model development research. Furthermore,

data for model development research can also include data from the developer and users

of the newly constructed model (Richey & Klein, 2007). Jones and Richey (2000)

described the development of a rapid prototyping model in a naturalistic real-world

setting. The research described the different phases (i.e. they used an ADDIE approach)

in the development process. Two experienced senior instructional designers and one

customer were the subjects of this study. Data collection methods included instructional

32

designer interviews, survey data, task logs, and content analysis of extant data (see p. 71).

In addition, customer data included a semi-structured telephone interview. This study

resulted in a revised rapid prototyping model.

Model Validation. Model validation research stems from the need to challenge

the quality and rigor of ISD models and to reduce the gap between theory and practice.

There is a need to provide evidence and empirical support of the model’s effectiveness

instead of relying on unsubstantiated claims and testimonials of effectiveness (Gustafson

& Branch, 2002; Richey, 2005; Richey & Klein, 2007). Richey (2005) describes model

validation as “a carefully planned process of collecting and analyzing empirical data to

demonstrate the effectiveness of a model’s use” (p. 174). Model validation processes

occur either internally or externally. Internal model validation relates to the “integrity” of

the components and processes of the model and how the model is applied in instructional

design situations (Richey, 2005). Internal validation looks at the individual components

of the model and how they function and aiming to answer questions like “Are the steps

manageable in the prescribed sequence? To what extent does the model address relevant

environmental factors? To what extent is the model usable for a wide range of design

projects and settings?” (Richey & Klein, 2007, p. 23).

External validation refers to how the model impacts the products that employ the

model and the end users (Richey, 2005; Richey & Klein, 2007). This goal of external

validation is to identify the product characteristics and determine the effect the model has

on teaching and learning. It aims to answer questions like “To what extent does the

resulting instruction meet learner needs, client needs, and client requirements? To what

extent do changes occur in learners’ knowledge, attitudes, and/or behaviors after

instruction” (Richey, 2005, p. 175).

Wilson (2011) conducted an external validation of an instructional design model

that assists designers in the designing, implementing, and evaluating of simulations used

for instruction. The specific design characteristics that were being validated in this study

included the use of objectives, problem solving, fidelity, feedback and debriefing. Wilson

(2011) examined the processes used by the instructional designer during the designing of

the simulation, how the simulation was implemented by nursing faculty, and course

evaluations from participating students and faculty. The study employed both qualitative

33

and quantitative research methods. Qualitative data collection methods included

document analysis of designer logs and faculty preparation logs and semi-structured

interviews. Quantitative methods included scores from pre- and post-tests administered to

students. Results indicated that the simulation model worked fairly well for the problem

solving and fidelity characteristics however, there were some weaknesses in the model

and characteristics that were not even addressed in the model (i.e. role of observers).

Wilson (2011) recommended ways the model could be improved including addressing

characteristics that were not described in the original framework.

Tracey (2009) conducted a design and development study focusing on the

construction and application of the Multiple Intelligence (MI) Design Model in hopes of

providing validation for this model. She used two different types of model validation

approaches— a designer usability study and a product impact study. The researcher had

two teams of two instructional designers design and develop the same instructional

modules, however one team used the MI Design model. During the usability phase of the

study instructional designer’s reactions, “program tryout” data, and evaluation data were

analyzed and used to revise the MI model. Results indicated that the instructional

designers using the MI Design Model responded positively and specifically they found

two of the components particularly favorable.

Model Use. Model use studies concentrate on the “conditions” or factors (e.g.

development environment, availability of resources, constraints, client requirements, etc.)

that affect how a model is applied in an instructional design project. This type of research

can employ both qualitative and quantitative measures. Model use research can include

the following research methods (see Richey, 2005; Richey & Klein, 2009):

• Document analysis of instructional design and communication documents

• In-depth interviews with instructional designers, model users, and model

developers

• Surveys

• Focus groups

• Evaluations of the product

• Expert reviews

34

Moreover, the research is generally exploratory or descriptive and is represented

in case studies (Richey & Klein, 2007). When conducting exploratory research, the

researcher, focuses on the prescribed processes of the model as they occur whereas, the

detailed use of the model and instructional designer’s decisions regarding the use of the

model, is the emphasis of a descriptive study (Richey & Klein, 2007).

As part of a mixed-methods exploratory study Gardner (2011b) described how he

used of the First Principles of Instruction to redesign a biology course. The purpose of his

study was to test the effectiveness of the First Principles of Instruction in an effort to

validate these principles. Reflections regarding decision-making strategies were out of

the scope of this study. Gardner evaluated the modules that used the First Principles of

Instruction but mainly tested efficacy through quantitative measures— pre- and post-test

scores from students using the modules.

This research is considered model research that focused on the use and validation

of the First Principles of Instruction in a specific context. This research employed both

qualitative and quantitative methods to investigate how instructional designers used the

First Principles of Instruction. Their decisions and the conditions surrounding those

decisions were also examined. Moreover, multiple methods of data collection were used

to triangulate the data for more conclusive results and to help prevent any biases. Results

of this research resulted in this case study.

35

CHAPTER THREE

METHODOLOGY

The purpose of this study was to investigate the use of First Principles of

Instruction (Merrill, 2002a) and the design and development decisions made by

instructional designers to determine if these principles are conducive to being

implemented in a short-term, high volume, rapid production of teacher professional

development modules. The primary research question that was addressed in this study

was: How were the First Principles of Instruction used by instructional designers, in a

short-term, high volume, rapid production of online K-12 teacher professional

development modules instructional modules? In an effort to answer the main research

question, four additional supporting research questions were addressed: What were the

conditions under which the First Principles of Instruction were used? What design

decisions regarding the First Principles of Instruction were made during the project?

What is the level of understanding of the First Principles of Instruction by instructional

designers? Lastly, how frequently do the modules incorporate the First Principles of

Instruction?

Research Design

This design and development case study aimed to describe the use of the Merrill’s

First Principles of Instruction (2002a, 2007a, 2007b) and validate the use of these

principles within a specific context (i.e. short-term, high volume, rapid production). This

research, described by Richey and Klein (2007) as a model use (i.e. the use of a set of

prescriptive principles) and validation study, explored the decisions made by designers

and the conditions (i.e. client restrictions, resource limitations, instructional design

setting) surrounding the use of First Principles of Instruction. Qualitative research

methods including interviews, surveys, and document analysis were employed while

conducting this case study. A case study is defined as a strategy of inquiry where a

researcher explores a phenomenon in depth (Creswell, 2009) and holistically describes

36

and analyzes the information rich data (Merriam, 1988). For this study, the researcher

took an emic approach and retrospectively described the case. The term “emic

perspective” means to take an insider’s perspective (Merriam, 1998; Patton, 2002). This

perspective is necessary because the researcher for this study was an “insider” working as

the lead instructional designer. The researcher was an instructional designer and

supervised the other instructional designers. Since the development of the modules

concluded before this research study began, data was collected retrospectively.

Participants

Participants for this study included 15 instructional designers and “designers by

assignment”. “Designers-by-assignment” refers to individuals who are doing instructional

design work but have not had formal instructional design training (Merrill, 2007a).

Participants represented five countries with the majority, 7 from the United States,

5 from Turkey, and 1 each from South Korea, Malaysia, and Thailand. Participants were

graduate students, recent graduates, faculty, and visiting scholars employed at a

multidisciplinary research and development organization at a large research university

located in the southeast region of the United States. There were eight male and seven

female participants and their average age was M=33.7 years (SD=6). There were five

participants who had PhDs (four with instructional design related PhD degrees), seven

with master’s degrees (four with instructional design related master’s degrees), and three

with bachelor’s degree. Eleven participants were currently working towards either a

master’s degree or PhD (nine pursuing degrees in instructional design related degrees).

Twelve of the participants indicated they had previous instructional design experience.

The average number of years of previous instructional design experience was M=3.6 (SD

= 5.6).

Participants were purposefully selected based on their involvement with a short-

term, high volume, rapid instructional design and development project that used the First

Principles of Instruction as a model for the modules they were creating. Specifically, the

participants needed to have contributed to the instructional design and development of at

least one professional development module. Some members of the instructional design

team participated in other design tasks (e.g. evaluation of modules, media selection and

creation) but did not actually design any portion of a module. Those participants were not

37

included in this study. An additional selection criterion included the length of time the

participant worked on the project. Participants had been employed on the project from the

beginning and worked for at least five of the ten weeks. An incentive of $30 was offered

to all participants but not all accepted the incentive.

Setting & Materials

The context for this study was an instructional design project that was federally

funded through a southeastern state’s Department of Education. The project timeline was

extremely short (11-weeks) and required the creation of 49 online modules within the

very strict 11-week timeframe. The major task was to use existing face-to-face

professional development materials and convert them to an online, independent study

environment. The goal of the new modules was to familiarize K-12 teachers with the

language of the new standards/benchmarks as well as have the teachers be able to

incorporate appropriate instructional strategies into their lessons as they fulfilled the

standard/benchmark requirements. A standard is a state-driven expectation of what the

student is to do and uses broader terminology than a benchmark. A benchmark is similar

to a learning or behavioral objective, it is a specific outcome.

Existing Materials. The existing materials were provided by the Florida

Department of Education and were located on a professional development website. These

materials consisted of PDF content guides and slide presentations to be used for face-to-

face professional development training. Much of the content in the existing materials

included trainer pacing guides, subject matter notes, presentation guides, participant

recourses, and activity sheets. The existing materials focused heavily on the rationale for

the new standards and differences between the previous standards/benchmarks and the

new standards/benchmarks. In addition, the materials also focused on the statistics of

where the U.S. stands in science and math education compared to other countries. The

existing materials relied on the trainer to encourage audience participation and

discussion. Consequently, much of the subject matter content (i.e. science and math) and

the specifics regarding the instructional strategy (i.e. how-to do it, demonstrations) being

taught were not included in the existing materials.

New Modules. The newly created online modules were designed to be

independent study and flexible in order to accommodate different school districts’

38

existing professional development training. The modules could be used independently or

clustered with other modules from the same grade-band (see Appendix A). School

districts could also incorporate the online modules into existing in-service teacher

training and other professional development programs.

The focus of the new modules was on the instructional strategy used to teach a

standard/benchmark and on the subject matter that was required based on the

standard/benchmark. The content of the modules included background information used

to provide context to teaching about the standard/benchmark. The background

information included subject matter like light energy, biotechnology, and quadrilaterals.

The new modules (see figure 3.1) included on-screen bullet points and narrated text, still

images representing the textual information, videos either demonstrating a concept or

used for a practice activity, and audio narration. Typically the general information (i.e.

the steps or skills being taught) was on the left and the specific information (i.e.

demonstration of steps or skills using specific examples) was on the right side of the

screen (see Figure 3.2). Occasionally videos were used as a demonstration and to assist in

a practice activity. Figure 3.3 shows an example of using a video for a practice activity.

Some general information and instructions were provided in the audio narration as well

as in the captioning (button to turn on captioning is located in the lower left corner).

Figure 3.1. The text is represented as bullet-points and is located on the left, while the specific instance is shown on the right (Johnson, Mendenhall, et al., 2011).

39

The modules incorporated reflection questions and tasks (e.g. creating lesson

plans and activities to use in the classroom) for learners to do on their own. The feedback

Figure 3.2. In this example, which illustrates unpacking a benchmark the steps and specific portrayal is on the right (Johnson, Mendenhall, et al., 2011).

Figure 3.3. Videos were sometimes used for practice activities. In this case the learner is watching a video and identifying tools used to make observations (Johnson, Mendenhall, et al., 2011).

40

and assessment of the lesson plans were left to each school district because the

expectations and requirements varied between the schools’ administrations.

Project Description. The project employed 28 instructional designers and

designers-by-assignment, three principle investigators (i.e. project directors), and 6

subject matter experts (SME). Instructional designers were hired to work between 10 – 40

hours per week. The project began with 14 SMEs but 6 were more frequent contributors.

The SMEs were public school teachers or administrators that had been public school

teachers previously. Each SME was proficient in a subject area (i.e. elementary science,

physics, biology, algebra, geometry, etc.). The SMEs were paid to provide subject matter

expertise and to review materials for accuracy. All but one SME worked remotely and

communicated via e-mail and Skype with the instructional design teams. SMEs worked

10 – 40 hours per week on the project.

The project team was divided into three major task-teams (see Figure 3.1), (1)

Project Leads (i.e. project lead, lead instructional design, project administrator), (2) Team

Leads (i.e. science and math team leaders), and (3) Instructional designers (i.e. they did

not have additional administrative duties). To clarify, the participants of this research

included the three task-teams mentioned above and each participant has conducted

instructional design duties and tasks however, throughout this research they are referred

to by their role (i.e. project lead, lead designer, team lead, instructional designer, or

designer-by-assignment). The purpose for differentiating the three-task teams in this

section is to help illustrate the context of the instructional design project used for this

research. The focus of this research is on the instructional designers and their decisions

and not on the project management or administrative tasks.

41

The project leads and team leads played a dual role, they were charged with

administrative and management tasks as well as instructional design tasks. Instructional

designers were divided up into two teams, the science team and the math team. The math

team had two team leads and the science team had four team leads and they were

assigned SMEs to assist in content selection and approval. During the initiation phase of

the project the project leads determined the instructional approach to be used for the

project. They chose to use the First Principles of Instruction because of their belief that a

real-world, problem-centered approach would be most appropriate to teach this type of

subject matter.

The First Principles of Instruction framework emphasized demonstrations of

concepts, applying new knowledge and skills while creating relevant artifacts, and

reflecting on their new knowledge and skills. Before the instructional design phases

began the project leads provided the instructional designers with journal articles and a

model (i.e. an online course) that demonstrates how the First Principles of Instruction is

used. A three-hour training session kicked off the project. Instructional designers were

trained on, among other things, how the First Principles of Instruction would be used to

develop these online modules. Additional training sessions and meetings that

Figure 3.4. Organizational Hierarchy of Participants

42

demonstrated the process and modeled the use of First Principles of Instruction occurred

often throughout the project.

Once the first training session was complete, the major tasks included content

analysis of existing materials, establishing the real-world tasks to be completed by the

teachers, and determining the goals of each of the modules. Then instructional designers

worked with SMEs to write the appropriate instructional materials and to create

demonstrations of the content. Storyboards were created, scripts were written, and

narration was recorded. The instructional designers developed the modules using

PowerPoint that was later converted into interactive modules, using Articulate, by an

outside web design and programming company. Instructional designers spent a

considerable amount of time reviewing and providing quality control for the modules

before they were housed in an online content management system where they became

available to K-12 teachers.

Data Sources

Designer Data. A demographic survey (see Appendix B) was administered to

participants online using a secure survey tool; the demographic data included age, gender,

role in the project, education level, length of time working on the project, and design

experience. In addition to demographic data, the survey contained questions asking about

their various roles in previous instructional design projects, comfort level with various

instructional design concepts and tasks, perceptions of First Principles of Instruction, and

how they gained an understanding of the First Principles of Instruction. The perception

and comfort level data was used to triangulate interview and document analysis data as

well as support claims that were made regarding the instructional designers and their

decision-making.

To capture in-depth information about how the participants made instructional

design decisions and the conditions under which those decisions were made, a 60-minute

semi-structured interview (see Appendix C) was conducted with each participant. During

the interviews, participants were asked to describe the conditions under which they made

instructional design decisions. These conditions included the work environment, the

client requirements, and project constraints. Due to the relocation of the participants most

of the interviews were conducted via Skype and recorded, with permission, for

43

transcription. Each interview was audio recorded and transcribed. During the interview

participants were asked about how they made design decisions, what factors contributed

to making those decisions, and how they used the First Principles of Instruction.

Furthermore, they were asked if there were tasks that were difficult to apply the First

Principles of Instruction; which tasks were easy to apply the First Principles of

Instruction; what were the top three things they would do differently regarding the use of

the First Principles of Instruction; and what top three things regarding the First Principles

of Instruction they would do the same if given the chance.

K-12 Teacher Professional Development Modules. The modules created during

this project consisted of 49 modules (see Appendix A) that instruct K-12 teachers newly

adopted and updated state science and math standards and benchmarks. In addition,

instructional strategies (e.g. Inquiry, 5E model, Backward Design, Manipulative

Materials) were also taught within the context of the subject matter (e.g. Nature of

Science, Earth Structures, Polynomials, Euclidean Constructions). Each module was

designed to take the learner approximately15-30 minutes to complete. There were nine

programs (e.g. Science Grades 3-5; High School Geometry) that contained five to seven

modules each. The modules contained audio narration, text, pictorial representations, and

minimal amounts of animation and videos. The real-world task the learner was asked to

complete was to create or select a lesson plan using the instructional strategies that were

taught in the modules to teach concepts that fulfill math or science related standards and

benchmarks. After the instructional designers created design documents and media, the

modules were then programed by outside developers specializing in programming and

web development. After development, the modules were then housed in an free online

portal for Florida educators.

For this case study, one module from each program/grade band (Appendix D) was

randomly selected for evaluation. A total of 9 modules were selected, using a stratified

random sampling procedure, for evaluation of the use of First Principles of Instruction. A

stratified random sample is a type of probability sampling where the population is

divided based upon a characteristic and then a sample is randomly selected from each

group (Creswell, 2008). In this case the stratum was the grade band (e.g. Science Grades

3-5, High School Algebra, etc.).

44

Extant Data. Project management documents including timelines, instructional

designer assignments, quality control documents, instructional design templates and

models, recorded WebEx meetings, and email communications were used to triangulate

designer data. These data provided an insight on the conditions (i.e. work environment,

client requirements, available resources, obstacles, and restrictions) that contributed to

the instructional designers’ decisions on using the First Principles of Instruction.

Recordings of team meetings provided data about the instructional designers’ level of

understanding of the First Principles of Instruction and their decisions on how to apply

the First Principles of Instruction.

Instrumentation

Demographic and Design Knowledge Survey. To determine the participant’s (a)

instructional design expertise levels, (b) how they learned about the First Principles of

Instruction, and (c) their perceived level of understanding of the First Principles of

Instruction, participants filled out a 21 item online demographic and design knowledge

survey (see Appendix B). The demographic portion of the survey contained questions

regarding participant’s background (e.g. age, gender, highest degree completed, etc.). The

design knowledge section of the survey asked participants about their comfort level when

using various ISD models, applying learning theories, designing a module from scratch,

selecting appropriate technologies, and developing instructional media assets. In addition,

participants were asked to rate their level of understanding of the First Principles of

Instruction and how they came to know the First Principles of Instruction.

First Principles of Instruction Knowledge Survey. Participants completed a

First Principles of Instruction Knowledge Survey (Appendix E). The survey included four

tasks. First, participants were given a short scenario to provide a real-world context for

the tasks and to activate the prior knowledge of the participants. Then they were given a

blank First Principles of Instruction diagram and asked to apply their knowledge and fill

in the model with the appropriate principles. Next, participants defined and described

how each of the principles could promote learning. Lastly, the participants integrated

their knowledge of the First Principles of Instruction. Participants were given the scenario

again and they described the strategies they would take and how they would apply the

First Principles of Instruction to create a module.

45

Module Evaluations. The modules were evaluated using a modified version of

Gardner’s (2011b) evaluation sheet (Appendix F). The sheet was based upon Merrill’s

(2007b) e3 rating scale. Each module was evaluated for the fundamental strategies

constructed from the First Principles of Instruction. The strategies included Tell

(activation), Show (demonstration), Ask (application), and Do (integration). Tell is the

general information or component skill being taught. Show is the specific portrayal and

demonstration of the component skill. Ask is where the learners practice and/or apply the

new knowledge or skills just learned. Do allowed the learners to integrate their new

knowledge by creating an artifact or completing a real-world task.

Procedures

Participants were purposefully selected because they were members of an

instructional design team that created online instructional modules that instructed K-12

teachers about the Next Generation Sunshine State Standards and instructional strategies.

The project was completed at the time of this research. Many of the participants have

returned to their home countries or have moved elsewhere. To recruit the participants an

e-mail was sent from the researcher requesting their participation (Appendix G). An

informed consent form (Appendix H) was attached to the e-mail. When they agreed to be

a participant, the researcher sent them a link to the online demographic and First

Principles of Instruction Knowledge Survey. After completing the survey the researcher

conducted an interview with each participant. When the interviews were completed, the

digital audio files were transcribed. Once the transcriptions were returned the researcher

reviewed the interview text to see if there were any additional questions or clarifications

needed. The researcher sent the transcripts to the each participant to check for errors and

to give the participants an opportunity to provide clarification. Member checking is a

technique to help establish credibility and trustworthiness of the data (Seale, 1999).

Three independent reviewers were selected to score participants’ First Principles

of Instruction Knowledge surveys. The reviewers were trained by the researcher on how

to score the surveys. After the three-hour training session the reviewers scored the

surveys independently. To determine the frequency that the modules incorporated the

First Principles of Instruction the same three independent reviewers conducted an

evaluation of the modules. After the interviews with the participants were conducted, the

46

researcher trained the three module evaluators on how to score these surveys. There were

four sessions totaling nine hours of training and evaluation of the modules. The modules

were evaluated independently during the training then the reviewers came to a consensus

on the application of each of the First Principles in the modules.

Data Analysis

Demographic and Design Knowledge Survey. Basic descriptive and frequency

statistics were used to analyze the demographic and design knowledge data.

Interviews and Extant Data. The interview and extant data were analyzed using

basic qualitative analytical steps, as outlined by Creswell (2009), and a comparative

analysis method (Glaser & Strauss, 1967). A comparative analysis method is when the

data are coded and analyzed concurrently. Coding is an iterative and interpretive process

(Creswell, 2008) and involves organizing the materials into segments and labeling the

segments into categories.

In this study, an online qualitative and mixed methods application, Dedoose

(http://www.dedoose.com/) was used to organize and securely store the data online. The

tool provided the flexibility to code and analyze the data concurrently. Dedoose allowed

the researcher to organize interview text data, web conferencing recordings that used

video and audio, and it linked the qualitative data to participant’s demographic data to

identify any patterns and reoccurring topics among participants. The application also

quantified the codes by providing frequency counts, which assisted in the identification

of the broader categories. The researcher analyzed each interview three times. First,

during the initial interviews the researcher wrote memos identifying prominent topics

brought up by the participants. Second, after participants checked the transcriptions for

errors the researcher reviewed the transcripts and compared it with the original audio

recordings and corrected any transcription errors. During this process more prominent

topics were identified and the data were analyzed again. Lastly, a final coding and

analysis took place. Once the interviews had been through a first-pass and second coding

regime, the researcher then used a lean coding technique to aggregate similar codes and

eliminate redundant codes in order to reduce down to topics (Creswell, 2008). After all of

the data had been analyzed 237 codes were identified. These codes were then reviewed

for redundancies and were aggregated into broader categories. Specific categories like

47

Work Closely With SMEs, Too Few SMEs, SMEs Virtual, etc. were aggregated to a

broader category of Subject Matter Experts. After several passes of reviewing and

aggregating codes there were four main topics and 16 sub-topics identified (see Table

3.1).

Table 3.1

Topics and Sub-topics

Main Topics Sub-Topics

Instructional Design Setting

Project Requirements Personnel Designer Experience Physical Setting Training and Meetings

Decision Making Power No sub-topics

Types of Design Decisions Strategic/Program-Planning General Decisions Application Decisions

Activation/Tell Demonstration/Show Application/Ask Integration/Do

Factors Affecting Decisions Time Knowledge/Experience Level

Existing Materials Online Environment

First Principles of Instruction Knowledge Survey. A scoring rubric (see

Appendix I) developed by the researcher was used to score the participants’ knowledge

of the First Principles of Instruction. Three instructional designers were the evaluators;

two with advance degrees in instructional design related fields, one with an advance

degree in nursing education, and all three pursuing a PhD in an instructional design

related field. All of the evaluators have had studied or have had prior experience with the

48

First Principles of Instruction. The evaluators were given the same articles to read and

use as a guideline as the participants in this study. They participated in a three-hour

training session, led by the researcher, to learn how to score the surveys and to discuss

any discrepancies in how the surveys were being scored. After the training, the evaluators

scored the surveys on their own. Descriptive statistics were used to report scores.

Interclass correlation coefficient was used to measure the amount of agreement among

three evaluators.

Module Evaluations. The same three individuals who scored the First Principles

of Instruction Knowledge Surveys also evaluated the nine modules using a scoring sheet

(see Appendix F). The evaluators indicated whether a strategy was present or not present.

After the evaluators came to a consensus the totals of each First Principle were calculated

providing a frequency count of how often each principle was used in a module.

49

Table 3.2

Data Collection and Analysis

Source Collection Method Analysis

Instructional Designers

Demographic and Design Knowledge Survey

Descriptive Statistics

First Principles of Instruction Knowledge Survey

• Scoring by rubric

• Descriptive Statistics

• Inter rater reliability- Interclass Correlation Coefficient

Interviews • Content analysis

• Multi-step lean coding scheme

Extant Data Project Management Documents:

• Instructional designer assignments

• Quality control documents

• Instructional design templates and models

• E-mail communications

• Recordings of team meetings

• Content analysis

• Multi-step lean coding scheme

K-12 Teacher Professional Development Modules

Evaluation rubric • Consensus

• Descriptive Statistics

Trustworthiness

Guba (1981) created a set of trustworthiness criteria that, in traditional scientific

terms, are referred to as internal and external validity, reliability, generalizability, and

objectivity. The criteria Guba (1981) created to establish trustworthiness, which more

closely describe issues of validity and reliability within naturalistic inquiry, are

credibility, transferability, dependability, and confirmability.

50

Credibility. Credibility refers to the accuracy in reporting the phenomena or case

being studied (Shenton, 2004). The techniques this study employed to ensure credibility

included (1) using well-established research methods, (2) familiarity with the culture of

the instructional design team being studied, (3) triangulation of data sources, (4)

consulting with research advisors, and (5) peer review and feedback (Shenton, 2004).

Specifically, this research employed sound qualitative research methods including

interviews with participants, analysis of extant data (i.e. documents and recordings),

surveys and evaluations. The researcher is a participant observer, meaning the researcher

was an active participant in the instructional design project and has developed a good

rapport with the participants in this study. There are multiple methods of data collection

and multiple participants that can compensate for “individual limitations” (Shenton,

2004, p. 65). Furthermore, in an effort to establish credibility and trustworthiness the

researcher consulted with her advisors often throughout the research project to “discuss

alternative approaches” (p.67). The research advisors are experts in design and

development research and general qualitative research methods and can identify flaws

and provide feedback on how to fix the flaws. Finally, the researcher elicited feedback

from colleagues and peers in order to provide a “fresh perspective…that challenge

assumptions made by the investigator, whose closeness to the project frequently inhibits

his or her ability to view it with real detachment” (Shenton, 2004, p. 67).

Transferability. Transferability refers to the extent the findings of the study can

be applied to another study (Merriam, 1998). A thick description of the research methods,

purposeful sampling, data collection, multi-step lean coding scheme used for data

analysis, and results are used to help ensure transferability. A rich description of the case

including the context of the study may help readers be able to appraise the case and find

similarities to their particular situation therefore enhancing transferability (Guba, 1981;

Merriam, 1998; Shenton, 2004).

Dependability. Guba (1981) refers to dependability as being concerned with the

“stability of the data” (p. 86). He suggests using overlapping methods in order to

triangulate the data and provide stability. As mentioned previously, this study employed

multiple methods of data collection (i.e. surveys, interviews, extant data) and multiple

51

participants (experienced instructional designers and non-experienced designers-by-

assignment) to compensate for the weaknesses of one method or a single individual.

Confirmability. Shenton (2004) and Guba (1981) described confirmability as the

naturalist’s form of a researcher’s objectivity. Techniques used in this study, to help

ensure a feasible level of objectivity, included data triangulation and a reflexive practice

called bracketing. Bracketing is a qualitative research method that is used to mitigate

potential biases because of the closeness of the researcher with the phenomena or case

being studied (Tufford & Newman, 2010). In this research, the researcher and one of the

researcher’s advisors were participants in the study. Both participants (i.e. researcher and

research advisor) had bracketing interviews by an objective interviewer not involved with

the study. The bracketing interviews were reflective in nature and revealed assumptions,

interests, values, impressions, understandings, and their points-of-view of the case being

studied.

Naturalistic inquiry (i.e. qualitative research) has its own set of criteria to help

ensure reliability and validity (i.e. trustworthiness). As mentioned previously, the

naturalistic criteria include credibility, transferability, dependability, and confirmability.

This study employed multiple methods of data collection and analysis, multiple

participants, and used a reflexive strategy (bracketing interview) to foster trustworthiness.

52

CHAPTER FOUR

RESULTS

This study examined the use of First Principles of Instruction and the design

decisions made by instructional designers during an intensive instructional design project.

The primary research question for this study was: How were the First Principles of

Instruction used by instructional designers, in a short-term, high volume, rapid production

of online K-12 teacher professional development modules instructional modules? The

results of four supporting questions are addressed in this chapter: (1) What were the

conditions under which the First Principles of Instruction were used? (2) What design

decisions regarding the First Principles of Instruction were made during the project? (3)

What was the level of understanding of the First Principles of Instruction by instructional

designers? (4) How frequently do the modules incorporate the First Principles of

Instruction? The results of each research question are stated in this chapter.

Conditions Under Which First Principles Were Used

The first research question focused on the conditions under which the First

Principles of Instruction were used. Analyses of interview and extent data suggest that the

instructional design setting was a main topic and subsidiary topics included project

requirements, personnel, designer experience, physical setting, and training and meetings.

Instructional Design Setting

Project Requirements. The project requirements included a) converting existing

face-to-face materials to an online environment, b) creating the modules to be versatile so

they can be adapted into existing professional development training programs, c)

embedded with in an existing online portal and repository, and d) completed within 11-

weeks.

The project requirements stemmed from the client’s request to convert existing

face-to-face teacher professional development training materials to an online format.

Determining other client requirements was difficult according to the project lead, who

53

interfaced with the client about these requirements. He indicated, “Part of the challenge

was trying to figure out what the client really wanted and narrowing that down. So, that

was actually a little bit tricky because they did not come out and say ‘this is what we

want’.” The project lead determined additional requirements along with the co-directors,

the lead instructional designer, and the science and math team leads. “We had to really

kind of think of what’s the best and we would propose it to [the client]. But honestly,

when we had our strategy, they [the client] didn’t have a qualm with it. So they were

happy.” These requirements included making the online modules versatile so they could

be incorporated into school districts’ existing professional development programs and

could be completed independently. “They needed to be embedded within [the online

portal]. We decided to make them into these modules that can be used independently or

as a set [of modules].” Housing the modules within [the online portal] was a requirement

determined by the co-directors of the project. The online portal was an existing repository

and course management system that contained all the standards and benchmarks for the

state’s K-12 school system as well as lesson plans, activities, and other resources for K-

12 teachers.

The most influential project requirement was to design and develop the modules

within an 11-week timeframe. This requirement was determined because the $1.2 million

dollar grant funding this project would be discontinued after a certain date (which ended

up being 11 weeks from the start date).

Personnel. The personnel consisted of project directors (not part of this study),

two project leads, two math team leads, four science team leads and 20 instructional

designers and designers-by-assignment (participants this study included eight of the

instructional designers and designers-by-assignment). The project leads and team leads

had multiple roles during the project (i.e. administrative tasks and instructional design

tasks). Participants indicated that prior obligations, variability in schedules, over

scheduling part-time workers, and excessive working hours were significant conditions

under which the First Principles of Instruction were used during the project.

One of the math team leads also had administrative responsibilities in the project.

He led the recruitment effort to hire enough instructional designers to complete the

project on time. He recruited instructional designers and designers-by-assignment

54

through his associations at the university where this study took place. In order to hire the

amount of people needed for the project, allowances needed to be made to the

instructional designers’ schedules. All of the instructional designers hired had prior

obligations ranging from second jobs, additional projects, family commitments, college

classes, and prior travel arrangements. The project lead said,

We wanted to accommodate otherwise they would say no to the project.

And so, some of the students we only got for two weeks and somebody for

four, and some came in after four weeks, so it was too fluid.

Table 4.1 shows a sampling of a project management document that illustrates the

variability of instructional designers’ schedules. However, this does not reflect some

instructional designers having quit the project early or other unexpected changes in

working hours nor does it reflect the actual hours instructional designers worked.

Instructional designers generally worked more hours than illustrated here.

Table 4.1

Instructional Designers Working Hours

7/18-

7/23

7/25-

7/30

8/1-

8/6

8/8-

8/13

8/15-

8/20

8/22-

8/27

8/29-

9/3

9/5-

9/10

9/12-

9/17

9/19-

9/24

9/26-

9/30

W1 W2 W3 W4 W5 W6 W7 W8 W9 W10 W11

Team Lead 20 20 10 5 5 0 0 0 0 0 0

Designer 40 40 40 0 40 40 20 20 20 20 20

Designer 20 20 20 Travel (10 or less) 10 10 10 10 10

Team Lead 20 20 30 30 30 30 30 20 20 20 20

Designer 15 15 15 40 40 40 20 20 20 20 20

Designer 0 0 40 40 40 0 0 0 0 0 0

The project lead and lead designer elucidated they worked 10-14 hour days for six

and sometimes seven days per week. On occasion they would work 18-hour days in order

to maintain the momentum of the project. A team lead asserted that the leads were asking

more of the instructional designers than they had time to complete within their designated

working hours. He said, “We were requesting [instructional designers] to do things like

they are working the whole time, but they were working 10 hours or 20 hours, but we

were… expecting them to do things like working 40 hours every week.”

55

Designer Experience. Participants reported previous instructional design roles

and tasks (see Table 4.2). With the exception of two designers-by-assignment each

designer had some previous instructional design experience ranging from providing

support in creating instruction (e.g. multimedia development, proof reading, research,

gather content) to designing, developing, and evaluating instructional design materials

and courses. One designer-by-assignment reported zero instructional design experience

however he indicated that he previously developed course materials. A large gap between

the years of experience existed. There were three designers-by-assignment that reported

zero prior instructional design experience; six designers reported 11 months to two years

of experience, five designers with three to six years of experience, and two designers with

13 to 20 years of experience. There were no instructional designers that reported six to 12

years of experience illustrating a large unfilled gap of proficient and expert instructional

designers. In addition, two team leads indicated only having one year of instructional

design experience. Most instructional designers, 10 out of 15, were pursuing higher

degrees in measurement and statistics, learning and cognition, or instructional design.

Table 4.3 illustrates the means and standard deviation of years of experience

based on the instructional designers role. Designers-by-assignment were not factored into

the calculation of instructional designers so as not to skew the data with the zero years of

experience.

56

Table 4.2

Instructional Designers Demographics

Gender Highest

Degree

ID

Experience

Previous ID Roles/Tasks

Male Masters 1 year • Assistant instructional designer

Female Bachelors 2 years • Designed and developed instructor-led training

• Designed multi-media supplements for two management courses

• Designed content, assessments, proof and edit book chapter/papers

Female Bachelors 0 • No Prior Experience

Male Doctorate 5 years • Worked on small-scale course projects.

Male Masters 11 months • As part of an instructional design internship I helped design and create instructional

modules on interviewing, networking, and finding jobs for business students.

Male Masters 1 year • Gathered content from open sources and subject matter experts.

• Designed course layout, including visual design, graphics, user interactivity, and audio narration.

Male Doctorate 6 years • Designing online courses

• Creating instructional material for online courses, audio, video, animation

Male Doctorate 3 years • I have been preparing courses for learning management systems (Instructor).

• I have developed e-learning content for some courses. (Developer).

• I have checked the instructional contents crated for e-learning environments (researcher).

Female Bachelors 1 year • Objective and assessment writing, content analysis, developing instructional

modules in PowerPoint, writing instructional content.

Male Doctorate 20 years • Project Manager

• Lead Instructional Designer

• Senior Instructional Designer

• Courseware Developer

• Instructional Designer

• Evaluation Specialist

Female Masters 0 • No Prior Experience

Male Masters 0 • Manage a national website including class materials for junior high school and high

school teachers operate regular meetings for update and develop class materials for junior high school and high school teachers

Female Masters 5 years • Designer in I am Learning numbers project for children who are 6 years old.

• Designer in I am Learning concepts on probability for children who are 10 years

old.

• Assisting the course named Instructional Design for bachelor students

• Assisting the course named Project Development and Management I and II for bachelor students

Female Masters 13 years • Project manager of instructional design projects

• Multimedia developer

• Instructional design and distance learning consultant

• Designed storyboards, instruction, distance learning, faculty consultant

Female Doctorate 3 years • Design and develop online training, and instructor-led training

57

Table 4.3

Means and Standard Deviations of Years of Experience

N M SD Range (years)

Project Leads 2 16.5 4.95 13 - 20

Team Leads 6 3.5 2.17 1 - 6

Instructional Designers 4 1.73 .98 11 mo. – 3

Designers-by-Assignment 3 0 0 0

Physical Setting. During this project, there were two offices in which the

instructional designers were housed physically –a main on-campus location and a

secondary location at an off-campus research facility. The project lead and lead designer

were located at different locations. Review of e-mail communications, analysis of project

management documents and researcher observations revealed that several instructional

designers and all but one subject matter expert telecommuted for the duration of the

project. Many of the instructional designers would often come into the on-campus

location for staff meetings and training while others would meet via web-conferencing.

While the telecommuting arrangements substantiates the assertion of flexibility and

accommodating factors necessary to hire and keep designers working on this project, this

arrangement was also very challenging for some of the team leads and instructional

designers.

Interviews indicated that team leads and instructional designers felt that having all

of the designers and subject matter experts in the same face-to-face location would have

resulted in a more efficient work environment. The project lead said, “There was a core

group [on-campus]. It was much easier for the lead designer and I to go through some

[things] face-to-face…decision-making is facilitated face-to-face. Overall, it was really

helpful to be face-to-face.” The lead designer said,

My office was over at the [research facility] where I could do both of my

jobs at the same time, but it really worked better if I was [on-campus]

where I had direct access to the project lead and the team.

58

Moreover, a team lead suggested that productivity could have been improved if other

instructional designers were working together face-to-face - “It ought to be like working

together and brainstorm together. I think that because what needs to be done and deciding

together, it will be better…instead of studying or working separately.”

Training and Team Meetings. A project kick-off meeting was conducted once

the project funding was awarded and after the majority of instructional designers were

hired. During the three-hour kick-off meeting instructional designers were given an

overview of the project, timelines, responsibilities, and expectations. In addition, an

overview and training were provided on the First Principles of Instruction. Designers

were directed to use the First Principles of Instruction as a framework for the online

modules. They received an email about a month before the project began with journal

articles about the First Principles of instruction and a website that was used as a model

for how First Principles of Instruction could be used (Mendenhall et al., 2006a, 2006b;

Merrill, 2007b, 2009d). These articles and the website were sent again the day before the

kick-off meeting. There were 11 designers that reported reading the article, The First

Principles of Instruction (Merrill, 2009d), 13 read A Task-Centered Instructional Strategy

(Merrill, 2007b), 12 read A Task-Centered Approach to Entrepreneurship (Mendenhall et

al., 2006a), and 11 reviewed the Entrepreneurship website for at least 10 to 30 minutes.

One designer did not read any of the articles or review the website. One designer-by-

assignment reported a poor understanding of the content of each article and the website.

The majority reported having a good or excellent understanding of the content of each

article and the website

59

Table 4.4 Training Materials Use and Perceived Level of Understanding Results

First Principles of

Instructiona

Task-Centered

Instructional Strategyb

Task-Centered

Approach to

Entrepreneurshipc

Entrepreneurship

Websited

Read Yes=11

No=2

I don’t remember = 2

Yes=13

No=2

Yes=12

No=3

Yes=11

No=3

I don’t

remember=1

Understanding of content

Excellent=3

Good=5

Neutral=2

Fair=1

Poor=1

Excellent=3

Good=8

Neutral=2

Poor=1

Excellent=5

Good=6

Poor=1

Excellent=4

Good=4

Neutral=2

Fair=1

Poor=1

Note. a (Merrill, 2009d); b(Merrill, 2007b); c(Mendenhall, et al., 2006a); d(Mendenhall, et al., 2006b)

There were 10 instructional designers from this study who attended the kick-off

meeting. Some thought the training was somewhat helpful, but the majority felt the

training didn’t help in their understanding of the application of the First Principles. One

designer said, “I felt coming away from those articles and from the training that…even

though it was brief, we had a pretty good overview of Tell-Show-Ask-Do.” Conversely,

another designer asserted that she,

didn’t get a whole lot out of the training. I went home, sat down with the

articles – the ones we were provided and I hashed through it that way. I’m

not saying that the training that was provided wasn’t good but what I’m

saying is that there was so much going … I have a difficult time honing in

my attention… so for me it was not effective at all.

A designer-by-assignment suggested,

It would have been really a better idea to have received the Merrill articles

and possibly even more of the instructional design references so we could

have investigated before the kickoff party so that we could then discuss

them in advance…I’m pretty sure that we did receive the (Merrill’s

articles) in advance…but it was still pretty rough and to just jump into

Merrill kind of cold turkey was a bit hard.

60

A second designer-by-assignment believed that “the instructional designers can

understand the principles but if you taught it in detail to the instructional designers [it will

be] more helpful.”

A team lead recalled the initial training as being fun and having a great time but

the training was “spray and pray” — a term used by Merrill (2009b) to describe lecture-

based teaching, spray information at the students and pray they will remember what is

said. He added, “I think what we failed to do was not getting them to practice.” Other

designers indicated that the training would have been more helpful if the project lead and

lead designer helped them apply the First Principles of Instruction. However, the lack of

time was specified as an inhibitor in conducting a more thorough training. A team lead

affirmed, “There’s no way we had any time do that” (i.e. to practice applying First

Principles of Instruction during training and receive feedback from the leads).

After the initial training took place the instructional designers were assigned to

different teams, the math team and the science team. Each team would conduct meetings

one or more times per week. Instructional designers generally reflected positively on the

individual team meetings because they were more intimate and a time to get specific

questions answered. One team leader said, “I think the meetings were really helpful

because we asked all questions that we were dealing with; these were the problems and

we try to find solutions for them or try to answer them.” Another team lead reflected,

“There were times that we had just team leader meetings when we would go over the

model (i.e. First Principles framework).” A designer-by-assignment affirmed, “I think

that having everyone around you where you can just say ‘Hey, does this look right?’

definitely it was helpful more than stopping and sending it to someone [via e-mail].”

Three weekend working retreats took place where as many instructional designers

that could attend would meet all day for two-days and work on designing the modules.

These retreats allowed instructional designers constant access to the project lead, lead

designer, and team leads. The working retreats allowed instructional designers to team-up

with one another in an effort to quickly and efficiently produce the modules. A designer-

by-assignment affirmed the usefulness of the working retreats; “I definitely think the

meetings that happened in physical space – when we went [to the off-campus research

facility for the retreats]… I think those weekend meetings were really helpful.”

61

Summary. The conditions of the instructional design project described by

instructional designers included the physical environment, designers’ experience levels,

and training/coaching. The environment (i.e. designers not all together in one space and

SMEs virtual and not easily accessible) affected the use of the principles due to not

having the experienced designers and novice designers together. If the experienced

designers were in close proximity of the novice designers more coaching, mentoring, and

immediate feedback would take place. It is likely that if experienced designers had to

coach novices individually that the amount of time experienced designers had to design

their modules would be reduced. Similarly, the working hours and schedules of the

designers varied so much that group coaching may have been difficult to coordinate.

Moreover, the trainings that took place were an attempt to provide the necessary coaching

however, as indicated by instructional designers the time and structure of the training

affected the quality of this training.

Decisions Regarding First Principles

The second research question addressed in this study was – What design decisions

regarding the First Principles of Instruction were made during the project? Three primary

topics were identified during the analyses of the case study data. The topics included: (1)

Decision Making Power, (2) Types of Design Decisions, and (3) Factors Affecting

Decisions. Several sub-topics were also identified and the results are addressed below

along with the main topics.

Decision Making Power

Most instructional designers indicated that their decision making power was

limited, however the designers-by-assignment and one team lead felt they had sufficient

decision making power but indicated time and lack of subject matter knowledge as

factors that limited their decision-making. Moreover, one team lead felt his decision-

making power developed over time. The reasons given for limited decision-making

power included a) project environment was not set up for decision-making, b)

instructional strategy and framework were already chosen, c) content in existing materials

were previously determined, d) lack of knowledge and experience with First Principles

and lack of subject matter knowledge, and e) time limited decision-making.

62

Project Environment. A team lead claimed that the environment was not set up

for decision-making. “There [was] no environment to decide something because the

things [that] need to be done were already applied on the [existing materials] and there

was a framework.” Further, he affirmed that the leaders made the higher-level decisions,

which left instructional designers with only minor or lower-level instructional design

decisions to make.

Framework and Instructional Design Strategy. An instructional designer stated

that she felt she had to “stick with the format that many people had agreed on and also

they wanted to follow the First Principles as much as possible… so, we did not have a lot

of freedom to explore our design as we wanted.” Similarly, another designer stated that

they did not have control over the instructional design framework and strategy because

the leaders had already made those decisions. A team lead asserted that his decision-

making was limited because the framework and materials were provided for him. He

added that in the beginning of the project, the team he led was not making instructional

design decisions because they didn’t have enough knowledge to decide, but after time

“they got enough knowledge to decide; to make easy decisions.”

Existing Materials and Content. An instructional designer affirmed that

designers were “limited by the content because we were using what has already been

created; we were just modifying previous content.”

Lack of Knowledge and Experience. An instructional designer stated that she

did not have the ability to make some instructional design decisions because she had not

had a lot of exposure to the First Principles of Instruction prior to the project. She said,

“My biggest challenge was figuring out the Tell-Show-Ask-Do framework and how it

related to the First Principles of Instruction.” Likewise, a second team lead felt that

during the project he didn’t have the instructional design decision-making power mainly

because he felt confused. “I did not grasp what we were trying to do, I kind of understood

half way.” In the beginning, he delegated the decision-making to the team members he

led. He confessed, “Most of the decisions, I did not make those decisions in the

beginning. Most of the decisions were made by… the instructional designers (on his

team).” After the project started to move forward, this team lead indicated that he did

make several decisions. Analysis of extant data revealed these decisions were more

63

managerial or strategic (i.e. when and where meetings were held, determining who would

work on which modules) and less instructional design oriented decisions.

A designer-by-assignment felt he “had enough decision making” power but he

said, “in my case that is very hard. I felt I had enough decision-making but… that could

be a problem at the same time, regarding content, because we are not [subject matter]

experts.” He indicated he was uncomfortable making certain decisions because of his

lack of subject matter knowledge.

Time. A team lead said, “We already have this power (i.e. decision-making

power) for the project. But we have limited time.”

Types of Design Decisions

While most instructional designers indicated being limited in their decision-

making power interviews, module evaluations, and extant data revealed that, in fact, there

were many design decisions that were made. Results indicated project directors and

project leads made a few strategic and program-planning decisions and instructional

designers made numerous general instructional design decisions and design decisions

regarding the application of First Principles of Instruction.

Strategic/Program-planning. There were some decisions not specifically related

to the application of the First Principles of Instruction but related to instructional design

tasks that emerged during this study. These decisions made by the project directors,

project lead, and lead designer were more strategic in nature and indirectly affected the

use of the First Principles of Instruction and the conditions under which the First

Principles were applied. Strategic/program-planning decisions are decisions that affect

how the entire project or program will function. These decisions included hiring

instructional designers, determining how learners would be assessed after completing the

modules, and simplifying the First Principles of Instruction by creating a storyboard

template that uses a Tell-Show-Ask-Do framework.

These strategic/program-planning decisions included the recruitment and hiring of

28 instructional designers and designers-by-assignment. The decision to include

designers-by-assignment and inexperienced instructional designers was due to 1) the need

to have enough personnel to complete the project on time, 2) experienced instructional

designers were not available during the summer, and 3) bureaucratic procedures delayed

64

the start of the project and the full-time instructional design contractors couldn’t continue

to wait for the processing to be complete and had to accept other work. The lead designer

was not in favor of hiring a large number of part-time instructional designers and

designers-by-assignment. She reflected on a conversation with the project lead about

hiring many part-time instructional designers.

I approached the project lead about this other type of organizational

hierarchy… about having fewer people but having them full-time… he

wasn’t opposed to the idea but I think he knew more than I did at that

time, that these contractors… couldn’t come on board full-time. We had to

change our plan and try to get as many (instructional design) students to

make up a 40-hour work week.

Determining the assessment was a strategic decision that needed to be decided up-

front before the design of the modules. According to the project lead, determining the

assessment was “really tricky because we were really getting strong…internal push from

other project team members (i.e. co-director and internal consultants). They wanted to

test on domain knowledge… The purpose was not to teach them the content (they already

have domain knowledge)…we were trying to convince our peers and our partners here

and trying to say ‘but your assessment doesn’t align with your objective.’” The decision

was to assess teachers on the new science and math standards and benchmarks and their

use of the instructional strategy as the learner described it in a lesson plan.

The third strategic decision, which could also be considered an instructional

design decision, was to create a storyboard template using a simplified version of the

First Principles of Instruction. The simplified version (i.e. Tell-Show-Ask-Do framework)

is not arbitrary but is referenced in much of Merrill’s work (see Merrill, 2002, 2007b,

2009d). While determining the instructional strategy and framework is an instructional

design decision, this was an important strategic decision because there was a need to

quickly familiarize inexperienced instructional designers and designers-by-assignment

and provide them a guideline to use as they designed the instruction. One caveat worth

noting is the storyboard template (see Appendix J) was created a couple weeks into the

start of the project and was not available during the initial kick-off and training meeting.

Instructional Design Decisions

65

General Decisions. The general instructional design decisions made by designers

included the selection of media – pictures, videos, and illustrations and determining the

content to include and exclude. A designer-by-assignment said, “We had all decision-

making power in the world about graphics and examples to include.” During a team

meeting one team leader told his team they had the choice of which picture and media to

use in their modules. He emphasized that the designers will also decide which textual key

points to put on the screen. Conversely, he told the members not to worry about the

placement of pictures because an instructional designer will be designated to work on the

layout of the module screens. Another team lead explained, “We can easily decide [which

content to select] based on our experience or based on the [First] Principles or some

suggestions from the content expert.” His technique was to decide which content to

include first and then have the content expert review for appropriateness and accuracy.

Likewise, the lead designer stated that she was “extremely comfortable with researching

and choosing new content.” An instructional designer pointed out that she also researched

for additional content to supplement the existing materials:

So, when we had the opportunity to create like something about the

inquiry [strategy] and build something around that for [the learners]… I

could go to [the online portal] and find lessons they would actually draw

upon and create. That’s when I thought I was being most effective. That’s

where as an instructional designer, I have an option to say ‘Okay, I would

like the teacher to look at this scenario and how he/she can use inquiry in

order to meet this standard.

Application Decisions. Instructional designers were provided a storyboard

template (see Appendix J) that guided them in the application of the First Principles of

Instruction (Tell-Show-Ask-Do framework). Instructional designers didn’t believe the

modules fully incorporated the First Principles of Instruction. One instructional designer

asserts,

I don’t think we actually try really hard to follow [the First Principles]. At

the end we don’t stick to the model really well. And from my

understanding it does not have to follow the Tell-Show-Ask-Do. We can

66

switch this around at some point, but then we kind of follow that up at the

end and we didn’t really follow that really well.

The lead designer agreed that the modules did not follow the First Principles of

Instruction as she had envisioned. She reflected on when she first received a module to

review, “I received some of the modules just thinking ‘oh my gosh, what did we do

wrong’, like in training the instructional designers…why is this so off? I think a lot of the

instruction was just Tell, Tell, and Ask.” However, the project leaders and instructional

designers felt that they did the best they could, given the constraints of the project.

Moreover, instructional designers felt the modules were a great improvement compared

to the existing materials.

Activation/Tell. At the beginning of the modules the screens were standardized with

the goals of the modules, the science or math standards/benchmarks that were addressed

in the modules, and then some type of background knowledge slides were provided. For

example, in the elementary science modules the general background information

consisted of the cognitive development of children at the different grade levels. In the

math programs, there was very little, if any, general background information outside the

goals, math standards and benchmarks addressed in the modules.

Instructional designers indicated that the Activation/Tell principle was very easy to

apply in the modules. Most of the content from the original materials were general

information or Tell only. A team lead said, “The first two steps are easily adaptable…the

beginning part (Tell-Show) but the last two parts are not easy.” Another team lead and an

instructional designer agreed that “there were no difficulties in the Tell part” and that “the

easiest part [to apply] would be the Tell part to instruct [the learner]”. While the

instructional designers acceded that the Activation/Tell principle was easy to apply some

felt Telling was not conducive to good instruction however, it was necessary to provide

general information. An instructional designer asseverated,

The word Tell sounds kind of like an information dump to me… I think

that’s a little boring for a learner. But at the same time, sometimes there

really is no better way to disseminate information and put some things

such as a benchmark. I can’t think about a more creative way than telling

them the benchmark, if that’s what they need to know.

67

The project lead acknowledged that he struggled a little bit with the Activation/Tell

principle and he didn’t know the audience very well. He said, “I’m not one of them. So,

this notion of giving them a couple of slides of content, I wondered if that was really

doing it…The activation of the strategy I got, the activation of the content I wasn’t sure.”

A designer-by-assignment indicated that, for the math modules they generally

provided definitions for the Activation/Tell principle. A team lead said that when

developing a math module his team would activate prior knowledge by “questioning or

asking them to reflect.” A science team lead and his team would “provide the information

to the learner first… and you have to explain what’s the core of the subject to the

learners.” Moreover, a second science team lead took a similar approach when applying

the Activation/Tell principle. He stated, “At the beginning we give some information or

we give some task to the students based on [the existing materials]”.

Demonstration/Show. Instructional designers tried to apply the Demonstration/Show

principle but every lead and designer indicated time as a major factor in how they chose

to demonstrate concepts. Team leads and instructional designers believed this principle

was easy to apply however with the time constraint they felt restricted in their efforts to

provide quality demonstrations. At the beginning of the project it was strongly suggested

by project directors to strictly limit the amount of videos created for demonstrations

because of the time and resources it would take to create a quality video. Many

instructional designers agreed that if there were more time they would add in more

demonstrations. The lead designer reflected, “If I had to make a decision based on time, I

would always try to put in demonstrations, you know, they really need to demonstrate

and show these concepts.” An instructional designer said she would also “add more

videos and…create ways to demonstrate.” She continued by saying that the

Demonstration/Show principle “was a little more difficult” because the modules couldn’t

“show” how a teacher uses the inquiry strategy. There were no videos of teachers actually

demonstrating the instructional strategies for the learners. They were written

descriptively with some specific information used as a demonstration. For example, the

backward design and standards based instruction used a specific science standard and

broke the strategy down step-by-step.

68

Some modules, on the other hand, were easier to incorporate video. The instructional

designers working on the physics and chemistry modules were able to incorporate video

easily because the conditions were favorable. For example, they were able to find a

laboratory in close proximity that had the appropriate materials on-hand and lab assistants

that were willing and able to meet on short notice to videotape the demonstrations. An

instructional designer, working on a chemistry module said,

My module actually uses the video…. of course I kind of look at the

content and I’m thinking well this is an experiment that they want to do in

their physics class and if they want the very similar amount of quality, we

need a video.

In other science modules, a team lead said they decided to demonstrate using specific

examples. For example, in a biology module an instructional designer and the lead

designer chose to use real-world examples to demonstrate the steps of the inquiry

instructional strategy. They used pictures to help portray the real-world examples.

Application/Ask. For all of the modules there was a uniform screen asking the

learners to review the standard/benchmarks from the modules and to reflect on the

following questions:

• How would you implement these ideas into your classroom?

• What challenges do you anticipate encountering?

• How will you handle each of these challenges when they arise?

• Are there activities you’re currently use in your classroom that support teaching

and learning of the benchmarks?

• How will you incorporate the [instructional strategy] in your teaching?

The standardization of the Application/Ask principle was to help “resolve the practice

component which wasn’t part of the module” according to the project lead. The project

lead, lead designer, and an instructional designer all mentioned the desire to have the

application embedded within the [online portal] in order to assess the learner

appropriately and provide feedback. Instructional designers indicated that there was a

need for more practice within the modules. One instructional designer reflected that in

one of the science modules she was working on there was an application activity she

69

wanted to incorporate but “due to the framework of the design [the activity] didn’t fit,

and we were running out of time, so I changed it to a guided activity.”

Some designers incorporated practice activities within the modules. For example, in a

couple of science modules the instructional designers would put up some screens asking

questions or asking the learner to practice writing observations and making inferences.

The practice activities were not being assessed and the learner’s answers were not being

recorded. The instructional designers provided feedback of possible correct answers on

the subsequent screens.

Instructional designers and team leads contended that the Application/Ask principle

required more instructional design expertise in order to apply it appropriately. A team

lead said that the “first two steps (Tell-Show) are easily adaptable but third and fourth

(Ask-Do) are not easily understandable and…I think [require] some experience to adapt

or to apply.” A designer-by-assignment, who has a degree in math and measurement and

statistics, felt the Application/Ask principle for the math modules was especially difficult

to apply, she said, “application is also hard for math. So, maybe it can be improved,

application parts can be improved and how can we apply this to the real-world, because

we don’t use functions in the real world, not [these] kind of functions.”

Integration/Do. For all of the modules there was one screen with an integration

activity. The Integration/Do activity included two parts. First, it asked the learner to take

a posttest. The posttest was not designed or developed as part of this project; assessment

experts hired by the client created the posttest. Since this was not part of the study the

researcher is unaware of the specific assessment items on the posttest. During initial

meetings the project lead and lead designer tried to convince other project directors and

consultants, who are not part of this study, to create an assessment that provided the

learners with an real-world task so they can apply their new knowledge. The project

directors and consultants felt the learners should be tested on the subject matter domain

(i.e. science and math concepts) and not on the objective of the modules; the outcome of

this discussion is unknown to the researcher. The second part of the Integration/Do

activity asked the learners to apply their new knowledge by creating a lesson plan. They

were asked to use a lesson planning tool embedded within the online portal to create and

70

submit a lesson plan for the science or math standard/benchmark and to plan the lesson

using the instructional strategy they learned in the modules.

The project lead decided to standardize the integration/do screen in the modules. The

modules were independent study and designed to be integrated into an existing

professional development program leaving the school principals and school district

administrators the option to assess the learner’s lesson plans based on their own

guidelines. Even though the instructional designers did not make any decisions regarding

the Integration/Do principle some recognized the difficulty in applying this principle. A

designer-by-assignment stated,

“I think we had difficulty, most difficulty on deciding what to do in the Do part,

because if we, if as an instructor you gave some assignment…you need to give

feedback to them. So, I think the most difficult decision was that part.”

A team lead said there were “difficulties in the third (Ask) and fourth (Do) steps

especially the Do part they had some difficulties how to apply the do part, how to prepare

the do steps while designing.”

Factors Affecting Decisions

There were several factors indicated by instructional designers that affected their

decisions regarding the First Principles of instruction. These factors included:

• Time

• Knowledge/Experience Level

• Existing Materials

• Online Environment

Time. Time was the primary factor affecting how the First Principles were

applied in the modules. The majority of participants considered the time when making

decisions regarding the application of the First Principles. Due to lack of sufficient time

to complete all the modules the scope was reduced and the leads, with client approval,

decided to not convert the elementary math modules. The project lead said, “Scope and

time, that was always in the back of my mind and there were some times when the time

issue helped us make a scope decision. As a matter of fact, the scope was always – it

wasn’t difficult but it was just amount of time that we needed, calendar time that we

needed to have. So, the time factor played on with the scope.” The lead designer

71

concurred by saying, “Time obviously was a major factor in every decision that we made

regarding what to put in, what to keep out…”

Time was the most frequent factor mentioned by instructional designers

that influenced their decision-making regarding the application of the First

Principles of Instruction. All 15 instructional designers in this study referenced

time as a constraint 128 times during interviews. The 128 references are in

addition to the myriad of e-mails, recorded meetings, and personal conversations

that also reference time as a major constraint. Instructional designers felt that

creativity was inhibited by the time constraint. One team lead stated, “We could

be more creative if we and they have had more time.”

Other instructional designers specifically stated that the

Demonstration/Show principle was affected most by the time constraint.

Consequently, inhibition of creativity was also a side effect of not having enough

time. An instructional designer reflected,

For physics and chemistry, that was very hard to do the show part because

we were just basically writing down an activity that they should have done

in person. And I think if we had a little more time to kind of be creative

and coming up with more appropriate activity for the internet that would

have been better.

A designer-by-assignment affirmed, “If we had more time and more

instructional designers we could be able to create more, better examples.” The

lead designer indicated if she had to make a decision based on time, she “would

always try to put in demonstrations [because] they really need to demonstrate or

show these concepts.” She continued, “Time was a major factor [in deciding] how

many demonstrations, what type of demonstrations because we really wanted

more video demonstrations [showing] teachers using these instructional strategies.

But we just didn’t have time.”

Another instructional designer said,

I felt like if we’d had more time or maybe more resources I think using

videos to actually show. And then, accompanying that with narration or a

breakout of bullet points, explaining, highlighting maybe certain points of

72

your demonstration. I think we ended up doing a lot of text on the screen

being narrated, which wasn’t maybe the most exciting or effective way.

Instructional designers also believed that the decisions regarding the

Application/Ask and Integration/Do principles were affected by time. A team lead

stressed that, “The Ask and Do phases take more time and preparation.” An instructional

designer stated, “I know we could have done the [Application/Ask] part better had we had

more time.” A second team lead said, “I believe we struggle with the Ask part when we

have questions for [the learner].” A third team lead explained, “In some parts we would

keep the same…in some parts if we had more time we would add a bit more detailed

images or concept maps, more drawings, in Tell and Show parts and especially in Do

part…so maybe we ignored the Do part in this project.”

Knowledge/Experience Level. Instructional designers made decisions regarding

the use of the First Principles based on their knowledge and prior experiences. Even if an

instructional designer had some familiarity with the First Principles (i.e. had taken a class

with Merrill, read/studies articles) they would often use design practices that they were

comfortable with or had previous experiences using. Many instructional designers on this

project were still in school studying instructional design or were recent graduates with

little real-world design experience. The lead designer said, for “the novice designers [this

was their] first instructional design project outside of school and they were familiar with

Gagnè. They were familiar with process models like the ADDIE model or Dick &

Carey…they would kind of try to make decisions based on their knowledge of those

things versus their knowledge and understanding of First Principles.”

A team lead said he knew the First Principles of Instruction for a long time (he

had taken a class from Dr. Merrill, the author of First Principles, during his

undergraduate years). He said, “I know what they mean but now applying to the real

projects, it was hard. I think that the ultimate issue would be that was my first time…

designing instruction.” Regarding her studies in instructional design, one designer said, “I

know I had never heard of First Principles before this. So, it was very interesting that we

had to assimilate this new information prior to and during the construction of these

modules.”

73

Instructional designers, both those with years of experience and those without

experience, indicated that the First Principles were easy to understand and practical

however, they were difficult to apply during this instructional design project. One

instructional designer asserts, “I think the principle(s) [are] very, very effective for this

kind of project but I suggest the instructional designers as well as subject matter expert

have to learn [the First Principles]…and have to learn how to apply the principle(s) for

reality.” A team lead confirmed that the First Principles of Instruction are “not a complex

model” and it wasn’t the

difficultness of the First Principles, it’s the hardness or the applying an

instructional design model…into a real world [project]. That’s the biggest

problem I think we have ever had. Many of the instructional designers in our team

were really good experienced people but in the classes not in the real-world.

Another team lead acknowledged that he knew what First Principles of Instruction

were however; he “did not really fully comprehend how to apply these into instruction.”

Instructional designers also indicated that after some practice their level of

understanding would improve and the decisions regarding the use of First Principles

became easier. One team lead reflected that after time things “got a little easier. We could

determine how much is too much information…do we need a picture?” Another team

lead concurred, “After sometime…[instructional designers] understood.”

Existing Materials. Instructional designers indicated that the existing materials

and the content of a project did affect the decisions regarding the use of First Principles.

The lead designer stated,

One of the challenges was just with the original materials themselves… all they had were

discussion questions for in-class face-to-face discussions with the teachers. [We] had to

fill in those gaps in order to put it online because that stuff was just not there in the

original materials. I think that was a challenge for [instructional designers] that would

affect the way they used First Principles because the demonstrations weren’t there in the

original materials. A team lead declared,

Actually the First Principles of Instruction is really clear. So you know what to do

exactly but the hardest part as I said its particular subject matters we had to deal

with. So, for example, chemistry… we couldn’t decide on the specific and

74

particular examples, which can be provided to the learners to teach the whole

topic.

Another team lead reflected that if he could “develop the content with the subject

matter experts, we create much more suitable material for the First Principles. I think

because we were limited by the content also with the subject matter experts.” An

instructional designer claimed that the type of Math content (i.e. functions, Euclidean

Constructions, Euler Segment, polynomials) she was working on was difficult to center

on real world problems.

Online Environment. The online environment affected instructional designers’

decision making and their use of the First Principles. Specifically, the Application/Ask

and Integration/Do principles were difficult to apply because the online portal where the

modules were housed was limited in its ability to provide feedback and score the

application activities. An instructional designer said, “I would say the online module, the

format of online learning itself is also one of the challenges, because like I say, the do

part and the ask part are pretty challenging.” The lead designer stated,

We wanted the modules to interact with [the online portal] more, so that when

[the learners] were in the modules they could go along and do their lesson plans

or we would have activities integrated – like more Ask parts…but we found out

that wasn’t possible…I’m not sure why, either time or the [online portal] wasn’t

set up to be able to store the information.

An instructional designer said, “I think [First Principles] was a really good

framework. Maybe with the exception of the Do, because it’s hard to take an online

module and ask teachers to demonstrate…The Do was just, it was kind of left up to [the

learner]…ideally I think the Do would be excellent for face-to-face and a little bit harder

to do online.” A team lead reflected,

Sometimes we couldn’t clearly extract the pure knowledge part or we couldn’t

understand the application they provided in the paper-based (face-to-face)

modules. And, of course, some applications were designed for the face-to-face

sessions. So, we have to find an appropriate application for the electronic version

of the modules, which was difficult for us.

75

Summary. Initially the instructional designers indicated a lack of decision-

making power, however, as indicated above the designers made a considerable amount of

instructional design decisions. Most instructional designers are familiar with design

processes (i.e. process models like ADDIE) that include analysis and making high level

strategic decisions before the project begins. Novice designers that have had ISD training

would recognize that not making the initial decision of which instructional model to use

would be limiting to them. The designers-by-assignment, on the other hand, are not

familiar with the design process and didn’t recognize that not making the initial decision

to use First Principles could be deemed as limiting. As indicated by the designers

recollections of the types of general decisions and how they applied First Principles they

actually did have quite a bit of decision-making power however, there were time

constraints that were also contributors to feeling they didn’t have enough decision-

making power. The designers indicated other barriers including their own lack of

experience or knowledge barriers about the subject matter and First Principles that

affected their use of the First Principles. Similarly, the designers faced many challenges

simultaneously. They had the challenge of figuring out how to use the existing materials,

which required additional research and development of the content, trying to understand

and apply the First Principles concurrently, and figure out how to put it all online.

Level of Understanding First Principles

The third research question addressed in this study was – What was the level of

understanding of the First Principles of Instruction by instructional designers? A survey

given to participants tested their knowledge, comprehension, and application of the First

Principles. There were n=3 scorers that graded the surveys using a scoring rubric

developed by the researcher (see Appendix I). A two-way mixed effects model was used

to calculate inter-rater reliability. The intraclass correlation for average measures

indicated a very high effect size between three raters (r = .926, lower 90% confidence

limit = .905 and 95% confidence interval = .943).

There were five knowledge level questions and they were scored one point for

each correct response for a maximum score of five points. There were six questions in the

comprehension level questions. Participants could receive a maximum of three points per

76

question for a maximum score of 18. In the application level there was only one question

with a maximum score of three points.

Table 4.5

First Principles of Instruction Knowledge Survey Scoresa

M SD Range

Knowledge (5) 3.07 2.19 0 - 5

Comprehension (18) 12.31 2.48 6.67 - 15

Application (3) 2.18 .56 .67 - 3

Note. a The maximum possible score for knowledge = 5, comprehension = 18, application = 3. n=15.

77

Table 4.6

First Principles of Instruction Knowledge Survey Scoresa by Role

n

Knowledge

(5)

M (SD)

Comprehension

(18)

M (SD)

Application

(3)

M (SD)

Project Leads 2 4.67 (.47) 13.83 (.24) 1.5 (1.17)

Team Leads 6 3 (2.45) 12.05 (2.38) 2.28 (.44)

Instructional Designers 4 1.67 (2.25) 13.58 (1.26) 2.50 (.34)

Designers-by-Assignment 3 3.01(2.19) 10.11 (3.69) 2.0 (.33)

Note. a The maximum possible score for knowledge = 5, comprehension = 18, application = 3.

Summary. The participants’ levels of understanding of the First Principles varied.

Some designers with less experience had more knowledge about First Principles because

they took a class about First Principles or had studied them on their own.

Frequency of First Principles Incorporated in Modules

The final research question was – How frequently do the modules incorporate the

First Principles of Instruction? There were nine modules evaluated for how often the

module incorporates (1) Activation/Tell, (2) Demonstration/Show, (3) Application/Ask,

and (4) Integration/Do principles. Each module screen was evaluated for the presence of

the First Principles. The number of instances of each principle was added to give the

frequency score and a percentage was provided to compare the frequency of each

principle within a module (see Table 4.6). Some screens had more than one principle (e.g.

a screen can have an instance of Activation/Tell and an instance of Demonstration/Show).

Each module had a variable number of screens that were evaluated; these are indicated in

the parentheses in Table 4.6. As part of the module template, there was one standardized

Ask screen and one standardized Do screen.

The Activation/Tell principle had the majority of instances from each of the

modules. High School Earth and Space Science, Physics, and Algebra had more than

81% instances of Activation/Tell and only 3.4% to 19% of Demonstration/Show

instances. The Application/Ask principle had the second most instances and the

Demonstration/Show principle had the second fewest instances while the Integration/Do

78

principle had the fewest total instances in the modules. There was at least one instance of

Integration/Do however; four science modules did not have any instances of the

Demonstration/Show principle.

Table 4.7

Module Evaluation Frequency Counts

Tell

(Activation)

Number of

Screens Percent of

Screens

Show

(Demonstration)

Number of

Screens Percent of

Screens

Ask

(Application)

Number of

Screens Percent of

Screens

Do

(Integration)

Number of

Screens Percent of

Screens

K-2nd

Grade Science (17)a

Living Organisms w/Backwards Design and Standards-based Instruction

12

71%

7

41%

1b

.06%

1c

5.9%

3rd

-5th

Grade Science (43)

Light w/Ask Questions, Graphic Organizers, Demonstrations

30

70%

11

26%

12

28%

1

2.3%

6th

-8th

Grade Science (28)

Observations and Inferences w/Explicit Reflective Approach

16

57%

5

18%

11

39%

1

3.8%

H.S. Biology (21)

Interdependence w/Inquiry Strategy

11

52%

9

43%

1

4.8%

1

4.8%

H.S. Earth & Space Science (29)

Earth Systems and Patterns w/ Concept Mapping

25

86%

1

3.4%

2

6.9%

1

3.4%

H.S. Chemistry (23)

Intermolecular Bonding w/Inquiry Strategy

15

65%

11

48%

1

4.3%

1

4.3%

H.S. Physics (16)

Gravitational Force w/Concept Mapping

13

81%

3

19%

1

6%

1

6%

H.S. Algebra (25)

Quadratic Equations w/Explanation and Justifications

21

84%

2

8%

3

12%

1

4%

H.S. Geometry (21)

Quadrilaterals w/Developing Quality

Definitions, Analyzing Geometric Properties, Using Manipulative Materials

12

57%

9

43%

2

9.5%

1

4.8%

Note. aEach module had a different number of screens that were evaluated. The parentheses indicate the number of screens evaluated. b

cEach module had one standardized Ask and Do screen.

79

Table 4.8

Percentage and Instances Ranges of the use of First Principles

Percentage Range Instances Range

Activation/Tell 52% - 86% 12-30

Demonstration/Show 3.4% - 48% 1-11

Application/Ask 4.3% - 39% 1-12

Integration/Do 2.3% - 6% Only 1 for each module

The Activation/Tell instances were most frequent and ranged from 52 – 86% of

each module. The Demonstration/Show instances had a larger range of usage from a very

low 3.4% to a moderate 48%. The Application/Ask instances also had a low to moderate

range of 4.3% to 39% usage. Since there was only one instance of the Integration/Do

principle the range was low for all modules.

Summary

In summary, participants described the conditions under which the First Principles

were used. The main condition that may have had an impact on the use of First Principles

included the instructional design setting; specifically the project requirements, personnel,

designer experience, physical setting, and training and meetings were significant

conditions under which the First Principles were used.

Participants also indicated the decisions they made, or felt they couldn’t make,

during the instructional design project. First, there were contradicting perceptions on

decision-making power. Most designers felt limited in their decision-making power but a

few others felt they had enough decision-making power. Second, the types of decisions

made by project directors and project leads were generally made up-front and were

strategic or program-planning decisions.

Third, other decisions made by instructional designers included general design

decisions, like media selection and placement, and decisions regarding the application of

First Principles. The level of understanding of First Principles of Instruction revealed that

project leads scored highest at the knowledge and comprehension levels but lowest on the

80

application level whereas the instructional designers (non-team leads) scored highest on

the application level but lowest at the knowledge level.

Fourth, the modules were evaluated for the frequency of each First Principle.

High School Biology, Chemistry, Geometry, and 6-8 Science modules all had a more

proportional (i.e. there were more demonstrations of the information being taught) usage

of Activation/Tell instances to Demonstration/Show instances. In the next chapter each of

these findings will be discussed.

81

CHAPTER FIVE

DISCUSSION

The purpose of this study was to examine the use of the First Principles of

Instruction (Merrill, 2002a) and the decisions made by instructional designers —

including project leads, team leads, and designers-by-assignment. The investigation of

the use of the First Principles was part of an effort to determine if these principles were

conducive to being implemented during a fast-paced project that required the design and

development of a large number of online modules.

The overarching research question for this study was: How were the First

Principles of Instruction used by instructional designers, in a short-term, high volume,

rapid production of online K-12 teacher professional development modules? Four

supporting questions were also addressed: 1) What were the conditions under which the

First Principles of Instruction were used? 2) What design decisions were made during the

project? 3) What is the level of understanding of the First Principles by instructional

designers? 4) How frequently do the modules incorporate the First Principles of

Instruction?

This case study involved 15 participants who were all instructional designers and

designers-by-assignment who worked on 49 science and math professional development

modules for K-12 teachers within a short 11-week time period. Participant interviews,

extant data —project management documents, e-mail communications, personal

observations, recordings of meetings, participant surveys, and the evaluation of 9 online

modules consisted of the data which resulted in this design and development research

study.

General Research Question

The main research question was - How were the First Principles of Instruction

used by instructional designers, in a short-term, high volume, rapid production of online

K-12 teacher professional development modules? In addition, the researcher questioned

82

whether these principles were conducive to being implemented within this type of

environment. The results indicated the First Principles of Instruction were not used at the

level expected by the lead designer (who serves as the researcher) and may not be

conducive to being applied as described by Merrill (2002a, 2007a, 2009a, 2009b) in this

specific case. The researcher expected there would be a more proportional ratio between

the Activation/Tell principle and the Demonstration/Show principle. For example, if there

was an instance of Activation/Tell then there should be an instance of the

Demonstration/Show principle immediately following. There is no hard and fast rule

regarding the frequency of instances of the First Principles within a module. However,

Merrill (2007b) provided an example sequence (see Table 5.1) that supports the idea that

if there is an instance of Activation/Tell then there should be an instance of

Demonstration/Show accompanying it. In this example, Merrill uses the term Do

interchangeably with Ask.

Table 5.1

Possible Strategy Sequence for Teaching Components (Merrill, 2007b, p.17)

Task 1 Task 2 Task 3

Topic 1 Tell/Show Do Do

Topic 2 Tell/Show Show Do

Topic 3 Tell/Show Tell/Show Show

The frequency of use of the First Principles in the modules disclosed an overuse

of the Activation/Tell principle in relationship to the number of Demonstrations/Show and

Application/Ask applications. An overuse means there are many more instances of the

Activation/Tell principle than the Demonstrations/Show principle. Six out of nine

modules had over 65% instances of the Activation/Tell principle and five of the modules

had instances of the Demonstration/Show principle ranging from 3.4% (only 1 instance

out of 29 components) to 26% (11 instances out of 43 components). The expectation the

researcher had was to have more instances of the Demonstrations/Show principle and the

83

Application/Ask principle. In addition, the researcher expected that the modules would be

centered on a real-world problem showing the learners what they would accomplish for

the Integration/Do principle at the beginning of the modules. There were many factors

that contributed to the use and misuse of the First Principles of Instruction that are

discussed in this chapter.

This chapter includes a summarization of the results, explanations, and probable

conclusions for the outcomes of this study. Limitations of this study are addressed as well

as future research possibilities. Moreover, implications and recommendations for

instructional design practitioners, project managers, and instructional design educators

are provided.

Supporting Research Question 1

The first supporting research question for this study was — What were the

conditions under which the First Principles of Instruction were used? Results indicated

that the project requirements, personnel, designer experience, physical setting, and

training and meetings contributed to decision-making and ultimately to the use and

misuse of the First Principles of Instruction.

Project Requirements. There were two primary project requirements that

affected the use of the First Principles of Instruction. First, the new modules needed to be

completed within an 11-week time period. Due to administrative processes that took

several weeks to complete the project wasn’t able to begin when it was initially proposed

leaving a mere 11-weeks to start and complete the project. It is recommended that the

project leads or managers should consider attenuating the requirements and scope of the

project to better reflect time and resources available. The project requirements and scope

should reflect the given amount of time and resources available.

Second, the client required their existing face-to-face materials to be converted to

an online environment. The modules were housed in an online portal. The online portal

was a combination of a course management system and a repository of lesson plans and

activities for K-12 teachers. In addition to being online, the new modules needed to be

independent study. The project lead indicated that it was difficult to “figure out what the

client really wanted.” The difficulty may have stemmed from having little contact with

the client upfront and it could be suggested that the client didn’t know what they wanted

84

and were leaving it up to the project lead to determine the requirements for them. It is

unclear to the researcher what contact the project lead and directors had with the client

prior to the beginning of the project. It is recommended that a thorough analysis be

completed before the initiation of the project to determine specific requirements.

Consequently, thorough analyses can often take a considerable amount of time, which

project leads may have felt could not be spared, and therefore general requirements were

defined upfront and not more detailed requirements. It is believed that spending more

time upfront defining the requirements thoroughly would eventually save time in the long

run. Leaders can undertake a “capacity analysis” to help identify requirements and align

it with the resources available. A capacity analysis consists of identifying project goals

and requirements, resources required and their availability, and identifies key resource

constraints that may cause gaps and bottlenecks (Cooper, 1999) in the successful

completion of an instructional design project.

Personnel and Designer Experience. Personnel included instructional designers

and designers-by-assignment (i.e. those with no formal instructional design training). Due

to the timing of the project and the length of time getting the project started, personnel

who were available to work on the project were limited. One finding that likely had an

impact on how the First Principles were used was the fact there was a large gap of

experience between designers. There were 13 instructional designers with 6 years or less

of instructional design experience and only two designers with 13 and 20 years of

experience. In addition, there were two team leads indicating having only one year of

instructional design experience (see Table 5.2).

Table 5.2

Means and Standard Deviations of Years of Experience

N M SD Range (years)

Project Leads 2 16.5 4.95 13 - 20

Team Leads 6 3.5 2.17 1 - 6

Instructional Designers 4 1.73 .98 11 mo. – 3

Designers-by-Assignment 3 0 0 0

85

This lack of experience likely contributed to how instructional design decisions

were made regarding the First Principles of Instruction. First, not having enough

proficient and expert designers to mentor and coach novice and advanced beginners (see

Dreyfus, 2005) could have lead to poor decision-making. Gibbons (2003) stated that

instructional designers evolve through different “centric” phases as they develop their

knowledge and gain experience. He described how designers begin at different entry

points. Some instructional designers in the current study may have entered into this

project as media-centric (see Gibbons, 2003), which means designers tend to be more

focused on the media or delivery method. Findings revealed that some designers were

focused more on the online environment and the challenges developing instruction for the

Internet. Conversely, more experienced designers may have been more strategy or model-

centric. “Model-centering encourages the designer to think first in terms of the system

and model constructs that lie at the base of subject-matter knowledge” (Gibbons, 2003).

The variability in the application of First Principles of Instruction could be a result of the

divergent entry points of each designer and the insufficient number of experienced

designers to coach less experienced designers into a convergent entry point.

Another plausible explanation of how designer’s experience levels may have

affected the use of First Principles of Instruction stems from research indicating that

entry-level designers often struggle in applying certain employer expected instructional

design skills (Villachica, Marker, & Taylor, 2010). The project leads expected that the

instructional designers would possess instructional design skills at the same level as their

theoretical knowledge and to complete tasks requiring these skills (i.e. conduct content

analysis, create design documents, and apply an instructional design model) without the

assistance of more experienced designers. In addition, the instructional designers may

have been “laboring under a halo effect” (Villachica, Marker, & Taylor, 2010. p. 49).

This could be a result from an inflated view of their knowledge, skills, and abilities,

which could have stemmed from excelling in coursework that may have included the

completion of some real-world instructional design projects.

Physical Setting and Training Meetings. Many of the participants —

instructional designers, subject matter experts (SME), and project leads were separated

86

by space and time. The project lead and lead designer were located in two different office

buildings and most of the instructional designers lived locally but telecommuted, and all

but one SME telecommuted. The primary mode of communication between those who

telecommuted was asynchronous e-mail. Many instructional designers were often not

physically present but would either come in and meet face-to-face for weekly meetings

and trainings or they would meet via web conferencing. Instructional designers and team

leads felt the separation of physical space and time was difficult to collaborate and to

give and receive support in the design of the modules. Physical spaces and group

boundaries provide a venue to brainstorm, problem-solve, and generate new ideas

(Sundstrom, De Meuse, & Futrell, 1990). Based on Sundstrom, De Meuse, and Futrell’s

(1990) research it can be suggested that if the instructional designers were working face-

to-face that the First Principles of Instruction might have been used more frequently

within the modules. The close proximity of instructional designers can foster the

reciprocation of ideas regarding the First Principles as well as help mediate peer feedback

thus instances of misuse of the principles could be identified and fixed immediately.

Several participants indicated that the initial training was insufficient in providing

them with the skills necessary to apply the First Principles of Instruction in this project.

This finding is consistent with Rowland’s (1992) research on instructional design

practices that included the practice of training instructional designers. Rowland (1992)

stated “our efforts to train designers and to assist designers in their work are based on

theory (i.e. a body of literature) that may be discrepant from practice” (p. 66).

Participants indicated that the initial training was based on the theory (i.e. provided with

journal articles to read) but not on practice leaving participants feeling the training was

insufficient.

There are several reasons why the training may have been insufficient in

providing designers with the skills necessary to apply the First Principles of Instruction

effectively. First, the structure of the training was not always consistent with effective

teaching and learning principles. Merrill (2002a) stated “the most effective

learning…environments are those that are problem-centered and involve the student in

four distinct phases of learning” (p. 44), which are Activation/Tell, Demonstration/Show,

Application/Ask, and Integration/Do. The training did not emulate a task-centered

87

approach using the First Principles of Instruction. The structure of the initial training

consisted of the leaders talking about how to apply the principles with little

demonstration and no application (i.e. practice) before instructional designers had to

integrate these skills into this real project. Learners, in this case the instructional

designers, most likely had difficulty “deriving deep understanding via traditional didactic

approaches” (Oliver & Hannafin, 2001. p. 6). Second, the materials used for the training

consisted of journal articles and a website that was used as a model for how First

Principles of Instruction were applied in an online environment. These articles were

mostly theoretical and descriptive and the illustrative case and example website were

based upon entrepreneurship and writing business plans. This most likely made it

difficult for inexperienced designers to assimilate entrepreneurship and business plans to

math and science standards/benchmarks and instructional strategies that teachers could

use in their classrooms. Research indicates that students (i.e. inexperienced designers)

find it difficult to comprehend expert’s conceptions (Snir & Smith, 1995 in Oliver &

Hannafin, 2001). It can be assumed that reading Merrill’s articles without adequate

guidance, discussion, and application made it difficult for designers to comprehend how

to apply the principles.

One recommendation would be to have structured the training around the real-

world problem faced by the designers — use the existing materials and create an online

module employing the First Principles. The trainers could first demonstrate the steps (i.e.

model the process) to create the modules allowing the instructional designers to visualize

(Merrill, 2002a) and develop a mental model (Oliver & Hannafin, 2001). Guiding the

designers through the application of First Principles using the existing materials could

help foster their understanding and ability to apply the principles within the actual

situation.

Supporting Research Question 2

The second supporting research question was — what design decisions regarding

the First Principles of Instruction were made during the project? There were three major

topics and several sub-topics identified that impacted instructional designers’ decision-

making regarding the First Principles of Instruction. The most significant findings were

the factors that affected instructional designer’s decision-making regarding the First

88

Principles of Instruction. Those factors included time, knowledge and experience level of

instructional designers, the existing materials that were converted to the online modules,

and the online environment. To provide context to the factors, the application of the First

Principles are discussed first then, the explanation of factors are discussed secondly. The

designers denoted they applied some of the principles of the First Principles of

Instruction but did not consistently apply other principles (e.g. Application/Ask,

Integration/Do).

Third, findings indicated that instructional designers felt their decision-making

authority was limited. Conversely, the last finding indicated that while the project

directors and leads made more strategic and program planning decisions the instructional

designers made quite a few instructional design decisions like content and media

selection as well as instructional strategy decisions.

Application of First Principles of Instruction. Overall, the instructional

designers felt the online modules were an improvement from the previously developed

face-to-face modules even though the online modules didn’t employ the First Principles

of Instruction as described by Merrill (2002a, 2007a, 2009a, 2009b). The designers felt

that the principles were straightforward and easy to understand yet they found the

principles difficult to apply.

Designers indicated the Activation/Tell principle was the easiest to apply, which

can be corroborated by the number of instances of the Activation/Tell principle applied

within the modules.

Table 5.3

Percentage and Instances Ranges of the use of First Principles

Percentage Range Instances Range

Activation/Tell 52% - 86% 12-30

Demonstration/Show 3.4% - 48% 1-11

Application/Ask 4.3% - 39% 1-12

Integration/Do 2.3% - 6% Only 1 for each module

89

Instructional designers also deemed the Demonstration/Show principle fairly

simple to use. However, they indicated time as a major constraint in deciding when and

how to apply this principle. The Application/Ask principle was viewed as challenging to

use and required more instructional design expertise to appropriately use this principle.

The Integration/Do principle was not applied by the instructional designers because the

project leads decided to standardize the integration component due to the module being

independent study and the assessment of the integration component would be determined

by individual school districts. However, designers indicated that the Integration/Do

principle was also very challenging to apply.

There are several factors that contributed to the way the First Principles were

applied; time, knowledge/experience level, existing materials, and the online environment

are discussed in the following Factors Affecting Decisions section. Other plausible

explanations as to why instructional designers made decisions regarding the First

Principles of Instruction are discussed below.

Merrill (2009b) claims that current instruction is often topic based and presents

information only (i.e. Tell and Ask instruction) with few demonstrations and little

opportunity for learners to practice their new knowledge and skills. Instructional

designers with little real-world experience tend to rely on their own experiences and

knowledge as they make decisions (Le Maistre, 1996) however; their knowledge and

experiences are limited (Ertmer, York, & Gedik, 2009). Designers may have relied

heavily on their prior knowledge of “information-only” type instruction and used that

prior knowledge as the basis of developing these modules. In a study conducted by Le

Maistre (1996), she indicated that even though instructional designers received feedback

and advice from experts the designers used their own knowledge to complete a task about

80% of the time and not the feedback from experts. Similarly, when instructional design

experts make decisions regarding the First Principles of Instruction they are more likely

to use the declarative knowledge acquired through formal instructional design education

more than less experienced designers (Rauchfuss, 2010).

Another explanation of the limited use of the First Principles was because of the

rapid nature of this project. Richey (2005) stated that the application of a model

especially during a project with a tight timeline and in a fast-paced environment is

90

difficult for instructional designers to apply, particularly for those with limited design

experience. Rowland (1992) confirmed that design processes, like the application of the

First Principles, and the quality of instruction “are affected by many factors, among them

the designer’s knowledge, skill, and experience; the design task, the working conditions

and environment; and methods and management” (p. 82). Likewise, Richey (2005) said

some approaches require more experienced designers than do other approaches (see p.

177).

Factors Affecting Decisions. Instructional designers specified that time,

knowledge/experience level, existing materials, and the online environment all

contributed to how they used the First Principles of Instruction. It was revealed that time

played a significant factor in deciding how and when to apply the First Principles. Every

participant indicated that there was a lack of adequate time to sufficiently apply the

principles as they wanted and as Merrill described them. Research supports the claim that

time plays a major factor in the practice of ISD and the use of ISD models and principles.

The research showed that designers adapted their instructional design activities (e.g.

eliminated certain tasks) based on time (Wedman and Tessmer, 1993), which may have

resulted in products that were less effective (Visscher-Voerman, 1999).

Instructional designers felt that the knowledge and expertise levels affected the use

of First Principles. Research shows that expertise levels of instructional designers are a

factor in how ISD models are interpreted (Perez and Emery, 1995). Moreover, Edmonds,

Branch, and Prachee (1994) explained that ISD models “vary in the amount of expertise

required by individuals to apply the model” (p. 61), In other words some models are

better suited for designers with more expertise. Even though the participants indicated

that the First Principles were easy to understand they clearly stated that the principles

were hard to apply, which explains why the principles may be better suited for designers

with more experience.

The existing materials and the type of content were revealed as having impacted

designers’ decision-making when using the First Principles. Edmonds, Branch, and

Mukherjee (as stated in Richey, 2005) postulated that the effectiveness of an ISD model

or set of principles

91

is dependent on the extent to which a match between the application context and

the context for which the model was originally indented. The contextual elements

they stress are not only for setting, but also differences in type of content and the

type of product being produced (p. 176).

The First Principles were not necessarily intended for any specific content however;

Merrill (2007a, 2009b) identified five types of learning outcomes that most instruction

falls under (i.e. information-about, parts-of, kinds-of, how-to, and what-happens-if).

Instructional designers may not have been able to assimilate the content extracted from

the existing materials with these learning outcomes as described in the literature.

Instructional designers found it was difficult to use the Application/Ask principle

and the Integration/Do principle within an online environment. One explanation that was

revealed from the data was due to the time constraint the online modules and the online

portal housing the modules couldn’t interact with one another therefore, learner’s data

could not be stored nor could feedback be embedded within the system. Gibbons (2003)

suggested that inexperienced designers often focus on the delivery medium as they design

their instruction and they may struggle to look past the medium in order to solve the

design problem.

Limited Decision-Making Power. Instructional designers felt their decision-

making authority was limited because the project leads already made main strategic

decisions. Interestingly, the designers-by-assignment felt they had a lot of decision-

making authority while most of the instructional designers felt they didn’t have the

authority to make decisions. This phenomenon could be attributed to the designers-by-

assignment not knowing the process of instructional design and the types of decisions

that could be made. Some instructional designers felt they couldn’t make decisions

because their understanding of the First Principles of Instruction were limited and

therefore hindered decision-making.

Novice instructional designers tend to think linearly or step-by-step as suggested

by process models like ADDIE (Rowland, 1992). Designers may have felt limited in their

decision-making due to not going through an entire instructional design process as

representative of the ADDIE model which many ISD programs emphasize when training

new designers. ISD process models are “an ideal set of ID activities to be completed,

92

typically in a prescribed sequence” (Wedman & Tessmer, 1993, p. 43). Novice

instructional designers in this study probably expected to complete the majority of phases

as described in these process models and may have had difficulty knowing how to adapt

if one of the phases had been skipped. Perez and Emery (1995) indicated that novice

instructional designers, those with less than two year of instructional design experience,

thought processes about design were linear and one-dimensional compared to expert

designers. Novice designers tend to concentrate on one instructional design factor at a

time. Like going through individual phases of the ADDIE model without any overlap of

phases. This could mean that the instructional designers in this project expected to do

more analyses other than a content analysis of the existing materials. It could also mean

that they expected to determine their own instructional strategy, like Gagné’s Nine

Events of Instruction, as mentioned by some instructional designers during their

interviews.

Since the instructional strategy (i.e. the use of First Principles of Instruction) was

already determined this ultimately limited designers decision-making on the

determination of the overall strategy however, there were a myriad other decisions that

could be made within the framework of the First Principles of Instruction. Wedman and

Tessmer (1993) suggested that designers in their study might not have made other

instructional design decisions because they viewed their own decisions superfluous and

not needed because similar design decisions had been made previously. This belief could

have been exacerbated by their lack of knowledge of the First Principles of Instruction

and their limited experiences as an instructional designer.

Designers felt that their lack of comprehension and experience with the First

Principles of Instruction restrained their ability to make decisions. This finding is

consistent Wedman and Tessmer’s (1993) study on instructional design practices by

novice and expert instructional designers. In their study “lack of expertise” was one

reason instructional designers gave as to why they made certain instructional design

decisions like deciding to exclude a design activity (e.g. identifying learning outcomes,

select instructional strategies).

Strategic/Program Planning Decisions. The strategic and program planning

decisions, like adopting the First Principles of Instruction as the framework for the entire

93

program of modules, were determined by project directors and the project leads before

the design and development began. The decision to adopt the principles was because the

project leads had prior experience using this framework and felt it would be a good fit for

the project. Similarly, the project lead created a template for the designers to use to help

guide designers in applying the First Principles of Instruction. Deciding to use the First

Principles and creating a design template for inexperienced designers may be attributed to

the project lead’s prognostication of the instructional designers’ lack of prior experience

with both the application of First Principles of Instruction and with typical instructional

design tasks. Expert designers often identify constraints and potential problems early in

the project cycle and therefore identify solutions and make strategic decisions that would

help solve those problems before they truly become problems or to lessen the impact of a

constraint (Rowland, 1992).

The employment of a design template may have affected the use and misuse of

the First Principles of Instruction by designers. It had a positive impact for some

designers by guiding them through the components of a simplified version of the First

Principles of Instruction (i.e. Tell-Show-Ask-Do). However, for some novice designers it

probably contributed to their need to follow rules rigidly when solving design problems

(Dorst & Reymen, 2004). The template may have also led to novice designers’ tendencies

to be task-oriented and focused on the details instead of the underlying principles

(Ertmer, York, & Gedik, 2009; Reiser, 2004).

General Decisions. Instructional designers felt their decision-making was more

low-level in terms of the impact those decisions made on the program. Designers

indicated the general decisions included the selection of media, content, and sometimes

the strategies in which to deliver the media and content. There was conflicting points-of-

view on the types of decisions designers could make. Some designers didn’t feel they

could make content selection decisions and they had to stick to the content provided in

the existing materials. While other designers were not hesitant to supplement or replace

the existing content with content they felt was more appropriate. The differences in the

level of expertise of the designers can elucidate this conflicting point-of-view. Dorst and

Reyman (2004) explained that instructional designers fall within Dreyfus’s (2005) skills-

based model that classifies seven levels of expertise (i.e. novice, advanced beginner,

94

competent, proficient, expert, master, and visionary). The majority of instructional

designers in this study fall within the novice, advanced beginner, and competent statuses

on Dreyfus’ continuum, whereas the project lead and lead designer could be considered

experts based on their reported years of experience. A novice instructional designer takes

things at face value and consistently tries to follow rules. Novice designers probably took

the instructions of using existing materials and converting them to an online environment

as a directive that couldn’t be negotiated. While advanced beginners begin to hone their

skills and recognize there are exceptions to rules and situation specific decisions. As the

designers continues to gain experience they become more competent in solving

instructional design problems, take more risks, and become more comfortable in the

choices they make (Dorst & Reyman, 2004). Instructional designers at the advanced

beginner and competent stages of expertise most likely felt they were able to interpret the

instructions more fluidly and they might have recognized the use of existing materials

was situational and not a hard and fast rule.

This may have affected the application of First Principles because the existing

materials didn’t contain all of the components necessary to implement the First

Principle’s framework. Since some instructional designers didn’t feel they could stray

from the existing materials then the content necessary to fulfill each principle may not

have been included in the new modules.

Making the decision to apply an instructional design model and prescriptive

principles is just the beginning of a series of fundamental design decisions. Even though

project leads may determine the use of a model it’s the instructional designers who are on

the frontlines designing the instruction and creating the storyboards that are making

specific design decisions on what gets implemented. In this study, the instructional

designers felt they had limited decision-making power however; they were the ones who

operationalized the use of First Principles in the modules. In reality, they had a

tremendous amount of instructional design decision-making authority.

Supporting Research Question 3

The third supporting research question was — What is the level of understanding

of the First Principles of Instruction by instructional designers? The instructional

designers were surveyed on knowledge, comprehension, and application levels of

95

understanding of the First Principles of Instruction. For the knowledge level questions the

designers were asked to recall the five First Principles of Instruction in order and any

deviation from the order would result in not getting a point. Designers scored M=3.07 out

of a total of five points for the knowledge level. This moderately low score can be

explained by the fact that while the designers were trained on the First Principles as

described in the literature, the project adopted the simplified Tell-Show-Ask-Do

framework. Next, instructional designers were asked to describe and provide an example

for each of the principles thus surveying their comprehension of each principle. The score

for comprehension level was M=12.31 out of 18, which is a slightly higher score than the

knowledge score, but still relatively low. Finally, the application score was M=2.18 out

of three indicated a moderate ability to describe how they would apply the First

Principles in a given scenario.

The low to moderate scores are not surprising for each level of understanding.

However, it is surprising that designers scored higher on the application question than the

knowledge and comprehension levels. This may be attributed to the training the designers

received. While they received the theoretical background of the First Principles through

journal articles only one designer-by-assignment and one instructional designer indicated

that they studied those materials. Most of the others indicated reading the materials but

not studying the materials. Another explanation of why the application level scores are

higher than the comprehension and knowledge scores, as explained previously, is that the

designers may have been more focused on the design tasks and not on the meaning

behind the principles (Ertmer, York, & Gedik, 2009; Reiser, 2004). Finally, project leads

worked with the design teams describing the steps to applying the First Principles. While

the designers may not have followed those steps they may have remembered those steps

well enough to describe it on the survey. A limitation of this survey is that the researcher

could not control for designers using resources to help them answer the questions or the

length of time to answer each question.

Supporting Research Question 4

The fourth research question was — How frequently do the modules incorporate

the First Principles of Instruction? The modules typically have many more instances of

Activation/Tell than the other principles. As stated previously the Application/Ask and the

96

Integration/Do principles were given standardized screens and instructional designers did

not add any other instances of Integration/Do however, some did add instances of the

Application/Ask principle to provide practice for the learners. These data support the

previous findings of the decisions made by designers and the factors that contributed to

those decisions regarding the use First Principles. While there is no hard and fast rule of

how many of each principle should be implemented within the instruction, Merrill

(2007b) explained that for each component skill there should be a demonstration of that

skill and an opportunity to practice. Sometimes practicing requires multiple component

skills so it may not be expected to have an instance of Application/Ask for every

component skill. Gardner (2011b) used a similar instrument, as employed in this study, to

count the frequency of instances of the First Principles in biology modules. He counted

the frequency of instances in modules that did not employ First Principles and then in

modules that were redeveloped using First Principles. The later had a more equal ratio for

Activation/Tell instances with Demonstration/Show instances. A module using First

Principles had 16 instances of Activation/Tell and 13 instances of Demonstration/Show,

no instances of Application/Ask, and seven instances of Integration/Do. Table 5.4 shows

a comparison of Gardner’s (2011b) module that employs First Principles with two

modules used in this study. Gardner’s module has more instances of Demonstration/Show

and as Merrill’s (2007b) example illustrates (see Table 5.1) uses the Do principle and not

the Ask principle.

Table 5.4 Comparisons of Gardner’s (2011) Module with 6-8 Grade Science and H.S. Earth and

Space Science Modules

Tell Show Ask Do

Gardner (2011b) Module (28)a 16 13 0 7

6-8 Grade Science Module (28) 16 5 11 1

H.S. Earth & Space Science (29) 25 1 2 1

Note: a Total number of course components

97

One major difference in the application of First Principles in his modules versus

the modules used for this study was that he created the modules himself under controlled

circumstances and he is an expert in using the First Principles of Instruction. Whereas,

the modules used for this study were mostly designed by individuals with little or no

experience with the First Principles and with little experience in designing instruction.

In summary, the environment of the ISD project played an important role in how

the First Principles were used. The physical location and spatial relationship of

instructional designers impacted how the First Principles were used. Novice designers

need consistent coaching and constructive feedback. Experienced and expert designers

should be in close proximity of novice designers to allow for coaching and feedback to

occur. Moreover, designers who are not familiar with First Principles should be trained

on the application of the principles using effective, efficient, and engaging teaching and

learning practices. As suggested by participants, the training should have employed the

First Principles of Instruction.

Time was a major factor that affected instructional designers’ decisions. While

many ISD projects have strict timelines and it often isn’t feasible to alter the timeline, the

scope can be negotiated with the client. Consequently, if the scope cannot be reduced

provisions need to take place to compensate for the lost time. For example, hiring

designers that have had experience using First Principles and can provide oversight and

guidance to the other designers. The restrictions of time can be averted by having

previously established an environment and team that has the necessary experience and

background knowledge to apply the principles efficiently and efficaciously.

Implications. These findings have implications for instructional systems design

(ISD) programs and for managers/leaders of instructional design projects. First, it is

recommended that in addition to teaching the theoretical foundations of instructional

design, ISD programs should also adopt apprenticeship-based learning environments

and/or action-learning approaches. An apprenticeship-based learning environment can

bridge the gap between theory and practice by providing ISD students with actual

instructional design work without the “constraints of a typical 3-credit college course

[that] may limit [instructional design] experiences too severely for them to be truly

98

representative of what new professionals will face in their first real instructional design

assignments” (Bishop, Schuch, Spector, & Tracey, 2004, p. 20).

Similarly, action learning is “an alternative model of instruction that provides a

strong bases for use in authentic contexts and applied practice settings” (Bannan-Ritland,

2001, p. 40). Action learning is described as both a process and a program that

incorporates real-world problem solving, team-based learning techniques, capitalizes on

individual intellectual resources, and is said to “confront the increasing demands of… job

complexity” (Bannan-Ritland, 2001). Action learning is generally fostered in business

and management programs and is not often seen within ISD programs. In addition,

programs may want to consider looking closely at the curricula and how they align with

employer expectations. Villachica, Marker, and Taylor (2010) state, “employer

expectations may vary across ID activities, with employers expecting entry level IDs to

perform some tasks without assistance and others with large amounts of assistance.”

The types of instructional design decisions and the reasons for making those

decisions vary based on the expertise level of the instructional designers. Managers and

leaders of instructional design projects should be aware that entry-level instructional

designers would require some assistance as they hone their ISD skills and develop

expertise. Specifically, some instructional design tasks require more assistance from

experts than others (Villachica, Marker, & Taylor, 2010) and

In addition, the implementation of performance support systems, education and

training opportunities, and workplace mentoring programs can provide the necessary

support to develop their skills (Villachica, Marker, and Taylor, 2010).

Limitations

Limitations for this study included (1) the researcher was a participant observer

(2) this is a retrospective study and data was collected after the project had been

completed; plus, the length of time from the completion of the project to data collection

consisted of a gap of approximately 6 months, (3) the instruments used for this study

were created by the researcher and specific to the participants of this study. The

instruments had not been thoroughly validated. The evaluation instrument was adapted

from Merrill (2007b) and had only been used in one previous study (Gardner, 2011b).

Neither Merrill nor Gardner validated the evaluation instrument. (4) The evaluators had

99

limited experience with First Principles of Instruction. To counteract some of these

limitations the data came from multiple sources to reduce bias. Even though the data was

collected retrospectively the data sources originated during the actual design project

including e-mail communications, project management documents, design documents,

training materials, templates, and the modules. The researcher provided participants cues

to stimulate prior recall during interviews in order to get the rich information that the

interviews afforded.

Future Research

This study revealed additional questions and opportunities for future research.

First and foremost another study should be conducted under more normal conditions

where there is a more heterogeneous group of instructional designers instead of the

majority being novice designers. In addition, this study should be on a project that has a

reasonable deadline. Moreover, it is suggested that an instrument to evaluate curricula

using the First Principles of Instruction be rigorously validated. Merrill (in press) has

created an instrument but it has not yet been deemed reliable or valid. Second, more

design and development research needs to take place in order to thoroughly validate the

First Principles of Instruction. This study can be used as a building block on which to

further the research on the use of and the internal validation of the principles. Similarly,

continued external evaluation needs to take place regarding the use and the effectiveness

of the principles. Third, the modules used as part of this study need to be tested for

usability and that they support the learning outcomes. Learner motivation, satisfaction,

and most importantly knowledge acquisition and transfer should be studied to help

validate the efficaciousness of the First Principles of Instruction.

Conclusion

This study aimed to determine if the First Principles of Instruction were

conducive to being implemented in a fast-paced instructional design environment with

several conditions and factors that contributed to the design decisions regarding the use

of the principles. Results revealed that the First Principles of Instruction were easy to

understand yet difficult to apply given the conditions in which they were employed.

The main condition and factor that influence designers’ decision-making was time. In

addition to the time constraints, the designers’ knowledge and experience levels,

100

instructional design setting, previous materials, and the online environment also

attributed to the types of design decisions made during this project.

This study has implications for instructional design programs as well as

employers and managers of instructional design projects. Recommendations included

incorporating more apprenticeship-based programs to help designers experience the

challenges of design projects without the constraints of a semester-based project.

Employers should understand the abilities of entry-level instructional designers and

provide novice designers with access to expert designers to help mentor and provide

support as novice designers practice and develop more expert-like thinking and

behaviors.

Further research is necessary to contribute to the knowledge base of how

instructional design models are used in various situations. Similarly, more research needs

to be conducted specifically with First Principles to validate these principles.

101

APPENDIX A

SCIENCE AND MATH STANDARDS INSTRUCTIONAL

MODULES

Program &

Grade Band Modules Instructional Strategy

Sci

ence

Gra

des

3-5

Pretest/Posttest

M1-Program Overview

M2-Observation and Inferences Explicit – Reflective

M3-Interpretation and Modeling Explicit – Reflective

M4-Distance Size investigation, Relative Size, and

Scale Models

5E Model

M5-Light Ask Questions, Graphic Organizer, Demonstration

M6- Adaptation Standards-Based Instruction, Backwards Design

Sci

ence

Gra

des

K-2

Pretest/Posttest

M1-Program Overview

M2-Properties of Matter 5E Model

M3-Observations and Inferences Explicit – Reflective

M4-Interpretation and Modeling Inquiry, Explicit – Reflective Instructional

Strategy

M5-Earth Structures Concept Mapping

M6-Living Organisms Standards-Based Instruction, Backwards Design

Sci

ence

Gra

des

6-8

Pretest/Posttest

M1-Program Overview

M2-Earth Structures Inquiry Based Instruction

M3-Observations and Inferences Explicit – Reflective

M4-Diversity and Evolution of Living Organisms 5E Model

M5-The Role of Theories, Laws, Hypothesis and

Models

Standards-Based Instruction, Backwards Design

Hig

h S

cho

ol

Bio

log

y Pretest/Posttest

M1-Program Overview

M2- Interdependence Inquiry

M3-Obervations & Inferences and Laws & Theories Explicit - Reflective

M4-Misconceptions and Evolution Concept Mapping

M5-Heredity and Reproduction Standards-Based Instruction

102

Hig

h S

cho

ol

Ear

th &

Sp

ace

Sci

ence

Pretest/Posttest

M1-Program Overview

M2-Earth in Space and Time Inquiry

M3-Observations and Inferences Explicit - Reflective

M4-Earth Systems and Patterns Concept Mapping

M5-Earth Structures and Plate Tectonics Standards-Based Instruction and Backwards Design

Hig

h S

cho

ol

Ch

emis

try

Pretest/Posttest

M1-Program Overview

M2-Matter Inquiry

M3-The Practice of Science and The Role of Theories,

Laws, Hypotheses and Models

Explicit - Reflective

M4-Matter and Redox Reaction Concept Mapping

M5-Energy Standards-Based Instruction and Backwards Design

Hig

h S

cho

ol

Ph

ysi

cs Pretest/Posttest

M1-Program Overview

M2-Intermolecular Bonding Inquiry

M3-Observation and Inferences Explicit - Reflective

M4-Gravitational Force Concept Mapping

M5-Exothermic and Endothermic Reactions Standards-based

Hig

h S

cho

ol

Alg

ebra

Pretest/Posttest

M1-Program Overview

M2-Polynomials: Variables Representation and Connections

M3-Relations and Functions Explanation and justification

M4-Linear Equations Explanation and justifications

M5-Quadratic Equations Explanation and justifications

Hig

h S

cho

ol

Geo

met

ry

Pretest/Posttest

M1-Program Overview

M2-Mathematical Definitions and Vocabulary Developing quality definitions, Using manipulative

materials, Working in collaborative groups

M3-Euclidean Constructions Real-world exploration

M4-Concurrency and Theories Developing quality definitions,

Using manipulative materials

M5-Points of Concurrency in Triangles Quality Definitions

M6-Pythagorean Theorem Problem solving and examining

real-world contexts

M7-Quadrilaterals Developing Quality Definition, Analyzing Geometric

Properties, using manipulative materials

103

APPENDIX B

DEMOGRAPHICS AND DESIGN KNOWLEDGE SURVEY

Identification Number: _____________________________________________

Age: ______________________

Gender: __________________

Role: __________________________________________________________________

How long did you work on the project? ___________________________

1. What is your highest degree completed?

a. Bachelor’s Degree

b. Master’s Degree

c. Doctorate Degree

d. Other

i. Please Specify

2. Are you currently working toward a degree?

a. Yes

b. No

3. If yes, what degree are you working towards

a. Master’s Degree

b. Doctorate Degree

c. Please specify what the degree is in (i.e. Instructional Systems)

4. How long have you been working as an instructional designer?

a. Years

b. Months

5. List your various roles in instructional design projects:

6. What is your comfort level in

104

a. Using the ADDIE model for instructional design projects

b. Using other models for instructional design projects

i. Please specify which ones

c. Applying a particular learning theory to instructional design projects

i. Please specify which learning theories you are most familiar with

d. Developing an instructional module from scratch

e. Developing an instructional module given appropriate content

f. Working with subject matter experts

g. Working with a team of instructional designers

h. Creating media scripts

i. Creating instructional videos

j. Creating audio for use in instruction

k. Communicating design and development needs to programmers

7. Rate your level of expertise in instructional design:

a. Novice

b. Advanced Beginner

c. Proficient

d. Expert

8. Rate your understanding of the First Principles of Instruction?

1. Very Low

2. Low

3. Neither High nor Low

4. High

5. Very High

9. How did you come to know about First Principles of Instruction? Select all that apply (or rank)

a. Took a class with Dr. Merrill

b. Learned about it in a graduate class

c. Learned on my own

d. I have never heard of First Principles of Instruction.

e. Other

i. Please specify

105

10. What literature did you read to learn about First Principles of Instruction? (Select all that apply.)

a. Trends and Issues in Instructional Design and Technology by Reiser & Dempsey

b. Instructional Design Theories and Models (Volumes I, II, or III) by Reigeluth

c. Journal Articles d. Online resources (podcasts, websites, wikis) e. Other – please specify

11. Did you attend the initial kick-off meeting where the task-centered model was discussed?

a. Yes

b. No

c. I don’t remember.

12. Did you read the article titled First Principles of Instruction (Merrill, 2002a)?

a. Yes

b. No

c. I don’t remember.

13. If you read the article, to what degree do you feel you understood the content of the article?

1. Not at All

2. Little

3. Somewhat

4. A Considerable Degree

5. A great deal

14. Did you read the article titled A Task-Centered Instructional Strategy (Merrill, 2007b)?

a. Yes

b. No

c. I don’t remember.

15. If you read the article, to what degree do you feel you understood the content of the article?

1. Not at All

2. Little

3. Somewhat

4. A Considerable Degree

106

5. A great deal

16. Did you read the article titled A Task-Centered Approach to Entrepreneurship (Mendenhall et al., 2006a)?

a. Yes

b. No

c. I don’t remember.

17. If you read the article, to what degree do you feel you understood the content of the article?

1. Not at All

2. Little

3. Somewhat

4. A Considerable Degree

5. A great deal

18. Did you review the Entrepreneurship website (Mendenhall et al., 2006b) that was sent to you?

a. Yes

b. No

c. I don’t remember.

19. How much time did you spend reviewing the website?

a. 10 minutes or less

b. 10 minutes to 30 minutes

c. 30 minutes to 1 hour

d. 1 hour or more

20. To what degree do you feel going through the website helped your understanding of the First Principles of Instruction?

1. Not at All

2. Little

3. Somewhat

4. A Considerable Degree

5. A great deal.

107

APPENDIX C

INTERVIEW PROTOCOL AND QUESTIONS

Date:

Time:

Interviewer:

Participant Number:

Instructions:

Introduce yourself and ask the interviewee if they have any questions or concerns before

continuing with the interview. Before beginning the interview read the following

statement:

Thank you for participating in this research study. You have been chosen to participate in

this study because of your involvement with the development of the professional training

modules (as stated in the consent form). To help facilitate the interview process and note

taking I will be audio recording our conversation. Only the researchers will have access

to the audio files. When the study has been completed the audio files will be destroyed.

Do you have any questions or concerns?

• During the interview remember to probe and ask follow-up questions if something

is not clear or needs an explanation. Have participants define what they mean and

be explicit.

• Do not bias their responses or “put words in their mouths.”

108

Questions:

1. What was your role in this project?

a. Describe/elaborate

2. What tasks did you do?

3. Which modules did you work on? For each module, what tasks did you perform?

Which ones did you have the most influence on?

4. Which modules do you feel most closely incorporate the First Principles of

Instruction?

5. Please give a specific example how you applied the First Principles of Instruction?

How did you make that design decision?

6. How did you make design decisions?

a. What were the factors that contributed to the decisions you made?

7. What were the client requirements for the project?

8. Were there any limitations or constraints in the project? If so, what were they?

9. Did these constraints or limitations influence your use of the First Principles of

Instruction? If so, how did these limitations and constraints influence your use of the

First Principles of Instruction?"

10. How did the workplace environment affect your decision-making?

11. Which tasks were the most difficult to apply the First Principles of Instruction?

12. Which tasks were easy to apply the First Principles of Instruction?

13. What were the top three things you would do differently regarding the usage of the

First Principles of Instruction, if given the chance?

14. What top three things you would do the same regarding the usage of the First

Principles of Instruction, if given the chance?

109

APPENDIX D

MODULES RANDOMLY SELECTED FOR EVALUATION

Program & Grade

Band Module Title Instructional Strategy

Science Grades K-

2 M6-Living Organisms

Standards-Based Instruction, Backwards

Design

Science Grades 3-5 M5-Light Ask Questions, Graphic Organizer,

Demonstration

Science Grades 6-8 M3-Observations and

Inferences Explicit – Reflective

High School

Biology M2- Interdependence Inquiry

High School Earth

& Space Science

M4-Earth Systems and

Patterns Concept Mapping

High School

Chemistry

M2-Intermolecular

Bonding Inquiry

High School

Physics M4-Gravitational Force Concept Mapping

High School

Algebra

M5-Quadratic

Equations Explanation and justifications

High School

Geometry M7-Quadrilaterals

Developing Quality Definition,

Analyzing Geometric Properties, using

manipulative materials

110

APPENDIX E

FIRST PRINCIPLES OF INSTRUCTION KNOWLEDGE

SURVEY

Identification Number: ________________________________________

PART I: You have been asked to design a module, using the First Principles of

Instruction, which will instruct teaching assistants about plagiarism, how to identify if

something is plagiarized, and how to help students prevent plagiarism. Before you design

the module your supervisor wants to know what your level of understanding of the First

Principles of Instruction is in order to determine if she needs to provide more training

before you begin to design the plagiarism module.

Please look at the diagram. Does this look familiar? This is the diagram used by

Merrill (2002a) when he describes the First Principles of Instruction. Fill in the blank

with the corresponding First Principle.

1. Label the diagram: 1. ? 2. ? 3. ? 4. ? 5. ?

Next Page (cannot go back)

1

2

3 4

5

1

2

3 4

5

111

PART II: Now that you have filled in the blank with each First Principle of Instruction,

please take some time and describe each of the First Principles of Instruction. Remember

your supervisor is only looking at your knowledge of the First Principles of Instruction.

This activity is not assessing your ability to design instruction. Your supervisor is

collecting information to help her develop training so you can be more successful in

developing instruction using the First Principles of Instruction.

For each item please try and be thorough when answering each question. Please include

examples to help illustrate what you mean.

2. Describe what a “whole-task” or “task-centered” problem means and how it is used to promote learning:

3. Describe what it means to “Activate Prior Knowledge” and how it promotes learning:

4. Describe what “Demonstration” means and how it promotes learning:

5. Describe what “Application” means and how it promotes learning:

6. Describe what “Integration” means and how it promotes learning:

Next Page (cannot go back)

112

PART III: Now that you have had a chance to reflect on what the First Principles of

Instruction are, your supervisor would like you to complete the following activity.

You have been asked to design a module, using the First Principles of Instruction. The

module will instruct teaching assistants about plagiarism, how to identify if something is

plagiarized, and how to help students prevent plagiarism. Plagiarism is the unauthorized

use or close imitation of another author’s work without giving the appropriate credit to

the author (http://dictionary.reference.com/browse/plagiarism; 2012).

Describe the steps you will take and how you will use the First Principles of Instruction

to create this module. Be as specific as possible.

Submit (end of survey)

113

APPENDIX F

MODULE EVALUATION SHEET

Reviewer’s Name:___________________________________________________

Date: ________________________

Module Number: ______________ Module Name: ________________________

COURSE COMPONENT

TE

LL

– I

nfo

rmat

ion

Pre

sen

tati

on

SH

OW

– P

ort

ray

al

Dem

on

stra

tio

n

AS

K –

In

form

atio

n R

ecal

l,

Pra

ctic

e A

pp

lica

tio

n

DO

– I

nte

gra

tio

n o

f n

ew

kn

ow

led

ge/

skil

ls

114

APPENDIX G

RECRUITMENT E-MAIL

Dear _________________________,

You are invited to participate in a research study about instructional designers and their design decisions.

You were selected as a potential participant because of your involvement with the creation of the Next

Generation Sunshine State Standards (NGSSS) Professional Development modules for Florida K-12 teachers.

The purpose of this study is to examine the:

• Use of the First Principles of Instruction during a short-term, high volume instructional product

development project; and the

• Design and development decisions made by instructional designers.

If you agree to participate in this study, the researchers will interview each participant individually. In addition to interviews you will be asked to complete a demographic survey, a task-centered instructional

strategy knowledge survey, and processes survey. Based upon your responses in the interviews and on the

surveys the researchers may contact you for follow-up information. The total time commitment would be approximately 2 to 2 ½ hours over the period of no more than 12 weeks.

Your participation is completely voluntary, and you can withdraw from the study at any time; there is no

penalty if you do not wish to participate. All information that we collect will be kept confidential to the extent allowed by law.

When you complete the interviews and surveys you will be compensated with a $30 gift card.

If you agree to participate please click on the link below to create a confidential alias (a number

that you will use to identify yourself with).

LINK

Please contact me if you have any questions or concerns.

Thank You,

Anne Mendenhall PhD Candidate, Instructional Systems

Educational Psychology and Learning Systems

Florida State University

115

APPENDIX H

CONSENT FORM

Florida State University Consent Form:

Examining the Use of First Principles of Instruction by Instructional Designers in a Short-term,

High Volume, Rapid Production of Online K-12 Teacher Professional Development Modules

Principle Investigator: Anne M. Mendenhall

Educational Psychology and Learning Systems

Florida State University

Faculty Supervisor: Dr. Tristan Johnson

Learning Systems Institute, Florida State University

Faculty Co-Supervisor: Dr. James Klein

Educational Psychology and Learning Systems

Florida State University

November 18, 2011

Dear Participants,

You are invited to participate in a research study about instructional designers and their design decisions.

You were selected as a potential participant because of your involvement with the creation of the Next Generation Sunshine State Standards (NGSSS) Professional Development modules for Florida K-12

teachers.

Background Information:

The purpose of this study is to examine the:

• Use of the First Principles of Instruction during a short-term, high volume instructional product

development project; and the

• Design and development processes taken and decisions made by instructional designers.

Procedures

If you agree to this study, the researchers will interview each participant individually and as a group. In

addition to interviews and focus groups you will be asked to complete a Demographic and Design

Knowledge survey, and a First Principles of Instruction knowledge survey. Based upon your responses in the interviews and on the surveys the researchers may contact you for follow-up information. The total

time commitment would be approximately 2 to 2 ½ hours over the period of no more than 12 weeks.

116

FSU Human Subjects Committee Approved on 3/01/2012. Void after 1/07/2013. HSC # 2012.7963

Your interview and focus group responses will be audio recorded in order to help facilitate in note taking.

The audio files will be stored digitally and after the completion of the study the audio files will be

destroyed.

Risks

The data collection methods and procedures present minimal risks to participants. The risks associated

with this study are no more than those experienced in daily life.

Benefits

Participants will be able to reflect on instructional design processes and decisions made during the design

and development of the modules. Reflection is a key component to learning. When instructional designers

reflect upon their experiences they will identify what processes and decisions worked well and what decisions didn’t work well. This reflection process will allow designers to learn from their experiences

and apply that knowledge to future instructional design projects.

Confidentiality

The data collected for this study will be kept private and confidential to the extent permitted by law.

Participants’ names will be kept private and confidential to the extent permitted by law. Any publication,

report, or printed articles will not identify individuals by name or allude to an individual person. Participants will be asked to initially put their names on the surveys. This is only for the researchers’

purpose of making sure follow-up data is attributed to the correct person. After the data collection process

has been completed the data with participant names will be destroyed. The researchers will keep your

decision to participate, not to participate, or withdrawal from the study confidential to the extent permitted by law.

Voluntary Nature of the Study

Your participation in this study is voluntary. If you decide not to participate there will be no retribution. If you decide to participate, you are free to decline to answer any questions or withdraw from the study at

any time.

Contacts and Questions

The principle investigator and lead researcher of this study is Anne Mendenhall. Please feel free to ask any questions now or anytime during the study. You are encouraged to contact her by phone, e-mail, or in

person.

If you have any questions or concerns regarding this study and would like to talk to someone other than the researcher(s), you are encouraged to contact the FSU IRB at 2010 Levy Avenue, Research Building

B, Suite 276, Tallahassee, FL 32306-2742, or (850) 644-8633, or by E-mail at

[email protected]

FSU Human Subjects Committee Approved on 3/01/2012. Void after 1/07/2013. HSC # 2012.7963

Interview 60 – 70 Minutes

Surveys 45 - 60 Minutes

Approximate Total Time 2 – 2 1/2 Hours

117

You will be given a copy of this information to keep for your records.

If you choose to participate in this study, please confirm your consent by signing and dating below. If you

have questions or concerns please contact Anne Mendenhall or Dr. Tristan Johnson.

Statement of Consent:

I have read the above information. I have asked questions and have received answers. I consent to

participate in the study.

Signature Date

Signature of Researcher Date

FSU Human Subjects Committee Approved on 3/01/2012. Void after 1/07/2013. HSC #

2012.7963

118

APPENDIX I

SCORING PROTOCOL AND RUBRIC FOR FPI SURVEY

Scoring Protocol and Rubric for:

First Principles of Instruction Knowledge Survey

Date: __________________________ Scorer: _________________________________ Protocol: 1. Each scorer will read the following articles to refresh their understanding of First Principles of Instruction.

Merrill, M. D. (2002a). First Principles of Instruction. Educational Technology Research and

Development, 50(3), 43-59.

Merrill, M. D. (2007b). A Task-Centered Instructional Strategy. Journal of Research on Technology in

Education, 40(1), 33-50.

2. The scorers will meet together and score one survey together using the rubric. The scorers will discuss any discrepancies with the rubric and determine if the rubric needs to be changed and change the rubric accordingly.

3. Scorers will score individually another survey and discuss discrepancies and come up with a consensus.

4. Once there are very few discrepancies then the scorers will again score individually and inter rater reliability will be calculated. If reliability isn’t in an acceptable range then the scorers will meet and discuss individual discrepancies and come up with a consensus.

Give 1 point for each correct answer. (Maximum 5 Points)

PART I: You have been asked to design a module, using the First Principles of Instruction,

which will instruct teaching assistants about plagiarism, how to identify if something is plagiarized, and how to help students prevent plagiarism. Before you design the module your supervisor wants to know what your level of understanding of the First Principles of Instruction is in order to determine if she needs to provide more training before you begin to design the plagiarism module. Please look at

119

the diagram. Does this look familiar? This is the diagram used by Merrill (2002aa) when he describes the First Principles of Instruction. Fill in the blank with the corresponding First Principle.

7. Label the diagram: (Acceptable answers)

1. Whole Task, Task-centered, Problem, Problem-centered 2. Activation, Activate Prior Knowledge, (Tell, ask questions – ½ point) 3. Demonstration, Show 4. Application, Apply, Practice, Ask 5. Integration, Transfer of knowledge or skill, Do

1

2

3 4

5

120

PART II: Now that you have filled in the blank with each First Principle of Instruction, please take some time and describe each of the First Principles of Instruction. Remember your supervisor is only looking at your knowledge of the First Principles of Instruction. This activity is not assessing your ability to design instruction. Your supervisor is collecting information to help her develop training so you can be more successful in developing instruction using the First Principles of Instruction.

For each item please try and be thorough when answering each question. Please include examples to help illustrate what you mean.

First Principles Description O Points 1 Point 2 Points 3 Points

Did not

answer.

Mostly inaccurate

descriptions and

are not articulated

well.

Accurately but not

thoroughly explains the

phase and/or does not

provide an example.

Accurately and thoroughly

explains the phase and

provides an accurate example

that illustrates how the phase

is applied.

Promote Learning Component

O Points 1 Point 2 Points 3 Points Did not

answer.

Mostly inaccurate

description of how

the phase promotes

or increases learning.

Partially describes how

the phase promotes or

increases learning.

Accurately and thoroughly

describes how the phase

promotes or increases

learning.

8. Describe what a “whole-task” or “task-centered” problem means and how it is used

to promote learning:

(2pt. Definition) “Learning is promoted when learners are engaged in solving real-world problems” (Merrill, 2002a, pg. 45).

(3pt. Definition should include one or more examples listed in this definition) “Learning is promoted when learners are shown the task that they will be able to do or the problem they will be able to solve as a result of completing a module or course. Learning is promoted when learners are engaged at the problem or task level, not just the operation or action level. Learning is promoted when learners solve a progression of problems that are explicitly compared to one another” (Merrill, 2002a, pg. 45).

9. Describe what it means to “Activate Prior Knowledge” and how it promotes

learning:

(2pt. Definition) “Learning is promoted when relevant previous experience (prior knowledge) is activated” (Merrill, 2002a, pg. 46).

(3pt. Definition should include one or more examples listed in this definition) “Learning is promoted when learners are directed to recall, relate, describe, or apply knowledge from

121

relevant past experience that can be used as a foundation for new knowledge. Learning is promoted when learners are provided relevant experience that can be used as a foundation for new knowledge. Learning is promoted when learners are provided or encouraged to recall a structure that can be used to organize new knowledge” (Merrill, 2002a, pg. 46).

10. Describe what “Demonstration” means and how it promotes learning:

(2pt. Definition) “Learning is promoted when the instruction demonstrates what is to be learned rather than merely telling information about what is to be learned” (Merrill, 2002a, pg. 47).

(3pt. Definition should include one or more examples listed in this definition) “Learning is promoted when the demonstration is consistent with the learning goal: (a) examples and non-examples for concepts, (b) demonstrations for procedures, (c) visualizations for processes, and (d) modeling for behavior. Learning is promoted when learners are provided appropriate learner guidance including some of the following: (a) learners are directed to relevant information, (b) multiple representations are used for the demonstrations, or (c) multiple demonstrations are explicitly compared” (Merrill, 2002a, pg. 47-48).

11. Describe what “Application” means and how it promotes learning:

(2pt. Definition) “Learning is promoted when learners are required to use their knowledge or skill to solve problems” (Merrill, 2002a, pg. 49).

(3pt. Definition should include one or more examples listed in this definition) “Learning is promoted when the application (practice) and the posttest are consistent with the stated or implied objectives: (a) information-about practice – recall or recognize information, (parts-of practice-locate, and name or describe each part, (c) kinds-of practice-identify new examples of each kind, (d) how-to practice – do the procedures and (e) what-happens practice-predict a consequence of a process given conditions, or find fault conditions given an unexpected consequence. Learning is promoted when learners are guided in their problem solving by appropriate feedback and coaching, including error detection and correction, and when this coaching is gradually withdrawn. Learning is promoted when learners are required to solve a sequence of varied problems” (Merrill, 2002a, pg. 49).

12. Describe what “Integration” means and how it promotes learning: (2pt. Definition) “Learning is promoted when learners are encouraged to integrate (transfer) the new knowledge or skill into their everyday life” (Merrill, 2002a, pg. 50).

(3pt. Definition should include one or more examples listed in this definition) “Learning is promoted when learners are given an opportunity to publicly demonstrate their new knowledge or skill. Learning is promoted when learners can reflect on, discuss, and defend their new knowledge or skill. Learning is promoted when learners can create, invent, and explore new and personal ways to use their new knowledge or skill” (Merrill, 2002a, pg. 50).

122

13. Describe what Tell, Show, Ask, and Do mean:

Tell – corresponds to the Activation phase. In the Tell part the learners prior knowledge is activated. Tell is where general information about the concepts/skills are provided.

Show – corresponds to the demonstration phase. This is where the concepts/skills that were taught previously are demonstrated for the learner. A specific portrayal is used to demonstrate the general information.

Ask – corresponds to the application phase. Learners are provided with opportunities to practice and apply their new skills.

Do – corresponds with the integration phase. Learners are provided with a new situation, new artifact to create, or new problem so they can demonstrate their new knowledge or skills.

O Points 1 Point 2 Points 3 Points Did not

answer.

Mostly inaccurate

describes the meaning

and how the First Principles correlate

with Tell, Show, Ask,

and Do

Partially describes the

meaning and how the

First Principles correlate with Tell, Show, Ask,

and Do

Accurately and

thoroughly

describes the meaning and how

the First

Principles

correlate with

Tell, Show, Ask,

and Do

PART III: Now that you have had a chance to reflect on what the First Principles of Instruction are, your supervisor would like you to complete the following activity.

You have been asked to design a module, using the First Principles of Instruction. The module will instruct teaching assistants about plagiarism, how to identify if something is plagiarized, and how to help students prevent plagiarism. Plagiarism is the unauthorized use or close imitation of another author’s work without giving the appropriate credit to the author (http://dictionary.reference.com/browse/plagiarism; 2012).

Describe the steps you will take and how you will use the First Principles of Instruction to create this module. Be as specific as possible.

O Points 1 Point 2 Points 3 Points

No Answer Provides an incorrect

response and does not address any of the First

Principles.

Provides a partially

correct response that addresses some of the

First Principles but not

all.

Provides a fully

correct response that addresses all of

the First Principles.

123

First Principles Approach should include the following:

1. Identify a typical real-world whole task. “Gather a set of specific whole tasks. Often it is possible to gather artifacts in the workplace. For processes it is often possible to video samples of the process in the workplace (Merrill, 2007b, pg. 38).

2. Identify “a series of similar tasks of increasing complexity” (Merrill, 2007b. pg. 35). “Sequence the tasks by putting the least complex tasks early in the progression with succeeding tasks those that have more elaborated knowledge and skill components or more component skills than preceding tasks” (Merrill, 2007b, pg. 38).

3. “Adapt the tasks or select alternate tasks as necessary to facilitate a smooth progression and to best enable demonstration and application of each component skill” (Merrill, 2007b, pg. 38).

4. Apply an instructional strategy that includes activating prior knowledge or experiences (Tell), demonstrate the component skills or concepts being taught (Show), provide learners with multiple opportunities to practice and apply their new knowledge (Ask), and provide the learner with an opportunity to integrate their new knowledge in the real world or provide a new whole-task or problem that simulates the real-world.

124

First Principles of Instruction Knowledge Survey Score Sheet

Date: __________________________ Scorer: ________________________________

Participant ID: ____________________________

Total Points: ____________________________

1. Label the diagram:

  Points  Notes 

1.      

2.   

3.   

4.   

5.   

Total Points: 

2. – 6. Descriptions of First Principles of Instruction

 

  Points  Notes 

2.     

3.   

4.   

5.   

6.   

Total Points: 

7. Description of Tell, Show, Ask, and Do

125

  Points  Notes 

7.     

 

8. Application of First Principles

 

  Points  Notes 

8.     

126

APPENDIX J

SAMPLE PROGRAM LOGIC AND STORYBOARD

TEMPLATES

Note from researcher: These are examples of the program logic and storyboard templates for

Science Grades 3-5. Some of the screens, narration, developer notes, and other pieces of content

have been altered or deleted to maintain anonymity and to reduce the length of the file. This is

merely an example to be used for illustrative purposes.

Program Logic Template for 3-5 Science 1. Overview of Program—[like the content in the Matrix] 2. Standards Framework

NGSSS Framework with focus on specifics for this program Mapping of Content with Benchmarks 3. Program Goals

4. Content Area 1: [Big Idea 1 and 2]

Observations and Inferences

Instructional Strategy: Explicit – Reflective

Presentation of Content and Instructional Strategy [TELL] 1.1 Explicate related benchmarks 1.2 Observations 1.3 Inferences 1.4 Explicit – Reflective Instructional Strategy Demonstration merging Content and Instructional Strategy [SHOW] 2.1 Monkey [Big Idea 1 and 2] 2.2 Fossil Foot Print [Big Idea 1 and 2] Assess Content and Instructional Strategy generally [ASK] 3.1 General questions about Observations and Inferences (MC) 3.2 Have learners find appropriate activities in [THE ONLINE PORTAL] for benchmarks related to Observations and Inferences Practice creating an instructional activity for the specified content [DO]

127

4.1 Plan an instructional activity using the Explicit-Reflective instructional strategy that includes the benchmarks related to Observations and Inferences 5. Content Area 2: [Big Idea 1 and 3]

Interpretation and Modeling

Instructional Strategy: Explicit – Reflective [continued]

Presentation of Content and Instructional Strategy [TELL] 1.1 Explicate related benchmarks 1.2 Interpretation 1.3 Modeling 1.4 Explicit – Reflective Instructional Strategy [review] Demonstration merging Content and Instructional Strategy [SHOW] 2.1 String Tubes [Big Idea 1 and 3] Assess Content and Instructional Strategy generally [ASK] 3.1 General questions about Interpretation and Modeling (MC) 3.2 Have learners find appropriate activities in [THE ONLINE PORTAL] for benchmarks related to Interpretation and Modeling Practice creating an instructional activity for the specified content [DO] 4.1 Plan an instructional activity using the Explicit-Reflective instructional strategy that includes the benchmarks related to Interpretation and Modeling 6. Content Area 3: [Big Idea 5]

Distance Size Investigation, Relative Size, and Scale Models

Instructional Strategy: 5E

Presentation of Content and Instructional Strategy [TELL] 1.1 Explicate related benchmarks 1.2 Distance Size Investigation 1.3 Relative Size 1.4 Scale Models 1.5 5E Instructional Strategy Demonstration merging Content and Instructional Strategy [SHOW] 2.1 Ball Activity [Big Idea 5] Assess Content and Instructional Strategy generally [ASK] 3.1 General questions about Distance Size Investigation, Relative Size, and Scale Models (MC) 3.2 Have learners find appropriate activities in [THE ONLINE PORTAL] for benchmarks related to Distance Size Investigation, Relative Size, and Scale Models Practice creating an instructional activity for the specified content [DO] 4.1 Plan an instructional activity using the 5E Instructional Strategy that includes the benchmarks related to Distance Size Investigation, Relative Size, and Scale Models

128

S35M2SB— Observations & Inferences Storyboard ID: PPT:

Section 01: Introduction Frame: Title Screen

SCREEN: NARRATION: DEVELOPER NOTES:

Frame: Goals

SCREEN: NARRATION: DEVELOPER NOTES:

Frame: Overview

SCREEN: NARRATION: DEVELOPER NOTES:

Frame: Big Ideas

SCREEN: Science Grades 3 – 5 Professional Development

• Big Idea 1 – The Practice of Science

• Big Idea 2 – The Characteristics of Scientific Knowledge

NARRATION: DEVELOPER NOTES:

Frame: Benchmarks 1 of 2

SCREEN: SC.3.N.1.2 Compare the observations made by different groups using the same tools and

seek reasons to explain the differences across groups NARRATION: DEVELOPER NOTES:

Frame: Benchmarks 2 of 2

SCREEN: Benchmark SC.5.N.2.1 Recognize and explain that science is grounded in empirical

observations that are testable; explanations must always be linked with evidence. NARRATION: DEVELOPER NOTES:

Frame: General Background and Cognitive Development 1 of 12

SCREEN: NARRATION: DEVELOPER NOTES:

Section 02: Presentation of Content and Instructional Strategy [TELL] Frame: Explicate Benchmarks 1 of 2

129

SCREEN: SC.3.N.1.2 Compare the observations made by different groups using the same tools

and seek reasons to explain the differences across groups NARRATION: DEVELOPER NOTES:

Frame: Benchmarks 2 of 2

SCREEN: • SC.5.N.2.1 Recognize and explain that science is grounded in empirical observations that are

testable; explanations must always be linked with evidence

NARRATION: DEVELOPER NOTES:

Frame: Observations (TELL) 1 of 4

SCREEN: Picture slide show NARRATION: DEVELOPER NOTES: 3 or 4 pictures of people making observations. Dissolve pictures from one to another.

Frame: Observations (SHOW)

SCREEN: Still shot from the video. Video clip to embed here. NARRATION: DEVELOPER NOTES: Need a “done” button that advances to the next slide.

Frame: Observations (ASK) 1 of 3

SCREEN: Image for Debbie and Mark. Need list. Use an image with pen writing on it. NARRATION: Now, compare your list with Debbie and Mark’s lists. How does your list compare? Did they have things listed that you didn’t have listed? DEVELOPER NOTES:

Section 04: Assess Content and Instructional Strategy [ASK] 3.1 General questions about Observations and Inferences (MC) 3.2 Have learners find appropriate activities in [the online portal] for benchmarks related to Observations & Inferences

Frame:

SCREEN: Reflection Activity 1. Within [the online portal] review Big Ideas 1 & 2 and review benchmarks. 2. Reflect on how you would present the information you learned in your classroom. 3. Using the text tool think about and answer the following questions:

• How would you implement these ideas into your classroom?

130

• What challenges do you anticipate encountering?

• How will you handle each of those challenges when they arise?

• Are there activities you currently use in your classroom that cover the benchmarks?

• How will you incorporate the Explicit/Reflective approach in your teaching?

NARRATION: DEVELOPER NOTES:

Section 05: Next Steps Frame: Next Steps

SCREEN: 1. Take the Post-test 2. Practice creating an instructional activity for the specified content [DO] Instructions: Plan an instructional activity using the Explicit/Reflective Approach that includes the benchmarks related to Observations and Inferences. NARRATION: DEVELOPER NOTES:

131

APPENDIX K

HUMAN SUBJECTS APPROVAL MEMORANDUM

Office of the Vice President For Research Human Subjects Committee Tallahassee, Florida 32306-2742 (850) 644-8673 · FAX (850) 644-4392 APPROVAL MEMORANDUM (for change in research protocol) Date: 3/2/2012 To: Anne Mendenhall Dept.: EDUCATIONAL PSYCHOLOGY AND LEARNING SYSTEMS From: Thomas L. Jacobson, Chair Re: Use of Human Subjects in Research (Approval for Change in Protocol) Project entitled: EXAMINING THE USE OF FIRST PRINCIPLES OF INSTRUCTION BY INSTRUCTIONAL DESIGNERS IN A SHORT-TERM, HIGH VOLUME, RAPID PRODUCTION OF ONLINE K-12 TEACHER PROFESSIONAL DEVELOPMENT MODULES The form that you submitted to this office in regard to the requested change/amendment to your research protocol for the above-referenced project has been reviewed and approved. If the project has not been completed by 1/7/2013, you must request a renewal of approval for continuation of the project. As a courtesy, a renewal notice will be sent to you prior to your expiration date; however, it is your responsibility as the Principal Investigator to timely request renewal of your approval from the Committee. By copy of this memorandum, the chairman of your department and/or your major professor is reminded that he/she is responsible for being informed concerning research projects involving human subjects in the department, and should review protocols as often as needed to insure that the project is being conducted in compliance with our institution and with DHHS regulations. This institution has an Assurance on file with the Office for Human Research Protection. The Assurance Number is FWA00000168/IRB number IRB00000446. Cc: Tristan Johnson, Advisor HSC No. 2012.7963

132

APPENDIX L

PRINCIPLE INVESTIGATOR APPROVAL MEMORANDUM

Rabieh Razzouk Wed, Nov 9, 2011 at 5:21 PM

To: "Mendenhall, Anne"

Cc: Tristan Johnson

Hi Anne,

Sorry this took longer than expected. I just heard back from the DoE. The PI (Laura Lang) and the DoE are ok with the request to use the module. Please let me know if you need anything else. I said that you will probably be willing to share your findings but please let me know if that will be a problem.

Good luck with your study and let me know if I can help.

Take care, Rabieh

R a b i e h R a z z o u k

Associate Director for Development & Administration Learning Systems Institute, Florida State University 4600 University Center C, Tallahassee, FL 32306-2540 - Web: http://www.lsi.fsu.edu

From: Mendenhall, Anne Sent: Wednesday, November 02, 2011 11:27 AM To: Rabieh Razzouk Cc: Tristan Johnson Subject: Request to use Modules

Hello Rabieh, Thank you for helping me to seek approval from DOE. I will be glad to have a conversation with them or send them more information is there is some concern. Below is some basic information about the study I'm hoping to conduct with the modules.

The purpose of this study is to explore the:

1. Use of a task-centered instructional design model during a short-term, high volume

133

instructional product development project.

2. Design and development processes taken by expert instructional designers, novice instructional designers, and designers by assignment; and

3. Learner outcomes (learning, satisfaction, relevance, and usefulness) of the finished products.

The study will involve the use of the 49 science and math professional modules we created and the previous XXXX modules (to show how the products evolved from one stage to the "end" online stage). The request is for the use of the modules for critique and evaluation of the model we employed to create the modules. Additionally, the study involves allowing pre-service and/or in-service teachers using the modules for evaluations of the usefulness, relevance, effectiveness, and learnability of the modules. Some screen captures and materials will be used in the dissertation and possibly in publications and presentations about this study. Thank you, Anne M.

Anne Mendenhall PhD Candidate, Instructional Systems College of Education Florida State University

134

APPENDIX M

PERMISSION TO USE FIGURES

On Tue, Sep 4, 2012 at 7:42 PM, m david Merrill <> wrote:

Permission granted from me but you may need to contact BYUH. Chad Compton was who signed my permission as

well as Greg.

Dave

On Sep 4, 2012 4:05 PM, "Anne Mendenhall" <> wrote:

I also recreated the First Principles diagram and cited that as well... but just in case, may I have permission to use

that too?

Thanks

Anne

On Tue, Sep 4, 2012 at 5:55 PM, Anne Mendenhall <> wrote:

Hi Dr. Merrill,

How are you doing? I hope all is well. I'm writing to request permission to use the pebble-in-the-pond image in my

dissertation. I recreated it and cited it but manuscript clearance folks think I need to get permission to use it. So, may

I please use the pebble-in-the-pond illustration in my dissertation?

Thanks

Anne

On Tue, Sep 4, 2012 at 10:52 PM, chad compton <> wrote: Yes, you may use a screen capture of the entrepreneurship course that you worked on while an employee of CITO.

Good luck on your dissertation.

D Chad Compton

Associate Academic Vice President

Brigham Young University-Hawaii

On Tue, Sep 4, 2012 at 1:54 PM, Anne Mendenhall <> wrote:

Dear Dr. Compton,

Dr. Dave Merrill suggested I contact you to request permission to use a screen capture of the Entrepreneurship

Course we developed while working at CITO. I was an employee of CITO from 2003-2008 and while there I managed and participated in the developed of the online Entrepreneurship Course. May I have your permission to

use a screen capture (see attached) as part of my dissertation?

Thank You,

Anne Mendenhall

135

On Wed, Sep 5, 2012 at 10:39 AM, Walter Dick < > wrote:

Anne,

You have my permission to use the illustration of the Dick and Carey model in your dissertation. Walter Dick

Walter Dick

On Sep 4, 2012, at 5:16 PM, Mendenhall, Anne wrote:

Dear Dr. Dick,

My name is Anne and I am a doctoral student of Dr. Jim Klein at Florida State University. My dissertation is about

ISD model use. May I have your permission to use the illustration of the Dick and Carey Model? I recreated it as it

was illustrated in your book The Systematic Design of Instruction and have cited is appropriately in my

dissertation. I use the image to illustrate different types of ISD process models.

Thank You,

Anne Mendenhall

136

REFERENCES

Andrews, D. H., & Goodson, L. A. (1980). A comparative analysis of models of instructional design. Journal of Instructional Development 3(4), 2-16.

Bannan-Ritland, B. (2001). Teaching instructional design: An action learning approach. Performance Improvement Quarterly, 14(2), 37-52.

Bishop, M., Schuch, D., Spector, J. M., & Tracey, M. W. (2004). Providing novice instructional designers real-world experiences: The PacifiCorp design and development competition, TechTrends, 20(2), 20-21.

Branch, R. M. (1997). Perceptions of Instructional Design Process Models. In R. E. Griffin, D. G. Beauchamp, J. M. Hunter, & C. B. Schiffman (Eds.), VisionQuest: Journeys toward

Visual Literacy. Selected Readings from the Annual Conference of the International Visual

Literacy Association, (pp. 429-433). Retrieved from: http://eric.ed.gov/ERICWebPortal/recordDetail?accno=ED408998

Branch, R. M., & Merrill, M. D. (2012). Characteristics in Instructional Design Models. In R. A. Reiser & J. V. Dempsey (Eds.) Trends and Issues in Instructional Design and Technology (pp. 8-16). Boston, MA: Pearson/Allyn and Bacon.

Carliner, S. (1998). How designers make decisions: A descriptive model of instructional design for informal learning in museums. Performance Improvement Quarterly, 11(2), 72-92.

Christensen, T, K., & Osguthorpe, R. T. (2004). How do instructional-design practitioners make instructional-strategy decisions? Performance Improvement Quarterly, 17(3), 45-65.

Clark, R. C. (2003). Building Expertise: Cognitive Methods for Training and Performance

Improvement, 2nd ed. Washington D.C.: International Society for Performance Improvement.

Collins, B., & Margaryan, A. (2005). Design criteria for work-base learning: Merrill’s First Principles of Instruction expanded. British Journal of Educational Technology, 36(5), 725-738.

Cooper, R. G. (1999). From Experience: The Invisible Success Factors in Product Innovation. Journal of Product Innovation Management, 16(2), 115-133.

Creswell, J. W. (2009). Research Design: Qualitative, quantitative, and mixed methods

approaches (3rd

Ed.). Thousand Oaks, CA: Sage Publications.

Creswell, J. W. (2008). Educational Research: Planning, conducting, and evaluating

quantitative and qualitative research. Upper Saddle River, NJ: Pearson Education.

137

Deutsch, K. (1952). On communication models in the social sciences. The Public Opinion

Quarterly, 16(3), 356-380.

Dick, W., Carey, L., & Carey, J. O. (2005). The systematic design of instruction. Boston: Pearson/Allyn and Bacon.

Dorst, K., & Reymen, I. (2004). Levels of expertise in design education. Proceedings International Engineering and Product Design Education Conference IEPDE 2004, Delft, 2004. Retrieved from: http://doc.utwente.nl/58083/1/levels_of_expertise.pdf

Dreyfus, H. L. (2005). Can there be a better source of meaning than everyday practices? Reinterpreting division I of Being and Time in the light of division II. In R. Polt (Ed.) Heidegger’s Being and Time: Critical Essays, (pp. 141-154), Lanham, MD: Rowman & Littlefield Publishers.

Driscoll, M. P. (2005). Psychology of Learning for Instruction (3rd ed.). Boston, MA: Pearson Education.

Driscoll, M. P. (2012). Psychological foundations of instruction. In R. Reiser & J. Dempsey (Eds.) Trends and Issues in Instructional Design and Technology (3rd ed.), (pp. 35-44). Boston, MA: Pearson Education.

Edmonds, G. S., Branch, R. C., & Mukherjee, P. (1994). A conceptual framework for comparing instructional design models. Educational Technology Research and Development, 42(4), 55-72.

Ertmer, P. A., & Stepich, D. A. (2005). Instructional design expertise: How will we know it when we see it? Educational Technology, 45(6), 36-43.

Ertmer, P. A., York, C. S., & Gedik, N. (2009). Learning from the pros: How experienced designers translate instructional design models into practice. Educational Technology,

49(1), 19-26.

Francom, G. M. (2011). Promoting Learner Self-Direction with Task-Centered Learning

Activities in a General Education Biology Course. (Unpublished doctoral dissertation). University of Georgia, Athens, Georgia.

Frick, T. W., Chadha, R., Watson, C., Wang, Y., & Green, P. (2009). College Student Perceptions of Teaching and Learning Quality. Educational Technology Research and

Development, 57(5), 705-720.

Gagné, R. M., Wager, W. W., Golas, K. C., & Keller, J. M. (2005). Principles of Instructional

Design (5th ed.). Belmont, CA: Wadsworth, Cengage Learning.

Gardner, J. L. (2011a). How Award-winning professors in higher education use Merrill’s First Principles of Instruction. International Journal of Instructional Technology and Distance

Learning, 8(5), 3-16.

138

Gardner, J. L. (2011b). Testing the Efficacy of Merrill’s First Principles of Instruction in

Improving Student Performance in Introductory Biology Courses. (Utah State University). ProQuest Dissertations and Theses, Retrieved from http://search.proquest.com/docview/862644295?accountid=4840

Gardner, J. L. (2010). Applying Merrill’s First Principles of Instruction: Practical methods based on a review of the literature. Educational Technology, 50(2), 20-25.

Gardner, J. L., & Jeon, T. (2009). Creating task-centered instruction for web-based instruction: Obstacles and solution. Journal of Educational Technology Systesms, 38(1), 21-34.

Gibbons, A. S. (2003). What and how do designers design? TechTrends, 47(5), 22–25.

Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research. Piscataway, NJ: Transaction Publishers.

Gordon, J., & Zemke, R. (2000). Attack on ISD. Training Magazine, 37(4), 42-53.

Gropper, G. L. (1983). A behavioral approach to instructional prescription. In C. M. Reigeluth (Ed.) Instructional-design theories and models: An overview of their current status (Vol. 1) (pp. 101-161). Hillsdale, NJ: Lawrence Erlbaum Associates.

Guba, E. G. (1981). Criteria for assessing the trustworthiness of naturalistic inquiries. Educational Communication and Technology, 29(2), 75-91.

Gustafson, K. L., & Branch, R. M. (2002). Survey of Instructional Development Models (4th

Ed). Syracuse, NY: ERIC Clearinghouse of Information & Technology, Syracuse University.

Hersey, P., Blanchard, K. H., & Johnson, D. E. (2001). Management of organizational behavior:

Leading human resources (8th

Ed.). Upper Saddle River, NJ: Prentice-Hall.

Hung, W., Smith, T., Harris, M., & Lockard, J. (2010). Development research of a teachers’ educational performance support system: the practices of design, development, and evaluation. Educational Technology Research & Development, 58(1), 61-80. doi:10.1007/s11423-007-9080-3

Johnson, T. E., Mendenhall, A. et al. (2011). Next Generation Sunshine State Standards (NGSSS) Professional Development Modules. Descriptions Retrieved from: http://floridastandards.org/ProfessionalDevelopment/ProfessionalDevProgSearch.aspx

Jonassen, D. H. (1997). Instructional design models for well-structured and ill-structured problem-solving learning outcomes. Educational Technology Research and Development,

45(1), 65-94.

Jones, T. S., & Ritchey, R. C. (2000). Rapid prototyping in action: A developmental study. Educational Technology Research and Development, 48(2), 63–80.

139

Keller, J. M. (2010). Motivational design for learning and performance: The ARCS model

approach. New York: Springer.

Keller, J. M., & Deimann, M. (2012). Motivation, Volition, and Performance. In R. Reiser & J. Dempsey (Eds.) Trends and Issues in Instructional Design and Technology (3rd ed.), (pp. 84-95). Boston, MA: Pearson Education.

Kim, C., Mendenhall, A., & Johnson, T. E. (2010). A design framework for an online English writing course. In J. M. Spector, D. Ifenthaler, & Kinshuk (Eds.), Learning and Instruction

in the Digital Age, (pp. 345-360). New York, NY: Springer Science + Business Media.

Kirschner, P., Carr, C., van Merriënboer, J., & Sloep, P. (2002). How expert designers design. Performance Improvement Quarterly, 15(4), 86-104.

Lazarowitz, R., & Lieb, C. (2006). Formative Assessment Pre-test to Identify College Students’ Prior Knowledge, Misconceptions, and Learning Difficulties in Biology. International

Journal of Science and Mathematical Education, 4, 741-762.

Le Maistre, C. (1998). What is an expert instructional designer? Evidence of expert performance during formative evaluation. Educational Technology Research and Development, 46(3), 21–36.

Mendenhall, A., Buhanan, C., Suhaka, M., Mills, G., Gibson, G., & Merrill, M. D. (2006a). A Task-Centered Approach to Entrepreneurship. Techtrends: Linking Research & Practice

To Improve Learning, 50(4), 84-89. doi:10.1007/s11528-006-0084-3

Mendenhall, A., Buhanan, C., Suhaka, M., Mills, G., Gibson, G., & Merrill, M. D. (2006b). Entrepreneurship. Retrieved from: http://mdavidmerrill.com/Workshops/EntrepreneurCourse/main.swf

Merriam, S. B. (1998). Qualitative research and case study applications in education. San Francisco: Jossey-Bass Publishers.

Merriam, S. B. (1988). Case study research in education: A qualitative approach. San Francisco: Jossey-Bass.

Merrill, M. D. (in press). First Principles of Instruction. San Francisco, CA: Pfeiffer.

Merrill, M. D. (2009a). First Principles of Instruction. Educational Technology, 46(4), (pp. 5-10).

Merrill, M. D. (2009b). Finding e3 (effective, efficient and engaging) Instruction. Educational

Technology, 49(3), 15-26.

Merrill, M.D. (2009c). M. David Merrill Interview. Presented at World Conference on Educational Multimedia, Hypermedia and Telecommunications 2009. Retrieved from http://www.editlib.org/p/32137.

140

Merrill, M. D. (2009d). First Principles of Instruction. In C. M. Reigeluth & A. A. Carr-Chellman (Eds.) Instructional-design theories and models: Building a common knowledge

base (Vol. 3) (pp. 41-56). New York, NY: Routledge.

Merrill, M. D. (2007a). First Principles of Instruction: a synthesis. In R. A. Reiser & J. V. Dempsey (Eds.), Trends and Issues in Instructional Design and Technology (2

nd Ed.), (pp.

62-71). Upper Saddle River, NJ: Merrill/Prentice Hall.

Merrill, M. D. (2007b). A Task-Centered Instructional Strategy. Journal of Research on

Technology in Education, 40(1), 33-50.

Merrill, M. D. (2002a). First Principles of Instruction. Educational Technology Research and

Development, 50(3), 43-59.

Merrill, M. D. (2002b). A pebble-in-the-pond model for instructional design. Performance

Improvement, 41(7), 39-44.

Merrill, M. D., Barclay, M., & Van Schaak, A. (2008). Prescriptive principles for instructional design. In J. M. Spector, M. D. Merrill, J. van Merriënboer & M. F. Driscoll (Eds.), Handbook of research on educational communications and technology (3rd ed.). New York, NY: Lawrence Erlbaum Associates, 173-184.

Merrill, M. D., & Wilson, B. (2007) The Future of Instructional Design (Point/Counterpoint). In R. A. Reiser & J. V. Dempsey (Eds.), Trends and Issues in Instructional Design and

Technology (2nd

Ed.), (pp. 335-351). Upper Saddle River, NJ: Merrill/Prentice Hall.

Oaks, D. H., & Oaks, K. M. (April 2009). Learning and Latter-day Saints. Liahona. 26-31.

Oliver, K., & Hannafin, M. (2001). Developing and refining mental models in open-ended learning environments: A case study. Educational Technology Research and Development,

49(4), 5-32.

Patton, M. Q. (2002). Qualitative Research and Evaluation Methods (3rd ed.). Thousand Oaks, CA: Sage Publications.

Perez, R. S., & Emery, C. D. (1995). Designer thinking: How novices and experts think about instructional design. Performance Improvement Quarterly, 8(3), 80-95.

Rauchfuss, G. H. (2010). How principled are designers? A study of instructional designers use of

first principles. Capella University). ProQuest Dissertations and Theses, Retrieved from http://search.proquest.com/docview/741708813?accountid=4840

Reeves, T. C. (2002). Enhancing the worth of instructional technology research through

“design experiments” and development research strategies. Paper presented at the 2000 AERA Annual Meeting. Retrieved from: http://www.teknologipendidikan.net/wp-content/uploads/2009/07/Enhancing-the-Worth-of-Instructional-Technology-Research-through3.pdf.

141

Reigeluth, C. M., & Carr-Chellman, A. A. (2009). Situational Principles of Instruction. In C.M. Reigeluth & A.A. Carr-Chellman (Eds.), Instructional Design Theories and Models:

Building a Common Knowledge Base (Vol. III), (pp. 57-61), New York, NY: Taylor and Francis.

Reigeluth, C. M. (1983). Instructional-design theories and models: An overview of their current status. Hillsdale, NJ: Lawrence Erlbaum Associates.

Reiser, B. J. (2004). Scaffolding complex learning: The mechanisms of structuring and problematizing student work. Journal of the Learning Sciences, 13(3), 273-304.

Reiser, R. A. (2007). A history of instructional design and technology. In R. A. Reiser & J. V. Dempsey (Eds.), Trends and Issues in Instructional Design and Technology (2nd ed.) (pp. 17-34). Upper Saddle River, NJ: Pearson Education.

Richey, R. C. (1995). Trends in Instructional Design: Emerging Theory-Based Models. Performance Improvement Quarterly, 8(3), 96-110.

Richey, R. C. (2005). Validating Instructional Design and Development Models. In J. M. Spector, C. Ohrazda, A. Van Schaack, & D. Wiley (Eds.), Innovation in Instructional

Technology: Essays in Honor of M. David Merrill (pp. 171-185). Mahwah, NJ: Lawrence Erlbaum Associates.

Richey, R. C., & Klein, J. D. (2008) Research on design and development. In J. M. Spector, M. D. Merrill, J. V. Merriënboer, & M. P. Driscoll (Eds.) Handbook of Research on

Educational Communication and Technology (3rd

ed.), (pp. 748-760), New York, NY: Routledge/Taylor & Francis Group.

Richey, R. C., & Klein, J. D. (2007). Design and Development Research. Mahwah, NJ: Routledge/Lawrence Erlbaum Associates.

Richey, R. C., Klein, J. D., & Tracey, M. W. (2011). The Instructional Design Knowledge Base:

Theory, research, and practice. New York, NY: Routledge.

Rosenberg-Kima, R. (2012). Effects of Task-Centered vs. Topic-Centered Instructional Strategy

Approaches on Problem Solving – Learning to Program in Flash. (Unpublished doctoral dissertation). Florida State University, Tallahassee, Florida.

Rothwell, W. J., & Kazanas, H. C. (2008). Mastering the Instructional Design Process: A

systematic Approach. San Francisco, CA: Pfeiffer.

Rowland, G. (1993). Designing and instructional design. Educational Technology Research and

Development, 41(1), 79-91.

Rowland, G. (1992). What do instructional designers actually do? An initial investigation of expert practice. Performance Improvement Quarterly, 5(2), 65-86.

142

Ryder, M. (no date). Instructional design models and theories. Retrieved from Instructional Design Central website: http://www.instructionaldesigncentral.com/htm/IDC_instructionaldesignmodels.htm

Savery, J.R., & Duffy, T.M. (1995). Problem based learning: An instructional model and its constructivist framework. Educational Technology 35(5): 31-38.

Seel, N. M., & Dijkstra, S. (1997). A historical snapshot on the growth of instructional design. In R. D. Tennyson (Ed.), Instructional design: International perspectives, theory, research,

and models (Vol. 1) (pp. 1-13). Mahwah, NJ: Lawrence Erlbaum Associates.

Seale, C. (1999). Quality in qualitative research. Qualitative Inquiry, 5(4), 465-478.

Severin, W. J., & Tankard, J. W. Jr. (2001) Models in mass communication research. In Communication theories: Origins, methods and uses in the mass media (5th ed.) (pp. 47-70). Boston, MA: Allyn and Bacon.

Shanteau, J. (1992). A psychology of experts: An alternative view. In G. Wright & F. Bolger (Eds.), Expertise and Decision Support, (pp. 11-23). New York, NY: Plenum Press. Retrieved from: http://calendar.ksu.edu/psych/cws/pdf/wb_chapter92.PDF

Shenton, A. K. (2004). Strategies for ensuring trustworthiness in qualitative research projects. Education for Information, 22(2), 63-75.

Sink, D. L. (2008). Instructional design models and learning theories. In E. Biech (Ed.), ASTD Handbook for Workplace Learning Professionals (pp. 195-212), Baltimore, MA: American Society for Training & Development.

Snir, J., & Smith, C. (1995). Constructing understanding in the science classroom: Integrating

laboratory experiments, student and computer models, and class discussion in learning scientific concepts. In D.N. Perkins, J.L. Schwartz, M.M. West, & M.S. Wiske (Eds.), Software goes to school: Teaching for understanding with new technologies (pp. 233–254). New York: Oxford University Press.

Spector, J. M., Ohrazda, C., Van Schaack, A., & Wiley, D. A. (2005). Epilogue: Questioning Merrill. In J. M. Spector, C. Ohrazda, A. Van Schaack, & D. Wiley (Eds.), Innovation in

Instructional Technology: Essays in Honor of M. David Merrill (pp. 303-323). Mahwah, NJ: Lawrence Erlbaum Associates.

Straits, W. J., & Wilke, R. (2006). Interactive Demonstrations: Examples From Biology Lectures. Journal Of College Science Teaching, 35(4), 58-59.

Sundstrom, E., De Meuse, K. P., & Futrell, D. (1990). Work teams: Applications and effectiveness. American Psychologist, 45(2), 120-133.

Tennyson, R. D., & Rasch, M. (1988). Linking cognitive learning theory to instructional prescriptions. Instructional Science, 17(4), 369-390.

143

Thompson Inc. (2002). Thompson job impact study: the next generation of learning. NETG. Retrieved from: http://www.mdavidmerrill.com/Papers/ThompsonJobImpact.pdf

Todorova, N., & Mills, A. (2011). Using Online Learning Systems to Improve Student Performance: Leveraging prior knowledge. International Journal of Information and

Communication Education, 7(2), 21-34.

Tracey, Monica W. (2009). Design and development research: A model validation case. Educational Technology Research and Development, 57(4), 553-571.

Thomson. (2002). Thomson Job Impact Study: The next generation of learning. NETG. Retrieved from: http://www.mdavidmerrill.com/Papers/ThompsonJobImpact.pdf

Tufford, L., & Newman, P. (2010). Bracketing in qualitative research. Qualitative Social Work,

11(1), 80-96.

van den Akker, J. J. H., (1999) Principles and methods of development research. In (J. van den Akker, R. M. Branch, K. Gustafson, N. Nieveen, & T. Plomp (Eds.) Design Approaches

and Tools in Education and Training, (pp. 1-14). Dordrecht, The Netherlands: Kluwer Academic Publishers.

van den Akker, J. J. H., Boersma, K. Th., & Nies, A. C. M. (1990). Ontwikkelstrategieën in SLO-

praktijken [Development strategies in practices within the Dutch National Institute for Curriculum Development]. Enschede, the Netherlands: Dutch National Institute for Curriculum Development.

van den Akker, J. J. H., & Kuiper, W. (2008). Research on models for instructional design. In J. M. Spector, M. D. Merrill, J. V. Merriënboer, & M. P. Driscoll (Eds.) Handbook of

Research on Educational Communication and Technology (3rd

ed.), (pp. 739-748), New York, NY: Lawrence Erlbaum Associates.

van Merriënboer, J. J. G., Clark, R. E., & de Crook, M. B. M. (2002). Blueprints for Complex Learning: The 4C/ID-Model. Educational Technology Research and Development, 50(2), pp. 39-61.

Villachia, S. W., Marker, A., & Taylor, K. (2010). But what do they really expect? Employer perceptions of the skills of entry-level instructional designers. Performance Improvement

Quarterly, 22(4), 33-51.

Visscher-Voerman, J. (1999). Design approaches in training and education: A reconstructive

study. Universiteit Twente (The Netherlands).

Visscher-Voerman, I., & Gustafson, K. L. (2004). Paradigms in the theory and practice of education and training design. Educational Technology Research & Development, 52(2), 69-89.

144

Wedman, J., & Tessmer, M. (1993). Instructional Designers’ Decisions and Priorities: A Survey of Design Practice. Performance Improvement Quarterly, 6(2), 43-57.

Wilson, Rebecca D. (2011). External validation of an instructional design model for high fidelity

simulation: Model application in a hospital setting. (Arizona State University). ProQuest

Dissertations and Theses, Retrieved from http://search.proquest.com/docview/864742502?accountid=4840

Winn, W. (1990). Some implications of cognitive theory for instructional design. Instructional

Science, 19(1), 53-69.

Zemke, R., & Rossett, A. (2002). A Hard Look at ISD. Training Magazine, 39(2), 26-35.

145

BIOGRAPHICAL SKETCH

ANNE MENDENHALL

PROFESSIONAL PREPARATION

Ph.D. The Florida State University - Instructional Systems, completed August 2012

Examining the Use of First Principles of Instruction by Instructional Designers in a

Short-term, High Volume, Rapid Production of Online K-12 Teacher Professional

Development Modules

MS Utah State University - Instructional Technology, 2003 Emphasis: International Curriculum Development

BS Utah State University - Communication, 1997

Emphasis: Journalism Certificates Florida State University, 2009

Human Performance Technology Program for Instructional Excellence Online Instructional Development

PROFESSIONAL EXPERIENCE

May 2012 - Current Payson Center for International Development, Tulane University

Distance Learning and Instructional Design Consultant. Faculty development, curriculum developer, provide workshops and training to Public Health and Medical faculty in Rwanda, Africa.

2008 – Current Learning Systems Institute, The Florida State University

Research Assistant/Instructional Designer, 2011 – Current Habitat Tracker: Learning About Scientific Inquiry Through Digital Journaling in Wildlife Centers. Development of an iPad application

and curriculum for 4th & 5th grader students and their teachers.

Lead Instructional Designer, 2011 Florida PROMISE Professional Development Modules for K-12

146

Teachers

Assistant Faculty in Research, 2009-2010 Distance Learning/Instructional Design Consultant at the Universitas Terbuka, the Open University of Indonesia (worked in Indonesia for 7 months)

Project Manager/Research Assistant, 2009 FIPSE Funded Project - The Social Annotation Model: A New Way to Improve Academic Performance and Critical Thinking Skills for College Freshmen

Instructional Designer, 2008 - 2009 Johns Hopkins PACER Higher Education Effort: Disaster Awareness and Preparedness Course

Project Manager/Instructional Designer, 2008 HKW Technologies Funded Project –Instructional Transaction Shell and Academic Writing Course

2003-2008 Center for Instructional Technology and Outreach, Brigham Young

University-Hawaii

Manager, Development of Online Curriculum, 2008 Created a faculty development proposal for instructors to develop curriculum online. Managed a team of undergraduate students and instructional designers. Worked with core faculty to implement a problem-centered instructional design model.

Director, Instructional Design and Development, 2004-2008 Managed a team of instructional designers and students as they worked to develop multimedia for faculty. Developed online courses. Worked with Dean to prepare for online program to prepare international students for academic success. Assisted faculty in the development of online and face-to-face courses and provided workshops to faculty and staff.

Instructional Designer, 2003-2004 Developed materials for microenterprise course. Assisted in the design and development of various courses.

2002-2003 Instructional Designer

Faculty Assistance Center for Teaching, Utah State University

1999-2003 Instructional Design and Technology Consultant

HOPE, Inc. (Home and Family Oriented Program Essentials)

147

Logan, UT 1997-2002 Multimedia Specialist

KSAR Distance Learning and Video Productions, Center for Persons with Disabilities, Utah State University

TEACHING EXPERIENCE

2008-2010 Performance Systems Analysis/Human Performance Technology

Analysis, Co-Instructor (Online, Graduate Course), Instructional Systems Program, Educational Psychology and Learning Systems Department, The Florida State University

2006 - 2007 Principles of Instructional Design, Adjunct

Instructor (Hybrid, Undergraduate) Brigham Young University-Hawaii

2000 – 2003 Introduction to Digital Video/Audio Production, Teaching Assistant

(Face-to- face, graduate), Instructional Technology Program, Utah State University

WORKSHOPS & TRAINING

Mendenhall, A. (2012). Engaging Learners in a Collaborative Blended-Learning

Environment and in Large Classrooms. New Literacies for the Unified Health Sciences Faculty Development Training and Certificate Program, (September 24-28, 2012) Kigali Health Institute. Kigali, Rwanda.

Mendenhall, A. (2012). First Principles of Instruction. Workshop to be given to graduate students for EDE 6925 Advanced Instructional Design and Development. Instructional Systems Program, Florida State University.

Mendenhall, A. (2012). Paradigm Shift in Teaching and Learning, From Traditional

to Transformative: Collaboration, Teamwork, and technology. New Literacies for the Unified Health Sciences Faculty Development Training and Certificate Program, (June 4-8, 2012). Kigali Health Institute. Kigali, Rwanda.

Mendenhall, A. (2010). PowerPoint Essentials. Teacher In-Service Training. Universitas

Terbuka Primary School. Mendenhall, A. (2010). Introduction to Academic Writing and Publishing. College of

Education and College of Business, Universitas Terbuka. Jakarta, Indonesia. Mendenhall, A. (2010). Using Audio and Video Tools in Online Distance Learning: Voice

Messaging and Web-Conferencing as a Means to Engage and Assess Learners.

148

Universitas Terbuka. Jakarta, Indonesia. Luschei, T. Spector M., & Mendenhall, A. (2009 – 2010). Academic Writing for International

Journals. Universitas Terbuka. Jakarta, Indonesia. Mendenhall, A. (2007). Objectives, Assessments, & Outcomes. Faculty Training, School of

Computing. Brigham Young University Hawai’i. La’ie, Hawai’i Mendenhall, A. (2004). Creating Animations Using Flash. Guest Instructor IDD 302

Educational Technology. Brigham Young University Hawai’i. La’ie, Hawai’i. AWARDS & HONORS

Gagné & Briggs Outstanding Doctoral Student Award (2011-2012), Educational Psychology & Learning Systems, Instructional Systems program, College of Education, Florida State University.

Finalist – Ruby Diamond Future Professor Award (2011-2012), Educational Psychology & Learning Systems, College of Education, Instructional Systems program, Florida State University.

Cochran Internship Award (2011) - Educational Communication Technology (ETC) Foundation, the International Conference of Association for Educational Communications and Technology (AECT). Jacksonville, FL.

AECT Graduate Student Mentor Program – (2010). Selected as a graduate student mentee to work with various faculty at the AECT Conference in Anaheim, CA (2010).

Gagne & Briggs Outstanding Student - Service Award (2008-2009), Educational Psychology & Learning Systems, College of Education, Florida State University.

Finalist, National Telly Award (2002); Director and Editor of Deaf Mentor Training: All

About Hearing; Client: Susan Watkins, Ph.D., SKI*HI Institute. Winner, National Telly Award (2001); Director and Editor of Honoring Ute Ways; Client: Jim Barta, Ph.D., Utah State University.

Winner, National Telly Award (2001); Executive Producer and Editor of John Stewart: A

Man to Match His Mountain; Client: Logan City School District. Certificate of Recognition from Logan City School District (2001); Executive Producer and Editor of John Stewart: A Man to Match His Mountain; Client: Logan City School District.

Finalist, Aegis Award (2001); Technical Director and Editor of Taking Turns not Telling My

Friend What to Do; Client: Susan Watkins, Ph.D., HOPE, Inc. Finalist, National Telly Award (2000); Director and Editor of Position Analysis

Questionnaire; Client: Connie Mecham, Ph.D., PAQ Services.

149

Finalist, National Telly Award (2000); Graphic Designer and Animator of The

Transition Process; Client: SKI*HI Institute.

PUBLICATIONS

REFEREED JOURNAL ARTICLES Razon, S., Mendenhall, A., Yesiltas, G. G., Johnson, T. E., & Tenenbaum, G. (2012).

Evaluation of a Computer-Supported Learning Tool: Effects on quiz performance, content-conceptualization, and motivation. Journal of Multidisciplinary Research,

4(1), 61-68.

Johnson, T. E., Pirnay-Dummer, P. N., Ifenthaler, D., Mendenhall, A., Karaman, S., Tenenbaum, G. (2011). Text Summaries or Concept Maps: Which better represent reading conceptualization? Technology, Instruction, Cognition & Learning, 8(3-4), 297-312.

Mendenhall, A., Johnson, T. E. (2010). Fostering the development of critical thinking skills, and reading comprehension of undergraduates using a Web 2.0 tool coupled with a learning system. Interactive Learning Environments, 18(3), 263-276.

Francom, G., Bybee, D., Wolfersberger, M., Mendenhall, A., Merrill, M. D. (2009). A Task-

Centered Approach to Freshman-Level General Biology. Bioscene, 35(1), 66-73. Mendenhall, A., Buhanan, C. W., Suhaka, M., Mills, G., Gibson, G. V., & Merrill, M.

D. (2006). A Task-Centered Approach to Entrepreneurship. Tech Trends, 50 (4), 84-89.

BOOK CHAPTERS

Mendenhall, A., Kim, C., & Johnson, T. E. (2011). Implementation of an online social annotation tool in a college English Course. In D. Ifenthaler, Kinshuk, P. Isaías, D. G. Sampson, & J. M. Spector (Eds.), Multiple perspectives on problem solving and

learning in the digital age. New York, NY: Springer. Kim, C., Mendenhall, A., & Johnson, T. E. (2010). A design framework for an online

English writing course. In J. M. Spector, D. Ifenthaler, P. Isaías, Kinshuk, & D. G. Sampson (Eds.), Learning and instruction in the digital age: Making a difference

through cognitive approaches,technology-facilitated collaboration and assessment,

and personalized communications (pp.345-360). New York, NY: Springer. Johnson, T. E., Sikorski, E. G., Mendenhall, A., Khalil, M., Lee, M. (2010). Selection of

Team Interventions Based on the level of Mental Model Sharedness as Determined by the Team Assessment and Diagnostic Instrument (TADI). In D. Ifenthaler, P. Pirnay-Dummer, N. Seel (Eds.), Computer-Based Diagnostics and

Systematic Analysis of Knowledge.

150

REFEREED CONFERENCE PROCEEDINGS Kim, C., Mendenhall, A., & Johnson, T.E. (2009). Implementation of an Online Social

Annotation Tool in a College English Course. In Kinshuk, D. G. Sampson, J. M. Spector, P. Isaías & D. Ifenthaler (Eds.), Proceedings of CELDA 2009, Cognition and Exploratory Learning in the Digital Age, 20-22 November. Rome, Italy: IADIS International Conference Cognition and Exploratory Learning in Digital Age (CELDA), Rome, Italy, Nov 20-22, 2009.

Kim, C., Mendenhall, A., & Johnson, T. E. (2008, October). The application of a

task-centered approach to an online English writing course. Proceedings of the IADIS International Conference of Cognition and Exploratory Learning in Digital Age (CELDA), Freiburg, Germany.

PRESENTATIONS

INVITED SPEAKER & PANELIST

Mendenhall, A. (November, 2011). Using Mobile Devices at a Wildlife Center to Promote

Scientific Inquiry. Invited panelists for the International Council for Educational Media’s (ICEM) panel titled “Discussion in Emerging Technology: Mobile Learning”, Association for Educational Communication and Technology (AECT). Jacksonville, FL.

Menenhall, A. (April, 2010). Active-Learning Strategies Using Web 2.0 and Free Online

Resources in Teaching and Learning. Invited speaker for the ICT in Teaching and Learning Seminar. Bandar Lampung, Sumatra, Indonesia.

Mendenhall, A. (March, 2010). A Problem-based, Peer Interactive Instructional Strategy in

a Blended Learning Environment. Invited speaker for the International Seminar on Instructional Strategies in Higher Education. DIES Natalis UNS XXIV, Universitas Sebelas Maret. Solo, Indonesia.

REFEREED PAPERS AT CONFERENCES

Mendenhall, A., Johnson, T. E., Klein, J. D. (October, 2012). Examining the use of the First

Principles of Instruction in a Short-term, High Volume, Rapid Production Environment. Paper to be presented at the Association for Educational Communication and Technology (AECT) International Convention. Louisville, KY.

Mendenhall, A., Myers, J., Chen, X., Sadaf, A., & Ari, F. (October, 2012). Tracking AECT

Convention Internship Alumni for Program Improvement and to Build a Community of

Practice and Support. Paper to be presented at the Association for Educational Communication and Technology (AECT) International Convention. Louisville, KY.

Mendenhall, A., Padmo, D., & Johnson, T. E. (November, 2011). How Shared Mental

Models and Team Processes Influence Team Performance in Faculty Teams. Paper presented at the Association for Educational Communication and Technology (AECT) International Convention. Jacksonville, FL.

151

Mendenhall, A. (November, 2011). A Collegiate Flying Trapeze Team: A

Phenomenological Study of Teamwork, Mental Models, and Team Effectiveness.

Paper presented at the Association for Educational Communication and Technology (AECT) International Convention. Jacksonville, FL.

Johnson, T. E., Ifenthaler, D., Mendenhall, A., Karaman, S., & Pirnay-Dummer, P. (November, 2011). Validation of Student Assessment using Natural Language and

Concept Map Representation Models. Paper presented at the Association for Educational Communication and Technology (AECT) International Convention. Jacksonville, FL.

Mendenhall, A., Marty, P., Douglas, I., Alemanne, N., & Clark, A. (October, 2011). Usability Study of Mobile Learning Technology: A Holistic Evaluation of a Field

Observation Experience. Paper presented at the Association for the Advancement of Computing in Education (AACE) E-LEARN World Conference. Honolulu, HI.

Mendenhall, A., Marty, P., Clark, A., & Alemanne, N. (October, 2011). Promoting Scientific

Inquiry through Student-Centered Activities and Mobile Learning Technology at a

Wildlife Center. Paper presented at the Association for the Advancement of Computing in Education (AACE) E-LEARN World Conference. Honolulu, HI.

Clark, A., Marty, P., Mendenhall, A. & Alemanne, N. (October, 2011). Habitat

Tracker: Learning About Scientific Inquiry through Digital Journaling in

Wildlife Centers. Paper presented at the Association for the Advancement of Computing in Education (AACE) E-LEARN World Conference. Honolulu, HI.

Mendenhall, A., Park, S., Luschei, T. & Spector, J. M. (October, 2010). The Evaluation

of an Online Bahasa Indonesia Language Course for Beginners. Paper presented at the Association for Educational Communications and Technology (AECT) International Convention. Anaheim, CA.

Johnson, T., Karaman, S., Mendenhall, A., Tennenbaum, G., Pirnay-Dummer, P., & Ifenthaler, D. (October, 2010). Validation of natural language representations and

concept maps using reference models. Paper presented at the Association for Educational Communications and Technology (AECT) International Convention. Anaheim, CA.

Nugraha, B., Antoro, S. D., Rahahyu, U., & Mendenhall, A. (July, 2010). A Scenario-based

Approach to a Bahasa Indonesia Course in a Blended Computer-Assisted Learning

Environment. Association for the Advancement of Computing in Education (AACE) ED-MEDIA World Conference. Toronto, Canada.

Kim, C., Mendenhall, A., & Johnson, T. E. (April, 2010). An Online Social

Annotation Tool for English Education. Paper presented at American Educational Research Association (AERA) Annual Meeting, Denver, CO.

Reiser, R., Meyers, J., Rosario, I., Mendenhall, A., & Driscoll, M. (April, 2010). Preparing Students to be Skilled Researchers in Instructional Design and

Technology. Structured poster presentation at American Educational Research Association (AERA) Annual Meeting, Denver, CO.

152

Kim, C., Mendenhall, A., & Johnson, T. E. (October, 2009). Implementation of an

Online Social Annotation Tool in a College English Course. Paper presented at the IADIS International Conference of Cognition and Exploratory Learning in Digital Age (CELDA), Rome, Italy.

Mendenhall, A., Myers, J., & Johnson, T. E. (October, 2009). Overcoming Learning

Challenges through Student Collaboration using Web 2.0 in an Online Disaster

Awareness Course. Paper presented at the Association for Educational Communications and Technology (AECT) International Convention. Louisville, KY.

Myers, J., Mendenhall, A., Johnson, T. E., & Spector, J. M. (October, 2009). Designing an

Online Disaster Awareness and Preparedness Course: Trials and Tribulations. Paper presented at the Association for Educational Communications and Technology (AECT) International Convention. Louisville, KY.

Kim, C., Mendenhall A., Johnson, T. E., & Euridge, G. (October, 2009). Implementation of

an Online Social Annotation System in a College English Course. Paper presented at the Association for Educational Communications and Technology (AECT) International Convention. Louisville, KY.

Johnson, T.E., Pirnay-Dummar, P. N., Ifenthaler, D., Mendenhall A., Karaman, S., & Tennenbaum, G. (October, 2009). Determining the Reliability of Text Summaries and

Concept Maps Mental Model Elicitation Techniques Using Reference Models of

Experts and Book Chapters. Paper presented at the Association for Educational Communications and Technology (AECT) International Convention. Louisville, KY.

Belawati, T., Luschei, T., Padmo, D., & Mendenhall, A. (October, 2009). Decentralized Basic Education in Indonesia: Challenges and Opportunities. Paper presented at the Association for Educational Communications and Technology (AECT) International Conference. Louisville, KY.

Archibald, T., Johnson, T. E., Myers, J, Mendenhall, A., Smith, S., Bolick, K., Cross, J. (November, 2008) Social Annotation Modeling-Learning System- Improving Student

Learning and Performance. Paper presented at the Association for Educational Communications and Technology (AECT) International Convention, Orlando, FL.

Kim, C., Johnson, T. E., & Mendenhall, A. (November, 2008). An evidence-based

framework for the application of Merrill’s first principles of instruction to an online

English writing course. Paper presented at Association for Educational Communication and Technology (AECT) International Convention. Orlando, FL.

Kim, C., Mendenhall, A., & Johnson, T. E. (October, 2008). The application of a task

centered approach to an online English writing course. Paper presented at the IADIS International Conference of Cognition and Exploratory Learning in Digital Age (CELDA), Freiburg, Germany.

153

SERVICE

Instructional Systems Program – Florida State University

• Instructional Systems Alumni Association Student Representative (2010 – Current)

• Graduate Policy Committee – Student Representative (2010)

• Instructional Systems Student Association President (2008 – 2009)

Conferences

• Reviewer - IEEE International Conference on Advanced Learning Technologies (ICALT) (2009– Current)

• Reviewer - Association for Educational Communications and Technology (AECT) Annual Conference (2009 – Current)

• Session Facilitator - Association for Educational Communications and Technology (AECT) Annual Conference (2009 – Current)

• Association for Educational Communications and Technology (AECT) Annual Conference - Training & Performance Division Volunteer (2010-2011)

Instructional Technology Program – Utah State University

• Director of Keynote Speakers – Instructional Technology Institute (2003)

• Focus Group Facilitator – Utah State University Libraries Study

Community Service

• Instructor– prepare lessons and teach good citizenship skills while encouraging positive interaction with classmates and friends to young children ages 4-5 and 8-11 years old.

• Organized and supervised a team of women in an international women’s organization. Arranged weekly educational and support meetings with teachers, music specialists, and members of the organization.

• Organized activities for young adults that fostered camaraderie, friendship, and support.

• Taught classes for an international women’s organization that encouraged charitable acts, diversity, friendship, and community service.

RESEARCH AND EVALUATION PROJECTS

• Design and Development Research including IMI/eLearning development, validation of ISD models

• Team Shared Mental Model Research

• Usability Study of Mobile Learning Technology – including reliability of device and application

• Evaluation of Online Courses and Web-portals

• Evaluation and Research on HyLighter Web 2.0 Social Annotation Technology

154

PROFESSIONAL ASSOCIATIONS

Association for the Advancement of Computing in Education (AACE)

Association for Educational Communication and Technology (AECT)

The Sloan Consortium (Sloan-C)

ASTD – Tallahassee, FL Chapter

TECHNICAL SKILLS

• Mac/PC Operating Systems

• Audio/Video/DVD Production

• SAKAI, Moodle, Blackboard Learning Management Systems

• Web Conferencing Tools

• Web 2.0 Tools

• Microsoft Office Suite

• Adobe PhotoShop, Acrobat Pro, Illustrator

• Final Cut Pro, QuickTime Pro

• Screen Capture software

• HIMATT (Highly Integrated Model Assessment Technology and Tools)

• HyLighter Social Annotation Tool