How Novice Science Teachers Appropriate Epistemic Discourses Around Model-Based Inquiry for Use in...

71
This article was downloaded by: [University of Washington Libraries] On: 03 July 2014, At: 16:45 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Cognition and Instruction Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/hcgi20 How Novice Science Teachers Appropriate Epistemic Discourses Around Model-Based Inquiry for Use in Classrooms Mark Windschitl a , Jessica Thompson a & Melissa Braaten a a University of Washington , Published online: 07 Jul 2008. To cite this article: Mark Windschitl , Jessica Thompson & Melissa Braaten (2008) How Novice Science Teachers Appropriate Epistemic Discourses Around Model- Based Inquiry for Use in Classrooms, Cognition and Instruction, 26:3, 310-378, DOI: 10.1080/07370000802177193 To link to this article: http://dx.doi.org/10.1080/07370000802177193 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content.

Transcript of How Novice Science Teachers Appropriate Epistemic Discourses Around Model-Based Inquiry for Use in...

This article was downloaded by: [University of Washington Libraries]On: 03 July 2014, At: 16:45Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH,UK

Cognition and InstructionPublication details, including instructions forauthors and subscription information:http://www.tandfonline.com/loi/hcgi20

How Novice Science TeachersAppropriate EpistemicDiscourses Around Model-BasedInquiry for Use in ClassroomsMark Windschitl a , Jessica Thompson a & MelissaBraaten aa University of Washington ,Published online: 07 Jul 2008.

To cite this article: Mark Windschitl , Jessica Thompson & Melissa Braaten (2008)How Novice Science Teachers Appropriate Epistemic Discourses Around Model-Based Inquiry for Use in Classrooms, Cognition and Instruction, 26:3, 310-378, DOI:10.1080/07370000802177193

To link to this article: http://dx.doi.org/10.1080/07370000802177193

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all theinformation (the “Content”) contained in the publications on our platform.However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness,or suitability for any purpose of the Content. Any opinions and viewsexpressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of theContent should not be relied upon and should be independently verified withprimary sources of information. Taylor and Francis shall not be liable for anylosses, actions, claims, proceedings, demands, costs, expenses, damages,and other liabilities whatsoever or howsoever caused arising directly orindirectly in connection with, in relation to or arising out of the use of theContent.

This article may be used for research, teaching, and private study purposes.Any substantial or systematic reproduction, redistribution, reselling, loan,sub-licensing, systematic supply, or distribution in any form to anyone isexpressly forbidden. Terms & Conditions of access and use can be found athttp://www.tandfonline.com/page/terms-and-conditions

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

COGNITION AND INSTRUCTION, 26: 310–378, 2008Copyright C© Taylor & Francis Group, LLCISSN: 0737-0008 print / 1532-690X onlineDOI: 10.1080/07370000802177193

How Novice Science TeachersAppropriate Epistemic Discourses

Around Model-Based Inquiry for Usein Classrooms

Mark Windschitl, Jessica Thompson, and Melissa BraatenUniversity of Washington

To test whether epistemically unproblematic ways of thinking and talking about sci-ence could be transformed during preservice teacher training, we designed a systemof learning activities based on a set of heuristics for progressive disciplinary dis-course (HPDD). The HPDD outline six design principles of learning environmentswhere the aim is to foster learners’ participation in material and discursive activitiesthat characterize the work of scientists. After tracking participants through universitycoursework where the HPDD was employed and into their teaching practicums, wefound that most came to reconceptualize the interrelated roles of models, theory, evi-dence, and argument. These ideas ultimately supported a shift in their goals for scien-tific investigation—from “proving” a hypothesis, to testing and revising explanatorymodels. Preliminary findings from teaching episodes with their own secondary stu-dents indicated that some participants took up “epistemically ambitious” classroompractices, pressing learners to develop testable models of natural phenomena andgather evidence to link observations with underlying explanatory processes.

Meaningful learning in school science entails more than accumulating domainknowledge, it requires the appropriation of specialized epistemic discourses thatallow students to organize, develop, and evaluate knowledge according to disci-plinary standards (Erickson, 1982; Lampert, 1990; Lemke, 1990; Moje, Collazo,Carrillo, & Marx, 2001; O’Connor & Michaels, 1996). By epistemic discoursesin science, we refer to the ways language or other symbolic forms are used to

Address correspondence to Mark Windschitl, University of Washington, 115 Miller Hall, Box353600, Seattle, WA 98195. E-mail: [email protected]

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 311

explore the testable, revisable, conjectural, explanatory, and generative nature ofscientific ideas (Smith, Maclin, Houghton & Hennessey, 2000). In order to makesense of these characteristics of scientific knowledge and the practices that embodythem, students must be able to hear, see, and read about more advanced othersusing particular terms, representations, and rhetorical forms within prototypicalcases of inquiry (Brown, Ash, Rutherford, Nakagawa, Gordon, & Campione 1993;Tomasello, 1999).

Unfortunately, most of these “advanced others”—middle and high school sci-ence teachers—have had no such apprenticeships themselves. They have experi-enced “doing science” during their undergraduate years through highly scriptedlaboratory activities (Gess-Newsome & Lederman, 1993; Trumbull & Kerr, 1993)and lectures where instructors rarely discuss in explicit terms how science gen-erates new ideas or uses evidence to evaluate knowledge claims (Bowen & Roth,1998; Duschl & Grandy, 2005; King, 1994; Reinvention Center at Stonybrook,2001; Wenk & Smith, 2004). Further complicating the issue of teacher preparationis the fact that the language of investigative science in schools has not kept pacewith current scientific practice. Contemporary views of science have shifted froma focus on experimentation to developing and refining explanatory models (Giere,1991; Nersessian, 2002). This does not mean that scientists no longer engagein experimentation. Rather, the role of experimentation is becoming “situated intheory and model-building, testing, and revising” (Duschl & Grandy, 2005, p. 7).Here too, teachers have limited experiences. Most have learned about models asvisual aids to help explain unproblematic ideas to others rather than as scientifictools to predict phenomena, generate hypotheses, or test ideas (Harrison, 2001;Smit & Feingold, 1995; Van Driel & Verloop, 2002). Modeling and inquiry are,in fact, seen by some teachers as distinctly different enterprises (Windschitl &Thompson, 2006).

For beginning educators, this lack of experience with authentic inquiry and theoften narrow conceptions of the role of models in scientific work compromises theirability to engage students in the ambitious reform-based pedagogies advocatedby the National Science Education Standards (NRC, 1996) and the Benchmarksfor Science Literacy (American Association for the Advancement of Science,1993). Reform teaching, as described by these documents, requires an integratedunderstanding of how theory, evidence, and explanation are used in inquiry andalso requires an understanding of the role of models in representing and evaluatingideas in inquiry contexts (Schwarz & Gwekwerere, 2007.

Within teacher preparation programs, some multifaceted instructional experi-ences have helped beginning educators develop better understandings of scientificmodels (Crawford & Cullin, 2004; DeJong & van Driel, 2001; Schwarz & Gwek-werere, 2007) and in a few cases helped them appropriate some of the practicesand language of model-based inquiry for use in their own classrooms (Windschitl& Thompson, 2006). However, we know little of which experiences influence

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

312 WINDSCHITL ET AL.

the thinking of novice educators and how different forms of scaffolding used inmultiple learning contexts support the development of more sophisticated epis-temic discourses. To address these critical gaps in our understanding, this studyexamines an extended apprenticeship in the epistemic discourses of science by18 preservice teachers, beginning in a methods class and ending in secondaryschools where participants created various inquiry-based experiences for theirown students. This apprenticeship was based on a set of heuristics for progressivedisciplinary discourse (HPDD)—a set of principles applied to the design of learn-ing environments where the aim is to foster learners’ participation in the materialand discursive activities of science. Our study addresses the following questions:

1. In what ways does an instructional focus on scientific models and model-based investigations influence how beginning teachers think and talk aboutthe epistemological role of models in inquiry?

2. How do elements of the HPDD framework facilitate changes in participants’discourse and epistemic reasoning around the role of explanatory models ininquiry?

In addition to these primary questions, we sought data to provide prelim-inary answers to the following:

3. In what ways does an instructional focus on models influence beginningteachers’ use of models and modeling with their own students?

BACKGROUND

Scientific Epistemology

The language of science is grounded in the types of knowledge used to representideas and the special forms of reasoning that reference these ideas in the advance-ment of explanations of natural phenomena (Sandoval, 2005). Types of knowledgeinclude hypotheses, laws, theories, and of course, models. Scientists create mod-els in the forms of analogies, conceptual drawings, diagrams, graphs, physicalconstructions, or computer simulations in order to describe and understand the or-ganization of systems, ranging from insect predation to stellar evolution. Certainforms of inquiry that utilize such models encompass all five dimensions of theepistemic nature of knowledge (noted in italics in what follows). In these inquiries,models are treated as subsets of larger, more comprehensive systems of explanation(i.e., theories) that provide crucial frames of reference to help generate hypothesesfor testing, act as referents in interpreting observations, and are themselves targetsof revision (Darden, 1991; Giere, 1988; Kitcher, 1993; Longino, 1990; Nersessian,2005; Stewart & Rudolph, 2001). Arguments for the support of conjectural mod-els (as opposed to purely descriptive, empirical models) involve observations (for

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 313

example, a balloon affixed to the top of a flask will begin to inflate when theflask is heated) used to support explanations involving unobservable entities orprocesses (in this case, that heat causes molecules of air to move more rapidly,producing a pressure inside the flask that is greater than that outside). Althoughdifferent domains in science have their own fundamental questions, methods, andstandards for “what counts” as evidence, they are all engaged in the same coreepistemological pursuit—the development of coherent and comprehensive expla-nations through the testing of models (Hempel, 1966; Knorr-Cetina, 1999; Latour,1999; Kuhn, 1970; Longino, 1990).

Models and Inquiry in School Science

In school settings, modeling approaches to science can take root only where in-quiry is valued and supported. By inquiry we refer to any context within whichquestioning, investigating, and explaining phenomena are given priority (Lehrer& Schauble, 2004). In referencing inquiry, the National Science Education Stan-dards specify that, for secondary students, investigations “should culminate informulating an explanation or model” (NRC, 1996, p. 175); however neither thenature of models nor the role of models as tools to think with are mentioned. Mod-eling, on the other hand, is a form of inquiry where the characteristic features ofthese representations and their role in building new knowledge are made explicitthrough the investigative enterprise. The general aim of modeling is to test anidea—represented as a system of related processes, events or structures—againstobservations in the real world and to assess the adequacy of the representation(i.e., model) against certain standards. Successful instructional frameworks formodeling typically guide students through a number of processes that include:engaging with a question or problem (often through material involvement with anatural phenomenon); developing hypotheses about causal or otherwise associa-tive relationships in the phenomenon; making systematic observations to test thesehypotheses; creating models of the phenomena that would account for the obser-vations; evaluating this model against standards of usefulness, predictive power,or explanatory adequacy; and finally, revising the model and applying it in newsituations (see Hestenes, 1992; Lehrer & Schauble, 2006; Lesh, Hoover, Hole,Kelly, & Post, 2000; Metcalf, Krajcik, & Soloway, 2000). White and Frederiksen(1998) and Schwarz and White (2005) for example, used such activities withsecondary students to foster modeling capabilities around law-like relationshipsunderlying force and motion. Stewart, Hafner, Johnson, and Finkel (1992) andStewart, Passmore, Cartier, Rudolph, and Donovan (2005) asked learners to workwith more probabilistic relationships, posing questions about populations of organ-isms, investigating their questions by using or adapting models to explain patternsof inheritance, and persuading other members of the class that their model bestaccounted for the observational data.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

314 WINDSCHITL ET AL.

Our conception of modeling, which we refer to as model-based inquiry (MBI), issimilar to those listed earlier in its basic repertoire of activity. It differs, however,in the emphasis placed on learners generating their initial hypotheses from atentative model that is developed early in the inquiry process. Included in thismodel are articulations between observable aspects of the target phenomena andconjectured underlying causal processes. In addition, learners are pressed to createfinal arguments that explain observations in terms of these underlying processes.

With these parameters in mind, we propose the following as an appropriatelyambitious form of inquiry for students at the secondary level, one that embodiesimportant epistemic features of scientific knowledge-building (note: this is not astep-wise protocol):

1. Investigations emerge from a motivating interest in some aspect of thenatural world and students are provided the resources and/or experiencesto develop an initial but tentative representation (i.e., model) of the phe-nomenon.

2. This model suggests unseen processes, properties, or structures, which arepotentially explanatory of the target phenomenon.

3. The model is used as a sense-making tool to generate testable hypotheses.4. The data collected to test the model are used to identify patterns or relation-

ships in the observable world.5. Arguments are constructed that not only attempt to validate the existence of

these patterns but ultimately to support or refute claims about explanatoryprocesses or entities hypothesized in the original model.

All but the fourth characteristic above differ in process and epistemology fromthe unproblematic “scientific method” used in schools today (i.e., observe, developquestion, create hypothesis, design experiment, collect and analyze data, draw con-clusions, develop new questions). In much of school science, “observations” aredirected by the teacher or guided by student interest but are rarely acknowledgedas being influenced by pre-existing theory or models. Consequently, the questionsarising from such observations are seldom informed by even a modest understand-ing of the phenomenon. This reinforces the naive assumption that hypotheses aremerely “best guesses” about experimental outcomes (Carey, Evans, Honda, Jay &Unger, 1989; Sandoval & Morrison, 2003; Smith et al., 2000), when in authenticscience, a hypothesis is considered a statement of how aspects of a specific modelmight map onto real-world situations (Cartwright, 1983; Giere, 1991; Morgan &Morrison, 1999; Nersessian, 2002, 2005).

Students using the scientific method then, are not often asked to make claimsabout or even understand the implications of explanatory “theoretical components”in a model. These omissions contribute to a lack of understanding about the natureof theory and models, and increase the likelihood of content-free inquiry—that is,

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 315

going through the motions of the scientific method without understanding in anydepth the phenomenon one is studying. The goal of most school inquiry tasks, asChinn and Malhotra observed, is “only to uncover easily observable regularities(e.g., plants grow faster in the light than in the dark) . . . not to generate theoriesabout underlying mechanisms” (2002, p. 187).

These limited curricular aims are consistent with what Driver, Leach, Millar,and Scott (1996) refer to as relation-based reasoning. The nature of explanationfor relation-based reasoning refers only to connections between features of a phe-nomenon that are observable (e.g., increased physical activity is accompanied byfaster heart rates) without using those to argue about underlying causes. In contrastto relation-based reasoning, authentic science uses model-based reasoning. Thisperspective takes inquiry to be an empirical test of conjectural models or theories.Explanation in this case involves coherent stories that posit unobservable processesand acknowledges discontinuity between observation and such processes.

Teachers’ Understandings of Models and Inquiry

To optimize classroom learning around epistemically rich forms of model-basedinquiry, teachers need a sophisticated understanding of the nature of scientificmodels as well as how they are used in authentic inquiry. This includes the ideasthat models can represent a system of ideas with explanatory power for someprocess or event, that models can be created in different representational modesfor different purposes (e.g., a concept map vs. a pictorial drawing), and that aphenomenon can be conceptualized through models in different ways (e.g., acaloric vs. kinetic model of heat transfer for example). They should understandthat applying a model to real-world circumstances must take into account thelogical limits of the model as well as any underlying assumptions used to build themodel. Regarding the function of models, teachers should understand how they canbe used to facilitate novel insights into a natural or mathematical system, and howthey are used to predict or explain events. The teacher should further understandhow one generates meaningful research questions and hypotheses from a model,and how scientific argument is employed to make revisions in models.

The ideals described earlier are not likely to be part of an educator’s conceptualframework. Most teachers, for example, believe that models are useful only asvisual aids to help explain complex or abstract ideas to others, or to demonstratehow things work (Cullin & Crawford, 2004; Smit & Finegold, 1995). Teachersrarely mention how models are used in making predictions or used as tools forobtaining information about targets that are inaccessible to direct observation(Harrison, 2001; Justi & Gilbert, 2002; Van Driel & Verloop, 2002). In generalthere is an awareness of the value of models in teaching science concepts butnot of their value in learning about science. Preservice teachers often subscribeto a “folk theory” of scientific inquiry in which models play no discernable role

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

316 WINDSCHITL ET AL.

(Windschitl, 2004). Facets of this folk theory include the belief that empiricallytesting and validating relationships are epistemological “ends-in-themselves” (i.e.,these patterns are not used to suggest underlying mechanisms) and that modelsand theories are optional tools used only at the end of a study to help explainresults.

Attempts to reshape teachers’ understandings of models have met with limitedsuccess. Windschitl and Thompson (2006), for example, examined 21 preservicesecondary teachers as they participated in a series of activities aimed explicitlyat developing their understanding of models and how models are used in inquiry.The study culminated in independent inquiries by the students, in which theywere required to develop a model of a natural phenomenon, empirically test someaspect of that model, and use the results to support or revise the original model.Researchers found that even though participants were able to talk about the ideaof models in sophisticated ways if given contextualized examples, they werealmost wholly unfamiliar with generating theoretical models to ground empiricalinvestigations or employing model-based reasoning to make sense of their findings.At the conclusion of their projects, only 2 of 21 participants discussed their findingsin terms of conjectural processes and most maintained beliefs that modeling andinquiry were separate enterprises.

DeJong and Van Driel (2001) worked with preservice chemistry teachers to shifttheir focus from exclusively teaching content to teaching about the nature of mod-els. Their participants discussed articles on modeling, examined model-orientedcurricula, and collaboratively developed lessons for teaching about specific mod-els. Despite these activities, most did not come to an understanding of some ofthe most fundamental functions of models. Crawford and Cullin (2004) had sec-ondary preservice teachers design an open-ended investigation of a plant, soil, andwater system, and later build computer models of the relevant environmental phe-nomena. After the modeling experience participants shifted their thinking, frommodels being used by someone to explain an idea to another, to the model beingconsidered by a “user” to understand the phenomena him- or herself. Overall,however, no participant moved from a mid-level understanding to an expert level.In a study with preservice elementary teachers, Schwarz and Gwekwerere (2007)experienced some success using an “engage, investigate, model, and apply” frame-work to engage them in model-based reasoning and move a majority toward theirown model-based lesson designs, although some used models in their lessons inways inconsistent with model-centered inquiry.

On the whole, instructional interventions with teachers have been onlymodestly successful, indicating the need for more robust instructional designsthat take into account participants’ current conceptions of inquiry, includeopportunities to work with models in varied and mutually reinforcing con-texts, and routinely connect the principles of model-based inquiry to classroompractice.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 317

Appropriating Specialized Discourses in Science

From the standpoint of discourse, learning about modeling in science is like ac-quiring a new language (Edwards & Mercer, 1987). For any learner to becomecompetent in specialized fields, they must receive social support in the form of scaf-folded opportunities to practice relevant ways of talking and thinking (O’Connor& Michaels, 1996). Gee (2002) describes four interrelated conditions that accom-modate such opportunities. First, “bootstrapping” into a new domain happens byactively imagining the perspectives of other more advanced practitioners as theypublicly situate meanings within the domain through interaction and dialogue.This requires that the novice witness these practices in settings rich enough tomake approximations as to what they might mean (Tomasello, 1999). Authenticsettings however, such as science laboratories, are inherently complex and onlysome aspects of the setting are relevant to the situated meanings being constructed.In these cases, learners need overt assistance in recognizing what to pay attentionto and what constitutes background noise. Second, beyond directing the learner’sattention, more advanced others must also model cases of talk and action beinggiven situated meanings within the context of practice (Barsalou, 1999a, b; Glen-berg, 1997; Glenberg & Robertson, 1999; Tomasello, 1999). Such initial casesshould not go beyond the learner’s current intellectual or experiential resourcesif more complex cases are to be dealt with later on. Third, experience with par-ticular types of complex thinking follow from repeated opportunities to take onvarious roles and stances within recurring social contexts that support those typesof intellectual give-and-take in its proto-forms (O’Connor & Michaels, 1996).Commonly, learners take stances in externalizing their own reasoning, inquiringinto the reasoning of others, and comparing perspectives on problems (Brownet al., 1993; Collins, Brown, & Newman, 1989; Lave & Wenger, 1991; Wells,1993). Finally, learners need feedback when they try out combinations of words,symbols, or images within the context of new practices to test whether their hy-potheses about situated meanings “work” (Kress, 2000; Kress, Jewitt, Ogborn, &Tsatsarelis, 2001).

This suggests a continuum of how one takes up new language practices. Initiallya novice merely recognizes new terms being used in a particular context. Fartheralong in the progression, a learner may be able to recognize why elements of thenew language are being used and then use these terms themselves to participateperipherally in conversations. Later, a learner may adopt much of this language tocarry on certain meaningful activities in the discipline. Finally, learners may fullyappropriate the language when the words and ideas become part of an internallypersuasive discourse through their interactions with the learners’ existing ways oftalking (Bakhtin, 1981). At this point, learners use it spontaneously to solve newproblems and even to adapt the language to make new kinds of meaning in novelsituations.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

318 WINDSCHITL ET AL.

A Framework for the Design of Instructional Activitieswith Preservice Teachers

Drawing on this language socialization literature and on principles of learningenvironment design, we organized a secondary science methods course for pre-service teachers based on a set of heuristics for progressive disciplinary discourse(HPDD), an adaptation of guidelines developed by Engle and Conant (2002) re-ferred to as “productive disciplinary engagement.” The HPDD outlines six designprinciples of learning environments where the aim is to foster learners’ participa-tion in material and discursive activities characterizing the work of scientists.

The first principle is for experienced instructors to model prototypical cases ofdisciplinary activity and discourse early in the time scale of instruction. Thesecases should be simplified in the sense that they draw on learners’ current hypoth-esizing resources as they demonstrate new and unfamiliar ways of approachingthe work of science (Gee, 2002). This allows learners to situate new meaningsof words, symbols, images, or ideas within the context of embodied experience(Tomasello, 1999).

The second principle, problematizing content, is accomplished by encouragingstudents to pose problems, hypothesize, and challenge ideas, rather than expect-ing them to ingest concepts or procedures (Heibert, Carpenter, Fennema, Fuson,Human, & Murray, 1996; Krajcik, Blumenfeld, Marx & Soloway, 2000; Warren& Rosebery, 1996). Complementing this is the third principle, giving studentsauthority, which means that they are producers of knowledge with ownership overit rather than being consumers of other people’s ideas. Students are given an activerole in defining, investigating, and resolving problems (Ball & Bass, 2001; Cobb,Gravemeijer, Yackel, McClain, & Whitenack, 1997; Lampert, 1990; Wenk, 2000).

The fourth principle is giving learners repeated experience in taking on variousdiscursive roles and stances within recurring social contexts that support intel-lectual give and take—such as supporting or challenging positions with respectto claims and observations made by others (Abd-El-Khalick & Lederman, 2000;Brown et al., 1993; Erickson, 1982; Goldenberg & Gallimore, 1991; Lemke, 1990;Moje, Collazo, Carrillo, & Marx, 2001). Taking on these roles and stances oper-ates within the constraints of the fifth principle, holding students accountable todisciplinary norms. Students are encouraged to express ideas within the boundsof the discipline; that is, recognizing limitations on how questions can be framed,abiding by the canonical methods of investigation, and using appropriate forms ofargument (Resnick & Hall, 2001).

The sixth principle is providing relevant resources. Resources include basicnecessities such as having enough time to pursue a problem in depth (Collins,Brown, & Newman, 1989; Henningsen & Stein, 1997), having access to keymaterials and information that explicitly demonstrate how ideas and types oflanguage are used within the discipline (Roth, 1995), and exposure to conceptual

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 319

tools that may facilitate reasoning or guide complex procedures (Lampert, 1990;Sohmer, 2000).

Instructional Activities

Table 1 shows the chronology of the course activities, the purpose of each, and theassociated HPDD features. We describe the activities in what follows and includeadditional detail in the Findings section.

The initial experience of the methods class involved students in a guided formof model-based inquiry. We intended to demonstrate the characteristic practices ofMBI and how these were conceptually and methodologically integrated with oneanother (HPDD principle 1). The first class took place in a computer lab whereparticipants in groups of three were presented with a live goldfish in a jar andasked “Can a fish drown in perfectly clean water?”—the underlying idea beingthat increases in water temperature can drive out the dissolved oxygen that fishuse for respiration. Participants, in groups, then used concept-mapping softwareto construct representations of relationships between various factors in a naturalhabitat that might influence the rate of fish respiration. The purpose of the conceptmaps was two-fold: first, to serve as an initial model from which we could generatequestions for the upcoming investigation; and second, to stimulate conversationsabout the relationship between the observable and the unobservable in naturalsystems.

The next class period, each participant read one of three different text resourceson aquatic ecosystems (one on fish respiration, one on the chemistry of dissolvedgasses in water, and one on thermal water pollution by factories). They thenintegrated ideas from these readings using a jigsaw activity and refined theirconcept-map models based on these readings. Following this, students were shownsealed containers of water that had been heated to various temperatures for differentlengths of time, then cooled. Students were asked to develop a testable questionand a hypothesis based on their concept-map models that would help them learnmore about the effects of thermal water pollution on fish respiration, and thenexecute a brief study. To culminate their study they had to construct a coherentexplanation for their findings, state how the data supported this explanation, anddescribe whether their initial model needed to be changed (HPDD 1, 4, 5).

The following week, students discussed a paper prepared by the instructor onscientific models and the role of models in inquiry (HPDD 6). From a previousstudy we knew that during undergraduate preparation aspiring teachers are rarelyengaged in explicit conversations about specialized forms of scientific knowledge,the purposes of inquiry, nor rhetoric in the discipline. To fill these gaps, thistext provided an overview of three interrelated ideas. The first section of thepaper characterized different forms of knowledge used in science: theories, laws,hypotheses, and models. The second section elaborated on the nature and function

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

320 WINDSCHITL ET AL.

TABLE 1Chronology and Purpose for Instructional Activities

Gradually placing more responsibility on participants to think generatively about wider variety ofmodels and to connect use of models with strategies for own classroom instruction (HPDDfeatures 2, 3, 4).

Activity Purpose HPDD Features*

Week 1: Creation ofconceptual models

� Begin norm-setting of treating science ideas as sys-tems of relationships

� Introduce “what counts” as a conceptual model� Stimulate conversations about relationship between

the observable and the unobservable in natural sys-tems

� Use models to generate questions, hypotheses

1

Week 1: Guidedmodel-basedinquiry on fishrespiration

� Participants experience “complete” MBI, see inter-relatedness of investigative practices

� Introduce norm of refining models as aim of inquiry� Initial “try out” of language and practice around

evidence-based explanation, argumentation

1, 4, 5

Week 2: Exploringtext resource onrole of models ininquiry, applyingthese ideas topedagogicalsituations

� Resource “fills gaps” from undergraduate workaround forms of knowledge and rhetoric in science

� Participants prompted to synthesize ideas in paperwith teaching situations imagined from previous fishinquiry

� Initiate conversations about types of activities K–12students need to understand nature and function ofmodels

6

Week 4: Analyzeauthentic casestudies oftheory-groundedinvestigation

� Create dissatisfaction with idea that science inquirybegins with theory-free observation and culminateswith validation of statistical findings

� Foster idea that argumentation includes explanationfor phenomena that links data with unobservable hy-pothesized processes

� Participants consider how to design instruction forstudents that exemplify ideas above

2, 4, 5, 6

Weeks 5 & 6: UsingWeb-baseddiscussion boardsduring fieldexperiences

� Provide opportunity to take on discursive roles andstances around models and inquiry

� Allows participants to inquire into reasoning of others� Surfaces which course activities had caused concep-

tual conflict, which helped refine ideas about models,theory, evidence and argument

4

(Continued on next page)

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 321

TABLE 1Chronology and Purpose for Instructional Activities (Continued.)

Activity Purpose HPDD Features*

Week 9: Observe andcritique exemplarypresentation ofMBI project

� Provides “just-in-time” access to MBI discourserelated to requirements of their project

� Presentation includes key epistemic features ofmodeling talk

6

Weeks 2–10:Independent MBIand Finalpresentation

� Participants required to engage inpractices/discourses not traditionally associated withThe Scientific Method

� Participants apply and coordinate all practicesassociated with MBI

� Participants see range of how MBI can be applied byobserving peers’ projects

2, 3, 4

∗HPDD features 1. Model prototypical cases of disciplinary activity/discourse early2. Problematize content3. Give students authority4. Give experience taking on discursive roles and stances in recurring social

contexts5. Hold students accountable to disciplinary norms6. Provide relevant resources

of models. Included here was a table showing three levels of how K–12 studentsthink about scientific models—from naive to sophisticated conceptions. In thethird section of the paper, model-testing was described as the core pursuit ofscience with the ultimate goal of providing plausible explanatory accounts ofphenomena. To provide a forum for synthesizing the ideas in this paper andapplying them to a pedagogical situation, we asked participants in small groupsto address the question, “How could you re-design the fish respiration activity foryour students, in order to help them develop a more advanced understanding ofscientific models?”

In the fourth week, we used a group activity to develop the idea that scientificargument includes an explanation for phenomena that links data with hypothesizedprocesses (HPDD 2, 4, 5, 6). A secondary goal for this activity was to reinforce thatmodels often include representations of theoretical mechanisms and that hypothe-ses are derived from such models. We intended to create a sense of dissatisfactionwith the idea that science inquiry begins with theory-free observation and culmi-nates with a validation of statistical findings. To these ends we distributed two-partvignettes on authentic examples of scientific studies that included (a) a descriptionof a particular scientific theory or model (such as plate tectonics) and (b) a de-scription of a study whose hypothesis and investigative design were based on that

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

322 WINDSCHITL ET AL.

model and that evaluated the model empirically. We asked participants to “developtasks for students, such as how they might collaboratively develop an argumentlinking data from the study with the original theory.”

During weeks five and six, participants spent two weeks observing in schools.During this time they communicated with each other via Web-based discus-sion boards. We asked them to exchange ideas about which course activitieshad caused conceptual conflict for them, helped them develop new ideas, orrefined existing ideas around models, theory, evidence, and argument. The pur-pose here was to provide a forum for participants to take on different discursivestances with peers about models and inquiry and to inquire into the reasoningof their peers (HPDD 4).

Continuing across the entire fall quarter, the major project for participantswas an independent MBI that they would eventually present to their peers inweek ten. In the project they were specifically required to incorporate epistemicfeatures of authentic science not typically addressed in school science (HPDD 2,3, 4). Feedback from instructors played a major role in these projects. They wererequired to develop an informed but tentative scientific model at the outset, basedon three content readings of their choice, which included potentially explanatoryprocesses or entities. They were asked to identify which relationships they intendedto test (i.e., develop hypotheses from) and, in the end, to generate final arguments inwhich empirical evidence was used to support or refute claims about explanatorymechanisms. Students were given a “model-testing” guide to help them thinkabout how to develop representations appropriate for testing, how to develop acoherent argument, and what the typical problems were that teacher educationstudents in the past had faced in model-testing investigations (for example, weincluded a list of model types unsuitable to support an investigation). Near theend of the quarter, participants expressed residual uncertainty about what an MBIpresentation “looked and sounded like.” We provided a “just-in-time” projectpresentation by one of the participants who had finished a week early, and hadincorporated all the required features into her investigation (HPDD 6).

As the course progressed, we gradually placed more responsibility on partici-pants to think generatively about a wider variety of models and to connect the useof models with strategies for their own classroom instruction (HPDD 2, 3, 4). Forexample, in the early weeks of the course, instructors were responsible for intro-ducing various explanatory models into the classroom discourse and supportingconversations about their salient features. As the quarter progressed, participantsbegan to initiate discussions about a range of model types (conceptual diagrams,drawings, graphs, etc.) across a spectrum of science domains and they began torefer to pedagogical situations in which students could test these models and/orlearn more about the nature and function of models. In winter quarter, the studentswere entirely responsible for collaboratively developing a unit on gas laws andtook turns teaching individual lessons to their peers. Without prompting from the

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 323

instructor, students decided it would be critical to teach about the gas laws as aninter-related set of models and designed their lessons accordingly.

METHODS

Participants

Our research was a naturalistic inquiry (Denzin & Lincoln, 2003) into how a groupof learners responded to a purposefully designed curriculum and to instructionalpractices that supported this curriculum. We focused specifically on how partici-pants’ forms of reasoning and discourse changed over time, and on the conditionsthat supported these changes. To help us characterize the influence of variousinstructional experiences on participants’ thinking, we routinely sought to maketheir thinking visible and public through discourse and inscriptions.

The 18 participants were students in a teacher education program at a publicuniversity in the northwest United States, all enrolled in a secondary sciencemethods course. All candidates entered the program with at least a bachelor’sdegree in an area of science. The study took place over 15 months, which includeda 6-month methods course (taught by the first author) and 4 months of studentteaching in local schools.

Data Sources and Analysis

Eight data sources were used in this study. The first was an extensive pre-courseinterview which elicited ideas about the nature of science models (for example:“What kinds of things do scientists make models of ?”); the function of models(for example: “What would a scientist use a model for ?”); the use of models ininstruction; the characteristics of authentic investigative science; and their school-related experiences in doing science (Appendix A). All questions were followedby prompts to elaborate, clarify, and provide examples where relevant. Theseinterviews were transcribed in full. Participants’ beginning understandings ofmodels were examined from three perspectives: the nature of models, thefunctionof models, and the role of models in inquiry. Participants were rated on each ofthese based on criteria listed in Tables 2 and 3. A rating of “3” represented waysof talking about models that were most congruent with those of experts, a ratingof “1” represented ways of talking about models that were least congruent withthose of experts, and “2” represented an intermediate level of sophistication.1 Wenote here that in coding responses for the function of models we realized some

1The ratings were scaled to the range of participants’ responses. That is, ratings were relative toothers in the participant pool. In the broader population of science learners of all ages, for example,there are individuals who would have held less sophisticated conceptions of the nature and function

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

324 WINDSCHITL ET AL.

participants understood only one of the two aspects of how models acted as toolsto advance scientific ideas (i.e., can be used to predict phenomena and/or can beused to generate new insights into systems of relationships). Understanding both ofthese related but distinct model functions appeared to indicate a more sophisticatedunderstanding of models than understanding one of the two. Therefore we ratedthose who understood both aspects a “3+.”

The second set of data sources included videotaped and written artifacts fromthe guided inquiry (i.e., model-testing activity) that took place during the firsttwo class periods. We developed an initial set of codes informed by a previousstudy (Windschitl & Thompson, 2006) that focused on participants’ perceptionsof the role of models in this inquiry, how hypotheses are developed, and variationsin understanding the rhetorical purpose of scientific argument. Relevant patternsof talk that appeared in conjunction with these codes stimulated a closer, turn-by-turn analysis of discourse (Castanheira, Crawford, Dixon, & Green, 2001).In addition to these codes, participants’ allusions to “what counts as theoretical”or “the unobservable” emerged as a predominant feature, which then became anadditional focus of analysis (Erickson, 1992). In the analysis we attended not onlyto particular terms, but to how participants “offered” ideas for comment, re-framedcommentary by others, and amplified or subverted one another’s ways of talkingabout “doing science.”

The third data source was audiotape of small group discussions about a paperdistributed in class that explicated the roles and functions of scientific modelsand described levels of sophistication in how young learners understand scientificmodels. We used the same general analytic strategy as described earlier, however,the codes applied to the nature and function of models (keying in on ideas fromTable 2), the role of models in inquiry (from Table 3 and the use of models inpedagogical contexts. In particular we were sensitive to suggestions of teachingwith models (using models only as props) as opposed to teaching about models(marking the curricular importance of the nature and function of models).

The fourth set of data sources was videotape of small group and whole classdiscussions about the authentic cases of theory-grounded investigations. We tran-scribed all discussions and coded for participants’ talk about how hypotheses aregenerated in authentic science, the role of models and theory in designing stud-ies, and the nature of scientific argument versus the language of “conclusions.”Because similar patterns of talk emerged in all participant groups, we selecteda representative small group conversation as a “telling case” (Mitchell, 1984) inorder to communicate findings from this instructional episode.

The fifth data set was gathered from web-based discussion boards that par-ticipants used while observing schools in mid-quarter. Because the prompting

of models, or the role of models in inquiry than participants in this study who ranked a “1” in thesecategories.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 325

TABLE 2Rating Criteria for Participants’ Understanding of Nature and Function of Models

Nature of Models Function of Models

3. Most congruent with expert views� Can portray conceptual/theoretical as well

as observable processes and relationships.� Represent ideas rather than “things.”� Models fallible in concept because they

are based on interpretation and inference.� Models have logical limits and underlying

assumptions.� Models can differ not only because of rep-

resentational modes, but because a phe-nomenon is totally reconceptualized.

� Tools to advance scientific ideas rather thanonly being a product of inquiry:

1. Are generalizable, can be used to predict.2. Allow novel insights into relationships,

and help generate questions for inquiry.

Examples cited: Kinetic models of molecularmotion & idea of harmonic motion.

2. Intermediate

� Models portray processes and systems thatmay not be directly observable, but aretaken to be real.

� Models can take form of mathematical rep-resentation or set of rules.

� Models of same thing can be different be-cause there are different modes of repre-sentation.

� Facilitates understanding, helps others tounderstand what an expert knows.

� Are generalizable, used to describe differ-ent situations.

� Helps analyze effects/variables of somecomplicated system.

Example cited: Fluid flow in watershed mock-up& fruit flies as model organisms.

1. Least congruent with expert views

� Models are pictorial or physical replica-tions of “things” considered to be real.

� Object of model may be too small, toolarge or inaccessible to direct observation.

� Relation of model to thing being modeled:object of model is more complex.

� Models can be different from one anotherbecause of different “looks” at the object.

� To simplify, illustrate, show.

Example cited: Plastic skeletons & solar systemmodels made of foam.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

326 WINDSCHITL ET AL.

TABLE 3Rating Criteria for Participants’ Understanding of the Role of Models in Inquiry

Role of Models in Inquiry

Development and testing of models is inquiry—they are methodologically and epistemologicallyintegrated

3. Most congruentwith expert views

� Research questions are conceived of within the context of a model.� Hypotheses are parts of models that will be tested against “the world.”� Models are revised through argument that uses data and logic, must be

consistent with evidence, other models, theories.� Empirical data can be used to argue for theoretical “pieces” (structures

or processes) of models.� Models can change not only as result of empirical “fine-tuning” but also

because target phenomenon is reconceptualized in new way.

Models and empirical investigations are reciprocally informative2. Intermediate � Scientists typically do scientific inquiry first, then create a model based

on data.� Models can help one think of things to investigate.� Hypotheses are models.� Data can be collected from models themselves.� It is important to collect data on actual phenomenon (rather than exclu-

sively from a model) if possible.� Models are changed only if they do not match/predict data.

Modeling and inquiry separate enterprises1. Least congruentwith expert views

� Model development not recognized as part of scientific inquiry; modelsfunction only to illustrate, simplify, help communicate ideas.

� Hypotheses are “best guesses” from unspecified background knowledge.� Relationships between empirical observations and theory unspecified.� Fact that data can be collected from models themselves is unacknowl-

edged.� Argument may be synonymous with “conclusions;” directed toward de-

termining if questions are answered rather than using patterns in data tosupport or refute models.

questions (Appendix B) referred to their reading of the paper we had distributedin class and to the guided inquiry at the beginning of the quarter, we used the samecodes as with our third data source. We added, however, a focus on comparativeand contrastive talk between MBI and features of The Scientific Method.

The sixth data set was a cluster of sources around a long-term model-testingproject, including student proposals for the projects, e-mails from participantsseeking guidance with their project, and videotape of the final presentations. Forthe proposals and the e-mails we attended to what participants felt was becoming“problematic” for them as they planned and executed their own model-basedinquiry. We looked again for language indicating residual conflicts between what

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 327

they were being required to do in the model-testing project and their existingconceptions of proper investigative method and epistemology.

The presentations were videotaped and analyzed based on five questions:

1. How did participants use background content materials to develop a con-ceptually integrated initial model of the target phenomenon?

2. How did they incorporate hypothesized causal processes or entities into thismodel?

3. How did they test hypotheses derived from this model?4. Did they analyze and represent data appropriately?5. How did they construct evidence-based scientific arguments?

These final arguments were evaluated as to whether they were theory-directedor method-directed.2 Theory-directed argument, used in authentic science butrarely in school science, takes as its object of critique a set of changes proposedto the underlying model based on the empirical findings. The basis of thesearguments is the empirical conclusions (statements of significant differences, co-variation, etc.). The point of theory-directed argument is to convince others thatthe possibility or character of unseen mechanisms was supported by evidencefrom observable outcomes. Method-directed argument, on the other hand, featuresempirical conclusions such as co-variance, significant differences, changes overtime, and so on as the object of the critique. The basis of the argument is howwell the study was designed, how carefully and systematically the data werecollected, and how accurate or appropriate the analysis was. The point of this formof argument is to convince others that empirically determined assertions aboutrelationships between observables were valid. School science commonly employsonly method-directed argument.

The seventh data set was a combination of written reflections on the inquiryprocess and end-of-course interviews (Appendix C). These were analyzed as acoupled set of documents. First, participants were rated on the same 1–3 scales(from Tables 2 and 3) for understanding the nature, function, and roles of modelsin inquiry. We then coded for final conflicts or coherences with the ScientificMethod, and the projected roles of models in their own teaching.

The eighth data set encompassed multiple observations of each student thefollowing fall as they began their teaching practicums in local schools (observa-tional protocols and first pass analysis in Appendix D). We observed each partici-pant at least three times (with two exceptions), targeting classes that participantsdescribed as “inquiry-oriented.” We observed full class periods, during which wewould script classroom dialogue, adding memos about particular discourse moves

2Method-directed and theory-directed argument are both part of authentic science.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

328 WINDSCHITL ET AL.

the participants made with their pupils regarding ideas around hypotheses, models,explanations, and evidence. In the after-class debriefing session, we asked abouttheir pupils’ further exposure to these concepts through assigned tasks and talk,and queried participants about what they thought was successful or found prob-lematic during the class period. In the analysis we focused on whether and howthey scaffolded students’ ideas around the nature or function of scientific models.

FINDINGS

Initial Ideas About Models and Investigative Science

Participants’ Understandings of the Nature and Function of Models

Initially, less than half of participants held views of the nature of modelscongruent with those of experts (criteria in Table 2). Ella, who was rated a 3,referred to models as representations of ideas that are, in turn, products of theinterpretation of data:

Ella: The thing that’s being modeled [atomic orbitals] is I guessusually an idea or concept, and the model lends some structureto that concept or idea.

Interviewer: Can there be more than one model of the same thing?Ella: Ah, because of the information we have and the data collected

so far, there are different interpretations of that data that can beconcluded from it. I mean, I know for a fact that there can bemore than one model to explain a concept.

More than a third of participants were rated a 1–2 for understanding the natureof models, meaning that the participant talked initially and predominantly aboutmodels as representations of “things considered to be real” (rating a 1), and duringthe course of the conversation mentioned, in passing and without elaboration, areference to models as representing processes (rating a 2). If however the partic-ipant mentioned that models could represent “things” considered to be real andlater elaborated on the idea that models could represent processes as well, thenthat individual was given the higher of the two ratings (2).

Similar to participants’ understanding of the nature of models, only half heldsophisticated conceptions of the function of models. For example, when askedwhat the purpose of a model was, Jenna (rated 2) replied: “I would use a modelto explain to someone else, either if it was a child in simpler terms, or if it wasto another scientist in maybe visual terms—to give further detailed explanation ofwhat you’re trying to present.”

When asked if models were important to teach about in science classrooms,sixteen of eighteen participants said “yes,” but only three of these suggested that the

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 329

reason was for students themselves to create models and use them as a componentof inquiry.

How Participants Talked About Inquiry and the Role of Models ininvestigations

In evaluating participants’ understanding of the role of models in inquiry (cri-teria detailed in Table 3), we could not rate any individual unambiguously as a 3.Five participants were rated 2–3; however, these mentioned, at most, only two ofthe level 3 ideas described in Table 3. The majority of participants talked aboutmodels and empirical investigations as being reciprocally informative (rated 2).Four talked about modeling and inquiry as entirely separate processes (rated 1).

Virtually all participants said that inquiry began with a question or hypothesisand then was followed by collecting and analyzing data. Hypotheses were talkedabout as educated guesses from unspecified background knowledge but neverreferenced as being derived from a more comprehensive system of relationships.The primary features of experimentation involved identifying relevant variablesand creating controls or control groups. When asked, “What is the purpose of anexperiment?” most participants said it was an activity “to get evidence,” “prove ahypothesis,” or “answer questions.”

To probe beyond these routine responses, we asked, “What happens after youanalyze your data?” Participants expressed a generally unproblematic view of therelationship between data and any claims made. Ten of the eighteen participantssimply said they would “write up conclusions” or “summarize the findings andthen go on to the next question.” One participant explained: “Well, you analyzeyour data and look for trends and report your results based on that. And thenyou should have some kind of conclusion about what you saw, whether or notit was correct, where your errors were.” Interestingly, for about one-third of theparticipants, their description of a scientific investigation actually ended with thecollection of data, suggesting that the remainder of the inquiry was somehowself-evident or determined by the data alone. The idea of developing scientificarguments that connected data with explanatory processes was entirely absent inthe interviews.

When participants were asked, “In order to advance science, do scientists con-duct investigations or create models, or do they do both?” almost every respondentsaid that models were indeed important. However, models were talked about al-most exclusively as the end-products of investigation. Only two respondents saidthat models were catalysts for generating research questions or for gaining new in-sights into natural systems. Five participants acknowledged that one could conductempirical studies on models themselves (as opposed to collecting data on naturalsystems); however, three of these participants warned that studying only models

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

330 WINDSCHITL ET AL.

would be inadvisable since then scientists would only gain an understanding of themodels themselves, but not “the real thing.” Examples here include collecting datafrom computer simulations of weather systems and data from tabletop mock-upsof watersheds. Without specific prompting, only one participant mentioned thatmodels played any particular role in inquiry.

Interestingly, several individuals had carried out research studies during theirundergraduate years but were unaware of their own use of models in these inves-tigations. For example, Jenna recounted an experiment in which she re-created anatural system of light energy within a greenhouse setting to study how shade-loving plants would respond if canopy trees overhead were cut down:

I think it was a great idea to try to find out if a plant could protect itself fromUV [ultra-violet] rays, and we were thinking the bigger picture of as you cut downforests you have more of an understory of plants so you’re going to get a lot moresun, all of a sudden, so could the plant protect itself by using pigments? So we hadthe experiment set up to use a UV—a UVB gun basically, so we would have thecontrol—we did have access to a greenhouse. . .

Minutes later Jenna was asked if she had ever used models in her classes or in herresearch, to which she replied:

Jenna: Not in my experiments we didn’t.Interviewer: Not in the UV light study?Jenna: We didn’t use a model for that.

In sum, participants were generally unaware of the roles models played ininvestigations and in advancing science. In addition, they had difficulty expressinghow special forms of rhetoric were employed by scientists to offer conjectureslinking the material world and the world of ideas.

Responses to Instruction

We organize participants’ responses to instructional experiences around threetemporally distributed themes. The first is “Tensions and sense-making discoursearound ‘What’s theoretical?”’ (in the first two weeks of instruction); the secondis “Pedagogical reasoning as a context for exploratory discourse around mod-els and inquiry” (from the second week through the seventh week); and thethird is “Embodying the language of scientific models in investigative practice”(from the seventh to the tenth week). For each of these time frames, we describeepistemological issues that participants worked through, the types of instructionalactivity that supported this work, and how their talk evolved in response to theseactivities.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 331

Phase 1: Tensions and Sense-Making Discourse Around “What’sTheoretical?”

The Role of the Unobservable in Science

We preface this section with the reminder that a fundamental epistemic con-sideration of MBI is that explanatory models will often include underlying causalprocesses or structures that are not directly observable. Explicit conversationsabout such “theoretical” factors and their relationship to the observable world arenot common in science education, as the following account indicates.

The first 90-minute class took place in a computer lab, where participantsin groups of three were presented with a live goldfish in a jar and asked, “Cana fish drown in perfectly clean water?” They were initially directed to observethe gill movements of the fish and count the number of respirations per minute.Participants then used concept-mapping software to construct representations ofrelationships between various factors in a natural habitat (e.g., water temperature,fish activity) that might influence the rate of fish respiration. They were shownexamples of concept maps that depicted other science phenomena to familiarizethem with the general form of this representative genre. To stimulate an initialconversation about the relationship between the observable and the unobservable,the instructor asked participants to identify which nodes in their concept mapsrepresented “theoretical” entities, processes or properties, and to highlight thosein green.

For participants, “theoretical” was described on a whiteboard at the front ofthe lab as any thing or process that was “too small, too big, too fast, too slow,inaccessible, or abstract to observe directly”; and examples were given of each.As we videotaped one group developing their concept maps, it became clear thatparticipants were struggling to find shared meaning about the idea of “theoretical”within the context of the fish respiration example.

Nadia: Does observable mean that I can see it or that I can measure it?Because I can measure the oxygen in the water, but I can’t seeit.

Katlin: But [the instructor] said “directly observable” the first timearound.

Nadia: But if you put a probe in the water you can tell how muchoxygen is in there.

Instructor: (comes to group) Well, with the technology we have, it isgetting harder to tell what is theoretical. Because we can’t see,for example, hydrogen ions directly, but we can see when theychange the color of pH paper. . .

Katlin: So it’s fuzzy. . .Instructor: It is a fuzzy concept, but maybe we can use the idea of “not

directly observable.”

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

332 WINDSCHITL ET AL.

Nadia: So if I don’t have any tools to measure something, it canbecome theoretical—that helps.

Katlin: So it’s the naked eye—well I guess if I had a microscope Icould see it. People have been able to see molecules haven’tthey? With scanning electron microscopes?

Instructor: Um-hmm.Katlin: Then in that case oxygen and carbon dioxide would not be

theoretical. . .

In this discussion, the idea of “what’s theoretical” encompassed questionsabout what it meant to “directly observe” a phenomenon and the issue of whethermeasuring something qualifies as “making an observation.” These issues werefurther complicated when we returned to a whole group discussion and anotherparticipant, Justin, introduced the possibility of observing an abstraction.

Justin: So, if you can test for something like water quality, is it stilltheoretical? Say you can test for the presence of nitrates infertilizer, but you can’t directly observe it, or would testing beconsidered “observing”?

Instructor: That’s a good question, so water quality is an abstract concept?Justin: The quality of it is an abstract concept, but a pollutant. . .Instructor: Like nitrogen?Justin: Nitrates—Instructor: Yeah, would it be easier if we got away from talk about things

that are theoretical and just referred to things as not directlyobservable?

Justin: I think that would be easier to categorize for me.Instructor: So how did the conversation go for others?Ethan: We were bothered by each other’s ideas about what was

theoretical (group laughter).Michael: Well, I was contemplating the idea of observing, which would

imply the idea that you could see it; whereas you could, to me,very clearly measure the amount of chemical whereas waterquality is this subjective analysis where you have to come tosome agreement on what it is, but if you . . . if you are inferringfrom a very empirical piece of data to something that’s, ahhh,defined, subjectively defined . . . once you start to infer fromdata then, I think you move into “theoretical.”

In this exchange the instructor made a bid to recast “theoretical” as “not directlyobservable” but also recognized that Michael has just raised one of the key ideas ofmodel-based inquiry—theoretical inference. The instructor took this opportunityto amplify Michael’s statement:

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 333

Instructor: So inferring from data gets you into—you are taking things youcan see and measure—and that pertains to Nadia’squestion—you are taking something you can see and measure,you analyze that and then you make inferences aboutsomething you can’t see and measure?

Michael’s response, in turn, revoices the instructor’s words, however, anotherparticipant, Allison, then picks up only on Michael’s later, complicating statementabout measuring abstractions:

Michael: You are taking a step beyond the empirical data. Water quality isa definition we make up, its not like saying parts per million, itskind of like, well, a simpler explanation would be to say “it’s hotoutside.”. . . We’re making an inference, it is 95 degrees outside,that would be data, so theoretically according to our inferencewould then fall into the realm of theory-based on a subjectivedefinition.

Allison: So we, like they were, we were bothered because there’s a lotof things you can measure parts of, but not like gravity, you canmeasure what gravity does, but not measure “gravity” like atheoretical concept. So you can observe parts of things, I don’tknow, like there is such a big gray area (looks at Rachel andAndrea, they nod in affirmation).

The instructor then attempted to expand the idea of “theoretical” into twoversions that applied to the different examples participants had just discussed:

Instructor: OK, how should we think then about this idea? Should we sim-plify it, should we break it into small pieces? Should we talkabout things that are observable directly? Or things that areobservable with instruments or other technologies? Should weditch the word “theoretical” ‘cause it’s too vague?

Ethan: Do you mean for right now? I think that over the next couple ofweeks it would be nice if we came up with a definition of whatdo you mean by theory, what you mean by observable data andstuff like that. In this instance it would be nice not to talk abouttheory (group laughter).

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

334 WINDSCHITL ET AL.

Instructor: OK. So for now, let’s just recognize that there are some thingsthat are observable and measurable without any instruments atall, there are some things that are observable if you have in-struments like a microscope or pH paper, and then there arecomplete abstractions like the idea of water quality, which youcan’t measure directly because it is an idea. Would this be abetter way to sort this out? Is that the language you want to use?(everyone nods, several vocalize “yeah”).

As a forum for intellectual give and take on the idea of “what’s theoretical,” thetasks and prompts were successful in generating sense-making discourse. How-ever, while we had intended to scaffold students into using the idea of underlying(theoretical) mechanisms as a way of reasoning on another level about observablephenomena, they were struggling to first develop consensual meaning for the term“theoretical.”

By the end of class, eight different concept maps had been produced. Theseincluded various networks of connections between fish, plant life, water, wa-ter temperature, pollutants, dissolved oxygen, CO2, fertilizers, and so on. Eventhough students were asked to outline in green any parts of these systems that hadunobservable elements, two of the six maps had nothing identified despite includ-ing conceptual items such as “abiotic factors” and “pH” or unobservable entitiessuch as “dissolved oxygen.” The remaining maps identified entities or proper-ties such as “dissolved oxygen,” “CO2,” “pH,” “water quality” and “pollution” astheoretical factors, however no group identified any processes or relationships astheoretical.

Immersion in Guided Inquiry, Avoiding Scientific Argument

At the end of the first class, we wanted to reinforce the notion that inquiryquestions had to be generated from an informed model of the phenomenon ofinterest. To these ends, we engaged in the jigsaw activity described earlier andhad them refine their original models. Following this, students were shown sealedcontainers of water that had been heated to various temperatures (72◦F, 100◦F,150◦F, 200◦F) for varying lengths of time (1, 5, 10 minutes) then cooled. Studentswere asked to develop a testable question based on their concept-map modelsthat would help them learn more about the effects of thermal pollution on fishrespiration, execute a brief study over the next 90 minutes, and present theirinquiry to their peers.

We scaffolded their thinking by asking them to create a poster based on a tem-plate with: (a) their question, (b) a hypothesis based on the model they had created,(c) a representation of the data they collected, (d) interpretations of their data,and (e) culminating arguments. These directions were explained and remained

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 335

projected on a screen throughout class. A description of the culminating argumentswas included in the template: participants had to include a coherent explanationfor their findings, state how the data supported this explanation, and describehow hypothesized relationships in their initial model were supported or in need ofrevision.

None of the groups had difficulty settling on a question or designing a controlledexperiment to test their hypotheses. The challenge came after they had analyzedthe data and began trying to use that data to develop an argument rather than simplystating a conclusion about the data. In the following dialogue, Ella, Michael, andJustin had finished their observations and are drawing up their poster. Ella, theparticipant in charge of drawing up the poster for her group, begins deliberatingwith her colleagues about how to interpret their study. Note that she avoids using theword “argument,” but rather substitutes non-specific terms and phrases (identifiedwith an asterisk*) in its place.

Ella: How should I arrange this? There are 5 sections (she looks up atthe PowerPoint list).

Michael: Our hypothesis says that fish will breathe faster in low O2 waterbecause it will need a greater flow of water over the gills to getthe same amount of O2.

Ella: So I’ll put the conclusions and the other stuff* on another poster.Justin: So what does our graph show? (Michael points to it) you can

make it a two-part conclusion. What does the graph show aboutthe data, not only does it not support the hypothesis, but whatDOES it show? That there is no difference?

Ella: Nope, that there is no increase, it actually decreased a little.Michael: Except for our last data point then, so that was what we were

going to try to explain.Ella: Do you want that to be part of the discussion or part of the

conclusion?*Michael: We can say that the behavior of the fish was so erratic we couldn’t

take our final data point, and so the data is incomplet?Ella: That would be part of #5*, right?Instructor: Is that part of #5? Is that a possible explanation of your results?

And then how these data supported that?Michael: No.Ella: No. OK, In my past experience, if you haven’t been able to

actually like document numerical data or qualitative data youjust don’t count it until you are discussing later* what happened.

Michael: That makes sense—our data actually shows the converse of ourhypothesis.

Ella: So rate of respiration—Michael: Gill movement—our data show that gill movement actually de-

creased with less dissolved O2, right?

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

336 WINDSCHITL ET AL.

Justin: So the erratic behavior resulted from O2 deprivation?Ella: OK, so you want to make that #5 then?*

When it came time to share their poster with others, Ella’s group attempted an“argument” for their outcomes but it did not describe explanatory links betweenwhat they observed and any conjectured mechanisms. Their poster read:

Conclusion: Quantitative data does not support the hypothesis! Gill movement de-creased with a decrease in O2, as shown in the graph. Note: Fish’s erratic behaviorat 212◦F prevented collection of quantitative data.

Argument: We think this erratic movement was due to such low dissolved O2 levels,since the fish was gulping air at the surface.

One other group attempted an explanation for their outcomes, but as with Ella’sgroup, it was little more than an allusion to the relationship between fish respirationand the presence of oxygen in the water, without reference to the role of heat. Twogroups recorded no argument at all on their posters. Only two of the six groupsoffered a coherent explanation for their results as well as how their explanationrelated to the data they had collected. One of these wrote:

Conclusion: Fish respiration is affected by water temperature. As H2O temp goesup, respiration goes up.

Argument: According to our background readings, the amount of dissolved O2 inwater increases as temperature decreases. Since fish require oxygen to respire, lessoxygen results in more “breathing movements.” This explains why the fish took morebreaths in the water that had been heated, because this H2O contained less oxygen.

In recapping participants’ experiences over the early days of instruction, clearly,the use of language around two related epistemological aspects of scientific think-ing was unfamiliar and problematic for them. On the first day, Ethan summed uptheir struggles in talking about the unobservable by stating, “. . . it would be nicenot to talk about theory.” On the second day, only a handful of students were able tolink the findings of their inquiry with underlying explanations (despite the fact thatthey had, minutes before, shared potential explanations through the jigsaw activityfor how heated water affects fish respiration). The majority of students avoidedentirely the rhetorical strategy of argumentation, falling back instead on the vaguelanguage of “conclusions” that most of them had voiced in the initial interviews.Despite such difficulties, later discussions during the course will show how theseinitial tenuous engagements with new ways of talking about investigative scienceset the stage for important shifts in thinking.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 337

Phase 2: Pedagogical Reasoning as Context For ExploratoryDiscourse Around Models and Inquiry

The second major phase of instruction involved participants using text resources,sequenced in two separate learning activities, as stimuli to publicly compareexploratory stances on models and inquiry within the context of imagined peda-gogical situations. We present dialogue prompted by these texts and their accom-panying activities as well as later talk from an electronic discussion board thatclarifies how participants’ thinking had changed over these weeks.

The “Models and Inquiry” Paper as a Conceptual Resource

In the days following the guided inquiry, students read a paper on scientificmodels, the role of models in inquiry, and how student understanding of modelsfalls along a spectrum of sophistication. To provide a forum for synthesizingthe ideas in this paper and applying them to a pedagogical situation, we askedparticipants in small groups to address the question, “How could you re-designthe fish respiration activity for your students, in order to help them develop a moreadvanced understanding of scientific models?”

From a videotape of one small group discussion, one participant’s openingcomments reveal how he uses a pedagogical scenario to embody three elementsof model-based inquiry from the paper: (a) that models can bridge the observ-able and unobservable worlds, (b) multiple models can be used to understand aphenomenon, and (c) models generate new questions.

Cody: I’ll go first. The activity—my activity would involve a few in-quiries into fish respiration models, where students would findmodels that consist of a few tangible components, and then onethat may be purely conceptual . . . so you can even have studentsgo out and find models or you can furnish models for them.And then after they’ve looked at a couple of styles and types ofmodels used to understand fish, then they will create their ownmodel to help them conceptualize some of the theoretical pro-cesses around fish respiration. So hopefully through them seeingwhat models are used for, for more than just the basic sense ofreplicating, you know, to a more advanced, like trying to hashout the process more—some of the conceptual processes. Theymight raise some questions using the models, kind of like wedid the other day.

In a subsequent turn, Brooke picks up on a thread of Cody’s idea from the paper(that multiple models can be used to understand phenomena) that is congruentwith her pre-existing conceptions of models as organisms that can “stand in”experimentally for other organisms (e.g., mice for humans in medical research).

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

338 WINDSCHITL ET AL.

Brooke: So I’ll go next. So instead of having activities I had a bunchof questions I could ask them. So, if you were to study fishrespiration without having the fish available, what other modelwould you use?

Jenna: OOoooo.Brooke: What if we used different liquids, would it still be a good model

to study fish respiration? Like if you added saltwater, or pollutedwater? If you had some liquid not associated with fish, would itstill be a good model? Would the results from the fish respirationinquiry still be a good model for other sea creatures like a shark,a jellyfish, or a whale, why or why not? Do you think thisexperiment is a good model for fish in their native habitat?

Justin: I like that first idea, but I’m not clear about the next part, if youdon’t have the fish available. . . ?

Brooke: So if you wanted to do an experiment on a fish, but didn’t haveone available, what would you use?

Justin: So you could use some other animal?Brooke: Yeah or even an air conditioner, to represent the gills—you know

what I mean?

In the next exchange, Justin and Cody build upon a theme that has been implic-itly introduced by Brooke—that models are versatile personal re-constructions ofa scientific idea, and that students are capable of developing them.

Justin: So I can see using another animal as a model, ‘cause that happensin science all the time. So, I had some questions too, but I alsohad an exercise. So this was trading models with other studentsand paying attention to what was different. So, what’s differentbetween their model and yours? The purpose is to show thereisn’t any one “right” model. So, the criteria for a good model issomething like “it has good predictive value” or something likethat.

Cody: That’s what I was thinking too was giving your student a bunchof models and saying “test these.” And like you could trademodels with other students in class that you felt comfortablewith and test each other’s models.

Midway through the conversation, Jenna is puzzled about how students wouldthink about models in authentic ways. Justin and Ella draw from the paper tooffer their conception of “more authentic” ways to use models and connect thisto an idea that has been emerging in the discourse over the past few minutes: thatstudents can develop a sense of ownership of models and even membership in ascience community through testing of models.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 339

Jenna: So in question #4, when it said to “Get students to think in moreauthentic ways about models” what did you think it meant to“think in authentic ways?”

Justin: I didn’t think the question was worded very well, but I went to thetable in the paper and tried to think of ways to get my studentsto do the behaviors associated with expert use of models—topredict, to understand that there are multiple models.

Ella: I took it from the standpoint of making students think they arepart of a scientific community. That they are supposed to revisemodels and theories. When I was in school, the only personswe ever thought would engage in these activities were scientistswho were at some grand big-name university—

Justin: And know lots of stuff you don’t know—Ella: Exactly. I want them to take away, so like it says here (reading

from the assigned paper) that they can create and critique amodel, and that they are active participants, so the model is notjust a static thing.

Despite this dialogue around students understanding models as tools, Jennaresponds by describing how she would get them to create, in her words, “a moreauthentic model” rather than getting her students to think in more authentic waysabout models. Jenna sees this re-interpretation of the assigned task as best accom-plished by allowing students to collect a wider variety of data.

Jenna: OK, So I’ll go. I thought I might take the activity we did and useit only as a pre-experiment. And then take the structure out ofit because we were so limited. I would give them pH paper, letthem have dissolved oxygen meters, then let them create what Ithought would be a more authentic model.

Later, in whole class discussion, Morgan is only the second person (Cody wasthe first) to suggest that models serve as conceptual scaffolds for arguments thatinvolve linking empirical observations with underlying causes. In the process,perhaps unintentionally, she challenges her peers who had not articulated soundscientific arguments about the results of their own fish respiration experiments inthe previous class period.

Morgan: So I would also ask students “How does this model answer thequestion that was asked? How does this model demonstrate therelationship between how much oxygen is in the water to howfast the fish breathes?”

Maria then responds to Morgan by suggesting that using a model in this waycan be made part of a new kind of lab write-up for students—a more sophisticated,

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

340 WINDSCHITL ET AL.

problematized variation of a common classroom practice. Maria, whose extensiveresearch experience at the university was well-known by her classmates, lendscredibility to the idea that authentic science is grounded in the development andtesting of models.

Maria: It seems like it is a common practice to have students perform theexperiment, and then do a lab write-up right? Isn’t that (turnsto Morgan) basically what you are asking them to do? Thelab write-up is another way to write up the model. You havesomething like the lab write up that is not too foreign to them butit has a different meaning this time around. And quite franklythat’s what scientists do, I mean if you look at. A scientificpaper—it is an explanation of a model, you know you are talkingabout the background information, this is the experiment we didthis is the data we collected, and you could turn that into aconcept map if you wanted to.

In summarizing this discourse, models were positioned by several participantsthrough imagined pedagogical situations as: (a) objects of critique and revisionin authentic science; (b) tools to support thinking; and (c) versatile, personalconstructions whose use signals membership in a scientific community. However,one fundamental idea, touched upon in the first class regarding the use of modelsto coordinate the theoretical with the observable, was taken up only briefly byCody in small group and then addressed by Morgan in whole group conversation,without follow-up by peers.

Only three participants expressed limited responses as to how a teacher mightpush students’ reasoning about models. One individual, who had a backgroundnot in science but in mechanical engineering, stated that she understood modelsin terms of material prototypes used to see what types of changes needed to bemade in the design of “real things” like bridges, tunnels, airplane wings, and so on.In her conversation with peers she referred to applicability and prediction as thetwo important functions of models that students should understand. Along theselines, she had only one task for students in this assignment: “They should thinkof human activities that increase thermal pollution, given the model they havecreated.”

For most participants, the opportunity to take stances on the paper appeared tohelp align their language and thinking around the ideas that models are testablerepresentations, created in various forms by both scientists and students, whichserve to generate ideas and predict outcomes.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 341

A “Failed” Activity: When Traditional School Science Talk is GoodEnough

Part of our instructional design was to build credibility in epistemological as-pects of model-testing that are commonly glossed over by the common scientificmethod. During week four we distributed two-part vignettes on authentic exam-ples of scientific studies (from Giere, 1991) that included: (a) a description ofa particular scientific theory or model; and (b) a description of a study, whosehypothesis and investigative design were based on that model, and that evaluatedthe model empirically. We used these vignettes as part of a lesson on designinggroup work, asking participants to “develop tasks for students, such as how theymight collaboratively develop an argument linking data from the study with theoriginal theory.”

One group of participants was presented with a theory about how memory hasa chemical basis that might be transferable from one organism to another. Thesecond half of the vignette described an experiment in which rats were trained torun a maze. The rats then had their cerebral fluid withdrawn and injected into thebrains of other rats. This second group of rats learned to run the same maze in afraction of the time of the first group. As participants began discussing this case,it became clear that, rather than pursuing the idea of scientific argument, theywere fixating on the details of the experiment itself. In response, one participant,Aaron, re-introduced the idea of “mechanism” from discussions in previous classesin an attempt to address underlying processes. No one, however, picked up hisline of thinking and his classmates continued to preoccupy themselves with theexperimental details:

Aaron: But what would you want them to get out of this? So these peoplehave done an experiment, and they have a conclusion, but I’mnot sure that they—

Katlin: They had a control and they did a trial on untrained rats,Aaron: They did something, but they don’t know anything about the

mechanisms whatsoever—Katlin: But kids could learn something from this about identifying the

parts of an experiment,Andrea: Maybe they could identify holes in their thinking?Katlin: Um humm—

Aaron attempts a second time to introduce the idea that, without some possibleexplanation for these findings or at least an attempt to identify underlying assump-tions for the study, one cannot generate an argument. His colleagues, however,turn the dialogue toward “argument as ethical debate” and “argument as validatingempirical conclusions.”

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

342 WINDSCHITL ET AL.

Allison: But how does this work?Aaron: (laughing) We have no idea! That’s why you can’t go any farther

with it, because you can say they injected these mice, but. . . . Andthey need to ask questions about how this experiment was done,maybe these rats could see each other and that maybe this effectwas due to more than injecting them.

Katlin: So you are saying that maybe the rats communicated with eachother through ESP?

Aaron: Well, we could talk about the assumptions they are basing thisall on. . .

Jenna: Or you could (reading from the handout) have students developan argument—yeah, you could have them develop an argumentfor or against the ethics of doing this, you’re going to have somekids supporting this.

Katlin: I was wondering too about how kids understand this conclusion,that because the rats that were injected learned the maze in threehours instead of 24, that that implies some aid to their memory.

Aaron: So students wouldn’t know what that means?Jenna: So they couldn’t conclude—Katlin: They can’t read or write or express stuff well in some schools,

anyway so then to get back to critiquing this—Aaron: So did these researchers have enough information to validate

this as a true experiment?Katlin: I don’t think there is enough information here to make an ar-

gument for or against this. They could critique the scientificexperiment, but also critique the amount of information thatthey are given in this report.

Jenna: Ooo that’s good.

The group then continued to talk about the importance of helping kids to attendto the parts of the experimental procedure rather than the potential links betweendata and the theory. Eventually they came to consensus on splitting students intogroups to critique different phases of the scientific method used in the study. Thegroup’s final plan for using the vignette in collaborative work for high schoolstudents was summarized as: “List facts that are important; what information ismissing? What other information would they want? In the end, do they supportthe conclusions that were reached by the scientists?”

The course of the aforementioned dialogue mirrored that of all the othergroups of participants and was marked by the incorporation of unspecified lan-guage around “drawing and supporting conclusions,” “identifying important facts,”and noting “what’s missing.” Their conversations were focused primarily on theexperiment and not in any way on potential claims or explanations. The vaguenessof typical school science language around conclusions was clearly sufficient, in theminds of participants, to complete this class activity. From an instructional design

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 343

standpoint, one problem with this exercise was selecting phenomena for whichthere was a highly complex underlying explanation. But even more problematic,we furnished no case examples of argumentation that situated the rhetoric withinthe context of a study, real or simulated.

Participants Reflect on Changes in Their Thinking

As the course progressed, science ideas that came up in day-to-day class con-versation (e.g., mechanical advantage in pulleys, simple patterns of inheritance)were routinely framed by the instructor as models that could be tested and theresulting evidence related to their explanatory roots. Mid-way through the fallquarter, participants spent two weeks observing in schools. During this time theycommunicated with each other via Web-based discussion boards. We asked themto exchange ideas about which course activities had caused conceptual conflictfor them, helped them develop new ideas, or refined existing ideas around models,theory, evidence, and argument. We also asked them to begin a dialogue abouttheir upcoming model-testing project.

In these online discussions, three overlapping lines of talk predominated. Thefirst of these was that models, particularly in the form of concept-maps, werebeing appropriated by participants as sense-making tools. Second, participantsbegan to suspect that scientists use models implicitly if not explicitly in theirinvestigations. And third, model-testing was being seen as a more comprehensive,integrated approach to investigations than the scientific method; although fourparticipants maintained that young learners should experience “the basics”—inthe form of the scientific method—before any attempts at model-testing.

In the first passages of online discourse on models as sense-making tools, Justinwrote that he “didn’t realize that Inspiration R© [concept-mapping software] couldhelp students learn.” Michael responded, “Yeah, the software allowed me to get mymind around some complicated content. I like how ecology uses models constantlyto illuminate complex systems.” Other participants discussed how models helpboth generate questions and push the inquiry beyond description to explanation.Cody for example explained:

I think the major difference as well as advantage of a model over the scientificmethod is the way that a model in itself can lead to ways to test the idea whereas thehypothesis of the scientific method lacks this inherent feature. The concept of themodel is very different from the scientific method, which I am more familiar with,yet I am beginning to see the value and versatility of models.

Ella responded:

On one hand, I think that you have to “zoom in” on the relationship you want to testwithin the framework of the model, then apply the scientific method to that specific

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

344 WINDSCHITL ET AL.

relationship, process, property, whatever. In that way, I agree. But on the other hand,maybe model-testing is just an example of the scientific method that is more fleshedout at the beginning. Doing research enables you to ask more informed, relevantquestions leading to more informed, relevant observations and so forth.

Andrea added that “With model-testing, you have an initial understanding of yourquestion, like which paper towel holds water better, that will help you explain whyand not only which is better.” [emphasis in original]

A second line of conversation emerged around the idea initiated by Maria a fewdays earlier—that scientists use models to guide their investigations. Maria haddescribed how scientists constantly revised conceptual drawings that covered thehallway whiteboards in her former lab. Ella picked up on this theme and explainedher changes in thinking about how scientists use models:

I thought I’d take a bite at this from another angle: how the reading [Models andInquiry paper] conflicted with my preconceptions. Until the past year or so, myperception was that the scientific method was a linear series of steps in a process thatwe learned in school to mimic scientists, not for the purpose of truly investigatingfor our own sake. The biggest surprise for me in realizing that the two (model-testing& scientific method) are essentially the same activity, is that I was missing a hugechunk of the scientific method: the iterations of developing a model & learning morein-depth info before generating a hypothesis. As with Olivia, the scientific method Ilearned leaped from observation directly to hypothesis—model-testing is much moreinvolved and more forgiving. I think this is the way scientists probably intended thescientific method be interpreted by teachers all along.

Julia, who had close personal ties with a researcher doing climate modeling,speculated about how scientists used models without being aware of them:

I think what is different about this approach (described in our class paper) is that itmakes explicit what I observe most scientists doing anyway. I think what is differentthough is that many scientists might not articulate it this way or even necessarily beaware of it. I definitely have observed it to be a far more fluid (less linear) . . . withideas being refined and reformulated throughout the entire process.

Nadia too added: “I think that the more sophisticated thinker, or scientist, probablyalready has a mental model in her head. She is informed about her area of expertiseand is therefore asking questions that are informed by her model and will in turninform her model.”

The third line of discourse focused on model-testing as a more comprehensive,integrated way to understand phenomena. Andrea for example wrote:

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 345

What I find to be different between the scientific method and model-testing is this:Model-testing is all about relationships. It can be very broad and yet very detailed.You observe a natural process and can connect it to create a spider web of ideas. Youcan then investigate a part of that model, but you’re always thinking in terms of howit is connected in the larger scheme of relativity.

Michael added:

Yes, and the notion of complex systems that have multiple variables interactingis often de-emphasized in teaching kids about the scientific method. The modelapproach, in my opinion, might get around this by virtue of the fact that a modelcannot easily be removed from the system that you are studying.

Allison related how this more integrated understanding of phenomena would allowthe investigator to learn from experiences that did not lead to the expected results:

The benefit I see to model testing vs. scientific method as it usually is presented isthat with model testing there is more scaffolding and framework so one doesn’t haveto throw out the entire idea if the experiment doesn’t match the prediction. In otherwords the goal of model testing is to create a more complete overall picture, not justsee if a prediction is correct.

Allison, however, later voiced an opinion shared by three other participants, thatmodel-testing was not a set of practices that school-age learners could readilyengage in:

I think model testing can be quite a bit more difficult to understand, so perhapsthat can be something that should be introduced only after the students have beenintroduced to the basics of “doing science.”

Ethan similarly talked about attempting model-testing only at the end of the year:

One suggestion for how to do such a project as a class is not to have it as a year-long thing, but to do it at the end of the school year, after building up the students’understanding of the scientific method. The majority of the year could be spent onteaching some of the basics of the scientific method through basic experiments.

Accompanying these three lines of discussion of how model-testing was gainingcredibility in the minds of most participants, were a number of statements indicat-ing that participants couched their learning in terms of acquiring a new language.Morgan wrote: “I am beginning to see science through these definitions of models,theories, etc. . . . I really like the idea of model-testing, but like you guys I don’tfeel like I am ‘fluent’ in models yet.” To which Cody replied: “This is kind of a

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

346 WINDSCHITL ET AL.

bad joke, but I feel like I am a MSL (Model as a Second Language) student havinggrown up only really knowing and using the scientific method.”

The overall picture created by these online conversations is that the design ofinstruction up to this point had created a case for the credibility of model-testing,while at the same time casting doubts about the authenticity of the scientific methodas traditionally taught. Much of this analysis by participants was grounded in acontext of imagined pedagogical situations, in which their own students wouldbe scaffolded through an investigation. These passages connect model-testing tofour of the five epistemic characteristics of scientific knowledge: its testability, itsrevisability, its generativity, and its conjectural nature. Although most participantswere becoming convinced that models play a crucial role in science learning and inscientific investigations, they faced new challenges (around conjecture and theory)when it came time to integrate these new conceptions into their own investigations.

Phase 3: Embodying the Language of Scientific Models inInvestigative Practice

The Confusion Around “What’s Theoretical?” Re-Emerges as: “Whatam I Studying?”

As the class passed the halfway point of fall quarter, participants turned in pro-posals for their model-testing projects. This was an opportunity for the instructorto provide feedback on aspects of their projects that could become problematic ifnot addressed early. Earlier in the quarter participants had been given the require-ments for this independent investigation that they were to complete by the end ofterm (about 10 weeks).

Participants had little trouble outlining portions of the model-testing processthat were most like traditional school science—designing relevant data collectionprotocols and ways to analyze that data. However the requirements to proposeunderlying explanations through an initial model (unlike school science) hadcaused an unresolved question from earlier in the course—that is, “What countsas theoretical?”—to re-emerge as “What am I studying?” This issue surfaced in anumber of personal communications (e-mails) about the proposals to the instructor.In all, seven participants sent e-mails to the instructor seeking advice and all sevensought help with this question.

Julia, for example, asked, “What do you mean by the ‘theoretical piece’ ofthe model I am testing for (an unobservable mechanism)? Can you give me anexample? The model I am working with has to do with rates of photosynthesis inan aquatic plant.” Another participant, Nadia, reprised a question she had posed onthe first day of the course: “Is it just unobservable and therefore theoretical if I can’tsee it with my naked eye? Are we supposed to list all the theoretical mechanismsor just the ones we are testing for?” A third participant, Allison, was studying

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 347

the relationship between the age of trees and leaf senescence (time at which theirleaves drop in the fall). In her initial model, she included theoretical processessuch as the production of sugars in the leaves and the breakdown of chlorophyll,but had not hypothesized why age would have any influence on leaf drop. Allisonwrote: “I am not sure what you mean by ‘mechanism.”’ Shortly after this e-mail, inan after-class discussion with Allison and other students, I pressed her to considerfrom her background readings what might influence trees to metabolize differentlyas they age. Another student suggested that trees use “magnesium or manganese”to produce chlorophyll and that the ability of the trees to take up these mineralscould be hindered with age. Allison felt this “opened up” the idea of mechanismfor her and she eventually incorporated this hypothesis into her model.

Other participants were now piloting their studies and struggling to come upwith claims that could link empirical data to unobservable processes. Katlin, whowas studying the relationship between enzyme activity in saliva and temperatureof foods stated: “I don’t have enough connection between my data and the theo-retical level. I didn’t look at the molecules, I needed, you know, one of those bigmicroscopes that can see at that level.” We recognized that she was holding to theidea that scientists always directly observe the phenomenon that they eventuallymake claims about. We responded by asking her how scientists study sub-atomicparticles, to which she replied: “They used an ‘if-then’ statement, like if our the-oretical mechanism is—then we should see .” Katlin later conducted a secondpilot study after which she wrote:

To make matters worse, the data do not appear to address the CAUSE for therelationship between temperature and enzyme activity. This, in my case is the kineticenergy and therefore collision rate of individual molecules. The cause is also thedenaturation of enzymes at high temps. Any advice on how to relate the data to themodel? [emphasis in original]

She then offered to re-cast her study as purely descriptive—a regression to a typicalschool science activity without arguments linking data to underlying explanations.She wrote: “I could present the project as an exploration of the experimental systemI used: the interactions of starch, amylase, iodine and temperature, and sharingsome of the things I found.”

These conversations, in sum, revealed a residual uncertainty about core epis-temic considerations of model-based inquiry: seeking explanations that transcendthe empirical. Some participants, like Allison, confronted the difficulty of “go-ing deeper” with her initial model to be able to hypothesize about cause, andsome, like Katlin, who although familiar with underlying mechanisms in herstudy, questioned the propriety of using the observable to make claims about theunobservable.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

348 WINDSCHITL ET AL.

Given these uncertainties about method and epistemology around the testingof a model, we decided to provide an exemplary case of model-testing activityand disciplinary discourse. One of the participants, Morgan, had finished herstudy mid-way through the quarter and had met all the criteria of the project. Weasked her to present one week before the other participants. Immediately aftershe presented her study of the environmental influences of leaves changing colorsin the fall, the class debriefed how her hypothesis was informed by a model andhow her data formed the basis for claims about explanatory processes not directlyobservable.

Were participants themselves capable of model-based inquiry?

We preface this section with findings from a previous study (Windschitl &Thompson, 2006) in which participants in a similar methods class were not pro-vided the scaffolding of the HPDD framework. Most were unable to constructinitial conjectural models suitable for testing. Some, for example, constructedexperimental flowcharts rather than representations of the system they were in-vestigating. Only 2 of the 21 participants were able to incorporate evidence intoarguments that utilized model-based reasoning.

In the current study, participants’ performances were markedly different. Alleighteen participants were able to (1) use background content materials to de-velop a coherent, conceptually integrated initial model of the phenomenon theyplanned to study. Twelve of the eighteen participants were additionally able to: (2)incorporate hypothesized causal processes or entities into this model, (3) test hy-potheses derived from this model, and (4) analyze and represent data appropriately.Ten participants achieved these benchmarks and were also able to: (5) constructevidence-based scientific arguments that coordinated their data with underlyingtheoretical mechanisms, and (6) make changes in their models where appropriate(including proposing new hypotheses based on the data collected).

Table 4 summarizes how participants designed and carried out their model-testing projects, including whether participants used theory-directed argument(associated with model-based reasoning) or method-directed argument (associatedwith relation-based reasoning).

As mentioned earlier, ten of the eighteen participants (labeled as Inquiry Type I)were able to design a fully authentic study, including using both method and theory-directed arguments. Other participants (labeled Inquiry Type II) had relevant theo-retical mechanisms incorporated into their models and actually tested specificallyfor those theoretical phenomena via observable relationships. However, in mak-ing their final claims, they employed method-directed argument, using data onlyto characterize relationship between observable variables (see Table 4). We notehere, however, that we rated their “aim of final arguments” based strictly on the

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

TAB

LE4

Inqu

iry

Type

sC

hara

cter

ized

byN

atur

eof

Mod

els,

Aim

sof

Inqu

iry,

Aim

sof

Fin

alA

rgum

ents

Aim

ofIn

quir

yin

Rel

atio

nC

hang

ein

Nam

eN

atur

eof

Init

ialM

odel

toIn

itia

lMod

elA

imof

Fin

alA

rgum

ents

Ori

gina

lMod

el

Inqu

iry

Type

IJu

liaO

livia

,K

atlin

Jenn

aA

ndre

aM

icha

elM

orga

nJu

stin

Eth

an,A

aron

�T

heor

etic

alm

echa

nism

sin

corp

orat

edin

tom

odel

.�

Mod

elre

pres

ents

rela

tions

hips

betw

een

obse

rvab

le,

unob

serv

able

proc

esse

s.

Exa

mpl

eJu

lia:H

owco

ncen

trat

ions

ofC

O2

affe

ctra

tes

ofph

otos

ynth

esis

inel

odea

(aqu

atic

plan

ts).

�Te

sts

for

theo

retic

alph

enom

ena

inm

odel

via

obse

rvab

lere

latio

nshi

ps.

The

ory—

dire

cted

:dat

aus

edto

supp

ortp

ropo

sed

theo

retic

alre

latio

nshi

psin

mod

elan

dM

etho

d—di

rect

ed:d

ata

used

toch

arac

teri

zere

latio

nshi

psbe

twee

nob

serv

able

vari

able

s.

1)C

laim

:Low

conc

entr

atio

nsof

CO

2lim

itph

otos

ynth

esis

beca

use

itis

nece

ssar

yfo

rsy

nthe

sis

ofgl

ucos

e.H

igh

CO

2

also

limits

beca

use

ofhi

ghpH

.2)

Evi

denc

e:G

raph

ssh

owbo

thhi

ghan

dlo

wC

O2

linke

dw

ithlo

wO

2(a

by-p

rodu

ctof

phot

osyn

thes

is)

Con

ject

ures

with

mod

els

are

supp

orte

d,no

subs

tant

ive

chan

ges.

Inqu

iry

Type

IIE

llaM

aria

�T

heor

etic

alm

echa

nism

sin

corp

orat

edin

tom

odel

.�

Mod

elre

pres

ents

rela

tions

hips

betw

een

obse

rvab

le,

unob

serv

able

proc

esse

s.

Exa

mpl

eE

lla:H

owet

hyle

nean

den

zym

esco

ntro

lfru

itri

peni

ngpr

oces

s.

�Te

sts

for

theo

retic

alph

enom

ena

inm

odel

via

obse

rvab

lere

latio

nshi

ps.

Met

hod—

dire

cted

only

:dat

aus

edto

char

acte

rize

rela

tions

hips

betw

een

obse

rvab

leva

riab

les.

1)C

laim

:Pre

senc

eof

ethy

lene

inen

clos

edsp

aces

does

nots

peed

ripe

ning

2)E

vide

nce:

Dat

ado

esn’

tsho

whi

gher

“rip

enes

s”sc

ore

for

frui

tsin

bags

vs.o

pen-

air

No

subs

tant

ive

chan

gein

resp

onse

tost

udy.

(Con

tinu

edon

next

page

)

349

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

TAB

LE4

Inqu

iry

Type

sC

hara

cter

ized

byN

atur

eof

Mod

els,

Aim

sof

Inqu

iry,

Aim

sof

Fin

alA

rgum

ents

(Con

tinu

ed)

Aim

ofIn

quir

yin

Rel

atio

nC

hang

ein

Nam

eN

atur

eof

Init

ialM

odel

toIn

itia

lMod

elA

imof

Fin

alA

rgum

ents

Ori

gina

lMod

el

Inqu

iry

Type

III

Alli

son

Rac

hel

Nad

ia

�M

odel

cont

aine

dth

eo-

retic

alpr

oces

ses

and

en-

titie

sbu

tno

nere

late

dto

targ

etph

enom

enon

inre

-se

arch

ques

tion

Exa

mpl

eA

lliso

n:A

nyco

rrel

atio

nsbe

twee

ntr

eeag

ean

don

seto

fle

aves

drop

ping

inth

efa

ll.

�Te

sts

for

char

acte

rof

obse

rvab

lere

latio

nshi

psid

entifi

edin

mod

elw

ith-

outr

efer

ence

toun

derl

y-in

gm

echa

nism

s.

The

ory—

dire

cted

:dat

aus

edto

supp

ortp

ropo

sed

theo

retic

alre

latio

nshi

psno

tori

gina

llyin

mod

elan

dM

etho

d—di

rect

ed:d

ata

used

toch

arac

teri

zere

latio

nshi

psbe

twee

nob

serv

able

vari

able

s.

1)C

laim

:DN

Apr

oduc

espr

otei

nsto

build

chlo

roph

yll

need

edto

keep

leav

esgr

een,

tree

aliv

e.2)

Evi

denc

e:Po

sitiv

eco

rrel

atio

nbe

-tw

een

tree

age

and

onse

tof

leaf

drop

infa

ll.

Add

son

eno

defo

r“a

ge-i

nduc

edD

NA

degr

adat

ion,

”bu

tno

mec

hani

sms.

Add

sot

her

rele

vant

deta

ilsto

tree

phys

iolo

gy.

Inqu

iry

Type

IVM

egan

Bro

oke

Cod

y

�N

Oth

eore

tical

mec

ha-

nism

sin

mod

el�

Mod

elre

pres

ents

are

la-

tions

hip

betw

een

mul

ti-pl

eob

serv

able

vari

able

s

Exa

mpl

eC

ody:

How

gran

ule

size

ofC

aC2

fuel

inm

iner

’sla

mp

rela

tes

toth

ebu

rntim

e.U

nobs

erva

ble

chem

ical

reac

tions

men

tione

d,bu

tno

tinc

orpo

rate

din

tom

odel

.

�Te

sts

for

char

acte

rof

obse

rvab

lere

latio

nshi

psid

entifi

edin

mod

el.

Met

hod—

dire

cted

only

:dat

aus

edto

char

acte

rize

rela

tions

hips

betw

een

obse

rvab

leva

riab

les.

1)C

laim

:D

ecre

asin

gC

aC2

gran

ule

size

redu

ces

burn

time

byin

crea

s-in

gre

actio

nra

te.

2)E

vide

nce:

As

finer

grai

nsi

zes

ofca

rbid

eus

edin

lam

p,bu

rntim

ede

crea

sed.

Add

sne

wpr

oper

tyof

CaC

2:t

otal

surf

ace

area

ofgr

anul

es.

350

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 351

public presentation of claims to the class. This is a highly conservative assessmentof their thinking since these participants mentioned during later questioning bythe audience precisely how their findings related to underlying processes in theirmodels.

The remainder of the participants tested only for observable relationships intheir models, but approached their inquiries in different ways. Allison (InquiryType III) had developed a model of tree physiology that included theoreticalphenomena (production of chlorophyll, for example) but none of these related tothe target phenomena expressed in her research question (“Is there a relationshipbetween tree age and leaf senescence?”). Interestingly, when it came time todiscuss her argument, she used data to suggest some theoretical processes aroundDNA degradation in older trees—a hypothetical process that was not present inher initial model.

Megan, Brooke, and Cody (Inquiry Type IV) did not include theoretical pro-cesses in their models. Brooke and Megan had worked together to test how tastebuds were distributed on the human tongue. Their initial model was not a setof conceptual relationships, but a spatial distribution of taste-sensitive area onthe tongue. Cody’s model portrayed how samples of calcium carbonate (a com-bustible compound used in miner’s lamps) would burn at different rates dependingon its grain size. These individuals all tested for observable relationships and usedmethod-directed arguments, but as with other participants, their understanding ofthe processes were likely under-estimated by the conservative manner in whichwe rated their models and final claims.

Final Interviews: Shifts in Thinking About Models and Inquiry

At the end of the six-month course, participants wrote reflections on the model-testing processes and were interviewed about their understandings of the nature andfunction of models, the model-testing framework for inquiry, and the relevance ofmodels to teaching and learning science. We first present a quantitative summaryof the changes in their thinking about models and inquiry (Table 5) and thendiscuss three major themes that emerged from this data.

Before the course, more than half of participants were rated at level 1 or 2 forunderstanding the nature of models. By the end of the course, ten participants hadimproved their understanding and eleven of the seventeen were rated as havingexpert-level understandings (one participant withdrew because of illness). Beforethe course, only two participants were rated at the highest level of understandingof the function of models in science (3+); by the end of the study, five participantswere given this rating. Of the sixteen participants who did not already have expertlevel understanding in this area, nine of them improved. Before the course, noparticipant had an expert level understanding of the role of models in inquiry; by

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

352 WINDSCHITL ET AL.

TABLE 5Pre- to Post-Course Changes in Understanding Nature and Function of Models, and Role

of Models Inquiry

Nature Function Role in Inquiry

Name Pre Post Net Pre Post Net Pre Post Net

Michael 3 3 � 3+ 3+ � 2 3 �Rachel 3 3 � 3 3+ � 2 3 �Maria 3 3 � 2–3 3+ � 2 3 �Ethan 3 2–3 � 2–3 2–3 � 2–3 2–3 �Ella 3 3 � 2 3 � 2–3 3 �Justin 2 3 � 3 3 � 2 2–3 �Julia 2 3 � 3+ 3 � 2–3 2–3 �Morgan 2–3 3 � 2–3 3 � 2 2–3 �Aaron 3 3 � 1–2 3 � 1 1–2 �Cody 1–2 2–3 � 3 3+ � 2 2–3 �Allison 1–2 3 � 3 3 � 2–3 3 �Brooke 2 2 � 2 2 � 1–2 2 �Megan 1–2 2 � 2 3 � 2 2–3 �Olivia 1–2 2 � 2–3 2–3 � 1–2 2–3 �Jenna 1–2 3 � 2 2 � 1 3 �Andrea 1–2 3 � 1–2 3 � 2 3 �Nadia 1–2 2 � 2 3+ � 1 2–3 �

the end of the course, seven provided evidence of an expert level understanding,and all but two participants had improved their understanding. No participantremained at a “Level 1” understanding.

Participants’ final understandings of models appeared to be shaped by instruc-tion in this course, but also by their own intellectual history in particular fields ofscience and by past research experiences. Julia, who talked about models as “sys-tems of relationships” throughout the course, was engaged to a graduate studentwho did computer modeling of ocean weather systems. She referred to his con-stant quest to “identify relevant variables,” “quantify influences,” and “move fromsimple to complex systems.” Brooke, on the other hand, used her immunology labexperience to talk about models as organisms (such as mice) that stand in for otherorganisms (such as humans) in order to safely test medical procedures or drugs.Even by the end of the course, Brooke still used this as her primary frame fortalking about models, and consequently had difficulty thinking of a model as setof conceptual relationships. Olivia, who had been a mechanical engineer, couldinitially only talk about models as physical replicas of larger material systems(model bridges for example) and saw their function as predicting the kinds ofbehaviors and stresses the “real” system would experience. Unlike Brooke, shebroadened her view of models significantly after the second week of class. Megan,

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 353

who had worked extensively in chemistry labs, had a very sophisticated view of therelationships between the structure and function in models, drawing on examplesof molecular representations of proteins. However, when it came time for her toconsider a model as a set of ideas or conceptual relationships, she expressed a deepconfusion that was later explained by a remark during her final interview: “Well,I had one big preconception before this class; I thought models were physicalthings that represented other physical things. So, the idea of model-testing wasimpossible to me.”

Returning now to the predominant themes in the reflections and interviews,two patterns demonstrate significant shifts in thinking about how model-basedinquiry supports discourses around the testable, revisable, explanatory, conjectural,and generative nature of scientific ideas. The first pattern was that virtually allparticipants, after conducting their inquiries, talked about model development asa crucial initial phase of inquiry. Mid-way through the first quarter some of themhad begun to see the value of creating models in order to provide a broader sense-making representation of the phenomena they were going to study. But it was onlyduring the model-testing project that the value of models was extended to helpingthem “see” the theoretical relationships that they eventually sought evidence for.Maria, for example, wrote this about her inquiry into chemical reactions in milk:

The skill of model-construction provided the foundation and structure for my think-ing and activities. The search for a testable relationship required analysis of andreflection on the model, which resulted in the need to develop a deeper understand-ing of milk, dishwashing soap and food coloring. . . . Another iteration of modelanalysis and reflection showed me that I needed a deeper understanding of theproperties of the different molecules in the model. Once I understood whether amolecule was hydrophobic or hydrophilic, I was better able to examine the potentialinteractions of each of the molecules.

Julia wrote similarly about the connections between her growing understandingof models and the clarification of what phenomenon she was actually going to testfor:

By far the most difficult step was identifying the process I was interested in anddeveloping a model to frame and organize my thinking. While reading, I drew onvarious permutations of my model. As I learned more, the mechanisms in which Iwas interested became evident and I refined my model.

Olivia too suggested that models provided insights for her, and that models wereindispensable to authentic science: “If a scientist can define known relationshipswell, it is easier to add in relationships that are unknown and therefore theoretical.Once these are added the model becomes a tool for thinking.”

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

354 WINDSCHITL ET AL.

A second pattern involved experiencing model-testing as a different orientationto the relationship between scientific ideas, questions, and the use of evidence.Allison, for example, wrote in her reflection:

At first, I did not think that this was an essential part of inquiry, but my own experiencehas shown me that making a model really helps students think about how differentfactors are related. Once you start to see these relationships between phenomena,it is much easier to find relevant questions to ask. For inquiry to be authentic, itis not enough to analyze data and say “from my observations, I have come to theconclusions that X causes Y—the end.” Students must also relate it back to the initialresearch and the model and at the very least say whether or not their results wereconsistent with what it led them to believe was happening.

Michael as well contrasted model-testing favorably against the scientific method:

So the scientific method doesn’t involve the front end in terms of, you have ahypothesis or you have a question but you haven’t looked at the concept and createda map of your understanding to get to a question you’re not, so it’s not connected,the actual process of understanding something, it’s just, well, “I have a question.” Ascientist I don’t think would ever do that. And the second thing is the idea that yourhypothesis is based on some part of your model and that you’re always looking backat that model. . .

Other participants told us explicitly of fundamental epistemological shifts in think-ing about scientific knowledge as a human construction:

Megan: So that was one way that I came to realize that a model could bechanged, was looking through that reading.

Interviewer: The paper on Models and Inquiry?Megan: Yeah, I don’t know, I mean how many different ways you can

have a model? I mean I still have those pictures from that paper inmy head, it just sort of blew my mind. And then the model-testingproject, like the premise of it being that you would change yourmodel in the end, I mean that was the whole basis of the entirething was to see if your model works or not. [vocal emphasis inoriginal]

Interviewer: So was that something unusual for you?Megan: Well, yeah, because, well, I think that’s the part about learning

about education or maybe learning about teaching that I had noclue about.

Despite the fact that the majority of participants saw the development of modelsas crucial to inquiry, five of them suggested in the final interviews that the scientificmethod was still a viable process for inquiry. Cody and Ethan believed that the

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 355

scientific method was the empirical, data-gathering core of the model-testingprocess. Cody explained: “Although we were using models, the inquiry was stillframed as an experiment to test a model which was done in the process generallyaccepted as the scientific method.” Cody believed model-testing to be an essentialoverall approach to inquiry for K–12 students; Ethan, however, maintained thatmodel-testing was too complex and should be attempted only at the end of theschool year.

A third theme from final interviews and reflections was that participants hadlittle changed the way they talked about scientific argument. When asked inthe final interview: “What happens in a study after an investigator analyzes thedata?” they provided explanations similar to their pre-course answers, expressinga generally unproblematic view of the relationship between data and any claimsmade. Approximately half of the eighteen participants simply said they would“write up conclusions” or “summarize the findings” or “go on to the next question.”Allison, who had spoken so eloquently about relating one’s conclusions back tothe model earlier in the interview said later that in considering data from a study,a student should “ask themselves what it means” and “use critical thinking” inthe process. The language around “claims” and “argument” was not taken up byparticipants in the way that “models,” “theoretical mechanisms,” or “evidence”had been, despite the fact that in their projects they had struggled with identifyingwhat theoretical processes they were using evidence to make claims about—and,for the most part, doing so successfully.

Post-Script: Trying Out the Language of Modelingin Their Own Classrooms

The following fall we observed participants student teaching in their middle orhigh school classrooms. We rated from 1–3 how they engaged their own studentsin activities around the nature and function of models and the role they play ininquiry, basing the ratings on the levels of understanding in Tables 2 and 3. Table 6shows the range of implementation of the epistemic practices and talk. Partici-pants fell into five groups. Two participants consistently taught about science ideasas models, allowing their students to explore the nature and function of modelsthrough class activity (Group 1). Three participants talked about models with stu-dents but were inexplicit about the nature and function of models (Group 2). Threeparticipants did not describe ideas as models but did attempt guided versions ofmodel-based inquiry (Group 3). Three participants incorporated the term “model”into their classroom discourse, but only as nominal references applied to scienceideas (Group 4). And four participants frequently used representations to helpstudents understand phenomena, but did not recognize these or teach about theseas models (Group 5).

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

TAB

LE6

Type

sof

Mod

el-r

elat

edIn

stru

ctio

nby

Par

ticip

ants

Dur

ing

Teac

hing

Pra

ctic

ums

Und

erst

andi

ng/U

seof

Mod

els:

Type

ofM

odel

Use

and

Cha

ract

eris

tic

Exa

mpl

esof

Mod

els

Use

and

Pro

mot

ion

ofE

pist

emic

Pre

-cou

rsew

ork,

Post

-co

urse

wor

k,D

isco

urse

Dis

cour

seD

urin

gSt

uden

tTea

chin

gTe

achi

ngP

ract

icum

s

I.C

onsi

sten

texp

licit

teac

hing

ofid

eas

asm

odel

s(2

stud

entt

each

ers)

Exa

mpl

eM

aria

:M

odel

test

ing.

Stud

ents

wer

egi

ven

thre

em

odel

sth

atea

chpa

rtia

llyex

plai

ned

why

the

seas

ons

occu

r.St

uden

tsas

sign

edto

diff

eren

tgr

oups

,mad

epr

edic

tions

abou

tthe

seas

ons

usin

gth

eir

(lim

ited)

mod

els,

then

did

expe

rim

ents

with

thei

rm

odel

s.St

uden

tsw

ere

aske

dto

mak

ecl

aim

sus

ing

evid

ence

,dis

cuss

the

adva

ntag

es/li

mita

tions

ofus

ing

only

one

expl

anat

ory

mod

el.

Mar

ia2

33

Use

ofm

odel

sw

ithin

inqu

iry

33

3N

atur

eof

mod

els

2–3

3+3

Func

tion

ofm

odel

sII

.Con

sist

ent,

buti

nexp

licit

teac

hing

ofid

eas

asm

odel

s(3

stud

entt

each

ers)

Exa

mpl

eJu

lia:

Mod

elde

velo

pmen

tpos

tinq

uiry

.Stu

dent

sex

amin

edga

spr

oduc

tion

ina

wat

erpl

antu

nder

vari

ous

light

cond

ition

sto

unde

rsta

ndph

otos

ynth

esis

and

resp

irat

ion.

Alth

ough

the

labo

rato

ryon

lyha

da

few

obse

rvab

leco

mpo

nent

s(t

hepr

esen

ceof

oxyg

enan

dca

rbon

diox

ide)

she

emph

asiz

edun

obse

rvab

le/t

heor

etic

alco

mpo

nent

sby

havi

ngst

uden

tsdr

awa

diag

ram

ofw

hatw

asoc

curr

ing

insi

deth

epl

ant.

Ask

edst

uden

ts:

“Why

wou

ldw

ech

oose

this

form

ofre

pres

enta

tion?

”“W

ould

diag

ram

besa

me

for

land

asw

ella

sw

ater

plan

ts?”

Julia 2–

32–

33

Use

ofm

odel

sw

ithin

inqu

iry

23

3N

atur

eof

mod

els

3+3

3Fu

nctio

nof

mod

els

III.

Did

notd

escr

ibe

idea

sas

mod

els

butu

sed

feat

ures

ofm

odel

-bas

edin

quir

y(3

stud

entt

each

ers)

Exa

mpl

eC

ody:

Mod

elde

velo

pmen

t.St

uden

tssp

ecul

ated

abou

thow

and

why

batte

ries

wor

ked,

then

did

anin

vest

igat

ion

toex

amin

eth

ere

latio

nshi

pbe

twee

ntim

ean

den

ergy

tran

sfer

duri

ngth

epr

oces

sof

rech

argi

ngba

tteri

es.P

osti

nqui

ry,s

tude

nts

drew

repr

esen

tatio

nsof

how

batte

rysy

stem

sw

ork;

Cod

ypr

esse

dth

emto

infe

rab

outt

heor

etic

alco

mpo

nent

sfo

cusi

ngon

ener

gyflo

wan

dtr

ansf

erof

char

ge.

Cod

y2

2–3

3U

seof

mod

els

with

inin

quir

y1–

22–

33

Nat

ure

ofm

odel

s3

3+2

Func

tion

ofm

odel

s

356

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

IV.C

onsi

sten

tbut

only

nom

inal

use

ofm

odel

s(3

stud

entt

each

ers)

Exa

mpl

eJu

stin

:E

xam

inin

gpr

e-ex

istin

gm

odel

s.St

uden

tsco

nduc

ted

expe

rim

ents

onth

ew

ave

and

part

icle

theo

ries

oflig

ht.J

ustin

calle

dth

ese

“mod

els”

butt

hese

wer

eno

texp

licitl

yta

ught

asm

odel

s.St

uden

tsex

amin

edva

riou

sre

pres

enta

tions

ofth

eat

om,l

abel

edas

mod

els,

but

wer

eno

task

edto

disc

uss

thes

eas

mod

els.

Just

inde

velo

ped

aqu

estio

nab

outw

heth

eror

nott

wo

atom

icm

odel

sco

uld

beus

edto

unde

rsta

ndth

esa

me

phen

omen

aor

ifm

odel

ssi

mpl

yre

plac

edon

ean

othe

r,ho

wev

erhe

did

nota

skst

uden

tsth

equ

estio

n.

Just

in 22–

31

Use

ofm

odel

sw

ithin

inqu

iry

23

3N

atur

eof

mod

els

33

2Fu

nctio

nof

mod

els

V.U

sing

repr

esen

tatio

nsbu

tno

trec

ogni

zing

orte

achi

ngab

outt

hem

asm

odel

s(4

stud

entt

each

ers)

Exa

mpl

eJe

nna:

Doi

ngin

quir

yw

ithou

tmod

elin

gta

lk.S

tude

nts

man

ipul

ated

food

web

rela

tions

hips

usin

ga

com

pute

rsi

mul

atio

nbu

tJe

nna

did

notr

elat

eth

isto

real

-wor

ldec

osys

tem

s.St

uden

tsco

nduc

ted

guid

edin

depe

nden

tinq

uiri

esin

tofa

ctor

saf

fect

ing

germ

inat

ion

ofse

eds

with

“poc

ketg

arde

ns.”

Jenn

adi

dno

tfra

me

the

inqu

iry

inte

rms

ofm

odel

s.

Jenn

a1

31

Use

ofm

odel

sw

ithin

inqu

iry

1–2

31

Nat

ure

ofm

odel

s2

21

Func

tion

ofm

odel

s

357

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

358 WINDSCHITL ET AL.

On one extreme, participants like Maria (Group 1) translated nearly the en-tire breadth of epistemic practice from the methods course into her classroom,engaging her students in model-testing and linking observations with theoreticalexplanations. In one example, she gave three different models to students thateach potentially explained why the seasons occur: (a) the tilt of the earth and howit affects the number of daylight hours, (b) how the height of the sun above thehorizon affects the directness of the sun’s rays, and (c) the changing distance be-tween the sun and the earth during the course of the year. Students were assignedto one of the three model groups and made predictions about the seasons usingtheir (limited) models. They then ran laboratory experiments with their models. Inthe presentation of their findings they were asked to make a claim about the dif-ferential heating of the earth using evidence they gathered from the lab activities,and discuss the limitations of investigating the seasons with a single explanatorymodel.

Another participant, Jenna (Group 5), represents a contrasting case. In a uniton energy flow through ecosystems she used a computer simulation to model therelationships between organisms in food webs. She had students “trace energythrough the program,” and prompted them to “remove an organism from the web,and see what happens.” While engaging students in a form of model-testing, shenever talked explicitly with students about the nature or function of models ortheir use in science. Later in the unit, Jenna had students create “pocket gardens”out of plastic envelopes lined with wet paper towels and a few bean seeds. Initiallystudents simply recorded observations in their science notebooks, but eventuallythey began to generate questions. The criteria for “good questions,” however,was that they were testable, not that they made sense in terms of any model ofgermination. Jenna then helped her students design experiments to test hypotheses(about the effect of light, moisture, or temperature on seed germination). Theseexperiments had the potential to shed light on the mechanisms underlying plantdevelopment as well as on the use of models to predict outcomes or help generalizeto the natural world, but Jenna did not recognize the power of the pocket gardens asmodels nor did she help students understand the gardens as models. She, like manyparticipants, was unable to “step outside” the content to see how these materialactivities could help explain, predict or generalize to a range of phenomena.

DISCUSSION AND CONCLUSIONS

All participants, to varying degrees, appropriated more sophisticated epistemo-logical views of how models, theory, evidence, and argument are used in scientificinquiry. Among the key conceptions developed by participants:

� Models can represent relationships between ideas rather than exclusivelyportraying things or processes that have an objective reality.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 359

� Models are broadly applicable sensemaking tools that can provide insightsinto the systemic nature and unobservable aspects of phenomena.

� In authentic science, models are used to generate questions and hypothesesfor investigations.

� Model-testing can help generate explanations that link evidence from obser-vations with unobservable explanatory structures and processes.

For a number of participants, these ideas ultimately supported a shift in their goalsfor scientific investigation—from “proving” a hypothesis, to testing and revisingexplanatory models.

In accounting for these changes in thinking, it appears that no single experiencewithin the course was unambiguously associated with revisions in participants’conceptions. Rather, there is some evidence that the strategic combination ofmutually reinforcing experiences and the consistent infusion of elements of theHPDD throughout the course provided conditions for change in many participants’thinking and talk. For example, when participants were asked on the first day ofinstruction to develop concept maps that included theoretical components (HPDD:problematizing content, taking on various intellectual roles and stances, modelingprototypical cases of disciplinary activity) the result was a productive tensionaround the idea of “what counts as theoretical” that set the stage for an emergingdialogue throughout the course—not merely about the linguistic conventions ofdescribing what can’t be seen—but about the idea that empirical data can beused to make claims about the unobservable. This dialogue was supported byideas expressed in the Models and Inquiry paper (HPDD: availability of relevantresources), carried through in the online discussion (HPDD: taking on variousintellectual roles and stances) and continued as participants struggled to identifywhat they were studying in their own inquiries (HPDD: problematizing content,giving students authority, holding students accountable to disciplinary norms).

Similarly, participants’ conceptions of scientific models appeared to developboth through multiple experiences and by persistent applications of the HPDD.The initial whole-class inquiry around fish respiration was designed to model aprototypical case of disciplinary activity and discourse, and at the same time,involve the preservice teachers as legitimate participants (HPDD: giving studentsauthority, holding students accountable to disciplinary norms). The students, how-ever, remained unsure about the functions and nature of what were being referredto as scientific models until they read the Models and Inquiry paper. This con-ceptual resource appeared to “fill gaps” in students’ thinking about the role ofmodels in our guided inquiry, but also helped them make sense of other forms ofscientific knowledge (theories, hypotheses, laws) that had never been discussed aspart of scientific epistemology during their undergraduate careers. Following thiscombination of engaging in guided inquiry and processing the ideas of the paper,participants’ conceptions of both models and inquiry began to shift. The roles of

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

360 WINDSCHITL ET AL.

models were being seen as more than representations; they were being visualizedas tools to organize and expand one’s understanding of a phenomenon and as anobject of critique whose revision signaled an advance in scientific thinking. How-ever, it was not until participants were faced with developing their own modelsby purposefully synthesizing ideas from various information resources, with theintention of generating hypotheses, that they realized model-building could helpthem envision underlying explanatory processes.

We are not suggesting unproblematic causal links between specific featuresof instruction and changes in participants’ understandings of science. We do,however, believe our system of instructional experiences provided conditions thatallowed most participants to restructure thinking and language around key epis-temic aspects of model-based inquiry. We note too that the HPDD could notfoster change without its principles being instantiated in instructional activitiesthat were shaped by our knowledge of how adult learners think about and respondto inquiry-based science. It was crucial, for example, that we knew from previousstudies the range of model-types participants might try to base their own inquiryon (graphs, flowcharts of experimental procedure, etc.) and to instead direct themto develop concept maps representing the target phenomenon itself. We also knewfrom previous studies that the “credibility” of model-testing as authentic sciencehad to be fostered over time. Our point here is that frameworks like the HPDD canonly serve as a general guide to the design of instruction; it must be coordinatedwith a layer of discursive and material activity that is informed by an under-standing of how particular kinds of learners (in this case preservice teachers withsignificant content knowledge, pre-existing ideas about models and theory, anda history of science inquiry experiences) engage with a specific set of epistemo-logical and methodological ideas (involving the relationships between scientificmodels, evidence, theory, and argument).

To support the credibility of claims asserting the effectiveness of the HPDDframework we triangulated data from various sources (Denzin, 1978) and consid-ered alternative explanations for our findings. The principal alternative explanationfor participants’ changes in discourse is that they learned to reiterate ways of talk-ing about models and MBI that the instructor had marked as important during thecourse. These reiterations could have masked superficial understandings; that is,participants may have never actually reconceptualized scientific activity in termsof a more sophisticated epistemology. In the post-course interview, for exam-ple, it is reasonable to suspect that participants amended their original pre-courseinterview responses based on what they heard during their course experiencesin order to provide the interviewer with more “correct answers.” This, however,seems implausible or at least only a partial explanation, given that several datasources indicated participants did meaningfully integrate ideas from the courseinto more epistemically authentic frameworks for thinking about models and in-quiry. For example, early in the quarter after participants had read the models

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 361

paper, they constructed their own imagined pedagogical situations in order to talkabout models as: (a) objects of critique and revision, (b) tools to support thinking,and (c) versatile personal constructions whose use signaled membership in a sci-entific community. One individual, in fact, mentioned that the premise of doinginquiry for the purpose of testing and changing a model had been incomprehensi-ble to her before reading the paper. Later, during participants’ online discussions,they wrote again about models as sense-making tools and as representations thatscientists use in authentic forms of inquiry. They also noted that model-testingseemed a more comprehensive and integrated approach to inquiry than the sci-entific method, suggesting further that model-testing was about evaluating ideasrather than predictions—a conceptual contrast that the instructors had not explic-itly mentioned during the course. Additional evidence for their understanding ofmodels was provided in the model-testing project itself, where ten of eighteenparticipants were able to develop an initial model of a phenomenon that includedhypothesized causal mechanisms, test hypotheses derived from their models, con-struct evidence-based arguments that coordinated data with underlying processes,and make appropriate changes to their models. These performances seemed toreflect an understanding not only of the roles of models in inquiry, but also of howthe epistemic characteristics of scientific knowledge (being testable, revisable,conjectural, explanatory, generative) are embodied within the context of inquiry.We also cite as evidence of understanding the nature and function of models thefact that during student teaching a majority of participants encouraged their ownpupils to use models in scientifically productive ways. Taken together, these con-verging data appear to contradict the hypothesis that participants had developedonly rote understandings of the role of models in inquiry or that their responses toquestions and tasks were merely emulative of the instructor’s talk.

Not all ideas and practices associated with MBI, however, were appropriatedby participants. Scientific argument, for example, was rarely incorporated intoparticipants’ day-to-day discourse. Although the majority of participants wereeventually able to conduct independent inquiries and construct scientific argu-ments that coordinated their data with conjectured theoretical mechanisms, theregular use of argument as a specialized form of rhetoric was not evident duringthe course. To explain this from a linguistic perspective, we know that individualstake up new terms (like “claim” and “argument”) when these come to have mean-ings that are distinguishable from similar expressions (like “conclusions”) andwhen new language allows one to express things that the current language willnot allow (Edwards & Mercer, 1987; O’Connor & Michaels, 1996; Tomasello,1999). Throughout the course we were successful in getting participants to takeup new kinds of epistemic talk around the use of scientific models, but we neverengaged students in the idea of claims and argument by consistently holding themaccountable to use this language as a disciplinary norm, nor did we provide regu-lar experiences in taking on various roles and stances around critiquing scientific

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

362 WINDSCHITL ET AL.

rhetoric. We conclude that it was not the HPDD that was ineffective, but rather ourapplication of its principles. As we noted earlier, most participants successfullyemployed model-based reasoning to develop legitimate claims and arguments intheir own inquiries; we believe this was due to explicit guidance in their require-ments for the final inquiry presentations (HPDD: providing conceptual resources)coupled with the modeling by one student of her inquiry and final arguments inthe days immediately before the rest of the class was to present (HPDD: modelingproto-typical cases). From the sociocultural view, this is an example of masterywithout appropriation (Herrenkohl & Wertsch, 1999).

The overall storyline of the data in this study suggests that the individual instruc-tional activities leading up to and including the final model-testing project wereeach necessary—but insufficient as stand-alones—to facilitate conceptual changeat a deep and integrated level. This study supports the idea that changing funda-mental conceptions about science takes sustained discourse on epistemologicalideas (see Smith, et al., 2000). In Smith and Wenk’s (2006) examination of collegestudents’ epistemological orientations, they found that individuals have interre-lated sets of conceptions that constrain thinking about how science generates newknowledge. They concluded that “if students’ ideas of one aspect of epistemologyare coordinated or supported by ideas about other aspects, a great deal of reorgani-zation and reconceptualization would need to occur to move to a more sophisticatedstance” (p. 774). Our findings provide empirical bridging between this hypothesisand important questions about why inquiry experiences alone in school sciencefail to alter students’ epistemological frameworks (Meichtry, 1992; Sandoval &Morrison, 2003; Schwartz, Lederman, & Crawford, 2004). For example, returningto the idea of argument, in this study and a similar study (Windschitl & Thompson,2006) participants seemed predisposed to rely initially on method-directed argu-ment as a way to culminate their inquiry projects. We see this disposition to usemethod-directed argument as supported by a tacit but potentially well-connectedweb of logic. From the standpoint of the inquirer, if the aim of such argument isto characterize relationships between variables, then the “explanations” involvedin the argument articulate only “how something happens,” never asserting “whysomething happens.” If the inquirer then is not concerned with cause, then there isno press at the outset of the investigation to think about models of phenomena interms of more fundamental (i.e., causal) theoretical processes. In such cases onecan fall back on the school science rhetoric of “conclusions” to describe phenom-ena exclusively in observable terms and side-step entire discourses around theory,conjecture, and causal explanation. In this study, when several participants weresetting up their own model-testing project, they faced incongruity between thisschool science logic and the logic behind theory-directed argument as advocatedin the course. Some participants’ uncertainties around argument led them to re-consider the reasons for data collection, asking essentially, “What am I generatingevidence about and for?” and to question the very object of their investigations

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 363

asking, “What am I studying?” (a phenomenon? variables? an idea?). Thus, iflearners are to shift from method-directed arguments that characterize school sci-ence to theory-directed arguments that characterize authentic forms of inquiry,they would have to undergo a radical reorganization of interconnected ideas link-ing beliefs about method with epistemic frames about the nature of scientificknowledge.

Other findings from this study reveal highly consequential ways that preserviceteachers’ undergraduate backgrounds and research experiences shape their think-ing about inquiry. First, many participants had conceptions of models that limitedor subverted the idea that model development was a form of inquiry. Brooke, forexample, from her immunology work, could think of models only as organisms(like mice) that are surrogates for other organisms (like humans) in medical re-search. Despite the HPDD scaffolding, she and others with similar backgroundswere unable to generate conceptual models and more importantly to consider howempirical data help generate claims about theoretical processes. This suggeststhat if learners do not have at least a mid-level understanding of the nature ofmodels, their ability to engage in some of the core epistemic discourses associatedwith MBI (around theory, conjecture, explanation) will likely be difficult. Second,a small number of participants held to conceptions that, although model-testingrepresented authentic science and was a potentially valuable practice for school-age learners, it constituted a set of activities that were merely appended on the“front-end” of the scientific method. Despite the intensive course discussions andexperiences around more advanced ways of thinking about inquiry, for them thescientific method remained positioned as the data-intensive core of disciplinarywork.

A final note on our preliminary findings from K–12 classrooms: More than halfthe participants (8 of the 15 that student-taught) prompted their own students to usemodels, explicitly or implicitly, to make predictions, connect observations withunderlying explanatory processes, and refine scientific ideas. Their pre- and post-course ratings of understanding models were roughly predictive of the degreeof sophistication they employed in using model-based instruction with younglearners. These results are promising given what we know about the broaderliterature on instruction by novice teachers. There is substantial evidence, forexample, that aspiring teachers in every subject matter area enter teacher educationwith traditional and often naive conceptions of teaching and of how knowledge intheir discipline is generated and evaluated (Ball & McDiarmid, 1990; Calderhead& Robson, 1991; Grossman, Schoenfeld, & Lee, 2005). We also know that evenwhen novice teachers are exposed to powerful conceptual frameworks to help themthink about their subject matter, about organizing instruction, and about analyzingclassroom events, they often fail to take these up (Bransford & Stein, 1993;Grossman, Valencia, Evans, Thompson, Martin, & Place, 2000), or, when placedin their own classrooms, revert to conservative, teacher-centered instruction over

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

364 WINDSCHITL ET AL.

time (Simmons et al., 1999; Tabachnick & Zeichner, 1999). If our participants areto maintain their trajectory toward meaningful, ambitious, and skilled pedagogy,then the early years of professional practice (induction) must include supportthat will keep them from abandoning such practices “in favor of what they mayperceive as safe, less complex activities” (Feiman-Nemser, 2001, p. 1029).

One way to think about this kind of support is in terms of the pedagogicalcontent knowledge (PCK) needed by teachers to engage their students in modelingdiscourses. The concept of PCK (Shulman, 1986; 1987) is based on the assumptionthat teaching is a complex activity requiring the integration of knowledge fromvarious domains including knowledge of subject matter, teaching context, andgeneral pedagogical strategies (Grossman, 1990). PCK refers to teachers’ uniqueunderstanding of how to make content comprehensible to others through theuse of metaphor, demonstrations, activities, investigations, and examples tailoredto meet the needs of particular groups of learners (Zembal-Saul, Blumenfeld,& Krajcik, 2000). Understanding the role of models in science has emergedrecently as an important part of PCK. The effective use of scientific models byteachers is now being conceptualized as something more than simply representingto students in diagrams, charts, or graphs, phenomena perceived to be objectiveand unproblematic (as characterized by Levels 1 and 2 of Table 2). Rather, part ofone’s skill in teaching “representational competence” (Lehrer & Schauble, 2004;Magnussen, Krajcik, & Borko, 1999) is to help learners treat models as objects ofcritique and revision. Models and modeling also figure prominently in what Davisand Krajcik (2005) refer to as PCK for disciplinary practices. They define thisas “knowledge of how to help students understand the authentic activities of thediscipline, the ways knowledge is developed in a field, and beliefs that represent asophisticated understanding of how the field works” (p. 5). Because MBI representsa form of authentic disciplinary practice, particular forms of knowledge aboutthe nature and function of models—described by Schwarz and White (2005) asmeta-modeling knowledge—undergird this type of PCK for teachers. Our Level 3descriptions in Tables 2 and 3, for example, capture such aspects of meta-modelingknowledge.

What our findings add to the literature is an array of specific elements of PCKfor disciplinary practice that came to light when we, as methods course instruc-tors, and our participants, as secondary science teachers, attempted to translateknowledge about models and modeling into instructional episodes. We see threesub-categories of PCK for disciplinary practice that are important for introduc-ing any form of model-based inquiry into classrooms. The first is (1) identifyingmodels with pedagogical potential. Our data from participants’ teaching practicesreveal that they needed assistance (1a) recognizing scientific ideas as modelswithin different subject matter domains, in part because scientific ideas are rarelyportrayed as testable, conjectural, or revisable forms of knowledge by commoncurricula. More specifically, teachers required help in (1b) selecting models in

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 365

science that were amenable to MBI. These are models that have at their core anidea with broad applicability in a domain (e.g., wave motion, inheritance, chemicalbonding, homeostasis) and that have comprehensible explanatory underpinningsfor secondary students. The second sub-category is (2) determining how modelscan be used to explore the epistemic features of scientific knowledge; essentiallyidentifying what it means to teach an idea as a model within the context of theircurriculum. This includes (2a) crafting questions and tasks that help students seemodels as testable systems of relationships, and (2b) helping students see the func-tional value of models in the scientific enterprise (e.g., for prediction, insight). Thisknowledge sets the stage for (2c) orchestrating classroom discourse that opens upfor students the explanatory and conjectural nature of scientific models. The thirdsub-category is (3) promoting representational competence in students (Lehrer& Schauble, 2004); that is, (3a) helping them understand how and why one cri-tiques salient features of representations, and (3b) facilitating students’ creationof their own models as tools for understanding scientific ideas. We believe thatthese elements of PCK for disciplinary activity encompass the fundamental un-derstandings and skills necessary to scaffold learners through authentic forms ofMBI. They represent the application of subject matter and disciplinary knowledgeto conditions for learning, and are an appropriate “grain size” to serve as referentsfor instruction in teacher preparation or in professional development. This said,we also believe that explicit examples of these kinds of practices-in-action are yetto be developed, as are discourse tools that can help teachers enter into the kindsof conversations suggested by these new elements of PCK.

In closing, our HPDD framework supported many of the teacher candidates intrying out a new language around investigative science. When our participants takecharge of their own classrooms, we will eventually gain a clearer retrospective onhow preservice experiences influenced these nascent epistemic discourses. Thisknowledge will, in turn, help us determine what types of continuing supports arenecessary to move novice teachers from mere competence to expertise in “talkingand doing science” with young learners.

ACKNOWLEDGMENTS

This research was funded by the Teachers for a New Era Project spon-sored by the Carnegie Corporation, Annenberg Foundation, and the Rocke-feller Foundation. The opinions expressed within are exclusively those of theauthors.

We thank Annemarie Palincsar and several anonymous reviewers for theirpatient guidance and invaluable insights in helping us craft the final form of thisresearch report.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

366 WINDSCHITL ET AL.

REFERENCES

American Association for the Advancement of Science. (1993). Benchmarks for science literacy. NewYork: Oxford University Press.

Abd-El-Khalick, F., & Lederman, N. (2000). The influence of history of science courses on stu-dents’ views of the nature of science. Journal of Research in Science Teaching, 37, 1057–1095.

Ball, D., & Bass, H. (2001, March). What mathematical language is entailed in teaching children toreason mathematically? In Mathematical Sciences Education Board (Ed.), Knowing and learningmathematics (pp. 26–34). Washington, DC: National Academy Press.

Ball, D. L., & McDiarmid, W. (1990). The subject-matter preparation of teachers. In W. R. Houston(Ed.), Handbook for research on teacher education (pp. 437–449). New York: Macmillan.

Bakhtin, M. (1981). The dialogic imagination: Four essays by M. M. Bakhtin. (C. Emerson & M.Holquist, trans.) Austin: University of Texas Press.

Barsalou, L. W. (1999a). Language comprehension: Archival memory or preparation for situatedaction. Discourse Processes, 28, 61–80.

Barsalou, L. W. (1999b). Perceptual symbol systems. Behavioral and Brain Sciences, 22, 577–660.Bowen, G. M., & Roth, W. M. (1998, April). Isolation of variables and enculturation to a reductionist

epistemology during ecology lectures. Paper presented at the annual conference of the AmericanEducational Research Association, San Diego, CA.

Bransford, J. D., & Stein, B. S. (1993). The IDEAL problem solver (2nd ed.). New York:Freeman.Brown, A., Ash, D., Rutherford, M., Nakagawa, K., Gordon, A., & Campione, J. (1993). Distributed

expertise in the classroom. In G. Salomon (Ed.), Distributed cognitions. New York: CambridgeUniversity Press.

Calderhead, J., & Robson, M. (1991). Images of teaching: Student teachers’ early conceptions ofclassroom practice. Teaching and Teacher Education, 7, 1–8.

Carey, S., Evans, R., Honda, M., Jay, E., & Unger, C. (1989). “An experiment is when you try itand see if it works”: A study of 7th grade students’ understanding of the construction of scientificknowledge. International Journal of Science Education, 11, 514–529.

Cartwright, N. (1983). How the laws of physics lie. Oxford: Clarendon Press.Castanheira, M. L., Crawford, T., Dixon, C. N., & Green, J. L. (2001). Interactional ethnography:

An approach to studying the construction of literate practices. Linguistics and Education, 11, 353–400.

Chinn, C., & Malhotra, B. (2002) Epistemologically authentic inquiry in schools: A theoretical frame-work for evaluating inquiry tasks. Science Education, 86, 175–218.

Cobb, P., Gravemeijer, K., Yackel, E., McClain, K., & Whitenack, J. (1997). Mathematizing andsymbolizing: The emergence of chains of signification in one first-grade classroom. In D. Kirschner& J. A. Whitson (Eds.), Situated cognition: Social, semiotic, and psychological perspectives (pp.151–233). Mahwah, NJ: Lawrence Erlbaum Associates.

Collins, A., Brown, J. S., & Newman, S. E. (1989). Cognitive apprenticeship: Teaching the craftsof reading, writing and mathematics. In L. B. Resnick (Ed.), Knowing, learning and instruction:Essays in honor of Robert Glaser. Hillsdale, NJ: Lawrence Erlbaum Associates.

Crawford, B., & Cullin, M. (2004). Supporting prospective teachers’ conceptions of modeling inscience. International Journal of Science Education, 1379–1401.

Cullin, M. J., & Crawford, B. A. (2004, April). The interplay between prospective science teachers’modeling strategies and understandings. Paper presented at the annual meeting of the NationalAssociation for Research in Science Teaching, Vancouver, British Columbia.

Darden, L. (1991). Theory change in science: Strategies from Mendelian genetics. New York: OxfordUniversity Press.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 367

Davis, E., & Krajcik, J. (2005). Designing educative curriculum materials to promote teacher learning.Educational Researcher, 34(3), 3–14.

DeJong, O., & van Driel, J. H. (2001). Developing pre-service teachers’ content knowledge and PCKof models and modeling. Paper presented at the annual conference of the National Association ofResearch in Science Teaching, St. Louis, MO.

Denzin, N. (1978). Sociological methods. New York: McGraw-Hill.Denzin, N., & Lincoln, Y. (2003). The discipline and practice of qualitative research. In N. Denzin &

Y. Lincoln (Eds.), The landscape of qualitative research: Theories and issues (pp. 1–45). ThousandOaks: Sage Publications.

Driver, R., Leach, J., Millar, R., & Scott, P. (1996). Young people’s images of science. Buckingham,UK: Open University Press.

Duschl, R., & Grandy, R. (2005, February). Reconsidering the character and role of inquiry in schoolscience: Framing the debates. In R. Duschl & R. Grandy (Eds.), Inquiry Conference on Developinga Consensus Research Agenda (pp. 319). Rutgers University.

Edwards, D., & Mercer, N. (1987). Common knowledge: The development of understanding in theclassroom. London: Routledge.

Engle, R., & Conant, F. (2002). Guiding principles for fostering productive disciplinary engagement:Explaining an emergent argument in a community of learners classroom. Cognition & Instruction,20(4), 399–483.

Erickson, F. (1982). Classroom discourse as improvisation. In L. C. Wilkinson (Ed.), Communicationin the classroom (pp. 153–182). New York: Academic Press.

Erickson, F. (1992). Ethnographic microanalysis of interaction. In J. Presseile (Ed.), Handbook ofqualitative research in education (pp. 201–225), San Diego, CA: Academic.

Feiman-Nemser, S. (2001). From preparation to practice: Designing a continuum to strengthen andsustain teaching. Teachers College Record, 103(6), 1013–1055.

Gee, J. (2002). Learning in semiotic domains. In D. Schallert, C. Fairbanks, J. Worthy, B. Maloch, &J. Hoffman (Eds.), The 51st yearbook of the National Reading Conference (pp. 23–32). Oak Creek,WI: National Reading Conference, Inc.

Gess-Newsome, J., & Lederman, N. (1993). Pre-service teachers’ knowledge structures as a func-tion of professional teacher education. A year-long assessment. Science Education, 77, 25–45.

Giere, R. N. (1988). Explaining science: A cognitive approach. Chicago: University of Chicago Press.Giere, R. N. (1991). Understanding scientific reasoning (3rd ed.). New York: Harcourt Brace Jo-

vanovich College Publishers.Glenberg, A. M. (1997). What is memory for? Behavioral and Brain Sciences, 22, 1–55.Glenberg, A. M., & Robertson, D. A. (1999). Indexical understanding of instructions. Discourse

Processes, 28, 1–26.Goldenberg, C., & Gallimore, R. (1991). Changing teaching takes more than a one-shot workshop.

Educational Leadership, 49, 69–72.Grossman, P. (1990). The making of a teacher: Teacher knowledge and teacher education. New York:

Teachers College Press.Grossman, P., Valencia, S., Evans, K., Thompson, C. Martin, S., & Place, N. (2000). Transitions into

teaching: Learning to teach writing in teacher education and beyond. Journal of Literacy Research,32(4), 631–662.

Grossman, P., Shoenfeld, A., & Lee, C. (2005). Teaching subject matter. In Preparing teachers fora changing world: What teachers should learn and be able to do (pp. 201–231). San Francisco:Jossey-Bass.

Harrison, A. G. (2001, April). Models and PCK: Their relevance for practicing and pre-serviceteachers. Paper presented at the annual meeting of the National Association of Research in ScienceTeaching, St. Louis, MO.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

368 WINDSCHITL ET AL.

Heibert, J., Carpenter, T. P., Fennema, E., Fuson, K., Human, P., Murray, H., et al. (1996). Problem-solving as a basis for reform in curriculum and instruction. The case of mathematics. EducationalResearcher, 25(4), 12–21.

Hempel, C. G. (1966). Philosophy of natural science. Englewood Cliffs, NJ: Prentice Hall.Henningson, M., & Stein, M. K. (1997). Mathematical tasks and student cognition: Classroom-based

factors that support or inhibit high-level mathematical thinking and reasoning. Journal for Researchin Mathematics Education, 28, 524–549.

Herrenkohl, L. R., & Wertsch, J. V. (1999). The use of cultural tools: Mastery and appropriation. InI. Sigel (Ed.), Development of mental representation: Theories and applications (pp. 415–435).Hillsdale, NJ: Lawrence Erlbaum Associates.

Hestenes, D. (1992). Modeling games in the Newtonian world. American Journal of Physics, 60(8),732–748.

Justi, R., & Gilbert, J. (2002). Science teachers’ knowledge about models and attitudes towards the useof models and modeling in learning science. International Journal of Science Education, 24(12),1273–1292.

King, A. (1994). Inquiry as a tool in critical thinking. In D. F. Halpern (Ed.), Changing collegeclassrooms: New teaching and learning strategies for an increasingly complex world (pp. 13–38).San Francisco: Jossey Bass.

Kitcher, P. (1993). The advancement of science: Science without legend, objectivity without illusions.New York: Oxford University Press.

Krajcik, J., Blumenfeld, P., Marx, R., & Soloway, E. (2000). Instructional, curricular, and technologicalsupports for inquiry in science classrooms. In J. Minstrell & E. van Zee (Eds.), Inquiring into inquirylearning and teaching in science (pp. 283–315). Washington, DC: American Association for theAdvancement of Science.

Knorr-Cetina, K. (1999). Epistemic cultures: How sciences make knowledge. Cambridge. MA: HarvardUniversity Press.

Kress, G. (2000). Multimodality. In B. Cope & M. Kalantzis (Eds.), Multiliteracies: Literacy andlearning in the design of social futures (pp. 182–202). London: Routledge.

Kress, G., Jewitt, C., Ogborn, J., & Tsatsarelis, C. (2001). Multimodal teaching and learning: Therhetortics of the science classroom. London: Continuum.

Kuhn, T. (1970). The structure of scientific revolutions (2nd ed.). Chicago: University of ChicagoPress.

Lampert, M. (1990). When the problem is not the question, and the solution is not the answer:Mathematical knowing and teaching. American Educational Research Journal, 27, 29–64.

Latour, B. (1999). Pandora’s hope: Essays on the reality of science studies. Cambridge, MA: HarvardUniversity Press.

Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. New York:Cambridge University Press.

Lehrer, R., & Schauble, L. (2004). Modeling natural variation through distribution. American Educa-tional Research Journal, 41(3), 635–679.

Lehrer, R., & Schauble, L. (2006). Scientific thinking and scientific literacy: Supporting developmentin learning in context. In W. Damon, R. M. Lerner, K. A. Renninger, & I. E. Sigel (Eds.), Handbookof child psychology, 6th ed. (vol. 4). Hoboken, NJ: John Wiley and Sons.

Lemke, J. (1990). Talking science: Language, learning, and values. Norwood, NJ: Ablex.Lesh, R., Hoover, M., Hole, B., Kelly, A., & Post, T. (2000). Principles for developing thought revealing

activities for students and teachers. In A. Kelly & R. Lesh (Eds.), The handbook of researchdesign in mathematics and science education, (pp. 591–646). Mahwah, NJ: Lawrence ErlbaumAssociates.

Longino, H. (1990). Science as social knowledge: Values and objectivity in scientific inquiry. Princeton,NJ: Princeton University Press.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 369

Magnussen, S., Krajcik, J., & Borko, H. (1999). Nature, sources, and development of pedagogicalcontent knowledge for science teaching. In J. Gess-Newsome & N. G. Lederman (Eds.), Examiningpedagogical content knowledge: The construct and its implications for science education (pp. 95–132). Boston MA: Kluwer.

Meichtry, Y. J. (1992). Influencing student understanding of the nature of science: Data from a case ofcurriculum development. Journal of Research in Science Teaching, 29(4), 389–407.

Metcalf, S. J., Krajcik, J., & Soloway, E. (2000). Model-It: A design retrospective. In M. Jacobson& R. B. Kozma (Eds.), Innovations in science and mathematics education: Advanced designs fortechnologies in learning (pp. 77–116). Mahwah, NJ: Lawrence Erlbaum Associates.

Mitchell, J. C. (1984). Case studies. In R. F. Ellen (Ed.), Ethnographic research: A guide to generalconduct (pp. 237–241). Orlando, FL: Academic Press.

Moje, E. B., Collazo, T., Carrillo, R., & Marx, R. (2001). “Maestro, what is ‘quality’?”: Language, lite-racy, and discourse in project-based science. Journal of Research in Science Teaching, 38, 469–498.

Morgan, M. S., & Morrison, M. (1999). Models as mediators. Cambridge: Cambridge UniversityPress.

National Research Council. (1996). National science education standards. Washington, DC: NationalAcademy Press.

Nersessian, N. (2002). The cognitive basis of model-based reasoning in science. In P. Carruthers,S. Stich & M. Siegal (Eds.), The cognitive basis of science (pp. 17–34). Cambridge: CambridgeUniversity Press.

Nersessian, N. (2005). Interpreting scientific and engineering practices: Integrating the cognitive,social, and cultural dimensions. In M. Gorman, R. D. Tweney, D. Gooding, & A. Kincannon (Eds.),Scientific and technological thinking. Hillsdale, NJ: Lawrence Erlbaum Associates.

O’Connor, M. C., & Michaels, S. (1996). Shifting participant frameworks: Orchestrating thinkingpractices in group discussion. In D. Hicks (Ed.), Child discourse and social learning (pp. 63–102). Cambridge: Cambridge University Press.

Reinvention Center at Stonybrook (2001, May). Reinventing undergraduate education: Threeyears after the Boyer report. Retrieved January 2006, from www.sunysb.edu/reinventioncenter/boyerfollowup.pdf.

Resnick, L. B., & Hall, M. W. (2001). The principles of learning: Study tools for educators (version2.0) [CD ROM]. Pittsburgh, PA: Institute for Learning, LRDC, University of Pittsburgh.

Roth, W-M. (1995). Authentic school science: Knowing and learning in open-inquiry science labora-tories. Boston: Kluwer.

Sandoval, W. (2005). Understanding students’ practical epistemologies and their influence on learningthrough inquiry. Science Education, 89(4), 634–656.

Sandoval, W., & Morrison, K. (2003). High school students’ ideas about theories and theory changeafter a biological inquiry unit. Journal of Research in Science Teaching, 40(4), 369–392.

Schwarz, C., & Gwekwerere, Y. (2007). Using a guided inquiry and modeling framework (EIMA) tosupport pre-service K-8 science teaching. Science Education, 91(1), 158–186.

Schwarz, C., & White, B. (2005). Meta-modeling knowledge: Developing students’ understanding ofscientific modeling. Cognition and Instruction, 23(2), 165–205.

Schwartz, R., Lederman, N., & Crawford, B. (2004). Developing views of the nature of science inauthentic contexts: An explicit approach to bridging the gap between nature of science and scientificinquiry. Science Education, published online May 12, www.interscience.wiley.com.

Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher,57, 4–14.

Shulman, L. S. (1987). Knowledge and teaching: Foundations of the new reform. Harvard EducationalReview, 57, 1–22.

Simmons, P., Emory, A., Carter, T., Coker, T., Finnegan, B., Crockett, D., et al. (1999). Beginningteachers: Beliefs and classroom actions. Journal of Research in Science Teaching, 36(8), 930–954.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

370 WINDSCHITL ET AL.

Smit, J. J., & Finegold, M. (1995). Models in physics: Perceptions held by final-year pre-servicephysical science teachers studying at South African universities. International Journal of ScienceEducation, 19, 621–634.

Smith, C. L., Maclin, D., Houghton, C., & Hennessey, M. G. (2000). Sixth-grade students’ episte-mologies of science: The impact of school science experiences on epistemological development.Cognition and Instruction, 18(3), 349–422.

Smith, C., & Wenk, L. (2006). Relations among three aspects of first-year college students’ episte-mologies of science. Journal of Research in Science Teaching, 43(8), 747–785.

Sohmer, R. E. (2000). “A page so big not one can fall off”: Apprenticeship as the architecture of inter-subjectivity in an after school science program for inner city middle school students. DissertationAbstracts International, 61, 928.

Stewart, J., Hafner, R., Johnson, S., & Finkel, E. (1992). Science as model-building: Computers andhigh school genetics. Educational Psychologist, 27, 317–336.

Stewart, J., & Rudolph, J. (2001). Considering the nature of scientific problems when designing sciencecurricula. Science Education, 85, 207–222.

Stewart, J., Passmore, C., Cartier, J., Rudolph, J., & Donovan, S. (2005). Modeling for understanding inscience education. In T. Romberg, T. Carpenter, & F. Dremock (Eds.), Understanding mathematicsand science matters (pp. 159–184). Mahwah, NJ; Lawrence Erlbaum Associates.

Tabachnick, B. R., & Zeichner, K. (1999). Idea and action: Action research and the development ofconceptual teaching in science. Science Education, 83(3), 309–322.

Tomasello, M. (1999). The cultural origins of human cognition. Cambridge, MA: Harvard UniversityPress.

Trumbull, D., & Kerr, P. (1993). University researchers’ inchoate critiques of science teaching: Impli-cations for the content of pre-service science teacher education. Science Education, 77(3), 301–317.

Van Driel, J. H., & Verloop, N. (2002). Experienced teachers’ knowledge of teaching and learningof models and modeling in science education. International Journal of Science Education, 24(12),1255–1272.

Warren, B., & Rosebery, A. S. (1996). “This question is just too easy!” Students’ perspectives onaccountability in science. In L. Schauble & R. Glaser (Eds.), Innovations in learning: New environ-ments for education (pp. 97–125). Mahwah, NJ: Lawrence Erlbaum Associates.

Wells, G. (1993). Reevaluating the IRF sequence: A proposal for the articulation of theories of activitiesand discourse for the analysis of teaching and learning in the classroom. Linguistics and Education,5, 1–37.

Wenk, L. (2000). Improving Science Learning: Inquiry-based and traditional first-year college sciencecurricula. Dissertation Abstracts International, 61(10), 3885A. (University Microfilms No. AAT9988852).

Wenk, L., & Smith, C. (2004, April). The impact of first-year college science courses on epistemologicalthinking: A comparative study. Paper presented at the annual meeting of the National Associationof Research in Science Teaching. Vancouver, BC.

White, B., & Fredericksen, J. (1998). Inquiry, modeling, and metacognition: Making science accessibleto all students. Cognition and Instruction, 16(1), 3–118.

Windschitl, M. (2004). Caught in the cycle of reproducing folk theories of “Inquiry”: How pre-service teachers continue the discourse and practices of an atheoretical scientific method. Journalof Research in Science Teaching, 41(5), 481–512.

Windschitl, M., & Thompson, J. (2006) Transcending simple forms of school science investigations:Can pre-service instruction foster teachers’ understandings of model-based inquiry? American Ed-ucational Research Journal, 43(4),783–835.

Zembal-Saul, C., Blumenfeld, P., & Krajcik, J. (2000). Influence of guided cycles of planning, teach-ing, and reflection on prospective elementary teachers’ science content representations. Journal ofResearch in Science Teaching, 37(4), 318–339.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 371

APPENDIX A

Pre-Course Interview Protocol

All questions are followed by prompts for elaboration, examples, and clarification.

� What area of science do you want to specialize in when you go into theschools?

Prompt for how they become interested in a specialty: Was it affinity from childhood,influential teacher, recent experience in university coursework, research experience,previous career?

� Can you tell me anything, in addition to your coursework, that’s got youinvolved in science, like research experiences as an undergrad or in a career?

Prompt for features of any research experiences that signal intellectual involvementwith posing questions, generating and using evidence—as opposed to “technicalassistance” to a researcher or mentor.

� Tell me a little bit about your history of science-related coursework as anundergraduate.

Prompt for nature of lab experiences, number and level of courses in various sciencedomains.

� Can you remember a time when you felt you learned a lot about how scienceis done?

Prompt: If asked what this means, reference learning how scientists develop ques-tions, make decisions about what to study and how, what the outcomes of “doingscience” are. Probe for talk about characteristic practices of the discipline.

� Did you ever have anyone in your coursework talk about investigating ascience idea—not the content, but the actual process?

Prompt: Such as in university lectures, was it ever an explicit topic to discuss whatcounts as a scientific question, hypothesis, what counts as evidence, etc?

� Did any instructor or teacher ever give you the chance to do your owninvestigations any time? This includes the span between middle school andthe most recent courses you’ve taken.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

372 WINDSCHITL ET AL.

Prompt: Has anyone helped you with a guided investigation or a guided inquirywhere they might have taken on parts of the process and you participated fully inother aspects?

• • • • •� When you hear people talk about “advancing science” or “making progress”

in science, what does that mean to you?

Probe for whether it is accumulating new facts or is it developing new ways (theory)to think about phenomena?

� When someone uses the term “experiment,” what comes into your mind?Can you think of examples?

Prompt: Is experimentation synonymous with scientific investigation?

Prompt for notion of always needing a “controlled randomized experimental design”or are there alternatives?

� What qualities are essential to make something a scientific investigation asopposed to investigations that non-scientists would engage in?

� When scientists go through the process of posing a question and then theydesign a way to collect data and then they analyze that data, what processfollows the analysis of data?

Prompt: If respondent mentions “conclusions,” unpack that.

� How do you recognize a scientific argument from other kinds of argumentthat historians or lawyers might engage in?

� What makes a scientific argument convincing?� Should creativity play any role in science? If so, what role? If not, why not?� Have any of your instructors ever talked about scientific theory, what a theory

is?

Prompt: if they mention any connections between the scientific method or scienceadvances and theory: What do you see as the connection?

� What would be the difference between a scientist who says, “I have a theoryabout something,” and a person out there on the street somewhere, the averagepedestrian, who says "I have a theory" about X?

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 373

� Have you ever had any instructor discuss the term “law”? What a law is asopposed to theory?

• • • • •� How about the term model? Have any of your instructors used the term

model? If so, can you elaborate?� Have you ever used a scientific model? What was the context for that? How

did you use it?� If you had to talk with middle school or high school students about things

that scientists make models of, what examples might you give them?� What about the purpose of creating models?

Prompt if they allude to “real things”—What do you mean by real things?

Prompt for “What does it mean to use a model to explain?” Do you mean to explainto another person?

� When creating a model, what types of things do you have to think about orconsider?

Prompt: What kinds of choices do you have to make?

� What is the relationship between a model and the thing that’s being modeled?� Can you have more than one model for the same thing?

Prompt: Can you think of an example where you might have two models for thesame thing? Why?

� Is there a way to decide if one model is better than another one?

Prompt: What criteria are used to determine if one model is better than another?

Prompt: What are shortcomings some models might have?

� Would a scientist ever change a model? Why or Why Not?

Prompt for any other reason than because of new facts coming to light.

• • • • •� Is teaching about models important in the area of science that you’re spe-

cializing in?

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

374 WINDSCHITL ET AL.

Prompt: If respondent begins talking about teaching with models, probe whether toimportant to teach about models.

� What is it you want your students to understand about the processes ofscience by the end of your school year with them?

APPENDIX B

Response Prompts for On-Line Web Discussion in Mid-Quarter

Participants are asked to respond to these questions and also to write generouscommentary on at least three other classmates’ question responses.

Response prompt #1. In the paper we read on models (“Teaching About ScienceIdeas—As Models”), different parts of the paper may have:

� reinforced what you already understood about ideas around scientific inves-tigations, theories, models

� conflicted with what you understood about these ideas� confused you about these ideas, or,� given you an “a-ha” that changed the way you think about these ideas.

Please comment about any of the four kinds of reactions, listed above, that youmay have had. Not all (reinforcement, conflict, confusion, “a-ha”) may apply.

• • • • •Response prompt #2. During the first and second class, when we built the model

and conducted the fish inquiry, different parts of this activity may have:

� reinforced, conflicted, confused, or given you an “a-ha” about how youunderstand that investigative science is done.

Please comment about any of the four kinds of reactions, listed above, that youmay have had. Not all (reinforcement, conflict, confusion, “a-ha”) may apply.

• • • • •

Response prompt #3. During any of our class discussions or in talking with yourclassmates, different parts of these conversations may have:

� reinforced, conflicted, confused, or given you an “a-ha” about how inves-tigative science is done.

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 375

Please comment about any of the four kinds of reactions, listed above, that youmay have had. Not all (reinforcement, conflict, confusion, “a-ha”) may apply.

APPENDIX C

Post-course Interview Protocol

All questions are followed by prompts for elaboration, examples, and clarification

� When you hear people talk about “advancing science” or “making progress”in science, what does that mean to you?

Probe for whether it is accumulating new facts or is it developing new ways (theory)to think about phenomena?

� When someone uses the term “experiment,” what comes into your mind?Can you think of examples?

Prompt: Is experimentation synonymous with scientific investigation?

Prompt for notion of always needing a “controlled randomized experimental design”or are there alternatives?

� What qualities are essential to make something a scientific investigation asopposed to investigations that non-scientists would engage in?

� When scientists go through the process of posing a question and then theydesign a way to collect data and then they analyze that data, what processfollows the analysis of data?

Prompt: If respondent mentions “conclusions,” unpack that.

� How do you recognize a scientific argument from other kinds of argumentthat historians or lawyers might engage in?

� What makes a scientific argument convincing?� Should creativity play any role in science? If so, what role? If not, why not?� What would be the difference between a scientist who says, “I have a theory

about something,” and a person out there on the street somewhere, the averagepedestrian, who says “I have a theory” about X?

• • • • •

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

376 WINDSCHITL ET AL.

� If you had to talk with middle school or high school students about thingsthat scientists make models of, what examples might you give them?

� What about the purpose of creating models?

Prompt if they allude to “real things”—What do you mean by real things?

Prompt for “What does it mean to use a model “to explain?” Do you mean to explainto another person?

� When creating a model, what types of things do you have to think about orconsider?

Prompt: What kinds of choices do you have to make?

� What is the relationship between a model and the thing that’s being modeled?� Can you have more than one model for the same thing?

Prompt: Can you think of an example where you might have two models for thesame thing? Why?

� Is there a way to decide if one model is better than another one?

Prompt: What criteria are used to determine if one model is better than another?

Prompt: What are shortcomings some models might have?

� Would a scientist ever change a model? Why or Why Not?

Prompt for any other reason than because of new facts coming to light.

� Is teaching about models important in the area of science that you’re spe-cializing in?

Prompt: If respondent begins talking about teaching with models, probe whether toimportant to teach about models.

� Are there any differences between real science, school science, and thescience you did during the model-testing project?

• • • • •� Is there any role for models in your unit plan? How are they used?

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

NOVICE SCIENCE TEACHERS 377

� What is it you want your students to understand about the processes ofscience by the end of your school year with them?

� How will you know that your students understand how science is done?

APPENDIX D

Student Teaching Protocols, Initial Analysis

Observation ProtocolContext Notes (contextual influences on curricular decisions, i.e., timing, equip-

ment, cooperating teacher)

1. Script all teacher and student talk during lesson2. Add notes re: teacher language around models, evidence, data, claims, ar-

guments, observable/unobservable data, theoretical components, hypothe-ses/hypothesizing & highlight the degree of sophistication students usedthis type of talk

3. Highlight questions teacher asked & questions students asked (differentiateclarifying and scientific questions: CQ & SQ)

Debrief Lesson with Teacher

1. What did you try that seemed successful, why would you call it successful?2. What were your goals for this lesson? (inquiry goals, content goals, skill-

based goals)3. Do you think your students met those goals? What do you think your

students were thinking about? What did you hear them talking about?4. What informed your planning for this lesson? (university course work, CT,

text; listen for impact of broader school context & for productive/non-productive conflicts across contexts)(we provided additional prompts not relevant to this study)

5. How did your students’ prior knowledge or their current thinking help youdesign this lesson?

6. How might you adapt your next lesson based on what you saw today?

Initial Pass at Analysis

1. Did student-teacher talk about/scaffold ideas about evidence and support-ing claims or explanations? How? What was the nature of the claims/explanations (i.e., did they stop with description or discuss why/underlyingmechanisms)?

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014

378 WINDSCHITL ET AL.

2. Did they talk about/scaffold ideas about scientific models/representations?How? Were these nominal references to models or more sophisticated ideasabout models? Reference level of nature/function of models.

3. Is there evidence that they used student thinking to adjust instruction?

� 1st did they provide opportunities to hear the student’s ideas by elicitingstudents’ ideas or engaging the students in sense-making talk? Describe.

� 2nd did they use students’ words or ideas?� 3rd did they modify their instruction or differentiate instruction for some

based on how students were learning?

1. Is there evidence of student learning?

� Evidence for how their pupils used evidence to support claims and explana-tions

� Evidence for how their pupils understood scientific models� Evidence for how their pupils understood a specific science concept/idea

Dow

nloa

ded

by [

Uni

vers

ity o

f W

ashi

ngto

n L

ibra

ries

] at

16:

45 0

3 Ju

ly 2

014