Studying teacher noticing: Examining the relationship among pre-service science teachers' ability to...

11
Studying teacher noticing: Examining the relationship among pre-service science teachers' ability to attend, analyze and respond to student thinking * Tara Barnhart * , Elizabeth van Es 1 School of Education, University of California, Irvine, 3200 Education, Irvine, CA 92697-5500, USA highlights We investigated two cohorts of teacher candidates in a preparation program. One cohort was enrolled in a course to develop reective skills using video. Course participants attended to and analyzed student ideas differently. Sophisticated analyses and responses required high sophistication in attending. Signicant relationships existed between attending and analyzing. article info Article history: Received 12 February 2014 Received in revised form 19 September 2014 Accepted 22 September 2014 Available online Keywords: Reective teaching Student teaching Student teachers Secondary school science Teacher noticing Pre-service teacher education abstract This study investigates pre-service teachers' capacities to attend to, analyze, and respond to student thinking. Using a performance assessment of teacher competence, we compare two cohorts of science teacher candidates, one that participated in a video-based course designed to develop these skills and one that did not. Course participants demonstrate more sophisticated levels of attention to and analysis of student ideas. Analysis of the relationship among skills reveals that sophisticated analyses and re- sponses to student ideas require high sophistication in attending to student ideas. However, high so- phistication in attending to student ideas does not guarantee more sophisticated analyses or responses. © 2014 Elsevier Ltd. All rights reserved. 1. Introduction Few factors have a greater impact on learning than the quality of a student's teacher (Darling-Hammond, 2010; Strauss & Sawyer, 1986; Whitehurst, 2002). While there is disagreement about what specic skills or dispositions make one teacher more effective than another (Goldhaber & Anthony, 2004), there is consensus around the importance of teachers being able to critically analyze their practice (Little & Horn, 2007; Windschitl, Thompson, & Braaten, 2011). Teachers who have opportunities to rigorously reect on their work and connect it to research and theory during their professional preparation are better able to identify and respond to dilemmas of practice, more likely to take an analytic stance toward their work, and demonstrate a willingness to take risks and explore alternative pedagogical approaches (Darling- Hammond & Bransford, 2005; Davis, Petish, & Smithey, 2006; Zeichner & Liston, 1996). Moreover, research on teacher expertise shows that expert teachers can distinguish between important and unimportant information in a complex situation, can reason about what they observe and can use this analysis to make more informed teaching decisions (Berliner, 2001). Despite its value, building reective and analytic skills can be challenging, particularly in the context of pre-service teacher ed- ucation. The eldwork sites may not promote systematic and rigorous analysis; the preconceptions pre-service teachers bring * This research was supported in part by the Knowles Science Teaching Foun- dation. The opinions expressed are those of the authors and do not necessarily reect the opinions of the supporting agency. * Corresponding author. Tel.: þ1 562 708 9703. E-mail addresses: [email protected] (T. Barnhart), [email protected] (E. van Es). 1 Tel.: þ1 949 824 7819. Contents lists available at ScienceDirect Teaching and Teacher Education journal homepage: www.elsevier.com/locate/tate http://dx.doi.org/10.1016/j.tate.2014.09.005 0742-051X/© 2014 Elsevier Ltd. All rights reserved. Teaching and Teacher Education 45 (2015) 83e93

Transcript of Studying teacher noticing: Examining the relationship among pre-service science teachers' ability to...

lable at ScienceDirect

Teaching and Teacher Education 45 (2015) 83e93

Contents lists avai

Teaching and Teacher Education

journal homepage: www.elsevier .com/locate/ tate

Studying teacher noticing: Examining the relationship amongpre-service science teachers' ability to attend, analyze and respondto student thinking*

Tara Barnhart*, Elizabeth van Es 1

School of Education, University of California, Irvine, 3200 Education, Irvine, CA 92697-5500, USA

h i g h l i g h t s

� We investigated two cohorts of teacher candidates in a preparation program.� One cohort was enrolled in a course to develop reflective skills using video.� Course participants attended to and analyzed student ideas differently.� Sophisticated analyses and responses required high sophistication in attending.� Significant relationships existed between attending and analyzing.

a r t i c l e i n f o

Article history:Received 12 February 2014Received in revised form19 September 2014Accepted 22 September 2014Available online

Keywords:Reflective teachingStudent teachingStudent teachersSecondary school scienceTeacher noticingPre-service teacher education

* This research was supported in part by the Knodation. The opinions expressed are those of the autreflect the opinions of the supporting agency.* Corresponding author. Tel.: þ1 562 708 9703.

E-mail addresses: [email protected] (T. Barnhart),1 Tel.: þ1 949 824 7819.

http://dx.doi.org/10.1016/j.tate.2014.09.0050742-051X/© 2014 Elsevier Ltd. All rights reserved.

a b s t r a c t

This study investigates pre-service teachers' capacities to attend to, analyze, and respond to studentthinking. Using a performance assessment of teacher competence, we compare two cohorts of scienceteacher candidates, one that participated in a video-based course designed to develop these skills andone that did not. Course participants demonstrate more sophisticated levels of attention to and analysisof student ideas. Analysis of the relationship among skills reveals that sophisticated analyses and re-sponses to student ideas require high sophistication in attending to student ideas. However, high so-phistication in attending to student ideas does not guarantee more sophisticated analyses or responses.

© 2014 Elsevier Ltd. All rights reserved.

1. Introduction

Few factors have a greater impact on learning than the quality ofa student's teacher (Darling-Hammond, 2010; Strauss & Sawyer,1986; Whitehurst, 2002). While there is disagreement aboutwhat specific skills or dispositions make one teacher more effectivethan another (Goldhaber & Anthony, 2004), there is consensusaround the importance of teachers being able to critically analyzetheir practice (Little & Horn, 2007; Windschitl, Thompson, &

wles Science Teaching Foun-hors and do not necessarily

[email protected] (E. van Es).

Braaten, 2011). Teachers who have opportunities to rigorouslyreflect on their work and connect it to research and theory duringtheir professional preparation are better able to identify andrespond to dilemmas of practice, more likely to take an analyticstance toward their work, and demonstrate a willingness to takerisks and explore alternative pedagogical approaches (Darling-Hammond & Bransford, 2005; Davis, Petish, & Smithey, 2006;Zeichner & Liston, 1996). Moreover, research on teacher expertiseshows that expert teachers can distinguish between important andunimportant information in a complex situation, can reason aboutwhat they observe and can use this analysis tomakemore informedteaching decisions (Berliner, 2001).

Despite its value, building reflective and analytic skills can bechallenging, particularly in the context of pre-service teacher ed-ucation. The fieldwork sites may not promote systematic andrigorous analysis; the preconceptions pre-service teachers bring

T. Barnhart, E. van Es / Teaching and Teacher Education 45 (2015) 83e9384

into the profession can interferewithwhat they choose to reflect onand how they reason about the effectiveness of their teaching; andpre-service teachers may lack the observation skills and pedagog-ical content knowledge required for sophisticated analyses ofteaching and learning (Borko & Livingston, 1989; Hammerness,Darling-Hammond, & Bransford, 2005; Hiebert, Morris, Berk, &Jansen, 2007; Schoenfeld, 2011; Star & Strickland, 2008).

A further complication is that simply adding requirements toteacher education programs to analyze practice, be it throughcoursework assignments or high-stakes portfolio assessments,without providing guidance in what should be analyzed, for whatpurpose, and how, results in superficial learning and may even bemis-educative (Dewey, 1933; Loughran, 2002; Zeichner & Liston,1996). This is because, without structured support and appro-priate framing, pre-service teachers' analyses tend to focus on theactions and behaviors of the teacher rather than student thinking,learning and sense-making (Hammer, 2000; Levin, Hammer, &Coffey, 2009), and tend to be judgmental and lack evidential sup-port and coherence (Davis, 2006; Sandoval, Deneroff, & Franke,2002).

The purpose of this study is to investigate how a video-basedcourse, Learning to Learn from Teaching (Santagata & van Es,2010), supported secondary science pre-service teachers inlearning to analyze and reflect on teaching and learning in sys-tematic ways. A central component of the course was learning touse evidence of student thinking as it unfolds in the lesson to drawinferences about the effectiveness of instruction and using thisanalysis to make subsequent pedagogical decisions. This type ofanalysis requires that pre-service teachers: a) attend to studentthinking and learning and the interactions that unfold amongstudents and between teachers and students, b) interpret studentunderstanding from these interactions, and c) decide next stepsbased on this analysis. Recent research refers to this collection ofskills as teacher noticing (Jacobs, Lamb, & Philipp, 2010). Ofparticular interest in this study is the relationship among theseskills. To date, several researchers have investigated these skills inisolation, with some researchers focusing on pre-service teachers'abilities to articulate clear learning goals (Jansen, Bartell, & Berk,2009; Morris, Hiebert, & Spitzer, 2009), others on their ability toattend to student thinking and learning in a lesson (Levin et al.,2009; Nicol & Crespo, 2004; Ruiz-Primo & Furtak, 2007; Star &Strickland, 2008; van Es & Sherin, 2002; van Zee & Minstrell,1997), and others on the use of evidence to support claims ofteaching effectiveness (Morris, 2006; Santagata & Yeh, 2013;Stockero, 2008). Less attention has been paid to how develop-ment of one skill influences development of the others and howthey coordinate to construct a coherent analysis of teaching prac-tice. An empirical investigation into this relationship will advanceresearch on the constructs of noticing and analysis of teaching andhas implications for the design of teacher education.

While research suggests that pre-service teachers can developanalytic skills in the context of a course where they analyze theirown and other's teaching (Pang, 2011; Santagata & Angelici, 2010),few studies have examinedwhether they draw on these skills whenthey analyze their own teaching after the conclusion of the course.Thus, the central research questions for this study include: a) Dopre-service teachers who participated in a course designed toscaffold systematic analysis of teaching through video analysisdraw on the skills to analyze their own teaching compared to acohort of teachers who did not participate in the course? and b)How are the skills of systematic analysis of teaching related to eachother? To investigate these questions, we compare written analysesof teaching of a cohort of secondary science teacher candidates thatparticipated in the Learning to Learn from Teaching (LLfT) course toone that did not in the context of a performance assessment for

teacher credentialing. To be clear, we do not focus on noticing andanalysis during instruction. Though research points to the value ofpreparing beginning teachers to learn to notice and respond duringinstruction (Kazemi, Lampert, & Franke, 2009; Windschitl,Thompson, Braaten, & Stroupe, 2012), we focus on their writtenanalyses of teaching because research also suggests that teacherswho systematically analyze teaching become more adept atresponding to student ideas (Windschitl et al., 2011). In addition,we conjectured that the written responses would provide us withaccess to pre-service teachers' ways of attending, analyzing andresponding, not all of which may be possible to observe if we wereto analyze videos of instruction.

The questions we investigate are particularly relevant in sci-ence education internationally. In the US, proposals for theimprovement of science teaching and learning emphasize teach-ing students how to collect, interpret, and evaluate evidence toformulate scientific explanations (American Association for theAdvancement of Science [AAAS], 2009; National ResearchCouncil [NRC], 2007, 2012; Sandoval et al., 2002; Windschitl,Thompson, & Braaten, 2008). This is also the case in other coun-tries where science curricula emphasize the importance ofdeveloping scientifically literate citizens and is reflected in thehighest proficiency levels on the Program for International Stu-dent Assessment [PISA] (OECD, 2013; Waddington, Nentwig, &Schanze, 2007). Because the scientific body of knowledge is inconstant flux with increasing amounts of information being addedor modified (Gleick, 2011), students must be able to ask criticalquestions and possess appropriate skepticism about proposedexplanations and interpretations of scientific phenomena. More-over, they need to successfully navigate the flood of information toparticipate knowledgeably in public discussions about science andtechnology and be sensible consumers of information about sci-ence and related issues (NRC, 2012). Thus, learning science is notonly about knowing science content, but also involves attending toand reasoning about scientific ideas, generating and testingmodels of scientific phenomena, and being effective problemsolvers (AAAS, 2009; Levin et al., 2009; Ministry of Education-Singapore, 2013; NRC 2012). To achieve this vision of science ed-ucation requires that pre-service teachers develop strategies forsystematically analyzing their ability to build students' scientificreasoning skills and for assessing students' progress in achievingthese goals.

2. Theoretical framework

2.1. The importance of reflection and analysis for learning to teach

This study is framed by research on reflection, teacher noticingand lesson analysis. Dewey (1933) and Sch€on (1983) each madecontributions to the field by defining the phases and mind-setneeded for critical reflection. Each believed reflective practi-tioners should be engaged in ongoing, systematic, rigorous, anddisciplined meaning-making with the aim of improving practice.They identified effective reflection as an analytic approach toproblem-solving distinct from informal ways of thinking aboutteaching and instruction in which teachers direct their attention toparticular details of practice, make sense or give reason to thesedetails, and use their analysis of these details to develop hypothesesabout how to solve dilemmas of classroom practice. In this way,teachers become skilled professionals rather than mindless tech-nicians who consume and implement instruction designed byothers.

An important characteristic of these reflectivemodels is the useof evidence from teaching to inform practical theories to test inpractice. In Dewey's (1933) perspective, all rigorous reflection is

T. Barnhart, E. van Es / Teaching and Teacher Education 45 (2015) 83e93 85

grounded in experience. When an experience is perceived as abasis upon which to build future decisions, it becomes evidence.Similarly, Sch€on (1983) writes that professionals bring pastexperience to bear on new situations and problems, allowing themto use experience as evidence to form new theories for futureaction.

However, research on teacher noticing (Erickson, 2011) suggeststhat what teachers attend to and use as evidence from their ex-periences may not provide them with the kind of information theyneed to draw meaningful inferences about student learning. This isparticularly true of inexperienced teachers. Novice teachers typi-cally focus on superficial features of classroom interactions or usestudent behaviors, such as raising hands enthusiastically, stayingon-task, and following classroom routines in an orderly fashion todeduce that students understood the lesson (Carter, Cushing,Sabers, Stein, & Berliner, 1988; Copeland, Birmingham, DeMeulle,D'Emidio-Caston, & Natal, 1994; So, 2012; Star & Strickland,2008; Star, Lynch, & Perova, 2011). Erickson (2011) also foundthat novice teachers focused on the learning of the whole class,rather than attending to unique student ideas, making sense ofthose ideas, and understanding how they fit into achieving thebroader learning goal. This results in overgeneralizations aboutstudent learning, leaving novices with an inaccurate sense of theeffectiveness of the lesson (Lloyd & Mukherjee, 2012; Loughran,2002; van Es & Sherin, 2002; van Manen, 1977).

Simply encouraging pre-service teachers to pay attention todetails of practice is not sufficient to improve practice. Zeichnerand Liston (1987), and more recently Levin et al. (2009) contendthat a key characteristic of effective reflection is how pre-serviceteachers frame problems of practice because this influenceswhat details they identify as worthy of attention and how thesedetails will be analyzed and used to construct testable theoriesabout practice. This suggests that pre-service teachers must beprovided with tools and frameworks to help guide what theyattend to in teaching, how they interpret these events, and howthey draw inferences from these experiences to make informedteaching decisions. Providing an organizational frame may permitmore sophisticated attention to salient details and enable thetransformation of these noticed details into evidence that can beused to inform future instructional decisions (Davis, 2006; Levinet al., 2009; Santagata & Angelici, 2010; Stürmer, K€onings, &Seidel, 2012).

2.2. Learning to systematically analyze teaching

Recent research proposes frameworks to support pre-serviceteachers in developing productive approaches for analyzingteaching (Gelfuso & Dennis, 2014; Hiebert et al., 2007; Santagata &Angelici, 2010; Windschitl et al., 2012). Consistent across thesemodels is a focus on defining a clear learning goal; using evidencefrom practice, such as student work or video of classroom in-teractions, to inform analysis; and examining the cause-effectrelationship between particular teaching practices and studentlearning. Teaching is framed as an experiment with the teacherengaged in a methodical approach to examine and learn frompractice (Spitzer, Phelps, Beyers, Johnson, & Sieminski, 2011).

We propose that to engage in this kind of systematic analysisrequires development of a variety of skills, including attending towhat is noteworthy in classroom data, analyzing and interpretingthat data with respect to defined goals, and deciding how torespond, what research refers to as teacher noticing (Jacobs et al.,2010; Star & Strickland, 2008; van Es & Sherin, 2002). While pre-service teachers tend to focus their attention on classroom man-agement and tasks, research shows that with structured guidance,they can become better attuned to and inferential about the

relationship between teaching and learning (Star et al., 2011; van Es& Sherin, 2002). Other researchers emphasize the importance ofusing analysis to make evidence-based decisions on how torespond to students' thinking, noting that it can improve withexperience and practice (Hiebert et al., 2007; Jacobs et al., 2010;Santagata & Angelici, 2010; Stürmer et al., 2012).

2.3. Relationships among skills for analyzing teaching

Studies of teacher noticing, lesson analysis, and reflection oftenconceptualize a connection among the elements on attending,analyzing, and responding (Franke & Kazemi, 2001; Levin et al.,2009; Santagata, Zannoni, & Stigler, 2007; Star & Strickland,2008; van Es & Sherin, 2002, 2008). For example, Davis (2006)examined what aspects of teaching pre-service teachers consider,emphasize, and integrate when they productively reflect on theirscience teaching. A more integrated reflection was one that madeconnections between features of instruction that pre-serviceteachers noticed. Research on lesson analysis implies a relation-ship between attending and making sense of instruction byexamining how pre-service teachers learn to collect and use evi-dence to draw inferences about teachers' instruction and studentlearning (Morris, 2006). Jacobs et al. (2010) “envision the existenceof a nested relationship among the three component skills suchthat deciding how to respond on the basis of children's un-derstandings can occur only if teachers interpret children's un-derstanding, and these interpretations can be made only if teachersattend to the details of children's strategies” (p. 197).

Though research implies a relationship among the skills forsystematically analyzing teaching, what is less clear is how skillsare connected and how the level of sophistication on one skill mayor may not relate to the level of sophistication on another. Thus, thepurpose of this study is twofold. First, we investigate if participa-tion in a video-based course designed to facilitate systematicanalysis of teaching supported secondary science teachers inanalyzing their ownpractice in sophisticatedways as compared to acohort who did not enroll in the course. By using the term sys-tematic analysis of teaching, we refer to demonstrating greaterattention to the particulars of student thinking, using evidence ofclassroom interactions to draw inferences about student learning,and making instructional decisions based on their attention to andanalysis of student thinking. We conjectured that candidates in thecourse would show higher levels of sophistication on these threeskills. We compare the two cohorts' analyses of teaching from thePerformance Assessment for California Teachers (PACT). The PACTis one of three state-approved portfolio assessment models pre-service teachers must complete in order to earn state licensure.Because the PACT requires pre-service science candidates to plan,enact, and reflect on an instructional sequence using video, itprovides a rich data source that captures the skills of interest in thisstudy.

Second, we examine the relationships among the skills forsystematic analysis of teaching across the two cohorts to gaininsight into how the skills interact with each other. We hypothe-sized that significant relationships existed between pairs of skillsand that these skills build on each other. Specifically, we conjec-tured that high sophistication on attending would support higherlevels of sophistication on analyzing, and higher sophistication onattending would in turn support higher levels of sophistication onresponding. Findings of this analysis will provide greater insightinto the construct of teacher noticing, as well as inform pre-serviceteacher educators' efforts to design instruction to help pre-serviceteachers develop their ability to systematically analyze and reflecton teaching.

Table 1Characteristics of LLfT and non-LLft cohorts.

Non-LLfT LLfT

GenderMale 3 4Female 5 12

Mean Age in years 31.4 (30e37) 30.1 (27e38)Mean UC G.P.A. 3.30 (2.23e3.98) 3.36 (2.97e3.97)Undergraduate MajorBiology 3 6Chemistry 4 1Psychology 1 3Ecology/Environmental Science 0 4Physics 0 1Earth Science 0 1

Mean Inquiry Lesson Score 2.35 (1e3.1.25) 2.21 (1e3)

Note. ManneWhitney results for gender U(23) ¼ 56, Z ¼ .46, n.s. For mean ageU(23) ¼ 46.5, Z ¼ �1.04, n.s. For mean GPA U(23) ¼ 64, Z ¼ .03, n.s. For InquiryLesson Score U(23) ¼ 39.5, Z ¼ �1.18, n.s. All tests were two-tailed. Range is inparentheses.p < .05.

T. Barnhart, E. van Es / Teaching and Teacher Education 45 (2015) 83e9386

3. Research design

3.1. Study context

This study takes place in a nine-month, three quarter teachercredential program at a large, public university in the westernUnited States. It is part of a larger research program that examineshow pre-service teacher candidates learn to notice and analyzeteaching using video case studies. All candidates complete super-vised fieldwork each quarter, primarily observing a classroomteacher in the first quarter and eventually taking on full re-sponsibility as a student teacher by the beginning of the thirdquarter. They also enroll in courses related to content areamethods,general pedagogy, learning theory, language and literacy in thecontent areas, and equity and diversity. All candidates seeking asecondary teaching credential enrolled in the Learning to Learn fromTeaching (LLfT) course in the first quarter of the program during theyear of this study. The course draws extensively on best practicesfor using video cases and uses structured frameworks to scaffoldand apprentice teacher candidates into reflective work in anauthentic practice-based context (Hiebert et al., 2007; Rodgers,2002; van Es & Sherin, 2002). In particular, the course intends todevelop pre-service teachers' skills in three domains: attention tostudent thinking; interpretation of student thinking; and planningand enactment of strategies to make student thinking visible(Santagata & van Es, 2010).

The course consists of three main phases (Santagata & van Es,2010). The first phase introduces students to theories about ana-lytic and responsive teaching practice. The second phase focuses onproviding structured practice using frameworks such as thoseproposed by Hiebert et al. (2007) and Rodgers (2002) to analyzestudent thinking and strategies that teachers in video case studiesemployed to promote classroom discourse to make studentthinking visible (Blythe, 1997; Hufferd-Ackles, Fuson, & Sherin,2004; Lemke, 2002; van Zee & Minstrell, 1997). In the final phase,students apply these skills to design, video record, and reflect ontheir own teaching practice.

3.2. Participants

Participants consisted of two cohorts of secondary pre-servicescience teachers: one cohort that participated in the LLfT course(n ¼ 16) and one cohort that did not (n ¼ 8). The number ofcredential program participants doubled between 2007 and 2009because of a concerted effort to increase program participation. Allcandidates complete the PACT as part of the credentialing process.When considering age, undergraduate major and grade point av-erages, there were no noteworthy differences between the twocohorts except for the addition of the LLfT course to the credentialprogram requirements. Using the “Essential Features of ClassroomInquiry and Their Variations” framework from the Inquiry and theNational Science Education Standards: A Guide for Teaching andLearning (NRC, 2000, p. 29), we also compared the lessons that werethe basis for the PACT videos and responses. Because the objects ofthe analysis likely informed and influenced what participants hadavailable to attend to, analyze and respond to, wewanted to ensurethat there was no qualitative difference between the two cohorts interms of their lessons. Each lesson plan was given a score of 1e4,with four representing the most student-directed lesson and onethe most teacher-directed (see Table 1).

3.3. Data

Data for the study consists of candidates' responses as part of thePerformance Assessment for California Teachers (PACT). Modeled

after the portfolio assessments of the National Board for Profes-sional Teaching Standards (NBPTS, 2014), the Interstate NewTeacher Assessment and Support Consortium (http://www.ccsso.org/Resources/Programs/Interstate_Teacher_Assessment_Consortium_(InTASC).html) Council of Chief State School Officers, 2014, andthe Connecticut State Department of Education (Pecheone& Chung,2006), secondary science teacher candidates complete a subject-specific PACT Teaching Event at the beginning of the third quarterof study in the credential program. The Teaching Event consists offive tasks centered on developing students' scientific concepts andscientific inquiry: context for learning, planning, instruction,assessment, and reflection. Candidates submit written lesson plansfor a three to five day learning segment involving “critical teachingand learning tasks in the credential area,” an assessment tool,student work samples, a video of their teaching, and a writtenreflection (Pecheone & Chung, 2006). These materials are submit-ted and scored using a published rubric to determine competencyto earn a teaching credential.

We focused on the candidates' responses to three questionsrelated to analyzing their videos of teaching that specifically assessthe skills of interest in this study. These include: a) In the instruc-tion seen in the clips, how did you further the students' knowledgeand skills and engage them intellectually while collecting,analyzing, and interpreting data from a scientific inquiry? Provideexamples of both general strategies to address the needs of all ofyour students and strategies to address specific individual needs; b)Describe the strategies you used tomonitor student learning duringthe learning task shown on the video clips. Cite one or two ex-amples of what students said and/or did in the video clips or inassessments related to the lesson(s) that indicated their progresstoward accomplishing the lessons' learning objectives; c) Reflect onthe learning that resulted from the experiences featured in thevideo clips. Explain how, in your subsequent planning and teaching,successes were built upon and missed opportunities wereaddressed. The authors did not participate in any activities relatedto supporting candidates in completing the PACTor in the scoring ofthe assessments. A member of the research team not related to thispart of this study de-identified all science candidates' PACTTeaching Event responses of any personal and field-work infor-mation prior to analysis.

3.4. Analysis

We adopted a mixed methods approach to our analysis forseveral reasons (Tashakkori & Teddlie, 2010). Because candidates'

T. Barnhart, E. van Es / Teaching and Teacher Education 45 (2015) 83e93 87

written responses were grounded in their experiences, we usedqualitative methods to honor the rich and nuanced nature of theirways of viewing and analyzing teaching. This allowed us toconstruct a framework to characterize variations in the ways thatthey attended to, analyzed, and responded to instructional in-teractions (Miles, Huberman, & Salda~na, 2014). At the same time,we used quantitative methods to provide us with analytic powerfor exploring the relationship among these skills. The result is avalid, reliable, and grounded exploration and explanation of theconstruct of pre-service teacher noticing and reflection in thecontext of a credential preparation program.

Data analysis for the first research question took place in severalphases. In the first phase, the first author selected a random sampleof PACT responses from six candidates, three from each cohort, andused a sentence-by-sentence coding technique to gain insight intothe similarities and differences between candidates' noticing andanalysis of teaching in this context (Corbin & Strauss, 2008). Thisanalysis was informed by prior research on teacher noticing, lessonanalysis, and teacher reflection (Davis, 2006; Hiebert et al., 2007;Levin et al., 2009; Mason, 1998; Santagata, 2011; van Es, 2011;van Es & Sherin, 2002), which highlights the importance of threecomponents: what teachers attend to, how they analyze instruc-tion, and how they choose to respond to students. This literaturealso draws attention to the importance of specificity of language,linking elements of teaching and learning, and making evidence-based claims about the effectiveness of instruction. From thiscoding, the first author wrote analytic memos for each of the sixcases to capture the focus and content of candidates' attention andanalysis (e.g. student thinking, student behavior, teacher behavior,classroom climate, science content), the level of detail used todescribe and analyze events, and the relationship among what theyobserved, their analyses, and their suggestions for improvingteaching. When we reviewed these memos we identified threeskills - the ways these candidates attended to, analyzed, andplanned to respond to instruction e as well as variations in thesophistication of each skill. For the second phase of analysis, wedeveloped a three-level framework to characterize the differencesamong the six cases for each skill. The resulting framework is dis-played in Table 2.

In this framework, attending is concerned with what candidatesattended to when they observed their teaching in their PACTresponse. Analyzing is concerned with their interpretation andsense-making of the events that they highlighted. Responding isconcerned with how participants wrote about how they took upand used student ideas to inform their teaching and proposedappropriate next steps in a logical way. In categorizing responses interms of sophistication, we intended to capture a range ofsimplistic, generic observations to more robust, substantive

Table 2Levels of sophistication for noticing skills.

Skill Low sophistication Medium sophi

Attending Highlights classroom events, teacher pedagogy,student behavior, and/or classroom climate. Noattention to student thinking.

Highlights studrespect to thea scientific inqfocus).

Analyzing Little or no sense-making of highlighted events;mostly descriptions. No elaboration or analysisof interactions and classroom events; little orno use of evidence to support claims.

Begins to makeevents. Some usupport claims

Responding Does not identify or describe acting on specificstudent ideas as topics of discussion; offersdisconnected or vague ideas of what to dodifferently next time.

Identifies and dspecific studenlesson; offers ido differently n

analyses. To further refine the framework, members of the researchteamwho were not affiliated with this particular study applied theframework to the six cases and discussed variations in the ratings.Issues raised in the discussion were used to further clarify andrefine the operational definitions of the framework in an iterativefashion (Corbin & Strauss, 2008).

In the third phase of analysis, we applied the framework to codethe additional cases (n ¼ 24) to examine any differences thatexisted between cohorts in the level of sophistication for all threeskills (attending, analyzing, and responding). We reviewed thecandidates' responses to all three PACT questions and then assignedan overall score for each category (Miles et al., 2014). Sophisticationscores were assigned a numerical value to reflect the ordinal natureof the data as follows: low¼ 1, medium¼ 2, and high¼ 3. To ensureinter-rater reliability, members of the research team independentlycoded a randomly-selected subset of five cases. Inter-rater agree-ment of 86% was achieved across the three skills (93% for attending,73% for analyzing, and 93% for responding). We then conducted aManneWhitney U test to examine differences between LLfT andnon-LLfT participants as compared to each other on each of thethree skills (attending, analyzing, and responding) and their com-bined rank score on the three skills. We used one-tailed testsbecause prior research suggests that teacher candidates learn tosystematically analyze teaching with structured opportunities tolearn, like that provided in the LLfT course (Chung & van Es, 2014;Santagata & Angelici, 2010; Star et al., 2011).

Our second research question examined the relationship amongskills. For this analysis, we combined the two cohorts because weneeded a wide range of responses to fully explore the construct ofnoticing and analyzing teaching. The findings from our first analysisrevealed differences between the two cohorts in terms of sophis-tication of analyzing teaching, providing us with a range that wouldallow us to probe these relationships. Because the data are notnormally distributed, the sample size is small, and the data areordinal rather than categorical, Kendall's Tau values were calcu-lated to determine the strength of the relationship among the skillsacross all cases (attending � analyzing; attending � responding;analyzing � responding). One-tailed tests were used because theliterature suggests that these three skills would be positivelycorrelated.

4. Results

In the following sections, we present the results for eachresearch question. We first present the comparisons between thetwo cohorts in terms of their attending, analyzing, and responding.Then, we report on the relationship among learning to attend to the

stication High sophistication

ent thinking withcollection of data fromuiry (science procedural

Highlights student thinking with respect to thecollection, analysis, and interpretation of datafrom a scientific inquiry (science conceptual focus)

sense of highlightedse of evidence to.

Consistently makes sense of highlighted events.Consistent use of evidence to support claims.

escribes acting on at idea during thedeas about what toext time.

Identifies and describes acting on a specific studentidea during the lesson and offers specific ideas ofwhat to do differently next time in response toevidence; makes logical connections betweenteaching and learning.

Table 4Comparison ofmean attending, analyzing, responding, and total scores between twocohorts.

Skill Non-LLfT candidates LLfT candidates

Attendingy 2 2.5Analyzing 1.75 2Respondingy 1.5 1.9Total Score* 5.3 (4e7) 6.4 (3e9)N 8 16

Note. ManneWhitney results for attending U(23) ¼ 88.0, Z ¼ 1.47, .07. For analyzingU(23) ¼ 77.0, Z ¼ .80, .21. For responding U(23) ¼ 86, Z ¼ 1.35, .10. For total scoreU(23) ¼ 95.0, Z ¼ 1.90, .03. All tests were one-tailed. Range is in parentheses.*p < .05. yp � .10.

T. Barnhart, E. van Es / Teaching and Teacher Education 45 (2015) 83e9388

particulars of student thinking and candidates' abilities to analyzeand respond to student ideas.

4.1. Differences between cohorts' analyses of teaching

When comparing the two different groups, data analysisrevealed that LLfT candidates were more sophisticated in attendingto specific instances of student thinking, analyzing this evidence,and commenting on adjustments to instruction in response to astudent idea as compared to the cohort that did not enroll in thecourse. Table 3 displays the differences between the two cohorts interms of level of sophistication in all three skills. We observed thegreatest difference between the cohorts in the sophistication ofattending to student thinking about the collection, analysis, andinterpretation of data from a scientific inquiry.

Looking at the distribution of the level of sophistication for thetwo cohorts in Table 3, although there are low scoring individuals inboth groups, most of the candidates who participated in the coursescored in the medium to high range for the three skills, while agreater number of the candidates who did not experience thecourse scored in the low to medium range for all three skills. Alsonoteworthy is that only two non-LLfT candidates scored in the highsophistication range for any skill, on the dimension of attending.

In terms of the framework, we observed high levels of sophis-tication for over 60% of candidates in the based course compared toonly 25% of the comparison group. More than half of both cohorts'analysis skills were coded as medium sophistication, suggestingthey were using some evidence to support the claims they made intheir PACT responses when prompted to do so, but onlymembers ofthe LLfT cohort showed high levels of analysis by consistently citingevidence in their responses. Finally, 75% of the course participantsidentified and described acting on or proposing a response to aspecific student idea, compared to 50% of the comparison group.

The distribution of scores across the three levels of sophisti-cation shown in Table 3 suggests that the candidates who enrolledin the course demonstrated higher levels of sophistication in eachskill. To explore this further, an overall mean sophistication scorewas calculated for both cohorts for each skill by summing eachcandidate's scores on each skill (low¼ 1, medium¼ 2, high¼ 3) andobtaining the mean score for each cohort. Analysis reveals that thecohorts are marginally statistically significantly different in theirdemonstration of attending and responding on the PACT (seeTable 4).

Though the differences in sophistication for the skill ofanalyzing did not reach statistical significance, the difference be-tween the two cohorts shows the same pattern as attending andresponding with LLfT candidates demonstrating higher

Table 3Comparison of the sophistication of attending, analyzing, and responding toteaching between two cohorts.

Skill Non-LLfT participants LLfT participants

AttendingLow sophistication 25% (2) 13% (2)Medium sophistication 50% (4) 25% (4)High sophistication 25% (2) 63% (10)

AnalyzingLow sophistication 25% (2) 19% (3)Medium sophistication 75% (6) 63% (10)High sophistication 0% (0) 19% (3)

RespondingLow sophistication 50% (4) 25% (4)Medium sophistication 50% (4) 56% (9)High sophistication 0% (0) 19% (3)

N 8 16

Note. N values are in parenthesis.

sophistication than those who did not take the course. When thesesmall but appreciable differences in the sophistication of each skillare aggregated to generate a mean total sophistication score forboth cohorts, the LLfT candidates' responses are statisticallysignificantly more sophisticated. The following excerpts fromcandidate responses typical for each cohort illustrate thesedifferences.

Roger, a non-LLfT candidate, shared a segment from a middleschool general science lesson inwhich he asked students towork ingroups to develop a test to determine the salinity of a solution thatwould yield the “crispiest,” or most turgid, potato slice. They wereprovidedmaterials and equipment such as salt, water, a triple beambalance, beakers, and a graduated cylinder to develop their test andcollect data. Roger defined the goal for this lesson for students todesign a controlled experiment, collect data, and identify sources oferror.When prompted to analyze howhe promoted student inquirythrough the collection and analysis of data, he wrote the following:

Specifically, the inquiry lab seen in the first clip highly engagedthe students as they were allowed to explore a scientific prob-lem using methods they created themselves. They were able tohave fun trying to solve a problem on their own, using their owncreativity, rather than being told to follow a very structured labprocedure.

Rather than focus on the substance of students' scientificthinking, Roger mentioned the importance of students to “havefun,” use their “creativity,” and solve a problem “on their own.” Heuses general language to describe students' participation. He alsouses general language in the way he frames his learning goal.Although the goal was mentioned, it was procedural rather thanconceptual and not specific to the content of focus. Though con-cepts of osmosis, diffusion, and the role of the cell wall in plant cellscould have been potential topics of investigation, they were notdiscussed in this lesson or in the subsequent lesson.

Later in the analysis, Roger responded to a prompt to describewhat he did to monitor student progress. Roger explained that heconducted a demonstration of one method to determine the“crispiest potato” and charged his students with finding any errorshe made during his procedure. He wrote:

During the demonstration, I asked the students to write downthe errors that they spotted. At the end of the demonstration, Iasked the students to tell me what they noticed. I then usedthese observations to further discuss the importance of thequality of data in an experiment. I directly questioned the stu-dents so that they would make the discovery that all of the er-rors made had to do with just one potato. I did this after allerrors had been reported by asking different students whichpotato I had made the error with. Each student stated that thepotato I hadmade the error withwas potato C. I then related this

T. Barnhart, E. van Es / Teaching and Teacher Education 45 (2015) 83e93 89

discovery back to the graph to show how great of an effect theerrors had on my data. This questioning helped keep the stu-dents engaged and interested in the outcome of the experimentas well as obtain a greater understanding of the learningobjective.

Several features stand out in this response. First, it is teacher-centered and focused almost exclusively on the teacher's own ac-tions. Roger makes one mention of an idea generated by the stu-dents e they observed “that the potato I had made the error withwas potato C,” but he does not elaborate on this idea except to saythat he linked this response to the bar graph he created todemonstrate the results. He closes his response with a claim thathis questioning kept the students “engaged and interested,” butdoes not provide any specific evidence to support this claim. Sec-ond, though Roger's lesson is designed to develop students' skills indata collection and he claims that he provides freedom for studentsto develop their own procedures, it is not in the service of a broaderconceptual goal of cellular transport or a “big idea” such as therelationship of structure to function. Third, Roger's attempt toengage students in identifying his errors, in essence figuring outwhat he already knows in a type of guessing game, is not in thespirit of inquiry as advocated by the Next Generation ScienceStandards (Achieve, Inc., 2014). In this context, his attention tostudent ideas is different than attending to students' novelthinking. He is not attempting to uncover and build on studentideas, but rather to lead students to his own “correct” answer. Wedo not know what meaning Roger draws from his students' answerto know if or how well students understood why the proceduresused on potato C changed the results.

In contrast, Walter's PACT response is typical of the LLfT candi-dates. His Biology students were collecting data to understand thefactors that influence Adenosine Triphosphate (ATP) productionand the role of oxygen and carbon dioxide in cellular respiration.Walter's students prepared a sample of water and BromothymolBlue and blew bubbles of carbon dioxide into it for a set amount oftime under different conditions (when at rest, after light exercise,and after vigorous exercise). The more acidic the water becamefrom the introduction of the carbon dioxide, the more sodium hy-droxide would be required to return it to its original color. Anoteworthy aspect of his response is that he highlights two ex-amples of how he helped students deepen their knowledge of thescience concept while collecting data. In the first, he describes aninteraction with one of the lab groups that is confused as to whytheir solution requires so much more sodium hydroxide than theirclassmates' solutions.

The students discussed how it was because the person per-forming the experiment was a tuba player. The dialogue aboutthe tuba player got the students building on their prior knowl-edge that all individuals are different and that by being a tubaplayer you would probably have a greater lung capacity. I thenasked the group that if the solution is taking a long time to turncolors is the solution very acidic or basic and the studentsanswered acidic, which I responded by asking, meaning it has alot of and they answered CO2. Therefore, I am scaffolding thestudents into understanding the concepts and recognizing theconnections of how one piece of information tells us some partof the reaction that is occurring.

Walter's response is different than Roger's in several ways. First,he uses specific, student-centered examples to support his claimthat he is helping “further the students' knowledge and skills whilecollecting data.” He provided a detailed example of studentsattempting to make sense of their lab experience. They expressed

confusion as to why their results were taking so long and suggestedit might be because one of them was a tuba player and thereforehad greater lung capacity. Walter pursued this idea, pushing thestudents to make the connection between what might be impor-tant about lung capacity in relation to the amount of carbon dioxidehe introduced into the solution, how that changed the pH of thesolution, and why that pH change requiredmore sodium hydroxideto neutralize it.

Not only is Walter specific in his description of the studentthinking in this interaction, he is also specific about the scienceconcept he identifies for students e the relationship among carbondioxide, blood pH, and aerobic capacity. It was not enough thatstudents recognize the relationship but that they understood thebiological mechanism that drives the relationship among the threevariables. This level of attention in the description and content ofstudent thinking is emblematic of a typical LLfT candidate response.

Another distinction inWalter's PACT response is how he built onstudents' science thinking during instruction. For example, hewrote about how he pressed the class during their examination oftheir pooled data displayed on the board. He questioned studentsabout why different groups' data may be different from each other.The students responded that “individuals are different.” As in theprevious example, he elaborated on his follow-up response to hisstudents' answer to document how he monitored student progresstoward the learning goal:

I then furthered the students thinking by asking, “what wouldbe different about people?” The students then came up withresponses about different people take in different levels of ox-ygen, some people are more fit, and different muscle mass.When students would suggest these differences, I would ask thestudents to expand on their thinking and why it would make adifference. Therefore, the students were building on theirknowledge by connecting the content from lecture to the con-cepts involved in the lab.

As before, Walter makes a claim about what he is attempting todo, in this case drawing students' attention to the connection be-tween physiological differences and the resulting changes incellular respiration rate, provides a detailed account of what hisstudents say and how he responds to their ideas, and then makes astatement about the significance of the evidence he presented.Additionally, Walter is calling out specific interactions that involvethe students' making meaning of a specific science concept, not justthe skill of collecting data. The details he includes in his narrativeserve as evidence to support a claim he is making about whatstudents learned and how he assisted them in the process. Linkingstudent ideas and instructional actions was more common in theLLfT candidates' responses than the non-LLfT candidates' responses.

These findings suggest that the LLfT candidates were demon-strating higher levels of sophistication on multiple skills in theirresponses. We then explored the differences between the two co-horts in terms of the co-occurrence of medium and high levels ofsophistication.

Table 5 shows that 60% of LLfT participants demonstrated me-dium or high levels of sophistication in all three skills, compared to25% of the comparison group. Further, none of the participants inthe comparison group demonstrated high levels of sophistication inall three areas and a quarter of this group exhibited low levels ofsophistication in multiple skills. The majority of candidates fromthe course scored at medium and high levels of sophistication in allthree skills. In contrast, the majority of comparison group scored atmedium and high levels of sophistication in only two skills com-bined. This suggests a relationship between the three skills ofattending, analyzing and responding, the focus of our second

Table 5Comparison of LLfT and Non-LLfT participants' sophistication of PACT responses incombination.

Skill Non-LLfTcohort

LLfTcohort

Low sophistication in all areas 0 6% (1)Medium or high sophistication in one area only 25% (2) 6% (1)Medium or high sophistication in two areas only 50% (4) 25% (4)Medium and high sophistication in all three areas 12.5% (1) 44% (7)Medium in all areas 12.5% (1) 6% (1)High sophistication in all areas 0 12.5% (2)

N 8 16

Note. N values are in parenthesis.

T. Barnhart, E. van Es / Teaching and Teacher Education 45 (2015) 83e9390

research question. We now turn to present our results in the nextsection.

4.2. Relationship among attending, analyzing, and responding

To answer the second research question, we explored the re-lationships among the three skills by examining if sophistication ofany one skill related to the sophistication of any other skill. Inparticular, we hypothesized that high sophistication on attendingwould support higher levels of sophistication on analyzing, andhigher sophistication on attending would in turn support higherlevels of sophistication on responding.

To examine these relationships, we explored each pair of skills(attending � analyzing, analyzing � responding, andattending � responding) in turn. Table 6 displays a relationshipamong sophistication in all three skill pairs.

We anticipated that there would be statistically significant re-lationships among the levels of sophistication of all skill pairs, thatlow levels of sophistication in some skills would inhibit candidatesfrom demonstrating high levels of sophistication in others, and thatcandidates would be more likely to demonstrate high levels ofsophistication in the skill of attending compared to analyzing andresponding. The results that follow largely support this conjecture.

The first skill pair we analyzed was attending and analyzing. Asexpected, candidates who demonstrated low sophistication inattending were unlikely to demonstrate high levels of sophisticat-ion in analyzing and vice versa. The Kendall's tau B coefficient in-dicates a positive and highly statistically significant relationshipbetween these two skills, similar to the relationship found in priorresearch, ƬB ¼ .387 p ¼ .003 (Santagata, 2011; van Es, 2011). Thispairing of attention to student ideas and analysis of what thoseideas tell the teacher about student understanding may be an

Table 6Observed relationship between attending, analyzing, and responding for allparticipants.

Level ofsophistication

Low Medium High

AttendingAnalyzing Low 1 4 0

Medium 3 3 9High 0 1 3

ƬB ¼ .387 p ¼ .003

RespondingAnalyzing Low 4 1 0

Medium 3 12 0High 1 0 3

ƬB ¼ .572 p ¼ .001

RespondingAttending Low 1 3 0

Medium 4 3 1High 3 7 2

ƬB ¼ .140 p ¼ .397

indication that the teacher is exhibiting more selective attentionand focusing on significant classroom events rather than attendingto a wide range of events that take place in the classroom (van Es &Sherin, 2002).

An example of this is in Walter's response detailed above. Hecalled out specific student ideas about how the amount of carbondioxide is influenced by differences among people. Rather thanaccepting “people are different” as an acceptable answer to explainthe variation in carbon dioxide production and moving on, heprobed students to uncover what his students meant when theysaid that people were “different.” Walter is not just simply seeking“correct” answers; rather he is seeking tomore fully understand hisstudents' ideas and the depth of their understanding by askingthem to elaborate on their answers. Walter then interpreted whatthese elaborations tell him about his students' developing under-standing about the relationship between cellular respiration rateand physiology.

We now turn to the relationship between analyzing studentthinking and responding to student ideas. The Kendall's tau valuesupports the qualitative analysis that these skills appear to betightly coupled, ƬB¼ .572 p¼ .001. A positive and highly statisticallysignificant pattern was observed between analyzing and respond-ing in this sample. High sophistication in analyzing tended to co-occur with high sophistication in responding, medium sophisti-cation in analyzing with medium sophistication of responding, andlow sophistication in analyzing with low sophistication inresponding. Participants were slightly more likely to demonstratesophistication in analysis than in responding to student ideas.Additionally, there was only one case of highly sophisticatedanalysis of student ideas without a highly sophisticated responseand no cases of highly sophisticated responses with only mediumor low levels of sophistication in analyzing.

To score at high levels of sophistication in responding requiredparticipants to propose logical changes to future lessons anddocument an adjustment they made to instruction in real-time to astudent idea. A medium level of sophistication required that par-ticipants propose logical changes to future lessons. To be consideredlogical, the responses had to be informed by evidence they cited intheir PACT narrative. An example of the typical logic for proposedadjustments is Walter's response to the analysis of his lesson. Heidentified a learning goal for his biology students: that they shouldunderstand the relationship amongexercise, oxygen demand, pulse,and ATP production. This example of homeostasis is an importantconcept in high school biology. Yet, when prompted to suggest apossible change he would make to this lesson in the future, hementioned that many students did not write extensive responses tothe prompts on their written lab report. He proposed the following:“A change I would make would be to place lines on the lab reportpages, which would give the students an idea of howmuch writingwould be required for each question.” This proposed adjustment,while related to how students may experience writing a lab report,has little relevance to his stated learning goal, and therefore, scoredas a less sophisticated response for responding. Though it isaccompanied by a justification based on what he noticed about hisstudents' writing, it is not a response to students' ideas about ho-meostasis. Documentation of these types of management andbehavioral adjustmentswere common among all participants in thePACT. They serve as an important reminder of how difficult it can befor pre-service teachers to act upon their analysis of student ideas inthe moment by adjusting instruction. In fact, only three of 24 par-ticipants scored at high levels of responding.

The final skill pair to be analyzed was between attending andresponding. Participants were more likely to demonstrate higherlevels of sophistication in attending than responding. Moreover,participants who demonstrated low sophistication in attending

T. Barnhart, E. van Es / Teaching and Teacher Education 45 (2015) 83e93 91

were unlikely to demonstrate high levels of sophistication inresponding. However, the qualitative examples combined with thelow and non-statistically significant Kendall Tau coefficient for thisskill pair suggests that these two skills are not meaningfully relatedto each other, ƬB ¼ .140 p ¼ .397. That is to say, it is reasonable toconclude that highly sophisticated analysis depends on highly so-phisticated attention to student ideas. In turn, highly sophisticatedresponses, whether proposed or enacted, depend on highly so-phisticated analyses of student ideas. Though teachers frequentlyreact to students during instructions, these reactions may be lack-ing in quality (Scribner, 2003). This suggests that attending withoutanalysis does not typically lead to sophisticated responses to stu-dent thinking.

5. Discussion

We sought to investigate if a video-based course supportedsecondary science teacher candidates in learning to systematicallyanalyze teaching by developing their skills at attending, analyzing,and responding as compared to a cohort of candidates who did notenroll in the course and to explore the relationship among thesethree skills as they arose in the candidates' reflections. We discussour findings below.

With respect to the first research question, this study revealsthat the candidates who participated in the video-based course,Learning to Learn from Teaching, demonstrated higher sophisticat-ion overall in their attention, analysis, and responses to studentthinking. This finding is consistent with other research that showsthat pre-service teachers can learn to attend to student ideas wheninstructed to do so (Davis, 2006; Davis& Smithey, 2009; Levin et al.,2009; Star et al., 2011). However, unlike previous studies, weexamined candidates' analyses of teaching several months after theconclusion of the course. Despite three months' time separatingcompletion of the course and completion of the PACT assessment,course participants demonstrated higher levels of sophisticationwith respect to the three skills, whereas the comparison groupdemonstrated little to no attention to student thinking despite theexplicit prompting of the PACT assessment to do so. Such disci-plined awareness is more typical of expert practitioners (Berliner,2001; Mason, 2002). This finding is encouraging, particularlybecause prior research suggests that pre-service teachers ignore,discount, or discard, what they learn during their preparationprograms (Korthagen, 2007; Richardson, 2003), and that theirteaching contexts do not often provide the structure and support topromote the enactment of practices they develop in their prepa-ration coursework (Thompson, Windschitl, Braaten, & Stroupe,2013; Valencia, Martin, Place, & Grossman, 2009).

In addition, though the participants in the course demonstratedgreater sophistication overall, they did not consistently demon-strate high levels of sophistication in all three skills in tandem. Infact, many of them demonstrated what has been described as“mixed” noticing (van Es, 2011) with higher levels of sophisticationon some dimensions and lower levels on others. One possibleexplanation is that the course only lasted ten weeks and thereflective skills that were the center of the course were notexplicitly reinforced in other aspects of the program. Researchsuggests that coherence in the learning goals across pre-serviceteachers' experiences in a teacher credential program will resultin greater opportunities to learn core knowledge, skills and prac-tices for beginning teaching (Darling-Hammond, 2006; McDonald,Kazemi, & Kavanagh, 2013). From this perspective, the influencethat participation in the course appears to have had on the candi-dates' abilities to systematically analyze teaching and learningdespite the limited scope of the intervention is encouraging.

Our second research question concerned the relationshipamong the three skills. While research on reflection (Dewey, 1933;Sch€on, 1983), awareness (Mason, 1998), and noticing (van Es, 2009)suggest that these are recursive processes, our analysis reveals thatsystematic analyses of instruction may rely on skilled “noticing”(Sherin, Jacobs, & Philipp, 2011). The low occurrence of sophisti-cated analysis and responding without sophisticated attentionsuggests that learning to see the important details of studentthinking with respect to particular content is a cornerstone formore sophisticated analysis and instructional responses. Priorresearch finds that it is not a simple matter to support candidates inlearning to see the details of student thinking e to learn to identifywhat about student ideas are or are not important, how their sci-entific thinking is evolving, and how students are reasoning andmaking sense of scientific phenomena, rather than the correctnessor accuracy of their ideas (Hiebert et al., 2007; Kazemi & Stipek,2001; Rodgers, 2002; Thompson, Braaten, & Windschitl, 2009).Thus, preparing pre-service teachers to learn how to attend tostudent ideas may be a critical prerequisite skill for more inte-grated, disciplined, and effective reflection proposed by research(Davis, 2006; Hiebert et al., 2007; Rodgers, 2002).

We also found that a high level of attention to student ideas doesnot guarantee high levels of analyzing or responding to those ideas.That is, learning to see the details of student thinking and the de-tails of interactions related to content as they unfold during in-struction does not ensure that candidates will take up those ideasand use them as evidence to analyze teaching and learning orinform instructional responses. The number of candidates scoringat high levels of sophisticationwas greatest in the skill of attending,then analyzing, with responding having the lowest number ofcandidates scoring at high levels of sophistication. The distributionof candidates' scores across the three skills of low, medium, andhigh sophistication indicate that attending, analyzing, andresponding to student ideas may be successively difficult andcomplicated skills. It may be that analysis is the bridging skill be-tween attending and responding. Evidence does not speak for it-self; it only gains meaning and value when it is “shaped, organized,and thought about.” (Earl & Timperley, 2008). Analysis is whatgives reason to the other two skills and distinguishes them as partof effective reflection rather than simply seeing and reacting(Rodgers, 2002). Engaging in this sort of interpretation and sense-making, and using what is observed to draw inferences, is chal-lenging even for experienced teachers (Little & Curry, 2008).

Further complicating the relationship among skills is howreflection and analysis are framed for teachers. Reflection musthave a purpose to be effective and that purposemust be framed as away to problematize teaching (Loughran, 2002; Zeichner, 1987).Levin et al. (2009) found that the focus on covering the curriculumand establishing routines in field sites likely influences whether ornot novice teachers attend to student thinking and how teachersrespond to student ideas. They suggest that developing teachercandidates' framing of instructional interactions is essential forhelping them shift to adopt a more student-centered, responsiveapproach to instruction.

While the results of this study are encouraging, we acknowledgethat there are several limitations. First, although the findingsregarding the differences between the course participants and non-participants are statistically significant and echo similar findings inthe literature, the small sample size curtails the broader implica-tions that can be drawn from these results. Second, though thesame instructors taught both the 2007 and 2009 courses and noother substantial changes weremade to the programbetween 2007and 2009, it may be the case that other aspects of the candidates'experience influenced how they came to analyze instruction, suchas the extent to which their supervisors and mentor teachers

T. Barnhart, E. van Es / Teaching and Teacher Education 45 (2015) 83e9392

prompted them to reflect in systematic ways or the content ofdifferent coursework may have provided opportunities for moresubstantive reflection. Understanding how candidates' experiencesin the video-based course interact with their experiences in thecredential program as a whole is an important subject for futureinquiry. Third, we only looked at candidates' proposed instructionalresponses to student thinking as documented in the PACT, ratherthan how actually responded during instruction as captured in thevideos of instruction. That said, by analyzing their written re-sponses, we have insight into how they make sense of what theyobserve, which is more difficult to capture by analyzing instruc-tional interactions. Analysis of their written responses, coupledwith the videos of their teaching, would likely provide a morerobust picture of what candidates attend to in teaching and howthey respond. It also the case that we used only one data source, thePACT assessment, to analyze pre-service teachers' reflective ca-pacities. Because of the summative, high-stakes nature of the PACTassessment for teacher credentialing, the candidates may havebeen less inclined to critically analyze their instruction by focusingon areas of improvement because they wanted to provide positivedepictions of their practice. A more robust study would providemultiple data sources that capture pre-service teachers' reflectionon practice. Finally, we only looked at candidates' reflection andanalysis of teaching at one point in time. Further analysis into pre-service teachers' learning to systematically analyze teaching overthe course of the credential program, as well as longitudinal studiesfollowing pre-service teachers into the early years of teaching, willalso yield valuable information about the influence that courses likeLearning to Learn from Teaching have on teachers' professionaltrajectories.

6. Conclusion

To prepare future teachers to enact rigorous and responsiveinstruction that is at the core of recent reforms in science education(Achieve, Inc., 2014) requires that teacher candidates learn toattend to the nuances of student ideas, how to interpret what theysee and hear, and how to best respond if they are to be effective,ethical professionals (Ball & Cohen, 1999). Preparing candidates tolearn to systematically attend, analyze and respond is also essentialif teacher educators seek to develop teachers who treat teaching asa learning profession, where they learn in and from their practiceover time (Feiman-Nemser, 2001; Hiebert et al., 2007; Lampert,2010; Zeichner, 1987). Teacher preparation programs mustcontinue to develop these capacities while researchers track thedevelopment of these skills beyond teacher education programsand into the induction years of teaching. Such research will provideimportant insights into how these capacities develop in relation toeach other over time and the influence they have on teacherslearning in and from their practice.

References

Achieve, Inc. (2014). The next generation science standards. Retrieved May, 2014 fromhttp://www.nextgenscience.org/next-generation-science-standards.

American Association for the Advancement of Science. (2009). Benchmarks for sci-entific literacy. New York, NY: Oxford University Press.

Ball, D. L., & Cohen, D. K. (1999). Developing practice, developing practitioners:toward a practice-based theory of professional education. In L. Darling-Ham-mond, & G. Sykes (Eds.), Teaching as the learning profession: Handbook of policyand practice (pp. 3e32). San Francisco, CA: Jossey-Bass.

Borko, H., & Livingston, C. (1989). Cognition and improvisation: differences inmathematics instruction by expert and novice teachers. American EducationalResearch Journal, 26(4), 473e498.

Berliner, D. C. (2001). Learning about and learning from expert teachers. Interna-tional Journal of Educational Research, 35(5), 463e482.

Blythe, T. (1997). The teaching for understanding guide. San Francisco, CA: JosseyBass.

Carter, K., Cushing, K., Sabers, D., Stein, P., & Berliner, D. (1988). Expert-novice dif-ferences in perceiving and processing visual classroom information. Journal ofTeacher Education, 39(3), 25e31.

Chung, H. Q., & van Es, E. (2014). Pre-service teachers' use of tools to systematicallyanalyze teaching and learning. Teachers and Teaching: Theory and Practice, 20(2),113e135.

Copeland, W. D., Birmingham, C., DeMeulle, L., D'Emidio-Caston, M., & Natal, D.(1994). Making meaning in classrooms: an investigation of cognitive processesin aspiring teachers, experienced teachers, and their peers. American Educa-tional Research Journal, 31(1), 166e196.

Corbin, J., & Strauss, A. (2008). Basics of qualitative research: Techniques and pro-cedures for developing grounded theory (3rd ed.). Thousand Oaks, CA: SagePublications.

Council of Chief State School Officers (2014). Retrieved May, 2014 from http://www.ccsso.org/Resources/Programs/Interstate_Teacher_Assessment_Consortium_(InTASC).html.

Darling-Hammond, L. (2006). Constructing 21st-century teacher education. Journalof Teacher Education, 57(3), 300e314.

Darling-Hammond, L. (2010). Recognizing and developing effective teaching: Whatpolicy makers should know and do. Partnership for Teacher Quality Policy Brief,May, 2010. Retrieved January, 2014 from http://www.nea.org/assets/docs/HE/Effective_Teaching_-_Linda_Darling-Hammond.pdf.

Darling-Hammond, L., & Bransford, J. (2005). Preparing teachers for a changingworld: What teachers should know and be able to do. San Francisco, CA: Jossey-Bass.

Davis, E. (2006). Characterizing productive reflection among preservice elemen-tary teachers: seeing what matters. Teaching and Teacher Education, 22(3),281e301.

Davis, E., Petish, D., & Smithey, J. (2006). Challenges new science teachers face.Review of Educational Research, 76(4), 607e651.

Davis, E., & Smithey, J. (2009). Beginning teachers moving toward effectiveelementary science teaching. Science Education, 93(4), 745e770.

Dewey, J. (1933). How we think. Buffalo, NY: Prometheus Books.Earl, L. M., & Timperley, H. (2008). Understanding how evidence and learning

conversations work. In L. M. Earl, & H. Timperley (Eds.), Professional learningconversations: Challenges in using evidence for improvement (pp. 1e12). NewYork, NY: Springer.

Erickson, F. (2011). On noticing teacher noticing. In M. Sherin, V. Jacobs, & R. Philipp(Eds.), Mathematics teacher noticing: Seeing through teachers' eyes (pp. 17e34).New York, NY: Routledge.

Feiman-Nemser, S. (2001). From preparation to practice: Designing a continuum tostrengthen and sustain teaching. The Teachers College Record, 103(6), 1013e1055.

Franke, M., & Kazemi, E. (2001). Learning to teach mathematics: focus on studentthinking. Theory Into Practice, 40(2), 102e109.

Gelfuso, A., & Dennis, D. (2014). Getting reflection off the page: the challenges ofdeveloping support structures for pre-service teacher reflection. Teaching andTeacher Education, 38(1), 1e11.

Goldhaber, D., & Anthony, E. (March 8, 2004). Can teacher quality be effectivelyassessed? The Urban Institute. Retrieved February, 2014 from http://www.urban.org/UpLoadedPDF/410958_NBPTSOutcomes.pdf.

Gleick, J. (2011). The information: A history, a theory, a flood. New York, NY: HarperCollins.

Hammer, D. (2000). Student resources for learning introductory physics. AmericanJournal of Physics, Physics Education Research Supplement, 68(S1), S52eS59.

Hammerness, K., Darling-Hammond, L., & Bransford, J. (2005). How teachers learnand develop. In L. Darling-Hammond, & J. Bransford (Eds.), Preparing teachers fora changing world: What teachers should know and be able to do. San Francisco,CA: Jossey-Bass.

Hiebert, J., Morris, A., Berk, D., & Jansen, A. (2007). Preparing teachers to learn fromteaching. Journal of Teacher Education, 58(1), 47e61.

Hufferd-Ackles, K., Fuson, K. C., & Sherin, M. G. (2004). Describing levels andcomponents of a math-talk learning community. Journal for Research in Math-ematics Education, 35(2), 81e116.

Jacobs, V. R., Lamb, L. L., & Philipp, R. A. (2010). Professional noticing of children'smathematical thinking. Journal for Research in Mathematics Education, 41(2),169e202.

Jansen, A., Bartell, T., & Berk, D. (2009). The role of learning goals in building aknowledge base for elementary mathematics teacher education. The ElementarySchool Journal, 109(5), 525e536.

Kazemi, E., Lampert, M., & Franke, M. (2009, July). Developing pedagogies in teachereducation to support novice teacher's ability to enact ambitious instruction. InR. Hunter, B. Bicknell, & T. Burgess (Eds.), Crossing divides: Proceedings of the32nd annual conference of the Mathematics Education Research Group of Aus-tralasia (Vol. 1, pp. 12e30). Palmerston North, NZ: MERGA.

Kazemi, E., & Stipek, D. (2001). Promoting conceptual thinking in four upper-elementary mathematics classrooms. The Elementary School Journal, 102(1),59e80.

Korthagen, F. (2007). The gap between research and practice revisited. EducationalResearch and Evaluation, 13(3), 303e310.

Lampert, M. (2010). Learning teaching in, from, and for practice: what do we mean?Journal of Teacher Education, 61(1e2), 21e34.

Lemke, J. (2002). Discursive technologies and the social organization of meaning.Folia Linguistica, 35(1e2), 79e96.

Levin, D. M., Hammer, D., & Coffey, J. E. (2009). Novice teachers' attention to studentthinking. Journal of Teacher Education, 60(2), 142e154.

T. Barnhart, E. van Es / Teaching and Teacher Education 45 (2015) 83e93 93

Little, J. W., & Curry, M. W. (2008). Structuring talk about teaching and learning: theuse of evidence in protocol-based conversation. In L. M. Earl, & H. Timperley(Eds.), Professional learning conversations: Challenges in using evidence forimprovement (pp. 29e42). New York, NY: Springer.

Little, J. W., & Horn, I. S. (2007). ‘Normalizing’ problems of practice: Convertingroutine conversation into a resource for learning in professional communities.In L. Stoll, & K. S. Louis (Eds.), Professional learning communities: Divergence,detail and difficulties (pp. 79e92). Maidenhead, UK: Open University Press.

Loughran, J. (2002). Effective reflective practice: In search of meaning in learningabout teaching. Journal of Teacher Education, 53(1), 33e43.

Lloyd, M. M., & Mukherjee, M. (2012, October). Tell me what you see: Pre-serviceteachers' recognition of exemplary digital pedagogy. Paper presented at Austra-lian Computers in Education Conference, Perth, Australia.

Mason, J. (1998). Enabling teachers to be real teachers: Necessary levels of aware-ness and structure of attention. Journal of Mathematics Teacher Education, 1,243e267.

Mason, J. (2002). Researching your own practice: The discipline of noticing. London:Routledge Falmer.

McDonald, M., Kazemi, E., & Kavanagh, S. S. (2013). Core practices and pedagogies ofteacher education: a call for a common language and collective activity. Journalof Teacher Education, 64(5), 378e386.

Miles, M. B., Huberman, A. M., & Salda~na, J. (2014). Qualitative data analysis: Amethods sourcebook (3rd ed.). Thousand Oaks, CA: Sage Publications.

Ministry of Education-Singapore. (2013). Science Syllabus: Primary. RetrievedFebruary, 2014 from www.moe.gov.sg/education/syllabuses/sciences/files/science-primary-2014.pdf.

Morris, A. K. (2006). Assessing pre-service teachers' skills for analyzing teaching.Journal of Mathematics Teacher Education, 9(5), 471e505.

Morris, A. K., Hiebert, J., & Spitzer, S. M. (2009). Mathematical knowledge forteaching in planning and evaluating instruction: what can preservice teacherslearn? Journal for Research in Mathematics Education, 40(5), 491e529.

National Board for Professional Teaching Standards (2014) Retrieved May, 2014from http://www.nbpts.org/mission-history.

National Research Council. (2000). Inquiry and the National Science EducationStandards: A guide for Teaching and Learning. Committee on Development of anAddendum to the National Science Education Standards on Scientific Inquiry,Center for Science, Mathematics, and Engineering Education. The National Acad-emies Press. Retrieved February, 2014 from http://www.nap.edu/download.php?record_id¼9596.

National Research Council. (2007). Taking science to school: learning and teachingscience in Grades K-8. Committee on Science Learning, Kindergarten throughEighth Grade. In R. A. Duschl, H. A. Schweingruber, & A. W. Shouse (Eds.), Boardon Science Education, Center for Education, Division of Behavioral and Social Sci-ences and Education. The National Academies Press. Retrieved February, 2014from http://www.nap.edu/download.php?record_id¼11625.

National Research Council. (2012). A Framework for K-12 Science Education: Practices,crosscutting concepts, and core ideas. Committee on a Conceptual Framework forNew K-12 Science Education Standards. Board on Science Education, Division ofBehavioral and Social Sciences and Education. National Academies Press.Retrieved February, 2014 from http://www.nap.edu/download.php?record_id¼13165.

Nicol, C., & Crespo, S. (2004). Learning to see in mathematics classrooms. In Pro-ceedings of the 28th Conference of the International (Vol. 3, pp. 417e424).Retrieved February, 2014 from ftp://134.76.12.4/pub/EMIS/proceedings/PME28/RR/RR263_Nicol.pdf.

OECD. (2013). PISA 2012 Results: What students know and can do e Student perfor-mance in mathematics, reading, and science (Vol 1). Retrieved February, 2014from http://dx.doi.org/10.1787/9789264201118-en.

Pang, J. (2011). Case-based pedagogy for prospective teachers to learn how to teachelementary mathematics in Korea. ZDM, 43(6e7), 777e789.

Pecheone, R. L., & Chung, R. R. (2006). Evidence in teacher education: the Perfor-mance Assessment for California Teachers (PACT). Journal of Teacher Education,57(1), 22e36.

Richardson, V. (2003). Preservice teachers' beliefs. Advances in Teacher Education, 6,1e22.

Rodgers, C. (2002). Seeing student learning: teacher change and the role ofreflection. Harvard Educational Review, 72(2), 230e253.

Ruiz-Primo, M. A., & Furtak, E. M. (2007). Exploring teachers' informal formativeassessment practices and students' understanding in the context of scientificinquiry. Journal of Research in Science Teaching, 44(1), 57e84.

Sandoval, W. A., Deneroff, V., & Franke, M. L. (2002, April). Teaching, as learning, asinquiry: Moving beyond activity in the analysis of teaching practice. Paper pre-sented at the annual meeting of the American Educational Research Associa-tion, New Orleans, LA.

Santagata, R. (2011). From teacher noticing to a framework for analyzing andimproving classroom lessons. In M. Sherin, V. Jacobs, & R. Philipp (Eds.),Mathematics teacher noticing: Seeing through teachers' eyes (pp. 152e168). NewYork, NY: Routledge.

Santagata, R., & Angelici, G. (2010). Studying the impact of the lesson analysisframework on preservice teachers' abilities to reflect on videos of classroomteaching. Journal of Teacher Education, 61(4), 339e349.

Santagata, R., & van Es, E. A. (2010). Disciplined analysis of mathematics teaching asa routine of practice. In J. Luebeck, & J. W. Lott (Eds.), Association for MathematicsTeacher Education Monograph Series (Vol. 7, pp. 109e123). San Diego: Associa-tion for Mathematics Teacher Educators.

Santagata, R., & Yeh, C. (2013). Learning to teach mathematics and to analyzeteaching effectiveness: evidence from a video-and practice-based approach.Journal of Mathematics Teacher Education, December, 1e24.

Santagata, R., Zannoni, C., & Stigler, J. W. (2007). The role of lesson analysis in pre-service teacher education: an empirical investigation of teacher learning from avirtual video-based field experience. Journal of Mathematics Teacher Education,10(2), 123e140.

Schoenfeld, A. H. (2011). Toward professional development for teachers groundedin a theory of decision making. ZDM, 43(4), 457e469.

Sch€on, D. A. (1983). The reflective practitioner. New York, NY: Basic Books.Scribner, J. P. (2003, April). Do-it-yourselfers or engineers? Bricolage as a metaphor for

teacher work and learning. Paper presented at the annual meeting of theAmerican Educational Research Association, Chicago, IL.

Sherin, M., Jacobs, V., & Philipp, R. (Eds.). (2011). Mathematics teacher noticing:Seeing through teachers' eyes. New York, NY: Routledge.

So, W. W.-M. (2012). Quality of learning outcomes in an online video-based learningcommunity: potential and challenges for student teachers. Asia-Pacific Journalof Teacher Education, 40(2), 143e158.

Spitzer, S. M., Phelps, C. M., Beyers, J. E., Johnson, D. Y., & Sieminski, E. M. (2011).Developing prospective elementary teachers' abilities to identify evidence ofstudent mathematical achievement. Journal of Mathematics Teacher Education,14(1), 67e87.

Star, J., Lynch, K., & Perova, N. (2011). Using video to improve preservice mathe-matics teachers' abilities to attend to classroom features. In M. Sherin, V. Jacobs,& R. Philipp (Eds.), Mathematics teacher noticing: Seeing through teachers' eyes(pp. 117e133). New York, NY: Routledge.

Star, J., & Strickland, S. (2008). Learning to observe: using video to improve pre-service mathematics teachers' ability to notice. Journal of Mathematics TeacherEducation, 11(2), 107e125.

Stockero, S. (2008). Using a video-based curriculum to develop a reflective stance inprospective mathematics teachers. Journal of Mathematics Teacher Education,11(5), 373e394.

Strauss, R. P., & Sawyer, E. A. (1986). Some new evidence on teacher and studentcompetencies. Economics of Education Review, 5(1), 41e48.

Stürmer, K., K€onings, K. D., & Seidel, T. (2012). Declarative knowledge and profes-sional vision in teacher education: effect of courses in teaching and learning.British Journal of Educational Psychology, 83, 467e483.

Tashakkori, A., & Teddlie, C. (Eds.). (2010). Sage handbook of mixed methods in social& behavioral research. Thousand Oaks, CA: Sage Publications.

Thompson, J., Braaten, M., & Windschitl, M. (2009, June). Learning progression asvision tools for advancing novice teachers' pedagogical performance. Paper pre-sented at Learning Progressions in Science (LeaPS) conference, Iowa City, IA.

Thompson, J., Windschitl, M., Braaten, M., & Stroupe, D. (2013). Developing a theoryof ambitious early-career teacher practice. American Educational ResearchJournal, 50(3), 574e615.

van Manen, M. (1977). Linking ways of knowing with ways of being practical.Curriculum inquiry, 6(3), 205e228.

van Zee, E., & Minstrell, J. (1997). Using questioning to guide student thinking.Journal of the Learning Sciences, 6(2), 227e269.

van Es, E., & Sherin, M. (2002). Learning to notice: Scaffolding new teachers’ in-terpretations of classroom interactions. Journal of Technology and Teacher Edu-cation, 10(4), 571e596.

van Es, E., & Sherin, M. (2008). Mathematics teachers' “learning to notice” in thecontext of a video club. Teaching and Teacher Education: An International Journalof Research and Studies, 24(2), 244e276.

van Es, E. A. (2009). Participants’ roles in the context of a video club. Journal of theLearning Sciences, 18(1), 100e137.

van Es, E. (2011). A framework for learning to notice student thinking. In M. Sherin,V. Jacobs, & R. Philipp (Eds.), Mathematics teacher noticing: Seeing throughteachers’ eyes (pp. 134e151). New York, NY: Routledge.

Valencia, S. W., Martin, S. D., Place, N. A., & Grossman, P. (2009). Complex inter-action in student teaching: Lost opportunities for learning. Journal of TeacherEducation, 60, 304e322.

Waddington, D., Nentwig, P., & Schanze, S. (2007). Standards in science education:Making it comparable. Munster, Germany: Waxman Publishing.

Whitehurst, G. (2002). Scientifically-based research on teacher quality: research onteacher preparation and professional development. In Proceedings of the WhiteHouse conference on preparing tomorrow's teachers. Retrieved February, 2014from http://tepserver.ucsd.edu.

Windschitl, M., Thompson, J., & Braaten, M. (2008). Beyond the scientific method:Model-based inquiry as a new paradigm of preference for school science in-vestigations. Science Education, 92(5), 941e967.

Windschitl, M., Thompson, J., & Braaten, M. (2011). Ambitious pedagogy by noviceteachers: who benefits from tool-supported collaborative inquiry into practiceand why? Teachers College Record, 113(7), 1311e1360.

Windschitl, M., Thompson, J., Braaten, M., & Stroupe, D. (2012). Proposing a core setof instructional practices and tools for teachers of science. Science Education,96(5), 878e903.

Zeichner, K. (1987). Preparing reflective teachers: an overview of instructionalstrategies which have been employed in preservice teacher education. Inter-national Journal of Educational Research, 11(5), 565e575.

Zeichner, K., & Liston, D. (1987). Teaching student teachers to reflect. HarvardEducational Review, 57(1), 23e49.

Zeichner, K., & Liston, D. (1996). Reflective teaching: An introduction. Mahwah, NJ:Lawrence Erlbaum Associates.