Exploring Teacher Questioning as a Formative Assessment ...

18
RELC Journal 1–18 © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/0033688214546962 rel.sagepub.com Exploring Teacher Questioning as a Formative Assessment Strategy Yan Jiang Faculty of Education, The University of Hong Kong, Hong Kong Abstract This study explored teacher questioning as a formative assessment strategy by examining the practices of teachers of English as a Foreign Language in Chinese tertiary institutions. It investigated how teachers deployed questions to stimulate student thinking, uncover students’ current level of learning, and allow responses to inform pedagogic decisions. The research methods were classroom observations and interviews. This paper highlights the practice of one experienced teacher who conducted quality questioning to gauge and facilitate learning. The paper provides practical insights into how questioning can be developed as a formative assessment method and recommends equipping teachers with further knowledge and skills to carry out effective questioning. Keywords Formative assessment, teacher questioning, teaching and learning Introduction Assessment plays a central role in teaching and learning, and its formative function has received steady consideration since Black and Wiliam’s (1998a) influential work demon- strated that formative assessment can be a powerful way to enhance student learning in general education contexts. In English language education, however, there has been rela- tively few empirical investigations of formative assessment in the classroom (Carless, 2011). The essential issue remains: how can the learning potential of formative assess- ment be realized. Teacher questioning, a widely used educational technique, has the potential to pro- mote learning by allowing teachers to gather important information about the current Corresponding author: Yan Jiang, Faculty of Education, The University of Hong Kong, 102 Hui Oi Chow Science Building, 999077, Hong Kong. Email: [email protected] 546962REL 0 0 10.1177/0033688214546962RELC JournalJiang research-article 2014 Article at PENNSYLVANIA STATE UNIV on September 12, 2016 rel.sagepub.com Downloaded from

Transcript of Exploring Teacher Questioning as a Formative Assessment ...

RELC Journal 1 –18

© The Author(s) 2014Reprints and permissions:

sagepub.co.uk/journalsPermissions.navDOI: 10.1177/0033688214546962

rel.sagepub.com

Exploring Teacher Questioning as a Formative Assessment Strategy

Yan JiangFaculty of Education, The University of Hong Kong, Hong Kong

AbstractThis study explored teacher questioning as a formative assessment strategy by examining the practices of teachers of English as a Foreign Language in Chinese tertiary institutions. It investigated how teachers deployed questions to stimulate student thinking, uncover students’ current level of learning, and allow responses to inform pedagogic decisions. The research methods were classroom observations and interviews. This paper highlights the practice of one experienced teacher who conducted quality questioning to gauge and facilitate learning. The paper provides practical insights into how questioning can be developed as a formative assessment method and recommends equipping teachers with further knowledge and skills to carry out effective questioning.

KeywordsFormative assessment, teacher questioning, teaching and learning

Introduction

Assessment plays a central role in teaching and learning, and its formative function has received steady consideration since Black and Wiliam’s (1998a) influential work demon-strated that formative assessment can be a powerful way to enhance student learning in general education contexts. In English language education, however, there has been rela-tively few empirical investigations of formative assessment in the classroom (Carless, 2011). The essential issue remains: how can the learning potential of formative assess-ment be realized.

Teacher questioning, a widely used educational technique, has the potential to pro-mote learning by allowing teachers to gather important information about the current

Corresponding author:Yan Jiang, Faculty of Education, The University of Hong Kong, 102 Hui Oi Chow Science Building, 999077, Hong Kong. Email: [email protected]

546962 REL0010.1177/0033688214546962RELC JournalJiangresearch-article2014

Article

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

2 RELC Journal

state of student knowledge. This, in turn, permits instructors to make better pedagogical decisions. (Black et al., 2003). In the field of formative assessment, there has been a considerable amount of research conducted on strategies such as peer/self-assessment and sharing assessment criteria with learners. However, relatively few studies have focused on teacher questioning (e.g. Black et al., 2003; Ruiz-Primo and Furtak, 2006; 2007). In the field of second language (L2) classroom interaction, researchers have examined questioning practice in depth (e.g. Long and Sato, 1983; Tan, 2007), whereas questioning, with specific reference to its use as an assessment tool, has remained rela-tively underexplored.

This study bridges the fields of formative assessment and L2 classroom interaction by exploring teacher questioning as an assessment strategy. This paper is drawn from a wider study examining the practices of six teachers of English as a Foreign Language (EFL) in two Chinese universities. For the purpose of illustrating how teacher questions might be used to gauge and promote learning, this paper sketches the main trends in these teachers’ practices and reports on one skillful teacher in detail whose questioning showed the most potential to enhance learning. It is hoped that this classroom-based research may contribute to the knowledge base by providing insights into how questioning can be developed as an assessment tool. It also offers concrete suggestions regarding how to improve questioning skills.

Literature Review

The framework of this study draws on findings from the fields of formative assessment and L2 classroom interaction. This section clarifies the relationship between formative assessment and classroom teaching and reviews studies on questioning from both areas.

Formative Assessment and Classroom Teaching

Formative assessment is a term open to different interpretations. According to Black and Wiliam (2009), assessment is formative when ‘evidence about student achievement is elicited, interpreted, and used by teachers, learners, or their peers, to make decisions about the next steps in instruction that are likely to be better, or better founded, than the decisions they would have taken in the absence of the evidence that was elicited’ (2009: 9). Seen in this light, the priority of formative assessment is to enhance instruction or/and promote learning by following the procedure of eliciting, interpreting, and using evidence.

Formative assessment takes various forms and ‘may be plotted at different points along a more “formal” to “informal” continuum’ (Rea-Dickins, 2001: 437). That is, both formal preplanned tasks such as classroom quizzes and informal ad hoc activities like teacher questioning can be seen as different versions of formative assessment.

Researchers generally agree that formative assessment and classroom teaching are interrelated. Rea-Dickins (2001) asserts that assessment strategies, especially informal ones, are routinely embedded within good classroom practice. Carless (2011) echoes this view by illustrating key assessment strategies reflecting characteristics of good teaching. In addition, Leung (2005) points out that formative assessment occurs spontaneously

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

Jiang 3

during ordinary instruction even if teachers may consider themselves to be teaching rather than assessing.

It can be inferred from the above that, formative assessment is an inseparable part of effective teaching: the two share the same goal of enhancing learning. Despite the simi-larities, formative assessment requires the teacher to seek, interpret, and use evidence about student learning, whereas good teaching does not necessarily follow this procedure.

Questioning as a Formative Assessment Strategy

Classroom questioning has a typical sequence: —teacher initiation, student response, and teacher feedback/evaluation (IRF/E) (Mehan, 1979). Formative assessment also involves three steps: eliciting, interpreting, and using the evidence (Black and Wiliam, 2009). It is worth noting that questioning may not be an assessment tool in all situations. For example, when it is adopted to develop student interest rather than to check learning, questioning is a teaching technique and not an assessment tool. Another example is that even when questioning is aimed at diagnosing learning, if follow-up actions are not taken to facilitate learning, it would be inapposite to label it as a formative assessment strategy.

Thus, to develop questioning as a formative assessment tool, there is a need to go beyond the standard IRF. First, the questions posed should be critical to the development of students’ understanding (Black et al., 2003). Second, the responses elicited should represent student thinking so as to facilitate teachers’ subsequent decision making. Third, the follow-up actions teachers take should be meaningful interventions which move learners towards their learning goals (Hill and McNamara, 2012). In brief, to explore questioning as an assessment tool, we need to examine the entire process of questioning and make sure that each stage serves the learning purpose.

In the formative assessment area, Black, Harrison, Lee, Marshall, and Wiliam (2003) have demonstrated how questioning can be used as an assessment strategy in content classrooms. For example, in their study, much time was spent framing quality questions, group discussions were conducted to allow deep thinking, and rich follow-up activities created further learning opportunities. Other researchers (e.g. Ruiz-Primo and Furtak, 2006; 2007) have confirmed findings by Black et al., (2003) that quality questioning makes both teaching and learning more effective, although these studies were limited in number and were conducted in science education.

In the field of classroom interaction, questioning has also been extensively investi-gated. Studies relating to each stage of questioning are reviewed below.

Initiation

Diverse criteria have been proposed to categorize teacher questions. For example, Bloom’s (1956) taxonomy places questions in an ascending order of cognitive demand: lower cognitive questions ask for factual recall and higher ones require high-order think-ing such as synthesis and evaluation. Long and Sato’s (1983) categories are based on whether or not the information elicited is known by the questioner: display questions

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

4 RELC Journal

request known information, while referential questions request unknown information. Richards and Lockhart (1994) divide questions into procedural and convergent/divergent questions. The former have to do with classroom routines and management and the latter engage students in the content of learning by eliciting similar or diverse responses.

Although researchers generally believe that low cognitive questions do not necessar-ily represent understanding and that higher ones are able to engage learners in deeper thinking (Black et al., 2003), classroom research has revealed that low cognitive ques-tions dominate in practice. In their 1983 study of ESL classrooms, Long and Sato reported that display recall questions greatly outnumbered referential questions. Over the ensuing three decades, little has changed regarding teachers’ emphasis on recall questions (Tan, 2007). It has been noted that low cognitive questions tend to engage students in rote learning and discourage critical thinking.

Response

Student talk represents the externalization of individual thinking coded in language (Leung and Mohan, 2004). Research in ESL/EFL classrooms, nonetheless, reports that student responses are not always accurate representations of their thinking. First, no answer, referring to student reticence in response to teacher questions, has been found to be a problem encountered by most teachers (Tsui, 1996). If students repeatedly respond to questions with silence, how much information can we expect to get about student thinking? The second problem concerns teacher answers, i.e., teachers answering their own questions. Excessive teacher answers deprive students of the opportunity to exhibit their thinking and make them more teacher-dependent (Hu et al., 2004). A third problem is choral answers–questions are replied to by students as a group/whole class. This type of response, in contrast to individual answers limits the amount of information that can be obtained about individual students (Chick, 1996).

Evaluation

In a comprehensive review of teacher evaluation practices Hattie and Timperley (2007) propose a model differentiating teacher feedback at four levels: task level, process level, self-regulation level, and self level. Although they further point out that feedback at the self-regulation and process levels promote learning more effectively, research evidence show that classroom feedback are commonly operated at the less effective self and task levels. Similarly, Black and Wiliam (2009) propose a formative mode of classroom inter-action: upon receiving responses, the teacher’s attention should focus on what he/she can learn about student thinking. In this model, the teacher’s work is far less predictable. Nonetheless, teachers in content classrooms commonly look for a particular response and lack the flexibility or confidence to deal with the unexpected (Black and Wiliam, 1998b). In these situations, the question/answer dialogue becomes ritualized and as a result teacher evaluation has less potential to enhance learning. In language classrooms, teachers have also been observed to follow a plan of predetermined actions regardless of the wide range of responses elicited (Musumeci, 1996); or they react to student responses by simply saying whether the answers are right or wrong (Leung and Mohan, 2004).

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

Jiang 5

Although previous studies have investigated teacher practice, research scrutinizing the entire process of questioning is sparse. This study aimed to examine the entire pro-cess of questioning and check whether each stage fulfilled its learning function.

Research Questions

The research questions which formed the focus of the study are outlined below:

1. What types of questions are posed by the teachers and to what extent do they seem to benefit learning?

2. What types of responses are elicited by teacher questions and do they represent student thinking?

3. What actions are taken by the teachers upon receiving responses and to what extent do they appear to promote learning?

Methods

Participants

The study was conducted in two tertiary institutions in the People’s Republic of China. UA is a private foreign language college and UB is a state-supported foreign language university. Three Chinese EFL teachers and 15 students from UA and three Chinese EFL teachers and 16 students from UB participated in the study. Purposeful sampling helped select teachers skillful in questioning, and maximum variation sampling was employed to identify teachers from different educational backgrounds (Merriam, 1998).

Data Collection

Classroom observations were conducted to obtain firsthand information about teacher questioning practices. Four teachers (T1 and T2 from UA, T3 and T4 from UB) were observed for eight sessions on a consecutive basis, which yielded about 12 hours of data for each teacher. The other two teachers (T5 from UA and T6 from UB) were less enthu-siastic about being observed and therefore three sessions of observation were conducted, which produced four-and-a-half hours of data for each. Audio-recordings were also made to capture teacher questions, student responses, and teacher reactions to these responses.

Semi-structured interviews were carried out to explore participants’ perceptions. The interview questions for teachers and students followed similar patterns, for example: ‘What types of questions do you/teachers usually ask?’ ‘What types of questions do you prefer and why?’ ‘How do you/teachers react to the answer when it is (in)correct?’ Six individual teacher interviews, 12 individual student interviews, and eight group inter-views of three or four students were carried out, resulting in 40 hours of interview data. The interviews were conducted in Mandarin, the native language of the participants, so as to facilitate natural communication. Key issues in the interviews were cross-checked with the participants to validate the researcher’s interpretation in accordance with trust-worthiness procedures.

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

6 RELC Journal

Data Analysis

Observational data were analyzed by going through the recordings, sorting out episodes involving question-answer interaction, and transcribing the interactions verbatim. Teacher questions and teacher follow-up actions to student responses were each coded according to Richards and Lockhart’s (1994) typology and Hattie and Timperley’s (2007) classification system. Since existing coding schemes did not seem suitable for student responses in the current study, I developed my own based on a continuing ‘dialogue’ between the research questions, the literature, and the data (Ritchie and Spencer, 2002). Measures to ensure trustworthiness of the categories, including participant checks of transcripts and analyses, and constant comparative analysis between data and emerging propositions were also taken.

Interviews were transcribed, transcriptions repeatedly read through, codes or com-ments assigned, and finally, similar codes aggregated together to form themes, which offered qualitative insights into questioning practices.

Findings

Table 1 and Table 2 below provide definitions and examples of different question types and response types to give readers a flavor of the categorization. Table 3 shows the aver-age number of questions that the teachers raised in each session and Table 4 presents how these questions were answered. Table 5 provides detailed information about how each type of question was answered.

The general trends of the teachers’ practices can be seen in Table 3 and Table 4: 1) the questions teachers raised most frequently were convergent questions (81%), followed by procedural questions (12%) and finally, divergent questions (7%). 2) The largest propor-tion of these questions led to students responding with individual answers (48%), although a high percentage responded with student choral answers (36%). A small pro-portion responded with proportion with teacher answers (8%) and no answer (8%), respectively.

A great deal of variance can be noted among the teachers’ practices, as well. For instance, T3’s questioning pattern was found to be distinct from those of the other participants. He asked the most divergent (24%) and the least convergent questions (67%), and students more often than not responded to his questions with silence. This pattern, nevertheless, was favored by students and perceived to have benefited learn-ing. In the following, T3’s practice is discussed in detail, and findings are presented in accordance with the research questions.

RQ1 Teacher Questions and the Extent to Which They Benefit Learning

On average, T3 raised 66 questions during an 80-minute lesson:44 were convergent (67%), 16 divergent (24%), and 6 procedural (9%). The section below further examines the first two categories that closely relate to the content of learning.

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

Jiang 7

Convergent Questions. Convergent questions, as the classroom data show generally required factual recall: some (58%) focused on the content, structure and language of the texts, and the answers could be found in the text; the rest (42%), although also related to the text in some way, demanded world knowledge not specified in the text. Table 6 lists examples of these questions.

As can be seen from the table, both types of questions checked student knowledge: text-oriented ones aimed to assess student mastery of the text, whereas world-knowledge questions were directed at student background information about the world.

Table 1. Question Types and Examples.

Type Definition Example

Procedural question

It has to do with classroom procedures and routines, and classroom management (Richards and Lockhart, 1994).

‘Did you finish the homework?’ (T2-Session 2)

Convergent question

It has to do with content of learning. It encourages similar and short responses, and focuses on the recall of previously presented information (Richards and Lockhart, 1994).

‘What is quick-fry?’ (T4-Session 3)

Divergent question

It has to do with content of learning. It encourages diverse responses and requires higher-level thinking (Richards and Lockhart, 1994).

‘What are the advantages of smoking?’ (T3-Session 2)

Table 2. Response Types and Examples.

Type Definition Example

Student individual answer

It refers to the answer offered by individual student to teacher questions.

T1: ‘Today we will further discuss the letter Cathy wrote. Do you remember what it is talking about?’S1: ‘Suggestion.’ (T1-Session 2)

Student choral answer

It refers to the answer provided by students as a whole class to teacher questions.

T1: ‘Actually Cathy wrote the letter for two reasons. The first is directly to show what?’Ss: ‘Disappointment.’ (T1-Session 2)

Teacher answer

It refers to the answer offered by the teacher himself/herself.

T2: ‘Do you know where Madrid is?’T2: ‘You don’t know? 马德里(Chinese counterpart for Madrid), is in Spain.’ (T2-Session 3)

No answer:

It refers to student reticence in response to teacher questions.

T3: ‘Is the US a member of OPEC?’Ss: [student reticence] (T3-Session 4)

Note. T=Teacher; S1=One student; Ss=Students as a group/whole class.

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

8 RELC Journal

Students were asked what they thought of text-oriented questions; while acknowledg-ing their assessment function, they did not seem to favor these types of question due to their stiffness. As two students remarked,

C3-A4: I believe those questions are necessary because they test our knowledge and lay the foundation for further learning.C3-H2: We actually lack motivation to respond to those questions with fixed answers in texts and we have been fed up with them since secondary schools.

In the case of world-knowledge questions, students seemed to be aware of their func-tions, as can be seen from these excerpts:

C3-A1: World-knowledge questions provide a supplement to what we learn in the course. When you get the answer, you gain some knowledge not specified in textbooks.C3-A3: Many of us merely focus on the textbook and know very little about the world, while T3 believes world knowledge is important, so he tries to raise our awareness by posing those questions.

Table 3. Types of Questions.

Teacher Questions per session

Procedural question

Convergent question

Divergent question

n. % n. % n. % n. %

T1 67 100 6 9 59 88 2 3T2 137 100 20 15 108 79 9 6T3 66 100 6 9 44 67 16 24T4 73 100 2 3 66 90 5 7T5 177 100 24 13 150 85 3 2T6 56 100 7 13 44 78 5 9average 96 100 11 12 78 81 7 7

Table 4. Types of Responses.

Teacher Question/session

Choral answer

Individual answer

Teacher answer

No answer

n. % n. % n. % n. % n. %

T1 67 100 35 52 26 39 3 4.5 3 4.5T2 137 100 61 45 64 47 7 5 5 3T3 66 100 8 12 39 59 8 12 11 17T4 73 100 37 51 25 34 9 12 2 3T5 177 100 57 32 82 46 16 9 22 13T6 56 100 8 14 38 68 6 11 4 7average 96 100 34 36 46 48 8 8 8 8

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

Jiang 9

Students also noted the teacher’s emphasis on world knowledge, and their speculations turned out to substantiate the teacher’s purposes. T3 stated, T3 stated,

T3: I notice my students don’t care about what is going on outside campus, which is not supposed to be. I therefore raise questions involving world knowledge and try to cultivate their interest.

The above findings appear to indicate that while convergent questions in general remained at a low cognitive level of recall, they functioned as an assessment tool to check whether students had mastered what they were supposed to learn, either information in the

Table 5. Links Between Types of Questions and Types of Responses.

Teacher Question/session

Choral answer

Individual answer

Teacher answer

No answer

n % n % n % n % n %

T1 procedural 6 100 2 33 3 50 0 0 1 17 convergent 59 100 33 56 21 36 3 5 2 3 divergent 2 100 0 0 2 100 0 0 0 0T2 procedural 20 100 8 40 10 50 1 5 1 5 convergent 108 100 53 49 46 42 6 6 3 3 divergent 9 100 0 0 8 89 0 0 1 11T3 procedural 6 100 0 0 4 67 0 0 2 33 convergent 44 100 8 18 23 52 6 14 7 16 divergent 16 100 0 0 12 75 2 13 2 13T4 procedural 2 100 0 0 2 100 0 0 0 0 convergent 66 100 37 56 19 29 9 14 1 1 divergent 5 100 0 0 4 80 0 0 1 20T5 procedural 24 100 4 17 11 46 2 8 7 29 convergent 150 100 53 35 68 45 14 9 15 10 divergent 3 100 0 0 3 100 0 0 0 0T6 procedural 7 100 0 0 6 86 0 0 1 14 convergent 44 100 8 18 28 64 6 13 2 5 divergent 5 100 0 0 4 80 0 0 1 20Ave. procedural 11 100 2.5 23 6 55 0.5 4 2 18 convergent 78 100 32 41 34 44 7 9 5 6 divergent 7 100 0 0 6 86 0.5 7 0.5 7

Table 6. Convergent Questions in T3’s Lesson.

Text-oriented question‘The UN has six main organs, what are they?’ (T3-Session 4)–raised after students had read a text about the UN;World-knowledge question‘What happened in Libya recently?’ (T3-Session 4)–raised when students were about to learn a unit discussing international affairs;

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

10 RELC Journal

textbook or information about the world. This thus paved the way for future learning. World-knowledge questions in particular diagnosed student weaknesses and guided stu-dents to explore knowledge beyond the lines of the text rather than confining their ideas to the text. In this sense, convergent questions in this class benefited student learning.

Divergent Questions. Divergent questions in T3’s practice greatly outnumbered those in other teachers’ practices. Examples are listed in Table 7, below.

As can be seen above, these questions shared some characteristics, even though they were initiated in different contexts. First, they were cognitively more challenging. To figure out the answers to them, students were challenged to recall the presentation, apply some criteria to evaluate it so as to identify its strengths, and finally organize their thoughts and express them in the target language (see Example 1, above). This process was likely to engage learners in deep thinking where cognitive skills, such as recall, application, evaluation and synthesis, were employed. Second, these questions related classroom learning to the students’ real lives. In Example 3, the definition of ‘rights’ learned in a text laid down a foundation for students to explore the concept of ‘human rights’, and further facilitated students’ reflection on ‘whether you enjoy human rights appropriately’. Thus, the questions connected what students learned in class to what they experienced off campus. Third, these questions were open in nature and allowed learners to express different ideas. In Example 2, students did not have to retrieve any specific piece of information they had previously learnt; rather, any idea that they were able to justify could be a ‘correct’ answer. In view of this, divergent questions can be seen as important incentives to elicit learners’ real thinking.

Interviews seemed to support the above observations. For example, C3-H2 was one of the students who repeatedly related divergent questions to high cognitive thinking skills.

C3-H2: We prefer divergent questions because they enable us to think actively, explore some issues and express our opinions. They seem to be at a higher level than those with fixed answers. They require me to think more and benefit my critical thinking skills.

In addition to their benefits for cognitive development, divergent questions were per-ceived to ensure psychological safety. As one low achieving student remarked:

Table 7. Divergent Questions in T3’s Lesson.

1. ‘What have you learned from (a student’ name) presentation?’ (T3-Session 1) –posed after one student had finished his oral presentation;2. ‘What do you think of the statement that we are deprived of the right to take drugs?’

(T3-Session 3) –raised after students had learned a unit discussing social problems;3. ‘What do you understand by human rights? As human beings, do you enjoy your human

rights appropriately?’ (T3-Session 6) –raised after students had learned a text discussing rights;

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

Jiang 11

C3-L1: I prefer divergent questions because there is no fixed answer. So I can say whatever I like without feeling too much pressure in front of my peers.

Students also believed this type of question allowed for the expression of personal ideas, which reciprocally informed teachers about their thinking.

C3-A1: Divergent questions provide us with opportunities to articulate our opinions, which also help teachers to know what we are thinking.

To summarize the above findings, there is evidence that divergent questions benefited learning. First, such questions engaged learners in deep thinking and thus contributed to their cognitive development: students not only had to recall information, but also had to apply ‘old knowledge’ to ‘new situations’; likewise, learners were required to provide not only factual information but also their own judgment. Second, the open and non-threatening nature of divergent questions boosted learners’ willingness to share personal opinions and therefore increased their classroom participation. Third, also thanks to their open nature, these questions were capable of eliciting richer learner information. Based on this information, teachers were better able to gauge student needs and make pedagogi-cal decisions accordingly.

RQ2 Student Responses and Whether They Represent Thinking

Due to space limitations of the 66 questions T3 posed, 39 elicited individual responses (59%), 11 elicited no response (17%), 8 led to choral responses (12%) and 8 were answered by the teacher himself (12%). The following analysis focuses on the first two categories.

Individual Answers. Individual answers were invited from time to time in class, and this was supported by the interviews.

C3-L2: On most occasions, questions were answered by individual students. I think it is more effective: you don’t know who will be nominated so everyone has to think.T3: When students were less responsive, I had to nominate one of them. I selected from the quiet students because they seemed to need more encouragement. If it is pos-sible, I’d like to engage everyone in every session.

It seems that the teacher’s purpose in requiring individual answers was understood by the students: students were expected to think about the question, be prepared to give an answer, and be actively involved in the question-answer interaction.

Further analysis revealed an interesting finding: around one fourth of individual answers were elicited after pair/group discussions. Some were associated with conver-gent questions, when students were checking answers during listening/reading tasks; others were related to divergent questions, when students were exchanging ideas with peers. For example, in Classroom Observation T3-Session 3, the teacher asked the ques-tion, ‘What do you think of the statement that we are deprived of the right to take drugs?’,

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

12 RELC Journal

and during the next nine minutes, students had discussions in six groups. After that, representatives from each group reported to the whole class. In this episode, T3 did not require an immediate response after posing the question. Rather, he provided some time for thinking and a ‘forum’ for discussion. Based on group ideas, student thinking was elicited.

When the teacher was asked why he followed a question-discussion-response sequence, he stated that ‘everyone may approach the topic from one perspective, while group discussions give students different ideas and make them think more thoroughly.’ When students were asked for their viewpoints, they all seemed to welcome this practice, as the following students bear out:

C3-H2: Group discussions give us more time to formulate the answer. We need time to organize our ideas and express them in a logical way.C3-A1: You get different ideas from peers, which stimulate your thinking, and in return you have more ideas.C3-A2: I’m reluctant to offer an answer in class for fear that my answer is stupid. Group discussions help test my ideas and make me less anxious.

Group discussions, as illustrated in the quotes above, seemed to have the potential of eliciting responses which accurately represented student thinking. First, when more time was provided to process the question and diverse ideas were received student answers were more likely to represent deep rather than superficial thinking. Second, when a micro-community was established in which ideas could be shared without fear of deri-sion, students became more willing to voice their thinking. Third, when group ideas were elicited after discussions, a better-rounded picture of both individual and group learners’ thinking was obtained.

In sum, individual responses generally required students to think actively and be engaged in classroom interaction, and those elicited after group discussions were more likely to reflect learners’ real thinking and exhibit both individual and group learners’ ideas.

No Answer. Out of 66 teacher questions, 11 elicited no responses; a further a further examination revealed that most of these unanswered questions were convergent. Since convergent questions simply required factual recall and were assumed to be easier to answer, why did students repeatedly fail to offer answers? To discover the reasons, class-room data was re-examined.

Classroom Observation T3-Session 4Some acronyms were listed in the course-book, and T3 asked his students, ‘What is

acronym? What is abbreviation? And what is initialism?’ The whole class was quiet. Waiting for a few seconds, he continued, ‘To satisfy your curiosity, find subtle differences between them. And you can tell me, rather than wait for me to tell you.’ T3 left these questions as homework and moved on to a reading task.

In this episode, the question ‘what is acronym, abbreviation, and initialism’ seemed to demand little mental processing on the part of the students. In other words, students should have been able to answer the question by retrieving relevant information from

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

Jiang 13

their memory. Nevertheless, a careful examination of the student course-book revealed that the required information was not specified; nor had it been taught previously, as suggested in student interviews. Accordingly, a lack of knowledge was probably the reason for the silence.

When T3 was asked about this, he stated that he had largely anticipated student lack of response, and the reason he posed the question was to inform students of his expecta-tions and to prompt students to find the information independently afterwards.

In short, to a great extent, a lack of response in this class reflected the scope of student knowledge in some areas and their deficiencies in others. It also uncovered the gap between the teacher’s expectations and the students’ actual state of learning.

RQ3 Teacher Actions and the Extent to Which They Promote Learning

T3’s reactions to student responses are discussed by looking at two examples. The first illustrates how the teacher reacted to student reticence, and the second shows how he followed individual responses.

Reactions to No Answer. The un-answered question, ‘What is acronym, abbreviation, and initialism’ was left as an assignment in Session Four. In the subsequent lesson, T3 asked the students to share the information they had acquired in groups, and then invited repre-sentatives to report group ideas. In responding to the silence, T3 did not push his students to come up with an answer, nor did he offer any ‘teacher expertise’. Indeed, he advised students to search for relevant information independently after class and work collabora-tively in class. In other words, the un-answered question was not dismissed, but assigned both as a self-learning and collaborative task.

The teacher gave his reasons for doing so, stating:

T3: I used to provide answers directly when students had no ideas. Later on, I realized students would quickly forget what I taught them. Now I believe in learning by doing. I ask them to search for information and let them tell me, and teach me in class, though I myself am fully prepared.

Students seemed to hold positive attitudes towards this practice, as they commented,

C3-H2: I think T3 knows the answers himself, but he determines not to let us get them so easily. He expects us to search for answers and gain deeper understanding. We learn the most when we teach others.C3-H1: I am strongly in favor of this practice, because ‘tell me, I will forget; show me, I will remember; involve me, I will learn.’C3-A3: I felt shamed when nobody could offer an answer. I made my decisions at that silent moment that I would work harder so as to perform better next time.

As indicated above, T3’s reactions to the no responses were likely to promote student learning. On the one hand, by exposing students’ ignorance in certain areas and not ‘feeding’ them with information immediately, the teacher probably aroused the, students’

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

14 RELC Journal

eagerness for knowledge and increased their motivation to learn. On the other hand, by encouraging students to explore by themselves and by promoting peer tutoring, the teacher made students take more responsibility for their own learning, thus fostering self-reliant and collaborative learners.

Reactions to Student Individual Answers. When listening to individual responses, T3 gener-ally showed positive attitudes towards his students’ ideas. The following classroom data demonstrates this tendency.

Classroom Observation T3-Session 3: T3 invited students to share group opinions on the topic ‘What do you think of the statement that we are deprived of the right to take drugs?’

T3: What’s the idea in your group?S1: We don’t think taking drugs is a right because… (T3 wrote S1’s idea on the

PowerPoint)T3: Any other ideas from your group? How do other people think of it?S2: Taking drugs is not necessarily a right…. (T3 wrote on the PowerPoint)T3: How about in your group?S3: I found something interesting from the internet…So whether to take drugs or not

is our own rights (T3 wrote on the PowerPoint).T3: All right, it is controversial…… S4, do you have different ideas?S4: Taking drugs is not your own choice since…T3: ….Do you have different ideas, S5? In your group?S5: We are all against this statement…T3: … OK. So far most students come up with something similar. I’d like to hear

different ideas.S6 (voluntarily offered an answer): I think we can peacefully use the drugs.T3: Tell us more about peaceful drug use.S6: In western countries……but according to Chinese government policies……T3: S6 just challenged the policy. Do you think all the policies defined by the gov-

ernment are acceptable?….Have you got more different ideas to support the statement? Try to think critically, try to think differently. Do not just follow the majority.

T3’s reactions in this scenario seemed to involve three steps: first, he accepted student ideas by adding the key words to the PowerPoint. Second, he perceived the weakness in responses (answers were similar and lacked critical thinking) by repeatedly calling for different ideas. And third, third he guided students to explore the question in depth by asking students to think critically.

The teacher’s way of reacting to student responses was appreciated by his students. Below, two students refer to the teacher’s openness and respect:

C3-H1: I like his practice of putting our ideas on the PowerPoint. It seems he respects our thoughts and acknowledges our contribution. Whether the opinion is right or wrong, he won’t despise it.

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

Jiang 15

C3-H2: I think T3 is extremely tolerant. As long as you are willing to share, he is will-ing to listen, so you won’t have worries when voicing out your opinions.

Students also seemed to notice the teacher’s identification of their problem, as they remarked:

C3-A3: He made me realize there is not necessarily a fixed correct answer for any question. The right answer is the one you can justify.C3-H1: I didn’t know how to think critically before. T3 always asks us to think differ-ently and to challenge someone else’s idea. Now I enjoy exploring the same issue from different perspectives.

The teacher also commented on why he attached great importance to critical thinking skills.

T3: I care more about how students arrived at an answer than the answer itself. And I always encourage my students to challenge ‘the correct answer’ if they have their own opinions. In my lessons, I accept answers different from those specified in the teacher’s book as long as the justification is given. Our students tend to follow the authority and lack critical thinking, which is what I’d like to change.

To summarize the above findings, the teacher’s reactions to individual responses tended to exert a positive impact on the learners. First, by largely accepting student answers and being open to diverse ideas, the teacher established a safe classroom environment where learners were more ready to articulate different opinions and risk making mistakes. Second, by identifying learners’ weaknesses and asking for different answers, he guided students to form their own judgment. This practice was likely to promote deep rather than surface learning and cultivate learners’ critical thinking skills.

Discussion and Conclusion

This study explored teacher questioning as a formative assessment strategy. With respect to the research questions, it was found that teachers raised significantly more convergent, lower cognitive questions (81%) than divergent, high cognitive questions (7%). The majority of teacher questions elicited individual, responses (48%) and choral answers (36%), with a small proportion being answered by the teachers themselves (8%) or elicit-ing no response (8%). In one skillful EFL teacher’s (i.e., T3) practice, both convergent and divergent questions seemed beneficial to learners. Convergent questions checked student mastery of textual or world knowledge. When these questions elicited no response, indicating student knowledge deficiency in certain areas, the questions were used to direct self-learning and collaborative tasks, where learners were encouraged to take responsibility for their own learning rather than relying on the teacher. Divergent questions appeared to engage learners in higher-order thinking and increase classroom participation. When these questions elicited responses that exhibited learner weaknesses,

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

16 RELC Journal

such as lack of creativity, meaningful intervention was conducted to facilitate deep learn-ing and to develop critical thinking skills.

T3’s practice in the current study adds further evidence in support of Black et al.’s (2003) claim that questioning can become an essential part of classroom assessment when teachers dedicate themselves to quality questioning. As the findings indicate at each stage of the questioning cycle, efforts can be made to support learning. In the initia-tion stage, convergent questions can be used to facilitate student mastery of the knowl-edge, and divergent questions have the potential to engage learners in deep thinking. In the response stage, convergent questions expose learners’ potential ‘blind spots’. In the evaluation stage, a lack of student responses can be utilized to inspire independent and collaborative learning.

The findings of the current study are similar to the results in Tan’s (2007) study that a high proportion of questions (87%) were at lower intellectual levels. Unlike Tan’s (2007) observation that recall questions focused on the text and engaged students in rote learn-ing, half of the recall questions T3 initiated were world-knowledge questions. They not only directed students’ attention to knowledge beyond textbooks, but encouraged inde-pendent after-class inquiry. Also different from Tan’s (2007) finding that higher cogni-tive questions were rarely asked, such questions were frequently integrated into T3’s instruction, which aimed to engage learners in deep learning and to form students’ criti-cal viewpoints.

The teacher’s reactions to student responses in this study also differed from the find-ings revealed in previous research. For example, while Black and Wiliam (1998b) noted that teachers in content classrooms tend to direct students toward giving expected answers, T3 in this study cared more about how students arrived at the answer and led students to explore the question in depth. Also, in contrast to the ESL teachers in Musumeci’s (1996) study who followed predetermined actions regardless of the diver-sity of the responses elicited, T3 conducted meaningful intervention to move learners forward based on the responses exhibiting learner weaknesses.

A number of implications for teaching and teacher education can be derived from this study. First, considering the dominance of low cognitive questions in most ESL/EFL teachers’ practice, instructors should not only work to increase higher-order ques-tions to stimulate thinking, but should also strive to skillfully manage recall questions so as to exploit learning potential. As T3 illustrated, teachers can deliberately deploy low cognitive questions (especially those one step higher than students’ current level) to test the extent of students’ knowledge and to make students explore their knowl-edge. The process of students arriving at the answer can be part of a deeper process of independent and collaborative learning. In this way, deep learning is likely to occur and learner responsibility cultivated. Second, given the assessment function of collect-ing accurate and rich information about learners, the best way to elicit responses is an issue to consider. Group discussions may be conducted to allow more time for students to formulate answers and to enable every student to express their ideas. Third, to ensure that the information elicited actually represents student thinking and is thus meaning-ful input for subsequent teacher decision making, a trusting classroom environment should be established. Teachers are advised to listen carefully to what their students

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

Jiang 17

say rather than merely offering predetermined answers, to treat student responses as provisional, and to show due respect to different ideas.

The current paper is among the first to explore teacher questioning from a formative assessment perspective in the Chinese tertiary context. Further research might explore how teacher training programs can develop teachers’ knowledge of strategies about formative assessment, as well as effective ways to implement quality questioning.

Funding

This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

References

Black P, Wiliam D (1998a) Assessment and classroom learning. Assessment in Education: Principles, Policy and Practice 5(1): 7–74.

Black, P, Wiliam, D (1998b) Inside the Black Box: Raising Standards through Classroom Assessment. London, UK: King’s College London School of Education.

Black P, Wiliam D (2009) Developing the theory of formative assessment. Educational Assessment, Evaluation and Accountability 21(1): 5–31.

Black P, Harrison C, Lee C, Marshall B, and Wiliam, D (2003) Assessment for Learning: Putting It into Practice. Maidenhead: Open University Press.

Bloom BS (1956) (ed.) Taxonomy of Educational Objectives. New York: Longmans, Green.Carless D (2011) From Testing to Productive Student Learning: Implementing Formative

Assessment in Confucian-heritage Settings. New York: Routledge.Chick J (1996) Safe-talk: collusion in apartheid education. In: Coleman H (ed.) Society and the

Language Classroom. Cambridge: Cambridge University Press, 21–39.Hattie J, Timperley H (2007) The power of feedback. Review of Educational Research 77(1):

81–112.Hill K, McNamara T (2012) Developing a comprehensive, empirically based research framework

for classroom-based assessment. Language Testing 29(3): 395–420.Hu Q, Nicholson A, and Chen W (2004) A survey on the questioning pattern of college English

teachers. Foreign Language World 6: 22–27.Leung C (2005) Classroom teacher assessment of second language development. In: Hinkel E (ed.)

Handbook of Research in Second Language Teaching and Learning. Mahwah, NJ: Lawrence Erlbaum Associates, 869–88.

Leung C, Mohan B (2004) Teacher formative assessment and talk in classroom context: assess-ment as discourse and assessment of discourse. Language Testing 21(3): 335–59.

Long MH, Sato CJ (1983) Classroom foreigner talk discourse: forms and functions of teach-ers’ questions. In: Seliger HW, Long MH (eds.) Classroom-Oriented Research in Second Language Acquisition. Rowley, MA: Newbury House, 268–86.

Mehan H (1979) Learning Lessons: Social Organization in the Classroom. Cambridge, MA: Harvard University Press.

Merriam SB (1998) Qualitative Research and Case Study Applications in Education. San Francisco, CA: Jossy-Bass Publishers.

Musumeci D (1996) Teacher-learner negotiation in content-based instruction: communication at cross-purposes? Applied Linguistics 17(3): 286–325.

Rea-Dickins P (2001) Mirror, mirror on the wall: identifying processes of classroom assessment. Language Testing 18(4): 429–62.

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from

18 RELC Journal

Richards JC, Lockhart C (1994) Reflective Teaching in Second Language Classrooms. Cambridge: Cambridge University Press.

Ritchie J, Spencer L (2002) Qualitative data analysis for applied policy research. In: Huberman AM, Miles MB (eds.) The Qualitative Researcher’s Companion. Thousand Oaks, CA: SAGE, 305–30.

Ruiz-Primo MA, Furtak EM (2006) Informal formative assessment and scientific inquiry: Exploring teachers’ practices and student learning. Educational Assessment 11(3–4): 205–35.

Ruiz-Primo MA, Furtak EM (2007) Exploring teachers’ informal formative assessment prac-tices and students’ understanding in the context of scientific inquiry. Journal of Research in Science Teaching 44(1): 57–84.

Tan Z (2007) Questioning in Chinese university EL classrooms: what lies beyond it? RELC Journal 38(1): 87–103.

Tsui A (1996) Reticence and anxiety in second language learning. In: Bailey K, Nunan D (eds.) Voices from the Language Classrooms: Qualitative Research in Second Language Education. Cambridge: Cambridge University Press, 145–67.

at PENNSYLVANIA STATE UNIV on September 12, 2016rel.sagepub.comDownloaded from