Revealing Writing, Concealing Writers: High-Stakes Assessment in an Urban Elementary Classroom

44
http://jlr.sagepub.com/ Journal of Literacy Research http://jlr.sagepub.com/content/45/2/99 The online version of this article can be found at: DOI: 10.1177/1086296X13475621 2013 45: 99 originally published online 13 February 2013 Journal of Literacy Research Elizabeth Dutro, Makenzie K. Selland and Andrea C. Bien Urban Elementary Classroom Revealing Writing, Concealing Writers: High-Stakes Assessment in an Published by: http://www.sagepublications.com On behalf of: Literary Research Association can be found at: Journal of Literacy Research Additional services and information for http://jlr.sagepub.com/cgi/alerts Email Alerts: http://jlr.sagepub.com/subscriptions Subscriptions: http://www.sagepub.com/journalsReprints.nav Reprints: http://www.sagepub.com/journalsPermissions.nav Permissions: http://jlr.sagepub.com/content/45/2/99.refs.html Citations: What is This? - Feb 13, 2013 OnlineFirst Version of Record - May 1, 2013 Version of Record >> at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013 jlr.sagepub.com Downloaded from

Transcript of Revealing Writing, Concealing Writers: High-Stakes Assessment in an Urban Elementary Classroom

http://jlr.sagepub.com/Journal of Literacy Research

http://jlr.sagepub.com/content/45/2/99The online version of this article can be found at:

 DOI: 10.1177/1086296X13475621

2013 45: 99 originally published online 13 February 2013Journal of Literacy ResearchElizabeth Dutro, Makenzie K. Selland and Andrea C. Bien

Urban Elementary ClassroomRevealing Writing, Concealing Writers: High-Stakes Assessment in an

  

Published by:

http://www.sagepublications.com

On behalf of: 

Literary Research Association

can be found at:Journal of Literacy ResearchAdditional services and information for    

  http://jlr.sagepub.com/cgi/alertsEmail Alerts:

 

http://jlr.sagepub.com/subscriptionsSubscriptions:  

http://www.sagepub.com/journalsReprints.navReprints:  

http://www.sagepub.com/journalsPermissions.navPermissions:  

http://jlr.sagepub.com/content/45/2/99.refs.htmlCitations:  

What is This? 

- Feb 13, 2013OnlineFirst Version of Record  

- May 1, 2013Version of Record >>

at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from at UNIV OF COLORADO LIBRARIES on September 30, 2013jlr.sagepub.comDownloaded from

Journal of Literacy Research45(2) 99 –141

© The Author(s) 2013Reprints and permissions:

sagepub.com/journalsPermissions.navDOI: 10.1177/1086296X13475621

jlr.sagepub.com

Article

475621 JLR45210.1177/1086296X13475621Journal of Literacy ResearchDutro et al.

1University of Colorado at Boulder, CO, USA2University of Washington, Seattle, WA, USA

Corresponding Author:Elizabeth Dutro, Literacy Studies, University of Colorado at Boulder, 249 UCB, Boulder, CO 80309, USA. Email: [email protected]

Revealing Writing, Concealing Writers: High-Stakes Assessment in an Urban Elementary Classroom

Elizabeth Dutro1, Makenzie K. Selland1, and Andrea C. Bien1

Abstract

Drawing on the combined theoretical lenses of positioning theory and academic literacies, this article presents case studies of four children from one urban classroom, two of whom scored at or above proficient on the large-scale writing assessments required by their district and state and two of whom scored below. Using criteria from state rubrics, we closely analyzed the writing products children produced for high-stakes assessments and classroom writing projects as well as drew on a range of qualitative data to contextualize children’s writing within the complex relationships with writing observed across the school year. Our findings suggest test scores may be inaccurate or highly malleable based on relations between the features of the writing children produced, students’ identities as writers and preferred practices, quirks of the testing context, and arbitrary features of the test itself. Indeed, our analyses found that some children’s test scores misrepresented their capabilities as demonstrated in the writing they produced both within and outside of the testing situation. Furthermore, the form of the assessments risked positioning these children in just the ways that would frustrate rather than promote their attempts to put their best writing on the page. Our data suggest that the children’s test scores did not provide the information about achievement in writing that such tests are assumed to convey and that both the form of on-demand writing assessments and the dichotomized sorting they facilitate potentially undermine some of the very goals, often articulated by policy makers, underlying the push for accountability through testing.

100 Journal of Literacy Research 45(2)

Keywords

assessment, diagnosis, at risk, struggling, diversity, literacy and equity, writing, composi-tion, discourse, cultural analysis, qualitative (general)

In an era of high-stakes accountability, few words are imbued with more weight than proficient. Although the definition of proficient may be straightforward, “expert; skilled,” determining what counts as skilled enough in writing to make judgments about which young writers in classrooms have earned the adjective proficient is far from simple. Mirabel, Philip, April, and Max, four children in an urban fourth and fifth grade classroom, embody that complexity. All brought to their classroom facili-ties with writing, perceptions of their own competence as writers, and experiences within and outside of school that affected their positioning within what counted as proficient writing on the high-stakes assessments they encountered. In the year in which we followed their journeys as writers, two of these children scored at or above proficient and two did not. Our inquiry was fueled by questions about children’s experiences with high-stakes writing assessments, the content of the writing they produce for those assessments and for other purposes and audiences, and the rela-tionships with writing that became visible across a school year. The assumptions about proficient writing embedded within standardized tests suggest a binary notion of proficiency that is seemingly easy to define, and our questions were particularly concerned with children’s positioning within those two distinct storylines. Although it is possible for individuals and groups to contest or subvert the positions ascribed them, in this dichotomy of “proficient” and “not proficient” there is little room for students to foster nuanced understandings of themselves as writers and, indeed, no direct route to refute or reorganize the pronouncement of the test. If weighty deci-sions are going to be made on the basis of test scores—as they increasingly are—we wanted to build better understandings of just what those scores might reveal or con-ceal about young writers.

Since No Child Left Behind (NCLB) was signed into law in 2002, researchers have increasingly addressed the process and consequences of test-based account-ability in literacy (e.g., Bomer, 2005; Dyson, 2006; Hillocks, 2002; Scherff & Piazza, 2005; Zacher, 2011). Such research is made all the more relevant in light of the Obama administration’s reform initiative “Race to the Top,” which directly ties competition for increased federal funding to states’ use of standardized test scores to identify what counts as successful teaching and learning. Indeed, our reading of the research literature in writing suggests an increasingly widening chasm between lit-eracy theories and research grounded in the social and cultural dimensions of writ-ing and the views of writing embedded in current policy. Although we do not claim that this account of children’s experiences with school writing and large-scale writ-ing assessment can bridge that gap between classroom-based research and policy, we do view this work as both a call for those bridges to be built and an attempt to add to that construction.

Dutro et al. 101

Toward that end, along with other researchers in literacy (Collins, 2011; McVee, Baldassarre, & Bailey, 2004; McVee, Brock, & Glazier, 2011), we draw on positioning theory (Davies & Harre, 1990; Harre, Moghaddam, Rothbart, & Sabat, 2009) to begin to understand how these students crafted ongoing identities for themselves as writers while they wrote to and through varied audiences and multiple contexts, not the least of which included their experiences writing for large-scale assessments, a context imbued with institutional power. Through a positioning lens, we focus our attention on the interweaving threads of formal and informal discourses that both imposed posi-tions and made positions available to these young writers.

Through their academic literacies perspective, Lea and Street (1998) discuss the power and influence of the institutionalized discourses students encounter in schools around writing. By fusing positioning theory with Lea and Street’s conception of aca-demic literacies, we argue that a narrow understanding of writing constructed only through an institutional lens can limit the import of other, often more complex, discur-sive productions of self as writer available in the varied contexts of students’ lived experiences. In addition, constricted notions of writing narrow what it is possible to see when assessing the linguistic features of children’s writing. Viewing children’s writing experiences through the issues of identity, social contexts, and institutional power that these perspectives afford both exposes what is at stake in the institutional-ized views of writing that children encounter in school and raises key issues surround-ing writing curriculum and assessment, particularly for those children who have been traditionally least well served by U.S. public schools.

Theoretical FrameworkBelow, we discuss in more detail our composite framework that draws on notions of academic literacies and positioning to understand a piece of the current story of chil-dren writing in elementary schools.

An Academic Literacies PerspectiveFollowing from its grounding in new literacy studies, one of the key assumptions of an academic literacies perspective is that writing is a social and cultural practice, an assumption supported by substantive research in K-12 writing. For instance, it is well established that children’s interactions with writing cannot be understood apart from their social relationships, the experiences they bring to the classroom, the micro-culture of the classroom, their perspectives on school and themselves as learners, and the texts and cultural resources with which they engage outside of school (e.g., Bloome & Egan-Robertson, 1993; Dixon, Frank, & Green, 1999; Dyson, 1993, 2001; Egan-Robertson, 1998; Lensmire, 1994; Lemke, 1995; McCarthey, 2001; Moll, Saez, & Dworin, 2001; Sperling & Woodlief, 1997).

Lea and Street (1998) articulate a perspective of academic literacies in higher edu-cation contexts, describing how

102 Journal of Literacy Research 45(2)

[t]he processes of student writing and tutor feedback are defined through implicit assumptions about what constitutes valid knowledge within a particu-lar context, and the relationships of authority that exist around the communi-cation of these assumptions. The nature of this authority and the claims associated with it can be identified through both formal, linguistic features of the writing involved and in the social and institutional relationships associated with it. (p. 160)

This argument is highly applicable to children’s experiences with writing in the contexts of large-scale assessment, and, with very few exceptions, such tests are con-structed and evaluated such that they cannot address issues of identity and social con-text. Indeed, such a notion seems anathema to conceptualizations of large-scale assessments as an objective measure of student achievement, even as research has demonstrated the importance of attention to identity and social context in the creation and implementation of literacy curriculum and assessment (for discussions of this lit-erature, see Lewis, Enciso, & Moje, 2007; Vasudevan & Campano, 2009). Furthermore, Lea and Street argue for the importance of examining the linguistic features of student writing within this framework in relation to authoritative views of what counts as pro-ficient writing, an analytic move that was important to our study.

Positions Offered to Young WritersPositioning theory further helps us expand our view to include the numerous experi-ences with writing that converge to constitute a child’s writing life in school. Grounded in specific locations and social networks in which students develop under-standings of what it means to write well, children engage in multiple and often con-tradictory discourses regarding their performances as writers. Davies and Harre (1990) note, in constructing stories of self and other, we perform a “complex weav-ing together of the positions (and the cultural/social/political meanings that are attached to those positions) that are available within any number of discourses” (p. 60). Our use of “discourse” is consistent with poststructuralist theories that take Foucault’s (e.g., 1977) idea that discourses are systems of thought, belief, and prac-tices in which particular meanings and subject positions—that is, ways of defining oneself and being read by others in any given situation—are made available. These positions include certain “rights, duties, and obligations” as well as expectations about how individuals should enact them (McVee et al., 2011, p. 5). In accepting, rejecting, and dialoguing with such expectations, individuals negotiate positions through interactions with the storylines available in the local, socially contextual-ized, experiences of daily lives. However, individuals differ in their power to posi-tion themselves and others, located, as they are, in specific social networks that confer power in distinct ways.

As Harre and Van Langenhove (1998) explain, position in this theoretical context refers to a “complex cluster of generic personal attributes, structured in various

Dutro et al. 103

ways, which impinges on the possibilities of interpersonal, intergroup, and even intrapersonal action through some assignment of rights, duties, and obligations to an individual as are sustained by the cluster” (p. 1). The illustrative example cited by these authors, and highly relevant to our analysis, is the assignment of competence or incompetence to individuals in particular endeavors. Those deemed incompetent lose the right to make meaningful contributions within that context, even as they retain the possibility of resisting or challenging the position they have been ascribed. In this way, how individuals, groups, and institutions use language to locate self and others constructs and constricts opportunity for participation, engagement, and action. Our analysis points to large-scale writing assessment as one such endeavor in which individual children are positioned as competent or incompetent, which, in turn, leads to the construction of groups who are assumed to share a narrative thread in particular storylines of achievement (e.g., children whose scores suggest writing proficiency or struggle). Positioning theory offers a lens to explore both the com-plexity and the consequences of these dynamics in assessments of children’s writing (McVee et al., 2011), which is particularly helpful when considering the ways high-stakes tests, imbued with great institutional power, contribute to the positioning of young writers. Reliance on large-scale assessments that define proficiency through a child’s position above a particular cut score necessarily inscribes dichotomies of success and failure that are belied by the complexities that lie behind those scores. In Davies’s (2000) words,

Binary logic constitutes the world in hierarchical ways through its privileging of one term or category within the binary . . . being positioned as one who belongs in or is defined in terms of the negative or dependent term can lock people into repeated patterns of powerlessness. (p. 107)

It is just such dichotomies that need to be challenged through research that reveals the nuance of writing struggle and facility subsumed by starkly defined categories.

As Dyson (2006) has argued, the neutral way in which writing conventions are traditionally approached (and, we would add, are embedded in large-scale writing tests) is contested by the “fluid world of diverse linguistic resources and varied com-municative practices” in classrooms (pp. 13-14). Reducing writing to a universal, value-neutral practice casts a shadow over the multiple dimensions of students’ iden-tities and limits the positions available to them (e.g., to be an invested writer working on organization or creative writer with exquisite voice who is working on conven-tions). There is also the potential that the binary discourses of “proficiency” we high-light and ascribe importance to (i.e., discourses that are institutionally powerful) are the ones that have the most potential to be taken up as students engage in the process of producing coherent storylines of themselves as writers. This is consequential for students whose potential is hidden by achievement scores and other institutionally constructed narratives that define them as struggling and risk alienating them from school altogether.

104 Journal of Literacy Research 45(2)

The Effects of High-Stakes Tests

Many of the most accomplished scholars of assessment argue again and again that these measures are inadequate, providing one partial lens on what a child knows and is able to do and, thus, should be viewed as just that—incomplete (e.g., Koretz, 2008; Linn, 2000; Shepard, 2002). Shepard (1995), for instance, argues that high-stakes consequences are not an effective lever of change and that “even authentic measures are corruptible and, when practiced for, can distort curriculum and undermine profes-sional autonomy” (p. 38). Indeed, much research over the past two decades suggests that the emphasis on testing constrains curriculum as well as teachers’ attempts to provide students with rich learning experiences focused on higher-order inquiry (e.g., Abrams, Pedulla, & Madaus, 2003; Barksdale-Ladd & Thomas, 2000; Berliner, 2007; Darling-Hammond & Rustique-Forrester, 2005; Lomax, West, Harmon, Viator, & Madaus, 1995; Mathison & Freeman, 2003; McCarthey, 2008; Noddings, 2002; Zacher, 2011). Researchers have also found that some students respond with anxiety, faulty assumptions, frustration, or decreased motivation to the testing process and awareness of their scores (as these studies also find, certainly other students seem to take the process in stride or even enjoy it, particularly those who score well; e.g., Dutro & Selland, 2012; Enciso, 2001; Gershman, 2004; Jones et al., 1999; Perna & Thomas, 2009; Roderick & Engel, 2001; Wheelock, Bebell, & Haney, 2000). Indeed, post-NCLB reviews of the literature on high-stakes testing conclude that there is not convincing evidence that such testing has its intended effect of increasing student learning and may have unintended, significant negative consequences for students, especially those in high-poverty schools (Laitsch, 2006; Nichols, 2007). As Nichols and Berliner (2007) emphasize, the high-stakes policies often lead to corruption and diminished instruction and learning opportunities for the very children they purport to serve. In writing specifically, Hillocks (2002) studied state writing tests and found that in some states the high-stakes writing assessments were driving instruction in ways that emphasized simplistic notions of genre, purposes for writing, and process.

However, even in the midst of such research on the nature and consequences of large-scale writing assessment, the chasm between these evidence-based arguments and the will of federal policy widens. In 2010, federal funding was allocated to a hand-ful of states based largely on their use of high-stakes test scores to determine student competency and link test scores to measures of teacher effectiveness. As the focus on standardized test scores as the primary measure of both student and teacher compe-tency continues to dominate the policy landscape, it is crucial to build deeper under-standings of what high-stakes tests may miss and dismiss about individual students’ knowledge and competencies.

Study ContextIn this section, we describe the larger qualitative study in which our analysis is situ-ated, the children who participated, and the classroom writing context.

Dutro et al. 105

Research Setting

The data analyzed for this article were gathered in Year 2 of a 2-year classroom-based study of children’s experiences in literacy and mathematics in a fourth and fifth grade urban classroom (Dutro, Kazemi, & Balf, 2005, 2006; Dutro, Kazemi, Balf, & Lin, 2008). The study’s goals were to construct in-depth understandings of children’s social and academic experiences within and across subject areas in one classroom over a significant period. Thus, our design combined participant observation and other ethnographic tools, case study (of the focus classroom, as well as particular children and instructional events), discourse analysis, and, for some of the case study analyses, including those on which we focus here, document analysis. The research involved a close collaboration between university-based researchers and Sue, the classroom teacher. Sue was in her third and fourth years of teaching during the study and had established relationships with the university researchers through professional devel-opment projects. Sue had completed her credentials at a local university and began her master’s work during the research period. Elizabeth and collaborator, Elham Kazemi, along with research assistants, visited the classroom two to four days a week through-out the school year.

ParticipantsA total of 23 children participated in the larger project. Their school was located in a large northwestern city and reflects the racial and ethnic diversity of the city’s shifting demographics, with a student population including Native American (4%), Asian American (21%), Black (22%), Latino (23%), and White (31%) children. In addition to Native American, African American, Asian American, and White families who had lived in the United States for two or more generations, this school included many families who had more recently emigrated from Africa (primarily Ethiopia, Eritrea, and Somalia), Southeast Asia, Pakistan, and Mexico. Students’ families were primar-ily working class and lower middle class, with a few students facing more significant economic struggles. Of the children, 60% qualified for free or reduced-price lunches. In this article, we focus on the experiences of four children, whom we describe in more detail through their case stories in a later section: April, a biracial Japanese American–White girl in fifth grade; Max, a Vietnamese American boy in fourth grade; Mirabel, a Mexican American girl in fifth grade; and Philip, a Native American boy in fourth grade.

Classroom Writing ContextIn what follows, we attempt to provide a portrait of Sue’s instruction as context for our case studies. Although our questions are not centrally focused on instruction, we acknowledge that our less detailed attention to instruction is a limitation of this analysis.

106 Journal of Literacy Research 45(2)

As in most classrooms, the curriculum and instruction students encountered played a complex role as the children negotiated their writing and identities as writers. Their teacher, Sue, did not have a district-prescribed writing curriculum and often taught reading and writing as integrated units focused on particular themes and genres, often embedding writing in content-area instruction. For instance, the children read folktales and adventure narratives while also writing their own, from prewriting to publication. They also participated in integrated units, such as a several-weeks-long unit on the Iditarod sled dog race that incorporated reading, writing, social studies, science, and mathematics. Students also encountered stand-alone writing projects, such as the Young Authors story they drafted and eventually published following a well-known children’s author’s visit to their school.

Like many early career teachers, Sue’s writing instruction was developing as she attempted to incorporate aspects of writing instruction that reflected a process-oriented approach to classroom writing (e.g., Calkins, 1994; Fletcher & Portalupi, 2001; Graves, 2003; National Council of Teachers of English [NCTE], 2004), a philosophy of writing instruction to which she was introduced in her credentialing program and that was reinforced in district professional development. This approach, which grew from innovative programs of research in the 1960s and 1970s, is widely considered best practice in the field, as evidenced by its prominence in white papers and position statements disseminated by professional organizations (e.g., NCTE, 2004), approaches to writing pedagogy within teacher-training across university-based and alternative programs, and wide adoption by many districts, schools, and teachers. Process writing will be familiar to many readers, and thus we point to just a few of the primary prin-ciples of this approach to provide context for both the classroom instruction and the high-stakes tests we discuss, including the belief that everyone has the capacity to be a writer and that writing can be taught; writing is a social and cultural process that will vary for different writers across various writing situations; writers need to have the grammar and spelling skills necessary to edit their work, but those skills should not be conflated with other traits of writing; and writing instruction should prepare people to write for many and varied purposes and audiences. As research has also emphasized (e.g., Delpit, 1995; Lensmire, 1994), progressive methods of teaching writing may work in very complicated ways for some children and need to be examined critically, particularly in classrooms serving students of color and emergent bilinguals.

Sue’s talk about her goals for writing pointed toward some of those commitments, as she emphasized her desire to provide opportunities for children to experience the writing process from conceptualization to publication, write from their own experi-ence, and experience choice and flexibility in topic in some assignments and projects. She also spoke of her commitment to exposing them to a wide range of genres, particu-larly those they would encounter on the district and state assessments they took each year. As she enacted her goals, we documented opportunities for children to engage the writing process, writing often and for varied purposes and audiences, encourage-ment to draw on their experiences and interests, regular informal writing conferences, and allowance for some choice of focus and topic. Although we observed some lessons

Dutro et al. 107

focused on skills and strategies related to craft, conventions, organization, and genre, Sue is the first to admit that she could have incorporated more explicit instruction of various aspects of writing during her literacy block. Indeed, incorporating more mini-lessons in writing was one of her goals for her instruction in subsequent years. We also observed Sue adapting her writing program in response to the social and cultural issues that arose for individual children as they engaged with school writing. We will suggest that the instructional methods that they encountered in Sue’s classroom supported these four children’s writing in important ways, even as experienced writing teachers will notice ways in which Sue could provide more effective support as her teaching developed.

Data Collection and AnalysisOur analysis centered on the following research question: In the context of insights about children’s social and intellectual identities, their experiences with writing within and outside of the classroom, and close analysis of their writing products, what do their scores on the state and district writing assessment reveal and conceal about these children as writers?

Classroom Data and Case StudiesOur data included field notes of observations and informal interactions with each child (78 total), audio and video recordings (62 total) of more than 80 lessons, and discussions for which Mirabel, April, Max, and Philip were present. We also drew on assessment data in writing, including students’ self-assessments and their scores on high-stakes assessments. In addition, our data included transcripts of conversations with Sue and a semistructured interview with each student, which we conducted in the spring. The interview protocol ensured all students were asked a common set of ques-tions, while providing us flexibility to follow the children’s lead based on their responses as well as to ask questions specific to our observations of students’ experi-ence. We also collected writing produced by the students across the school year. The written artifacts children produced across the year included an ongoing dialogue read-ing journal, a math journal in which they described their problem-solving strategies, research reports for social studies, focused writing exercises related to minilessons (e.g., constructing effective paragraphs, complex sentences, practicing leads and con-clusions), and more extended writing projects in which students took their writing through the writing process from prewriting through publication. In the case studies we discuss below, the children speak of some common writing experiences that rep-resented some of those major writing projects during the year, including a journal they kept as part of a major unit of study on the Iditarod sled dog race, a project in which they completed an original folktale from prewriting to publication, and a story they each wrote and published as part of a Young Authors celebration at their school. Because these projects were the most visible in children’s talk about writing and

108 Journal of Literacy Research 45(2)

represented important writing experiences in the classroom, they are a significant presence across our case studies.

Although our analysis certainly involved a focus on the products of the children’s writing, those products were contextualized within attention to their writing practices across our time in their classroom (Street, 1993). Practices to which we attended included children’s stances toward various kinds of writing across contexts, their body language and emotions during writing events, and their views of writing and them-selves as writers. For instance, the tears April shed during the state writing assessment related to both her ongoing struggles with writer’s block and her identity as a success-ful, accomplished writer who understood the importance of doing well on the test. Philip’s behaviors around writing—arms at his sides or playing with his pencil, eyes looking everywhere but at his paper—provided crucial information about his relation-ship to writing through which to consider the products he produced.

Both in the larger project and in the more focused questions we describe here, our analysis drew on tools of grounded theory to ensure our continual immersion in the data and to identify, hone, and revise patterns that emerged from our notes and tran-scripts over time (Strauss & Corbin, 1998). All field notes and transcripts were entered into the qualitative data analysis program ATLAS.ti (Muhr, 1996) and coded to create inductive codes arising from our ongoing immersion in the data and deduc-tive codes that arose from the study’s conceptual framework, design, and questions (see Appendix A). Drawing on conceptual codes related to our theoretical frame-work helped us locate particular events, writing samples, or transcripts to analyze more closely for instances of positioning. Turning to those identified episodes in the data, we used tools from critical discourse analysis (Fairclough, 1989; Luke, 1995) to examine and learn from instances that confirmed or challenged emerging themes, as well as to center our analyses on how children positioned themselves and were positioned by others through the local discourses surrounding classroom writing events and the testing process. Specifically, we identified segments of transcripts for critical discourse analysis based on those we had coded as related to children impli-cating themselves or being implicated in particular storylines surrounding testing, as well as what it means to be a competent, struggling, engaged, frustrated, eager, or disengaged writer. In addition, we continually shared our evolving interpretations with each other. Our collaboration provided multiple perspectives (classroom teacher and participant–observers) through which to check and reexamine our emerging understandings. The research team met biweekly to discuss and analyze project data. We also examined our emerging understandings across data sources and tools of analysis to ensure that the understandings gleaned from discourse analy-sis were consistent.

We chose four children as focal students after observing their experiences in writing for much of the year. Each child experienced the writing curriculum in ways unique to their experience, but also represented patterns we saw across groups of children to inconsistencies between their experiences with classroom writing and the process and results of high-stakes assessment. For instance, April represented

Dutro et al. 109

students who were deemed successful in school writing, but whose accomplish-ments were embedded with complex negotiations with both writing curriculum and instruction and the structures of high-stakes assessment. Philip represented other students in his class who struggled with writing, but responded well to curricular adjustments that allowed more freedom in choice of topic. We chose four students for two reasons. First, pairs of students within the four cases shared proficiency designations, but were very differently positioned from one another as writers in the classroom. Second, given our goals to examine the complexities within rela-tionships of proficiency designations and children’s writing across the school year, we needed a number of case studies manageable for depth of conducting and shar-ing analysis, while also conveying the complexities within and across children’s experiences. We attend to the uniqueness of each child’s case, while also discussing ways that their case is instructive for thinking about the wider experiences of children.

Guided by frameworks for constructing case studies from substantive qualitative data (Dyson & Genishi, 2005; Yin, 2008), we used our analyses across data sources to write descriptions of each child through the following lenses: positioning within dis-courses of writing in both the classroom and embedded in the state assessment and its related documents (website descriptions of the assessment for teachers and the public, the rubric and descriptions of each section), the social context in which each child was embedded in the classroom, and how children performed (verbally, nonverbally, and in their writing) an identity in relation to writing. In addition, our cases include close analyses of children’s writing.

Assessment DataIn this section, we focus on the assessment data, including the form and structure of the state and district writing assessments, the state-identified criteria for proficient writing used to determine assessment scores, and the methods we used in our analyses of children’s writing.

Our analysis centered on two large-scale assessments—a district-level direct writ-ing assessment given to Sue’s fifth graders and the writing portion of the state assess-ment of student learning, administered to Sue’s fourth graders. Students in Grades 3 to 8 and 10 took the criterion-referenced state assessment of student learning each spring in reading and math, and Grades 4, 7, and 10 were also tested in writing. The direct writing assessment required essay-length responses to provided prompts within a 90-minute time limit. The prompts were distributed to students randomly and focused on one of three genres: narrative, expository, or persuasive. Student writing was scored on a 4-point rubric with a score of 3 or 4 necessary to meet proficiency. The state assessment of student learning required students to write to both a narrative and an expository prompt. Responses were scored on a scale of 1 to 4 for content, organiza-tion, and style and a scale of 0 to 2 on conventions (see Appendix A). A total of 9 points out of the possible 12 was required to meet proficiency.

110 Journal of Literacy Research 45(2)

Our analysis focused on the following features that the state emphasizes as hall-marks of proficient writing (based on state documents and as reflected in the state rubric in Appendix A):

• Focus An adequate focus on purpose, topic, and audience throughout the piece A main idea with adequate, elaborated, and mostly relevant supporting

details• Ways of Structuring (we included attention to cohesive devices in this

category) A sense of sequence with a clear beginning, middle, and end Adequate transitions Intentional paragraphing

• Voice, Word Choice, and Sentence Fluency An effort to engage audiences; a sense of who the author is; the use of

emotions in stories; use of metaphor and imagery An awareness of word choice: appropriate, adequate vocabulary; begin-

ning to use engaging words; limited use of tired words A variety of sentence structures and lengths: simple and complex sen-

tences; introductory phrases and clauses• Conventions and Spelling

Occasional lapses in spelling of common words that don’t impede under-standing of text; correct spelling of high frequency words

An understanding of the basic rules of conventions: usage, punctuation, capitalization, structure

Lapses in conventions caused by excitement in the process of writing

For our analysis, two researchers separately examined each sample of children’s writing in relation to each category of proficient writing, including children’s writing completed in the context of classroom writing assignments and their assessment essays (and, in Mirabel’s case, samples of writing completed outside of school). One researcher made notes on her analysis of each element in relation to each piece of writ-ing and used those notes to construct an account of the analysis. A second researcher then read and commented on the account based on her separate analysis of the chil-dren’s writing. Any discrepancies between the two analyses were then discussed and reconciled. Discrepancies were minor and primarily involved level of detail in the analysis, which, in each instance, resulted in our opting to draw on the more detailed analysis. In addition, because the district writing assessment was designed to provide data on writing achievement in some of the grade levels that did not take the state writ-ing assessment, it was consistent in design and focus with the state writing assessment (one difference in the logistics of its scoring was that the district folded conventions

Dutro et al. 111

into a 4-point scale). Given that the criteria for writing proficiency were the same across the two assessments, we used the state criteria for proficient writing in our analysis of children’s writing for both assessments. In addition, because both of the fourth graders we discuss below (Max and Philip) received consistent scores across their two essays for the state assessment, we focused our close analysis of their assess-ment writing on one of the two essays they completed for the state assessment. For the fifth graders (Mirabel and April), our analysis focused on the essay each wrote for the district assessment. Our goal in discussing children’s experiences with district and state-level writing assessments is to illuminate how each child was positioned within the testing experience (how it unfolded for them), the writing they produced, and how they were evaluated. In other words, our lens is focused on the children and their writ-ing, and we do not intend to analyze or evaluate any one particular district or state writing assessment.

FindingsIn this section, we present our case studies of Mirabel, April, Max, and Philip. Although our level of detail varies according to relevant issues for each child, each case includes the child’s positioning within the institutional measurements for profi-ciency, including analyses of the children’s test essays and classroom writing, accounts of the child’s identity in relationship to writing, and the sociocultural con-texts in which each child’s motivations and challenges unfolded in the classroom.

Mirabel: “I Like to Write, and That’s All.”Mirabel was a Latina fifth grader from a working-class family, tall for her age and more physically developed than most of the other girls in the class. She was bilingual in Spanish and English and, although she spoke English fluently, received weekly pullout support from the school’s English language learner instructor. For most of the year, she was quiet in class, and our field notes indicate that Sue spoke early in the year about Mirabel as one of the students who did not seem to have friends in the class. At recess, Mirabel often opted to stay in the classroom and look at the Japanese anime books she brought from home or chat with Sue or other adults in the room. She some-times spoke to classmates who sat near her, but mostly she watched and listened. As we will discuss, Mirabel’s social positioning in the classroom changed radically toward the end of the year in ways that seemed deeply connected to writing.

Mirabel struggled with certain aspects of writing. Conventions of spelling and punctuation were very challenging, making it sometimes difficult to read and under-stand her writing. In a report on Mirabel’s progress, Sue noted that Mirabel’s primary challenges were spelling and the use of complete sentences, both of which she thought might be related to second language issues. As Sue also noted, Mirabel’s writing was

112 Journal of Literacy Research 45(2)

consistently coherent and she was capable of writing a lot, as evidenced in the rela-tively lengthy entries in her reading response and Iditarod journals. In our observa-tions, she was often highly engaged during writing tasks—rarely distracted, writing diligently, and listening closely to Sue’s instruction. Although her writing was not particularly neat and she used much invented spelling, the organization and appear-ance of her work seemed to matter to her, further demonstrating her commitment to writing. The following description is from our field notes in March as children worked on their journals for the Iditarod unit:

I notice that Mirabel has erased an entire page of her journal. I ask her why she erased. She said that she had written on the wrong page. She showed me that she had recopied the entire entry onto the page it should have been on (to be consecutive).

Mirabel’s assessment. Mirabel received a score of 1.8 on the district’s writing assess-ment, well below the 3 required for proficiency. On her report card she received scores of 1 (significantly below grade level) on sentence fluency and conventions, 2 (below grade level) for word choice, and 3 (at grade level) for ideas and content, voice, and organization. Although she received some positive feedback from Sue, her scores on writing assessments indicated only one thing about Mirabel as a writer: that she was a decidedly struggling writer.

However, close analysis of her assessment essay reveals many features of writing valued by the district and state. She wrote to the expository prompt asking her to describe what she would do if she were in charge of recess activities (see Figure 1). Although a noticeable struggle with conventions and spelling is evident in Mirabel’s writing for the state exam, her work to attend to these aspects of her writing is also apparent. She employs inventive spelling that is comprehensible, such as “woud” for “would,” “mulupashon” for “multiplication,” and “anser” for “answer.” She uses pre-cise capitalization and indents her paragraphs intentionally. In addition, from the very beginning, her essay demonstrates strong organization centered on a clear main idea that remains consistent throughout the piece. In her first sentence, she states, “If I was in charge of recess activities, I would choose multiplication and division memory” (in our analyses, we quote the translations of the children’s writing to better highlight features of the writing beyond conventions). Although she uses a period instead of a comma and employs invented spelling, there is no confusion about the topic of her essay. This clear topic sentence is followed by a logical middle and conclusion that provide multiple supporting details, including how such math practice would help her peers to “remember the answer when you are given a test” and focus on “behaving while learning.” Furthermore, these multiple details supporting her choice of activity are linked through a series of connecting words that, although perhaps unsophisti-cated, make her organizational structure very clear, including the transitional phrase “I choose this because it helps to remember the answer when you are given a test” at the

Dutro et al. 113

start of her second paragraph. To conclude her essay, Mirabel writes, “So that is why I think multiplication and division memory will be helpful,” clearly indicating a sum-mary statement. Through these details, and others, Mirabel supports her main idea with logical reasons and fluidly connects them.

Mirabel’s essay displays not only strong organization, but also a keen awareness of her audience and understanding of the expository essay genre. This begins with her essay’s topic sentence, which, in addition to introducing the topic for the piece, also restates the prompt, a technique often taught as a way to ensure that connection to the prompt is clear to readers. Subsequent paragraphs also begin with topic sentences that indicate their focus. Of note, the institutional context of this essay is significant, as she makes language choices that signal the institutional functions her essay will fulfill. As she refers in each paragraph to aspects of schooling and assessment, her choice of topic and supporting details suggests attention to the concerns of her perceived audi-ence of test assessors.

Furthermore, in her choice of topic and supporting details, Mirabel leverages this awareness of audience to also demonstrate her developing understanding of voice. Approaching voice as “expressive confidence” that suggests the writer’s beliefs that her ideas are worthy of attention (Luce-Kapler, Catlin, Sumara, & Kocher, 2011, p. 162), we note that Mirabel crafts her response to include the aspects of school she understands are important to her audience. The expressive nature of voice is also demonstrated in the variety of her sentence lengths and structures, although, at first glance, this could be

Figure 1. Mirabel’s district writing assessment.

114 Journal of Literacy Research 45(2)

obscured by errors in punctuation use. For instance, in her first paragraph she begins with a longer, complex sentence followed by three shorter sentences. Technically, many of her sentences can be classified as run-ons or fragments, primarily because of her use of periods in place of commas. However, her essay demonstrates a young writer’s experimentation with a variety of sentence lengths and structure. She does not rely on simple sentence constructions to avoid punctuation; thus, her mistakes could be inter-preted as evidence of her understanding of the importance of voice and complex con-structions, a writing skill that is arguably more sophisticated than proper use of punctuation (clearly an area in which she needs more explicit instruction and support).

Our analysis of Mirabel’s essay shows skills across the traits valued by the state, but these proficiencies were subsumed by her final (not proficient) score. As we discuss below, what the scores also cannot capture is Mirabel’s devotion to writing and her self-identification as a writer.

Mirabel’s writing behind her test score. Despite the feedback she had received via assess-ment scores, when asked about her favorite subject in school during her spring inter-view, Mirabel said, “I like to write, and that’s all.” She also indicated some of the parameters of school writing that hindered or supported her desire to engage with the writing process, positively noting the new positions made available to her in Sue’s class, notably through the curricular design of “Young Authors”:

Mirabel: I just like writing it, but I never liked revising. But, when I got into Ms. Bryant’s class, with her Young Authors, I really wanted to finish the book.

Elizabeth: Right. And, so, how did revising it feel then?Mirabel: Well, it was something I wanted to write. I worked really hard on it, so

I wanted to write that more than other things.

Reflecting on her writing process in her interview, Mirabel consistently empha-sized the importance of choosing her own topics, highlighting how her relationship with writing was affected by the increased opportunity for some autonomy within writing projects.

It was the Young Authors story that came to represent a significant shift in Mirabel’s experiences in Sue’s room. Soon after the Young Authors stories were published and shared with the class, we learned that Mirabel was particularly prolific in her out-of-school writing. Her story, written in early spring, launched a series of lengthy sequels over the ensuing months (see Figure 2). Like many professional writers, Mirabel spoke of her characters in that story and its sequels as compelling her to write. Speaking of the role played by her protagonist, she said, “It’s like, every time I write it, it’s like Rosy, I’m writing it for Rosy, because Rosy wants me to write it.” The ongoing saga, which seemed to be modeled on some of the telenovelas she watched at home, featured a pro-tagonist based in part on herself and some key characters named after her classmates. Her young author’s story was well received, and she soon began bringing her sequels into school and sharing them with classmates. She was extremely enthusiastic about her writing, and, in turn, her classmates were an engaged and enthusiastic audience.

Dutro et al. 115

Her story, “True Love 2,” reveals many insights into Mirabel’s strengths and prog-ress as a writer. To turn first to organization, although Mirabel does not use paragraphs in this story, the focus and logical order of the story are clear. She introduces her pro-tagonist at the start and proceeds to a clear beginning, middle, and end in which char-acters are introduced, the central conflict is revealed and builds to a climax, and then the conflict is resolved at the end. Thus, Mirabel demonstrates not only an ability to organize her writing, but also a well-developed understanding of narrative plot structure.

Figure 2. Excerpt from Mirabel’s “True Love 2” story.

116 Journal of Literacy Research 45(2)

With regard to conventions, Mirabel exhibits many of the same strengths and strug-gles in her out-of-school writing as she does in her test essay. She again uses capital-ization effectively, but seems to have more difficulties with spelling, punctuation, and run-on and fragment sentences, making it harder to decode her out-of-school writing. However, juxtaposing these two pieces reveals her more careful attention to spelling and conventions on the test, suggesting her awareness that these aspects of writing particularly matter when writing for that audience. She understands that editing is important, even though she does not yet have the skills in conventions necessary to perfect her work at that stage in the writing process.

In “True Love 2,” we also see further evidence of Mirabel’s strong facility with aspects of voice, as evidenced by her character development and use of dialogue and foreshadowing. For instance, she uses dialogue skillfully to uncover character moti-vations and plot details. At the end of her story, when Rosy and D.J. simultaneously ask each other, “Do you want to go with me?” she uses dialogue to push the plot forward and show, rather than tell, her story’s happy resolution. Mirabel’s story is also told with clear attention to her peer audience. She uses foreshadowing to create suspense early in the story when she writes, “Last year you would not believe what happened.” Later in the story, Rosy describes how D.J. “told me he would never break up with me again. At least that’s what I thought,” to hint at upcoming tension between the couple. Such techniques demonstrate evidence of strong voice and audi-ence awareness, as Mirabel strategically works to grab and hold her readers’ attention.

If all writing is social, the social aspects of writing were quite explicitly and liter-ally apparent in Mirabel’s experience. As she shared her writing with others, Mirabel began to forge relationships with her classmates, while also using the relationships of the classroom to grow her competency with and ideas for writing. Once her stories became a topic of interest among her peers, as she explained, Mirabel began to talk with others about her writing process: “Well I was writing the fourth book and I didn’t know what to write and Melody gave me the idea, why don’t you have some more, new characters, and I thought that’s a good idea!” As this example illustrates, Mirabel used the social resources available in the classroom to confirm and expand her ideas and her process as she positioned herself and was positioned by others as writer. This cyclical relationship between writing growth and her social connections was also seen in her developing friendships. Between March and the end of the school year, she became close friends with a group of socially well-connected and well-liked girls in the class. She no longer spent her recesses in the classroom, but walked the playground with her new friends, talking, giggling, and occasionally stopping at the bars to twirl and spin. Elizabeth asked if she thought her writing had affected her relationships with others in the class, and Mirabel replied,

When I started to write then its like I got [pause] I was like a loner in the begin-ning of the year and then me and Brittney started hanging out a lot of the time ’cause we were sort of alike, like we liked a lot of the same things. So I got a

Dutro et al. 117

lot of more friends. I just got some more friends and they have been really help-ful in school.”

In her end-of-year self-assessment, Mirabel wrote,

I try to work on organizing first. I write a rough draft. Then I find out where the paragraphs go. Details, I try to put as much details as possible. I’m very cre-ative. Once I get an idea I start to do it. Then I made a book. I’m making a series of books. I get plenty of ideas. I try to make sure my spelling is right, but I make a lot of mistakes. I need to work on paragraphs and spelling.

Although she is aware of her conventions struggles, she writes much more extensively about her craft of writing, a direct contrast to what appears to be the emphasis on conventions in the scoring of her assessment essay. The district writing assessment was able to highlight areas where Mirabel could use targeted instruction as a writer. However, her score did not reflect what she does accomplish in her assessment essay and in no way was it able to convey the more complex story of Mirabel’s writing life, in which she came to be positioned as an accomplished writer. Yet it is her designation as not proficient that will be sent to her parents and recorded in her cumulative file for future teachers.

Max: “I Need To Do Better in Writing”Max, a Vietnamese American boy from a lower-middle-class family, was a quiet, responsible student who participated in classroom discussions only if Sue called on him. When she did ask him to participate, he often appeared uncomfortable and mumbled his answers. Although he did not voluntarily speak up in class, he often appeared engaged—he smiled at Sue’s jokes, he copied notes from the overhead, he followed along on the right page. Max had a good rapport with Sue and seemed to feel secure in her class, but he was not socially connected to other children in the class. Except when directed to work with others in small groups, he generally kept to him-self in the classroom. On the playground he did not often interact with other children, tending to walk alone around the playground or play independently among other children on the play structure.

Max believed himself to be a struggling writer. Perhaps nothing better captures Max’s feelings about writing than his experiences with his Young Authors story. In his story, “The Boy Who Died,” Max wrote about a boy who perished as a result of his struggles with writing (Dutro et al., 2006). Max writes of his protagonist, “He’d rather be in a war or get a trip to Russia or even Antarctica. He had the flu for two days! He had one day to finish it, from rough draft to the end. He was depressed, really depressed.” The story ends with the boy unable to produce a coherent draft: “He was so miserable. He died by not breathing.” The experience of the protagonist viscerally captures the emotional response writing struggles evoked for Max.

118 Journal of Literacy Research 45(2)

Max’s assessment. Max scored proficient on the state writing test. He received the per-suasive prompt, “Should pets be allowed in school?” Because revision played such a strong role in his assessment essay, we show the changes he made from first draft to final version (see Figure 3).

Max’s changes to his draft show attention to grammar, word choice, flow, and coherence. His revisions to his introductory paragraph, for example, resulted in a much more engaging and coherent opening to his argument. His second paragraph, in contrast, showcased some of the issues that contributed to Max’s feedback in school and his own views of himself as a struggling writer. Although his essay does focus on two overarching ideas—why pets shouldn’t be in school and why they are problematic in general—a close analysis of Max’s writing reveals that the introduction and conclu-sion are clearly connected, but the middle of the essay focuses on details that are not clearly related to his beginning and end. He begins by discussing how pets transform a school into a zoo, but then changes topic when he describes the costs of pet care, the factors that make pets vulnerable to stealing and experimentation, and the fact that pets “might run away.” His creative concluding sentence then returns to his original idea, providing further reasons why pets should not come to school.

It’s not good to bring pets to school. Unless maybe bring the pet after school. Or if you live near a summer school and its summer. After summer school show your pet if it a dog. Pet make TRANSOFRM a school to a zoo. If you bring a pet to school. Where are you going to put it? Pets make people distracted are distracting at school and make a mess. A cat would mess and may scratch worksheets. A dog would be wild and that would leads to a mess. Be-sides pets need food and if a dog would be sick eating pizza and other type of food instead of dog food. Pets would be sick eating our food. they get sick eating food like pizza. PETS WOULD NEED TO GO TO THE VET EVERDAY. WHICH TAKES LOT OF MONEY.

Pets cost lots and lots of money. If they PETS were stolen someone would sell them some-place else or maybe even in a different conutrie. YOU BE SAD EVEN IF YOU HAD A NEW PET. The person who stole somebody pet will sell to maybe scientists who test medicine, chemical, and stuff like that Reseach, Chemical, Medicine and more department for money. I THINK THEY SHOULD GO TO A CASINO. The person who stole the pet would sell to scien-tists is because expermints. The scientist will do test as in expermint to pets. Besides all this pet can run away. THE PET MIGHT NOT COME BACK SO YOU HAVE TO PUT A PICTURE OF THE PET AND MAKE SURE YOU WRITE ON IT: LOST (PET KIND) BOUTY: (SHOULD BE CLOSE TO THE PET PRICE) PHONE: (YOUR PHONE NUNBER. SOME PEOPLE MIGHT LOOK FOR THE PET.

Pets need caring but you can’t care for it when you are in school because you have pay at-tention in class. Plus pets make distraction. Pets should stay at home. When you come back home have a great time and remmber when homework time ether put your pet in a doghouse or littbox and basket. AFTER SCHOOL SPEND SOME TIME WITH YOUR PET.

Figure 3. Max’s state assessment essay (with revisions).

Dutro et al. 119

Although Max takes us through a somewhat jagged organizational structure, he does provide details to support his ideas. For example, he offers specific examples of just what dogs and cats would do in school that would be problematic and result in transforming school into a “zoo.” He mentions that pizza (a common school lunch) is not appropriate for pets to eat. Even in the middle of the essay, he describes how if your pet were stolen, you would have to spend time creating “Lost” flyers.

Max demonstrates strong voice in his essay. He begins the essay with a metaphor, using the image of a “zoo” to help the reader see and understand the effects of bringing pets to school. He follows this metaphor with an effective use of a rhetorical question, asking, “If you bring a pet to school. Where are you going to put it?” When he later describes the problem of people stealing pets to sell to laboratories, he admonishes the thieves by describing how “they should go to a casino” for money instead, showing a focus on engaging his audience and a hint of humor. Max also employs strong word choice. For instance, pets don’t just turn a school into a zoo, they “transform” it. They are not just a problem for students, but are “distracting.” In these ways, Max showcases the qualities related to voice identified by the state as markers of proficiency.

In the more technical aspects of writing, Max shows facilities and struggles. He has occasional trouble with subject–verb agreement (e.g., “You be sad even if you had a new pet.”) and pronoun references (e.g., “Pets need caring, but you can’t care for it when you are in school.”); however, his spelling and capitalization are very strong. Although many of his sentences are short and simple, he peppers his essay with compound and complex sentences. Much like Mirabel, his experimentation with different sentence structures is often marked by lapses in correct punctuation, but show a child on the cusp of more complex sentence fluency and punctuation use. Given our analysis, it is striking that Max achieved proficiency on the state exam, as he demonstrated many of the same errors as Mirabel and his writing is sometimes less clearly organized than her piece.

Max’s writing behind his test score. Throughout the year, Max expressed a sense of privacy that seemed to directly affect his writing, feelings that may have been con-nected to cultural practices (though, given our data, we can only speculate). In an interview, Elham tried to talk with him about his Young Authors story. She asked if he could tell her about writing “The Boy Who Died,” and Max replied, “Ahhh . . . no. I don’t want to talk about it. . . . Yeah. It’s kinda embarrassing telling your stories. They’re like private.” In addition to expressing discomfort with sharing “personal” stories in writing, Max was also consistently critical of books that he felt talked too much about people’s feelings. Thus, his sense of a perceived boundary between public and private positioned him in difficult ways within some of the genres students are expected to engage in school.

In addition to his struggles to “make sense,” in his interview at the end of the school year, Max talked about his sense of himself as a writer.

Elham: Would you say, Max that you like to write?Max: [giggles] No!Elham: No. How come?

120 Journal of Literacy Research 45(2)

Max: Because I’m not good at it, and I need go to summer school because I need to focus on writing more.

Elham: Uh huh. Tell me about that. How do you feel about that?Max: I think it’s good to help me.

Max’s sense of his abilities in writing seemed closely tied to his impressions of how others judged him as a writer and suggests that he had internalized the message that writing was difficult for him and that he needed to improve. Indeed, Max did often struggle to express his ideas in ways that made sense to others, an issue that compli-cated many writing assignments for him (see Figure 4). It took time and close analysis

Figure 4. Max’s reading journal.

Dutro et al. 121

for us to decipher Max’s meaning in his reading journal entry, and although not all of Max’s writing was this convoluted, it was very often difficult to access his writing. Max’s written self-assessment at the end of the year was also revealing of his sense of his own writing abilities: “I think I only improved a little bit. I think using paragraphs is the one thing I most improved on. I think writing neatly is the one thing I most have to improve on and work on.” He was willing to give himself only a little credit for improving in writing, even though it was an area in which he had worked very hard. Max also focused on using paragraphs and handwriting, two characteristics of writing that are indeed important for the effective communication of ideas, but are, by com-parison, relatively superficial issues that do not capture the improvement he had shown in ideas, voice, and coherence.

Although he was deemed proficient by the test, Max continued to identify as a struggling writer. Though on different sides of the cut score for proficiency, Max and Mirabel share a misalignment between their test scores and writing identities. In addi-tion, aside from her more pronounced conventions struggles, our analysis suggests that Mirabel’s assessment essay was comparable to Max’s in other aspects of writing. Thus, Max and Mirabel’s experiences with the tests point to what can be masked by the binary positions made available through assessment scores.

April: “I’m a Pretty Good Writer”April, a fifth grader from a lower-middle-class family, was very self-assured, both academically and socially. She seemed to have positive relationships with all children in the class and appeared to particularly enjoy Sue’s sense of humor, often sitting in class with a small smile on her face as she listened to Sue. April’s father was biracial Japanese American and White and her mother was White, and although April lived with her mother, she saw her father regularly and identified strongly with her Japanese American ancestry. She attended Japanese language school and excitedly told us that her grandmother promised to take her to Japan after she had learned the language.

April’s assessment. April’s scores on the district assessment reflected her facility with academic writing. She received 3.58 out of 4.0 on the direct writing assessment. She also received 3s and 4s on her report card, indicating work at and above grade level standards. Her scores rightfully attest to her strengths as a writer. Indeed, her success in these measures of writing proficiency could lead to the assumption that writing was not at all difficult for her, an assumption that conceals some significant struggles she faced with the writing process. Thus, for April, our discussion of her assessment focuses on process, rather than a close analysis of her writing.

Over time, it became clear that April was particularly paralyzed by imposed “pre-writing” activities, a key part of the writing process as taught in this and many class-rooms. In her words, many of the common prewriting strategies emphasized in school did not help her (e.g., brainstorming, graphic organizers, outlines) and, “The only way that I can prewrite, and it’s not actually prewriting, is do it in my head. I can just think. I just have to think because I can’t do any of those other things.” Of course, the

122 Journal of Literacy Research 45(2)

thinking April describes is prewriting, but, in retrospect, Sue could see how her stu-dents may have associated the concept of prewriting with more formal, visible pro-cesses, such as the use of the graphic organizers she provided.

April demonstrated a strikingly sophisticated understanding of high-stakes writing tests as a genre. As she explained in an interview,

I mean, there’s even a lot of narrative writing that’s really boring. Like, in stuff like [the district writing assessment], you often get topics like “Write about what could happen if you were invisible” or something like that, that’s really fun, but then at the same time, that’s like, “oh, come on, everybody would think of that.” You’d want something totally different. I want to think of my own thing. I wish they could let you think of that, but then it wouldn’t be, it probably wouldn’t be much of a test because then everybody would be writing something different.

April understands that large-scale assessments are their own kind of genre; they are structured as they are for the particular purpose of comparing students’ writing, and therefore, it wouldn’t work to allow students to choose their own topics. Even as she critiques the pedestrian nature of the prompts in assessments, she understands their function well, which would understandably raise her own sense of the stakes involved in her performance.

It was difficult to watch April during the direct writing assessment, as the district assessment required a formal prewriting process and, like most writing assessments, prohibited interaction about ideas. April had the expository prompt “Write about a skill you’ve learned that has made your life more fun.” Several times April picked up her pencil and set it back down, until, a half hour into the time allotted, it was clear that she was close to tears. Elizabeth walked over and crouched by her desk. April looked up with full eyes and shook her head in frustration. Elizabeth asked April if there were things she enjoyed doing. Tearfully, April whispered, “I like to read.” “That sounds like a skill you learned that made life more fun.” April nodded and immediately began to write. She wrote nonstop until time was called and turned in the essay that would subsequently score so well (see Figure 5).

Indeed, April’s assessment essay does reflect her effective use of all areas of writ-ing we identified for our analysis, including focus, structure and cohesion, voice, and conventions and spelling. It is also apparent that April’s scores on the direct writing assessment might have looked very different if she had not been able to resolve her writer’s block. We are highly cognizant of the complex ethical—and, we would argue, moral—issues related to the structures of high-stakes writing assessment raised by Elizabeth’s brief exchange with April. However, here we emphasize that a moment of interaction—just the kind of interchange with others many of us count on as writers to support our process—was part of a context in which her score reflected her capabili-ties in writing.

Dutro et al. 123

April’s writing behind her test score. April was a highly successful writer by all of the institutional measures she encountered. She seemed to enjoy writing and, once she arrived at a topic and generated ideas, felt confident in her writing. She was always diligent in her attention to writing assignments and classmates sitting at her table often turned to her for help. As revealed in her interview, April was very aware of her own strengths and challenges as a writer:

A lot of times, when I start writing, I get really bad writer’s block. And, um, I got it even worse then [during the state test] because it just happened to have, like, a really hard topic and a really boring one. . . . I like stuff like Young Authors story, and if I have, like, a favorite type, like persuasive and all that, I like narrative a lot. The other types are, like, expository and persuasive, but I’m not that good at, and I’m not that good at them because, usually, well, when we do [that state assessment], my two topics were expository, and that kind of writ-ing can be really boring.

April speaks knowledgeably about genre and expresses clear personal preferences. Although she says she’s “not that good at” expository and persuasive genres, her understanding of those genres serves her well in both classroom writing and the dis-trict assessment. What she describes as “writer’s block” (a writerly term with which she is clearly familiar and that further indicates an identity as writer) was no small

Figure 5. April’s district writing assessment.

124 Journal of Literacy Research 45(2)

issue for her. In our field notes, we noted several times when she seemed frustrated, sometimes to the point of tears, with a writing assignment, and she often felt extreme anxiety at the start of writing projects. Here, we describe one of those situations.

One day in early winter, Sue passed out a graphic organizer to help students organize their ideas for a writing assignment. April stared at the paper, tried several times to fill it out, and, on the verge of tears, raised her hand for help. She told Sue that she could not think of anything to write and that the graphic organizer was not helping her. Sue told her she didn’t have to use the organizer and suggested that she write “I don’t know what to write” over and over until she thought of an idea. As April recalled later, Sue came by several minutes later, saw a page filled with the phrase, and said, “Man, you’re stubborn!” She and Sue then had a conversation in which April said that she kept think-ing about penguins, but wanted to write a tale from Africa. As April related it in a sub-sequent interview, “Ms. Bryant’s like, ‘So, write about penguins,’ and I’m like, ‘But penguins don’t live in Africa!’ and she’s like, ‘Exactly. Write about them in Antarctica!’ I was like, ‘Oh, duh.’” She went on to publish a story about penguins set in Antarctica. From then on, Sue never required April to use the graphic organizers she provided to support students’ writing and, in this way, positioned April as an expert of her own writ-ing process. It also became clear that conversation was often helpful at the start of writ-ing projects as April sorted through her ideas and attempted to begin writing. Although these were insights into April as writer that Sue could attempt to address in the class-room, those strategies were explicitly disallowed by the writing assessment.

With all of her strengths—her proficiency across writing traits, understandings of genre and the purposes of writing, and her awareness of her own strengths and challenges—April’s success in writing required flexibility and interactional support. The test’s parameters, by design, did not account for these needs. Her assessment scores capture much about her strengths in writing, but they cannot convey what it takes for her to demonstrate those strengths or the multiple processes that converged to position April as proficient.

Philip: “We Do Too Much Writing”Philip, a Native American fourth grader from a working-class family, is a member of the Elwha tribe. He had a mischievous smile and enjoyed sharing jokes with Sue. Although he shared his smile often, he would also become easily frustrated and some-times had conflicts with children from other classes on the playground. He seemed to have positive social connections with his classmates, though, both in the classroom and on the playground where he was an enthusiastic participant in soccer, basketball, and other sports. Two years before we met Philip, he had experienced a traumatic loss when his mother was violently murdered. Philip and his older brother subsequently lived with their grandmother, and their uncle, chief of what Philip described as “their village” at the nearby tribal lands, was also actively involved in Philip’s life.

Philip rarely completed an assignment. Our data files tell their own story in Philip’s case. Our collections of student work, at least three inches thick for most students in

Dutro et al. 125

the class, are a slim half inch for Philip. In our field notes, we soon came to recognize a common description of what we came to call Philip’s “writing stance”: sitting at his desk, his arms hanging beside his body, pencil lying next to his paper, and his eyes alternately staring at the page or exploring the room around him. For instance, when the children were writing in their Iditarod journals, Sue would periodically stop by his desk, urging him to write and repeating instructions. Our field notes include descriptions such as, “I notice Philip in his usual routine of writing a sentence or two and then playing. Now he is stretching, using his pencil to scratch his back”; “For the past several minutes, Philip has been looking straight ahead”; “Philip yawns and looks at the clock.”

Philip’s assessment. Philip’s scores on the writing portion of the state assessment of student learning reflected the struggles with classroom writing we observed. He received a 1.7 and a 1.2 out of 6 on the essay we analyze below. Our notes during the assessment closely track Philip’s writing process, and it was painful to watch. At 35 minutes into the assessment, we noted that he had written three sentences. Every few minutes, we noted his progress—he held his head in his hands, he played with his pencil, he sat for several minutes at a time without picking up his pencil, he got a drink, he played with a piece of paper. We worried that he would have nothing resem-bling a completed essay. However, more than halfway through the allotted time, he focused long enough to produce a rough draft that he neatly copied onto the test paper (see Figure 6).

Close analysis of Philip’s story highlights features of his writing that signal important competencies, despite aspects that also reveal challenges. For instance, his story includes a clear beginning, middle, and end, and he both develops a central plot that aligned with the prompt and provides details to support his main idea. His organization scheme—from finding the bag, to encountering the monster and, finally, emerging from peril—follows a logic that suggests understanding of plot structure for narrative texts. Although he often relies on repetitive and unsophisti-cated cohesive devices such as “so,” their inclusion represents an understanding of the need to link events in his story together with transitional phrases. Even as the flow of his story is sometimes fragmented, overall, a developing understanding of organization is clear.

Although Philip’s use of violent imagery may point to superficial understandings of audience (the assessors, in this case) and writing tests as a genre, he does demon-strate understanding of the suspense genre. For example, he includes the kinds of exciting twists and turns associated with a scary and suspenseful story. In one instance, Philip signals an unexpected turn of events, writing, “I got scared and ran in my house, but then I thought of something.” His use of voice is revealed in the level of detail, specific word choice, and vivid descriptions he leverages to draw the reader into his story.

Philip’s story also reveals his developing understandings of conventions of spelling and punctuation. For example, although his spelling is often inaccurate, his inventive spelling is easily discernible and does not disrupt the flow for the reader. Moreover,

126 Journal of Literacy Research 45(2)

although he has not yet mastered comma use, he consistently and correctly uses apos-trophes, periods, and capital letters. In spite of his tentative understandings about the function of the comma, he does include compound and complex sentences in his story, which suggests he may be on the cusp of integrating commas more consistently into his writing.

Given our observations of Philip, the score this writing received was not surprising. However, our analysis of his response displayed facilities with aspects of writing that

Figure 6. Philip’s state writing assessment.

Dutro et al. 127

Philip rarely displayed in the classroom. Below, we discuss key exceptions to that pattern.

Philip’s writing behind his test score. The state assessment was one of a few times when Philip produced a completed piece of writing. Although he seemed to have difficulty throughout the process of writing the state assessment, there were two writing events during the year when his writing seemed to come easier and we were provided glimpses of motivation to write. One of these was the Young Authors story. As with much of Philip’s writing, we do not have a copy of his story—he said he had taken it home or lost it prior to when we collected them. However, he recounted it in detail during his interview. Sue related that he had seemed very engaged in his writing of the story, titled “The Goat, the Farmer, and the Chickens.” In Philip’s words, “It was about, like, how this guy found a goat, and, um, he was, um, he had thorns in his legs, so he took him home, patched him up, and then he was trying to let him go a week later. . . .” He described an elaborate plot involving the developing relationship between the goat and the farmer, culminating in a cougar attack on the farm in which the goat and farmer save each other and the chickens. He enthusiastically recounted details of the story, even though it had been more than four months since he com-pleted it.

This enthusiasm did not extend to his general feelings about classroom writing. In an interview, he explained, “We do too much writing. . . . I’ve got to just sit there, and just think, and sometimes you get cramps.” Philip also suggested that it was important to him to have some control over the content of his writing, recounting experiences in his third grade classroom and comparing them to his experiences in Sue’s classroom:

Philip: If you’re like in my other teacher’s classroom, Ms. P’s, you’ve gotta, soon as you walk in that door, you’ve gotta start writing, for at least four hours.

Elizabeth: Wow!Philip: Um, only sometimes we have to do a little math, and, um, well, we just

gotta sit there and think and think and think. We can’t make stuff up. But, in the Iditarod journal, you got to make up stuff, because you couldn’t see what they were doing. [In the other classroom] we had to write what she told us to.

Both the Young Authors story and the Iditarod journal were examples of writing events that allowed the children some control and flexibility over the focus and con-tent of their writing. A homework assignment during the Iditarod unit provided a clue about why Philip may have valued flexibility in school writing.

The assignment involved writing to a prompt asking students to imagine they were a child in one of the villages along the Iditarod route and write about how they felt and what they did when the Iditarod racers rode through their village. The following day, Sue showed us Philip’s homework (see Figure 7). Although, the writing assignment had a prompt, Philip essentially ignored it. Instead, he seized an opportunity to write about his own tribal village and the tribe’s relationship to the environment.

128 Journal of Literacy Research 45(2)

Philip’s writing in this instance exhibited qualities that had not been visible in other writing situations, including the state assessment. His story is focused and organized, including details to support each statement he offers about life with his tribe. For instance, Philip writes about hunting elk, deer, and bear in the Elwha forest, carefully outlining the multiple uses of the animals’ hides that “were used to keep warm for bed-ding and wraps.” Here, he also employs a transitional phrase, “Most of all the hide gave us shelter,” which was far more sophisticated than the basic connecting words included in his state test.

Philip’s essay showcases a strong writing voice in every aspect delineated in the state’s guidelines for proficiency. He engages readers by bringing them into the world of his tribe and provides a clear sense of the author’s identity and commitments. He also draws on emotion, metaphor, and imagery. For instance, the essay takes a particu-larly poetic turn when Philip writes about the river as “she” and describes what she offers to the tribe. Philip also weaves evocative word choices and phrasing throughout his essay, writing, for instance, about the “delicious salmon that filled everyone’s bel-lies” and traveling “up and down the coast of Juan de fuca to fish halibut and crab.” In ways not visible in his state assessment, this essay also demonstrates his ability to use a variety of sentence structures and lengths, including both simple and complex sen-tences. Consistent with his use of conventions on his state exam, his use of correct punctuation is inconsistent, but he continues to spell many words correctly or with an

Figure 7. Philip’s essay about his tribe.

Dutro et al. 129

easily discernible inventive spelling that retains the flow of the story. Overall, through this essay, Philip reveals not only his facilities with important aspects of writing, but also his deep understandings of and connections with his tribe and his uncle’s stories about the life and history of their family.

We wondered what it was about the prompt that allowed Philip to interpret it in this way. Perhaps it was the word village, which was the term he used when talking about his relatives’ home. It may be that he identified with the Native Americans in Alaska whom he assumed to be living in the villages along the Iditarod route. When we asked him about this piece of writing, he was very pleased to share it. He smiled and said, “Yeah, native.” He explained that his uncle often told him stories of his tribe and that he had written some of them down for this assignment. Philip seemed to take great pride in this piece, and it also represented writing that he completed and com-pleted independently, both of which were rare for him. The relevance of the prompt to his lived experiences created an opportunity for Philip to position himself as expert, drawing on his family stories, something he knew and cared about, and a practice akin to that of many award-winning writers. Importantly, Sue celebrated his interpre-tation of the prompt, praising his efforts as opposed to insisting that he adhere to the directions of the assignment. Although his state assessment showed some strengths that were obscured by his very low score, his essay about his tribe revealed writing capacities that challenged both his sense of himself (and our sense of him) as a disen-gaged writer and his official designation as a decidedly struggling writer. Philip’s experience underscores the role of personal knowledge, culture, and deep connection in fostering writing identities and in ensuring that children’s best efforts are allowed to count in measures of achievement.

Discussion and ConclusionsAs our case studies reveal, these young writers were positioned in varying and vari-ous ways in relation to the discourses of writing they encountered. Their ideas about themselves as writers were consistent with their designation on the writing assess-ment for just two of them. April believed herself to be a successful writer, and, indeed, she was deemed well above proficient, despite challenges with the process. Still, the inflexibility of the assessment threatened to derail even this confident writer. Philip thought he had trouble in writing and much of his class work, and his scores on the state assessment were consistent with his self-assessment, despite the promise he demonstrated when writing connected to his life and interests. In contrast, both Mirabel’s and Max’s feelings about themselves as writers were inconsistent with how they were positioned by their designations on the high-stakes assessment. Mirabel increasingly claimed an identity as “writer.” At home and in the classroom, her writing was a personal, intellectual, and social investment and seemed to be a primary source of her academic identity. Given Mirabel’s struggles with conventions, her accomplishments and her identity as writer threatened to be overshadowed by her scores. Max had constructed an identity as a struggling writer based on his experiences

130 Journal of Literacy Research 45(2)

in writing throughout his school career. Although Max needed support crafting writ-ing that clearly expressed his ideas, he worked hard on his writing. Ultimately, he received a proficient score on the state writing assessment, even as he continued to position himself as a struggling writer.

In this section, we revisit the theories guiding our study to discuss the implications of key arguments arising from our findings. Namely, our analyses point to how test scores may be inaccurate or highly malleable based on relations between students’ identities as writers and preferred practices, quirks of the testing context, and arbitrary features of the test itself.

Although writing instruction is not our central focus, instruction is certainly key to children’s opportunities in school writing, which raises a question: Would different writing instruction have mitigated the children’s experience of the assessment? Perhaps. However, Sue represents many teachers who embrace the philosophies of writing instruction that are recommended by prominent writing researchers and pro-fessional organizations, but are in the process of building their skills as teachers of writing. In addition, our core argument is not that high-stakes testing in its current form can position students in less problematic ways by improving instruction, although we believe that is possible and an important goal. Rather, we contend that “both the formal, linguistic features of the writing” and the “social and institutional relationships associated with it” (Lea & Street, 1998, p. 60) demonstrate how the test scores these children received do not and cannot capture their writing competencies and challenges in a way that warrants imbuing such assessments with their current weight and stakes.

In light of the contradictions we found between children’s writing and their experi-ences with high-stakes assessment, we find the lenses on discourses and implicit insti-tutional assumptions offered by positioning theory and academic literacies compelling and analytically useful for examining the stakes of accountability policies for children. For instance, chief among the current discourses at work in large-scale assessment is the binary of proficient–not proficient or competent–incompetent inscribed in an iso-lated writing experience that holds very high stakes for children, teachers, and schools. Children are, thus, presented with a storyline from the early years of schooling that requires them to be positioned within that binary, as their location above or below a cut score is inscribed in files and circulated among parents, teachers, and administrators. In addition, “The relationships of authority that exist around the communication of assumptions” (Lea & Street, 1998, p. 60) about writing are highly contradictory for both teachers and students. We imagine this is not the only state and district in which the explicitly stated philosophies surrounding writing instruction found in state docu-ments, as well as the writing workshop approaches to instruction encouraged by dis-tricts, are undermined by the form of the assessments used. Indeed, this is an area ripe for further research.

Challenging binary constructions of proficiency matters because when children are identified early as lacking proficiency through an assessment that holds immense authority and currency, it follows that it could be difficult to reposition themselves as they negotiate their identity as a writer in nuanced ways over time. As research has

Dutro et al. 131

revealed, if a child is positioned as a “problem,” either academically or behaviorally, those narratives often follow him or her and affect his or her opportunities to be a posi-tive participant in classrooms and to positively identify with school learning (e.g., Collins, 2003, 2011; Ferguson, 2000; Valenzuela, 1999). Such findings underscore Davies and Harre’s (1990) argument that “[o]nce having taken up a particular position as one’s own, a person inevitably sees the world from the vantage point of that posi-tion” (p. 89).

The potential of such positioning to have a corrosive impact on a child’s relation-ship with school seems particularly salient for children like Mirabel and Philip, whose scores are so low that Mirabel’s facilities with and strong identity as a writer and Philip’s highly apparent potential for deep connection to writing could be overlooked. This is even more significant for Mirabel and Philip given that they belong to racial groups too often positioned on the negative side of the “achievement gap.” As Gutierrez, Larson, and Kreuter (1995) warn, “By not honoring the negotiation pro-cesses that must occur in communities where multiple and hybrid discourses exist, students from diverse linguistic and cultural backgrounds become marginalized and resistant participants in the learning process” (p. 413). We resonate deeply with liter-acy researchers’ efforts to redefine the discourse of achievement gaps as inequitable access to opportunity and as an indication of the education debt owed to groups who continue to face egregious systemic inequities in the United States (e.g., DeShano da Silva, Huguley, Kakli, & Rao, 2007; Ladson-Billings, 2006).

Our findings also point to an additional and striking level of unfairness, as the opportunities children did pursue and the successes they demonstrated were dis-counted in what is officially inscribed as achievement. Similar to what Lea and Street (1998, 2000) found in their work with university-level student writers, all four chil-dren illustrate how institutional structures determining what counts as successful writ-ing will likely get it wrong if fuller accounts of students’ writing are not taken into account. Indeed, the fallibility of large-scale assessments is apparent in analyses con-ducted by policy researchers and in scandals and lawsuits surrounding scoring mis-takes in the SAT, teacher competency tests, and state-level achievement tests (e.g., Booher-Jennings, 2005; Clark, 2004; Newsinferno.com, 2006). In short, Mirabel’s and Philip’s scores may represent differential access to opportunity grounded in current and historical systemic inequities in education and society, but those scores also mis-represent their capabilities in writing.

Indeed, our data suggest that both the form of on-demand writing assessments and the dichotomized sorting they facilitate potentially undermine some of the key goals, often articulated by policy makers, underlying the push for accountability through testing, including preparing students who can navigate the writing demands of the workplace and providing officials with an accurate picture of what students know and are able to do in writing (Common Core State Standards Initiative, 2010; U.S. Department of Education, n.d.). For instance, professional writers and those who employ writing in their careers write about topics they know and in which they are invested, even when required to write to tight deadlines. Thus, it seems unreasonable

132 Journal of Literacy Research 45(2)

to expect novices or even advanced students to do their best writing or to demonstrate their potential for using writing effectively for authentic purposes when starting from the disadvantage of surprise prompts and prescribed processes. Furthermore, if stu-dents’ facilities and challenges as writers are rendered invisible by an assessment—as was the case for all four children—students’ scores are simply not providing the information about achievement in writing that the tests are assumed to convey. As April expressed so well, tests wouldn’t be tests if they allowed students to write what-ever and however they wished. This meant, though, that the district and state writing assessments risked positioning the children in just the ways that would frustrate rather than promote their attempts to put their best writing on the page.

Our data also revealed a clear disconnect, if not total contradiction, between class-room writing practices and high-stakes tests. Again, our lenses of academic literacies and positioning theory provided a frame for considering the positions available to children within the implicit assumptions about what counted as legitimate knowledge and valid writing processes embedded in institutional discourses of the daily life of the classroom and the event of the writing assessments. As with our discussion of the proficient versus not-proficient dichotomy discussed above, viewing students over time in the classroom and, then, in testing situations, raised troubling binaries in which children were required to position themselves. Key examples include were flexible versus fixed notions of writing process, choice versus prescription, and social–interac-tive versus individual–solitary.

Although experts in writing pedagogy will recognize how Sue was developing as a writing teacher, our data showed important ways in which she adjusted her instruc-tion and responses to students in light of what she learned about them across the year. For instance, Mirabel’s love of writing was apparent only once she was allowed the freedom to bring her out-of-school writing into the classroom, April needed permis-sion to ignore the prewriting tools that Sue developed as part of writing instruction, Max’s strong feelings about what was appropriate to share in writing required Sue to adjust her expectations about how he would engage personal narrative, and Philip’s positive response to writing about his tribe led her to find as many ways as possible for him to engage that personal knowledge and experience. Each child represents the importance of honoring cultural and personal knowledge and preferred practices in fostering writing skills and processes in classrooms. This was poignantly the case for Philip, whose talent for description and evocative turn of phrase was not visible in school until the day he wrote about his tribe. The issues that facilitated and compli-cated children’s writing were resolved only through Sue’s knowledge of individual students’ needs and her flexibility to make adjustments to meet those needs (which is not to say, as she would be the first to admit, that more classroom support could have been proffered to children’s benefit). However, although pressures around testing may be more or less intensive for teachers in various school and district contexts, the discourse of proficiency and what counts as achievement as tied to test scores is embedded in federal policy. Hillocks’s (2002) research suggests that because high-stakes accountability policies often lead to intensive test preparation, state writing

Dutro et al. 133

assessments limit instruction and lead to “truncated thinking” (p. 201). Such research points to the significance of institutional discourses surrounding writing achieve-ment, for, in Dyson’s (2006) words, “If explicit institutional value is not placed on the practices guiding teachers’ and children’s composing, then teachers may have trouble allowing for, building on, and indeed hearing children’s landscape of voices, their foundational resource” (p. 37).

Given our findings, it is not surprising that we join with many other researchers across areas of education in arguing for assessments that give real institutional weight to more complex, and therefore more accurate, notions of young writers and their unique strengths and challenges (e.g., Hillocks, 2002; Laitsch, 2006; Ysseldyke, 2005). As our data and the theories on which we draw suggest, it is both theoretically and ethically imperative to understand the social, cultural, and intel-lectual resources students bring to writing, the unique challenges they face, and the competencies they exhibit that an on-demand test can miss. We are also pragmatic, however, and believe that teachers can act as important mediators of children’s experiences with the current form of large-scale writing assessment. Positioning theory offers a lens on how such mediation might be envisioned, first, as a way to open up new storylines to children about themselves as writers and, second, to explicitly situate the role of large-scale assessment in the larger narrative of chil-dren’s writing lives. Although Sue acted on her growing knowledge of her students as writers across the year, she did not make direct links between students’ individ-ual identities and preferences as writers and the test that they would encounter on a designated day in spring.

A rich body of practitioner-oriented literature has grown from the idea that large-scale assessment should be taught to children as a distinct genre with particular fea-tures and expectations (e.g., Fountas & Pinell, 2001; Glennon & Greene, 2007; Montgomery, Calkins, Falk, & Santman, 1998; Routman, 2004). In writing, this instructional approach must be layered: Assessments are a particular genre that are embedded with multiple writing genres with which children will be required to engage in specific ways. This layered approach to genre instruction allows students to access the discourses surrounding large-scale assessment: what it is, why it is part of their educational experience, and how they can situate it within their larger classroom writ-ing lives. In addition, we imagine the importance of individual conferences with stu-dents in which the quirks and nuances of their own positioning as writers are brought into explicit conversation with what they will encounter in large-scale assessments. To support children in crafting a storyline for themselves as writers, teachers certainly need support to hone their instruction. However, they also need to be critical consum-ers of assessments and explicit advocates for children in the assessment process, examining the limits of high-stakes assessment, the institutional assumptions about writing and the writing process embedded in those tests (and how those may contradict what they hear about “best practice” teaching from the very officials distributing the exams), and the complex ways individual children are positioned within the dichoto-mies inscribed in their scores. Our hope is that our study might serve as a context for

134 Journal of Literacy Research 45(2)

conversation and a source of analytic tools for the teachers, teacher educators, and researchers who are involved in the crucial, daily work of supporting and understand-ing children as writers.

Children are discovering and experiencing writing within an increasingly dichot-omized notion of proficiency and incompetence, a binary that is defied by the nuanced writing practices of children even as it is further substantiated by the poten-tial misrepresentation of children’s competencies inscribed in their scores. As cur-rent policy constructs learning as a “race to the top,” these children’s experiences are a call to challenge a discourse that requires some children to be winners and others losers in that contest. Policy makers and administrators assume test scores reveal an accurate story of student achievement in writing. Mirabel, Max, April, and Philip represent what might lie lost, concealed, or misconstrued behind those scores. We need to re-envision what is possible when we define proficiency in academic writ-ing, since we are simultaneously, however indirectly, defining the children who write in our schools.

Appendix AExamples of Inductive and Deductive Codes Used in Analysis

Examples of inductive codes (with subcodes in parentheses)

Examples of deductive codes (with subcodes in parentheses)

• Assessment (self, teacher, high-stakes) • Competence (self, others, content area) • Norms for participation • Type of participation (clarification, answer

direct question, pose question, shares illustrative story, expresses confusion, provides explanation)

• Use of humor • Teacher invitation to participate • Child-initiated participation • Frustration during writing (codes for body

language and verbal) • Engagement during writing (body language

and verbal) • Reference to out-of-school writing • Family • Writing-to-self connection • Writing-to-peer connection • Writing-to-teacher connection • Writing-to-larger social issues/questions

connection

Content area codes: • Math • Literacy (codes for reading and writing) ○ Specific math or literacy assignments/

events ○ Individual children

Conceptual codes (paired with a content area code when relevant): ○ Intellectual/academic identities (e.g.,

perceived competence—self and others)

○ Writing/writer identities ○ Social identities (race, class, gender) ○ Social identities (family, peers,

teacher, community)

Dutro et al. 135

Appendix BState Fourth Grade Writing Assessment Rubrics

Grade 4: Content, Organization, and Style Scoring Guide

Points Description

4 • Maintains consistent focus on topic and has selected relevant details • Has a logical organizational pattern and conveys a sense of wholeness and

completeness • Provides transitions which clearly serve to connect ideas • Uses language effectively by exhibiting word choices that are engaging and

appropriate for intended audience and purpose • Includes sentences, or phrases where appropriate, of varied length and structure • Allows the reader to sense the person behind the words

3 • Maintains adequate focus on the topic and has adequate supporting details • Has a logical organizational pattern and conveys a sense of wholeness and

completeness, although some lapses occur • Provides adequate transitions in an attempt to connect ideas • Uses adequate language and appropriate word choices for intended audience

and purpose • Includes sentences, or phrases where appropriate, that are somewhat varied in

length and structure • Provides the reader with some sense of the person behind the words

2 • Demonstrates an inconsistent focus and includes some supporting details, but may include extraneous or loosely related material

• Shows an attempt at an organizational pattern, but exhibits little sense of wholeness and completeness

• Provides transitions which are weak or inconsistent • Has a limited and predictable vocabulary which may not be appropriate for the

intended audience and purpose • Shows limited variety in sentence length and structure • Attempts somewhat to give the reader a sense of the person behind the words

1 • Demonstrates little or no focus and few supporting details which may be inconsistent or interfere with the meaning of the text

• Has little evidence of an organizational pattern or any sense of wholeness and completeness

• Provides transitions which are poorly utilized, or fails to provide transitions • Has a limited or inappropriate vocabulary for the intended audience and

purpose • Has little or no variety in sentence length and structure • Provides the reader with little sense of the person behind the words

(continued)

136 Journal of Literacy Research 45(2)

Appendix B (continued)

Grade 4: Conventions Scoring Guide

Points Description

2 • Consistently follows the rules of Standard English for grammar and usage • Consistently follows the rules of Standard English for spelling of commonly

used words • Consistently follows the rules of Standard English for capitalization • Consistently follows the rules of Standard English for punctuation • Exhibits the use of complete sentences except where purposeful fragments

are used for effect1 • Generally follows the rules of Standard English for grammar and usage

• Generally follows the rules of Standard English for spelling of commonly used words

• Generally follows the rules of Standard English for capitalization • Generally follows the rules of Standard English for punctuation • Generally exhibits the use of complete sentences except where purposeful

fragments are used for effect

0 • Mostly does not follow the rules of Standard English for grammar and usage • Mostly does not follow the rules of Standard English for spelling of commonly

used words • Mostly does not follow the rules of Standard English for capitalization • Mostly does not follow the rules of Standard English for punctuation • Exhibits errors in sentence structure that impede communication

Declaration of Conflicting Interests

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding

The author(s) received the following financial support for the research, authorship, and/or pub-lication of this article: Funding for research was provided by a Royalty Research Fund.

References

Abrams, L., Pedulla, J., & Madaus, G. (2003). Views from the classroom: Teachers’ opinions of statewide testing programs. Theory Into Practice, 42, 18-29.

Barksdale-Ladd, M., & Thomas, K. (2000). What’s at stake in high-stakes testing: Teachers and parents speak out. Journal of Teacher Education, 51, 384-397.

Berliner, D. (2007). The incompatibility of high-stakes testing and the development of skills for the twenty-first century. In R. Marzano (Ed.), On excellence in teaching (pp. 113-144). Bloomington, IN: Solution Tree Press.

Dutro et al. 137

Bloome, D., & Egan-Robertson, A. (1993). The social construction of intertextuality and class-room reading and writing. Reading Research Quarterly, 28(4), 303-333.

Bomer, R. (2005). You are here: The moment in literacy education. Research in the Teaching of English, 40, 373-379.

Booher-Jennings, J. (2005). Below the bubble: “Educational triage” and the Texas accountabil-ity system. American Educational Research Journal, 42(2), 231-268.

Calkins, L. (1994). The art of teaching writing. New York, NY: Heinemann.Clark, M. (2004, July 15). Some did pass teaching test: Essays were graded too harshly. Cincin-

nati Enquirer. Retrieved from http://www.enquirer.com/editions/2004/07/15/loc_teachert-est15.html

Collins, K. M. (2003). Ability profiling and school failure: One child’s struggle to be seen as competent. Mahwah, NJ: Lawrence Erlbaum.

Collins, K. M. (2011). Discursive positioning in a fifth grade writing lesson: The making of a bad, bad boy. Urban Education, 46(4), 741-785.

Common Core State Standards Initiative. (2010). Preparing America’s students for college and career. Retrieved from http://www.corestandards.org/the-standards/english-language-arts-standards

Darling-Hammond, L., & Rustique-Forrester, E. (2005). The consequences of student testing for teaching and teacher quality. Yearbook of the National Society for the Study of Educa-tion, 104(2), 289-319.

Davies, B. (2000). A body of writing: 1990-1999. New York, NY: Rowman Altamira.Davies, B., & Harre, R. (1990). Positioning: The discursive production of selves. Journal for the

Theory of Social Behavior, 20(1), 43-63.Delpit, L. (1995). Other people’s children: Cultural conflict in the classroom. New York, NY:

New Press.DeShano da Silva, C., Huguley, J. P., Kakli, Z., & Rao, R. (Eds.). (2007). The opportunity gap

achievement and inequality in education. Cambridge, MA: Harvard Education Press.Dixon, C. N., Frank, C. R., & Green, J. L. (1999). Classrooms as cultures: Understanding the

constructed nature of life in classrooms. Primary Voices, 7(3), 4-8.Dutro, E., Kazemi, E., & Balf, R. (2005). The aftermath of “you’re only half”: Multiracial iden-

tities in the literacy classroom. Language Arts, 83, 96-106.Dutro, E., Kazemi, E., & Balf, R. (2006). Making sense of “the boy who died”: Tales of a strug-

gling successful writer. Reading and Writing Quarterly, 22, 325-356.Dutro, E., Kazemi, E., Balf, R., & Lin, Y. (2008). “What are you and where are you from?”

Race, identity, and the vicissitudes of cultural relevance in an urban elementary classroom. Urban Education, 43, 269-300.

Dutro, E., & Selland, M. (2012). “I like to read, but I know I’m not good at it”: Children’s perspectives on high-stakes literacy assessment in a high-poverty classroom. Curriculum Inquiry, 42, 340-367.

Dyson, A. H. (1993). The social worlds of children learning to write in an urban primary school. New York, NY: Teachers College Press.

Dyson, A. H. (2001). Donkey Kong in Little Bear country: A first grader’s composing develop-ment in the media spotlight. Elementary School Journal, 101(4), 417-433.

138 Journal of Literacy Research 45(2)

Dyson, A. H. (2006). On saying it right (write): “Fix-its” in the foundations of learning to write. Research in the Teaching of English, 41(1), 8-42.

Dyson, A. H., & Genishi, C. (2005). On the case: Approaches to language and literacy research. New York, NY: Teachers College Press.

Egan-Robertson, A. (1998). Learning about culture, language, and power: Understanding rela-tionships among personhood, literacy practices, and intertextuality. Journal of Literacy Research, 30(4), 449-488.

Enciso, P. (2001). Taking our seats: The consequences of positioning in reading assessments. Theory Into Practice, 40, 166-174.

Fairclough, N. (1989). Language and power. New York, NY: Longman.Ferguson, A. A. (2000). Bad boys: Public schools in the making of Black masculinity. Ann

Arbor: University of Michigan Press.Fletcher, R., & Portalupi, J. (2001). Writer’s workshop: The essential guide. New York, NY:

Heinemann.Foucault, M. (1977). Discipline and punish: The birth of the prison (A. Sheridan, Trans.).

London, UK: Allen Lane.Fountas, I., & Pinell, G. S. (2001). Guiding readers and writers, Grades 3-6: Teaching compre-

hension, genre, and content literacy. Portsmouth, NH: Heinemann.Gershman, K. (2004). They always test us on things we haven’t read: Teen laments and lessons

learned. New York, NY: Hamilton Books.Glennon, D. M., & Greene, A. H. (2007). Test talk: Integrating test preparation into reading

workshop. New York, NY: Stenhouse.Graves, D. (2003). Writing: Teachers and children at work. Portsmouth, NH: Heinemann.Gutierrez, K., Larson, J., & Kreuter, B. (1995). The scripted classroom: The value of the subju-

gated perspective. Urban Education, 29(4), 410-442.Harre, R., Moghaddam, F. M., Rothbart, D., & Sabat, S. (2009). Recent advancements in posi-

tioning theory. Theory and Psychology, 19, 5-31.Harre, R., & Van Langenhove, L. (1998). Positioning theory: Moral contexts of international

action. London, UK: Wiley-Blackwell.Hillocks, G. (2002). The testing trap: How state writing assessments control learning. New

York, NY: Teachers College Press.Jones, G., Jones, B. D., Hardin, B., Chapman, L., Yarbrough, T., & Davis, M. (1999). The impact of

high-stakes testing on teachers and students in North Carolina. Phi Delta Kappan, 81(3), 199-203.Koretz, D. (2008). Measuring up: What educational testing really tells us. Cambridge, MA:

Harvard University Press.Ladson-Billings, G. (2006). From the achievement gap to the education debt: Understanding

achievement in U.S. schools. Educational Researcher, 35(7), 3-12.Laitsch, D. A. (2006). Assessment, high stakes, and alternative visions: Appropriate use of

the right tools to leverage improvement. Boulder, CO: National Education Policy Center. Retrieved from http://www.scribd.com/doc/38015472/EPSL-0611-222-EPRU#archive

Lea, M. R., & Street, B. V. (1998). Student writing in higher education: An academic literacies approach. Studies in Higher Education, 23(2), 157-172.

Dutro et al. 139

Lea, M., & Street, B. (2000). Student writing and staff feedback in higher education: An aca-demic literacies approach. In M. Lea & B. Stierer (Eds.), Student writing in higher edu-cation: New contexts (pp. 32-46). Buckingham, UK: Society for Research into Higher Education & Open University Press.

Lemke, J. L. (1995). Textual politics: Discourse and social dynamics. London, UK: Taylor & Francis.

Lensmire, T. (1994). When children write: Critical re-visions of the writing workshop. New York, NY: Teachers College Press.

Lewis, C., Enciso, P., & Moje, E. (2007). Reframing sociocultural research on literacy: Identity, agency, and power. New York, NY: Routledge.

Linn, R. L. (2000). Assessments and accountability. Educational Researcher, 29(2), 4-16.Lomax, R., West, M. M., Harmon, M. C., Viator, K. A., & Madaus, G. F. (1995). The impact

of mandated standardized testing on minority students. Journal of Negro Education, 64, 171-185.

Luce-Kapler, R., Catlin, S., Sumara, D., & Kocher, P. (2011). Voicing consciousness: The mind in writing. Changing English, 18, 161-172.

Luke, A. (1995). Text and discourse in education: An introduction to critical discourse analysis. Review of Research in Education, 21, 3-48.

Mathison, S., & Freeman, M. (2003, September 19). Constraining elementary teachers’ work: Dilemmas and paradoxes created by state mandated testing. Education Policy Analysis Archives, 11, 34. Retrieved from http://epaa.asu.edu/epaa/v11n34/

McCarthey, S. (2001). Identity construction in elementary readers and writers. Reading Research Quarterly, 36(2), 123-152.

McCarthey, S. J. (2008). The impact of No Child Left Behind on teachers’ writing instruction. Written Communication, 25, 462-505.

McVee, M. B., Baldassarre, M., & Bailey, N. (2004). Positioning theory as lens to explore teach-ers’ beliefs about literacy and culture. In C. M. Fairbanks, J. Worthy, B. Maloch, J. V. Hoffman & D. L. Schallert (Eds.), 53rd National Reading Conference yearbook (pp. 281-295). Oak Creek, WI: National Reading Conference.

McVee, M., Brock, C., & Glazier, J. (2011). Sociocultural positioning in literacy: Exploring culture, discourse, narrative, and power in diverse educational contexts. New York, NY: Hampton Press.

Moll, L., Saez, R., & Dworin, J. (2001). Exploring biliteracy: Two case examples of writing as a social practice. Elementary School Journal, 101(4), 435-449.

Montgomery, K., Calkins, L., Falk, B., & Santman, D. (1998). A teacher’s guide to standardized reading tests: Knowledge is power. Portsmouth, NH: Heinemann.

Muhr, T. (1996). ATLAS.ti: The knowledge workbench [Computer software]. Thousand Oaks, CA: Scolari.

National Council of Teachers of English. (2004). NCTE beliefs about the teaching of writing. Retrieved from http://www.ncte.org/positions/statements/writingbeliefs

Newsinferno.com. (2006, April 12). SAT scoring debacle prompts apology by College Board President. Retrieved from http://newsinferno.social2b.com/sat-scoring-debacle-prompts-apology-by-college-board-president/

140 Journal of Literacy Research 45(2)

Nichols, S. L. (2007). High-stakes testing. Journal of Applied School Psychology, 23(2), 47-64.Nichols, S. L., & Berliner, D. C. (2007). Collateral damage. Cambridge, MA: Harvard Educa-

tion Press.Noddings, N. (2002). High stakes testing and the distortion of care. In J. Paul, C. D. Lavely, E.

Cranton-Gingras, & L. Taylor (Eds.), Rethinking professional issues in special education (pp. 69-82). New York, NY: Ablex/Greenwood.

Perna, L., & Thomas, S. (2009). Barriers to college opportunity: The unintended consequences of state-mandated testing. Educational Policy, 23, 451-479.

Roderick, M., & Engel, M. (2001). The grasshopper and the ant: Motivational responses of low-achieving students to high-stakes testing. Educational Evaluation and Policy Analysis, 23, 197-227.

Routman, R. (2004). Writing essentials: Raising expectations and results while simplifying teaching. Portsmouth, NH: Heinemann.

Scherff, L., & Piazza, C. (2005). The more things change, the more they stay the same: A survey of high school students’ writing experiences. Research in the Teaching of English, 39(3), 271-301.

Shepard, L. A. (1995). Using assessment to improve learning. Educational Leadership, 54(5), 38-43.

Shepard, L. A. (2002). The hazards of high-stakes testing: Hyped by many as the key to improv-ing the quality of education, testing can do more harm than good if the limitations of tests are not understood (Issues in Science and Technology 19). Washington, DC: National Acad-emy of Sciences.

Sperling, M., & Woodlief, L. (1997). Two classrooms, two writing communities: Urban and sub-urban tenth-graders learning to write. Research in the Teaching of English, 31(2), 205-239.

Strauss, A., & Corbin, J. (1998). Basics of qualitative research: Techniques and procedures for developing grounded theory (2nd ed.). Thousand Oaks, CA: Sage.

Street, B. (1993). Cross-cultural approaches to literacy. Cambridge, UK: Cambridge Univer-sity Press.

U.S. Department of Education. (n.d.). Race to the Top program: Executive summary. Retrieved from http://www2.ed.gov/programs/racetothetop/index.html

Valenzuela, A. (1999). Subtractive schooling: US Mexican youth and the politics of caring. Albany: State University of New York Press.

Vasudevan, L., & Campano, G. (2009). The social production of adolescent risk and the promise of adolescent literacies. Review of Research in Education, 33, 310-353.

Wheelock, A., Bebell, D., & Haney, W. (2000). Student self-portraits as test-takers: Varia-tions, contextual differences, and assumptions about motivation. Teachers College Record. Retrieved from http://www.tcrecord.org

Yin, R. K. (2008). Case study research: Design and methods. Thousand Oaks, CA: Sage.Ysseldyke, J. (2005). Assessment and decision-making for students with disabilities: What if

this is as good as it gets? Learning Disabilities Quarterly, 28, 125-128.Zacher, J. (2011). The over-testing of English language learners and their teachers. New York,

NY: Teachers College Press.

Dutro et al. 141

Author Biographies

Elizabeth Dutro is an associate professor in the School of Education at the University of Colorado at Boulder. Her research investigates the intersections of literacy, identity, and chil-dren and youth’s opportunities for positive, sustained, relationships with schooling in racially diverse, high-poverty classrooms. A primary strand of her work analyzes the presence and consequences of out-of-school life experiences and discourses of race, class, and gender in students’ encounters with literacy curricula, instruction, and high-stakes accountability policy. She was a recipient of the Promising Researcher Award and Alan C. Purves Award from the National Council of Teachers of English.

Makenzie K. Selland is a doctoral candidate in literacy at the University of Colorado at Boulder, and recently accepted a position as an assistant professor of education at Utah Valley University. Her research interests include literacy curriculum and teacher education, specifi-cally around storytelling in the student teaching experience. Before coming to graduate school, she taught middle and high school English in Washington, D.C.

Andrea C. Bien is a doctoral candidate in curriculum & instruction: literacy studies at the University of Colorado at Boulder. Her research examines how conceptualizations of reading proficiency and mandated curricula intersect to shape literacy classroom teaching and learning practices in diverse and high-poverty classrooms.