<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:atom="http://www.w3.org/2005/Atom" version="2.0">
  <channel>
    <docs>http://www.rssboard.org/rss-specification</docs>
    <atom:link rel="self" type="application/rss+xml" href="https://escholarship.org/uc/jwa/rss"/>
    <ttl>720</ttl>
    <title>Recent jwa items</title>
    <link>https://escholarship.org/uc/jwa/rss</link>
    <description>Recent eScholarship items from Journal of Writing Assessment</description>
    <pubDate>Fri, 15 May 2026 11:15:26 +0000</pubDate>
    <item>
      <title>Exploring Fairness and Seeking Social Justice for Writing Assessment: ePortfolios, Language Difference, and Metacognition</title>
      <link>https://escholarship.org/uc/item/6m55q8xg</link>
      <description>&lt;p&gt;This quantitative validation analysis applies antiracist methods to longitudinal ePortfolio assessment data to study language difference through the lens of a metacognitive literacy construct. With interdisciplinary research reshaping the field of writing assessment using quantitative and intersectional demographic approaches, this essay advances language as a meaningful register of validity evidence and an indicator of fairness across linguistically and racially heterogeneous students sorted into three cohorts that establish comparisons of ePortfolio assessment scores from samples tracking from 2016–2020. To contribute to the critical study of social justice in writing assessment, this exploratory analysis offers nuanced responses to its guiding heuristic question: Can ePortfolios be instruments of fairness in a local assessment ecology? For this formative curricular assessment, rigorous statistical methods complicate claims derived from the ePortfolio assessment results,...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/6m55q8xg</guid>
      <pubDate>Mon, 8 Dec 2025 00:00:00 +0000</pubDate>
      <author>
        <name>Queen, Brad</name>
      </author>
      <author>
        <name>Kirby, Kate</name>
      </author>
      <author>
        <name>Eslami, Maryam</name>
      </author>
      <author>
        <name>Denaro, Kameryn</name>
        <uri>https://orcid.org/0000-0001-9175-1640</uri>
      </author>
    </item>
    <item>
      <title>On Neurodivergence/Disability and Labor-Based Grading: A Response to Kryger and Zimmerman (2020) and Carillo (2021)</title>
      <link>https://escholarship.org/uc/item/4gf435hz</link>
      <description>This essay responds to existing scholarship on neurodivergence/disability and labor-based grading, contending that current critiques define labor-based grading too narrowly and conflate the lack of quantitative grades with a lack of scaffolding. The essay further suggests that labor-based or other alternative assessment approaches, especially those which move away from authoritative, quality-based judgments of student work, invite students to express agency over and open a conversation about expectations around writing processes and habits. The article concludes by calling for additional research and conversation about how labor-based approaches may account for access and accessibility.</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/4gf435hz</guid>
      <pubDate>Mon, 8 Dec 2025 00:00:00 +0000</pubDate>
      <author>
        <name>Von Bergen, Megan</name>
      </author>
    </item>
    <item>
      <title>Continuing the Conversation:&amp;nbsp;A Response to Megan Von Bergen’s “On Neurodivergence/Disability and Labor-Based Grading”</title>
      <link>https://escholarship.org/uc/item/21t5n34z</link>
      <description>Continuing the Conversation:&amp;nbsp;A Response to Megan Von Bergen’s “On Neurodivergence/Disability and Labor-Based Grading”</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/21t5n34z</guid>
      <pubDate>Mon, 8 Dec 2025 00:00:00 +0000</pubDate>
      <author>
        <name>Carillo, Ellen</name>
      </author>
    </item>
    <item>
      <title>Editors’ Introduction:&amp;nbsp;New Editors, New Directions in Writing Assessment</title>
      <link>https://escholarship.org/uc/item/1n62d85w</link>
      <description>&lt;p&gt;The JWA 18.2 editors’ introduction contains a welcome by the new journal editors, Mathew Gomes, Lizbett Tinoco, and Stacy Wittstock. It also provides an overview of the two articles and symposium in the issue: Bradley Queen, Kate Kirby, Maryam Eslami, and Kameryn Denaro’s (2025) exploration of ePortfolios as instruments of fairness; Daniel Ernst’s (2025) examination of the use of automated writing evaluation (AWE) technology in writing assessment; and Megan Von Bergen’s (2025) critique of labor-based grading discussions by Kryger and Zimmerman (2020) and Carillo (2021), followed by responses from Griffin X. Zimmerman (2025) and Ellen Carillo (2025).&lt;/p&gt;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/1n62d85w</guid>
      <pubDate>Mon, 8 Dec 2025 00:00:00 +0000</pubDate>
      <author>
        <name>Gomes, Mathew</name>
      </author>
      <author>
        <name>Tinoco, Lizbett</name>
      </author>
      <author>
        <name>Wittstock, Stacy</name>
      </author>
    </item>
    <item>
      <title>The Effects of Automated Writing Evaluation Technology on Improving Student Writing</title>
      <link>https://escholarship.org/uc/item/0q34x1vh</link>
      <description>&lt;p&gt;Advances in automated writing evaluation technology have shifted the aims of the tools from reliably and holistically scoring and ranking essays to providing formative and analytical feedback to users for improving writing. This study uses a quasi-experimental design to test the ability of one automated writing evaluation program to improve college student writing. Using a comparative judgment model of assessment, four college writing instructors evaluated pairs of essays with one per pair treated by the program and selected the better of each pair. The essays treated by the automated evaluation program significantly underperformed the null hypothesis of 50%. Results suggest the automated evaluation program fails to improve student writing in the eyes of instructors. Theories and implications for why are discussed.&lt;/p&gt;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/0q34x1vh</guid>
      <pubDate>Mon, 8 Dec 2025 00:00:00 +0000</pubDate>
      <author>
        <name>Ernst, Daniel</name>
      </author>
    </item>
    <item>
      <title>Troubling Definitions, Expanding Conceptions: A Response</title>
      <link>https://escholarship.org/uc/item/01s2d32g</link>
      <description>Troubling Definitions, Expanding Conceptions: A Response</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/01s2d32g</guid>
      <pubDate>Mon, 8 Dec 2025 00:00:00 +0000</pubDate>
      <author>
        <name>Zimmerman, Griffin Xander</name>
      </author>
    </item>
    <item>
      <title>Editor’s Introduction: Placement and Its Discontents or The Long Winding Road toward Change</title>
      <link>https://escholarship.org/uc/item/9m68h5g3</link>
      <description>&lt;p&gt;The JWA 18.1 editor's introduction contains Carl Whithaus's reflections on 10 years editing the journal. It also provides an overview of the six articles in the issue: Sallie Koenig, Catrina Mitchum, and Rochelle Rodrigo's (2025) exploration of completion rubrics on student learning and agency in online asynchronous courses; Maggie Fernandes, Emily Brier, and Megan McIntyre's (2025) critique of "ungrading" and development of alternative writing assessments to more effectively achieve the goals of "ungrading"; Kate L Pantelides and Erin Whittig's (2025) section introduction updating us on Student Self Placement (SSP); Amy Ferdinandt Stolley, Dauvan Mulally, and Craig Hulst's (2025) &amp;nbsp;30-year retrospective on how Directed Self Placement (DSP) has developed and changed over time at Grand Valley State University; Genie Giaimo and Kristina Reardon’s (2025) examination of how SSP can encourage changes across different writing courses at a small liberal arts college; and, Jessica...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/9m68h5g3</guid>
      <pubDate>Wed, 9 Apr 2025 00:00:00 +0000</pubDate>
      <author>
        <name>Whithaus, Carl</name>
      </author>
    </item>
    <item>
      <title>Everything Old Is New Again: Reconsidering DSP Amid the Changing Academic Landscape at Grand Valley State University</title>
      <link>https://escholarship.org/uc/item/7696n7gf</link>
      <description>&lt;p&gt;As the origin of directed self-placement (DSP), Grand Valley State University is in the unique position of having created, adapted, and maintained a DSP program for almost thirty years. This article explores the history of GVSU’s placement practices to articulate what we have learned about DSP amid our institution’s changing academic landscape. Using interviews and reflections from past and current administrators who lead our placement practices, we demonstrate that the philosophical foundation of DSP—student self-efficacy—remains the guiding light of our placement practices. However, we argue that multiple changes experienced at many institutions, including new admissions standards, changing student demographics, and the lingering effects of the COVID-19 pandemic, require WPAs to consider new questions about DSP to ensure that our placement practices promote equity and access to all students.&lt;/p&gt;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7696n7gf</guid>
      <pubDate>Wed, 9 Apr 2025 00:00:00 +0000</pubDate>
      <author>
        <name>Ferdinandt Stolley, Amy</name>
      </author>
      <author>
        <name>Mulally, Dauvan</name>
      </author>
      <author>
        <name>Hulst, Craig</name>
      </author>
    </item>
    <item>
      <title>Wrap-around support via a directed self placement model:  A treatment for SLAC writing programs</title>
      <link>https://escholarship.org/uc/item/62k3q7kp</link>
      <description>In this paper, two WPAs at small and highly selective liberal arts colleges (SLACs) discuss the process of developing and implementing a “wrap-around” directed self placement (DSP) model. Beginning with a braided narrative, the authors discuss the impetus for the DSP, its impact on course placement, as well as using DSP data to create robust support plans for individual students. Of course, given the elite nature of the authors’ institutions, we also discuss how to apply a DSP model in a competitive and highly selective context where there are few, if any, developmental courses. Here, we offer possibilities for DSPs at SLACs that include retention and persistence tracking, as well as tracing self-efficacy by disciplinary specialization (i.e., STEM). We end by sharing our instruments and guidance on how SLAC WPAs can use DSP in novel and more comprehensive ways.</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/62k3q7kp</guid>
      <pubDate>Wed, 9 Apr 2025 00:00:00 +0000</pubDate>
      <author>
        <name>Giaimo, Genie</name>
      </author>
      <author>
        <name>Reardon, Kristina</name>
      </author>
    </item>
    <item>
      <title>Afterword: Finding the Right Note in Writing Placement

&amp;nbsp;

&amp;nbsp;</title>
      <link>https://escholarship.org/uc/item/4s96g8sw</link>
      <description>Afterword: Finding the Right Note in Writing Placement

&amp;nbsp;

&amp;nbsp;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/4s96g8sw</guid>
      <pubDate>Wed, 9 Apr 2025 00:00:00 +0000</pubDate>
      <author>
        <name>Nastal, Jessica</name>
      </author>
      <author>
        <name>Messer, Kris</name>
      </author>
    </item>
    <item>
      <title>The Trouble With “Ungrading”: Toward Disciplinary Specificity in Alternative Writing Assessment</title>
      <link>https://escholarship.org/uc/item/3s97p28z</link>
      <description>Responding to the emergent discourse around “ungrading,” this essay articulates the need for disciplinary conversations about alternative writing assessments, conversations that center work on antiracism, Black Linguistic Justice, and anti-ableist composition pedagogies and policies. From that foundation, we argue, we have the chance to build concrete, specific, and equitable alternative assessment practices that also include the practices and voices of the faculty and graduate students most likely to be teaching first-year composition courses.</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/3s97p28z</guid>
      <pubDate>Wed, 9 Apr 2025 00:00:00 +0000</pubDate>
      <author>
        <name>Fernandes, Maggie</name>
      </author>
      <author>
        <name>Brier, Emily</name>
      </author>
      <author>
        <name>McIntyre, Megan</name>
      </author>
    </item>
    <item>
      <title>Using Completion Rubrics to Grade Engagement in Online Spaces</title>
      <link>https://escholarship.org/uc/item/3m47g95t</link>
      <description>&lt;p&gt;This study examines how completion rubrics impact student learning and agency in online asynchronous courses. The study was conducted during the Fall 2021 term in three 7.5-week courses: two sections of ENGL101 and one section of ENGL300. The analysis focuses on student survey responses. We found that student responses focused on defining labor, coming to terms with invisible labor, how they experienced this new assessment system, their perceptions about the connection between assessment and learning, and finally four distinct time-related themes. First, time emerged as a theme while students defined labor. Second, it appeared repeatedly as students discussed invisible labor and grading not accounting for time a task might take. Third, students distinguished between how previous experience and skills impact an individual’s time on task. Finally, students associated saving time with gaining agency and being able to prioritize other areas outside of the class. Completion rubrics...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/3m47g95t</guid>
      <pubDate>Wed, 9 Apr 2025 00:00:00 +0000</pubDate>
      <author>
        <name>Koenig, Sallie</name>
      </author>
      <author>
        <name>Mitchum, Catrina</name>
      </author>
      <author>
        <name>Rodrigo, Shelley</name>
      </author>
    </item>
    <item>
      <title>Collaboratively Building Our SSP Scholarship (Because Placement is Still Everyone's Business)</title>
      <link>https://escholarship.org/uc/item/1nm1x2zp</link>
      <description>Collaboratively Building Our SSP Scholarship (Because Placement is Still Everyone's Business)</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/1nm1x2zp</guid>
      <pubDate>Wed, 9 Apr 2025 00:00:00 +0000</pubDate>
      <author>
        <name>Pantelides, Kate L</name>
      </author>
      <author>
        <name>Whittig, Erin</name>
      </author>
    </item>
    <item>
      <title>Editor’s Introduction: The “Accidental California Issue” – Critical Questions about Fairness and Equity in Writing Assessment and Placement&amp;nbsp;</title>
      <link>https://escholarship.org/uc/item/9hn4297f</link>
      <description>&lt;p&gt;JWA 17.2 features five articles that explore these evolving practices and critical questions around fairness and equity. Daniel Gross (2024) examines the implications of construct validity in the discontinuation of the Analytical Writing Placement Examination (AWPE) at the University of California. Julia Voss, Loring Pfeiffer, and Nicole Branch (2024) share how they used interviews from programmatic assessment to understand student learning outcomes in ways that value minoritized students’ experiential knowledge. Edward Comstock (2024) investigates the interplay between self-efficacy and programmatic assessment, emphasizing the value of qualitative methods in evaluating writing programs. Sarah Hirsch, Kenneth Smith, and Madeleine Sorapure (2024) present on Collaborative Writing Placement (CWP). Julie Prebel and Justin Li (2024) critique of a first-year writing portfolio assessment through lenses of equity, curricular design, performance, and reliability.&amp;nbsp;&lt;/p&gt;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/9hn4297f</guid>
      <pubDate>Thu, 19 Dec 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Whithaus, Carl</name>
      </author>
    </item>
    <item>
      <title>The Strange Loop of Self-Efficacy and the Value of Focus Groups in Writing Program Assessment</title>
      <link>https://escholarship.org/uc/item/6vn170k5</link>
      <description>It’s long been presumed that increases in self-efficacy are correlated with other “habits of mind,” including more effective metacognitive strategies that will enable writing skills to transfer to different situations. Similarly, it’s long been understood that high self-efficacy is associated with more productive habits of mind and more positive emotional dispositions towards writing tasks. However, this two-year assessment of College Writing classes&amp;nbsp;a private, mid-sized, urban four-year university&amp;nbsp;complicates these assumptions. By supplementing substantial survey data we the analysis of data collected in focus groups, we&amp;nbsp;found that the development of self-efficacy does not necessarily correlate to the development of more sophisticated epistemological beliefs—beliefs about how learning happens—nor the development of rhetorically-effective “writing dispositions." In short, suggesting the value of focus groups in&amp;nbsp;assessment, we&amp;nbsp;discovered&amp;nbsp;a&amp;nbsp;“strange...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/6vn170k5</guid>
      <pubDate>Thu, 19 Dec 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Comstock, Edward</name>
      </author>
    </item>
    <item>
      <title>Assessment is Constructed and Contextual: Identity, Information Literacy, and Interview-Based Methodologies in the First-Year Writing Classroom</title>
      <link>https://escholarship.org/uc/item/52k3376g</link>
      <description>&lt;p&gt;Over the past twenty years,the field of writing assessment has moved from critical theories that questiontraditional models of validity and objectivity (Huot, 2002; Lynne, 2004) toscholarship that exposes how traditional assessment perpetuates inequality(Inoue 2015, 2019) and advocates new approaches that take social justice astheir central goal (Poe, et. al., 2018). We report on a collaboration betweentwo writing instructors and one librarian that assessed first-year writing(FYW) students' information literacy when researching and writing with popularnews sources. In addition to the typical practice of analyzing students'written work, this project used interviews as an assessment methodology. Thisresearch produced three important findings: 1) minoritized studentsdemonstrated superior critical information literacy skills compared tomajoritized students; 2) these differences were made visible through the use ofmultiple measures (written artifacts and interviews); and 3) the...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/52k3376g</guid>
      <pubDate>Thu, 19 Dec 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Voss, Julia</name>
      </author>
      <author>
        <name>Branch, Nicole</name>
      </author>
      <author>
        <name>Pfeiffer, Loring</name>
      </author>
    </item>
    <item>
      <title>Collaborative Writing Placement: Partnering with Students in the Placement Process</title>
      <link>https://escholarship.org/uc/item/4b85378n</link>
      <description>&lt;p&gt;This paper will discuss how the Writing Program at the University of California, Santa Barbara “flipped the script” on placement by implementing a model that emphasizes the importance of student voices. Our Collaborative Writing Placement (CWP) shares many similarities with Directed Self-Placement (DSP) in that its instrument consists of survey questions and reflective writing opportunities (Aull, 2021; Gere, Ruggles, et. al., 2013). But it differs from DSP in that students work with writing faculty in choosing the first-year course that is the best fit for them. Through an examination of our initial data, and the first two years of CWP’s implementation, our paper will discuss how the CWP offers another avenue for promoting student agency and generating more equitable placement outcomes.&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/4b85378n</guid>
      <pubDate>Thu, 19 Dec 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Hirsch, Sarah</name>
      </author>
      <author>
        <name>Smith, Kenny</name>
      </author>
      <author>
        <name>Sorapure, Madeleine</name>
      </author>
    </item>
    <item>
      <title>Multifaceted Equity: Critiquing a First-Year Writing Assessment through Curricular, Performance, and Reliability Lenses</title>
      <link>https://escholarship.org/uc/item/3qr482bb</link>
      <description>&lt;p&gt;This article examines whether a college’s new portfolio-based first-year writing assessment process is equitable. We build on the existing literature by arguing that equity must be assessed through multiple, complementary facets within the writing assessment ecology. We present and operationalize three lenses through which to examine the equity of a first-year writing assessment process. The curricular lens shows that the new writing assessment is more aligned with and improved the classroom pedagogy of our first-year seminars. The performance lens shows the ongoing disparities between students across demographic backgrounds. Finally, the reliability lens reveals faculty differences in how they interpret the writing rubric. We conclude that while the new portfolio-based writing assessment is more equitable, it is also constrained by institutional structures and systems of power that prevent it from being equitable, period.&lt;/p&gt;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/3qr482bb</guid>
      <pubDate>Thu, 19 Dec 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Prebel, Julie</name>
      </author>
      <author>
        <name>Li, Justin</name>
      </author>
    </item>
    <item>
      <title>Construct Validity and the Demise of the Analytical Writing Placement Examination (AWPE) at the University of California: A Tale of Social Mobility</title>
      <link>https://escholarship.org/uc/item/3gm126sq</link>
      <description>&lt;p&gt;In 2021, the University of California System ended its decades-old timed writing assessment for course placement, due in part to challenges presented by the COVID-19 pandemic. Beyond practical crisis, however, the event marks a sea change in educational philosophy away from a universalizing model of cognitive development, which dominated in the 1970s and 1980s, towards a concern for social mobility and student self-assessment. The article explores the historical factors that led to this change, including the emergence of the social mobility index as a new method for evaluating student success. It also unpacks UC's discourse on preparatory education and levels of proficiency, emphasizing instead fairness in writing assessment.&lt;/p&gt;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/3gm126sq</guid>
      <pubDate>Thu, 19 Dec 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Gross, Daniel M.</name>
      </author>
    </item>
    <item>
      <title>Localizing Directed Self-Placement: UX Stories and Methods</title>
      <link>https://escholarship.org/uc/item/9sm4851w</link>
      <description>This article seeks to address the need for research supporting localization efforts in placement assessment. We argue that as a highly technical communication endeavor, directed self-placement (DSP) and its developers can benefit from research in technical and professional communication (TPC). We synthesize the theoretical relations between DSP and TPC, especially regarding models of localization, and demonstrate how implementing user experience (UX) design can help address placement equity concerns by foregrounding accessibility and usability from the beginning. We follow this discussion of DSP and TPC scholarship with storied examples from our institution, providing a sample range of UX methods that (1) are flexible across contexts, (2) are relatively manageable to implement, and (3) are cognizant of WPA, staff, and students’ time, labor, and compensation concerns. We propose DSP as a form of advocacy, and we demonstrate how UX method/ologies are an excellent choice for DSP...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/9sm4851w</guid>
      <pubDate>Wed, 27 Mar 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Kryger, Kathleen</name>
      </author>
      <author>
        <name>Mitchum, Catrina</name>
      </author>
      <author>
        <name>Higgins, Aly</name>
      </author>
    </item>
    <item>
      <title>Informing Self-Placement: A Polyvocal Narrative Case Study</title>
      <link>https://escholarship.org/uc/item/9fw125t3</link>
      <description>This article provides a polyvocal narrative of the development, initial assessment, and ongoing revision of an Informed Self-Placement (ISP) process initially implemented during the COVID pandemic. The authors intersperse collectively narrated description how the ISP unfolded in its first two years with individual reflections on those experiences from a variety of positions and identities. Data so far suggest that this ISP process has narrowed but not fully closed racial equity gaps in first-year writing placement while maintaining enrollments and academic performance in the first-year writing course sequence. Persistent equity issues reside not only in the ISP instrument itself but the systems by which students learn about the ISP and the opportunities they have to complete it.</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/9fw125t3</guid>
      <pubDate>Wed, 27 Mar 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Toth, Christie</name>
      </author>
      <author>
        <name>Andrus, Jennifer</name>
      </author>
      <author>
        <name>Onwuzuruoha, Nkenna</name>
      </author>
      <author>
        <name>Clawson, Nicole</name>
      </author>
      <author>
        <name>Fraser, Pietera</name>
      </author>
      <author>
        <name>Fochs, Aubrey</name>
      </author>
      <author>
        <name>Rivera Aguilar, Samuel</name>
      </author>
    </item>
    <item>
      <title>(Re)Placing Personalis: A Study of Placement Reform and Self-Construction in Mission-Driven Contexts</title>
      <link>https://escholarship.org/uc/item/8bv4d2tb</link>
      <description>Recent movements in higher education have opened many opportunities for writing program administrators to reform first-year writing placement procedures, including continued development and adaptation of Directed Self Placement (DSP) models alongside ongoing research into their potential to foster student agency and advance linguistic, racial, and social justice in the academy. Our study traces and compares the efforts of two writing program administrators to reform flawed placement processes at their two mission-driven liberal arts institutions—one, a small Lasallian university and Hispanic-serving Institution in Northern California; the other, a private research Jesuit university located in New York City. Using inter-institutional, grounded theory research, this study examines students' reflections on their placement choices to understand “&lt;i&gt;substantive validity&lt;/i&gt;,” inquiring intentionally into ways that students self-locate with regard to their self-placement assessments...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/8bv4d2tb</guid>
      <pubDate>Wed, 27 Mar 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Sweeney, Meghan</name>
      </author>
      <author>
        <name>Colombini, Crystal</name>
      </author>
    </item>
    <item>
      <title>Placement is Everyone’s Business: A Love Letter to Our SSP Coalition</title>
      <link>https://escholarship.org/uc/item/7v9830xh</link>
      <description>In this introduction to the special issue, the co-editors offer the umbrella term "methods of student self-placement" (SSP) to refer to any placement mechanism that includes student choice&amp;nbsp;so that we can further build theoretical apparatus, gather much-needed empirical data, and subsequently flesh out meaningful differences in approaches. They argue that&amp;nbsp;just as SSP asks us to rethink the mission of first-year writing, it also asks us to rethink some of the divisions in Writing Studies because placement work is meaningful across the university. Ultimately, they conclude that&amp;nbsp;SSP isn't an easy fix for systemic problems in higher education, but it is powerful in fully acknowledging the complexity of placement and meeting students' diverse learning needs.&amp;nbsp;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7v9830xh</guid>
      <pubDate>Wed, 27 Mar 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Pantelides, Kate L</name>
      </author>
      <author>
        <name>Whittig, Erin</name>
      </author>
    </item>
    <item>
      <title>Editor’s Introduction: Special Issue on Student Self Placement (SSP)</title>
      <link>https://escholarship.org/uc/item/7mp54031</link>
      <description>Editor’s Introduction: Special Issue on Student Self Placement (SSP)</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7mp54031</guid>
      <pubDate>Wed, 27 Mar 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Whithaus, Carl</name>
      </author>
    </item>
    <item>
      <title>It Takes a Campus: Agility in the Development of Directed Self-Placement</title>
      <link>https://escholarship.org/uc/item/7gg6v4gj</link>
      <description>Transitioning from a conventional placement model for first-year writing to a student self-placement (SSP) model requires many stakeholders to shift their perspectives on students, assessment, and the nature of the work of writing program administrators (WPAs). This article recounts the communicative and administrative agility involved in launching SSP while simultaneously researching its effects on student success. It also foregrounds the shifts in numerous roles--including those of instructors, students, and advisors, and even our own roles as WPA-researchers--that have been prompted by the transition to SSP. In particular, this article explores the connections between those roles and academic paternalism--an attitude that presumes to know what is best for students, that doubts students' abilities to make good placement decisions, and that treats conventional placement outcomes as the measure against which SSP should be judged. Adherence to academic paternalism and its investment...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7gg6v4gj</guid>
      <pubDate>Wed, 27 Mar 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Whitney, Kelly</name>
      </author>
      <author>
        <name>Skinner, Carolyn</name>
      </author>
    </item>
    <item>
      <title>Directed Self-Placement for Multilingual, Multicultural International Students</title>
      <link>https://escholarship.org/uc/item/3wz6s1qb</link>
      <description>Directed self-placement (DSP) methods remain relatively rare in multilingual writing programs because such methods present unique ethical and academic risks. Grounded in five years of institutional research, this article reports on a first-year writing program in which DSP is the sole means of placement for international students and in which the international student population is linguistically, educationally, and culturally diverse. We offer logistical and technical guidance for creating DSP programs for multilingual writers, and we argue that DSP can be a vehicle for more equitable, socially just writing placement for multilingual, multicultural writers.</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/3wz6s1qb</guid>
      <pubDate>Wed, 27 Mar 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Johnson, Kristine</name>
      </author>
      <author>
        <name>Vander Bie, Sara</name>
      </author>
    </item>
    <item>
      <title>After Implementation: Assessing Student Self-Placement in College Writing Programs</title>
      <link>https://escholarship.org/uc/item/3pc0n4dz</link>
      <description>&lt;p&gt;While a growing body of research provides instruction on how to implement student self-placement (SSP) for college writing courses, there is a gap in the literature about how to evaluate SSP after implementation. This article offers strategies and recommendations for assessing SSP processes, based on the authors’ experiences of developing a new SSP mechanism and evaluating its effectiveness over several years. This article presents statistical data from our analysis of our institution’s SSP, which informs a heuristic set o fquestions that others can use to evaluate the effectiveness of their own SSP after implementation. This analysis demonstrates the value of evaluating SSP processes for writing programs, as well as outlining issues that may emerge and should be considered when analyzing SSP.&lt;/p&gt;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/3pc0n4dz</guid>
      <pubDate>Wed, 27 Mar 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Arnold, Lisa R</name>
      </author>
      <author>
        <name>Jiang, Lei</name>
        <uri>https://orcid.org/0000-0002-8084-2667</uri>
      </author>
      <author>
        <name>Hassel, Holly</name>
        <uri>https://orcid.org/0000-0002-7714-6388</uri>
      </author>
    </item>
    <item>
      <title>Supporting Student Linguistic Identity and Autonomy in Directed Self Placement Through Linguistic Domains Using Qualtrics Scoring</title>
      <link>https://escholarship.org/uc/item/2jw1298d</link>
      <description>&lt;p&gt;In this article, we review the current and dynamic state of multilingual writers, especially their experiences in Composition and with English self-placement methods. Then, we position our institution and department’s theoretical underpinnings for support of multilingual writers and their self-placement, and we describe how we utilized Cavazos and Karaman’s (2021) Translingual Disposition Questionnaire as a framework for our recent revision of our Directed Self-Placement survey and utilized Qualtrics scoring tools to provide students with feedback on a novel language&lt;i&gt;&amp;nbsp;&lt;/i&gt;domain.&amp;nbsp;&amp;nbsp;Our intent was to offer multilingual students transparency and choice in the English placement process so they could select the first year Composition course that best matched their needs. We hope that other WPAs gain insight on how to integrate asset-based philosophies and linguistic domains using Qualtrics scoring to offer their multilingual students more autonomy in their first...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/2jw1298d</guid>
      <pubDate>Wed, 27 Mar 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Decker, Laura</name>
      </author>
      <author>
        <name>Taormina-Barrientos, Brianne</name>
      </author>
    </item>
    <item>
      <title>Self-Characterization in the Self-Placement Assessment Ecology: Complicating the Stories We Tell about DSP’s Effects and Effectiveness</title>
      <link>https://escholarship.org/uc/item/07c5j8tp</link>
      <description>Scholarship on student self-placement (SSP) emphasizes the importance of understanding methods like directed self-placement (DSP) as dynamic assessment ecologies (e.g., Inoue, 2015; Nastal et al., 2022; Wang, 2020), with implications not only for placement but also for how students conceptualize writing and themselves (e.g., Johnson, 2022). What can be learned about SSP’s ecological impacts by more meaningfully attending not just to patterns in students’ placement decisions but also to the qualitative content of their (self-)reflections and (self-)characterizations? Leveraging a dataset of more than 5,000 SSP pathways, we examine a corpus of short-answer survey responses, totaling more than half a million words, in which students wrote about their strengths as writers and what writing tasks they find most challenging. Students’ words help us understand how they see themselves as writers and how they conceive of college writing expectations. Through data analysis, this study found...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/07c5j8tp</guid>
      <pubDate>Wed, 27 Mar 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Tinkle, Theresa</name>
      </author>
      <author>
        <name>Godfrey, Jason</name>
      </author>
      <author>
        <name>Hammond, J. W.</name>
        <uri>https://orcid.org/0000-0001-7226-0408</uri>
      </author>
      <author>
        <name>Moos, Andrew</name>
      </author>
    </item>
    <item>
      <title>Editor's Introduction: Volume 1, Issue 1</title>
      <link>https://escholarship.org/uc/item/6jn072sf</link>
      <description>Editor's Introduction: Volume 1, Issue 1</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/6jn072sf</guid>
      <pubDate>Fri, 5 Jan 2024 00:00:00 +0000</pubDate>
      <author>
        <name>Huot, Brian</name>
      </author>
      <author>
        <name>Blake Yancey, Kathleen</name>
      </author>
    </item>
    <item>
      <title>Time as a “Built-In Headwind”:  The Disparate Impact of Portfolio Cross-assessment on Black TYC students</title>
      <link>https://escholarship.org/uc/item/8c91s9dt</link>
      <description>This study of a departmental portfolio cross-assessment practice sheds light on factors that appear to influence assessment outcomes for Black students and helps to tease out some of the reasons why this assessment ecosystem has a disparate impact on these students. The findings, drawn from student outcomes data and student survey data,&amp;nbsp; suggest that it isn’t only, or even primarily, Black students’ linguistic variety that lead to higher failure rates. The writing qualities most commonly flagged on Black students’ failing portfolios are likely related to the very different material conditions in which they write their papers. These conditions challenge the framing of “time” and “labor” as neutral, non-racially-inflected resources to which all students have equal access and which are not often conceptualized as part of the construct of writing ability. As TYCs across the country reform their placement mechanisms for greater access and equity and place more and more students...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/8c91s9dt</guid>
      <pubDate>Mon, 4 Dec 2023 00:00:00 +0000</pubDate>
      <author>
        <name>Del Principe, Annie</name>
      </author>
    </item>
    <item>
      <title>Achieving High Goals: The Impact of Contract Grading on High School Students' Academic Performance, Avoidance Orientation, and Social Comparison</title>
      <link>https://escholarship.org/uc/item/70g147sg</link>
      <description>This article examines American high school students’ (N=439) self-worth protection behaviors, maladaptive coping mechanisms, and academic performance under a contract grading system, which has been understudied in contemporary secondary classrooms. The quantitative analysis revealed that under the contract grading system, 97% (n=421) earned a passing grade (i.e., A, B, or C) on the assessment and 90% (n=390) fulfilled the contract by reaching mastery (A) or proficiency (B). Compared to the previous year, students with prior experience were 19% more likely to earn an A and 16% more likely to earn a B under the grading contract despite increased workload demands. The qualitative analysis of 40 semi-structured interviews revealed that performance improved as a result of the contract’s clarity of purpose, which limited task avoidance and facilitated task-oriented effort toward a desirable goal. Students enrolled in regular courses experienced the most significant grade improvement...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/70g147sg</guid>
      <pubDate>Mon, 4 Dec 2023 00:00:00 +0000</pubDate>
      <author>
        <name>Watson, Emily</name>
        <uri>https://orcid.org/0000-0002-1941-7872</uri>
      </author>
    </item>
    <item>
      <title>What Do We Reward in Reflection? Assessing Reflective Writing with the Index for Metacognitive Knowledge</title>
      <link>https://escholarship.org/uc/item/3dc6w4hg</link>
      <description>Reflection is a staple of contemporary writing pedagogy and writing assessment. Although the power of reflective writing has long been understood in writing studies, the field has not made progress on articulating how to assess the reflective work. Developed at the crossroads of research in reflection and metacognition, the Index for Metacognitive Knowledge (IMK) is designed to help writing researchers, teachers, and students articulate what is being rewarded in the assessment of reflection and to articulate the role of metacognitive knowledge in critical reflective writing. The IMK was used to code final portfolio introductions from first-year writing courses in order to analyze the distribution of the three kinds of metacognitive knowledge (declarative, procedural, and conditional) and to explore the quality and complexity of students’ metacognitive knowledge. Inter-rater reliability testing on the IMK showed that it is highly reliable; the Fleiss’ kappa was 83% (K=.834). The...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/3dc6w4hg</guid>
      <pubDate>Mon, 4 Dec 2023 00:00:00 +0000</pubDate>
      <author>
        <name>Ratto Parks, Amy</name>
      </author>
    </item>
    <item>
      <title>Contract Grading and the Development of an Efficacious Writerly Habitus</title>
      <link>https://escholarship.org/uc/item/2c4979b8</link>
      <description>&lt;p&gt;Contract grading has been shown to reduce stress and anxiety, promote self-directed learning, and disrupt unjust educational norms (Cowan, 2020; Inoue, 2019; Medina &amp;amp; Walker, 2018). Yet, there is growing recognition of challenges associated with the approach, including the unintended effects of deemphasizing grades (Inman &amp;amp; Powell, 2018) and the possibility that labor-based contracts, in particular, may put some students at a disadvantage (Carillo, 2021). This article reports selected findings from an IRB-approved multi-semester, comparative study of labor-based and labor-informed contract grading in first-year writing courses at a large private research university. The study affirms several findings from existing research on contract grading. In particular, it shows the approach mitigates students’ stress and anxiety and increases their overall satisfaction with grading. Contract grading shifts the assessment ecology of the first-year writing classroom so that the...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/2c4979b8</guid>
      <pubDate>Mon, 4 Dec 2023 00:00:00 +0000</pubDate>
      <author>
        <name>DasBender, Gita</name>
        <uri>https://orcid.org/0000-0002-0482-4926</uri>
      </author>
      <author>
        <name>Mickelson, Nate</name>
      </author>
      <author>
        <name>Souffrant, Leah</name>
      </author>
    </item>
    <item>
      <title>Editor’s Introduction: Contract Grading, Portfolios, and Reflection</title>
      <link>https://escholarship.org/uc/item/00m15609</link>
      <description>The articles in this issue examine the continuing use and development of contract grading in college and high school writing courses (DasBender et al. and Watson); time and labor as important influences despite most often being seen as outside of the construct of writing (Del Principe); and the treatment of reflection within writing assessment theory and practice (Ratto Parks).</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/00m15609</guid>
      <pubDate>Mon, 4 Dec 2023 00:00:00 +0000</pubDate>
      <author>
        <name>Whithaus, Carl</name>
      </author>
    </item>
    <item>
      <title>Access, Digital Writing, and Achievement: The Data in Two Diverse School Districts</title>
      <link>https://escholarship.org/uc/item/8rb1j91b</link>
      <description>&lt;p&gt;Students must compose texts using keyboards for college and career success. This study focuses on writing done in two school districts by students in Grades 4-11 on Google Docs to understand the relationships among digital device access, digital writing time, and standardized English language arts assessment scores. Our data cover three academic years: 2014-15, 2015-16, and 2016-17. We describe the amount of time spent writing in this mode and how it changed over grade levels and the relationship between Google Docs writing time and access to digital devices. Using fixed-effects regression, the amount of time spent writing digitally increased significantly during this time. Males and English learners spent fewer minutes writing in Google Docs compared to females and fluent English speakers. Students of color tended to spend more time writing in this mode than our White students. Device density (the number of school-provided digital devices per student) predicted the number...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/8rb1j91b</guid>
      <pubDate>Mon, 5 Dec 2022 00:00:00 +0000</pubDate>
      <author>
        <name>Tate, Tamara</name>
        <uri>https://orcid.org/0000-0002-1753-8435</uri>
      </author>
      <author>
        <name>Warschauer, Mark</name>
        <uri>https://orcid.org/0000-0002-6817-4416</uri>
      </author>
    </item>
    <item>
      <title>Confronting the Ideologies of Assimilation and Neutrality in Writing Program Assessment through Antiracist Dynamic Criteria Mapping</title>
      <link>https://escholarship.org/uc/item/7rq4n47t</link>
      <description>&lt;p&gt;This article contributes to conversations about antiracist writing program assessment, with particular attention to the evaluation of first-year writing samples. In an effort to confront the racist ideologies of assimilation and neutrality, I employed a modified version of dynamic criteria mapping (DCM) that involved surveying students, conducting instructor focus groups, and analyzing writing prompts. The triangulated results informed the development of an assessment tool that was used to examine 89 writing samples. The goal of this assessment was not to produce a set of standards that mirror community values but rather to describe what was happening in the writing program and then use that information to facilitate critical reflection on the ways in which classroom practices align with or depart from the programmatic goal of delivering socially just writing instruction. By sharing my own experiences, I hope to help other writing program administrators (WPAs) develop processes...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7rq4n47t</guid>
      <pubDate>Mon, 5 Dec 2022 00:00:00 +0000</pubDate>
      <author>
        <name>Stewart, Mary</name>
      </author>
    </item>
    <item>
      <title>Two Sisters and a Heuristic for Listening to Multilingual, International Students’ Directed Self-Placement Stories</title>
      <link>https://escholarship.org/uc/item/5s9458xk</link>
      <description>Directed self-placement (DSP) is considered useful in linguistically and culturally diverse writing programs, but questions of self-efficacy and institutional knowledge sustain hesitancy in using DSP with English as an additional language (EAL) writers. This interview study grounded in sociocultural literacy theory explores multilingual, international students’ engagement with writing placement and courses, showcasing two quadrilingual, bicultural, international student sisters, Hemani and Kavya. Despite nearly identical linguistic, cultural, and educational backgrounds upon concurrently entering a writing program, they experienced DSP differently and enrolled in different sections: Hemani in mainstream and Kavya in EAL courses. Hemani shares DSP’s positive impacts on her writing program trajectory whereas Kavya’s story uncovers lost opportunities and feelings of otherness. Findings affirm that multilingual, international student placement is complex and that DSP is highly contextual....</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/5s9458xk</guid>
      <pubDate>Mon, 5 Dec 2022 00:00:00 +0000</pubDate>
      <author>
        <name>Horton, Analeigh E.</name>
      </author>
    </item>
    <item>
      <title>Editor’s Introduction: Assessing Writing Programs and Their Impacts on Students</title>
      <link>https://escholarship.org/uc/item/01d2s22m</link>
      <description>This editor's column provides an overview of Tamara Tate and Mark Warschauer's "Access, Digital Writing, and Achievement," Mary Stewart’s “Confronting the Ideologies of Assimilation and Colorblindness in Writing Program Assessment through Antiracist Dynamic Criteria Mapping," and Analeigh Horton's “Two Sisters and a Heuristic for Listening to Multilingual, International Students’ Directed Self-Placement Stories.”</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/01d2s22m</guid>
      <pubDate>Mon, 5 Dec 2022 00:00:00 +0000</pubDate>
      <author>
        <name>Whithaus, Carl</name>
      </author>
    </item>
    <item>
      <title>The Injustice of Opportunity: Clarence DeWitt Thorpe, Articulation, and the Inter-Institutional Ecology of Writing Assessment</title>
      <link>https://escholarship.org/uc/item/3jv5r0kk</link>
      <description>&lt;p&gt;Because writing assessment’s present is bound to its past (Elliot, 2005; Poe et al., 2018), scholarship has pointed to the need for more critical inquiries of local “assessment ecologies” (Inoue, 2015) to better understand the effects of past injustices (Hammond, 2018; Harms, 2018). In understanding how opportunities are allocated unjustly within processes like articulation, compositionists must be willing to understand how postsecondary ecologies have systematically attempted to deny opportunities for certain student groups. This article does so by examining the 1935 Michigan Committee on the Articulation of High School and College English, a committee in charge of redefining readiness for first-year writing across the state of Michigan, led by Professor Clarence DeWitt Thorpe of the University of Michigan. The work of Thorpe’s committee has been an under examined component of historical assessment ecologies in the Midwest and beyond. Under Thorpe’s guidance, this committee...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/3jv5r0kk</guid>
      <pubDate>Fri, 19 Nov 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Moos, Andrew</name>
      </author>
    </item>
    <item>
      <title>The Effects of Student-Fashioning and Teacher-Pleasing in the Assessment of First-Year Writing Reflective Essays</title>
      <link>https://escholarship.org/uc/item/3c26v2hh</link>
      <description>The use of reflective essays has become a key artifact of outcome-based writing assessment in the field of writing studies (White, 2005). However, scoring reflective essays may be influenced by textual features irrelevant to most outcomes and assessment rubrics.&amp;nbsp; Two problematic features are teacher-pleasing, which Yancey (1996) called the “schmooze factor,” and student-fashioning, which Miura (2018) related to identity formation. In this article, we present two mixed methods studies to examine the effects of these particular textual features on the direct assessment of first-year writing (FYW) reflective essays. In the first pilot study, we identified four textual features relevant to teacher-pleasing and student-fashioning. In the second quasi-experimental study, we created a sample of FYW essays with and without these features. Two assessment teams then scored the essays in order to determine whether these features had statistically significant effects on assessment scores....</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/3c26v2hh</guid>
      <pubDate>Fri, 19 Nov 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Pruchnic, Jeff</name>
      </author>
      <author>
        <name>Barton, Ellen</name>
      </author>
      <author>
        <name>Trimble, Thomas</name>
      </author>
      <author>
        <name>Primeau, Sarah</name>
      </author>
      <author>
        <name>Weiss, Hillary</name>
      </author>
      <author>
        <name>Varty, Nicole Guinot</name>
      </author>
      <author>
        <name>Moore, Tanina Foster</name>
      </author>
    </item>
    <item>
      <title>Editors Introduction, Volume 14 Issue 1</title>
      <link>https://escholarship.org/uc/item/16p348d9</link>
      <description>Editors Introduction, Volume 14 Issue 1</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/16p348d9</guid>
      <pubDate>Fri, 19 Nov 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Kelly-Riley, Diane</name>
      </author>
      <author>
        <name>Whithaus, Carl</name>
      </author>
    </item>
    <item>
      <title>Slouching Toward Sustainability: Mixed Methods in the Direct Assessment of Student Writing</title>
      <link>https://escholarship.org/uc/item/9z65k7wj</link>
      <description>The development of present-day assessment culture in higher education has led to a disciplinary turn away from statistical definitions of reliability and validity in favor of methods argued to have more potential for positive curricular change. Such interest in redefining reliability and validity also may be inspired by the unsustainable demands that large-scale quantitative assessment would place on composition programs. In response to this dilemma, we tested a mixed-methods approach to writing assessment that combined large-scale quantitative assessment using thin-slice methods with targeted, smaller-scale qualitative assessment of selected student writing using rich features analysis. We suggest that such an approach will allow composition programs to (a) directly assess a representative sample of student writing with excellent reliability, (b) significantly reduce total assessment time, and (c) preserve the autonomy and contextualized quality of assessment sought in current...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/9z65k7wj</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Pruchnic, Jeff</name>
      </author>
      <author>
        <name>Susak, Chris</name>
      </author>
      <author>
        <name>Grogan, Jared</name>
      </author>
      <author>
        <name>Primeau, Sarah</name>
      </author>
      <author>
        <name>Torok, Joe</name>
      </author>
      <author>
        <name>Trimble, Thomas</name>
      </author>
      <author>
        <name>Foster, Tanina</name>
      </author>
      <author>
        <name>Barton, Ellen</name>
      </author>
    </item>
    <item>
      <title>Editorial Board 2019</title>
      <link>https://escholarship.org/uc/item/9x616148</link>
      <description>Editorial Board 2019</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/9x616148</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
    </item>
    <item>
      <title>Rebuilding Habits: Assessing the Impact of a Hybrid Learning Contract in Online First-Year Composition Courses</title>
      <link>https://escholarship.org/uc/item/9sp0g53j</link>
      <description>This article examines a pilot study of a learning contract in an online first-year writing program. The program uses a master-class model with a shared curriculum and serves more than 3,500 students a semester. In this pilot, we implemented the contract within half of our courses. Our goal was to understand the impact of a learning contract on student retention in our first-year writing courses. We also hoped to determine if the learning contract helped shift student and instructor focus from grades to skill transfer. In this article, we first discuss the process of developing a learning contract, including the challenges of collaborating with faculty to address their needs and concerns; building instructor and instructional designer buy-in; and working through the limitations of the learning management system (LMS) to implement the contract in online courses. Second, we assess the results of the initial pilot to determine whether the contract functioned as we hoped by tracking...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/9sp0g53j</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Stuckey, Michelle A.</name>
      </author>
      <author>
        <name>Erdem, Ebru</name>
      </author>
      <author>
        <name>Waggoner, Zachary</name>
      </author>
    </item>
    <item>
      <title>Recommendations or choices? A review of Decisions, Agency, and Advising</title>
      <link>https://escholarship.org/uc/item/9jj077d4</link>
      <description>Recommendations or choices? A review of Decisions, Agency, and Advising</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/9jj077d4</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>di Gennaro, Kristen</name>
      </author>
    </item>
    <item>
      <title>Book Review: Labor-Based Grading Contracts: Building Equity and Inclusion in the Writing Classroom by Asao B. Inoue</title>
      <link>https://escholarship.org/uc/item/9hw9p7hc</link>
      <description>Grading writing, or judging language, can be difficult. Asao B. Inoue's Labor-Based Grading Contracts problematizes traditional assessment practices that assess writing "quality". Inoue explains how this type of practice operates to reproduce White supremacy because language standards are tied to historical White racial formations. He suggests an alternative assessment method (e.g., grading contracts) that is based on labor and compassion. If you find yourself dissatisfied with classroom grading practices or wanting to understand how writing assessment can be constructed to do social justice work, then Inoue's Labor-Based Grading Contracts is a great read. &lt;br&gt;
  &lt;br&gt;&lt;strong&gt;Keywords&lt;/strong&gt;: grading contracts, race, writing assessment, labor&lt;br&gt;&lt;br&gt;&lt;br&gt;&lt;br&gt;
&lt;br&gt;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/9hw9p7hc</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Wood, Shane A.</name>
      </author>
    </item>
    <item>
      <title>Editorial Board 2017</title>
      <link>https://escholarship.org/uc/item/9fw8r6n1</link>
      <description>Editorial Board 2017</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/9fw8r6n1</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
    </item>
    <item>
      <title>Teaching and Learning in an 'Audit Culture':  A Critical Genre Analysis of Common Core Implementation</title>
      <link>https://escholarship.org/uc/item/9653s1fk</link>
      <description>This article examines classroom materials and sample assessments to understand the effects of Common Core implementation on the teaching and learning of writing. Drawing on theories of genre systems and intertextuality, the article focuses on the ways in which a Common Core-aligned senior English Language Arts textbook and sample writing assessment recontextualize the standards in writing prompts, criteria, and written instructions related to argumentative writing. This critical genre analysis demonstrates the ways in which a theory of writing is transformed in the implementation of the standards, and makes visible the ways in which the implementation process privileges the goals and needs of an accountability mandate rather than the teachers and students enacting the standards.</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/9653s1fk</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Jacobson, Brad</name>
      </author>
    </item>
    <item>
      <title>Big Data, Learning Analytics, and Social Assessment1</title>
      <link>https://escholarship.org/uc/item/94b4m5f9</link>
      <description>This article explores the value of using social media and a community rubric to assess writing ability across genres, course sections, and classes. Since Fall 2011 through Spring 2013, approximately 70 instructors each semester in the first-year composition program at the University of South Florida have used one rubric to evaluate over 100,000 student essays. Between Fall 2012 and Spring 2013, students used the same rubric to conduct more than 20,000 peer reviews. The rubric was developed via a datagogical, crowdsourcing process (Moxley, 2008; Vieregge, Stedman, Mitchell, &amp;amp; Moxley, 2012). It was administrated via &lt;i&gt;My Reviewers&lt;/i&gt;, a web-based software tool designed to facilitate document review, peer review, and writing program assessment. This report explores what we have learned by comparing rubric scores by project and semester on five measures (Focus, Organization, Evidence, Style, and Format) by project, section, semester, and course and by comparing independent evaluators'...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/94b4m5f9</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Moxley, Joe</name>
      </author>
    </item>
    <item>
      <title>Big Rubrics and Weird Genres:  The Futility of Using Generic Assessment Tools Across Diverse Instructional Contexts</title>
      <link>https://escholarship.org/uc/item/93b9g3t6</link>
      <description>Interest "all-purpose" assessment of students' writing and/or speaking appeals to many teachers and administrators because it seems simple and efficient, offers a single set of standards that can inform pedagogy, and serves as a benchmark for institutional improvement.  This essay argues, however, that such generalized standards are unproductive and theoretically misguided.  Drawing on situated approaches to the assessment of writing and speaking, as well as many years of collective experience working with faculty, administrators, and students on communication instruction in highly specific curricular contexts, we demonstrate the advantages of shaping assessment around local conditions, including discipline-based genres and contexts, specific and varied communicative goals, and the embeddedness of communication instruction in particular "ways of knowing" within disciplines and subdisciplines.  By sharing analyses of unique genres of writing and speaking at our institutions, and...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/93b9g3t6</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Anson, Chris M.</name>
      </author>
      <author>
        <name>Dannels, Deanna P.</name>
      </author>
      <author>
        <name>Flash, Pamela</name>
      </author>
      <author>
        <name>Gaffney, Amy L. Housley</name>
      </author>
    </item>
    <item>
      <title>Moving Beyond the Common Core to Develop Rhetorically Based and Contextually Sensitive Assessment Practices</title>
      <link>https://escholarship.org/uc/item/9325025k</link>
      <description>Much political and disciplinary debate has occurred regarding The Common Core State Standards and the development and implementation of concomitant standardized tests generated by the two national assessment consortia: The Partnerships for Assessment of Readiness for College and Careers (PARCC) and Smarter Balanced Assessment Consortium (SBAC). In entering the debate about K-12 standardized assessment, the authors critique the top-down model of assessment that has dominated K-12 education and is currently being promoted by the national assessment consortia, and how the assessments associated with the national assessment consortia promote an interpretation of college readiness from a skill-based framework. Moreover, we examine PARCC by using content analysis to illustrate how it is an inflexible assessment measure that fails to capture the complexity of learning, specifically in literacy based on more than thirty years of disciplinary research. In contrast, using the construct...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/9325025k</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Clark-Oates, Angela</name>
      </author>
      <author>
        <name>Rankins-Robertson, Sherry</name>
      </author>
      <author>
        <name>Ivy, Erica</name>
      </author>
      <author>
        <name>Behm, Nicholas</name>
      </author>
      <author>
        <name>Roen, Duane</name>
      </author>
    </item>
    <item>
      <title>Write Outside the Boxes: The Single Point Rubric in the Secondary ELA Classroom</title>
      <link>https://escholarship.org/uc/item/9175z7zs</link>
      <description>The conversation around writing assessment in educational settings has been developed by research, practices, and legislation over the last 100 years. This article focuses on secondary writing assessment, where instructors are typically limited by local and statewide requirements. The debate on the use of rubrics (such as the traditional analytic and holistic) and the use of narrative/feedback assessment has shaped secondary writing instruction and assessment, but it has largely been shaped by stakeholders outside of the classroom. This article presents the Single Point Rubric (SPR) as a possible tool to work against the problematic applications of analytic and holistic rubrics without the commitment of time, focus, and energy that narrative feedback assessment demands. Rooted in decades old concepts of grid-grading, the SPR combines the formulaic and time-saving components of rubrics with the differentiated and individualized components of narrative response and grading via detailed...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/9175z7zs</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Wilson, Jenna</name>
      </author>
    </item>
    <item>
      <title>Editors' Introduction (Fall 2018)</title>
      <link>https://escholarship.org/uc/item/8xd940nc</link>
      <description>Editors' Introduction (Fall 2018)</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/8xd940nc</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Kelly-Riley, Diane</name>
      </author>
      <author>
        <name>Whithaus, Carl</name>
      </author>
    </item>
    <item>
      <title>Introduction to a Special Issue on a Theory of Ethics for Writing Assessment</title>
      <link>https://escholarship.org/uc/item/8nq5w3t0</link>
      <description>Editors' introduction to the Special Issue on a Theory of Ethics for Writing Assessment.</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/8nq5w3t0</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Kelly-Riley, Diane</name>
      </author>
      <author>
        <name>Whithaus, Carl</name>
      </author>
    </item>
    <item>
      <title>Editors Introduction, Volume 13, Issue 1</title>
      <link>https://escholarship.org/uc/item/8bh1h777</link>
      <description>Editors Introduction, Volume 13, Issue 1</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/8bh1h777</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Kelly-Riley, Diane</name>
      </author>
      <author>
        <name>Whithaus, Carl</name>
      </author>
    </item>
    <item>
      <title>Introduction: Writing Assessment, Placement, and the Two-Year College</title>
      <link>https://escholarship.org/uc/item/8393560s</link>
      <description>Two-year colleges are experiencing rapid change, much of which is driven by reform-minded higher education researchers, philanthropists, and policymakers seeking to improve degree completion rates in the nation's open-admissions community colleges. As part of this broader push for reform, placement has come under increased scrutiny, and many two-year colleges are reevaluating and reimagining longstanding placement practices. To set the context for the 2019 special issue of Journal of Writing Assessment on Writing Placement at Two-Year Colleges, this introductory essay reviews five scholarly conversations essential for understanding the issues and stakes: 1) the distinctive histories, missions, demographics, and constraints and opportunities of open admissions two-year colleges; 2) the nature, problems, and possibilities of the reform pressures currently bearing on two-year colleges and placement; 3) the history of writing placement assessment and the theoretical debates surrounding...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/8393560s</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Toth, Christie</name>
      </author>
      <author>
        <name>Nastal, Jessica</name>
      </author>
      <author>
        <name>Hassle, Holly</name>
      </author>
      <author>
        <name>Giordano, Joanne Baird</name>
      </author>
    </item>
    <item>
      <title>Collaborative Placement of Multilingual Writers: Combining Formal Assessment and Self-Evaluation</title>
      <link>https://escholarship.org/uc/item/7z6683m6</link>
      <description>Placement of multilingual writers within writing programs is an important and challenging issue. If students perceive that the placement process is rigid and unfair, this perception may affect their attitudes and motivation levels while taking courses in the writing program. The purpose of this study was to see whether a specific subgroup of students (n = 65) in a large university writing program for multilingual students could be successful if allowed to collaborate, with guidance, in their own placement. Various data were collected about these students in their first quarter after matriculating in the writing program: instructors' initial ratings, students' outcomes in their initial course (final portfolio scores and course grades), and students' satisfaction levels with their placement after they had completed the course (via a brief survey). These data were compared to another group of students (n = 65) who received similar placement scores but were not given the choice to...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7z6683m6</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Ferris, Dana</name>
      </author>
      <author>
        <name>Lombardi, Amy</name>
      </author>
    </item>
    <item>
      <title>Introduction to Volume 8</title>
      <link>https://escholarship.org/uc/item/7wp2j31t</link>
      <description>Editors' Introduction to Volume 8</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7wp2j31t</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Kelly-Riley, Diane</name>
      </author>
      <author>
        <name>Whithaus, Carl</name>
      </author>
    </item>
    <item>
      <title>Editors' Introduction: Special Issue on Two-Year College Writing Placement</title>
      <link>https://escholarship.org/uc/item/7vg91466</link>
      <description>Editors' Introduction: Special Issue on Two-Year College Writing Placement</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7vg91466</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Kelly-Riley, Diane</name>
      </author>
      <author>
        <name>Whithaus, Carl</name>
      </author>
    </item>
    <item>
      <title>Innovation and the California State University and Colleges English Equivalency Examination, 1973-1981: An Organizational Perspective</title>
      <link>https://escholarship.org/uc/item/7rt5v9p2</link>
      <description>This article examines the origin and development of the English Equivalency Exam (EEE) used by California State University and Colleges between 1973 and 1981. Although an episode in the history of writing assessment that has been well documented, the EEE bears revisiting through the lens of an organizational perspective, with special attention to the process of innovation. Attention to management processes and the contexts in which they occur can inform the perspectives of professionals in language assessment and strengthen their commitment to action undertaken on behalf of students. &lt;p&gt;&lt;b&gt;Keywords&lt;/b&gt;: assessment management; California State University and Colleges English Equivalency Examination; history of writing assessment; holistic scoring; innovation&lt;/p&gt;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7rt5v9p2</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Haswell, Richard</name>
      </author>
      <author>
        <name>Elliot, Norbert</name>
      </author>
      <author>
        <name>Arizona State University</name>
      </author>
      <author>
        <name>Arizona State University</name>
      </author>
      <author>
        <name>Indiana University of Pennsylvania</name>
      </author>
    </item>
    <item>
      <title>Critique of Mark D. Shermis &amp;amp; Ben Hamner, 'Contrasting State-of-the-Art Automated Scoring of Essays: Analysis'</title>
      <link>https://escholarship.org/uc/item/7qh108bw</link>
      <description>Although the unpublished study by Shermis &amp;amp; Hamner (2012) received substantial publicity about its claim that automated essay scoring (AES) of student essays was as accurate as scoring by human readers, a close examination of the paper's methodology demonstrates that the data and analytic procedures employed in the study do not support such a claim. The most notable shortcoming in the study is the absence of any articulated construct for writing, the variable that is being measured.  Indeed, half of the writing samples used were not essays but short one-paragraph responses involving literary analysis or reading comprehension that were not evaluated on any construct involving writing.  In addition, the study's methodology employed one method for calculating the reliability of human readers and a different method for calculating the reliability of machines, this difference artificially privileging the machines in half the writing samples.  Moreover, many of the study's conclusions...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7qh108bw</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Perelman, Les C.</name>
      </author>
    </item>
    <item>
      <title>Book Review: Henry Chauncey: An American Life by Norbert Elliot</title>
      <link>https://escholarship.org/uc/item/7pg8n4ww</link>
      <description>If you want to read a history of writing assessment as it developed during the 20th century within the narrow and specialized confines of the Educational Testing Service (ETS), you can't do better than Norbert Elliot's &lt;i&gt;On a Scale: A Social History of Writing Assessment in America &lt;/i&gt; (2005). If your curiosity about ETS is not satisfied by that enormously careful and detailed history, and if you want to gain a close-up, intimate understanding of the person one author called ETS's "first president and abiding institutional deity" (Owen, 1985, p. 1), then you can't do better than Elliot's new biography, &lt;i&gt;Henry Chauncey: An American Life&lt;/i&gt;. &lt;p&gt;
Later in this review, I will re-visit those last two "ifs" &lt;/p&gt;&lt;p&gt;&lt;/p&gt;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7pg8n4ww</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Broad, Bob</name>
      </author>
    </item>
    <item>
      <title>Afterword</title>
      <link>https://escholarship.org/uc/item/7mn0z56w</link>
      <description>Afterword</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7mn0z56w</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>O'Neill, Peggy</name>
      </author>
      <author>
        <name>Kelly-Riley, Diane</name>
      </author>
    </item>
    <item>
      <title>Validity inquiry of race and shared evaluation practices in a large-scale, university-wide writing portfolio assessment</title>
      <link>https://escholarship.org/uc/item/7m18h956</link>
      <description>This article examines the intersections of students' race with the evaluation of their writing abilities in a locally-developed, context-rich, university-wide, junior-level writing portfolio assessment that relies on faculty articulation of standards and shared evaluation practices.  This study employs sequential regression analysis to identify how faculty raters operationalize their definition of good writing within this university-wide writing portfolio assessment, and, in particular, whether students' race accounts for any of the variability in faculty's assessment of student writing.  The findings suggest that there is a difference in student performance by race, but that student race does not contribute to faculty's assessment of students' writing in this setting.  However, the findings also suggest that faculty employ a limited set of the criteria published by the writing assessment program, and faculty use non-programmatic criteria--including perceived demographic variables--in...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7m18h956</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Kelly-Riley, Diane</name>
      </author>
    </item>
    <item>
      <title>Language Background and the College Writing Course</title>
      <link>https://escholarship.org/uc/item/7hr047zk</link>
      <description>In an era of growing linguistic diversity, assessment of all college writing courses needs to include a focus on multilingual equity: How well does the course serve the needs of students with varying language backgrounds and educational histories? In this study, an Education and Language Background (ELB) survey was developed on a scale measuring divergence from default assumptions of college students as U.S.-educated monolingual English speakers. This survey data was used in assessment of a junior-level college writing course by correlating student ELB data with writing sample scores. On the pre-test, multilingual students and immigrants educated in non-U.S. systems scored significantly lower, but by the post-test this effect had disappeared, suggesting that junior-level writing instruction may be of especial utility to such students. Survey data also revealed important language and education differences between students who began their career at College Y in first-year composition...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7hr047zk</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Hall, Jonathan</name>
      </author>
    </item>
    <item>
      <title>Dynamic Patterns: Emotional Episodes within Teachers' Response Practices</title>
      <link>https://escholarship.org/uc/item/7gz1c60w</link>
      <description>Responding to student writing is one activity where teachers' emotions become relevant, but there are limited scholarly conversations directly discussing emotion as a component of teachers' responses to student writing. This article brings together scholarship on emotion, survey results, and narrative description of two specific teachers to suggest the relationship between emotion and response: a dynamic, recursive episode pattern of values, triggers, emotions, and actions. The results of 146 surveys of writing teachers reporting on emotions in their response practices provide a contextual grounding for a closer examination of the interrelated emotional episode of one teacher, Brittney. An awareness of the emotional episode of response promotes reflection and acts as a catalyst for teachers to think about their teacherly identity.</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7gz1c60w</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Caswell, Nicole I.</name>
      </author>
    </item>
    <item>
      <title>Helping Faculty Self-Regulate Emotional Responses in Writing Assessment:  Use of an Overall Response Rubric Category</title>
      <link>https://escholarship.org/uc/item/7ds38413</link>
      <description>Faculty evaluation of student learning artifacts is a critical activity as accrediting bodies call for campuses to promote "cultures of assessment." Also important are opportunities for faculty engagement and development that assessment projects provide. However, such projects come with significant challenges to facilitators and faculty scorers themselves. Faculty bring their own expertise and beliefs about student learning and writing to the assessment context, all of which can have emotional valence. Assessment sessions may emphasize faculty scoring to components of a rubric, perhaps eliminating a holistic score from the conversations because holistic scores are not viewed as actionable data points and thus are often not parts of the rubric (McConnell &amp;amp; Rhodes, 2017).  As actionable data and reliability among scorers are emphasized in assessment, and holistic scores fall away, are we losing an important scoring tool by removing a place for assessment scorers to log their...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7ds38413</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Neely, Michelle E.</name>
      </author>
    </item>
    <item>
      <title>ePortfolios: Foundational Measurement Issues</title>
      <link>https://escholarship.org/uc/item/7bm4t40t</link>
      <description>Using performance information obtained for program assessment purposes, this quantitative study reports the relationship of ePortfolio trait and holistic scores to specific academic achievement measures for first-year undergraduate students. Attention is given to three evidential categories: consensus and consistency evidence related to reliability/precision; convergent evidence related to validity; and score difference and predictive evidence related to fairness. Interpretative challenges of ePortfolio-based assessments are identified in terms of consistency, convergent, and predictive evidence. Benefits of these assessments include the absence of statistically significant differences in ePortfolio scores for race/ethnicity sub-groups. Discussion emphasizes the need for principled design and contextual information as prerequisite to score interpretation and use. Instrumental value of the study suggests that next-generation ePortfolio-based research must be alert to sample size,...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7bm4t40t</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Elliot, Norbert</name>
      </author>
      <author>
        <name>Rudniy, Alex</name>
      </author>
      <author>
        <name>Deess, Perry</name>
      </author>
      <author>
        <name>Klobucar, Andrew</name>
      </author>
      <author>
        <name>Collins, Regina</name>
      </author>
      <author>
        <name>Sava, Sharla</name>
      </author>
    </item>
    <item>
      <title>An Annotated Bibliography of Writing Assessment: Machine Scoring and Evaluation of Essay-length Writing</title>
      <link>https://escholarship.org/uc/item/73s8c743</link>
      <description>This installment of the &lt;i&gt;JWA&lt;/i&gt; annotated bibliography focuses on the phenomenon of machine scoring of whole essays composed by students and others. "Machine scoring" is defined as the rating of extended or essay writing by means of automated, computerized technology. Excluded is scoring of paragraph-sized free responses of the sort that occur in academic course examinations. Also excluded is software that checks only grammar, style, and spelling. Included, however, is software that provides other kinds of evaluative or diagnostic feedback along with a holistic score. While some entries in this bibliography describe, validate, and critique the ways computers "read" texts and generate scores and feedback, other sources critically examine how these results are used. The topic is timely, since the use of machine scoring of essays is rapidly growing in standardized testing, sorting of job and college applicants, admission to college, placement into and exit out of writing courses,...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/73s8c743</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Haswell, Richard</name>
      </author>
      <author>
        <name>Donnelly, Whitney</name>
      </author>
      <author>
        <name>Hester, Vicki</name>
      </author>
      <author>
        <name>O'Neill, Peggy</name>
      </author>
      <author>
        <name>Schendel, Ellen</name>
      </author>
    </item>
    <item>
      <title>Forum: Issues and Reflections on Ethics and Writing Assessment</title>
      <link>https://escholarship.org/uc/item/7076v8f0</link>
      <description>We hope this special issue adds to the body of knowledge created by the writing studies community with respect to the opportunities that can be created when assessment is seen in terms of the creation of opportunity structure. This hope is accompanied by a reminder of our strength as we annually encounter approximately 48.9 million students in public elementary and secondary schools 20.6 million students in postsecondary institutions (Snyder &amp;amp; Dillow, 2015). Our influence is remarkable as we touch the lives of many, one student at a time.</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/7076v8f0</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Elliot, Norbert</name>
      </author>
      <author>
        <name>Slomp, David</name>
      </author>
      <author>
        <name>Poe, Mya</name>
      </author>
      <author>
        <name>Cogan, John Aloysius, Jr.</name>
      </author>
      <author>
        <name>Broad, Bob</name>
      </author>
      <author>
        <name>Cushman, Ellen</name>
      </author>
    </item>
    <item>
      <title>Reframing Reliability for Writing Assessment</title>
      <link>https://escholarship.org/uc/item/6w87j2wp</link>
      <description>This essay provides an overview of the research and scholarship on reliability in college writing assessment from the author's perspective as a composition and rhetoric scholar.  It argues for reframing reliability by drawing on traditions from fields of college composition and educational measurement with the goal of developing a more productive discussion about reliability as we work toward a unified field of writing assessment.  In making this argument, the author uses the concept of framing to argue that writing assessment scholars should develop a shared understanding of reliability.  The shared understanding begins with the values—such as accuracy, consistency, fairness, responsibility, and meaningfulness—that we have in common with others, including psychometricians and measurement specialists, instead of focusing on the methods.  Traditionally, reliability has been framed by statistical methods and calculations associated with positivist science although psychometric theory...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/6w87j2wp</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>O'Neill, Peggy</name>
      </author>
    </item>
    <item>
      <title>Perceptions of Fairness in Summer Bridge Classrooms with Contract Grades</title>
      <link>https://escholarship.org/uc/item/6qc0j26v</link>
      <description>This narrative explains how a summer bridge student turned writing fellow effectively communicated peers' comments about fairness in grading to her former professor as she prepared to teach her summer bridge writing course again. Co-authored by the instructor and undergraduate student, this reflection explores both undergraduate understandings of fairness in the context of contract grading as well as the teacher-student relationship. Both teacher and student advocate for the use of contract grading in summer bridge writing classrooms. However, they argue that systems of grading need to be clarified and contextualized for pre-college students who sometimes express confusion about college standards and/or may overextend lessons learned during their first college course to their fall semester when not all professors will use contract grading.
&lt;br&gt;
  &lt;br&gt;&lt;strong&gt;Keywords:&lt;/strong&gt; fairness, summer bridge, writing fellows, revision, contract grading&lt;br&gt;&lt;br&gt;&lt;br&gt;&lt;br&gt;
&lt;br&gt;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/6qc0j26v</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Reardon, Kristina</name>
      </author>
      <author>
        <name>Guardado-Menjivar, Vanessa</name>
      </author>
    </item>
    <item>
      <title>Directed Self-Placement at Two-Year Colleges: A Kairotic Moment</title>
      <link>https://escholarship.org/uc/item/6g81k736</link>
      <description>As national reform efforts are reshaping community college policies with the goal of improving degree completion rates, many two-year colleges are rethinking longstanding course placement processes. Directed Self-Placement (DSP) has emerged as one increasingly visible and viable option for placing students into introductory English and mathematics courses. However, higher education researchers advocating placement reform demonstrate little familiarity with the extensive scholarly literature on DSP in writing studies. To date, that literature has focused almost exclusively on 4-year institutions, with few studies of DSP at two-year colleges. This article begins to address these gaps by (a) reviewing writing studies scholarship on DSP to identify key theoretical insights that are missing in the community college placement reform literature and (b) presenting findings from semi-structured interviews with implementation leaders at twelve 2-year colleges that have attempted DSP. These...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/6g81k736</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Toth, Christie</name>
      </author>
    </item>
    <item>
      <title>Editorial Board 2020</title>
      <link>https://escholarship.org/uc/item/63c4w48s</link>
      <description>Editorial Board 2020</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/63c4w48s</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
    </item>
    <item>
      <title>Afterword:  Volume 6, 2013</title>
      <link>https://escholarship.org/uc/item/5wm773wr</link>
      <description>Volume 6 of the &lt;i&gt;Journal of Writing Assessment&lt;/i&gt;, our second as co-editors and the third as an online open-access journal, is complete. Looking back at the five articles we published, important themes emerged related to the impact of digital technologies on writing assessment and the connections between rubrics and raters.  &lt;p&gt;&lt;/p&gt;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/5wm773wr</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Kelly-Riley, Diane</name>
      </author>
      <author>
        <name>O'Neill, Peggy</name>
      </author>
    </item>
    <item>
      <title>Globalizing Plagiarism &amp;amp; Writing Assessment: A Case Study of Turnitin</title>
      <link>https://escholarship.org/uc/item/5vq519dr</link>
      <description>This article examines the plagiarism detection service Turnitin.com's recent expansion into international writing assessment technologies. Examining Turnitin's rhetorics of plagiarism alongside scholarship on plagiarism detection illuminates Turnitin's efforts to globalize definitions of and approaches to plagiarism. If successful in advancing their positions on plagiarism, Turnitin's products could be proffered as a global model for writing assessment. The proceedings of a Czech Republic conference partially sponsored by Turnitin demonstrate troubling constructions of the "student plagiarist". They demonstrate, too, a binary model of west and nonwest that stigmatizes nonwestern institutions and students. These findings support an ongoing attention to the global cultural work of corporate plagiarism detection and assessment.</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/5vq519dr</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Canzonetta, Jordan</name>
      </author>
      <author>
        <name>Kannan, Vani</name>
      </author>
    </item>
    <item>
      <title>Understanding Proficiency: Analyzing the Characteristics of Secondary Studentsâ€™ On-Demand Analytical Essay Writing</title>
      <link>https://escholarship.org/uc/item/5tw695cz</link>
      <description>This study investigated the different characteristics of not-pass (n = 174), adequate-pass (n = 173), and strong-pass (n = 114) text-based, analytical essays written by middle and high school students. Essays were drawn from the 2015-2016 Pathway writing and reading intervention pretests and posttests. Results revealed the use of relevant summary was an important difference between not-pass and adequate-pass essays where significantly more adequate-pass essays used summary in a purposeful rather than general way. In contrast, major characteristics that set apart strong-pass essays from adequate-pass essays involved providing analysis and including a clear conclusion or end. Factors that affected these characteristics such as whether the writer made claims and comments about the text are discussed, and some instructional strategies are suggested. 
&lt;br&gt;
  &lt;br&gt;&lt;strong&gt;Keywords&lt;/strong&gt;: writing proficiency, writing instruction, adolescent literacy, text-based analytical writing,...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/5tw695cz</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Chen, Vicky</name>
      </author>
      <author>
        <name>Olson, Carol B.</name>
      </author>
      <author>
        <name>Chung, Huy Quoc</name>
      </author>
    </item>
    <item>
      <title>College Students' Use of a Writing Rubric: Effect on Quality of Writing, Self-Efficacy, and Writing Practices</title>
      <link>https://escholarship.org/uc/item/5f79g6kz</link>
      <description>Fifty-six college students enrolled in two sections of a psychology class were randomly assigned to use one of three tools for assessing their own writing: a long rubric, a short rubric, or an open-ended assessment tool.  Students used their assigned self-assessment tool to assess drafts of a course-required, five-page paper.  There was no effect of self-assessment condition on the quality of students'; final drafts, or on students' self-efficacy for writing.  However, there was a significant effect of condition on students' writing beliefs and practices, with long rubric users reporting more productive use of self-assessment than students using the open-ended tool.  In addition, across conditions, most students reported that being required to assess their writing shaped their writing practices in desirable ways. 

&lt;b&gt;Keywords&lt;/b&gt;: rubrics, self-efficacy, self-assessment, working memory, writing quality, writing beliefs, college writers</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/5f79g6kz</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Covill, Amy E.</name>
      </author>
    </item>
    <item>
      <title>The Empirical Development of an Instrument to Measure  Writerly Self-Efficacy in Writing Centers</title>
      <link>https://escholarship.org/uc/item/5dp4m86t</link>
      <description>Post-secondary writing centers have struggled to produce substantial, credible, and sustainable evidence of their impact in the educational environment.  The objective of this study was to develop a college-level writing self-efficacy scale that can be used across repeated sessions in a writing center, as self-efficacy has been identified as an important construct underlying successful writing and cognitive development.  A 20-item instrument (PSWSES) was developed to evaluate writerly self-efficacy.  505 university students participated in the study.  Results indicate that the PSWSES has high internal consistency and reliability across items and construct validity, which was supported through a correlation between tutor perceptions of client writerly self-efficacy and client self-ratings.  Factor analysis revealed three factors: local and global writing process knowledge, physical reaction, and time/effort.  Additionally, across repeated sessions, the clients' PSWSES scores appropriately...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/5dp4m86t</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Schmidt, Katherine M.</name>
      </author>
      <author>
        <name>Alexander, Joel E.</name>
      </author>
    </item>
    <item>
      <title>The Path to Competency-Based Certification: A Look at the LEAP Challenge and the VALUE Rubric for Written Communication</title>
      <link>https://escholarship.org/uc/item/5575w31k</link>
      <description>Although originally designed by writing professionals, AAC&amp;amp;U's VALUE Written Communication rubric is one small part of a larger national vision for higher education. This article traces that vision through multiple AAC&amp;amp;U publications from 2002-2017 to demonstrate the way advocacy-based philanthropy and competency-based education has shifted the VALUE initiative away from institutionally-based assessment toward national accountability. With the General Education Maps and Markers (GEMs) pathway initiative of 2015 and the creation of the VALUE Institute national scoring database in 2018, the VALUE rubrics may be used to compare writing instruction at universities, to facilitate state-wide transfer agreements, and to certify students' degree completion. In so doing, much of the original value of the rubric for writing studies is lost. When used on a national scale, it is impossible to modify for local context. I argue experts in writing assessment need greater awareness of...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/5575w31k</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Grouling, Jennifer</name>
      </author>
    </item>
    <item>
      <title>Measuring Civic Writing: The Development and Validation of the Civically Engaged Writing Analysis Continuum</title>
      <link>https://escholarship.org/uc/item/5552v71s</link>
      <description>As youth increasingly access the public sphere and contribute to civic life through digital tools, scholars and educators are rethinking how civically engaged writing is taught, nurtured, and assessed. This article presents the conceptual underpinnings of the National Writing Project's Civically Engaged Writing Analysis Continuum (CEWAC), a new tool for assessing youth's civically engaged writing. It defines four attributes of civically engaged writing using qualitative analysis of expert interviews and literature: employs a public voice, advocates civic engagement or action, argues a position based on reasoning and evidence, and employs a structure to support a position. The article also presents reliability and validity evidence for CEWAC. The study finds that CEWAC has a moderate to high level of exact agreement and a high level of exact or adjacent agreement. Covariation analyses showed that, even with similar scoring patterns, CEWAC's attributes hold at least a moderate level...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/5552v71s</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Friedrich, Linda</name>
      </author>
      <author>
        <name>Strother, Scott</name>
      </author>
    </item>
    <item>
      <title>Forum: Two-Year College Writing Placement as Fairness</title>
      <link>https://escholarship.org/uc/item/4zv0r9b2</link>
      <description>&lt;b&gt;Keywords&lt;/b&gt;: two-year colleges, placement, developmental education reform, ethics, policy</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/4zv0r9b2</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Gilman, Holly</name>
      </author>
      <author>
        <name>Giordano, Joanne Baird</name>
      </author>
      <author>
        <name>Hancock, Nicole</name>
      </author>
      <author>
        <name>Hassel, Holly</name>
      </author>
      <author>
        <name>Henson, Leslie</name>
      </author>
      <author>
        <name>Hern, Katie</name>
      </author>
      <author>
        <name>Nastal, Jessica</name>
      </author>
      <author>
        <name>Toth, Christie</name>
      </author>
    </item>
    <item>
      <title>Three Interpretative Frameworks: Assessment of English Language Arts-Writing in the Common Core State Standards Initiative</title>
      <link>https://escholarship.org/uc/item/4zb222xg</link>
      <description>We present three interpretative frameworks by which stakeholders can analyze curricular and assessment decisions related to the Common Core State Standards Initiative in English Language Arts-Writing (CCSSI ELA-W). We pay special attention to the assessment efforts of the Smarter Balanced Assessment Consortium (Smarter Balanced) and the Partnership for Assessment of Readiness for College and Careers (PARCC). Informed by recent work in educational measurement and writing assessment communities, the first framework is a multidisciplinary conceptual analysis of the targeted constructs in the CCSSI ELA-W and their potential measurement. The second framework is provided by the Standards for Educational and Psychological Testing (2014) with a primary focus on foundational principles of validity, reliability/precision, and fairness. The third framework is evidence-centered design (ECD), a principled design approach that supports coherent evidentiary assessment arguments. We first illustrate...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/4zb222xg</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Elliot, Norbert</name>
      </author>
      <author>
        <name>Rupp, Andre A.</name>
      </author>
      <author>
        <name>Williamson, David M.</name>
      </author>
    </item>
    <item>
      <title>Editors Introduction: JWA Special Issue on Contract Grading</title>
      <link>https://escholarship.org/uc/item/4xw238dr</link>
      <description>Editors Introduction: JWA Special Issue on Contract Grading</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/4xw238dr</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Kelly-Riley, Diane</name>
      </author>
      <author>
        <name>Whithaus, Carl</name>
      </author>
    </item>
    <item>
      <title>The Effect of Scoring Order on the Independence of Holistic and Analytic Scores</title>
      <link>https://escholarship.org/uc/item/4xb1c9gj</link>
      <description>The Effect of Scoring Order on the Independence of Holistic and Analytic Scores</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/4xb1c9gj</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Singer, Nancy Robb</name>
      </author>
      <author>
        <name>LeMahieu, Paul</name>
      </author>
    </item>
    <item>
      <title>Introduction to Volume 6, 2013</title>
      <link>https://escholarship.org/uc/item/4x12800x</link>
      <description>Welcome to Volume 6 of the &lt;i&gt;Journal of Writing Assessment&lt;/i&gt;. This is the third volume of the online, open-access format, and the second volume in which we are serving as editors. As we mentioned in the last volume, our policy is to publish articles as they complete the review process so that the scholarly work is published as quickly as it is accepted for publication. As a result, JWA doesn’t construct an issue in the same way a print journal does. Instead, the issue grows organically, and when we complete the calendar year we will provide some retrospective comments. You can access all of the archives of JWA for free. The archives include access to all of the print issues of JWA which were made available through the generosity of Hampton Press.</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/4x12800x</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Kelly-Riley, Diane</name>
      </author>
    </item>
    <item>
      <title>Beyond Tradition: Writing Placement, Fairness, and Success at a Two-Year College</title>
      <link>https://escholarship.org/uc/item/4wg8w0ng</link>
      <description>This archival study analyzed the impact of a writing skills placement test at a minority-serving community college. With special emphasis on 1,029 students in the lowest level of developmental writing class, attention was given to both performance (grades and grade point average) and to student placement (in terms of sex and race/ethnicity) from 2012-2016. With findings indicating undue burden on Black students as the result of the placement test, the case study is used to raise questions of success, its formulation, and the instrumental value of the case for next-generation fairness measures for two-year colleges.
&lt;br&gt;
  &lt;br&gt;&lt;b&gt;Keywords&lt;/b&gt;: college composition, fairness, two-year colleges, writing assessment, writing placement&lt;br&gt;&lt;br&gt;&lt;br&gt;&lt;br&gt;
&lt;br&gt;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/4wg8w0ng</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Nastal, Jessica</name>
      </author>
    </item>
    <item>
      <title>Building Student Agency Through Contract Grading in Technical Communication</title>
      <link>https://escholarship.org/uc/item/4v65z263</link>
      <description>The scholarship on contract grading has focused on the impacts in first-year writing, but little work explores how contract grading is used in other writing contexts, specifically technical communication. In fact, a focus on contract grading can align with the social justice turn in technical communication if viewed as a way to enact feminist and antiracist pedagogies. In this reflection, we--an instructor of an introductory technical communication service course and a student who took that class--share our experiences around contract grading. After providing an overview of the course and institutional context, we reflect together on our experiences around student perceptions and attitudes as well as the impact contract grading had on us as teacher and learner. We conclude with lessons learned and how instructors can take up contract grading in their technical communication classrooms. Our goal is to share our experiences that could lead to scholarship on assessment practices...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/4v65z263</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Mallette, Jennifer C.</name>
      </author>
      <author>
        <name>Hawks, Amanda</name>
      </author>
    </item>
    <item>
      <title>Situating writing assessment practices: A review of A Guide to College Writing Assessment</title>
      <link>https://escholarship.org/uc/item/4kw4c2mx</link>
      <description>Situating writing assessment practices: A review of A Guide to College Writing Assessment</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/4kw4c2mx</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Getchell, Kristen</name>
      </author>
    </item>
    <item>
      <title>An Integrated Design and Appraisal Framework for Ethical Writing Assessment</title>
      <link>https://escholarship.org/uc/item/4bg9003k</link>
      <description>In my introduction to this special issue, I highlighted the insufficiency of key measurement concepts--fairness, validity, and reliability--in guiding the design and implementation of writing assessments. I proposed that the concept of ethics provides a more complete framework for guiding assessment design and use. This article advances the philosophical foundation for our theory of ethics articulated by Elliot (this issue). Starting with fairness as first principle, it examines how safety and risk can be addressed through the application of an integrated design and appraisal framework (IDAF) for writing assessment tools. The paper is structured around two case studies set in Alberta, Canada. Case Study 1 applies Kane's (2013) IUA model of validation to an appraisal--Alberta's English 30-1 (grade 12 academic English) diploma exam program--highlighting in the process the limitations in contemporary validity theory. Case Study 2 examines an assessment design project I am currently...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/4bg9003k</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Slomp, David</name>
      </author>
    </item>
    <item>
      <title>Editorial Board 2016</title>
      <link>https://escholarship.org/uc/item/49n8m2fg</link>
      <description>Editorial Board 2016</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/49n8m2fg</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
    </item>
    <item>
      <title>Stories About Grading Contracts, or How Do I Like Through the Violence I've Done?</title>
      <link>https://escholarship.org/uc/item/3zw9h7p9</link>
      <description>Stories About Grading Contracts, or How Do I Like Through the Violence I've Done?</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/3zw9h7p9</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Inoue, Asao B.</name>
      </author>
    </item>
    <item>
      <title>Let Them In: Increasing Access, Completion, and Equity in English Placement Policies at a Two-Year College in California</title>
      <link>https://escholarship.org/uc/item/3nh6v5d0</link>
      <description>This article uses a disparate impact analysis framework to assess the impact of a policy change in writing assessment that roughly doubled the proportion of students placing into college English at Butte College, a two-year college in California. After establishing the disparate impact of placement, we tracked how students performed in college English, subsequent college courses, and overall college completion under the new policy. We found that substantially more students completed college English compared to previous cohorts, with Asian, African American, Latinx, and Native American students' completion of college English doubling or tripling. Upon taking subsequent college courses, students placing into college English under the new policy performed as well as those who had qualified for college English under the more restrictive policy. Overall college completion outcomes, including degree completion and meeting the criteria for transferring to 4-year universities, have generally...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/3nh6v5d0</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Henson, Leslie</name>
      </author>
      <author>
        <name>Hern, Katie</name>
      </author>
    </item>
    <item>
      <title>Automated Essay Scoring in Innovative Assessments of Writing from Sources</title>
      <link>https://escholarship.org/uc/item/3nf6r4kv</link>
      <description>This study examined automated essay scoring for experimental tests of writing from sources. These tests (part of the CBAL research initiative at ETS) embed writing tasks within a scenario in which students read and respond to sources. Two large-scale pilots are reported: One was administered in 2009, in which four writing assessments were piloted, and one was administered in 2011, in which two writing assessments and two reading assessments were administered. Two different rubrics were applied by human raters to each prompt: a general rubric intended to measure only those skills for which automated essay scoring provides relatively direct measurement, and a genre-specific rubric focusing on specific skills such as argumentation and literary analysis. An automated scoring engine (e-rater) was trained on part of the 2009 dataset, and cross-validated against the remaining 2009 dataset and all the 2011 data. The results indicated that automated scoring can achieve operationally acceptable...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/3nf6r4kv</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Deane, Paul</name>
      </author>
      <author>
        <name>Williams, Frank</name>
      </author>
      <author>
        <name>Weng, Vincent</name>
      </author>
      <author>
        <name>Trapani, Catherine S.</name>
      </author>
    </item>
    <item>
      <title>Are We Whom We Claim to Be? A Case Study of Language Policy in Community College Writing Placement Practices</title>
      <link>https://escholarship.org/uc/item/3kh925t2</link>
      <description>This article undertakes a qualitative case study and critical examination of the language ideologies implicit in placement procedures at an urban community college in Washington State. By focusing on the racism inherent in the tacit language policy embedded in the school's placement and assessment procedures, the author proposes strategies to effect change, both at this specific institution and others that employ similar tacit language policies. 
&lt;br&gt;
  &lt;br&gt;&lt;b&gt;Keywords&lt;/b&gt;: Language policy, inequity, placement practices, Language Minority students, standard English
&lt;br&gt;&lt;br&gt;&lt;br&gt;&lt;br&gt;
&lt;br&gt;</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/3kh925t2</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Gilman, Holly</name>
      </author>
    </item>
    <item>
      <title>Recognizing Multiplicity and Audience across the Disciplines: Developing a Questionnaire to Assess Undergraduates' Rhetorical Writing Beliefs</title>
      <link>https://escholarship.org/uc/item/3gr7j9dz</link>
      <description>How do students feel about expressing uncertainty in their academic writing? To what extent do they think about their readers as they compose? Understanding the enactment of rhetorical knowledge is among the goals of many rich qualitative studies about students' reading and writing processes (e.g. Haas &amp;amp; Flower, 1988; Roozen, 2010). The current study seeks to provide a quantitative assessment of student' rhetorical beliefs based on a questionnaire. This study reports on (1) the development of the Measure of Rhetorical Beliefs and (2) demonstration of the measure's construct validity and utility by comparing undergraduates' rhetorical and epistemological beliefs, as well as their composing process, across different majors. The new Measure of Rhetorical Beliefs (MRB) was administered to engineering, business, and liberal arts and science majors, along with the Inventory of Process in College Composition (Lavelle and Zuercher, 2001) and the Epistemological Belief Inventory (Schraw,...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/3gr7j9dz</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Neely, Michelle</name>
      </author>
    </item>
    <item>
      <title>Introduction from the New Editors</title>
      <link>https://escholarship.org/uc/item/3b34m9k0</link>
      <description>Welcome to the first volume of &lt;i&gt;JWA&lt;/i&gt; under our editorship. For the last 14 months, we have been working with Brian Huot (the former editor), Hampton Press (the former publisher), and technology support to move &lt;i&gt;JWA&lt;/i&gt; to an online, open-access format. Because of everyone's cooperation, we were able to get all of the &lt;i&gt;JWA&lt;/i&gt; archives online. We also helped to publish Volume 4, Brian's last volume as editor.</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/3b34m9k0</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Kelly-Riley, Diane</name>
      </author>
    </item>
    <item>
      <title>The Micropolitics of Pathways: Teacher Education, Writing Assessment, and the Common Core</title>
      <link>https://escholarship.org/uc/item/392847v0</link>
      <description>Within writing assessment scholarship, disciplinary discussions about the politics of pathways regularly question how reforms mediate education and affect education actors. This article complements and complicates these conversations by attending to the micropolitics of pathways: how local education actors mediate reform-related standards, and, in the process, pave what they believe to be locally-meaningful pathways. Taking the Common Core State Standards (CCSS) as our point of departure, our study centers on one important site for micropolitical work that has, to date, gone unstudied in CCSS-focused writing assessment research: teacher education, which involves coordination between secondary and postsecondary actors who might differently interpret and engage with externally-imposed reforms. Our findings suggest that while standards may be politically intended to mediate education and standardize pathways, teachers micropolitically interpret and repurpose those standards--strategically...</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/392847v0</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Hammond, J. W.</name>
      </author>
      <author>
        <name>Garcia, Merideth</name>
      </author>
      <author>
        <name>Salt Lake Community College</name>
      </author>
      <author>
        <name>Salt Lake Community College</name>
      </author>
      <author>
        <name>Salt Lake Community College</name>
      </author>
    </item>
    <item>
      <title>A Theory of Ethics for Writing Assessment</title>
      <link>https://escholarship.org/uc/item/36t565mm</link>
      <description>This paper proposes a theory of ethics for writing assessment. Based on a definition of fairness as the identification of opportunity structures created through maximum construct representation under conditions of constraint--and the toleration of constraint only to the extent to which benefits are realized for the least advantaged--the theory is expressed in terms of its tradition, boundary, order, and foundation. To examine the force of the theory, a thought experiment demonstrating action based on the theory is offered so that its weaknesses and strengths are identified. Intended for the research specialization of writing assessment, the theory has generalization implications for the field of writing studies.</description>
      <guid isPermaLink="true">https://escholarship.org/uc/item/36t565mm</guid>
      <pubDate>Wed, 29 Sep 2021 00:00:00 +0000</pubDate>
      <author>
        <name>Elliot, Norbert</name>
      </author>
    </item>
  </channel>
</rss>
