B.28: Research on Responding and Document Assessment—Assessing Writing and Responding Using Traditional and Big Data Methods
Reviewed by LauraAnne Carroll-Adler, University of Southern California,Los Angeles, CA (lauraana@usc.edu)
Speakers: David Martins, Rochester Institute of Technology, Rochester, NY, “Pragmatic Approaches to Assess Writing and Improve Instruction across Language and Cultural Difference”
Sandy Vandercook, Leavell College/New Orleans Baptist Theological Seminary, New Orleans, LA, “Am I Wasting my Time? Teachers’ Beliefs about Written Response and Their Actual Written Response Practices”
(Chair and 3rd speaker did not attend)
The first presenter, David S. Martins, began by bringing his audience into his presentation, calling for areas of interest within his topic. In his paper, he discussed practices developed for assessing first-year composition (FYC) work across several campuses, including international sites in Kosovo, Dubrovnik, and Dubai.
Although the loss of one presenter could have provided more time for what was a fairly complex, data-laden presentation, Martins stayed with the 15-minute allotment. He presented several slides that covered the criteria for assessment used by the teams of readers: One slide, for example, listed Scope, Content, Purpose, Integration, and Variety, each on a 0–4 point scale. Another described the methodology used for the process of grading, norming, and rechecking the scoring of essay across the several campuses. Martin noted the flexibility of the set-up; instructors were able to Skype in and participate from other countries, and even from a car stuck in a storm. He concluded with a set of questions and challenges to be addressed—for example, aligning data collection from these assessments with the goals and needs of the various institutions participating.
The premise behind this presentation was interesting, and it surely holds promise for exploring methods of proceeding in an increasingly mobile and online environment. Many of the slides, however, were presented too quickly to copy or even photograph; they were also not accessible on the Conference on College Composition and Communication (CCCC) Connected Community site. I am hoping they will eventually be made available there or published in some form, since the project and its implications point the way towards continued growth in transnational assessment practices.
The second speaker was Sandra F. Vandercook, whose presentation centered on individual assessment, or more specifically, commenting practices. She proposed examining the practice of commenting from the teachers’ point of view by suggesting we ask ourselves what our goals are for our comments.
She discussed her study of four writing instructors who were interviewed about their teaching goals and beliefs, observed in class, and then asked to submit samples of comments on student writing. The study found a few emerging themes. Primarily, Vandercook argued that teachers need to think in terms of responding as readers. Teachers needed and wanted an opportunity to reflect on their own pedagogical values and beliefs. She referred to Chris Anson’s (2012) work, “What Good is it? The Effects of Teacher Response on Students’ Development,” on the complex social and instructional setting of the act of reading/grading, and concluded by reminding instructors to work on solidifying their own beliefs and their knowledge of the needs of individual students cohesively with the received messages from institutional programs. The complete text, including the slides and an extensive list of references, has been uploaded on the Connected Community site.
Questions from the audience began with one for Martins on the prevalence of large-scale writing examinations such as statewide teacher exams and those administered by Educational Testing Service. It was noted that the process of norming associated with scoring these exams has several purposes (teacher training, professional development, and practical clarification of department goals) in addition to facilitating the grading itself.
A second question for Vandercook brought up the issue of students who prefer written comments on hardcopies to online remarks. It was noted that, as of now, many students see the in-text, pen-and-paper comments as more available and more personal than typed online comments.
References
Anson, Chris. (2012). What good is it? The effects of teacher response on students’ development. In Norbert Elliot & Les Perelman (Eds.), Writing assessment in the 21st century: Essays in honor of Edward M. White (pp. 187–202). Cresskill, NJ: Hampton Press.