Lit Lang Library

Back to Listing

Linguistics Seminar Series Presentation - "In quest of learner profiles in L2 writing classrooms: The development and validation of a profile-based rating scale for a post-admission ESL writing placement test"

Event Type
panel discussion
Department of Linguistics
Lucy Ellis Lounge, 1080 Foreign Languages Building, 707 S. Mathews Ave., Urbana
Oct 29, 2018   4:00 - 5:00 pm  
Xun Yan, Assistant Professor of Linguistics; Hyunji (Hayley) Park, Linguistics PhD student; Ha Ram (Hannah) Kim, Linguistics PhD student
Free and open to the public.
Originating Calendar
School of Literatures, Cultures and Linguistics Calendar

    Post-admission language assessments and support programs tend to serve students from a limited range of proficiencies due to the admission-selection process (Cho & Bridgeman, 2012; Ginther & Yan, 2018; Yan, Kim & Kotnarowski, forthcoming). While students may not differ substantially in general proficiency levels, they differ greatly in strengths and weaknesses in subskills. This makes conventional holistic, proficiency-based rating scales incongruent with the placement and diagnostic needs of post-admission language assessments. This presentation reports on a two-year collaborative project between language testers and teachers, to develop and validate a new scale for the English Placement Test (EPT) at UIUC, with the goal to better capture the range of writing performances among ESL students on campus.


    Following a data-driven approach, a group of testers and teachers participated in a four-stage, iterative scale development process, where they (1) reflected on the range of writing performances in ESL courses, (2) evaluated four rounds of sample essays from the test and developed descriptors for the new scale, (3) pilot-rated new essays using the new scale, and (4) revised the scale descriptors based on rater performance and feedback. The collaboration led to the emergence of a profile-based rating scale. The scale consisted of four proficiency level and six writing profiles, capturing students’ relative strengths and weaknesses in two major sub-components of academic writing ability: argumentation and lexico-grammar.


    To validate the profile-based scale, a sample of 300 benchmark essays were rated analytically on both argumentation and lexico-grammar. Many-facet Rasch modeling (Fox & Bond, 2013) and cluster analysis were performed on the subscale scores to extract writing performance profiles represented in the test. The results confirmed the distinct writer profiles specified in the new scale. Findings of this study suggest that the profile-based scale is valid and more suitable for the placement and diagnostic purposes of post-admission writing assessments. More importantly, the collaboration between teachers and testers can promote collaborative assessment-related dialogues and practices within language support programs, strengthening the alignment across curriculum, instruction and assessment.

link for robots only