Date of Award

Spring 1-1-2017

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

First Advisor

Kathy Escamilla

Second Advisor

Guillermo Solano-Flores

Third Advisor

Arturo Aldama

Fourth Advisor

Susan Hopewell

Fifth Advisor

Edd Taylor

Abstract

Current policy mandates that emergent bilingual (EB) students partake in standardized assessments before they are fully proficient in English. Additionally, standardized assessments are quickly converting to an online administration. The increase in design features inherent in computer-based assessments will most likely increase the number of construct-irrelevant factors and affect the accessibility of assessment items for EBs.

This study used frequency analyses, descriptive statistics, item semiotic complexity measures, Spearman’s rank-order correlation, one- and two-way analysis of variance, and Chi-square test of independence to examine the semiotic components and complexity of items included in a Smarter Balanced Assessment Consortium Grade 8 practice test, and how the semiotic features of the items affected non-EB and EB students as they solved four of the items.

Results indicate that the SBAC items have different ideational, interpersonal, and textual metafunctions as well as intersemiotic relations than items in curriculum resources. The average item semiotic complexity was approximately 69 components, indicating that students had to correctly interpret, as intended by the item writers, 69 different constituents to answer an item correctly. Regardless of language group, students’ reported actions and thinking involved very few semiotic components or intersemiotic relations belonging to the test items.

For all items combined, EBs had a higher total cognitive load than non-EBs. As item semiotic complexity increased, EBs’ scores decreased while non-EBs’ scores remained constant once a certain item semiotic complexity level was reached. Results indicate that as cognitive load decreased, total score increased for both language groups. Additionally, total cognitive load negatively affected both groups of students during problem-solving and non-EBs during interpretation.

This study provides the PARCC and SBAC assessment consortia with information relevant to designing test items using multiple semiotic resources used in computer-based assessments in ways that are sensitive to the characteristics of EB students. Results from this study contribute to the establishment of a research agenda on the relationship between the semiotic properties of test items and the achievement of EB students in large-scale tests. Additionally, the information gained from this study informs teachers on how they can use semiotic resources to support their students in the classroom.