Date of Award

Spring 1-1-2018

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

First Advisor

Derek C. Briggs

Second Advisor

Benjamin Shear

Third Advisor

William Penuel

Fourth Advisor

Kai Larsen

Fifth Advisor

Gary McClelland

Abstract

Survey researchers have long been aware that respondents may not always be attentive when answering surveys. But, only recently have researchers studied the detection and impact of answer patterns that respondents provide when not enacting critical steps in the expected response process. I term these answer patterns Low Cognitive Effort (LCE) response vectors where the term response vector refers to a complete set of item responses provided by a single respondent. Previous research suggests that the most promising LCE detection methods flag up 10 to 20% of respondents and that statistics like Cronbach’s alpha may shift between 5 and 15% when LCE vectors are present in samples.

This study examines both the detection and impact of LCE vectors using simulated and empirical data. Three person fit statistics from Item Response Theory modeling that have not previously been examined as LCE detection methods are compared with eight previously examined detection methods. Further, this study examines how simulated LCE vectors affect common survey properties and statistics when plausible amounts of these responses are present in samples.

In general, this study finds that the impact of LCE vectors is minimal under the most plausible circumstances. Further, the results corroborate previous findings that researchers should be most concerned with any amount of invariant, “straight line” response vectors because these appear to have the largest impacts on conclusions. When seeking to remove LCE vectors, this study finds that a combination of a subset of the studied methods perform very well at flagging potential LCE responses.

Share

COinS