DescriptionExperience sampling methods involve repeatedly gathering self-report data from participants in daily life. Careless responding is a source known to partly invalidate such self-report data. Literature on techniques to detect this type of responding is well established in classic survey research, but scarce related to experience sampling data. We adapted existing careless responding detection techniques from survey studies and developed novel techniques for the experience sampling context. This resulted in 11 novel detection techniques, indicating overly fast reaction times, rhythmic tapping, too little and too much consistency in responses, and outlier responses. In three studies, we evaluated the interrelations between these techniques, how they were predictive of actual (instructed or self-reported) careless responding itself, and how accurate combining them can be for identifying careless responses in the context of experience sampling research. Results showed that counting psychometric antonyms and synonym violations was correlated to outlier indicators, as well as correlated with each other, showing that too little consistency is rare and could indicate careless responding. Moreover, using mixed-effect logistic regression models, we showed that the proposed techniques were able to accurately classify instructed careless responses, with average time per item as the most powerful indicator. Furthermore, low correlations between our detection techniques and scores on a self-reported attention item show that the latter might not be an effective indication of careless responding. In my symposium talk, I will present these techniques and findings, including recommendations for using these techniques in experience sampling research.
|Event title||Society for Ambulatory Assessment conference 2023|