Quantitative Psychology and Individual Differences
Faculty of Psychology and Educational Sciences
KU Leuven, Belgium
Prof. Rianne Janssen
On October 11th 2016, Dries Debeer defended his thesis entitled
Item-position effects and missing responses in large-scale assessments: Models and applications
Psychometric models for differential item performance
In educational and psychological measurement, it is often – if not always – assumed that test scores and item responses only depend on the measured attribute of interest, and that the measurement is invariant with respect to the administering conditions. However, it has been repeatedly shown that the administration context might violate this measurement invariance. These context effects, and how to deal with them, will be the focus of my research.
Currently I am working on the effects of item position. In achievement testing, the use of alternate test forms with the same items, presented in different orders, is a common strategy to prevent copying and enhance test security. Consequently items are administered at different positions in the different test forms. These changes in item position can threat measurement invariance assumptions, or item parameter invariance assumptions. Within the IRT framework we are developing an integrated approach to detect and model these effects. Combining the logic of Differential Item Functioning (DIF) models and the Linear Logistic Test Model (LLTM), this method addresses both the item-side and the person-side of the issue, as it allows for individual differences in the effect of item position.
In the future, the framework will be extended to tackle other confounding context effects. Firstly, we will develop an approach to model omissions and “not reached” items as different cases of non-response. Secondly, we will focus on context related differential item functioning, where the functioning of the item depends on the content of the previously administered item(s).