Angelina Kuchina

Department of Methodology and Statistics
Tilburg School of Social and Behavioral Sciences
Tilburg University
Email
Website

Project
Psychometric innovations in monitoring learning progress in students

To ensure that students can get the best out of themselves, it is important to tailor the learning activities to their unique individual needs. To accomplish this goal, we need to closely monitor the individual’s learning progress so that we can effectively match the learning tasks accordingly. This process of feedback and feed-up requires the collection of reliable and valid measurements about learning progress on a regular basis. What progress has been made? Are there subjects in which students progress exceptionally well or lag behind? At the same time, these measurements should be obtained as efficient as possible and preferably naturally embedded in the learning process. Therefore testing time should be kept as short as possible. However, it also means that all available information – either from the test (answers or response times) or related contextual information – is used optimally. The project’s central research question is: How can we measure the learning progress as validly and reliably in the most practical way by using innovative psychometric techniques that can integrate different sources of information about the students? Simulation research and the analysis of existing longitudinal data sets will be used to answer this question.
Project 1: One of the goals of educational measurement in to monitor the learning progress of students. When the learning progress in monitored, it is typically done not for a single domain (e.g., addition within arithmetic), but for many related domains at the same time (e.g., addition, subtraction, multiplication, and division). In psychometrics there is a long tradition of investigating whether using separate scores on such domains (i.e., subscale scores) is better than using a composite score which combines the highly related domains (Tate, 2004; Brennan, 2012; Sinharay, 2010; Haberman & Sinharay, 2010; Yao & Boughton, 2007). Subscale scores are aimed at providing a more fine-grained picture of the students’ profile, but since subscale scores are based on a smaller number of items, they are less reliable than the composite scores and therefore they do not always provide added value. This question of added value of subscales is even more relevant in the context of change scores – the differences between the students’ scores at two (or more) measurement occasions, – since they are less reliable than the scores themselves (Bereiter, 1963; Cronbach & Furby, 1970; Denney, Rapport, & Chung, 2005; Finney et al., 1980; Kim & Camilli, 2014; Linn & Slinde, 1977; Lord, 1963; O’Connor, 1972; Raaijmakers, 2016; Sandell & Wilczek, 2016; Son & Morrison, 2010; Williams & Kaufmann, 2012). In this project we will investigate under which conditions using subscale change scores provides added value for evaluating the learning progress of students. In a simulation study we will consider a wide range of realistic conditions in terms of the reliability of the subscores and composite scores, correlations between the subdomains, the amount of change and whether it is related to the level of the student. Data will be generated using multidimensional dynamic item response theory models. Both multidimensional (all subscales and the relations between them will be modeled) and unidimensional (either all subscales will be combined or a single subscale will be considered in isolation) models will be used for estimating the change in person abilities between the measurement occasions. When evaluating the added value of subscale change scores, we will look at whether the subscale change scores or the composite scores are better predictors of the true change on the subscales. Furthermore, on the individual level we will look at the quality of the decisions about the presence of change. Based on the results of the simulation studies we will provide guidelines for the practitioners for the use of subscales when evaluating change.
Project 2: to be determined further.
Project 3: to be determined further.
Project 4: to be determined further.

Supervisors
Dr. Wilco Emons
Dr. Maria Bolsinova
Prof. Jeroen Vermunt

Period
20 September 2023 – 20 September 2027