Sara Steegen

Steegen.Sara_Quantitative Psychology and Individual Differences
Faculty of Psychology and Educational Sciences
KU Leuven, Belgium

Supervisor
Prof. Wolf Vanpaemel & Prof. Francis Tuerlinckx

On October 17th, 2017, Sara Steegen defended her thesis entitled

Towards better research practices in psychology

Summary
Psychology faces a deep  crisis of confidence and is at the risk of losing its credibility. Researchers are being criticized for the way they are conducting studies, analyzing data and reporting results. Confronted with this poor research quality in psychology, several recommendations have been made  to overcome this problem. The goal of this dissertation is to make a contribution to this enterprise of improving research quality in the field of psychology.

In Chapter 1, we carry out a replication  study, implementing the most commonly made  recommendation for good research practices. In Chapter 2, we extend the class of recommendations that focus on transparency by highlighting the importance of an increased transparency about arbitrary choices in data processing. We suggest that instead of performing only one analysis, researchers should perform a multiverse analysis, which involves performing all analyses across the whole set of alternatively processed data sets corresponding to a large set of reasonable scenarios.

Chapters 3 and 4 cover topics concerning Bayes factors, which are being advocated as a Bayesian alternative for null hypothesis significance testing. In Chapter 3, we compare the Bayes factor with an alternative Bayesian model selection method: the Prior Information Criterion (PIC). We show that the PIC can lead to conclusions that not only widely differ from the conclusions based on the Bayes factor, but are also highly undesirable. Finally, in Chapter 4, we extend the core idea of Bayes factors –  considering average fit rather than best fit – to qualitative data. We explore the potential of Parameter Space Partitioning – a model evaluation tool that focuses on qualitative data patterns –  as a model selection method, focusing on average model fit with respect to the qualitative aspects of the data.

Project

Model selection for qualitative data patterns
Evaluating different models against empirical data, and identifying the model that represents the closest approximation to the processes that generated the phenomenon is known as model selection. Much more than other sciences, such as physics, formal modeling in psychology is characterized by a proliferation of competing accounts of human and other animal behavior. Therefore, in psychology especially, model selection is an issue of key importance.

Recently, a new model evaluation method has been proposed, called Parameter Space Partitioning (PSP; Pitt, Kim, Navarro, Myung, 2006).  Rather than considering quantitative model predictions, as most other methods, PSP focuses on the theoretically relevant qualitative predictions of a model. It searches all the qualitative data patterns that a model can generate by partitioning its parameter space into regions that correspond to these patterns. In a first project we proposed a score based on PSP that captures a model’s global adequacy in describing the observed qualitative trends. This score can be used to select between models if one is interested in their qualitative behaviour.

In future projects we will apply PSP to psychological models in fields such as category learning to gain insights in their qualitative behaviour. Further, other model selection methods (e.g., Bayesian methods) will be explored with applications in psychology.

Financed by
KU Leuven