Skip to main content Skip to navigation

Investigating Item Parameter Drift across Computer- and Paper-Based Assessment in PISA 2015

The importance of international large-scale assessments (LSAs) rises on thanks to effectiveness of education systems. There is a transition in the Programme for International Student Assessment (PISA) to move from computer-based assessment (CBA) applications to paper-based assessment (PBA). However, this migration constitutes new issues such as assessment modes equivalence, validation and fairness. Even though there were made some studies about item analysis in PISA 2015 by OECD (OECD, 2016), there needs to evidence the assessment modes equivalence in terms of item statistical features. Few studies, to my knowledge, made the comparisons at the item level (Terluin, Brouwers, Marchand, & De Vet, 2018). In light of this, this study aims at comparing CBA and PBA in terms of item parameter drift (IPD) of PISA 2015 Mathematics test items. To select sample countries OECD Mathematics average score will be used as a criterion. For investigating IPD, it will be performed Differential Item Functioning (DIF) methods. The relationship between IPD and DIF is described as “When DIF occurs across testing occasions, rather than across groups, it is called item parameter drift.” (Han et al., 2012). Even though IPD has impact on test validity, fairness and interpretation (DeMars, 2004; Han et al., 2012), there are few studies that investigates IPD in assessment setting (Terluin et al., 2018; Wu et al., 2017; Sachse & Haag, 2017). The main research question is: (a) Are there any item drifts between CBA and PBA in PISA 2015?

Educational Psychology