Literaturnachweis - Detailanzeige
Autor/inn/en | Kopp, Jason P.; Jones, Andrew T. |
---|---|
Titel | Impact of Item Parameter Drift on Rasch Scale Stability in Small Samples over Multiple Administrations |
Quelle | In: Applied Measurement in Education, 33 (2020) 1, S.24-33 (10 Seiten)Infoseite zur Zeitschrift
PDF als Volltext |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Zeitschriftenaufsatz |
ISSN | 0895-7347 |
DOI | 10.1080/08957347.2019.1674303 |
Schlagwörter | Item Response Theory; Psychometrics; Sample Size; Sampling; Test Items; Test Bias; Accuracy; Test Validity; Scores; Simulation; Error of Measurement |
Abstract | Traditional psychometric guidelines suggest that at least several hundred respondents are needed to obtain accurate parameter estimates under the Rasch model. However, recent research indicates that Rasch equating results in accurate parameter estimates with sample sizes as small as 25. Item parameter drift under the Rasch model has been previously investigated, but not under small sample conditions. The current study simulated data using the Rasch model to investigate the impact of varying proportions, magnitudes, and directions of item drift across multiple administrations with sample sizes of 10, 25 and 50. High proportions of high magnitude drift strongly affected bias, RMSE, and classification accuracy. High proportions of positive drift (items becoming harder across administrations) were particularly problematic for classification accuracy, with accuracy falling to less than 60% in the most extreme conditions. Results indicate that item drift affects Rasch examinee ability estimates similarly under both small and large sample conditions, and even small to moderate degrees of unaddressed item drift can substantially affect the validity of scores and pass-fail decisions. (As Provided). |
Anmerkungen | Routledge. Available from: Taylor & Francis, Ltd. 530 Walnut Street Suite 850, Philadelphia, PA 19106. Tel: 800-354-1420; Tel: 215-625-8900; Fax: 215-207-0050; Web site: http://www.tandf.co.uk/journals |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |
Update | 2024/1/01 |