Literaturnachweis - Detailanzeige
Autor/inn/en | Shin, Jinnie; Gierl, Mark J. |
---|---|
Titel | Generating Reading Comprehension Items Using Automated Processes |
Quelle | In: International Journal of Testing, 22 (2022) 3-4, S.289-311 (23 Seiten)Infoseite zur Zeitschrift
PDF als Volltext |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Zeitschriftenaufsatz |
ISSN | 1530-5058 |
DOI | 10.1080/15305058.2022.2070755 |
Schlagwörter | Reading Comprehension; Test Construction; Test Items; Natural Language Processing; Automation; Computer Assisted Testing; High Stakes Tests; College Entrance Examinations; Foreign Countries; Test Format; Reading Tests; South Korea |
Abstract | Over the last five years, tremendous strides have been made in advancing the AIG methodology required to produce items in diverse content areas. However, the one content area where enormous problems remain unsolved is language arts, generally, and reading comprehension, more specifically. While reading comprehension test items can be created using many different item formats, fill-in-the-blank remains one of the most common when the goal is to measure inferential knowledge. Currently, the item development process used to create fill-in-the-blank reading comprehension items is time-consuming and expensive. Hence, the purpose of the study is to introduce a new systematic method for generating fill-in-the-blank reading comprehension items using an item modeling approach. We describe the use of different unsupervised learning methods that can be paired with natural language processing techniques to identify the salient item models within existing texts. To demonstrate the capacity of our method, 1,013 test items were generated from 100 input texts taken from fill-in-the-blank reading comprehension items used on a high-stakes college entrance exam in South Korea. Our validation results indicated that the generated items produced higher semantic similarities between the item options while depicting little to no syntactic differences with the traditionally written test items. (As Provided). |
Anmerkungen | Routledge. Available from: Taylor & Francis, Ltd. 530 Walnut Street Suite 850, Philadelphia, PA 19106. Tel: 800-354-1420; Tel: 215-625-8900; Fax: 215-207-0050; Web site: http://www.tandf.co.uk/journals |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |
Update | 2024/1/01 |