Literaturnachweis - Detailanzeige
Autor/inn/en | Organisciak, Peter; Newman, Michele; Eby, David; Acar, Selcuk; Dumas, Denis |
---|---|
Titel | How Do the Kids Speak? Improving Educational Use of Text Mining with Child-Directed Language Models |
Quelle | (2023), (26 Seiten)
PDF als Volltext |
Zusatzinformation | ORCID (Organisciak, Peter) ORCID (Newman, Michele) ORCID (Eby, David) ORCID (Acar, Selcuk) ORCID (Dumas, Denis) Weitere Informationen |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Monographie |
Schlagwörter | Learning Analytics; Child Language; Semantics; Age Differences; Elementary School Students; Gender Differences; Racial Differences; Educational Assessment; Test Format; Computation; Information Science; Measurement Techniques; Computational Linguistics; Vocabulary Development; Language Usage; Language Acquisition; English; Models; Generalization; Speech Communication; Culture Fair Tests; Language Tests; Creative Thinking; Television; Programming (Broadcast); Video Technology; Childrens Literature; Databases; Information Sources; Learning Processes; Teaching Methods 'Children''s language'; Kindersprache; Semantik; Age; Difference; Age difference; Altersunterschied; Geschlechterkonflikt; Rassenunterschied; Education; assessment; Bewertungssystem; Testentwicklung; Informationswissenschaft; Messtechnik; Linguistics; Computerlinguistik; Wortschatzarbeit; Sprachgebrauch; Sprachaneignung; Spracherwerb; English language; Englisch; Analogiemodell; Language test; Sprachtest; Kreatives Denken; Fernsehen; Fernsehtechnik; Programmgestaltung; 'Children''s literature'; Kinderliteratur; Datenbank; Information source; Informationsquelle; Learning process; Lernprozess; Teaching method; Lehrmethode; Unterrichtsmethode |
Abstract | Purpose: Most educational assessments tend to be constructed in a close-ended format, which is easier to score consistently and more affordable. However, recent work has leveraged computation text methods from the information sciences to make open-ended measurement more effective and reliable for older students. This study asks whether such text applications need to be adapted when used with samples of elementary-aged children. Design/methodology/approach: This study introduces domain-adapted semantic models for child-specific text analysis, to allow better elementary-aged educational assessment. A corpus compiled from a multi-modal mix of spoken and written child-directed sources is presented, used to train a children's language model, and evaluated against standard non-age-specific semantic models. Findings: Child-oriented language is found to differ in vocabulary and word sense use from general English, while exhibiting lower gender and race biases. The model is evaluated in an educational application of divergent thinking measurement and shown to improve on generalized English models. Originality: Research in computational measurement of open-ended responses has thus far used models of language trained on general English sources or domain-specific sources such as textbooks. This paper is the first to study age-specific language models for educational assessment. Additionally, while there have been several targeted, high-quality corpora of child-created or child-directed speech, the corpus presented here is the first developed with the breadth and scale required for large-scale text modeling. Research limitations/implications: The findings demonstrate the need for age-specific language models in the growing domain of automated divergent thinking and strongly encourage the same for other educational uses of computation text analysis by showing a measurable difference in the language of children. Social implications: Understanding children's language more representatively in automated educational assessment allows for more fair and equitable testing. Further, child-specific language models have fewer gender and race biases. [This paper was published in "Information and Learning Sciences."] (As Provided). |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |
Update | 2024/1/01 |