Literaturnachweis - Detailanzeige
Autor/inn/en | Zhai, Xuesong; Xu, Jiaqi; Chen, Nian-Shing; Shen, Jun; Li, Yan; Wang, Yonggu; Chu, Xiaoyan; Zhu, Yumeng |
---|---|
Titel | The Syncretic Effect of Dual-Source Data on Affective Computing in Online Learning Contexts: A Perspective from Convolutional Neural Network with Attention Mechanism |
Quelle | In: Journal of Educational Computing Research, 61 (2023) 2, S.466-493 (28 Seiten)Infoseite zur Zeitschrift
PDF als Volltext |
Zusatzinformation | ORCID (Chen, Nian-Shing) ORCID (Wang, Yonggu) |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Zeitschriftenaufsatz |
ISSN | 0735-6331 |
DOI | 10.1177/07356331221115663 |
Schlagwörter | Affective Behavior; Nonverbal Communication; Video Technology; Online Courses; Middle School Students; Artificial Intelligence; Electronic Learning; Emotional Response; COVID-19; Pandemics; Educational Technology; Models Affective disturbance; Active behaviour; Affektive Störung; Non-verbal communication; Nonverbale Kommunikation; Online course; Online-Kurs; Middle school; Middle schools; Student; Students; Mittelschule; Mittelstufenschule; Schüler; Schülerin; Künstliche Intelligenz; Emotionales Verhalten; Unterrichtsmedien; Analogiemodell |
Abstract | Affective computing (AC) has been regarded as a relevant approach to identifying online learners' mental states and predicting their learning performance. Previous research mainly used one single-source data set, typically learners' facial expression, to compute learners' affection. However, a single facial expression may represent different affections in various head poses. This study proposed a dual-source data approach to solve the problem. Facial expression and head pose are two typical data sources that can be captured from online learning videos. The current study collected a dual-source data set of facial expressions and head poses from an online learning class in a middle school. A deep learning neural network using AlexNet with an attention mechanism was developed to verify the syncretic effect on affective computing of the proposed dual-source fusion strategy. The results show that the dual-source fusion approach significantly outperforms the single-source approach based on the AC recognition accuracy between the two approaches (dual-source approach using Attention-AlexNet model 80.96%; single-source approach, facial expression 76.65% and head pose 64.34%). This study contributes to the theoretical construction of the dual-source data fusion approach, and the empirical validation of the effect of the Attention-AlexNet neural network approach on affective computing in online learning contexts. (As Provided). |
Anmerkungen | SAGE Publications. 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 800-818-7243; Tel: 805-499-9774; Fax: 800-583-2665; e-mail: journals@sagepub.com; Web site: https://sagepub.com |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |
Update | 2024/1/01 |