Suche

Wo soll gesucht werden?
Erweiterte Literatursuche

Ariadne Pfad:

Inhalt

Literaturnachweis - Detailanzeige

 
Autor/inn/enTestolin, Alberto; Stoianov, Ivilin; Sperduti, Alessandro; Zorzi, Marco
TitelLearning Orthographic Structure with Sequential Generative Neural Networks
QuelleIn: Cognitive Science, 40 (2016) 3, S.579-606 (28 Seiten)
PDF als Volltext Verfügbarkeit 
Spracheenglisch
Dokumenttypgedruckt; online; Zeitschriftenaufsatz
ISSN0364-0213
DOI10.1111/cogs.12258
SchlagwörterOrthographic Symbols; Neurological Organization; Models; Probability; Sequential Learning; Cognitive Processes; English; Syllables; Alphabets; Accuracy; Vocabulary Development; Prediction; Word Recognition; Markov Processes; Sensory Experience; Language Processing
AbstractLearning the structure of event sequences is a ubiquitous problem in cognition and particularly in language. One possible solution is to learn a probabilistic generative model of sequences that allows making predictions about upcoming events. Though appealing from a neurobiological standpoint, this approach is typically not pursued in connectionist modeling. Here, we investigated a sequential version of the restricted Boltzmann machine (RBM), a stochastic recurrent neural network that extracts high-order structure from sensory data through unsupervised generative learning and can encode contextual information in the form of internal, distributed representations. We assessed whether this type of network can extract the orthographic structure of English monosyllables by learning a generative model of the letter sequences forming a word training corpus. We show that the network learned an accurate probabilistic model of English graphotactics, which can be used to make predictions about the letter following a given context as well as to autonomously generate high-quality pseudowords. The model was compared to an extended version of simple recurrent networks, augmented with a stochastic process that allows autonomous generation of sequences, and to non-connectionist probabilistic models (n-grams and hidden Markov models). We conclude that sequential RBMs and stochastic simple recurrent networks are promising candidates for modeling cognition in the temporal domain. (As Provided).
AnmerkungenWiley-Blackwell. 350 Main Street, Malden, MA 02148. Tel: 800-835-6770; Tel: 781-388-8598; Fax: 781-388-8232; e-mail: cs-journals@wiley.com; Web site: http://www.wiley.com/WileyCDA
Erfasst vonERIC (Education Resources Information Center), Washington, DC
Update2020/1/01
Literaturbeschaffung und Bestandsnachweise in Bibliotheken prüfen
 

Standortunabhängige Dienste
Bibliotheken, die die Zeitschrift "Cognitive Science" besitzen:
Link zur Zeitschriftendatenbank (ZDB)

Artikellieferdienst der deutschen Bibliotheken (subito):
Übernahme der Daten in das subito-Bestellformular

Tipps zum Auffinden elektronischer Volltexte im Video-Tutorial

Trefferlisten Einstellungen

Permalink als QR-Code

Permalink als QR-Code

Inhalt auf sozialen Plattformen teilen (nur vorhanden, wenn Javascript eingeschaltet ist)

Teile diese Seite: