Home |  English version |  Mappa |  Commenti |  Sondaggio |  Staff |  Contattaci Cerca nel sito  
Istituto di linguistica computazionale "Antonio Zampolli"

Torna all'elenco Contributi in rivista anno 2020

Contributo in rivista

Tipo: Articolo in rivista

Titolo: Modeling Word Learning and Processing with Recurrent Neural Networks

Anno di pubblicazione: 2020

Formato: Elettronico

Autori: Marzi, C.

Affiliazioni autori: Institute for Computational Linguistics--Italian National Research Council

Autori CNR:


Lingua: inglese

Abstract: The paper focuses on what two different types of Recurrent Neural Networks, namely a recurrent Long Short-Term Memory and a recurrent variant of self-organizing memories, a Temporal Self-Organizing Map, can tell us about speakers' learning and processing a set of fully inflected verb forms selected from the top-frequency paradigms of Italian and German. Both architectures, due to the re-entrant layer of temporal connectivity, can develop a strong sensitivity to sequential patterns that are highly attested in the training data. The main goal is to evaluate learning and processing dynamics of verb inflection data in the two neural networks by focusing on the effects of morphological structure on word production and word recognition, as well as on word generalization for untrained verb forms. For both models, results show that production and recognition, as well as generalization, are facilitated for verb forms in regular paradigms. However, the two models are differently influenced by structural effects, with the Temporal Self-Organizing Map more prone to adaptively find a balance between processing issues of learnability and generalization, on the one side, and discriminability on the other side.

Lingua abstract: inglese

Pagine totali: 14


Information Molecular Diversity Preservation International
Paese di pubblicazione:
Lingua: inglese
ISSN: 2078-2489

Numero volume: 11

Numero fascicolo: 6

DOI: 10.3390/info11060320

Referee: Sė: Internazionale

Stato della pubblicazione: Published version

Indicizzato da:

  • Google Scholar [iMOzt94AAAAJ&h]
  • ResearchGate [https://www.researchgate.net/publication/342170932_Modeling_Word_Learning_and_Processing_with_Recurrent_Neural_Networks]

Parole chiave:

  • word-learning
  • serial word processing
  • recurrent neural networks
  • long short-term memories
  • temporal self-organizing memories

URL: https://www.mdpi.com/2078-2489/11/6/320

Data di accettazione: 10/06/2020

Altre informazioni: Paper 320 in Special Issue on "Advances in Computational Linguistics"

Strutture CNR:

Allegati: Modeling Word Learning and Processing with Recurrent Neural Networks_Marzi_2020 (application/pdf)

Torna indietro Richiedi modifiche Invia per email Stampa
Home Il CNR  |  I servizi News |   Eventi | Istituti |  Focus