Consiglio Nazionale delle Ricerche

Tipo di prodottoArticolo in rivista
TitoloModeling Word Learning and Processing with Recurrent Neural Networks
Anno di pubblicazione2020
FormatoElettronico
Autore/iMarzi, C.
Affiliazioni autoriInstitute for Computational Linguistics--Italian National Research Council
Autori CNR e affiliazioni
  • CLAUDIA MARZI
Lingua/e
  • inglese
AbstractThe paper focuses on what two different types of Recurrent Neural Networks, namely a recurrent Long Short-Term Memory and a recurrent variant of self-organizing memories, a Temporal Self-Organizing Map, can tell us about speakers' learning and processing a set of fully inflected verb forms selected from the top-frequency paradigms of Italian and German. Both architectures, due to the re-entrant layer of temporal connectivity, can develop a strong sensitivity to sequential patterns that are highly attested in the training data. The main goal is to evaluate learning and processing dynamics of verb inflection data in the two neural networks by focusing on the effects of morphological structure on word production and word recognition, as well as on word generalization for untrained verb forms. For both models, results show that production and recognition, as well as generalization, are facilitated for verb forms in regular paradigms. However, the two models are differently influenced by structural effects, with the Temporal Self-Organizing Map more prone to adaptively find a balance between processing issues of learnability and generalization, on the one side, and discriminability on the other side.
Lingua abstractinglese
Altro abstract-
Lingua altro abstract-
Pagine da-
Pagine a-
Pagine totali14
RivistaInformation (Basel)
Attiva dal 2010
Editore: Molecular Diversity Preservation International - Basel
Lingua: inglese
ISSN: 2078-2489
Titolo chiave: Information (Basel)
Titolo proprio: Information. (Basel)
Titolo abbreviato: Information (Basel)
Numero volume della rivista11
Fascicolo della rivista6
DOI10.3390/info11060320
Verificato da refereeSì: Internazionale
Stato della pubblicazionePublished version
Indicizzazione (in banche dati controllate)
  • Google Scholar (Codice:iMOzt94AAAAJ&h)
  • ResearchGate (Codice:https://www.researchgate.net/publication/342170932_Modeling_Word_Learning_and_Processing_with_Recurrent_Neural_Networks)
Parole chiaveword-learning, serial word processing, recurrent neural networks, long short-term memories, temporal self-organizing memories
Link (URL, URI)https://www.mdpi.com/2078-2489/11/6/320
Titolo parallelo-
Data di accettazione10/06/2020
Note/Altre informazioniPaper 320 in Special Issue on "Advances in Computational Linguistics"
Strutture CNR
  • ILC — Istituto di linguistica computazionale "Antonio Zampolli"
Moduli/Attività/Sottoprogetti CNR
  • DUS.AD016.007.001 : Approcci interdisciplinari a modelli teorici e computazionali di acquisizione lessicale in contesti mono- e multi-lingui
  • DUS.AD016.075.004 : Modelli (bio-)computazionali dell'uso linguistico
Progetti Europei-
Allegati
Modeling Word Learning and Processing with Recurrent Neural Networks_Marzi_2020
Descrizione: Published_version:pdf
Tipo documento: application/pdf