Consiglio Nazionale delle Ricerche

Tipo di prodottoArticolo in rivista
TitoloA Universal Approximation Result for Difference of Log-Sum-Exp Neural Networks
Anno di pubblicazione2020
Formato-
Autore/iCalafiore, Giuseppe C.; Gaubert, Stephane; Possieri, Corrado
Affiliazioni autoriCentre de Mathématiques Appliquées; Consiglio Nazionale delle Ricerche; Politecnico di Torino
Autori CNR e affiliazioni
  • CORRADO POSSIERI
Lingua/e
  • inglese
AbstractWe show that a neural network whose output is obtained as the difference of the outputs of two feedforward networks with exponential activation function in the hidden layer and logarithmic activation function in the output node, referred to as log-sum-exp (LSE) network, is a smooth universal approximator of continuous functions over convex, compact sets. By using a logarithmic transform, this class of network maps to a family of subtraction-free ratios of generalized posynomials (GPOS), which we also show to be universal approximators of positive functions over log-convex, compact subsets of the positive orthant. The main advantage of difference-LSE networks with respect to classical feedforward neural networks is that, after a standard training phase, they provide surrogate models for a design that possesses a specific difference-of-convex-functions form, which makes them optimizable via relatively efficient numerical methods. In particular, by adapting an existing difference-of-convex algorithm to these models, we obtain an algorithm for performing an effective optimization-based design. We illustrate the proposed approach by applying it to the data-driven design of a diet for a patient with type-2 diabetes and to a nonconvex optimization problem.
Lingua abstractinglese
Altro abstract-
Lingua altro abstract-
Pagine da5603
Pagine a5612
Pagine totali-
RivistaIEEE Transactions on Neural Networks and Learning Systems
Attiva dal 2012
Editore: Institute of Electrical and Electronics Engineers, - New York, NY - USA
Paese di pubblicazione: Stati Uniti d'America
Lingua: inglese
ISSN: 2162-237X
Titolo chiave: IEEE Transactions on Neural Networks and Learning Systems
Numero volume della rivista31
Fascicolo della rivista12
DOI10.1109/TNNLS.2020.2975051
Verificato da referee-
Stato della pubblicazionePublished version
Indicizzazione (in banche dati controllate)
  • Scopus (Codice:2-s2.0-85096363848)
Parole chiaveNeural networks, Optimization, Difference of convex, DCA
Link (URL, URI)http://www.scopus.com/record/display.url?eid=2-s2.0-85096363848&origin=inward
Titolo parallelo-
Licenza-
Scadenza embargo-
Data di accettazione-
Note/Altre informazioni-
Strutture CNR
  • IASI — Istituto di analisi dei sistemi ed informatica "Antonio Ruberti"
Moduli/Attività/Sottoprogetti CNR
  • DIT.AD007.010.001 : MCISCO - Modellistica, controllo e identificazione i sistemi complessi
Progetti Europei-
Allegati