Mostrar el registro sencillo

dc.contributor.authorScardapane, Simone
dc.contributor.authorVan Vaerenbergh, Steven
dc.contributor.authorHussain, Amir
dc.contributor.authorUncini, Aurelio
dc.contributor.otherUniversidad de Cantabriaes_ES
dc.date.accessioned2025-01-23T17:36:39Z
dc.date.available2025-01-23T17:36:39Z
dc.date.issued2020-04
dc.identifier.issn2471-285X
dc.identifier.issn2376-4562
dc.identifier.otherTEC2014-57402-JINes_ES
dc.identifier.otherTEC2016-81900-REDTes_ES
dc.identifier.urihttps://hdl.handle.net/10902/35145
dc.description.abstractComplex-valued neural networks (CVNNs) are a powerful modeling tool for domains where data can be naturally interpreted in terms of complex numbers. However, several analytical properties of the complex domain (such as holomorphicity) make the design of CVNNs a more challenging task than their real counterpart. In this paper, we consider the problem of flexible activation functions (AFs) in the complex domain, i.e., AFs endowed with sufficient degrees of freedom to adapt their shape given the training data. While this problem has received considerable attention in the real case, very limited literature exists for CVNNs, where most activation functions are generally developed in a split fashion (i.e., by considering the real and imaginary parts of the activation separately) or with simple phase-amplitude techniques. Leveraging over the recently proposed kernel activation functions, and related advances in the design of complex-valued kernels, we propose the first fully complex, nonparametric activation function for CVNNs, which is based on a kernel expansion with a fixed dictionary that can be implemented efficiently on vectorized hardware. Several experiments on common use cases, including prediction and channel equalization, validate our proposal when compared to real-valued neural networks and CVNNs with fixed activation functions.es_ES
dc.description.sponsorshipThe work of Simone Scardapane was supported in part by Italian MIUR, "Progetti di Ricerca di Rilevante Interesse Nazionale", GAUChO project, under Grant 2015YPXH4W 004. The work of Steven Van Vaerenbergh was supported by the Ministerio de Econom´ıa, Industria y Competitividad (MINECO) of Spain under grant TEC2014-57402-JIN (PRISMA). Amir Hussain was supported by the UK Engineering and Physical Science Research Council (EPSRC) grant no. EP/M026981/1.es_ES
dc.format.extent11 p.es_ES
dc.language.isoenges_ES
dc.publisherInstitute of Electrical and Electronics Engineers, Inc.es_ES
dc.rights© 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.es_ES
dc.sourceIEEE Transactions on Emerging Topics in Computational Intelligence, 2020, 4(2), 140-150es_ES
dc.subject.otherNeural networkses_ES
dc.subject.otherActivation functionses_ES
dc.subject.otherKernel methodses_ES
dc.subject.otherComplex domaines_ES
dc.titleComplex-valued neural networks with nonparametric activation functionses_ES
dc.typeinfo:eu-repo/semantics/articlees_ES
dc.relation.publisherVersionhttps://doi.org/10.1109/TETCI.2018.2872600es_ES
dc.rights.accessRightsopenAccesses_ES
dc.relation.projectIDinfo:eu-repo/grantAgreement/MINECO//TEC2014-57402-JIN/ES/TECNICAS AVANZADAS DE APRENDIZAJE MAQUINA PARA RECONOCIMIENTO DE PATRONES EN SERIES TEMPORALES/
dc.identifier.DOI10.1109/TETCI.2018.2872600
dc.type.versionacceptedVersiones_ES


Ficheros en el ítem

Thumbnail

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo