Complex-valued neural networks with nonparametric activation functions
Ver/ Abrir
Registro completo
Mostrar el registro completo DCFecha
2020-04Derechos
© 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Publicado en
IEEE Transactions on Emerging Topics in Computational Intelligence, 2020, 4(2), 140-150
Editorial
Institute of Electrical and Electronics Engineers, Inc.
Enlace a la publicación
Palabras clave
Neural networks
Activation functions
Kernel methods
Complex domain
Resumen/Abstract
Complex-valued neural networks (CVNNs) are a powerful modeling tool for domains where data can be naturally interpreted in terms of complex numbers. However, several analytical properties of the complex domain (such as holomorphicity) make the design of CVNNs a more challenging task than their real counterpart. In this paper, we consider the problem of flexible activation functions (AFs) in the complex domain, i.e., AFs endowed with sufficient degrees of freedom to adapt their shape given the training data. While this problem has received considerable attention in the real case, very limited literature exists for CVNNs, where most activation functions are generally developed in a split fashion (i.e., by considering the real and imaginary parts of the activation separately) or with simple phase-amplitude techniques. Leveraging over the recently proposed kernel activation functions, and related advances in the design of complex-valued kernels, we propose the first fully complex, nonparametric activation function for CVNNs, which is based on a kernel expansion with a fixed dictionary that can be implemented efficiently on vectorized hardware. Several experiments on common use cases, including prediction and channel equalization, validate our proposal when compared to real-valued neural networks and CVNNs with fixed activation functions.
Colecciones a las que pertenece
- D21 Artículos [417]
- D21 Proyectos de Investigación [326]