On selecting activation functions for neural network-based digital predistortion models
Ver/ Abrir
Registro completo
Mostrar el registro completo DCFecha
2025-02Derechos
© EMW Publishing. The Electromagnetics Academy. Reproduced courtesy of The Electromagnetics Academy
Publicado en
Progress in Electromagnetics Research C, 2025, 152, 111-120
Editorial
EMW Publishing
Resumen/Abstract
Neural networks have become a focal point for their ability to effectively capture the complex nonlinear characteristics of power amplifiers (PAs) and facilitate the design of digital predistortion (DPD) circuits. This is accomplished through the utilization of nonlinear activation functions (AFs) that are the cornerstone in a neural network architecture. In this paper, we delve into the influence of eight carefully selected AFs on the performance of the neural network-based DPD. We particularly explore their interaction with both the depth and width of neural network. In addition, we provide an extensive performance analysis using two crucial metrics: the normalized mean square error (NMSE) and adjacent channel power ratio (ACPR). Our findings highlight the superiority of the exponential linear unit activation function (ELU AF), particularly within deep neural network (DNN) frameworks, among the AFs under consideration.
Colecciones a las que pertenece
- D12 Artículos [360]