Sparse multivariate Gaussian mixture regression
Ver/ Abrir
Registro completo
Mostrar el registro completo DCFecha
2015-05Derechos
© 2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Publicado en
IEEE Transactions on Neural Networks and Learning Systems, 2015, 26(5), 1098 - 1108
Editorial
Institute of Electrical and Electronics Engineeers
Enlace a la publicación
Palabras clave
Gaussian function mixture
Function approximation
Regression
Logarithmic utility function
Sparsity
Resumen/Abstract
Fitting a multivariate Gaussian mixture to data represents an attractive, as well as challenging problem, in especial when sparsity in the solution is demanded. Achieving this objective requires the concurrent update of all parameters (weight, centers, and precisions) of all multivariate Gaussian functions during the learning process. Such is the focus of this paper, which presents a novel method founded on the minimization of the error of the generalized logarithmic utility function (GLUF). This choice, which allows us to move smoothly from the mean square error (MSE) criterion to the one based on the logarithmic error, yields an optimization problem that resembles a locally convex problem and can be solved with a quasi-Newton method. The GLUF framework also facilitates the comparative study between both extremes, concluding that the classical MSE optimization is not the most adequate for the task. The performance of the proposed novel technique is demonstrated on simulated as well as realistic scenarios.
Colecciones a las que pertenece
- D12 Artículos [360]