Using time series foundation models for few-shot remaining useful life prediction of aircraft engines
Ver/ Abrir
Registro completo
Mostrar el registro completo DCFecha
2025-07Derechos
Attribution 4.0 International
Publicado en
Computer Modeling in Engineering and Sciences, 2025, 144(1), 239-265
Editorial
Tech Science Press
Enlace a la publicación
Resumen/Abstract
Predictive maintenance often involves imbalanced multivariate time series datasets with scarce failure events, posing challenges for model training due to the high dimensionality of the data and the need for domain-specific preprocessing, which frequently leads to the development of large and complex models. Inspired by the success of Large Language Models (LLMs), transformer-based foundation models have been developed for time series (TSFM). These models have been proven to reconstruct time series in a zero-shot manner, being able to capture different patterns that effectively characterize time series. This paper proposes the use of TSFM to generate embeddings of the input data space, making them more interpretable for machine learning models. To evaluate the effectiveness of our approach, we trained three classical machine learning algorithms and one neural network using the embeddings generated by the TSFM called Moment for predicting the remaining useful life of aircraft engines. We test the models trained with both the full training dataset and only 10% of the training samples. Our results show that training simple models, such as support vector regressors or neural networks, with embeddings generated by Moment not only accelerates the training process but also enhances performance in few-shot learning scenarios, where data is scarce. This suggests a promising alternative to complex deep learning architectures, particularly in industrial contexts with limited labeled data.
Colecciones a las que pertenece
- D30 Artículos [105]
- D30 Proyectos de Investigación [127]








