Video action recognition in SoC FPGAs driven by neural architecture search
Ver/ Abrir
Registro completo
Mostrar el registro completo DCAutoría
Suárez Plata, Daniel Nicolás
; Hernández Fernández, Pedro; Fernández Solórzano, Víctor Manuel
; Marrero Callicó, Gustavo
Fecha
2025Derechos
© 2025 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Publicado en
40th Conference on Design of Circuits and Integrated Systems, Santander, 2025, 150-155
Editorial
Institute of Electrical and Electronics Engineers, Inc.
Enlace a la publicación
Palabras clave
Neural architecture search
SoC FPGA
CNN-RNN architectures
Video action recognition
Reinforcement learning
Embedded AI
Hardware-aware NAS
Resumen/Abstract
This work presents a hardware-aware Neural Architecture Search (NAS) framework for video-based human action recognition, targeting real-time deployment on FPGAbased System-on-Chip (SoC) platforms. The proposed method explores a constrained search space of Convolutional Neural Network (CNN)-Recurrent Neural Network (RNN) architectures aligned with a hardware-software pipeline where CNNs are mapped to FPGA Deep Learning Processing Units (DPUs) and RNNs to embedded ARM cores. A reinforcement learning (RL)-based controller, guided by a position-based discounted reward strategy, progressively learns to generate architectures that emphasize high-impact design decisions. Experiments on the UCF101 dataset demonstrate that the proposed architectures achieve 81.07 % accuracy, among the highest reported for CNNRNN models relying exclusively on spatial information. The results validate the effectiveness of the proposed framework in driving hardware-compatible and performance-optimized architecture exploration.
Colecciones a las que pertenece
- D50 Congresos [476]
- D50 Proyectos de Investigación [445]






