INFORMAZIONI SU QUESTO ARTICOLO
Pubblicato online: 21 giu 2023
Pagine: 2599 - 2606
Ricevuto: 05 set 2022
Accettato: 26 dic 2022
DOI: https://doi.org/10.2478/amns.2023.1.00440
Parole chiave
© 2023 Hongyun Liu, published by Sciendo
This work is licensed under the Creative Commons Attribution 4.0 International License.
In this paper, the emotions of dancers are identified in combination with the integrated deep-learning model. Firstly, four initial value features with important emotional states are extracted from the time, frequency, and time-frequency domains, respectively. It was isolated using a deep belief network enhanced by neuro colloidal chains. Finally, the finite Boltzmann criterion integrates the features of higher abstractions and predicts the emotional states. The results of DEAP data show that the correlation between EEG channels can be discovered and applied by glial chains. The fused deep learning model combines EEG emotional features with temporal, frequency, and expressive qualities.