dc.contributor | Universitat Ramon Llull. IQS | |
dc.contributor.author | Solis Arrazola, Manuel Alejandro | |
dc.contributor.author | Sánchez-Yáñez, Raúl | |
dc.contributor.author | Gonzalez-Acosta, Ana M. S. | |
dc.contributor.author | Garcia-Capulin, C. H. | |
dc.contributor.author | Rostro Gonzalez, Horacio | |
dc.date.accessioned | 2025-02-11T07:18:59Z | |
dc.date.available | 2025-02-11T07:18:59Z | |
dc.date.issued | 2025-01 | |
dc.identifier.issn | 2504-2289 | ca |
dc.identifier.uri | http://hdl.handle.net/20.500.14342/4905 | |
dc.description.abstract | This study explores children’s emotions through a novel approach of Generative Artificial Intelligence (GenAI) and Facial Muscle Activation (FMA). It examines GenAI’s effectiveness in creating facial images that produce genuine emotional responses in children, alongside FMA’s analysis of muscular activation during these expressions. The aim is to determine if AI can realistically generate and recognize emotions similar to human experiences. The study involves generating a database of 280 images (40 per emotion) of children expressing various emotions. For real children’s faces from public databases (DEFSS and NIMH-CHEFS), five emotions were considered: happiness, angry, fear, sadness, and neutral. In contrast, for AI-generated images, seven emotions were analyzed, including the previous five plus surprise and disgust. A feature vector is extracted from these images, indicating lengths between reference points on the face that contract or expand based on the expressed emotion. This vector is then input into an artificial neural network for emotion recognition and classification, achieving accuracies of up to 99% in certain cases. This approach offers new avenues for training and validating AI algorithms, enabling models to be trained with artificial and real-world data interchangeably. The integration of both datasets during training and validation phases enhances model performance and adaptability. | ca |
dc.format.extent | p.18 | ca |
dc.language.iso | eng | ca |
dc.publisher | MDPI | ca |
dc.relation.ispartof | Big Data Cognitive Computing 2025, 9(1), 15 | ca |
dc.rights | © L'autor/a | ca |
dc.rights | Attribution 4.0 International | * |
dc.rights.uri | http://creativecommons.org/licenses/by/4.0/ | * |
dc.subject.other | Generative artificial intelligence | ca |
dc.subject.other | Facial emotion recognition | ca |
dc.subject.other | Facial muscle activation | ca |
dc.subject.other | Artificial neural networks | ca |
dc.subject.other | Intel·ligència artificial | ca |
dc.subject.other | Expressió facial | ca |
dc.subject.other | Emocions en els infants | ca |
dc.title | Eliciting Emotions: Investigating the Use of Generative AI and Facial Muscle Activation in Children’s Emotional Recognition | ca |
dc.type | info:eu-repo/semantics/article | ca |
dc.rights.accessLevel | info:eu-repo/semantics/openAccess | |
dc.embargo.terms | cap | ca |
dc.subject.udc | 004 | ca |
dc.subject.udc | 159.9 | ca |
dc.identifier.doi | https://doi.org/10.3390/bdcc9010015 | ca |
dc.description.version | info:eu-repo/semantics/publishedVersion | ca |