
Facial expression recognition (FER) constitutes a key domain in affective computing and applied psychology, as it enables the systematic assessment of emotional states through observable facial cues. The present study examined the psychometric and methodological properties of two deep convolutional neural network architectures—AlexNet and DenseNet-201—in the automatic classification of emotional expressions using the FER-2013 dataset. Both models employed transfer learning and data augmentation procedures to enhance generalization and robustness. Comparative analyses were conducted across seven emotion categories (anger, disgust, fear, happiness, sadness, surprise, and neutrality) using standard performance indices—accuracy, precision, recall, and F1 score. AlexNet achieved a validation accuracy of 82.94%, whereas DenseNet-201 yielded 84.91%. DenseNet-201 demonstrated superior discriminative capacity, particularly in the recognition of subtle emotional states such as fear and disgust, which are often more challenging to detect both computationally and psychologically. To support interpretability and construct validity, Class Activation Mapping (CAM) was applied to identify the facial regions most influential in the classification process, offering insight into the visual cues underlying automated emotion assessment. Overall, findings highlight the methodological trade-off between model simplicity and psychometric precision: while AlexNet is suitable for efficient, lightweight applications, DenseNet-201 provides a more accurate and psychologically representative model of facial affect recognition. These results contribute to the integration of advanced computational techniques into psychometric models of emotion measurement and assessment.
Facial Expression Recognition, Deep Learning, FER-2013, Class Activation Mapping, DenseNet-201, Transfer Learning, Data Augmentation, Emotion Classification, AlexNet
Facial Expression Recognition, Deep Learning, FER-2013, Class Activation Mapping, DenseNet-201, Transfer Learning, Data Augmentation, Emotion Classification, AlexNet
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
