User Perceptions of Emotion for Robot Digital Faces Across Ages
Abstract
A natural way to observe and analyze emotions is through facial expressions. As such we wanted to understand differences in ages in labeling robot faces specific emotions. The present study was conducted online via the platform Prolific, where each participant was compensated $3 for time and effort. The sample distribution was n=95 (older 65+; M = 70.27 ,SD = 4.66), n=103 (middle aged 36-64; M = 47.12,SD =8.08), and n=101 (young 18-35; M = 28.37,SD =4.44). A series of chi-square analyses were run on the data to understand differences between the age groups. 62 chi-square analyses were run on the data (31 for emotions displayed perceptions per face and 31 for emoji classification perceptions per face). Results suggest that digital faces that elicit strong emotions are easier to classify, aiding in congruent perceptions across the age groups.
Refbacks
- There are currently no refbacks.