- Changes in situations cause changes in how personality traits are perceived (Joshi, Gunes & Goecke, ICPR’14; Celiktutan & Gunes, IEEE TAC’16). Previous research claims that traits (extraversion, agreeableness, conscientiousness, neuroticism, and openness) are stable over time and they do not change much. We challenged this claim by hypothesising that if there is a change in the situation and the interaction (we call this context), people will exhibit different aspects of their personality. We conducted a study to investigate the change in the perception marked by external observers (raters) when the same individual interacts with different embodied conversational agents (virtual characters) displaying varied conversational and social attributes. Our results showed that context appears to play an important role in altering the raters’ perceptions and their scores. Varying situational context causes the manifestation of different facets of people’s personality (Celiktutan & Gunes, IEEE TAC’16). The analysis with respect to the three virtual agents (Poppy – cheerful and positive, Spike – angry and aggressive, and Obadiah – sad and miserable) showed that a better relationship between automatic prediction and situational context was established for Poppy and for Obadiah with the audio-visual labels. This conﬁrmed that subjects interacting with Poppy and Obadiah were perceived as more active and expressive, however their cues were more subtle when interacting with Spike (Celiktutan & Gunes, IEEE TAC 2016).
- Observer ratings for personality are more consistent when only visual communication channel is available (Joshi, Gunes & Goecke, ICPR’14). For the same set of subjects and same set of visual-only or audio-visual clips, mean difference in observer ratings is lower when only visual displays of subjects are available. Cognitive overload (availability of more cues and modalities) affects the raters’ perceptions while audio-visual feed is processed, and there are possible individual differences in modality preference during perception (some raters focus more on what is said rather than how it is said, or vice versa).
- Assessments provided by each rater vary, but we can measure and model the credibility of each rater (Joshi, Gunes & Goecke, ICPR’14). To assess variations in a rater’s assessments for different trait dimensions, each rater was shown a pre-selected audio-visual and a visual-only clip twice. We evaluate the credibility of a rater by assigning weights to every rater based on their consistency in assessing the same clip. Automatic personality prediction incorporating a weighted model outperforms the average model by automatically predicting each trait dimension more accurately.
- A number of personality dimensions are perceived and rated as more dynamic compared to others (Celiktutan & Gunes, IEEE TAC’16). Raters perceived conscientiousness, openness and likeability as more static (once the raters make up their mind, perceptions do not change), while agreeableness, extroversion and facial attractiveness were perceived as more dynamic (perceptions vary and change in time during the interaction). Our automatic prediction experiments also support this phenomenon , we obtained lower prediction performance for extroversion and facial attractiveness compared to conscientiousness, openness and likeability.
- Automatic prediction needs to treat and model each personality trait differently (Celiktutan & Gunes, ICIP’14; Celiktutan & Gunes, IEEE TAC’16). Some traits are better modelled using multiple communication channels and dynamic models whereas others are better modelled using a single communication channel and static models. Our experimental results showed that, combining face or body appearance features with audio features is found to be the best solution for predicting conscientiousness and engagement. However, for other traits, unimodal features provide better results – e.g., facial attractiveness is best modelled using facial shape and configuration features, and likeability is best modelled using histogram of optical flow and histogram of gradients.
- Predicting personality traits based on history is more robust and accurate (Celiktutan & Gunes, IEEE TAC’16). Instead of doing frame by frame analysis and treating each time instant separately, more accurate and robust automatic prediction is achieved by processing the information from the past frames using a dynamic approach
- Facial attractiveness perception and computation is affected by the subject’s behaviour (Kalayci, Ekenel & Gunes, IEEE ICIP’14). Previous research only focussed on the perception and analysis of static appearance for automatic attractiveness prediction. Our experimental results showed that the perception and computation of the human physical traits (i.e., facial attractiveness) are affected by both static appearance (e.g., facial proportions, symmetry) and dynamic behaviour (e.g., lifting eyebrows, smiling).
We extended our research from the MAPTRAITS project to the Being There project, and conducted comparative experiments with the extroverted versus introverted robot condition showing that (Celiktutan & Gunes, IEEE RO-MAN 2016):
- Perceived enjoyment with the NAO robot is found to be significantly correlated with participants’ extroversion trait (which validates the similarity rule);
- NAO robot’s perceived empathy positively correlates with participants’ extroversion trait – extroverted people feel more control over their interactions and judge them as more intimate and less incompatible;
- Perceived enjoyment with the robot is highly correlated with the agreeableness trait of the participants;
- A significant relationship is established between perceived robot’s realism and the neuroticism trait of the participants – people who score high on neuroticism tend to perceive their interactions as being forced and strained, therefore artificial behaviours of the robot might appear to them as realistic.