Research findings suggest that personality traits such as extraversion, agreeableness, and openness to experience, are tightly coupled with human abilities and behaviour encountered in daily lives: emotional expression, linguistic production, success in interpersonal tasks, leadership ability, general job performance, teacher effectiveness, academic ability, as well as interaction with technology. In fact, human users tend to anthropomorphise computers and virtual agents, treating them as social beings, and interpreting their behaviour similarly to daily human-human interactions.
The problem of assessing people’s personality is very important for multiple research and business domains such as computer-mediated staff assessment and training, human-computer and human-robot interaction. Despite a growing interest and emphasis on personality traits and their effects on human life in general, and recent advances in machine analysis of human behavioural signals (e.g., vocal expressions, and physiological reactions), pioneering efforts focusing on machine analysis of personality traits have started to emerge only recently: (i) there exist a small number of efforts based on unimodal cues such as written texts/ audio/ speech/ static facial features, (ii) despite tentative efforts on multimodal personality trait analysis, the dynamics (duration, speed, etc.) of multiple cues, which have been shown to be important in human judgments of personalities, have mostly been neglected, (iii) although personality analysis research suggests that a trait exists in all people to a greater or lesser degree (i.e. a person can be anywhere on a continuum ranging from introversion to extraversion), none of the proposed efforts have attempted to assess personality traits continuously in time and space (i.e., how a person can be rated along the multiple trait dimensions at a given interaction time and context), and (iv) how machine (automatic) traits analysis can be utilised for personalised, social and adaptive human – virtual agent interaction has not been investigated.
Overall, both the common everyday technology (e.g., personal PCs, smart phones) and the more sophisticated systems people use nowadays (e.g., computer games, assistive technologies, embodied virtual agents, etc.) lack the capability of understanding their human users’ personality and behaviour, and of providing socially intelligent, adaptive and engaging human – computer interaction.
To address these issues and limitations, MAPTRAITS project brings around a set of audio-visual tools that can analyse and predict human personality traits dynamically from multiple nonverbal cues and channels (i.e., upper body, head, face, voice and their dynamics) in continuous time and trait space. As a proof-of-concept, the MAPTRAITS technology has been developed for automatic matching of virtual agent and user personalities, to automatically model what type of users would like to engage with what type of virtual agents to the aim of user engagement enhancement. The motivation for choosing this application area lies in its significance: (i) Research has shown that people’s attitudes toward machines and conversational agents is based on the perceived personality of the agent, and their own personality, and (ii) humans are social beings, and currently their everyday life revolves around interacting with computers, virtual agents and robots that are getting increasingly popular as companions, coaches, user interfaces to smart homes, or household robots.