Part of:

How Can AI Help in Personality Prediction?


Can AI understand your personality just by looking into your eyes? That's what researchers are working on, and this technology can help humans better understand each other.

The artificial intelligence community recognizes that androids are not going to be the omnipotent objects that replace humans as science fiction stories and urban legends would have us believe. Cognitive capabilities are innately human and too intricate to replicate in humanoid robots.

As robot ethicist Kate Darling states in her Ted Talk, robots have a greater value when they are agreeable partners collaborating with humans. By reading the eye movements of humans, robots can understand them and remain in sync with the rhythm of their life.

People and Social Robots

Psychological research has confirmed the truth in the adage that eyes are a window to the soul. Eye movements are spontaneous responses to external stimuli that reflect the personalities of humans. Insecure humans, for example, are likely to feel more nervous in response to adverse events and experience quicker eye movements.

Social robots, aided by computer vision and other sensors, are designed to learn about the state of mind of their human partners and assist them—perhaps by communicating reassuring messages when their companion is nervous.

We spoke to Nikolas Kairinos, the co-founder and the CEO of which is developing social robots to personalize learning for enhanced engagement. By keeping track of eye movements as well as micro-facial expressions and body movement of humans, (but not yet personality types,) it is possible to gauge attention and its impact on learning.

“We studied learning engagement in the real world of a diversity of contexts, across cultures, and genders to build an engagement engine. Our social robots observe attention and make decisions on whether to change the methods of engagement such as by using a different medium like a video to be more effective.”

Personality Traits

Psychological research has expanded the traits of human personality that it is able to quantify and analyze with machine learning.


Personality traits of humans have been defined by the five-factor model which has been expanded by three other models—the Dark Triad, the Reinforcement Sensitivity Model (sometimes described as the Behavioral Inhibition System(BIS)/Behavioral Approach System (BAS) model), and the HEXACO model.

The five-factor model includes:

  • Openness refers to those ready to go off the beaten path, who show the traits of creativity and intellectual curiosity.

  • Agreeableness indicates people who are compassionate, altruistic and polite.

  • Conscientiousness describes people who are diligent, efficient and dutiful in completing tasks well.

  • Extraversion marks a person who is socially adept, expressive and confident.

  • Neuroticism is the hallmark of people given to anxiety, mood swings and dark thoughts.

The Dark Triad refers to people who lack empathy, are possibly narcissistic, and could be described as Machiavellian.

Behavioral Inhibition System (BIS) is characteristic of people who experience anxiety at the prospect of frustrating experiences, misfortune, or payback. Behavioral Approach System (BAS) is descriptive of people who live for rewards—achievement, enjoyment, and positive incentives.

The HEXACO model overlaps three factors of the five-factor model—openness, agreeableness, and conscientiousness—and adds two more: honesty and resilience.

The data to determine the personality traits is captured by SMI Eye-Tracking Glasses (ETG) that have two infrared cameras pointed at each eye working as sensors to read natural eye and gaze behavior.

The cameras view pupil dilation (along X and Y axes), rapid eye movements called eye saccades and fixations, blinks, and relative gaze direction data.

Additionally, tracking of physiological responses can include skin conductivity, brain activity, and pupil size as done by other studies. EEG data for brain signals and galvanic skin response for skin conductivity.

Further data processing of raw signals characterizes their importance based on knowledge gained from previous research.

External stimuli is provided by videos. For example, the detection of emotional stability can be gauged from responses to energetic songs or graphic violence as the stimuli.

The data is analyzed with supervised machine learning methods which include decision trees, regression testing, and ensembles of classifiers with labeled data. For new subjects whose personality is unknown, the labels for data are predicted from previous subjects whose personalities have been identified.

Deciphering Eye Movements

Inexpensive eyeglasses, embedded with sensors, can track eye movements. The impact of external stimuli (video footage or images) is measured by spontaneous eye movements, especially eye saccades which are physiological responses that are hard to consciously manipulate.

Eye saccades that are higher than normal indicate a pronounced stimulus and are more rapid among people who are impulsive and impatient.

Brain function is affected by levels of serotonin and dopamine. Low levels of serotonin correlate with aggression, poor impulse control, and dark moods. Dopamine stimulates the prefrontal cortex for sharper cognition and the brain functions associated with motivation, emotion, and desire for rewards.

Skin conductivity increases due to external stimuli such as fear. Sweat from responses stimulated by fear moisten the skin and increases conductivity.

Laboratory results have confirmed the posited results. The expected outcomes were based on the emotions that images from the International Affective Picture System represent with corresponding arousal and valence (eliciting a “good” or “bad” feeling) scores. The video stimuli was drawn from the FilmStim dataset and included the following movies: “Seven,” “Life is Beautiful,” “American History X”, “Blue”, “Dangerous Mind”, “A Fish Called Wanda”, and “Trainspotting”. The measured emotional responses were then classified into personality traits.

An overall accuracy of 86% was achieved with the postulated categories of behavior as analyzed with supervised machine learning (ML) It improved on earlier classifiers; on all measures of HEXACO, it did worse on BIS/BAS on all counts but one which it matched, and matched but two of the metrics for D3 and did better on two.

Going forward, the search for reproducing the results in the wild, or in the field, are underway. Kairinos explains:

“Academic research is prone to confirmatory bias. We are taking a machine learning approach that considers a wide variety of contextual variables into account. Later, we will collaborate with academics for a theoretical understanding of the data.”

Emulating Human Behavior Through the Eyes

Social robots appear behaviorally realistic when they reproduce the communicative movements of the human biological eye. “Social Eye Gaze in Human-Robot Interaction: A Review” written by Henny Admoni and Brian Scassellati, at the Department of Computer Science at Yale University, concluded that cost, complexity, and fragility of the robots rise in seeking to reproduce eye movements such as saccades.

We spoke to one of the authors of the report, Henny Admoni, currently Assistant Professor at the Carnegie Mellon University Robotics Institute, Pittsburg, about how progress can be made despite the challenges of recreating features of the biological eye in robots.

“I'm not convinced that robots will need to be more human-like to interact well with people. As humans, we tend to attribute anthropomorphic signals to many non-human-like objects and behaviors.”

Anthropomorphism is the willing suspension of disbelief that lets people attribute more human traits in robots than is objective. “With smart designs, we won't need expensive attributes like pupil dilation for robots to behave socially with people,” Henny Admoni concluded.

She did acknowledge that real-world context presents a whole range of new challenges.

“Social interaction can be influenced by the relationship between the social actors, the social norms of their cultures and their current location, even the layout of the furniture in their environment.”


Investigations into the interaction between humans and social robots have opened a cornucopia of knowledge about not only human psychology but also robot design for collaboration. It will likely transform education, elderly care, and households.


Related Reading

Related Terms

Kishore Jethanandani

Kishore Jethanandani is a futurist, economist nut, innovation buff, a business technology writer, and an entrepreneur in the computer vision, wearable devices, and IoT space. He specializes in writing about emerging intersecting technologies.