Skip to main content
Enterprise AI Analysis: Smile on the Face, Sadness in the Eyes: Bridging the Emotion Gap with a Multimodal Dataset of Eye and Facial Behaviors

AI in Emotional Intelligence

Smile on the Face, Sadness in the Eyes: Bridging the Emotion Gap with a Multimodal Dataset of Eye and Facial Behaviors

This research introduces the Eye-behavior-aided Multimodal Emotion Recognition (EMER) dataset and the EMERT model to address the 'emotion gap' between facial expressions and genuine emotions. By integrating facial expressions with eye behaviors (movement sequences, fixation maps), EMER provides a rich, multi-view annotated dataset of 1,303 spontaneous emotional samples from 121 participants. The EMERT model, utilizing modality-adversarial feature decoupling and a multi-task Transformer, significantly outperforms existing state-of-the-art methods in emotion recognition, demonstrating the crucial role of eye behaviors for robust and nuanced emotional understanding. This work paves the way for more reliable ER systems in human-computer interaction, mental health, and intelligent virtual agents.

Executive Impact & Key Metrics

Understanding true emotional states, beyond superficial expressions, can revolutionize enterprise applications. Our findings demonstrate significant advancements in robust emotion recognition, crucial for intelligent systems.

0 Dataset Samples
0 Participants
0.0 ER Performance Boost

Deep Analysis & Enterprise Applications

Select a topic to dive deeper, then explore the specific findings from the research, rebuilt as interactive, enterprise-focused modules.

Understanding the Emotion Gap

The 'emotion gap' refers to the disparity between outward facial expressions and an individual's true internal emotional state. Conventional Facial Expression Recognition (FER) often struggles here, as facial cues can be consciously masked or suppressed. This research highlights the need to integrate more intuitive physiological signals, like eye behaviors, to bridge this gap for more reliable Emotion Recognition (ER).

Bridging The Gap Between Surface Expressions & True Emotion

Introducing the EMER Dataset

EMER is the first eye-behavior-aided multimodal dataset specifically designed to address the emotion gap. It features 1,303 spontaneous emotional samples from 121 diverse participants, capturing synchronized facial expression videos, eye movement sequences, and eye fixation maps. Crucially, it provides both Facial Expression Recognition (FER) and Emotion Recognition (ER) labels, enabling detailed analysis of their discrepancies.

1,303+ Multimodal Samples for Nuanced ER

EMER vs. Existing Multimodal Datasets

Feature Existing Datasets (e.g., MAHNOB-HCI) EMER Dataset
# Participants 27 121
# Samples 565 1,303
Data Quality Clean Clean
Non-invasive Sensor No (EEG) Yes (Eye Tracker, Camera)
Visual Facial Images Yes Yes
Eye Movement Data Yes Yes
Eye Fixation Maps Yes Yes
Both ER & FER Anno. No Yes
Emotion Gap Analysis No Yes

The EMERT Model: A Novel Approach

The Eye-behavior-aided MER Transformer (EMERT) is designed to leverage the rich multimodal data in EMER. It employs a Multimodal Feature Extraction (MFE) module, followed by Modality-Adversarial Feature Decoupling (MAFD) to isolate emotion-generic and emotion-unique features. Finally, an Emotion-sensitive Multi-task Transformer (EMT) guides the fusion for robust, gap-bridging emotion recognition.

EMERT Model Pipeline

Multimodal Feature Extraction (MFE)
Modality-Adversarial Feature Decoupling (MAFD)
Emotion-sensitive Multi-task Transformer (EMT)
Robust Emotion Recognition Output

Significant Performance Improvements

Extensive benchmarks show EMERT's superior performance across various tasks. For 7-class ER, EMERT achieved the highest WAR (59.28%), UAR (52.62%), and F1 (55.71%), significantly outperforming state-of-the-art methods like Self_MM. The integration of eye behaviors proved crucial, enhancing both ER and FER tasks and demonstrating resilience to noise.

+9.73% UAR improvement for 3-class ER over SOTA

EMERT vs. SOTA for 7-Class ER (Selected Metrics)

Model WAR (↑) UAR (↑) F1 (↑)
MulT [34] 57.33 48.03 53.53
Self_MM [46] 54.08 42.89 47.48
NORM-TR [49] 59.13 49.28 53.04
EMERT (Our Method) 59.28 52.62 55.71

Enterprise Application: Enhanced Customer Experience

Imagine a call center where AI could accurately gauge a customer's true emotional state, even if they're trying to mask frustration with a polite tone. By analyzing vocal cues alongside subtle eye behaviors like blink rate or gaze patterns (as the EMERT model enables), AI could detect underlying sadness or anger. This would allow the system to proactively escalate to a human agent or offer tailored, empathetic responses, significantly improving customer satisfaction and retention. This moves beyond superficial sentiment analysis to genuine emotional intelligence.

Calculate Your Potential ROI with Emotionally Intelligent AI

Estimate the impact of advanced emotion recognition on your operational efficiency and customer engagement.

Estimated Annual Savings $0
Annual Hours Reclaimed 0 Hours

Your Roadmap to Emotionally Intelligent AI

A structured approach to integrating advanced emotion recognition into your enterprise operations.

Phase 1: Discovery & Strategy

Assess current emotional intelligence needs, define use cases, and formulate an AI strategy aligned with business goals. Identify data sources and integration points for eye-tracking and facial expression data.

Phase 2: Data Integration & Model Training

Integrate multimodal data streams (facial video, eye movements, fixation maps). Utilize the EMER dataset for transfer learning, and train custom EMERT models on your specific enterprise data.

Phase 3: Pilot Deployment & Optimization

Deploy EMERT-powered AI in a pilot environment. Monitor performance, gather feedback, and fine-tune models for accuracy and robustness in real-world scenarios.

Phase 4: Full-Scale Integration & Continuous Improvement

Scale emotionally intelligent AI across the enterprise. Establish continuous learning loops for model updates, ensuring sustained high performance and adapting to evolving emotional cues.

Ready to Bridge the Emotion Gap in Your Enterprise?

Unlock deeper emotional intelligence for your AI systems. Schedule a personalized strategy session with our experts to explore how EMERT can transform your operations.

Ready to Get Started?

Book Your Free Consultation.

Let's Discuss Your AI Strategy!

Lets Discuss Your Needs


AI Consultation Booking