The short definition

Emotional AI — also called emotion AI, affective computing, or emotional analytics — is a field of artificial intelligence that measures and interprets human emotional states from physiological signals. The primary signal source is the face: involuntary muscle activations that occur in the 200–500 millisecond window following a stimulus, before any conscious moderation of expression has occurred.

The foundational science is the Facial Action Coding System (FACS), developed by psychologists Paul Ekman and Wallace Friesen in 1978. FACS defines 44 Action Units — each corresponding to a specific facial muscle or muscle group — that together encode every human emotional expression. Emotional AI automates the analysis of these Action Units at the speed of a live research session.

The output is typically expressed as a VAD score: Valence (positive vs negative affect), Arousal (intensity of the response) and Dominance (the participant's sense of control). These three dimensions, from Russell's Circumplex Model of Affect (1980), together describe the complete emotional state at any moment during an interaction.

Emotional AI at a glance

What it measures

Involuntary facial muscle activations, voice patterns, and text-emotion divergence — the gap between what people write and what they feel.

What it bypasses

Social desirability bias, peer moderation in focus groups, articulation gaps, and post-session recall distortion — the structural failures of self-report research.

What it produces

VAD scores, emotion timelines, text-emotion divergence analysis, and actionable research reports — data to information to knowledge.

Why self-report research fails — and what emotional AI fixes

The problem with surveys, focus groups, depth interviews and moderated research is not the methodology — it is the fundamental act of asking. When a participant is asked how they feel about something, three distortions occur simultaneously:

  • Articulation gap: people are poor at translating felt emotion into words. "Interesting" is the default response when the genuine reaction is uncertain or mixed.
  • Social desirability bias: in any research context with a researcher present, participants produce the response they believe is expected or appropriate. In focus groups, this is amplified by peer dynamics.
  • Recall distortion: post-session surveys ask participants to remember how they felt during an experience — not to report how they feel now. Memory smooths emotional peaks and compresses valleys.

Emotional AI bypasses all three. The measurement happens involuntarily, in real time, at the moment of stimulus exposure — before any of these distortions have occurred. The participant cannot suppress or modulate an Action Unit activation any more than they can suppress a blink reflex.

How emotional AI works in practice

A standard emotional AI research session runs as follows. The participant joins remotely on any camera-equipped device — laptop, desktop, tablet. No specialist hardware, no facility, no installation is required. They provide explicit informed consent for emotion capture, scoped to the session and purpose.

During the session, a product concept, advertising creative, interview question or written stimulus is presented. The emotional AI system scores 44 facial Action Units per video frame in real time. These activations are mapped to VAD coordinates — producing a continuous emotion timeline that shows exactly when and how strongly the participant responded to each element of the stimulus.

For culture and people analytics applications, emotional AI is applied differently: the participant's existing written survey responses are scored for emotional signal — detecting where the language is positive but the underlying emotion is not. This text-emotion divergence analysis is the primary tool for identifying masked dissatisfaction in workplace culture data.

Key distinction

Emotional AI is not facial recognition.

Facial recognition identifies who someone is by matching facial geometry to a stored identity. Emotional AI identifies what someone is feeling by measuring involuntary muscle activations. No individual is identified. No images are stored. EchoDepth retains Action Unit activation scores only — no raw video.

Where emotional AI is used

Pharma market research

Patient and HCP research where social desirability bias is a known, material problem. Emotional AI captures genuine response to drug messaging, treatment communication and marketing materials — without the professional composure that makes pharma qual systematically unreliable.

EchoDepth for pharma →

FMCG consumer insights

Concept testing, packaging research and advertising pre-testing — identifying genuine purchase desire rather than the "interesting but wouldn't buy" response that focus groups reliably produce. Approximately 80–90% of FMCG new product launches fail; emotional AI captures the signal that predicts which will.

EchoDepth for FMCG →

People analytics & culture

HR analytics and employee engagement measurement where culture surveys show stable scores but retention is declining. Emotional AI adds the layer that surveys cannot: the felt experience beneath the stated response. Text-emotion divergence analysis identifies the masked dissatisfaction that precedes voluntary turnover.

EchoDepth for culture →

Research agency methodology

Adding an emotional intelligence layer to existing qualitative research methodology — alongside focus groups, depth interviews and ethnographic studies. The combination of stated response and felt response is significantly more powerful than either alone; the discrepancies are often the most commercially valuable finding.

All use cases →

What emotional AI cannot do

Emotional AI is a depth research tool, not a large-scale quantitative survey replacement. It excels at revealing the why behind behavioural and attitudinal data — but it produces directional insight with samples of 20–50, not statistically representative data across thousands of respondents. For broad market sizing, incidence and prevalence studies, traditional survey platforms remain appropriate.

Emotional AI also cannot reliably detect deception (despite popular press suggestions). It measures emotional state — not intention. A participant who is anxious about being observed is registering genuine anxiety, not guilt. The science does not support polygraph-style inference from emotional signal, and EchoDepth does not claim otherwise.

EchoDepth Insight — emotional AI for commercial research

EchoDepth Insight is an emotional AI and facial coding software platform built by Cavefish Ltd, Cardiff, Wales. It applies 44 FACS Action Units and VAD scoring to product testing, pharma research, FMCG consumer insights, people analytics and culture measurement — delivered entirely remotely via browser, with no hardware required.

The platform goes beyond data delivery: EchoDepth translates raw emotional signal into structured insight and then into actionable knowledge — the specific findings and recommendations a research team or board can act on. As Gethin Thomas, CEO of Iterate, puts it: "Using EchoDepth from Cavefish allows us to validate ideas quickly, minimising the risk of launching a product or idea."

Evaluating EchoDepth against existing tools? See EchoDepth vs iMotions and EchoDepth vs Culture Amp.

Frequently asked questions about emotional AI

What is emotional AI?
Emotional AI is a field of artificial intelligence that measures human emotional states from involuntary physiological signals — primarily facial muscle movements, voice patterns and text. Unlike sentiment analysis, which scores the words people choose to write, emotional AI captures the signal that precedes any verbal response. The core science is FACS (Facial Action Coding System), which defines 44 Action Units corresponding to specific facial muscle activations. (Ekman & Friesen, 1978.)
How does emotional AI differ from sentiment analysis?
Sentiment analysis is a linguistic tool: it scores the words people write or say. Emotional AI is a physiological measurement tool: it captures the involuntary muscle response that occurs before any words are formed. The critical difference is that emotional AI detects masked dissatisfaction — the gap between how someone describes feeling and how they actually feel — which no text-based sentiment tool can reach, regardless of how sophisticated the language model is.
Is emotional AI the same as facial recognition?
No. Facial recognition identifies who someone is by matching facial geometry to a stored identity — it is a surveillance technology. Emotional AI identifies what someone is feeling by measuring involuntary muscle activations — it is a research measurement technology. EchoDepth does not record, store or process any imagery that could identify an individual. Only Action Unit activation scores are retained, which cannot be reverse-engineered to reconstruct an image or identity.
Can participants fake their emotional response?
No. FACS analysis measures involuntary facial muscle activations in the 200–500 millisecond window immediately following stimulus exposure — before any conscious control of expression is possible. Even trained actors cannot suppress the micro-expression patterns that reveal genuine emotional response at this timescale. This involuntary capture is the core scientific advantage of emotional AI over self-report methodology.
Is emotional AI GDPR compliant?
EchoDepth Insight is UK GDPR compliant by design. No raw video is retained. Only VAD scores and Action Unit activations are stored. All sessions require explicit informed consent with time-bound, purpose-limited data agreements. Culture survey text analysis processes anonymised text only — no camera or biometric data is required for this application.
What is the difference between emotional AI and traditional qualitative research?
Traditional qualitative research — focus groups, depth interviews, ethnography — captures stated experience: what participants say and how they describe feeling. Emotional AI captures felt experience: the physiological response that occurs before and alongside any verbal description. The two are complementary. The most powerful research methodology runs them in parallel, using the discrepancy between the stated and felt response as the primary analytical lens. EchoDepth is designed to work alongside existing qual methodology, not to replace it.

Go deeper

Related reading

The science

What is FACS analysis — and why does it matter for research?

The 44 Action Units of the Facial Action Coding System, how they are measured, and why involuntary muscle activation is more reliable than any self-report.

Culture measurement

Culture survey vs emotional AI — what is the difference?

Why culture surveys measure stated sentiment and emotional AI measures felt sentiment — and what the difference means for people analytics.

Pharma research

Social desirability bias in pharma research

Why HCP and patient research is especially vulnerable to social desirability bias — and how FACS analysis eliminates it.

EchoDepth Insight

Ready to see emotional AI applied to your research?

Book a 30-minute discovery call. No commitment — we will show you exactly what EchoDepth would surface on your specific research challenge.