The argument

We have spent decades measuring understanding by asking people to demonstrate it — tests, assessments, quizzes, surveys. The problem is not methodology. The problem is timing.

Understanding arrives emotionally before it arrives verbally. The moment a concept clicks happens in the 200–500 millisecond window before anyone can write it down, tick a box, or tell a trainer they've got it. By the time any self-report measure captures a response, the lived moment of comprehension — or confusion — has already passed.

Emotion signals understanding more accurately than any survey or test. And AI can now measure that signal in real time, individually, remotely — without disrupting the training environment that produced it.

Claim 1

Emotion precedes language

The felt response to a stimulus occurs 200–500ms before verbal processing begins. Every self-report measure captures the rationalisation, not the experience.

Claim 2

AI can read that signal

44 facial Action Units scored in real time — confidence, hesitation, stress, genuine comprehension. Involuntary. Measurable. The difference between someone who's got it and someone nodding along.

Claim 3

Trainers can intervene earlier

When emotional signal is available in real time, trainers no longer need to wait for test scores. The need to redirect, repeat or reinforce is visible before anyone raises their hand.

JP

Jonathan Prescott

CEO & Co-Founder, Cavefish Ltd · Cardiff, Wales

Former Director of Digital at The Royal Mint and Director of Digital Performance at Assurant ($10bn global insurance group). MBA, Bayes Business School. Co-founder of Cavefish, developers of EchoDepth — an emotional AI platform used in research, pharma, FMCG consumer insights, people analytics and defence training.

ITEC speaker profile →  ·  About Cavefish →

The technology behind the talk

EchoDepth is the emotional AI platform built by Cavefish. It uses 44 facial Action Units from the FACS taxonomy — the scientific standard for emotion measurement (Ekman & Friesen, 1978) — to score Valence, Arousal and Dominance in real time during any browser-based session.

No hardware. No facility. No specialist operator. Participants join on their own device. EchoDepth captures the involuntary physiological response at the moment of stimulus exposure — before any verbal or written response has formed — and translates it into structured, actionable insight.

Deployed across pharma research, FMCG consumer insights, culture measurement, and — as this session demonstrates — human performance and training contexts.

At ITEC 2026?

Let's meet.

If you're at ExCeL on 14–16 April and want to talk about how emotional AI applies to your training, research, or people programmes — get in touch before the event and we'll find time.

Get in touch

Methodology

Why conventional research methods miss the signal

The structural problems with asking people how they feel — and why physiological measurement changes the picture.

Platform

What is emotional AI?

FACS, VAD scoring, and the science behind measuring genuine emotional response.

Publication

The Emotional Decision Intelligence Brief

Jonathan Prescott's guide to predicting emotional risk in high-stakes enterprise decisions.

EchoDepth Insight

See what EchoDepth measures that your current research cannot.

Book a 30-minute discovery call. We will show you what the emotional AI layer surfaces on your specific challenge — training, research, culture, or concept testing.