The measurement gap

The difference between footfall and genuine engagement is worth knowing.

The exhibition intelligence gap: UK businesses spend over £2bn annually on exhibition and events marketing (AEO research). The primary metric used to evaluate stand performance is footfall — a measure of proximity, not engagement. What visitors actually felt about what they saw remains, in almost every case, entirely unmeasured.

Every events team has experienced the same problem: high footfall, warm conversations, a stand that "felt busy" — and leads that converted poorly. Or the opposite: a quieter stand with visitors who stopped longer, asked better questions, and became the strongest pipeline.

The difference between those two outcomes is not the number of people who passed the stand. It is the emotional response of the people who stopped. Genuine curiosity is physiologically distinct from polite acknowledgement. Authentic interest in a product demonstration looks different from the social performance of interest. FACS analysis captures that distinction at the moment it occurs — before any verbal response, card swipe, or post-event survey.

EchoDepth measures VAD (Valence, Arousal, Dominance) continuously from facial Action Unit analysis across every visitor interaction. The result is the first accurate picture of what your exhibition investment actually produced emotionally — and which elements of your stand, content, or product actually moved people.

Traditional event measurement — what you get

Footfall counts — measure proximity, not interest; pass-by traffic inflates numbers

Exit surveys — recall bias; visitors describe experiences more positively than felt in the moment

Lead card volumes — a measure of willingness to engage, not emotional resonance with your product

Staff impressions — anecdotal, inconsistent across a busy stand, and unavailable for most interactions

Post-show pipeline — outcome metric only; tells you nothing about what created or destroyed the opportunity

EchoDepth — what you get instead

First-exposure signal — what visitors felt in the first 500ms before any social performance began

Genuine vs polite interest — VAD scoring separates authentic engagement from courtesy attention

Demo moment analysis — which product features or content moments drove emotional peaks or drops

Time-of-day patterns — when visitor emotional engagement peaks across the event day

Confusion and friction signals — where visitors showed cognitive load or disengagement during demonstrations

Where EchoDepth Insight applies

Four events intelligence applications.

From stand-level engagement to conference audience analysis — each producing a signal that no existing events measurement tool can reach.

01 — Exhibition stands

Stand effectiveness measurement

Camera-based EchoDepth analysis at your stand captures visitor emotional response from first approach through to departure. Identifies which stand elements, product displays, or digital content generate genuine interest — and which generate polite dwell. Outputs: emotional engagement map by stand zone, product response curves, and staff interaction quality signals.

Typical events: Industry trade shows, B2B exhibitions, product launches, tech expos

02 — Product demonstrations

Demo moment analysis

During product demonstrations, EchoDepth captures second-by-second emotional response curves — showing exactly which features landed with genuine interest, which triggered confusion or concern, and where attention dropped. Across multiple demonstration sessions, patterns emerge: the moments that consistently engage versus the moments that consistently lose the audience.

Typical events: Product launches, hands-on demos, interactive exhibits, roadshows

03 — Conference presentations

Audience response intelligence

EchoDepth analyses audience facial response during conference sessions, keynotes, and panel discussions — producing aggregate emotional response curves across the presentation. Identifies which statements, data claims, or narrative moments generate genuine resonance versus passive attendance. Pre/post comparison shows whether the session shifted audience emotional disposition toward or away from the presenting organisation.

Typical events: Industry conferences, investor days, results presentations, analyst briefings

04 — Consumer events

Visitor experience optimisation

For consumer-facing events — brand activations, pop-up experiences, retail exhibitions — EchoDepth measures the emotional journey of visitors across the event space, identifying peak engagement zones, friction moments, and the experiences most likely to generate genuine brand affinity versus Instagram-worthy but emotionally shallow interactions. Outputs feed directly into event redesign and future activation planning.

Typical events: Brand activations, consumer expos, retail pop-ups, experiential marketing

The emotional signal

What 44 Action Units reveal about a visitor that no survey can.

Genuine interest

AU6+AU12 Duchenne response with elevated arousal. The real signal — distinct from the polite smile (AU12 without AU6) that visitors produce socially.

Suppressed confusion

AU4 brow compression with moderate arousal and valence drop. Visitors rarely say they're confused — they smile and move on. EchoDepth captures it before the conversation ends.

Polite engagement

Moderate arousal, neutral valence, low dominance. The most common state at most stands. The visitor is present but not moved — and will not remember the interaction 48 hours later.

Purchase intent signal

High valence, high dominance, sustained arousal. The emotional state associated with genuine desire — the visitor who will follow up, not just take the brochure.

Concern or scepticism

AU1+AU2 brow raise with valence drop. Often suppressed during conversations. The visitor who seems engaged but is developing a material objection they may never voice.

Disengagement point

Arousal collapse, attention-narrowing AUs. The exact moment in a demonstration or presentation when a visitor mentally left — even while physically remaining. Consistently identifying this moment across sessions drives meaningful improvement.

Deployment

How it works at your event.

1

Pre-event configuration

Camera zones are defined for your stand or venue. No specialist hardware required — standard camera-equipped devices run the EchoDepth capture layer. Consent signage and DPIA documentation are provided as standard.

2

Live capture at the event

EchoDepth analyses 44 facial Action Units continuously across visitor interactions. Your stand team operates normally — the system runs in background. Optional live dashboard shows aggregate engagement signals in real time for stand management.

3

Intra-day optimisation

Aggregate emotional engagement signals from morning sessions inform afternoon stand adjustments — content sequencing, demo flow, stand layout priorities. The first event day produces actionable intelligence for the second.

4

Post-event report

Structured report covering: stand zone engagement heatmap, product demonstration response curves, best and worst engagement moments, time-of-day patterns, and comparative benchmarks if multiple events are tracked. Actionable recommendations for the next event cycle.

Common questions

Events intelligence — your questions answered.

Do visitors need to consent to being analysed?

Yes. Clear consent signage at the point of entry to the capture zone is required, informing visitors that emotional AI analysis is in use. EchoDepth provides standard consent signage templates and DPIA documentation for all event deployments. Where events operate under existing photography or recording consent frameworks, these can be extended to cover EchoDepth analysis with appropriate additions. EchoDepth does not store raw video or build persistent visitor profiles — it produces aggregate emotional intelligence data.

What hardware is needed at the stand?

Standard camera-equipped laptops, tablets, or dedicated webcam setups are sufficient for EchoDepth capture. The system is designed for exhibition environments — no specialist biometric hardware, no intrusive scanning equipment. Multiple camera zones can be run from a central device for larger stands. EchoDepth operates over standard Wi-Fi; 4G/5G backup is recommended for exhibition venues with variable connectivity.

Can EchoDepth be used at international trade shows?

Yes. EchoDepth is deployable at any event globally. Data processing is GDPR-compliant by design; for events in non-UK/EU jurisdictions, specific data transfer documentation is provided. FACS-based facial Action Unit analysis is culturally validated across populations — the involuntary physiological expression patterns that EchoDepth measures are consistent across ethnicities and cultural backgrounds (Ekman's cross-cultural FACS validation research). Consent signage is available in multiple languages.

How is this different from NPS or post-show surveys?

Post-show surveys and NPS measure recall, not real-time experience. By the time a visitor completes a survey, their emotional memory has been moderated by fatigue, social desirability, and the overall tone of the event — not the specific interaction with your stand. EchoDepth captures the genuine involuntary emotional response at the moment of the interaction, before any social moderation. The two approaches are complementary: EchoDepth data explains what caused the NPS scores you observe.

What does an events intelligence engagement look like commercially?

EchoDepth events intelligence is available as a per-event engagement or as an annual events programme. A per-event deployment covers pre-event configuration, on-site capture support, and post-event report delivery. Annual programmes include cross-event benchmarking, trend analysis across your event calendar, and dedicated account support. Contact us for pricing based on event size, number of stand zones, and required reporting depth.

Know what your next event actually produced.

Talk to us about deploying EchoDepth Insight at your next trade show, conference, or brand activation.

Book a discovery call How the platform works