LUCID MIR.AI Logo
LUCID MIR.AI|未来
[ MODULES ] [ EXPERIMENTAL ]

NEURAL_SYNCHRONY

THE MACHINE WATCHES. YOU WATCH IT WATCHING.

You are about to enter a feedback loop. This interface uses Facial Action Coding System (FACS) to decompose your micro-expressions into 468 data points—reading you while you read your own data in real-time.

Neural Synchrony begins when the wetware (your 43 facial muscles) and the software (this machine) achieve resonance. The question is not whether you can fool it. The question is: what does it reveal about you? Edge AI. Local processing only. No biometrics stored or transmitted.

OFFLINE
SCANNING CONFIDENCE: 0.0%
AFFECTIVE SPECTRUM
ATTENTION (GAZE + APERTURE) 0%
JOY (AU6 + AU12) 0%
SADNESS (AU1 + AU15) 0%
ANGER (AU4 + AU7) 0%
FEAR (AU1 + AU2 + AU20) 0%
DISGUST (AU9 + AU10) 0%
MORPHOMETRICS (RAW)
SMILE_VEC 0.000
BROW_DIST (AU4) 0.000
INNER_BROW (AU1) 0.000
EYE_OPEN (AU5) 0.000
MOUTH_WIDTH (AU20) 0.000
SENSITIVITY MATRIX
JOY_BIAS 1.0
SAD_BIAS 1.0
ANGER_BIAS 1.0
RESEARCH_PROTOCOL // FULL_METHODOLOGY
1. THE PREMISE

The human face is a high-bandwidth data transmission surface. This engine measures Resonance (did the biology react?) rather than Exposure (did the pixel load?).

2. PRIVACY CONSENT

All processing occurs locally in your browser. No video/images are transmitted to any server. MediaPipe FaceMesh runs on-device. Camera access requires explicit user consent via browser permission dialog.

3. CALIBRATION

Position face within frame. Neutral lighting preferred. System loads neural network for 468-landmark mesh detection. First frames establish baseline geometry.

4. LIVE CAPTURE

30fps analysis extracts: lip corner positions (smile vector), eye aperture (attention), brow distance (confusion/focus). Raw landmark data rendered as wireframe overlay.

5. FACS INTERPRETATION

Ekman's Facial Action Coding System maps muscle movements to Action Units (AUs):

  • JOY: AU6 + AU12 (Duchenne Marker = genuine)
  • SADNESS: AU1 + AU15 (brow raise + lip depress)
  • FEAR: AU5 + AU20 (lid raise + lip stretch)
  • ANGER: AU4 + AU7 (brow lower + lid tense)
6. DATA HANDLING

Session data is ephemeral. Page reload clears all computed metrics. No cookies, no storage, no analytics. The machine forgets the moment you leave.

7. ADTECH APPLICATION

This technology enables shift from impression-based to emotion-based ad measurement:

  • Pre-roll attention verification
  • Creative A/B testing via emotional response
  • Engagement scoring beyond click-through

RESEARCH_PROTOCOL
01. THE PREMISE

The human face is a high-bandwidth data transmission surface. Historically, advertising measured Exposure (did the pixel load?). This engine measures Resonance (did the biology react?).

By synthesizing Wetware (43 facial muscles) with Software (Computer Vision), we convert ephemeral human emotion into hard, programmatic data structures.

02. SIGNAL DECOMPOSITION (FACS)

We do not guess emotions. We track Action Units (AUs) based on Ekman's Facial Action Coding System.

JOY (The Duchenne Marker)
AU6 (Orbicularis Oculi) + AU12 (Zygomatic Major)
True happiness involves the eyes. A fake smile only engages the mouth. We track the Eye Aperture Vector to validate authenticity.
SADNESS (The Hardest Metric)
AU1 (Inner Brow) + AU15 (Lip Depressor)
Evolutionarily designed to solicit help. The key signal is the "Omega" sign on the forehead—where the Inner Brow raises higher than the Outer Brow.
FEAR vs. SURPRISE
The Upper Lid Vector (AU5)
Both widen the eyes to intake data. The difference lies in tension:
Surprise: AU26 (Jaw Drop) - Passive intake.
Fear: AU20 (Lip Stretcher) - Horizontal tension preparing for threat.
ANGER (Focus/Aggression)
AU4 (Brow Lowerer) + AU7 (Lid Tightener)
Characterized by Compression. The glabella distance shrinks, eyelids tighten. In AdTech, mild anger signals often correlate with High Focus/Intent.
03. THE ADTECH APPLICATION
  • Creative Optimization: A/B testing assets based on Joy/Surprise spikes rather than clicks.
  • Contextual Safety: Detecting Disgust/Anger to avoid placing brands in emotionally toxic environments.
  • Attention Economics: Moving beyond "Viewability" to "Cognitive Load"—measuring how hard the brain is processing the visual field.