BCI Adaptive Horror Game
Real-time EEG “fear index” dynamically tunes horror intensity.
Demo Video
This prototype reads real-time EEG from a Unicorn headset to compute a fear index. The index modulates darkness, audio tension, enemy behavior, and jumpscare intensity so the experience “breathes” with the player’s state.
I led game design and Unity implementation: EEG ingestion, parameter mapping, event system for encounters, and visual effects (vignette, lens distortion).
Game Design
You play an archaeologist exploring a sentient cave. The cave reacts to emotions: the more afraid you are, the harsher it becomes. Core loop: explore → encounter → survive → branch via calm vs. panic.
- Environment: global illumination and ambience scale with fear index.
- Jumpscares: short (<200ms) spikes tuned by the index (cooldown + intensity).
- Enemy & AI: perception radius, chase desire, and spatial audio (footsteps/breathing) adapt in real time.
EEG Pipeline
We compute PSD in alpha/beta bands, derive relative power vs. baseline (Z-score), normalize it to 0–1, then feed the value into a parameter mapping layer in Unity.
- Device: Unicorn Hybrid Black (8 electrodes)
- Runtime: C# processing library, standalone or Unity package
- API: exposed
float fearIndex
that systems subscribe to (curve mapping)
Implementation
Built in Unity. We used free Asset Store packs to prototype level, enemy AI, and controller quickly. A curve-driven parameter layer (ScriptableObjects + events) lets design tweak responses without code.
- Stack: Unity (URP), C#, ScriptableObjects, Post Processing
- Audio: distance-aware footsteps/breathing, fear-driven ambience
- FX: red vignette, subtle DOF wobble, film grain, mild distortion
Gallery

