MIT logo

Entwine: Cognitive Assessment in VR

Image of Nelson playing a VR Maze game while wearing Galea, a multimodal physiological recording tool. A stream of the electroencephalogram (EEG), electrodermal, and heart rate signals are shown on the right.

Project Goals

Little work has been done to evaluate how BCI’s game mechanics affect gamers during gameplay and to expand existing game mechanics to be more accessible to those with disabilities. Project Entwine’s main goal is to explore how BCI technologies might impact the VR experience and how by using BCI paradigms, existing game mechanics can be translated to reach a more neuro-diverse population. We aim to better understand what mechanisms are available to the player and how those can be integrated with VR so as to reach a larger audience while at the same time prioritizing usability and comfort during VR experiences. We propose to start running studies that support (1) standardized data collection and feature extraction of a wide range of sensors, as well as (2) real-time adaptation of the AR/VR content based on the user’s current mental and affective state. The platform will be scalable and adaptable to a broad range of applications.

Contributions and Responsibilities

  • Help design a streamlined data pipeline for the Affective 360 Video Experiment connecting Unity to Brainflow, LSL, and Neuropype’s Artifact Removal.
  • Co-design Tetris experiment, assessing adequate timing of cognitive load or releive tasks and developing post-processing of multimodal data.
  • Document and integrate signal processing code-base for studying P300 visually evoked potentials, SSVEP Maze game, and multi-focal visually evoked potentials.