AR/VR/XR and Brain Sensing
Last page edit: April 8, 2024.
This project investigates the usage of brain-sensing electrodes (EEG) as a novel modality to enhance XR interaction. In particular, the project has several subcategories, like measuring the attention of the user or their mental imagery process.
In this project we propose a prototype which combines an existing AR headset HoloLens 2 with a Brain-Computer Interfaces (BCI) system based on our AttentivU project, and we perform several tasks to validate this concept.
Application 1. Assessing Internal and External Attention in AR using Brain Computer Interfaces.
Most research works featuring AR and Brain-Computer Interface (BCI) systems are not taking advantage of the opportunities to integrate the two planes of data. Additionally, AR devices that use a Head-Mounted Display (HMD) face one major problem: constant closeness to a screen makes it hard to avoid distractions within the virtual environment. In the first application, we propose to reduce this distraction by including information about the current attentional state. We first introduce a clip-on solution for AR-BCI integration. A simple game was designed for the Microsoft HoloLens 2, which changed in real time according to the user’s state of attention measured via electroencephalography (EEG). The system only responded if the attentional orientation was classified as "external." Fourteen users tested the attention-aware system; we show that the augmentation of the interface improved the usability of the system. We conclude that more systems would benefit from clearly visualizing the user’s ongoing attentional state as well as further efficient integration of AR and BCI headsets.
Application 2. Are You Still Watching?
In 2023/2024 we have replicated the aforementioned work from Application 1, but instead we used a stand-alone AR lens, called Monocle.
Check the paper for more details.
Application 3. A Pilot Study using Covert Visuospatial Attention as an EEG-based Brain Computer Interface to Enhance AR Interaction.
In the third application we investigated the feasibility of using a BCI based on covert visuospatial attention (CVSA) – a process of focusing attention on different regions of the visual field without overt eye movements. We operated without relying on any stimulus- driven responses.
The proof-of-concept presented in this application opens up interesting possible applications of AR EEG-BCIs which use CVSA. Inherent gaze independence of CVSA makes it a potential alternative for patients who do not display any overt eye movements. Its intuitiveness—natural attraction toward regions or objects of interest in the visual field— makes it a possible candidate for BCI-driven navigation devices (like wheelchairs or robots), as well as yes–no communication. The absence of stimulation stimuli like ERPs/SSVEPs may prove it more suitable for use over longer periods of time, as it allows a more engaging, comfortable and direct operation, and it is better adapted toward out-of-lab interactions for different user groups.
Application 4. A Pilot Study using Apple Vision Pro as an EEG-based Brain Computer Interface to Enhance AR Interaction.
Enhancing Apple Vision Pro headset with brain sensing capabilities via brain sensing (Electroencephalography - EEG) electrodes.
Research and Publications
2021 International Symposium on Wearable Computers. Association for Computing Machinery, New York, NY, USA, 43–47
Nataliya Kosmyna, Chi-Yun Hu, Yujie Wang, Qiuxuan Wu, Cassandra Scheirer, and Pattie Maes
Assessing Internal and External Attention in AR using Brain Computer Interfaces: A Pilot Study.
2021 IEEE 17th International Conference on Wearable and Implantable Body Sensor Networks (BSN), 2021, pp. 1-6.
N. Kosmyna, Q. Wu, C. -Y. Hu, Y. Wang, C. Scheirer and P. Maes. 2021
AttentivU: A Wearable Pair of EEG and EOG Glasses for Real-Time Physiological Processing
Nataliya Kosmyna, 2020