REBOOT is a "hands-off" VR/PC prototype for use in pain research and rehabilitation to help address financial barriers in the delivery of VR neurofeedback intervention. Check out the website @ for more info!

✨ About

This project represents the collaborative efforts of a small team, submitted for NeurAlbertaTech’s 2021 inaugural natHACKS - a 64-hour hackathon organized around neurotechnology.

Designed as an open-source prototype for use in pain research and rehabilitation, this game is compatible with both Muse and OpenBCI technology to optimize accessibility and address some of the financial barriers in the delivery of VR neurofeedback intervention. We hope to expand upon these concepts and extend the project’s utility to broaden the scope of novel psychological research along both therapeutic and foundational dimensions, including investigations into the intersections of physical and mental pain perception.

Play your brains out! 🎉

🧾 Summary (~250-word)

Modern research literature in pain management is exploring non-pharmacological interventions with distraction therapy. To further investigate this method, we developed an attentional neurofeedback application, where in-game mechanics reward concentration 1. The main application is cross-platform-- currently supporting VR, desktop, and web—and contains four modules: biosignal recorder, mini-games, analysis module, and a personalized profile. The biosignal module uses Brainflow and muse-lsl; tested hardware includes Muse2 (no dongle) and OpenBCI. Mini-games scenes were developed with Unity Engine and the Shader Graph to develop original VFX. Blender and TreeIT were used for modeling virtual assets and scenarios. For maximum VR-headset(s) compatibility, OpenXR and XR Interaction Toolkit were used (tested on Oculus Quest 1 + 2). React-Typescript, Figma, and Unity WebGL were used for web support. The analysis module incorporates Brainflow.MLModel library and C# math libraries that interact with the different data types of the different boards used, currently supporting Alpha/Beta power bands ratio and a pre-trained Concentration ML model as metrics. Feedback is then provided to the user through a UI that signals the level of concentration achieved. The metrics obtained from this module are subsequently analyzed and visualized (ex: the concentration magnitude is used to control different game mechanics like levitating in-game objects, modification of scale, change color and transparency of various environmental properties, etc). The player’s performance metrics are saved into Unity using playerprefs module and exports the raw signals to csv. We aim to develop further the gamification component with procedurally generated scenarios based on personalized user preferences and Brains@Play integration.

Share this project: