Inspiration

Our inspiration for this project was

What it does

Our software takes in a continuous stream of 6 channel 250Hz EEG input, passes it into the NeuroLM neural network which produces attention and engagement scores alongside a 512 dimensional embedding for each 1-10 seconds of EEG readings(adjustable). The current embedding can then be compared to embeddings previously produced after watching different videos, in effect mapping the current neural signals to other videos that elicited the same neural response. This has many applications such as recommending videos to users that maximize their attention or engagement across a certain topic which might be useful for crafting personalized study plans for students to optimize learning. The other part of the project is a small wearable camera that can take snapshots of everyday events when the attention and engagement readings reach a certain threshold thus allowing wearers to capture, remember, and relive the moments they implicitly found the most important during their everyday lives.

How we built it

  • hardware: OpenBCI Ultracortex EEG headset and Seeed studio XIAO ESP32S3 camera board
  • software: NeuroLM, (put DB and fullstack software here)

Challenges we ran into

  • EEG readings becoming corrupted
  • finicky EEG software
  • difficulty integrating NN with frontend
  • difficulty setting up wifi for XIAO board

Accomplishments that we're proud of

What we learned

Many of us have never worked with EEG data and lacked a neurosci background so learning about this area was a novel and rewarding experience.

What's next for Brainstorm

The world!!!

Built With

Share this project:

Updates