Inspiration
We were inspired by the challenges many face with maintaining focus while reading dense or complex material and envisioned a solution that uses neurofeedback to create a more responsive and personalized learning environment.
What it does
SixthSense enriches the reading experience by allowing users to interact with authors through real-time updates, engage with content dynamically using voice interaction, and maintain focus with neurofeedback-driven prompts
How we built it
We utilized BraveAPI for real-time information access, LLama 3 as our core language model, Neurosity for capturing EEG data, ElevenLabs for realistic voice generation and interaction, and Meta Raybans to design our user experience
Challenges we ran into
Meta Raybans don't have SDK
Accomplishments that we're proud of
Making it work
What we learned
Future of Learning
What's next for SixthSense - Enhance Physical Knowledge w Neurofeedback
Winning the hackathon + launching it

Log in or sign up for Devpost to join the conversation.