Project Inspiration Our team was inspired by a personal connection to the challenges faced by individuals with dementia. Motivated to leverage technology for meaningful assistance, we developed Clearview, an augmented reality tool designed to alleviate the cognitive load associated with social interactions and memory recall.
Core Functionality Clearview operates as a real-time cognitive assistant through a wearable AR interface. Its primary functions are:
Facial Recognition: The system identifies individuals in the user's field of view.
Contextual Display: Upon recognition, it discreetly displays the person's name and relationship to the user within the AR display.
Live Transcription: The device captures and transcribes the ongoing conversation, providing a real-time textual record to aid comprehension and short-term memory.
Technical Implementation The project was built on a multi-layered technology stack:
Front-End: We utilized Lens Studio to create the immersive and interactive augmented reality experience.
Machine Learning: Python was employed to develop and train our custom facial recognition model, as SnapML lacked native support for this feature.
User Interface and Logic: TypeScript was used to build the responsive UI for the transcription service and to manage the real-time data flow between the recognition and display modules.
Challenges and Accomplishments A significant technical hurdle was the absence of built-in facial recognition capabilities in the target AR platform. We successfully overcame this by engineering a custom machine learning pipeline. Our key accomplishments include the successful training and deployment of this bespoke recognition model and the design of an intuitive, accessible user interface for the live transcription feature.
Key Learnings This project provided our team with invaluable experience in the end-to-end machine learning lifecycle, from data gathering and model training to deployment in a real-world application. Furthermore, we gained a deep appreciation for the principles of user-centric design in creating AR solutions that offer tangible benefits to users with specific accessibility needs.
Future Roadmap The next phase for Clearview involves two primary goals:
Persistent Memory Integration: We plan to incorporate persistent storage to create a longitudinal memory aid, allowing users to recall past conversations and interactions.
Intent Recognition: We will explore advanced Natural Language Processing (NLP) to integrate intent recognition, enabling the assistant to provide proactive and more context-aware support.
Built With
- lensstudio
- snap
- snapml
Log in or sign up for Devpost to join the conversation.