Inspiration
The inspiration for Pedagoggles came from the desire to create a more inclusive and efficient learning experience for students, particularly those who are deaf or hard of hearing, as well as anyone looking for an enhanced way to process and review class material. As technology and AI continue to evolve, we recognized an opportunity to integrate these advancements into the classroom to support both educational enrichment and accessibility. We were excited by the potential of augmented reality (AR), specifically the Spectacles platform by Snap Inc., and saw an opportunity to use it as an accessible, hands-free tool for capturing live classroom lectures, transcribing them in real-time, and providing instant summaries and Q&A. Our team was inspired by the idea of creating a tool that could empower students to learn more effectively, regardless of their background or needs, and we wanted to create something that could seamlessly blend into the modern learning environment.
What it does
Pedagoggles is an AI-powered class learning assistant designed for use with Snap Inc.'s Spectacles, an experimental augmented reality (AR) device. The Lens provides a suite of features aimed at improving learning experiences and accessibility. Key functionalities include live captioning of lectures or presentations, AI-generated summary notes that synthesize key points from the material, and a question-and-answer feature powered by AI to assist with understanding and retention. Pedagoggles can function both as an accessibility tool for the deaf and as an enhancement for anyone seeking to improve their learning experience. The Lens is built to work with Spectacles and uses a Gemini-powered backend written in TypeScript, React, and Python to process and display real-time information in the form of captions, summaries, and answers directly in the user's line of sight.
How we built it
We built Pedagoggles using Lens Studio, a visual development environment designed for creating AR experiences for Snapchat and Spectacles. While the platform’s documentation was sparse, we used it as the foundation for our Lens and integrated several features, such as live captioning, AI summary generation, and Q&A functionality. The backend of the project was powered by Gemini, which facilitated AI processing of the captured audio and text inputs, and was written in TypeScript, React, and Python.
Challenges we ran into
The development process came with several challenges, primarily due to the experimental nature of the Spectacles platform. One major hurdle was the sparse documentation provided by Snap for developing AR experiences specifically for Spectacles. This created many obstacles for developing on the platform for the first time. We were struggling to implement a scroll bar on a UI element until we went to visit the Snap team's expert desk. The guy we spoke to told us he had no idea how to implement a scroll bar, and then we all (including the Snap employee), in that moment, discovered a template example for a scrollable UI. This template, and others like it, helped our project advance much quicker than would've been expected on this platform.
Accomplishments that we're proud of
We are very proud to have built something with bleeding-edge technology. The quick thinking needed to learn something like this in 36 hours is easily one of the greatest feats of this weekend. The backend for this project was a completely separate beast. Integrating outside audio recording into our project required audio capturing, conversion, and processing on our backend server before utilization by Spectacles.
What we learned
Throughout the development of Pedagoggles, we learned a tremendous amount about augmented reality, working with experimental platforms like Spectacles, and integrating AI-powered features into an immersive experience. The lack of comprehensive documentation forced us to rely heavily on experimentation and creative problem-solving.
What's next for Pedagogles
We’re looking forward to expanding the abilities of Pedagoggles by processing incoming multimedia and allowing more opportunities to interact with the environment enhanced by AR. They could be shipped as a standalone app on the Snapchat Ecosystem. The Lens have the power to utilize all the future advancements in Spectacles technologies, such as the ability to process audio from outside people and an increased viewport.
Built With
- fastapi
- gemini
- javascript
- lensstudio
- mongodb
- python
- react
- spectacles
- typescript

Log in or sign up for Devpost to join the conversation.