Inspiration
Traditional campus tours often rely on you needing to schedule beforehand and other alternatives like virtual phone torus are annoying. So we wanted to create a hands-free experience that allows users to explore freely while still receiving location-based insights. With Meta Glasses, we saw an opportunity to leverage cutting-edge wearable technology to deliver real-time, geofenced audio narration, ensuring that users stay fully immersed in their surroundings while effortlessly accessing relevant information—no screens, no interruptions, just a natural, guided exploration.
What it does
LocalLens provides a hands-free, location-aware campus tour using Meta Glasses (currently set up only for Purdue but this could be used in any other "tour" setting). As users walk around campus, the app detects their position and plays relevant audio snippets about landmarks, historical sites, and hidden gems—without requiring them to pull out their phone.
How we built it
Built with React Native and Expo, our app uses Expo Router for navigation and the Expo Location API for geofencing. We utilize Selenium to automate the login process and join the Instagram livestream, where the system will post location-based comments. The setup works as follows: A livestream is initiated on the Meta Glasses, allowing users to stay hands-free. Meanwhile, positional data from the phone is continuously sent to an Express.js server, which processes the user's location and matches it to predefined geofenced points. When the user enters a geofence, the server retrieves the associated fact and automatically posts it as a comment on the livestream. Meta’s built-in text-to-speech feature then reads the comment aloud, delivering an immersive, real-time guided tour without requiring the user to look at a screen.
Challenges we ran into
We encountered numerous challenges along the way, primarily due to the closed-source nature of Meta Glasses, which made direct integration difficult. To work around this, we reverse-engineered a solution using the livestream feature to deliver audio-based tour information. However, this approach led to rate-limiting issues and other restrictions that complicated our implementation. Additionally, we faced difficulties in connecting the app to the server, ensuring smooth data transmission between the mobile device, geofencing system, and livestream comments. Initially, we also attempted to train a model to analyze real-time images from the glasses and provide highly specific facts based on what the user was looking at. However, this turned out to be far more complex than anticipated, and we had to pivot to a more practical solution within the given timeframe.
Accomplishments that we're proud of
We are proud to have built a working system that tracks real-time locations and posts updates to an Instagram Live stream. Getting the API integration and authentication to function correctly was a key achievement.
What we learned
Throughout this project, we gained valuable experience in integrating real-time location tracking with a social media platform. Specifically, we learned: Geolocation Tracking: Using the expo-location package to continuously track a user's position and implement geofencing logic. React Native & Expo: Building a mobile-friendly app with Expo Router, handling navigation, and optimizing UI components. Backend API & Automation: Developing a backend server using Express.js and Selenium WebDriver to automate interactions with Instagram Live. Handling Authentication: Managing session persistence and authentication with Instagram, including debugging challenges with dynamic page elements.
What's next for LocalLens
Since we have it already connected to the glasses we can do stuff with the audio and teh camera feed on the live. With this you can do stuff like answering questions and seeing what the user is looking at to give them a direct prompt. This project has many spots where it can be expanded on.
Log in or sign up for Devpost to join the conversation.