Inspiration

The inspiration for IntroSpect came from a conversation I had with my mom. She works closely with children and youth and often shares stories about their uncertainty when asked, “What do you want to do when you get older?”, many respond with fear or hesitation. Through these stories, and through additional research, I learned how significant the barriers are for neurodivergent and disabled individuals who want to pursue careers they’re genuinely passionate about. With IntroSpect, we aim to help remove those barriers and make these professions more accessible.

What it does

IntroSpect is an assistive technology that acts like a sixth sense for neurodivergent or visually impaired individuals. It leverages modern, and still emerging, AI technologies to deliver a seamless experience that provides users with deeper context during conversations. By translating non-verbal cues into accessible audio or text feedback, IntroSpect helps users better understand human emotion and navigate challenging interactions with confidence.

How we built it

We built IntroSpect using a SwiftUI frontend and a TypeScript backend. The two are connected through a Cloudflare Worker, which handles data flow between the client and server. Our backend processes visual inputs using the Gemini API, and then optionally converts insights into audio through ElevenLabs. This ensures that users, especially those who are visually impaired, can receive real-time non-verbal cue information in an accessible and intuitive way.

Challenges we ran into

One of our biggest challenges was developing an idea that was both unique and genuinely impactful with real-world use cases. We also ran into technical hurdles, more specifically, the Presage SDK was unable to retrieve one of the microExpression fields we needed. After experimenting with various workarounds, we ultimately pivoted to an approach that leveraged Gemini’s processing power to fill in the gaps and produce reliable results.

Accomplishments that we're proud of

As a team, we’re proud of creating a comprehensive project that can scale to support many different types of users and adapt to a wide range of real-world scenarios. We’re also especially proud that this was the first hackathon for two of our members. Yet together, we were able to brainstorm, collaborate, and turn an idea we had at the start of the weekend into a fully functioning tool by the end. Building something meaningful, accessible, and technically ambitious in such a short time is an achievement we’re genuinely proud of.

What we learned

IntroSpect taught us a lot, not just about ourselves as developers and tech enthusiasts, but also about the real, systemic barriers faced by neurodivergent and disabled individuals. Through this project, we gained a deeper understanding of how technology can meaningfully bridge those gaps in a way that isn’t invasive to the user.

Working together to freely build, prototype, and experiment made the entire experience incredibly rewarding. It pushed us to learn new languages, frameworks, and implementation strategies much faster than we expected. Collaborating under pressure reminded us how creative, adaptable, and motivated we are when working toward a mission that matters.

What's next for IntroSpect

IntroSpect has no real limits, we envision it scaling into a tool that can support hundreds of thousands of users across countless fields. From 1:1 coaching and mentoring to interview preparation and client-facing professions, the potential applications are wide and continually growing.

We also see opportunities to integrate additional AI-driven features, such as Gemini-powered note-taking and audio transcription for visually impaired users. Looking ahead, we’re excited about incorporating more precise biometric insights through wearable devices like smartwatches or smart rings.

IntroSpect is only at the beginning of its journey, and we’re committed to expanding it into a fully accessible, multi-layered support system for anyone navigating complex human interactions.

Built With

Share this project:

Updates