Inspiration
As websites continue to vie for our attention and our daily lives, our team imagined the user experience of those with disabilities who may experience trouble traversing sites that are poorly designed for accessibility. This may include images without alternative text, low contrast screen elements, or excessive graphics that can be overstimulating. We saw this as an opportunity to create tools for those who face visual, auditory, or motor challenges in their daily digital life. Additionally, we explored the potential to expand the utility of earphones for specific audio user cases for those with auditory sensitivity. Neurodivergent people, including those with autism spectrum disorder or ADHD, can often experience sensory overload when in loud environments. Given the prevalence of wireless earphones with adjustive and dynamic active noise cancellation using external microphones, we looked to create a solution to harness this capability in the technology already available to us for this purpose.
What it does
Opensight takes a URL or QR code linked to a website that a user may find less accessible and scrapes the site for the vital information and elements that they might need to interact with. This is so it identifies the most important actions a user might want to take. It then creates executable action buttons that the user can interact with to be redirected to without having to view the unnecessary text on the webpage. Sift provides its user a “Sensory Shield” that, when activated with a specific environment preset, filters outside noise using noise cancelling earphones. It also allows the user to fine-tune certain frequencies that they find disturbing for more specific audio scenarios. Both tools utilize colorblind-friendly UI, large buttons, and sans serifs fonts to maximize readability.
How we built it
Open-Sight: Making the Web Readable Open-Sight is a digital translator that turns messy websites into simple, clear information. The Python and Flask backend acts like a smart filter, reaching out to complex websites and stripping away the "junk" like ads and confusing layouts. On the front end, we use React to display that information in a super-clean, high-contrast format with extra-large text. This creates a "Simplified View" that is perfect for people with visual impairments or anyone who feels overwhelmed by busy screens, ensuring they get the facts they need without the headache.
Sensory Shield: A Personal Noise Filter Sensory Shield is an app designed to protect users from the "sensory scream" of loud city life. We’ve taken the Librepods API and customized it to create a smart audio-filtering system that can tell the difference between "bad" noise and "important" sound. Instead of just blocking everything like normal headphones, it uses AI to quiet down scary sounds like train screeches while making sure you can still hear a friend talking or an emergency siren. The app’s design is purposefully calm and minimal, using soft visuals so that the interface itself never causes more stress for the user.
Challenges we ran into
During the hackathon, we struggled to get the AirPods API working correctly on an Android emulator, which slowed down our initial development and testing process. We faced challenges while creating the app and converting it into an APK, as this was our first time navigating the full Android build and deployment workflow. Midway through the hackathon, we decided to switch projects, which required us to quickly adapt to new goals, tools, and technical requirements under time constraints. We encountered difficulties trying to connect AirPods to the app for testing, since hardware-dependent features were hard to validate without consistent device compatibility. Integrating AI into our project (OpenSight) for the first time presented a learning curve, particularly in understanding how to properly incorporate AI functionality into an existing application. We also had trouble selecting the correct AI model for website scanning and developing effective prompts that produced accurate and reliable results. A major challenge was building action buttons that work across websites, since each site structures its links in a different way.
Accomplishments We’re Proud Of
We have successfully launched a powerful mobile app that gives our users instant access to everything they need. It serves as a reliable, all-in-one hub that brings our services directly to your pocket, ensuring that support and connectivity are always just one tap away, no matter where you are. Our team is dedicated to inclusive design, creating a beautiful interface that works for everyone. We believe great technology should be accessible to all people, regardless of their tech skills or abilities, so we built a visual experience that is welcoming, clear, and easy for every member of our community to navigate. We take great pride in how simple and stress-free our platform is to use. By removing unnecessary steps and focusing on a smooth, natural flow, we’ve made sure that our users can finish their tasks quickly and get back to their day. It’s high-tech performance delivered with total ease.
What we learned
Android Studio React Swift API use AI API’s and integrating them into projects How each website structures their link differently
What's next for Opensight and Sift
We would like to expand the features available including converting Opensight to a mobile app that more naturally utilizes that QR code scanning feature as well as providing guided audio descriptions during the user interactions. For Sift, we would like to implement more audio presets to further customize the user experience as well as add additional features like a white noise library. Additionally, we would like to properly integrate the AirPods API to take full advantage of the hardware capabilities it offers. Finally, a next step would be to directly consult those with audio sensitivity to tailor our application to their specific needs and pain points.
Log in or sign up for Devpost to join the conversation.