Inspiration

Many individuals with disabilities face significant barriers when interacting with computers. These challenges include physical limitations, lack of accessibility features, and difficulty navigating complex interfaces. This lack of inclusive design often leads to frustration, reduced independence, and limited opportunities. Motivated by this gap in accessibility, we set out to create a tool that helps people with disabilities navigate the computer with ease, enhancing their ability to engage with digital environments efficiently and independently.

What it does

Our project provides an intuitive solution designed to support individuals with disabilities in navigating their computers. By using motion detection, voice commands, and gaze tracking, the system allows users to interact with their devices more effectively, offering alternatives to traditional input methods such as keyboard and mouse. The goal is to create an inclusive, accessible computing environment that adapts to users' needs, providing them with greater control and independence.

How we built it

We took a structured approach to create this accessible tool:

  • Feature Planning: We defined the core functionalities, prioritizing the accessibility of input methods and usability.
  • Technology Integration: We integrated AI for gesture recognition, gaze tracking, and voice commands to facilitate hands-free control.
  • User-Centered Design: We focused on building an interface that was simple and intuitive, ensuring accessibility for all users.
  • Testing & Iteration: We continuously tested the system with real users, gathering feedback to improve the accessibility features.

Challenges we ran into

Developing this project came with a number of challenges:

  • Hardware & Integration: Integrating motion detection and gaze tracking with different devices required extensive testing and optimization for accurate response.
  • User Diversity: Addressing the varied needs of users with different disabilities meant constantly adjusting the interface and controls to make sure everyone could benefit.
  • Real-Time Processing: Ensuring that the system could process input and provide feedback in real-time without latency was a technical challenge.

Accomplishments that we're proud of

Despite the challenges, we accomplished several key milestones:

  • AI Integration: Successfully integrating AI-driven tools like gesture and voice recognition to enhance user experience.
  • Usability Testing: Testing the system with users to ensure that it truly met accessibility needs. Innovative Design: Building a solution that enables individuals with disabilities to navigate their devices independently.
  • Teamwork & Problem Solving: Overcoming technical hurdles and collaborating effectively to create an impactful solution.

What we learned

Throughout the project, we gained valuable insights:

  • Adaptive Technology: The importance of designing systems that can adapt to a wide range of abilities and needs.
  • Collaboration: Working as a team to solve complex problems and integrate various technologies seamlessly.
  • User-Centered Design: The significance of feedback-driven design in making technology truly accessible.

What's next for SeeSay

Looking ahead, we plan to scale this project into a browser or desktop extension that can access the operating system, allowing users to navigate their desktops with ease. Key future improvements include:

  • Browser/Desktop Extension: Developing a cross-platform extension that integrates seamlessly with both browsers and desktop environments to provide hands-free navigation.
  • Enhanced Accessibility Features: Implementing additional control options, such as eye-tracking and more sophisticated voice commands, to further improve accessibility.
  • Personalized User Experience: Using machine learning to customize the interface and controls based on individual user needs and preferences.
  • OS Integration: Expanding functionality to allow direct interaction with system settings, files, and applications for complete desktop navigation.
  • Collaboration with Accessibility Experts: Partnering with experts and organizations to validate the solution’s impact and ensure it meets the highest accessibility standards.

Our team members!

  • Thu Nguyen: AI Integration (gaze detection)
  • Ethan Do: AI Integration (voice detection)
  • Alex Tran: Frontend Development (UI/UX improvement)
  • Han Le: Figma Design & Frontend Support (animation effects)

Built With

Share this project:

Updates