Inspiration

Our inspiration stems from the fact that over one billion people worldwide live with some form of physical disability. For many, accessing digital tools is either impossible or prohibitively expensive. We noticed that existing commercial solutions (like Tobii) cost between $2,000 and $10,000 and require specialized hardware. This makes life-changing technology accessible to only 10% of those in need. We wanted to create a solution that is 100% free, highly accurate, and accessible to anyone with a basic webcam.

What it does

Hedmouse is an AI-powered assistive system that allows users to control computers and smartphones entirely through facial expressions and voice.

Movement: Tracks nose and head movement to control the cursor.

Actions: Uses winks for left/right clicks and opening the mouth to scroll through pages.

Voice Control: Features a built-in smart assistant that executes complex commands via voice in both Arabic and English. It provides complete digital independence for people with paralysis, fractures, or amputations.

How we built it

We developed the desktop version using Python and the mobile version using Kotlin. Key technologies include:

MediaPipe (Google): For high-fidelity tracking of 468 facial landmarks.

OpenCV: For real-time video processing and computer vision.

VOSK: To enable offline voice recognition, ensuring privacy and accessibility without internet.

Groq AI: To power the intelligent chatbot assistant.

PyQt5 & Material Design: To create modern, intuitive user interfaces for both desktop and mobile.

Challenges we ran into

The biggest challenge was balancing high precision with low resource consumption to ensure the software runs smoothly on low-end devices. We also spent significant time fine-tuning the "Wink-to-Click" algorithm to distinguish between a deliberate wink and natural eye blinking, eventually achieving a 520ms response threshold that maximized accuracy while minimizing accidental clicks.

Accomplishments that we're proud of

Achieving a tracking accuracy of 90%+, which rivals expensive commercial hardware.

Building a 100% offline mode that protects user privacy.

Developing a system that is fully optimized for the Arabic language, filling a major gap in the global assistive technology market.

Keeping the total cost to the user at $0.

What we learned

We learned that impactful assistive technology doesn't always require expensive sensors; it requires smart, empathetic engineering. We gained deep expertise in computer vision and UI/UX design specifically tailored for accessibility, realizing that small adjustments in software can make a massive difference in a user's daily life.

What's next for Hedmouse

iOS Support: Launching the official version for iPhone users.

Eye-Gaze Tracking: Implementing advanced eye-tracking to supplement head movements for even higher precision.

Enhanced AI: Integrating larger language models (LLMs) to allow users to dictate emails and documents more naturally.

Global Outreach: Partnering with rehabilitation centers and NGOs to put Hedmouse in the hands of those who need it most worldwide.

Built With

Share this project:

Updates