Inspiration

We wanted to make something that helps people use a computer without using their hands. The tools that are there are just not good enough. They are too expensive or too slow. We thought, what if someone could move the mouse with their head click with their eyes and talk to the computer at the same time?

This question led to creating Aimless, a system that lets people control a computer without using their hands for people with motor impairments and anyone else who needs a different way to input data into the computer.

What it does

The Aimless system helps people control a computer without using their hands. It lets users move the mouse with their head click with their eyes and talk to the computer at the time. Aimless is a system that lets people control a computer without using their hands. It is for people with motor impairments and anyone else who needs a different way to input data into the computer.

How we built it

Building Aimless was not easy. We wanted to make a system that lets people control a computer completely without using their hands. We thought about what would happen if a person could move the mouse with their head click with their eyes and talk to the computer at the time. That question led to the creation of Aimless, a system that lets people control a computer without using their hands for people with motor impairments and anyone else who needs a different way to input data into the computer.

Challenges we ran into

  • We had trouble keeping the head tracking steady without making it feel slow or jumpy. This was a problem for the Aimless system.

  • We struggled with making detection that works in all kinds of light and on all kinds of faces. This was hard to figure out for the system.

  • Dealing with speech recognition in some places was another challenge for the system.

  • We also had to make a platform accessibility layer that works the way on all platforms. This was not easy for the system.

  • Stopping voice commands from stopping text-to-speech playback was a problem for the Aimless system.

  • Running three engines in time without lag or blocking was difficult for the system.

  • We had to make a user interface that shows users many tuning parameters without making them feel overwhelmed. This was a challenge for the system.

We had to rethink our architecture optimize more and build backups for each challenge that the Aimless system faced.

Accomplishments that we're proud of

  • We created a cursor that works perfectly without hands and feels smooth, responsive and natural. This is a deal for the Aimless system.

  • We made clicking that works with blinking and gives feedback. This is a feature of the Aimless system.

  • Our voice command system lets users repeat things write things down and ask questions to the intelligence. This is really cool for the system.

  • We developed a snap-assist engine that makes it much easier to move around the user interface. This is a feature of the Aimless system.

  • We built a desktop dashboard with telemetry that updates in time. This is a tool for the Aimless system.

  • We created an alternative to a web dashboard. This is an option for the Aimless system.

  • We made a system that really works, not a demo. This is an accomplishment for the Aimless system.

Aimless is more than a prototype; it is a real tool that can make things easier to get to.

What we learned

  • Multimodal interfaces need vision, audio and user interface layers that work well. This is a lesson for the Aimless system.

  • APIs for accessibility are very different on different operating systems. This was a surprise for the system.

  • The quality of the camera and lighting can affect blink detection. This is something to consider for the system.

  • Voice user experience is less about getting the transcription and more about fixing mistakes. This is a lesson for the Aimless system.

  • Users need adjustable settings to make the Aimless system work better for their bodies. This is really important for the system.

  • Artificial intelligence integration is most powerful when it can be interrupted and is voice-driven. This is a feature of the Aimless system.

We learned that bringing together computer vision, speech and accessibility technology into one experience has a lot of potential for the system.

Whats next for Aimless

We are planning a set of improvements for the system:

  • Absolute cursor control based on gaze which is also known as iris tracking for the Aimless system.

  • Speech recognition on the device for use when not connected to the internet for the Aimless system.

  • Blink thresholds that change based on what the user does for the system.

  • Full-screen user interface highlighting to make snapping easier for the Aimless system.

  • A mobile version for Android accessibility services for the system.

  • A plugin system for custom voice macros for the system.

  • Profiles that sync to the cloud so users can set up their devices the way they want them for the Aimless system.

Aimless is just getting started. Our goal is to make hands-free computing quick, easy to use and available, to everyone who uses the system.

Built With

  • applicationservicesapi
  • at-spiapi
  • comtypes
  • customtkinter
  • elevenlabsapi
  • flask
  • googlespeechrecognition
  • mediapipe
  • mistralai
  • noisereduce
  • numpy
  • opencv
  • pyatspi
  • pyautogui
  • pygame
  • pyobjc
  • pyttsx3
  • sounddevice
  • soundfile
  • speech-recognition
  • uiautomationapi
Share this project:

Updates