Inspiration
As COVID-19 forced eduction online, to flatten the curve, disabled people were forced to rely on others for their education. With this product we hope that disabled people will be able to gain autonomy and be able to learn by themselves at their own pace.
What it does
The program allows disabled people to gain autonomy and control the computer mouse simply by moving their head a doing some facial expressions.
How we built it
In python we coded a program that uses a face recognition AI paired with a linear algebra model to track the users head that in turn moves the mouse. Additionally we implemented a program that allows the user to simulate mouse click event by opening their mouse.
Challenges we ran into
Initially we used the camera to track eye movement to move the mouse however we quickly found out that eye movement is sporadic and unreliable to move the mouse. Additionally in low light situations (i.e. nighttime) it was impossible to get accurate imaging of the eye. This caused our team to go back to the drawing boards to come up with this new and improved product.
Accomplishments that we're proud of
We are really proud of this project and the potential social good to come out of this projects in helping underprivileged groups and hope to improve education for everyone.
What we learned
We significantly improved our understanding of facial recognition algorithms and limitations and problems such algorithms face. We also developed a significantly more nuanced understanding of these algorithms.
What's next for "Simulating Mouse Events With Users Head Movements"
A current pitfall of the application is that you need a traditional mouse setup to run the program. However, in the future we hope that a version program can be integrated into OS systems along with a virtual keyboard to get rid of this pitfall.
Built With
- cv2
- pyautogui
- python
Log in or sign up for Devpost to join the conversation.