Inspiration

Our goal was to create a product that increases accessibility for individuals with mobility challenges. Many people with limited mobility rely on caretakers to perform daily tasks like controlling home devices, like lights and ACs. With opportunity of independent living and quality of life in mind, our product aims to provide a hands-free solution for controlling different elements at home.

What it does

Similar to a smart home app, the website allows users to control lighting but through facial expressions.

Key features:

  • Facial Expression Control: The website uses real-time facial recognition to interpret a user’s facial expressions and map them to actions like turning lights on/off, adjusting brightness, or changing color temperature.

  • Hands-Free Interaction: Users can control their environment without needing to touch anything, offering a more accessible solution for those with limited mobility. A smile could increase the brightness, while a frown could dim the lights, for instance.

How we built it

The laptop webcam captures the face expression, MediaPipe detects whether that expression is a smile or a frown which corresponds to either turning up the brightness or turning it down. The change in brightness is simulated through the website using html, css and javascript.

Challenges we ran into

  • Having an ambitious idea
  • Unfamiliarity with frontend backend communication
  • Real-time computing

Accomplishments that we're proud of

  • Persevering through an ambitious idea while learning new languages

What's next for Faceable

  • Implement different variables to control (temperature, volume)
  • Control physical devices (smart light, AC, heater, TV)
  • Allow users to set their own facial expressions for controls

Built With

Share this project:

Updates