Inspiration

We began this project as a journey into the world of machine learning and deep learning. Along the way, we discovered the power of landmark detection systems like MediaPipe and realized that we could bypass complex training processes by directly working with keypoint data. This insight led us to build a lightweight, real-time sign language recognition system — one that doesn’t rely on large datasets or heavy models.

What it does

Gestura is a cross-platform, accessible tool for practicing Indian Sign Language (ISL). It allows users to perform and practice signs live via camera input, and test their comprehension through sentence-based playgrounds. It does not utilize any ML/DL model in background and works purely on sequential logic.

How we built it

We used MediaPipe to extract and normalize hand landmarks, and applied FastDTW to perform efficient dynamic gesture matching without any need for deep learning models.

To make this experience widely accessible, we built:

  • A web interface for browser-based ISL practice sessions in classrooms and community centers.
  • An Android app that enables ISL users to form sentences hands-free — making mobile accessibility seamless.
  • A desktop integration module for gesture-based PC control, enabling typing, navigation, and basic automation via signs.

These cross-platform integrations make Gestura usable in varied settings — from rural schools to urban homes — without demanding expensive hardware or internet connectivity.

Challenges we ran into

  • Designing lightweight, real-time recognition without CNN/LSTM models.
  • Normalizing gestures across varying angles, hand sizes, and lighting conditions.
  • Creating an intuitive, accessible UI for diverse user groups.
  • Ensuring community relevance by testing and incorporating real feedback from local schools and users.

Accomplishments that we're proud of

  • Built a fully working prototype that detects dynamic signs in real time — without any model training.
  • Successfully deployed and tested in collaboration with a government school.
  • Received strong academic and community feedback.

What we learned

We learned that impactful, inclusive technology doesn't need to be computationally heavy. Smart use of existing lightweight tools can create powerful, real-time solutions. Most importantly, we discovered how community engagement — especially in education — shapes truly meaningful products.

What's next for Gestura

We're working toward:

  • Expanding support to ASL and other global sign languages.
  • Partnering with educators, NGOs, and accessibility advocates.
  • Improving gesture accuracy under angular and environmental variations.
  • Scaling outreach to more schools, public institutions, and underserved communities.

Gestura isn’t just a tech project — it’s a step toward a more inclusive, sign-accessible world.

Built With

Share this project:

Updates