Inspiration

My inspiration for BlinkTalk came from a poignant moment in Grey's Anatomy, where a character, after a tragic accident, could only communicate 'Yes' and 'No' through eye blinks, unable to say 'I love you' to his loved one in his final moments. This scene resonated deeply with me as I imagined the profound emotional turmoil of not being able to verbally communicate with loved ones during critical moments. This experience made me reflect on historical instances, such as the American soldier in captivity who used Morse code blinks to communicate 'TORTURE' during a video interview, letting the world know what was actually happening. As someone experienced in software development, I realized the potential of leveraging my coding skills to create BlinkTalk, a solution that detects and translates eye blinks into Morse code and then into English text. This technology aims to empower individuals with limited mobility to communicate effectively and independently.

What it does

BlinkTalk is developing a computer vision solution aimed at assisting quadriplegic patients in communication. By detecting eye blinks, our technology translates these blinks into Morse code, which is then converted into English text. This innovative approach empowers individuals with limited mobility to express themselves more independently and effectively. BlinkTalk will be accessible through a smartphone app, ensuring efficient translation of eye blinks into actionable communication, and making it easily accessible to a huge demographic.

How we built it

BlinkTalk utilizes computer vision techniques to detect and interpret eye blinks. Using OpenCV and Dlib libraries, we implemented a system to track facial landmarks and calculate blink patterns based on eye aspect ratios. Morse code translation is achieved through predefined mappings, converting blinks into understandable English text. The system is being developed as a mobile application for seamless accessibility and user-friendliness.

Challenges we ran into

Developing accurate and real-time blink detection posed significant challenges, requiring fine-tuning of parameters and algorithms to ensure reliable performance across different lighting conditions and facial variations. Integrating Morse code translation seamlessly into the application while maintaining responsiveness also presented technical hurdles that required iterative testing and optimization.

Accomplishments that we're proud of

Developing a solution that enhances communication for individuals with limited mobility through innovative use of computer vision and Morse code translation.

What we learned

Through BlinkTalk, we gained insights into the complexities of real-time computer vision applications and the importance of robust algorithm development. We honed our skills in integrating diverse technologies to create a cohesive solution that addresses real-world communication challenges. Additionally, we deepened our understanding of user-centered design principles in developing accessible technologies for diverse user groups.

What's next for BlinkTalk

In the future, BlinkTalk aims to expand its functionality by enhancing accuracy and reliability in blink detection through machine learning advancements. We plan to optimize the mobile application for seamless integration with assistive technologies and explore partnerships for broader accessibility. Continuous user feedback and iterative development will drive our efforts to empower more individuals with limited mobility worldwide.

Share this project:

Updates