Inspiration
India is home to over 1.3 billion people, with a significant portion belonging to low-income communities, where access to essential services like mental healthcare is severely limited. Farmers, in particular, face immense economic and social pressures, leading to a distressing rise in suicides. Millions in rural and underprivileged regions do not have access to affordable mental health support. Compounding this problem is the fact that a large segment of the population, including those with hearing impairments, remains underserved. While global advancements in technology for American Sign Language (ASL) have expanded, tools and platforms supporting Indian Sign Language (ISL) are still severely lacking. However, with the widespread availability of affordable internet in India, there lies an opportunity to bridge this gap. Leveraging cutting-edge technology and language processing, there is a critical need for a solution that provides mental health support in local languages, making help accessible to everyone—regardless of economic status or communication abilities.
What it does
Our software aims to address this challenge by offering a transformative platform that converts any language into English, generates personalized mental health advice, and translates it back into said language, ensuring that people from diverse backgrounds and abilities can receive the support they need, all through an accessible, low-cost, and user-friendly interface. This solution will provide inclusive mental health care, empower underserved communities, and contribute to alleviating the mental health crisis, especially in rural India, where it is needed most.
How we built it
This project was built using Python and integrated with Google's Gemini LLM and speech recognition frameworks. The project works in a singular pipeline consisting of various language conversions and language detection such that it can be utilized by a wide range of users who speak different languages. It essentially works on the English language but is then converted to the user desired language or whatever language the user is speaking. Gemini LLM was used to implement the mental health chatbot. The LLM was iteratively instruction-tuned to become more empathetic and compassionate in its responses. The team devised a novel strategy of CONTEXTUAL-REFINEMENT through which the model responses were pre-processed, summarized and contextualized through the use of previous chat history, all in an effort to make the responses more human-like.
Challenges we ran into
Integrating ISL into our product was the most challenging part. There is no dataset for an Indian Sign Language dictionary that can be readily used to train any models that take ISL as input. While we did integrate ISL characters for input, which is progress from what already exists in the market, the real-life use case is less because no one is spelling out sentences in sign language. Even in ASL, the datasets available for words (not just characters) require very high compute and cannot be used to build a market-ready product.
Moreover, while developing the mental health bot, it took multiple rounds of trial and error to figure out the precise set of instructions that generates the right responses which makes it sound more empathetic and human like. Making the product so it could recognize any language also took a while.
Accomplishments that we're proud of
- Firstly, we are proud of identifying a gap in the market and developing an idea that caters to a large underrepresented audience.
- We are proud of learning about ISL and ASL, and spending multiple hours trying to find an implementable solution.
- We are happy that we were able to equally divide work and have great team work to implement an idea with multiple features in just 32 hours.
What we learned
We learnt that LLMs cannot give clinical advice. They can be used as a first point of contact in communities that lack mental healthcare, and can be used to help people feel supported. However, using them to diagnose people can be very tricky and potentially dangerous. We also learnt that there is a lack of a dataset of Indian Sign Language that can be utilised to build accessible products. Compiling such a dataset would make for a novel research topic. Developing code that can take ASL input through video format (beyond just letters) would also make for a very relevant product that is not readily available in the market.
What's next for CareBridge
CareBridge can scale and expand by adding more regional languages and continuing to improve ISL gesture recognition so the product can be more accessible. There can also be machine learning improvements and the chatbot can be trained on datasets of mental health conversations to recognise issues and provide better suggestions. After these things, the product will be market ready, and can enter into proper testing with research groups to back results.
Log in or sign up for Devpost to join the conversation.