Inspiration The main inspiration of our project was the issues that we often face in our daily lives that we ignore the small or recurring health issues. We find it inconvenient to go to a doctor, pay the fees for a regular checkup. Further, when at times we search for our health problems from our symptoms in Google, it only gives us one answer and it may not be accurate all the time. Recently, an aunt of mine had low pulse rate, and google suggested it to be bradycardia, but it was actually hypothyroidism
What it does Our platform has two main features. First is, user can sign in and use our health checkup feature. They can enter the symptoms they are facing, and our ML model will suggest them the top 5 possible health problems. Other than that, based on the location, our platform suggests the nearest health centres using the openstreemap api and also marked there
How we built it We used Next.js for frontend and backend, MongoDB for storing user database and Next-auth for google authentication. We built the landing page, dashboard and the features separately, then integrated the frontend with the python script which was deployed in huggingface. Then we merged the frontend and backend and conditionally rendered the user dashboard which is visible only when user is signed in
Challenges we ran into The very first challenge was the model accuracy of our ML model. It took multiple datasets and deployments to reach a 95% accuracy currently, which we can assure will cross 98% with more training. Next came the health check app where the model was integrated with the frontend. And lastly, it was a little challenge to show the distance of the user from the health centers, but we pulled it off. Other than that, all of us kept learning along the way and did our best to contribute towards the project. Accomplishments that we're proud of It was our first time working with a ML model, so successfully integrating it and showing the data felt like a big thing. Then the dynamic location and current location indicator on the map was pretty cool which we were able to pull off. Overall, we are very happy with how our project turned out and how we pulled off the project. Also, there are many features we intend to add in the coming days
What we learned We learnt dynamic routing in Next.js, Next-authentication using google, connecting python script with frontend from huggingface, collaborating with one another, helping each other in the works, and fetching user's live data and dynamically showing the required data and information
What's next for Chikitshalay Next we plan to make a mobile version of this application, mainly an app. We also plan to add a vertex AI model which can recognise doctor's handwriting and then the data it fetches and can suggest where to go next and what steps to take. Next, we plan to add a doctor's list and suggest best doctors from the health checkup.
Built With
- google-auth
- javascript
- mapbox
- mongodb
- next-auth
- next.js
- openstreetmap
- python
- react.js
- tensorflow
Log in or sign up for Devpost to join the conversation.