The landing page for our web application.
The American Heart Association published the first Scientific Statement on Acute Myocardial Infarction in Women. They discovered that…
- Heart disease kills women more than any other cause (including cancer).
- Women with heart disease are 7 times more likely to be misdiagnosed than men
- Women have “atypical” symptoms more often than men
- Even high risk abnormalities in ECGs are missed by physicians in Emergency Medicine
In “When Doctors Don’t Listen”, Dr. Kosowsky argues active patient participation can prevent deadly mistakes. Hera, named after the Greek goddess of women's health, is an application that empowers women to confidently address their heart health concerns and fight gender inequality in healthcare.
What it does
We use deep learning to predict the probability of a patient having heart disease based on their ECG results. The user uploads a photo of their ECG results on our web or mobile application, and we analyze those results with a deep learning model to predict the probability that they will have heart issues.
How I built it
The deep learning model is built using CNN algorithm in python. The model consists of 4 convolution layers and 2 fully connected layers. For training the model we need the acceptable format of input data. The ECG dataset was downloaded from physionet website and preprocessed in the way that it was legible by the CNN in keras library. Then the dataset was split to training and test data to find the accuracy.
For the Android app, we developed the application using Xamarin and Firebase. The web application is built using a CherryPy server. Both applications have a simple easy-to-use UI and a feature to upload their images. We save these images on the server with the goal of using the data we collect to improve our model.
Challenges I ran into
It was incredibly difficult to find a suitable dataset in such a short period of time, and that impacted our application greatly. Initially we were planning on creating a simple mobile application with a swipe functionality wherein users could swipe left or right depending on whether or not they had certain symptoms. Unfortunately, we were unable to find a compatible dataset so we decided to use an image processing algorithm instead. This affected our UI and entire tech stack. We also experienced difficulties establishing a connection between Firebase and our Android application due to issues with our build automation tool, Gradle. Finally, because the dataset we used was extremely large, our model was taking very long to train.
Accomplishments that I'm proud of
In just 36 hours, we built two applications (both mobile and web) using very big data. Our idea solves a serious real world problem. All of our members were either first time hackers, or people who did not have much experience with the technologies we worked with. We stayed up all night to polish our application!
What I learned
Technically, we all learned something new since we pushed ourselves. It was our first time diving into Android app development and setting up our own Python server, first time at a hackathon, and neither one of us has ever used Firebase before. Some of us were new to even Git!
We also learned the importance of teamwork and communication! On Saturday night, we were all ready to give up because the app was so challenging and we made so many mistakes. However, we were able to motivate each other to continue working on the application. We were all inspired by each other's desire to finish the project on time.
What's next for Hera
The accuracy is already measured but in machine learning and deep learning it is not actually the case. We will measure the f-score, precision and recall for future. We have also a plan to improve our dataset to improve the accuracy and other aforementioned parameters. All the images uploaded by the users are saved on our server with the goal of improving our model for the future.