Inspiration
Access to healthcare facilities is a significant challenge for many individuals in rural areas, where the nearest clinic might be hundreds of kilometers away, often located in city centers.
While minor injuries might not always require an immediate visit to a clinic, proper wound care remains essential. However, in rural communities, traditional methods are often relied upon for treating injuries. Unfortunately, these methods may sometimes be ineffective or, in some cases, cause more harm than good.
What it does
“You ok anot” is a mobile app designed to help users identify and treat wounds effectively using their smartphone camera.
The app analyzes a photo of the wound and classifies it into categories such as abrasions (commonly caused by falling), lacerations (from cuts), bruises, or burns. Based on the classification, it provides step-by-step instructions for proper wound care.
To ensure accessibility in rural areas where internet connectivity may be unreliable, the AI model is deployed directly on the device, allowing the app to function offline.
For more serious injuries, the app also shows the nearest clinics or hospitals, which is particularly useful in regions where medical facilities might be hundreds of kilometres away.
Additionally, the app features a first aid guide, providing users with detailed instructions for treating wounds without the need to take a photo to classify the injury.
“You ok anot” is designed to empower rural communities by making wound care resources accessible and easy to use.
How we built it
Our mobile app is built using the Android SDK with Jetpack Compose. For the AI model, we used Tensorflow and ported it to Android using TFLite. We utilised transfer learning by using a base EfficientNetB0 model which was pretrained on the ImageNet dataset. We then added our custom layers while keeping the pretrained weights and trained it on our own dataset. We also used Mapbox to get user location and display nearest clinics on a map.
Challenges we ran into
Our model initially had very poor accuracy (20-30%), and we spent hours tweaking the layers and experimenting with various data augmentation techniques. We suspected the issue was due to a lack of diversity in the dataset or a class imbalance. However, after extensive trial and error, we discovered the root cause was the choice of the base model. Initially, we were using MobileNetV2, which did not perform well for our specific task. Once we explored other base models, we stumbled upon EfficientNetB0, which significantly outperformed our expectations and delivered exceptional results.
Accomplishments that we're proud of
The resulting accuracy was 97% on our validation dataset with 5 wound classes. We're very happy that we managed to deploy the TFLite model on-device on Android as well.
What we learned
- Experiment with different base models before committing to one.
- Learnt how to use Mapbox to simplify getting user location and render the map
What's next for You ok anot
Currently, our app supports classification for only 5 wound types. Moving forward, we aim to expand its capabilities to include a wider variety of wounds. For example, while we can already classify burns, we acknowledge the need to differentiate between the degrees of burns, as each requires specific treatments.
For instance:
- Second-degree burns require cool water, but not warm water.
- Third-degree burns, on the other hand, should avoid both cold and warm water, and instead require immediate medical attention.
To achieve this level of detail, we would need a significantly more comprehensive dataset that accounts for varying degrees and types of wounds. This expansion will enable "You ok anot" to provide more accurate and tailored treatment guidance for users in diverse situations.

Log in or sign up for Devpost to join the conversation.