Inspiration
We are inspired by the important role of technology and data for improving our health.
What it does
It predicts and visualize pollution/allergy triggers on a simple heat-map.
These triggers can be measured by various resources, such sensors and campers. The sensor data is being visualized based on recommended medical thresholds (e.g. CO2 level). Image information is extracted by our neural-network model, which can handle cellphone pictures, security footage, vehicle dash-cam, drones and satellites. For each image, the model outputs the probability of a presence of a pollution/allergic triggers in the image. These triggers are dynamic and predefined by the user. Examples for such triggers are: specific flower species (pollen), construction sites (sand), animals (fur) and more.
The whole information is being projected on a city map, forming a single heat-map. It is being used for visualizing hundreds to thousands of data points in a simple app. This kind of heat-map can be used to avoid pollution/allergic triggers when navigating from point A to B or for apartment hunting (personalize area of leaving).
How we built it
For the frontend, we setup a local react project and connected it to the UI library of Amplify Studio. That way we could use Figma to design the UI. We visualized data points as a heat-map by using the Mapbox API.
We created the backend in Amplify Studio by defining a data model and using the implicit connection to the AWS data storage. For image recognition we used a large-scale pre-trained neural network, which we setup with the PyTorch library. In addition, we used the captum package in order to explain our model's decisions (e.g. why an image was marked as pollen trigger) .
Deployment is done through AWS connected to a github repository.
Challenges we ran into:
Focusing on our specific idea to solve the challenge, we had various directions to pick and finally converged to the one which excited all of us.
The art of data points visualization as a heat-map was a challenge itself (colors, shapes, sizes, interactions, etc).
What we learned
Beyond the technical improvement achieved by the usual application development, we learned to think and work together and to utilize each team member's strength and skills properly to achieve our goal. Adding the language gaps and multi nationality on top of it made it more interesting, fun and challenging.
Environment Friendly
In the last years, big-tech companies trained huge neural-network models on lots of data. Our neural-network is one of these models and been trained on ~500M images. Each such training leaves a serious CO2 footprint for the whole process due to huge electricity consumption. In order to prevent further waste, we have "recycled" our neural-network by using an existing one (so called "Transfer Learning") in a certain way that meets our domain, instead of training a new one.
What’s next for the project
We hope to further work on this project, due to its huge potential impact on everyone's life. We maybe can't cure the allergies, but we can help to avoid them. One small step for tech, one giant leap for health!
Built With
- adobe-after-effects
- adobe-illustrator
- ai
- amazon-web-services
- amplify
- dinamodb
- figma
- github
- javascript
- machine-learning
- photoshop
- python
- pytorch
- react
Log in or sign up for Devpost to join the conversation.