It is a special hackathon for all of us because of the global pandemic and thus we think COVID-19 will be a suitable topic to explore and build something meaningful that can serve the community.

What it does

We integrate Computer Vision model into our mobile app to detect every pedestrians on the road and identified each of them with two indexed: are they wearing masks and the distance between them with the user. If they are too close with the user and they don't have mask on. The UI will change and the warning sound will be triggered to notified the blind people.

How we built it

We use Flask as the backend with Pytorch to construct and train the CV model. For the frontend part we use React Native to build the mobile app

Challenges we ran into

At first we wanted it to be a full-stack javaScript project, so we decided to use React Native only with tensorflowjs to train the model. However because of the inexperience of using React Native, we ran into several bugs that does not allow us to load the TFJS model, and it was the most frustrating time during the entire hackathon that cost us around 5 hours to debug. Eventually we still cannot solve the problem which force us to switch from full-stack JS with Flask as a backend to serve as a RESTful API

Accomplishments that we're proud of

Each of us we have done something that we have never done before. For example Steven did React Native, Prince did Backend and Richard did Machine Learning all for their first time.

What's next for FiatLux

We plan to beautify our UI and add more functionalities such as adding direction notifications for the blind people, and can add detections for the barriers.

Share this project: