Team Members

Sun Yitao ID 784
Noel Kwan Zhi Kai ID 799
Ng Wan Ying ID 113

Inspiration

It is quite common to see people walking while using a phone during a commute, myself included. However, as only the peripheral vision is looking at the road in front, doing so may lead to tripping, bumping into obstacles, or not looking at traffic when crossing a road. Some of my friends I spoke to also expressed their frustration with other people nearly walking into them when using phones.

What it does

A fully functional mobile browser that sends a visual and sound alert to the user when there is an obstacle in front so the user does not walk into it or trip on it. There is also a GPS functionality that allows the user to see where he or she is walking and route a path to get from point A to point B.

How we built it

We started by collecting data using mobile phones in video format, which we extracted in jpgs at 5 frames per second. We then trained a ResNet18 model in keras which yielded 96% validation accuracy. We used ResNet18 to train due to a lack of data which led to overfitting with bigger models. After which, we converted the .h5 checkpoint to .coreml using coremltools. We then managed to add the video capturing and image recognition to a open source browser called brave (unfortunately iOS doesn't allow expensive computation and camera functions to carry out in the background when the app is not active). The reason we chose a web browser to build on is because it retains most of the smartphone's functionality including writing emails, browsing social media and using various other web apps. For the GPS routing function, it was written using the MapKit framework provided by Apple.

Accomplishments that we're proud of

We managed to finish the model training fast ahead of time which made us free to start work on the app (we allocated 12 hours but managed to finish in less than half the time). We then faced some difficulty trying get the async image recognition to work smoothly to prevent alerts from popping up too many at a time, but we managed to solve it with a counter to prevent certain threads from executing. After that we tried to work on our stretch goal which was to implement a GPS routing feature. However, we could not find good code examples of implementing that which lead to almost 4 hours of debugging as the app will crash upon loading the map. This lead us to almost deciding to rewrite the entire app from scratch as the current app was created programmatically which we were not familiar with. After a few times rewriting the MapViewController we finally managed to get it to work!

What we learned

We learnt how to do things in Swift without using storyboards, as well as persevering when things looked dire. We also became more proficient with machine learning and resolving git merge conflicts.

What's next for Surf & Stroll

We plan to get the app to detect if the user is crossing a road, and get the app to not resume until the user gets off the road. We also want to work on an Android version of the app to make it more accessible.

Built With

Share this project:
×

Updates