Inspiration

With convenience driving (no pun intended) consumers to choose the software that's easiest for them, cAR was inspired after noticing a lack of information in the transportation industry.

What it does

The app uses the phone's camera and when you show it a car, it'll predict what car it is. Then, when you tap the screen, it adds an augmented reality label that anchors to the car.

How I built it

I created a CoreML model from Google images of various cars that creates a neural net to predict what car it's seeing. It uses the Liberty Mutual Shine API to retrieve the MPG of the vehicle.

Challenges I ran into

Creating the CoreML model took a lot longer than I expected. The AR labels can get buggy when trying to detect a plane to anchor to.

Accomplishments that I'm proud of

I think the AR component makes it really interested and opens up a lot of possibilities. I'm also proud of the CoreML model as I have never done something like that before. This app could be very beneficial when it's not being used on pictures of cars. If you're at a car dealership, the user could easily map out the dealership by adding the AR labels anchored to each car and easily compare each car's MPG.

What I learned

I learned how to develop a CoreML model which taught me a lot about neural nets and machine learning.

What's next for cAR

I believe car dealerships could utilize this app as it would allow for visitors to easily and efficiently learn about cars without doing research prior. I also think if people knew more about the environmental effects of cars with high carbon fuel emissions, they would purchase a car with the environment and climate in mind.

Built With

Share this project:
×

Updates