Why'd we make ChopChop?
The inspiration for our application came from several personal occasions in which we ordered food from an online delivery service and were incredibly disappointed with the meal we received. We realized this was an issue that many other people our age had also faced. He or she would purchase food from a delivery service under the assumption that the item they ordered was a full and satisfying meal, but the final product would fail to meet expectations. We thought an application that would eliminate the possibility of this communication error through AR would be the best concept for a mobile application.
What it does
ChopChop is an app designed to show you the item you are purchasing before you actually order it. With our app, a consumer can simply click a button to see an AR model of the item they are considering buying. For example, a restaurant could have an AR model for every food item they have on their menu, further helping their consumers understand the scale and quality of the food they are getting. The AR feature is our unique aspect, but the foundation of the app functions just like a delivery app is expected to: the consumer enters their address, picks a restaurant, and orders their food.
We utilized the NCR Design Systems to map out our application design, Swift to create it, and then we embedded EchoAR into our app for AR functionality.
Challenges we ran into
Some of the challenges we ran into were the development of the AR viewer on the application. SwiftUI does not have EchoAR support, so we had to figure out a way to implement our main feature while still maintaining our main stylistic vision.
Accomplishments that we're proud of
We are incredibly proud at how quickly we came up with our idea, and how we were able to translate the rough concept of our app into Figma and eventually SwiftUI. None of us had any prior experience with EchoAR, so it was a weight off of our shoulders when we viewed the final product. The all-nighters were definitely worth the effort.
What we learned
We all left this Hackathon with skills we did not have when we started. We learned how to navigate Figma, how to design an app using SwiftUI, and how to embed EchoAR.
What's next for ChopChop
With more time and labor, we envision the AR aspect of our app being embedded into the app itself rather than opening it in a web browser. We also would invest in a higher quality scanner to further ensure the validity and accuracy of the scale and condition of the item being scanned.
If you want to try out ChopChop on your own iOS device, the Xcode project files are located at the GitHub repo on this post.