Our teammate Joon was hungrily scrolling Google images of food and thought to himself if only he could visualize food before he ordered food online. Thus the idea was born.

What it does

It is a service which takes advantage of the depth-sensing camera of the iPhone to 3D scan a food item and uploads it to cloud. Business owners can upload realistic 3D models of their food in the exact size, shape, and color into the app. Then, on the user side, they can download the food and load it into an AR environment to visualize it and decide whether they would want to buy the food, or compare the model with the real thing.

How we built it

The six of us delegated roles in three teams. A team that focuses on Android development, another on IOS development and the business team. We used libraries such as ARKit and ARCore, and API and templates from StandardCyborg for the 3D scanner function of the app. We also delegated a teammate to work on our cloud server (FireBase) to upload and download our food models.

Challenges we ran into

There certainly are a lot of challenges we faced. The first being our limited knowledge in app development due to our backgrounds from electrical and mechatronics degrees. But it did not deter us the slightest. 24 hours ago, we could not even code in Android Studio (Java) or Xcode (swift) and most of our time were spent taking tutorials of how to get started and to familiarize with the platform. Coming from an engineering background, our biggest strength, the ability to learn quickly, became our main activator in our accomplishments. It was very ambitious at the very start to do a cross-platform and to utilize cloud computing, but in the end, we have two working apps (Android and IOS). Also, we only had 1 Mac to use to code the iOS app, bottlenecking the team to only 2 delegated to work on the main app.

Accomplishments that we're proud of

The app we came up with was no easy feat in terms of technicality. We used libraries and APIs which are very new and thus, the documentation is very limited. We met a lot of random technical issues like importing files and file formats which somehow wasted a lot of the precious hours. And somehow we survived.

What we learned

How to stay awake for more than 24 hours. Also, we got more familiar with AR and the libraries for it. We also learned how to code in Swift and Java and how to use XCode and android studio. We also improved our business skills through pitching and working on the business potential of the app.

What's next for Bite-by-Byte

We would be waiting for the APIs to improve so that our apps and service can improve in performance and functionality.

Also, certainly another hackathon.

Built With

Share this project: