Inspiration

We're a startup in the customer service space looking for ways to eliminate the often frustrating experience of contacting customer service. An AR based solution potentially removes the need for contacting an agent and puts the power back in the consumers hands.

What it does

A customer would access this service from the support section of either the company website or app. They would then be asked to point the camera at the IoT enabled device, which would then perform a health check. Upon receiving the status, if a self-serviceable issue were found, the app would populate with on screen instructions and markers on the object to help guide the user in resolving the issue.

How we built it

We created our scans of a coffee maker using the ARkit Scanner app, which allowed us to recognize the object when inside of an AR-enabled app view. The AR components were provided by ViroReact, with the overlays, state management, and navigation built in React Native.

Challenges we ran into

ViroReact is a relatively new library, so the documentation was a bit spotty. Specifically, in instances where we were attempting to nest React-Native components inside of ViroReact components, as well as adding animations to our overlayed instructional images. Working in XCode was a new experience for some of the team, so there was significant ramp up time there as well.

Accomplishments that we're proud of

This being everyone on the team's step into augmented reality applications, almost every step of the way was satisfying. Seeing the object (coffee maker) be recognized, tracked, and marked with the appropriate overlays was a very cool experience.

What we learned

Some do's and don'ts of ViroReact and XCode development (when in doubt, clear the cache, clean and rebuild!), the importance of high quality scans of any objects you want the app to track/recognize, and the upper limits of caffeine's effect on developer productivity.

What's next for Brewmaster Self Service AR

We have several features in mind that were out of scope given the time constraints of the hackathon. One of the best use cases for taking advantage of 5G would be the option to bring an agent on the line if you were unsuccessful in your self-service, who would be able to see a live feed of your camera and provide additional support via voice and on-screen interactions with the object. Second, in our current iteration, the user has to manually move to the next step of instructions. With proper tracking, we believe we could recognize when a user has completed each step (assuming there are enough moving parts) and trigger the change automatically. Lastly, assuming an IoT enabled device, being able to perform a health check up front to provide the user with a diagnostic of the entire device, including the option to order replacement parts or contact an agent directly if self service is impossible or inadvisable.

Built With

Share this project:

Updates