We inspire a healthy lifestyle by making it accessible to everyone. There are a lot of new approaches to diets out there, and people are becoming more educated now and understand the importance of macronutrient as a big factor to weight goals. Even though all the information is available on the web, it’s difficult to search and look for valid nutritional information. We want to simplify the process by incorporating new IBM Watson technology, to help you achieve the BetterYou that you desire.
What it does
BetterYou is an innovative nutrition app that utilizes IBM Watson API’s to create an accessible platform to a wider user base. BetterYou pulls nutritional information based on your input in multiple ways to make a more accessible platform. The application itself can dictate aloud to the user what is on the screen and how to interact with the application with voice command. This is meant to expand accessibility to users with vision challenges. To search for nutritional information, the camera function can be used to identify a food item or the user can tell the application what they are looking for (ex: “nutrition facts for strawberries”)
How we built it
We built an image classification native ios app for food in swift. We integrated the text to speech, speech to text, and image classification service from IBM.
Challenges we ran into
Because time is a big factor, we approached development with a UX Agile approach. Our team was able to reference recent qualitative user research that had been utilized on a recent Keto product design project. Our IBM Watson, also needs more time to develop stronger visual recognition because there is not enough data to be as effective as it potentially could be.
Accomplishments that we are proud of
We are very proud that we were able to integrate image recognition and usability tests that demonstrate working product functions in the app are easy for users to follow.
What we learned
We learned how to trust each other in development. Each team member was empowered to accomplish a task with an understanding of our overall plan, and it resulted in an MVP accessibility product solution.
For our next steps, we would like to integrate an AR feature that educates users about their food options. For example, if the user was observing an apple, there would be alternate versions of apples viewable on the screen. We would also like to optimize diet, notify users of concerns if there are dietary restrictions specific to the user, and conduct usability tests to observe user interactions with our product