Cooking can be challenging because it involves multiple skills that involve prepping ingredients, managing time and temperature, and remembering sequences of steps. It can be especially tricky to implement a recipe you have never cooked before. It would be helpful to have an assistant that could help with those tasks, to relieve the mental burden of the cook, and make the process less stressful and more relaxed.
What it does
This is an augmented reality app that helps a cook make a new recipe. It breaks down a recipe into its essential sequence of steps, and provides guidance to the cook, with simplified, animated, visual imagery overlaid on a real life view of the cooking environment. A helpful narrator provides subtle cooking hints. It also assists with tasks that are challenging for humans, such as judging timing with floating timers and alarms, and with the addition of wireless thermometers, monitoring temperature.
How we built it
It was built by our team using both Mac and Windows, and pencil and paper. Some of the technologies we used included Unity, Microsoft Hololens and its SDKs, Maya, Adobe Photoshop, Github.
Challenges we ran into
There were both application domain challenges, and technology challenges.
Figuring how to represent a recipe in a visual way, that was both simple and expressive.
Reducing a wordy recipe to its essential components.
Reducing the scope of the project to something achievable in a short time with limited resources.
Balancing what would make a reasonable present-time project, with futuristic aspirations.
Working in a mixed Mac and Windows environment. Some aspects of the system could only be implemented on Windows, such as the Hololens functions, but several team members had Macs.
Installing compatible versions of software on different platforms.
Learning to use the tools and technologies.
Coordinating a small team with various skills on an intensive, short term project.
What's next for Fast Foodie
There is a lot of potential for future enhancement, starting with parts of the original vision that were eliminated due to limitations on time and resources, and extending to future features that were inspired by working with the concept.
- We would like to add more animation to the visual graphics, and tie them to the physical objects that they are associated with.
- We would like to add real-time indicators such as count-down timers, and temperature read-outs (using wireless sensors), that would give a cook skills that humans normally are not good at.
- We would like the add flexibility to customize recipes, by omitting or adding optional ingredients or scaling the recipe for number of servings.
- We would like to add a process for choosing a recipe that allows the computer to recommend recipes, and the user to browse the suggestions and choose one.
- We would like to automate the process of turning a recipe into an AR experience, by using natural language analysis to translate the recipe into known concepts, and then mapping those concepts onto visual modules representing ingredients and actions, which can automatically be assembled into an AR experience.