Inspiration

https://www.youtube.com/watch?v=VqwL7GnxiI0

What it does

^O^...grub (“O-Grub”) is designed to help feed the physically challenged, via ROS motion planning and 3D face detection. It allows user to choose from a food menu via hands-free gestures: tilt head to the left/right moves food selection to left/right, and nod head to confirm food selection. Then, our robot arm scoops the selected food and deliver to the user’s mouth. With facial expression recognization, based on the user’s reaction to the food, our machine also allows the user to upvote/downvote a particular food menu item, to signal the chef about the user’s taste preference.

Challenges we ran into

  1. Some struggle with ROS planning
  2. Finding the right component for our arm
  3. How to make robot arm structure, and “spoon” from raw materials like sheet metal, lube-penetrant, wood, etc.?

Accomplishments that we're proud of

  1. SW: Used Microsoft Kinect and EmoPy to detect facial position and expression
  2. Hardware: finish the whole structure with 6 servos, develop the spoon’s structure which helps scoop food probably.

What's next for O-Grub

  1. To develop multi-user support via facial recognition
  2. To physically build it in large scale
  3. Change the materials of the spoon so it can be replaced easily

Built With

Share this project:

Updates