People worry about inside their house when they are not home. We want to give people the ability to not only see inside their home but also have the ability to control an avatar to complete tasks that was previously impossible unless they are physically there.

What it does

With VR goggles you have the ability to have a full-body take-over of Pepper and control its movements, arms, and vision. With this avatar, you can complete tasks at home using Pepper from anywhere in the world. From doing housework, to checking the stove, and even first emergency response, Pepper will be an extension of you while you are not present.

How we built it

Techs we used:

  1. Pepper
  2. Leap Motion
  3. Android
  4. Oculus
  5. Unity
  6. Salt

How it works

So Pepper's api allows for a very limited amount of control over the robot. You can make and open a fist, rotate the palm/write area, and can rotate the elbow and shoulders on 2 axis. The two biggest problems were the shoulder and elbow. How were we going to turn 3 axis rotation into 2 axis rotation. Also, since the Leap motion does not track the upper arm, how were we going to realistically rotate a shoulder joint.

Due to the fact that FabCafe did not bring any Myo armbands (even though they said they were bring one/some), we had to improvise and the only other devices that we had access to that had accelerometers were phones. We managed to acquire 2 android phones at 1 am on Sunday and wrote a simple android app that just sends the acceleration data of the phone to a simple web-server on the computer running Unity (the socket server computer). From there every time a new set of values is received it was written to file on the machine. Two files were written, one for each phone. From there, the Unity code would read from the files every frame of the game/whatever this was, and so some fancy black magic math that translated that into the rotational values we sent to the pepper.

The next part was the elbow, and it was a bit of a pain as well. The way the pepper rotated its elbow as inwards (like in the video (like my movements), and along the forward direction of my arm. The 2nd rotation mentioned is also how the wrist rotates, meaning we had to disable the 2nd rotation for the elbow. So all we did was whenever we sent new data to the pepper as to how to rotate the elbow, we just sent it the same rotational value for that part. We got that data by just taking the hand.Arm.Direction.x of the hand and sending that to the pepper.

For the wrist, it was pretty simple. We just send the hand.PalmNormal.z to the pepper, and it worked just fine. The same was for closing and opening a fist. There is a property of each Leap hand that gives us a value between 0 and 1 telling us if a fist is being made. However, pepper would sometimes refuse to make a fist for no reason.

Challenges we ran into

  1. Pepper

Accomplishments that we're proud of

Got it to work

What we learned


What's next for Salt and Pepper

As the Global VR Market is expected to grow at a profound CAGR of 96& by 2019, we want to be on the forefront of finding practical and useful solutions with this new piece of technology. In addition, as Pepper's own capabilities increases in terms of movements, strength, and dexterity, we will allow our customers to complete more tasks using our technology and give Pepper more automated tasks to do at the same time.

We plan to end "worrying" in every family in the world when they leave the house. Starting from Japan, we will enter the Asian Market, then Europe and US markets thereafter.

Share this project: