Inspiration

We started with an interest in robots that behave as more than just functional boxes--robots that experience and exhibit empathy. With a gravitation toward Jibo's mission statement of "helping the user with a robot that has personality", we leveraged Jibo's hardware and SDK to push the boundaries of human-robotic communication through natural language.

What it does

Webo detects movement and facial expression with Microsoft Emotion API. We look at the adjective which has the highest occurrence and match it to happy, sad, or neutral. After the pattern matching, Webo will reply in tune with natural human conversation.

How we built it

We worked with the Jibo SDK to expand upon its current set of demo packages by implementing Microsoft Cognitive Services' Emotion API. In enhancing the perception capability of Jibo's camera output, we were able to provide sentiment analysis on facial recognition and make the robot more socially aware in home conversational settings.

Challenges we ran into

The Jibo SDK is still in beta; thus, it exhibits various unresolved bugs and methods without sample code, which took us a lot of time to work through small problems. Even more challenging, we were not able to test our product with an actual hardware unit. Furthermore, we had limited interactions with Jibo engineers who could otherwise point us to the correct implementation directions.

Accomplishments that we're proud of

We implemented Microsoft's Emotion API successfully, and through its collected dataset, have improved Jibo's sense of empathy. By putting human experience first, we were able to take a step toward redesigning the subservient nature of current human-computer communications.

What we learned

Technically, we learned to create within the Jibo SDK through its documentation and API references. At a more fundamental level, we came to understand that what human-computer interactions require in order to move forward are primarily design improvements in human experience.

What's next for Webo

We plan to develop more modes of personal robot language that empathize with human experience. By engaging robots and humans in more consistent regular conversation, we plan to define a new set of capabilities for a computer co-inhabited world.

The original Github link is private because it contains Microsoft API: https://github.com/chrisMYchen/weboInMotion [UPDATE]A new link is posted.

Built With

Share this project:
×

Updates