Inspiration

The inspiration behind this project was to make JIBO do cool things like interact with other smart devices (IoT devices) and be of more use for humans, while also increasing direct interaction with humans.

Hardware Used:

OpenBCI starter kit (3D printed sensor, circuit board, usb dongle, 1.5V AA batteries 4-pack, Myo wrist band, data cable, Bluetooth dongle, JIBO robot, 8GB RAM processing units (laptops, etc)

Setting up webservice hosting JIBO:

After installing the crowd-sourced jibo-sdk package, we implemented a RESTful architecture inside JIBO, by setting up a webservice in Express employing nodeJS. This enabled JIBO to receive api calls over the internet to perform relevant tasks, based on remote human input. Express was used so that the setup is compatible with the real robots brought at the hackathon. Once any JIBO was able to respond to api calls, it's fun seeing it dance around in reality.

Setting up Myo WristBand:

We set up Myo WristBand with the instruction given on the official website to be used along with Myo Connect on Windows. Apart from the SDK, Myo Connect has nodeJS capabilities to alter the function methods of event sequences. Since our JIBO webservice is also hosted using nodeJS, sending REST calls becomes easy. There are 5 hand motions detected by Myo Wristband: Wave-in, wave-out, spread-fingers, make-fist and double-tap. Each motion is mapped to a specific JIBO animation/text output.

Setting up OpenBCI:

We used the EEG signals from the human forehead to detect eye blinks, closing, and opening of eyes. We set up the Processing tool provided by OpenBCI to see our brain waves on their setup. The EEG output were signals with a lot of noise. We filtered the input data to smoothen the curve. We then employed thresholding and came up with peak detection algorithms to identify the motions in user’s actions. Then we did a bit of signal analysis to detect three different type of actions: closing eyes, opening eyes, and blinking (quickly). This signal processing was done by making alterations to the BCI codebase available. Once the actions were differentiable to a reasonable degree of accuracy, we sent the appropriate calls over REST to control JIBO.

Challenges we ran into

Identifying peaks in the different channels of BCI interface and then mapping it to a particular user action. Building all of JIBO's code in EMAC6, and then finding that the actual robots brought to the hackathon are the older versions and do not support happy servers. We had to shift the whole thing to Express.

Accomplishments that we're proud of

Making very different type of technologies talk to each other, and thereby delving into the field of IoT. Exposing JIBO to a plethora of other smart devices. Using a combination of BCI and Myo wristband, more complex tasks are possible to be dictated to JIBO through remote access by a human.

What we learned

That simple concepts can be used to integrate very different technologies and come up with something substantial. The coupling of brain waves with JIBO can make the robot in the long run talk for a speech-challenged person.

What's next for JIBO-HCI

The three products belong to the three companies, so apart from giving the demo and presentations, hopefully something else cool very soon.

Built With

Share this project:
×

Updates