Inspiration

Personal assistants are well know among smartphones. However having extra capabilities, such as control and interpret objects around the environment, as well as locomotion are desired features that could enhance the ability that machines and human interact. As nowadays mini CPUs such as the Raspberry pi are becoming gradually cheaper, exploring these devices as main processor for lower cost appllications is a good starting point. Since robotic mobile platforms are generally expensive and with limited functions, we propose a low cost, open source personal assistant Robot capable of moving in space, interpreting object shapes and the human hand via camera, being directly controlled by voice (everything offline) using STT and TTS engines. Our final design is estimate in US$150.

What it does

The personal mobile assistant robot waits for an input from the user, and depending on the response acts accordingly. It can do basic functions such as time, weather and news reading, as well as object shape and color recognition, finger counting and voice controlled navigation.

How I built it

I used laser cutter parts, a travel luggage handle and mixed with all electrical components (DC and servomotors) and electronic (Raspberry pi, L298n motor driver). Any smartphone can be hooked up to the robot to serve as display .

Challenges I ran into

Tuning the voice recognition engine, cutting all parts putting it all together and common last minute bugs.

Accomplishments that I'm proud of

The robot does what I envisioned, I believe that is extremely satisfactory.

What I learned

More python,Opencv,Skitic-learn and a bit about SST and TTs engines.

What's next for Dbot

Improve its capability as well as make the harware more stable.

Built With

Share this project:

Updates