What it does

Through EMG signals and speech commands the bionic robotic hand will able to behave .

How I built it

We first designed the 3D hands, there are more than 20 pieces. Then we setup the framework for signal/speech recolonization, this was done through the micro-chips we made on-sight and machine learning script from scratch. Then we assembled everything together to make a biotic-robotic arm

Challenges I ran into

We were planning to use EMG signals as the input, then though a simple machine learning model to output signals to the servos that allow hand to move. However, this didn't work as we expected due to the solder damaged our chips. We were unable to make the small conduct work.

Accomplishments that I'm proud of

We are able to program via Raspberry Pi 3B and make the hand moving from our commands.

What I learned

  • How to use machine learning algorithm to recognize commands,
  • How to use python to receive voice commands and convert it to signal that allows to power servos
  • How to use 3D printing to print a hand,
  • How to assemble a hand.
  • How to deal with pressure and stress. ## What's next for AI Bionic-Robotic Hand
  • Low budget 3D printing hand,
  • High precision movement,
  • Be able to recognize EMG signals from the upper arm,
  • Bionic hand allows patient to 'feel'.

Built With

Share this project: