Inspiration
We wanted to learn how to use gesture control and asked a surgeon peer how it might be useful to them.
What it does
It takes speech or arm/hand gestures as input in order to pan/zoom onto an operating room display and switch between displays.
How we built it
We used an IMU accelerometer and EMG electrodes with an amplifier circuit to detect muscle contractions and movements of the arm. This data is sent to the ESP32 microcontroller for data processing, which sends a command to the user's computer over a bluetooth connection to edit their operating room display. A foot pedal was also used in order to activate the user's microphone and use speech recognition for sending commands.
Challenges we ran into
We had trouble with inconsistent bluetooth connections and EMG data readings.
Accomplishments that we're proud of
We were able to read an output from the EMG electrodes and accelerometer.
What's next for The Surgeon Sidekick
- Add more gesture vocabulary
- Real-time feedback overlay
- Integrate with hospital PACS system
- Improve accuracy

Log in or sign up for Devpost to join the conversation.