Surgeons rely on precision, but what if that precision could reach across distance? We wanted to explore how gesture-controlled robotics could make remote procedures safer and more intuitive. With the rise of telemedicine and robotic assistance, we built TeleFlex, a glove-driven robotic arm prototype that brings natural hand control to remote manipulation.

TeleFlex uses five flex sensors embedded in a glove to capture finger movement and translate it into servo motion. Each finger’s bend maps to an axis of control, letting the user intuitively guide the arm’s pitch and yaw. The result feels like an extension of your own hand, move your fingers, and the robotic arm mirrors your gestures in real time.

While our demo focuses on two degrees of freedom, the framework is modular and ready to scale to more servos, grippers, or tools. The concept could one day assist in remote surgery, hazardous environment operation, or training simulations.

We wired five flex sensors through analog inputs on an Arduino, using resistor dividers for voltage control and noise reduction. Each reading is filtered, normalized, and mapped to servo angles through custom logic that adds dead zones, smoothing, and incremental position changes for stable, precise motion.

The robotic arm itself is a simple dual-servo rig designed for demonstration, but the control architecture supports modular expansion. Real-time serial output helps with calibration and data visualization.

TeleFlex proves that intuitive teleoperation doesn’t need complex or expensive hardware. With basic sensors and smart control logic, we can bring human-like precision to affordable robotics, opening possibilities in remote assistance, education, and low-cost surgical simulation.

Built With

Share this project:

Updates