Inspiration The grippers used in industry today are not adaptable enough to interact with a wide range of objects, especially those with complex shapes, varying sizes, or mixed surface types. Traditional grippers either rely on rigid mechanisms that fail with irregular geometries or require manual reconfiguration between tasks. We wanted to build a gripper that could think about what it’s grabbing and adapt both mechanically and intelligently.

What it does The Multimodal Compliant Gripper is a self-adaptive robotic grasper that combines mechanical compliance with computer vision and force feedback to reliably grasp a wide variety of objects. An Intel RealSense D435if camera feeds into a vision pipeline using SAM3 (Segment Anything Model 3) for object segmentation, fused with depth data to distinguish between hollow and flat objects — allowing the gripper to choose the right grasping strategy. It switches between a compliant jaw mechanism for irregularly shaped objects and vacuum suction cups for large, smooth surfaces, while Hall effect sensors provide real-time force feedback to prevent crushing or slipping.

How we built it • Designed a compliant mechanism (CM)-based gripper that passively self-adapts to object geometry without complex actuation • Integrated vacuum suction cups for handling large, flat, or smooth-surfaced objects • Mounted an Intel RealSense D435if camera directly on the gripper for close-range perception • Built a vision pipeline using SAM3 to generate segmentation bitmasks, fused with the camera’s depth matrix to classify object type (hollow vs. flat) • Used an Arduino Uno Q as the microcontroller for all electronics control — sensors, actuators, and suction • Set up a Linux-based Python server to act as a bridge, receiving commands and communicating with the Arduino. A client interface on a separate laptop communicated with the server over MQTT • Implemented force feedback using a Hall effect sensor and a linearly moving magnet — displacement was measured and mapped to force using a linear regression based on F=kx, with a known spring constant

Challenges we ran into • Fusing SAM3 segmentation outputs with the RealSense depth matrix accurately, especially for objects with reflective or transparent surfaces • Designing a compliant mechanism that is both flexible enough for adaptability and rigid enough for reliable force transfer • Calibrating the Hall effect sensor and ensuring the linear displacement-to-force mapping stayed consistent across different grasp conditions • Managing communication reliably across the Arduino Q, Python bridge server, and MQTT client

Accomplishments that we’re proud of • Successfully combining passive mechanical compliance with active vision-based object classification in a single gripper • Getting SAM3 and depth fusion working reliably to differentiate hollow from flat objects in real time • Building a working force feedback system from first principles using F=kx, a Hall effect sensor, and linear regression • Setting up a clean three-layer communication architecture — Arduino Q, Python bridge, and MQTT client — that tied all the systems together • A hardware and software build that handles diverse objects without manual reconfiguration

What we learned • How to bridge vision model outputs (bitmasks) with real physical actuation decisions • How to build a force estimation system using basic physics and a Hall effect sensor • Managing multi-layer communication between a microcontroller, a Linux server, and a remote client over MQTT • The design trade-offs in compliant mechanisms — stiffness vs. adaptability

What’s next for Multimodal Compliant Gripper With Force Feedback • Expanding object classification beyond hollow/flat to include material type (soft, rigid, fragile) for smarter grasp planning • Training a lightweight grasp policy model that uses the vision and force feedback loop to autonomously optimize grip strength and position • Testing on a full robotic arm in an unstructured industrial or warehouse environment • Exploring tactile sensors alongside Hall effect sensors for richer contact feedback • Publishing the compliant mechanism design as an open-source hardware reference for the robotics community

Built With

Share this project:

Updates