Inspiration
In industrial fabrication and space exploration, certain environments are too hazardous for human presence but require the nuanced dexterity of a human hand. Inspired by our industry experience, the team wanted to apply our skills to improve the "Dirty, Dark, Dull, and Dangerous" conditions many face. We also have a lot of interest in robotics, controls, and ML and wanted to expand our skills by pursuing a project that applied each of them. We wanted to replace the traditional "leader-follower" rigid arm setup with a wearable, intuitive "sensor sleeve" to improve the natural feel of controlling a robot.
What it does
Ghost Arm is a high-fidelity teleoperation system consisting of the following: • The Sensor Sleeve: A wearable array of three MPU IMUs that captures full arm kinematics and translates them into control signals. • The Rail System: A linear roller base that provides the robotic assembly (LeRobot Arm) with lateral mobility along a rail, expanding its work envelope. • The Haptic Feedback Loop: An integrated haptic motor in the sleeve that provides tactile cues to the operator, signaling physical constraints or contact in the robot's environment. • Intelligence: A Machine Learning model that processes the follower arm's data to perform high-level tasks without human manual control
How we built it
• Sensor Fusion & Wearables: Our goal was to create a wearable that would be able to be tracked without a camera and would provide the controler with more intuitave control. We created a sensor sleeve using muliple EPS32 comunicating over wifi to send data from 3 MPU accelerometer/gyroscope modules. We utilized the Madgwick AHRS algorithm to transform raw data into stable quaternions. • Mechanical Mobility: A custom-built rail system was integrated to allow the LeRobot Arm to translate along a linear axis, modeled after industrial gantry systems. The rail system was modeled using Creo Parametric and fabricated with a 3D printer over one prototype. • Embedded Haptics: We implemented a haptic feedback system to move beyond visual-only control, allowing for "blind" operation in low-visibility environments. • Networking: Data transmission was handled via UDP JSON packets to ensure sub-20ms latency, critical for preventing operator disorientation during teleoperation
Challenges we ran into
• Sensor Fusion & Wearables: One of the biggest challenges was that the team wasn't able to get an important controller called a pin mux, a tool that always multiplies serial connections. The team spent multiple hours troubleshooting and trying to get workarounds with the only solution having multiple independent microcontrollers. Other issues include difficulties faced with the new Arduino Uno Q software bugs. • Mechanical Mobility: The one prototype showed roller balance and clearance problems. Since we were only able to submit one prototype to print over the duration of the hackathon due to high demand and long print times, the team was unable to refine the design based on prototype results and complete a polished rail system. • Embedded Haptics: We intially had a haptic motor but due to power draw limations, it wasnt able to be driven in the final product. • Networking:
Accomplishments that we're proud of
The team is very proud of all that we got done in less than 2 days of work. From finally meeting each other to getting a working product 3 hours from the deadline, we overcame a lot of challenges that we didn't expect.
Sensor Fusion & Wearables: We built a working three-IMU kinematic chain from scratch despite losing the pin mux — pivoting to multiple independent microcontrollers mid-hackathon. The Madgwick filter produces stable quaternions at 100 Hz on a first-generation Arduino Uno Q, and we demonstrated live wearable teleoperation driving the LeRobot arm without a leader-follower rig. As far as we know, this is one of the first end-to-end wearable teleop demos on the Uno Q platform.
Mechanical Mobility: We went from CAD in Creo Parametric to a physical roller assembly in a single print cycle — learning what every hackathon hardware team eventually learns, that print queues are the real constraint. The mounting geometry for the LeRobot arm on the rail chassis was successful, and the iteration we couldn't finish is now a clear design file waiting for revision.
Embedded Haptics: We designed, wired, and tested the haptic feedback loop end-to-end before hitting the power-draw wall. The architecture is sound and the code path is live — next iteration will use a dedicated haptic driver IC and a separate power rail.
Networking: We achieved sub-20ms UDP JSON telemetry between the wearable and the control station, fast enough that teleoperation feels responsive to the human operator rather than laggy and disorienting.
Systems Integration: We unified the whole system — wearable, arm, cameras, and telemetry — behind Viam mission control with custom sensor modules exposing both operator input and robot state. Judges and teammates can watch the entire pipeline live on one dashboard.
ML Pipeline: We stood up the full ACT policy training workflow on AMD Instinct MI300X cloud infrastructure, from dataset upload through checkpoint export, proving the path from human demonstration to autonomous replay works end-to-end on AMD silicon.
Log in or sign up for Devpost to join the conversation.