Inspiration

The hackathon was organized by the robotics team, so we decided to look into real engineering problems in space systems. We started researching the ISS and found that one of the recurring issues was MMOD impacts, micrometeorite and orbital debris hitting the structure and causing internal damage that nobody could detect until something actually failed. That became our starting point. Then at the hackathon talking with sponsors we realized the exact same problem exists in utility poles, aircraft fuselages, and civil infrastructures the same invisible internal damage, the same lack of tools to find it before it's too late. So we built one solution that works for all of them.

What it does

Space Pecker is an autonomous robot that taps a surface and analyzes the acoustic response to detect internal structural damage. A solenoid strikes the surface, two sensors capture the acoustic and vibrational signals at the same time, also we integrate a camera to capture the defect, and an onboard AI model classifies the result. Green LED means the structure is healthy. Red LED means there is something wrong.

How we built it

We live close to each other so before the hackathon we held meetings during design week to agree on the concept, figure out what components we had available, and design the CAD. We had to model the enclosure without knowing the exact size of all the components which meant the design kept changing as we went. At the hackathon we split into three parallel workstreams: soldering the circuit, building and training the model, and preparing the business case. We used AMD Developer Cloud to train the Random Forest on material datasheets and on a dataset we built ourselves from a physical aluminum sheet. We also learned to use platforms we had never touched before like Viam AI, QTex PartWise and ESP-DSP, all during the hackathon itself.

Challenges we ran into

Getting clean signals from a solenoid tap is really hard. The raw acoustic readings are extremely noisy and we spent a lot of time iterating with ESP-DSP IIR filter to filter them properly before feeding the classifier. We also had no labeled dataset for our setup so we created one from scratch, tapping a physical aluminum sheet hundreds of times in different configurations. And designing the CAD without knowing the final size of all components was a constant challenge — the accelerometer needed to be as close as possible to the solenoid but also stay connected to the ESP32-S3, which forced several redesigns.

Accomplishments that we're proud of

The whole pipeline works end to end with no human intervention, from the solenoid strike to the LED output. We hand-soldered the dual sensor circuit ourselves during the hackathon. We trained a model on a dataset we built from scratch. And we did all of this as a multidisciplinary team of embedded engineers, an ML developer and a business person who genuinely learned from each other throughout the whole process.

What we learned

We learned that signal quality matters more than model complexity. Most of our time went into cleaning and fusing the two physical signals before they reached the classifier, not into the AI itself. We also learned what it means to work across disciplines under a real deadline — the embedded engineers had to understand what the ML pipeline needed, the ML developer had to understand what the hardware could actually run, and we all had to agree on what good enough meant with a clock ticking. On top of that we learned AMD Developer Cloud, Viam AI, QTex PartWise and ESP-DSP from scratch this weekend.

What's next for Space Pecker

We want to build wheels that can adhere to vertical and curved surfaces so the robot can inspect utility poles and aircraft fuselages on its own. We also want to expand the training dataset to more materials and damage types and use Viam to coordinate multiple robots inspecting the same structure at once.

Built With

Share this project:

Updates