Inspiration
Our inspiration to tackle the challenge of building machines in space stems from the limitations of current space exploration methods. While rockets have propelled us to the stars, their payloads are finite and costly, constraining the expansion of our presence in space. Relying solely on human labor for assembly tasks is not sustainable; it's inherently risky and logistically complex. Our motivation is use fully autonomous technology to transcend these limitations. By developing a system that combines precision with reliability, we aim to minimize human intervention in space assembly, paving the way for more ambitious and cost-effective missions beyond Earth's orbit.
What it does
Our solution leverages APRIL Tags to achieve pinpoint accuracy in locating both our robot and the crucial parts. Additionally, our cutting-edge computer vision module sets a new standard in precision and efficiency. Our robot is seamlessly synchronized with a compact camera, providing real-time data input. This synergy culminates in the deployment of a sophisticated pathfinding algorithm, enabling our robot to elegantly navigate and expertly position components into place.
How we built it
The robot used for the demonstration is a modified version of the ELEGOO Smart Robot Car Kit V4.0, a basic robotics platform built around the Arduino Uno R3, which was provided by TechSpark. The Uno R3 software was written in C++. The computer vision and APRIL Tag position estimation algorithms are written in Python, with training and calibration being done in OpenCV.
Challenges we ran into
We faced several difficult challenges along the way. Reverse-engineering the motor controller board schematic required meticulous attention to detail. In the realm of computer vision, crafting overlays and visualizations using OpenCV pushed us to create original solutions to seemingly simple problems. Moreover, navigating the occasional inaccuracies in position estimations within OpenCV posed a formidable puzzle, compelling us to refine our algorithms and fine-tune our approach.
Accomplishments that we're proud of
We employed advanced statistical filtering techniques to discern outliers and rectify misaligned objects with remarkable precision. Our full pose estimation utilizing APRIL Tags boasts a remarkable accuracy, consistently achieving measurements within one inch of perfection—a feat that addressed the complex perspective n-point problem. Finally, our adaptive path generation algorithm proficiently realigned misplaced crates in various configurations.
What we learned
We delved into the realm of electronics, using essential tools like multimeters and oscilloscopes to find and fix errors. We embraced the intricacies of integrated circuitry, implementing control logic based on data sheet analysis. In the realm of computer vision, we developed a keen eye for debugging, particularly in the pursuit of achieving high-precision pose estimation through the various layers of our technological stack.
What's next for Crosshair
The future applications of our product are boundless. By integrating a Raspberry Pi, we can make our robot completely autonomous, freeing it from the need for continuous computer connections and enhancing its adaptability (and removing the need for a really long cable). Moreover, the technology behind our system can be extended to various robot types, including those equipped with mechanical arms—which means that our idea can be expanded to an enormous range of fields from manufacturing to healthcare. As we look toward the cosmos, our innovation also addresses the unique challenges of zero-gravity environments, enabling more complex robots to manipulate and transport objects with precision. And finally, with our sights set on Mars, our technology becomes a cornerstone in the construction of the Red Planet's future infrastructure, facilitating mankind's next giant leap in space exploration and colonization.
Log in or sign up for Devpost to join the conversation.