Inspiration
After a brutal Achilles tear and weeks in bed, one of our teammate’s moms re-tore her Achilles after a fall from a mobility scooter, similar to our walker. Luckily, she was at home when the re-injury occurred and could get help from family. Not everyone is so lucky, and this incident got us thinking about accessibility tools such as scooters, wheelchairs, and walkers and how they could be improved to assist when no one else is around.
When someone who is physically impaired falls, they are often unable or severely restricted in their ability to get back to their physical aid. Surely, it would be helpful if their walker came back for them?
What it does
This walker is equipped with a camera, a large motor for power, and two smaller motors to pull the brakes. Using a computer vision algorithm that we trained using YOLOv8, the walker detects when someone has fallen and raised their hand, requesting assistance. The walker then motors toward the fallen user to help them.
How we built it
This project combined complex hardware, mechanical design, and software components,
- On the hardware side, a Raspberry Pi handles our hand-detection algorithm and communicates with an ESC controlling the motors.
- The main motor spins a wheel for power, while two smaller motors apply pressure as brakes, enabling the walker to stop and turn in the direction of its user.
- On the mechanical design side, key mounts, brakes, and housing for the servo motors were all custom-designed and 3D printed over the weekend!
- For software, we trained a YOLOv8 computer vision model using hand-annotated pictures of raised hands. The model was hosted on a web server, and the Raspberry Pi communicated via UART with the ESC. The Pi also communicated with the servos via IO.
Challenges we ran into
After training our model, we found that it wasn't running as we wanted on the Raspberry Pi due to power limitations. We had to quickly pivot and upload the model to a web server, developing our own API for the Raspberry Pi to upload images and receive the necessary information.
UART was really finicky to work with, after writing code that worked literally hours before, our team was disappointed to find that it not longer worked and that we had to fix it.
None of our team had worked with a Raspberry Pi before, making setup and configuration quite challenging.
We didn’t get all the crucial hardware we requested, but our team adapted and altered the project accordingly.
Accomplishments that we're proud of
Training the model in such a short time frame was a huge win, especially with the help of roboflow.
Writing the UART was difficult but extremely satisfying once completed.
Our team maintained a resourceful and positive attitude, overcoming every obstacle.
What we learned
We learned a lot of new technical skills: new Python libraries, Linux commands, CAD designs, 3D printing settings, and hardware techniques. Working with a Raspberry Pi was an eye-opening experience, especially with UART communication.
Additionally, we gained valuable teamwork lessons, including how much sleep deprivation can affect problem-solving, but also the extent to which persistence can lead to solutions!
What's next for Moonwalk?
Who knows? We may continue working on this after the weekend. It’s too early to say on this very sleep-deprived Sunday morning...
Log in or sign up for Devpost to join the conversation.