Inspiration
In a year of social interactions driven by online experiences, we were craving some sort of in-person game. This gave us the idea to build a robot so that we could play chess 'in-person' with each other while being miles away.
What it does
Our robot has several different game modes. First, a human player can play chess against a AI on a physical chess board. The human player moves pieces on the board, then the robot uses a camera to detect the location of all pieces on the board, uses an AI to decide which piece to move, then a claw mechanism to make that move. Next, a human player can play against Twitch chat using the twitch_plays_hackru library. In this mode, a human player moves pieces on the physical chess board, and the robot takes pictures to calculate legal moves. Then, twitch chat can vote on which move to make, and the robot will use its claw mechanism to carry out the play. Additionally, a human player can verse an online player through a web interface we developed. This interface contains a livestream of the camera feed from the robot, along with an interactive chess board that updates as the robot scans the board. An online player can drag and drop on the interface to make their move, which is then carried out by the robot.
How we built it
We built our robot onto a Da Vinci 2.0 Duo 3D printer. The custom-designed 3D printed claw mounts to the extruder assembly and is powered by a servo plugged into a Raspberry Pi 4. The claw also contains a mount for a Pi Cam V2, which is wired directly into the Raspberry Pi. On the software side, a laptop is plugged into the robot through a usb serial connection, and has a remote connection to the Raspberry Pi. With this, our Python programs are able to control the location of the claw and close its gripper. It also performs image analysis on the pictures taken to identify the location of all pieces on the board. Finally, it is hosting the web interface and communicates with it to update the position of the virtual chess board and to take inputs from online users. It can also run the twitch_plays_hackru service to gather votes from a twitch chat.
Challenges we ran into
We ran into several challenges in all aspects of this project, but here are a few notable ones. Designing and 3D printing a complicated claw mechanism in 24 hours is difficult, but we managed. Our biggest issue here was that our first claw was off by 2 millimeters in one aspect and another 3 hours had to be spent printing a second. The image analysis to gather information about the location of the chess pieces on the board was extremely challenging. We ran into issues that led us to try using a wide angle lense and compensate for the fisheye effect for better results, but ended up sticking with the standard lense and taking 4 pictures, at each quadrant of the board to get an accurate reading. It was a delicate process to calibrate the image recognition software to identify the 6 colors used to differentiate between the 6 different types of chess pieces. Finally, we ran into many issues setting up our web interface, from getting the live video footage formatted properly to a decent drag and drop interface that could pass information back to our Python program.
Accomplishments that we're proud of
We're incredibly proud of the capabilities of our robot at this stage. We weren't sure how far we would get and how many of our goals we could accomplish, but we're glad to say that we surpassed our goals and that the robot is performing quite well. This project involved several first-time accomplishments for us, and we learned many new skills in the process. Most of all, we feel accomplished that we were able to create a working prototype of a robot with so many independent aspects.
What we learned
We learned a lot about robotics, web-design, artificial intelligence, image processing, and simply learning how to learn new things under time pressure. It was an amazing experience to blend together such varying fields into one project.
What's next for light-blue
We have a few stretch goals that we'd like to accomplish in the near future. First, we'd like to allow players on the online interface to play against the AI, or even other online players, with the robot controlling both players' moves. Next, we'd like to better tune the image analysis and color recognition used to determine the location of pieces on the board, since it can easily make mistakes. Finally, we'd like to improve the overall experience of using the web interface and add functionality like starting and configuring games, rather than just interacting with parts of the program. Finally, we'd like to test out twitch_plays_hackru with a proper stream and audience voting on which moves to make. We'd also like to implement the ability for twitch chat to play against the AI, rather than just a human opponent. Thank you for learning about team light-blue, be sure to check out our GitHub! https://github.com/Michael73MGD/light-blue
Built With
- bootstrap
- html
- javascript
- opencv
- python
- stockfish
Log in or sign up for Devpost to join the conversation.