Inspiration
With the start of quarantine, I had a lot more free time which I wanted to use playing Madden 21, a football game on the Xbox. Part of the game is leveling up and having a really strong team to play against others, however leveling up takes a very long time to do so I wanted a way to gain experience points without having to grind it out. Knowing that I'm not supposed to mess with the game's software, I created Solo Slayer, a robot that could play as me, when I’m not playing.
What it does
Solo Slayer is a robot that hits buttons on an XBOX controller in a specific sequence, mimicking how a real human interacts with a controller. It has two separate arms, one each for the down button on the D-Pad and the A button. By pressing rotating these arms at appropriate time intervals, Solo Slayer can play a solo challenge repeatedly, which rewards XP.
How we built it
Solo Starter was built with the Mindstorms EV3 kit, and programmed with the EV3 software. I tested it a multiple times to come up with the ideal waiting time to complete the challenge repeatedly without human interference.
Challenges we ran into
A challenge that I ran into at first was that the controller would constantly shift and move around, causing the pressing of the button to be inconsistent. This was remedied by placing the controller into a box, which was then held in place by a jig to prevent it from moving. Another challenge I ran into was inconsistency at pressing the button. This was remedied by reinforcing the arms and building a backstop to prevent it from overshooting the button.
Accomplishments that we're proud of
It works! Solo Slayer has a 95% success rate so it can be run while I'm doing something else.
What we learned
I learned the importance of testing your code repeatedly to double check for any possible errors that could occur.
What's next for SoloSlayer
I want to add a color sensor to detect what's on the screen instead of relying on waiting a specific amount of time. I also want to add a touch sensor so I can start the robot with one touch instead of digging through a program list. A loftier goal is to recreate this robot but with an arduino or raspberry pi so that computer vision can be used to detect what's on the screen to make the robot move.
Built With
- ev3
- mindstorms


Log in or sign up for Devpost to join the conversation.