Inspiration

Gaming is something that brings people together. It allows us to experience things that we may never have the opportunity to do and connect with people across the globe. However, not everyone can play using the standard controller. While there are some options on the market, they can be very expensive. I wanted to try and build a low-cost prototype that people could use as inspiration to make their controllers fit their individual needs.

What it does

The way the project works is that it allows the user to select how they would like to interact with their computer. Based on their choice, there will be different triggers that will then simulate keyboard presses so that the user can interact with the game.

How we built it

This project utilizes many technologies. The hardware controller uses an Arduino Uno platform and various sensors to create the accessibility controller. It should be noted that you can use other sensors that you have if they suit your needs better. Communication between the main Python program and the hardware controller was conducted through the serial port. The voice controller used the PyAudio and SpeechRecognition libraries to make commands. Finally, the video controller interacted with a MATLAB function which performed image classification using GoogLeNet. All of the inputs are used to simulate keyboard presses using the Pynput library. Lastly, the GUI was developed using the EEL library.

Challenges we ran into

Unfortunately, I could not use a majority of the hacking time due to other commitments, drastically reducing the time I had to work on the project. Furthermore, I had many problems getting the MATLAB program to correctly communicate with the main Python program. Additionally, there were many instances where images were incorrectly classified which made it harder to use the video controller.

Accomplishments that we're proud of

Despite the limited time to work on the project, I am proud of the result. Many aspects can be improved upon, but all of the basic functionality was implemented.

What we learned

*The EEL Library *PyAudio Library *SpeechRecognition Library *Arduino Capacitive Sensor Library *Python-MATLAB connections

What's next for EveryoneCanPlay

I hope that this project inspires others to make their own controllers. Some ideas include; using different sensors, making the system wireless, compatible with different devices, reducing the latency for speech and camera inputs, and many more.

Built With

Share this project:

Updates