Inspiration

According to healthdata.org, around 57.7 million people in this world are living with limb amputation. For them, it is a struggle to even perform the most basic interactions with a computer. We felt almost compelled to do something that could make computer interaction more personalized and accessible for such people.

What it does

Our project utilizes image processing using OpenCV to track the motion of an april-tag in 3D space using a webcam or external camera. This real-world movement is then translated with much precision into accurate mouse pointer coordinates on the screen. We have demonstrated the application of such technology using a Space Game that we created in python using the pygame module.

How we built it

The first step is to calibrate the user's camera using a 7x7 chessboard. This is done to remove the distortion in images and also calculate the accurate camera matrix for that particular camera. The values from that camera matrix are then passed into an april-tag detection algorithm that tracks the motion of an april-tag in real-time. This 3D movement is then mapped into mouse pointer coordinates according to the accurate screen resolution and screen real estate. The demonstration of this technology is delivered through a fun Space Game that we built in python using the pygame module. Through the game, we are able to demonstrate important features such as the implementation of a clicking mechanism without the use of physical buttons. The entire code has been written in python and all other dependencies have been pushed to Github.

Challenges we ran into

The journey towards the completion of our project had a few obstacles. The first challenge we ran into was deciding on what we would build as our project for the hackathon. Our initial plan was to develop two projects, then make a pros and cons list to finally narrow it down to one. After doing decent research we were finally able to decide which of the two was more creative, efficient, and impactful. This ensured a smooth start to the project. We then ran into a complication, which was to prevent the user’s device from using its camera to take multiple pictures for calibration. After spending some time on team discussions and researching on ways to implement a better way to approach the issue, we implemented real-time camera calibration, where the camera understands the movements in real-time and scales the checkerboard orientation. We implemented a feature in our product that simulates a click in the game when the cursor is held over an object for a certain period of time. This was a very big challenge for the team as there were very limited resources for inspiration on the simulation of a click without physical buttons. The game that we developed has a moving background to emphasize the space theme. It was a challenging task to figure out a way to keep the same image repeating itself without showing any lines of continuation in the background. While these challenges costed the team a good amount of time, it allowed the team to research on more efficient solutions to the project and widened the horizons of our thought process resulting in a much better overall product in comparison to the initial model that we had in mind.

Accomplishments that we're proud of

With challenges came solutions and with solutions came the remarkable feeling of achieving. The team is very proud of the product that is created as a result of the vigorous planning process. The simplest yet very effective accomplishment for us was staying on track. This meant that each team member performed the tasks according to what was assigned to them in the planning sessions. This resulted in no distraction or deviation from the primary goal of completing the project in the given time constraints. One of the very first breakthroughs was the way in which we figured out to implement real-time camera calibration for our product to work and help contribute towards the cause we had in mind, creating a product that enables everyone to enjoy the same experience of interacting with the computer system. We are very proud of the ways we implemented our thoughts into a product and then brought that product into a real-world application. We are also proud of the efficient solutions that we thought of to tackle all the obstacles that interrupted our path, paving the path for making our thoughts come to reality. We are determined to continue working on this technology and make it more efficient.

What we learned

We acquired immense knowledge about computer vision, image processing, and game development. Empathetical thinking and team-building were also essential life skills that we acquired. All in all, we had some of the most fun, tedious, enthralling, and challenging 36 hours here at SpartaHack.

What's next for Pointer Tracking Using Image Processing

Creating perfect technology to completely change the way human-computer interaction takes place takes a bit more than 36 hours to develop. We want to refine this technology to drastically improve the accuracy of image processing and also implement this for day-to-day use on common operating systems. Instead of using it for specific programs, we want it to be able to simulate keystrokes and mouse pointer movement so that it can be used with any software out there.

Built With

Share this project:

Updates