Originally we wanted to create a physical, cross-platform volume mixer to control levels of various processes while staying in full screen. It ended up being far more challenging than that but the basics were still achieved.

What it does

The python file is executed and it interprets and parses certain commands based on gestures and user input.

How I built it

With a lot of help from Synaptics. It's based on a large, 15" touchscreen manufactured by them. Included was a single example program and very little documentation.

Challenges I ran into

It was a challenge to make this device. Just starting out it was hard to configure and install the drivers correctly just to interact with the board itself, and beyond that very little documentation was available. Essentially we were handed the board and sent on our ways. However, Synaptics provided a lot of mentoring and help whenever I needed it.

Accomplishments that I'm proud of

While pretty basic and not super efficient, I'm proud of the algorithm I made for detecting the motion of the finger, and the design choices I made throughout the code. Very few values are hardcoded in an effort to create something that 'could' be used later.

What's next for Touchy-Feely Screen

More touching and feeling

Built With

Share this project: