Wanted to build my first hardware hack and found no better way than to interact with touch technologies. I also wanted to work with vision.

What it does

Draw symbols, and the computer will speak them to you as shape names.

How we built it

Wrote python scripts to generate baseline values for LCD gray-scale gradient. Then based on average base-line relative values, we determined the x,y coords of the touched areas. Touched areas represented high values (<255 RGB) and from there we recorded midpoints to determine the points we want to have selected. In the end we use image post-processing with open-cv to analyze curvatures and edges. With this data we were able to flatten matrices and determine basic shapes.

Challenges we ran into

Getting the drivers installed with the Synaptic touch board.

Accomplishments that we're proud of

That it works and speaks to you.

What we learned

Learned how to geometrically define boundaries for shapes using open-cv and scaling images to represent the larger tablet.

What's next for ShapeLanguage

Identify any type of symbol

Share this project: