Neural network code, video feed, and responses to hand gestures detected.
This project uses MATLAB's Computer Vision toolbox as well as the Neural Networks toolbox to robustly recognize a hand in either a rock, paper, or scissor configuration.
I collected data by applying a Gaussian convolution to captured images, and then detecting features by utilizing MATLAB's edge detection function. To reduce the number of points to analyze, the edge detection output was then run through Harris-Stephens corner detection from the Computer Vision toolbox. The image was then divided up into 16 equal sections, and the number of points detected in each section was summed. This data was used to train a neural network using nprtool. The neural network could then be used to interpret webcam input as rock, paper, or scissor, and the correct counter could be played by the computer.