Inspiration
We wanted a fun way to mix palm-reading with modern AI. Palm features like the life line, head line, and sun line tells stories about personality and fate - but not everyone knows how to read them. So we built a webapp that spots and reads these features for you and returns it's interpretation of your palm!
How We Built It
We built our project using Python, leveraging a stack of computer vision and deep learning libraries to create a pipeline that moves from raw image data to semantic interpretation.
Hand Tracking & Isolation: We utilized MediaPipe to detect the hand landmarks in real-time. This allowed us to isolate the Region of Interest by extracting the bounding box coordinates based on the palm's key points.
Feature Extraction: Once the hand was isolated, we used OpenCV to preprocess the image. We applied Gaussian blurs to reduce noise and adaptive thresholding to highlight select palm lines against the skin texture.
Classification Model: For the interpretation logic, we explored Convolutional Neural Networks (CNNs) using PyTorch and Keras. The model analyzes the extracted features to classify distinct palm attributes.
Log in or sign up for Devpost to join the conversation.