Inspiration

We wanted to simplify the maker journey. Too often, people have parts lying around but no clear idea of what to build. We were inspired by the idea of “vibe coding” for hardware which means turning random components into exciting, tangible projects.

What it does

HackStack takes a photo of your electronic components, identifies them, and instantly suggests project ideas you can build with what you already have. It removes the guesswork and sparks creativity.

How we built it

We used PyTorch for computer vision, Flask as the backend, NextJS for the frontend, and integrated with Arduino and Wokwi for hardware simulations. SciKit helped us with data processing and model fine-tuning.

Challenges we ran into

We faced challenges with limited training data, ensuring accurate component recognition, and designing smooth wire connections in simulations. Balancing speed, accuracy, and usability under hackathon time constraints was tough.

Accomplishments that we're proud of

We created a working prototype that can scan components, recognize them, and suggest real projects. We also integrated circuit simulations and demonstrated end-to-end usability.

What we learned

We learned how to optimize vision models with small datasets, connect multiple frameworks into a single workflow, and design with the maker community in mind. Collaboration and quick problem-solving were key.

What's next for HackStack

We plan to grow the dataset, expand the project library, and integrate auto-generated circuit diagrams and code. Our long-term goal is to make HackStack the go-to platform for makers, educators, and students worldwide.

Built With

Share this project:

Updates