In ECE lab last semester, a classmate blew a fuse while conducting an experiment, and unfortunately, he lost a $250 dollar device. He needed 5.2MΩ and grabbed a 52Ω because the 5.2MΩ drawer had one carelessly thrown in. Being that my classmate is colorblind, he didn't notice the difference in the tiny bars on the resister. We want to make this process far easier.

What it does

ResistMe lets you use your phone's camera to discover the value of a resistor in real time. It uses computer vision to analyze the bars on the resistor and then calculates an accurate response in a beautiful UI

How I built it

Using Swift 3 and XCode 8, we built the screens and animations for the app. It passes data back into an objective-C++ program that uses OpenCV to calculate the colors. A small Objective-C wrapper is used to serve as a bridge between the C++ OpenCV code and Swift.

Challenges I ran into

Filtering based on lighting can be pretty challenging. In addition, the fact that C++ and Swift are incompatible introduces surprisingly difficult integration challenges when using the swift compiler.

Accomplishments that I'm proud of

We are able to perform all of these calculations by filtering images at more than 10Hz, which effectively makes this app real time.

What I learned

OpenCV integrations on a mobile phone. We had only ever worked on the Python/Web interface. Learned a lot about hue filtering and live video stream processing.

What's next for ResistMe

Use machine learning to better assess the values based on the filtering heuristic discovered at this hackathon

Built With

Share this project: