Inspiration

I came across this home body 3d scanner which aims to be the first company in the world to be able to give detailed health stats like body fat percentage, body measurements etc in the comfort of your home. But it costs around $1500 making it inaccessible to most people in the world. What if you could do the same with just a smartphone app?

What it does

You take a picture of yourself with your smartphone camera. And the app takes care of the rest. You get your body fat percentage and other body measurements like waist size. The best part is it uses only a single picture. No stereo cameras or other depth sensors. This makes it cheap and accesible to anyone who wants to keep track of their health.

How we built it

It uses a monocular depth estimating network to produce a pixel level depth map. This was based on the CVPR 2019 paper 'Learning the depths of moving people by watching frozen people'. At the same time, a retinanet was finetuned to estimate the location of your body parts. PyTorch was used for both the networks and it was extremely pythonic and intuitive to use. This information is combined to calculate your body measurements and body fat percentage. Some camera intrinsics from the exif data is also used for estimation. It uses the Navy body fat formula for calculation.

Challenges we ran into

Running the network(finetuning and evaluating) using google Colab proved challenging since we do not have access to local GPUs. Also figuring out how to accurately estimate by combining all the information requiring a lot of pre-project research

Accomplishments that we're proud of

The fact that it works and gives a decent estimate is something we're proud of.

What we learned

Actual engineering and putting everything together was a great learning experience.

What's next for BodyScan

Evaluate the model with more data and make it more accurate.

Built With

Share this project:

Updates