We present some updates to YOLO! We made a bunch of little design changes to make it better. We also trained this new network that’s pretty swell. It’s a little bigger than last time but more accurate. It’s still fast though, don’t worry. At 320 × 320 YOLOv3 runs in 22 ms at 28.2 mAP, as accurate as SSD but three times faster. When we look at the old .5 IOU mAP detection metric YOLOv3 is quite good. It achieves 57.9 AP50 in 51 ms on a Titan X, compared to 57.5 AP50 in 198 ms by RetinaNet, similar performance but 3.8× faster.

What it does

Uses pretrained weights to make predictions on images. Below table displays the inference times when using as inputs images scaled to 256x256. The ResNet backbone measurements are taken from the YOLOv3 paper. The Darknet-53 measurement marked shows the inference time of this implementation on my 1080ti card.

How I built it

numpy torch>=1.0 torchvision matplotlib tensorflow tensorboard terminaltables pillow tqdm

Challenges I ran into

One of the challenges was to use complex mathematical formulae using Python to derive certain parameters. However, the Python Package on A2019 made it really easy to call Python functions inside and code and get the output. Also creating a waypoint plan for the Autopilot was initially a challenge however using the product documentation I was able to create the flight plans using Log to file package making all the values dynamic.

Accomplishments that I'm proud of

I am happy that I was able to use my academic knowledge in ML, my work expertise in AI/ML, and my curiosity to explore new possibilities in creating this solution which can help the world in this traumatic situation. I am also happy that I was able to bring out the true sense of Torch in the name "PyTorch".

What I learned

It was indeed a great experience to develop this Project with a lot of research, trial & error and new learnings. I was able to start off with ML and I'm looking forward to trying out further possibilities.

What's next for PyTorch-YOLOv3


Built With

Share this project: