Replies: 1 comment 5 replies
-
👋 Hello @wangp22, thank you for reaching out and using Ultralytics 🚀! We appreciate your thoughtful questions about interpreting validation metrics in YOLOv9. For new users, we recommend checking out the Docs where you’ll find Python and CLI examples, plus detailed explanations of results and metrics. If this is a 🐛 Bug Report, please provide a minimum reproducible example to help us investigate further. If this is a custom training ❓ Question, please include as many details as possible, such as dataset sample images, training logs, and confirm you’re following our Tips for Best Training Results. Join the Ultralytics community for support and discussion:
UpgradePlease ensure you are running the latest pip install -U ultralytics EnvironmentsYou can run YOLO in any of these verified environments (all dependencies, CUDA/CUDNN, Python, PyTorch preinstalled):
StatusIf this badge is green, all Ultralytics CI tests are currently passing. CI tests verify correct operation of all YOLO Modes and Tasks across macOS, Windows, and Ubuntu every 24 hours and on every commit. This is an automated response 🦾. An Ultralytics engineer will review your question and assist you soon! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
After running a Python script containing model.val() for validating a model on a test dataset, I get metrics that compose of 6 numbers printed at the terminal. Like:
all 1372 1586 0.857 0.733 0.815 0.754
class1 8 9 0.985 1 0.995 0.917
class2 4 5 0.943 0.8 0.817 0.671
and so on...
I would like to think they are the number of images, number of instances, precision, recall, map50, and map95 respectively. But then I'm not sure how to interpret map50 and map95 by the class, since I thought those are the average across all classes.
Also, for the precision and recall, those can vary based on the iou (intersection over union) used to match ground truth with the detection. I think a range of iou values is used from 0.5 to 0.95, in increments of 0.05. So I'm not sure how to interpret the single precision/recall shown. Do they correspond to a particular iou value?
Please clarify my confusion.
Thanks
Beta Was this translation helpful? Give feedback.
All reactions