You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have searched the YOLOv8 issues and discussions and found no similar questions.
Question
The P and R values in the output results obtained by using val() verification are the P and R corresponding to the conf value when maximizing the F1 curve. This is not P and R at the conf threshold specified in val(conf=0.25). I think YOLO outputs the maximum P and R (obtained based on the maximum F1) that can be obtained between the predicted results under the current conf (the conf specified in val) and the real GT, rather than simple P and R (in When specifying conf, it is obtained by calculating TP, FP, and FN based on GT). Is my understanding correct? I noticed that there is no precise explanation of this in the official documentation. In the actual project, I directly used the P and R printed by the terminal as the indicators of my model. This is incorrect. The correct P and R indicators should be obtained by modifying the following code:.
ultralytics/utils/metrics.py---ap_per_class()
i = smooth(f1_curve.mean(0), 0.1).argmax() # max F1 index
p, r, f1 = p_curve[:, 0], r_curve[:,0], f1_curve[:, 0] # i->0
Am I right to understand this? Thanks for the answer.
Additional
No response
The text was updated successfully, but these errors were encountered:
Hello! Thanks for your detailed observation. You're correct in your understanding of how precision (P) and recall (R) are being calculated in relation to the confidence (conf) threshold and the F1 score. The values reported during validation (val()) are indeed the maximum P and R associated with the highest F1 score, not directly at the explicit conf threshold such as conf=0.25.
To get P and R at a specific confidence threshold, you will need to adjust the calculation as you've outlined in your modification of the ap_per_class() function by setting i to 0. This way, it will consider the precision and recall at the first index of the respective curves, which corresponds to your specific threshold without considering the F1 score maximization.
Feel free to test these changes, and let us know if this aligns with the results you need! Keep in mind any variations might occur due to the specifics of your dataset and detection thresholds. Happy coding! 🚀
@zwcity you're welcome! I'm glad to hear that the explanation was helpful. If you need further clarification or run into any more questions as you adjust the code or process your results, feel free to reach out. Happy modeling! 😊
Search before asking
Question
The P and R values in the output results obtained by using val() verification are the P and R corresponding to the conf value when maximizing the F1 curve. This is not P and R at the conf threshold specified in val(conf=0.25). I think YOLO outputs the maximum P and R (obtained based on the maximum F1) that can be obtained between the predicted results under the current conf (the conf specified in val) and the real GT, rather than simple P and R (in When specifying conf, it is obtained by calculating TP, FP, and FN based on GT). Is my understanding correct? I noticed that there is no precise explanation of this in the official documentation. In the actual project, I directly used the P and R printed by the terminal as the indicators of my model. This is incorrect. The correct P and R indicators should be obtained by modifying the following code:.
Am I right to understand this? Thanks for the answer.
Additional
No response
The text was updated successfully, but these errors were encountered: