TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Inner-IoU: More Effective Intersection over Union Loss wit...

Inner-IoU: More Effective Intersection over Union Loss with Auxiliary Bounding Box

Hao Zhang, Cong Xu, Shuaijie Zhang

2023-11-06regressionObject Detection
PaperPDFCode(official)

Abstract

With the rapid development of detectors, Bounding Box Regression (BBR) loss function has constantly updated and optimized. However, the existing IoU-based BBR still focus on accelerating convergence by adding new loss terms, ignoring the limitations of IoU loss term itself. Although theoretically IoU loss can effectively describe the state of bounding box regression,in practical applications, it cannot adjust itself according to different detectors and detection tasks, and does not have strong generalization. Based on the above, we first analyzed the BBR model and concluded that distinguishing different regression samples and using different scales of auxiliary bounding boxes to calculate losses can effectively accelerate the bounding box regression process. For high IoU samples, using smaller auxiliary bounding boxes to calculate losses can accelerate convergence, while larger auxiliary bounding boxes are suitable for low IoU samples. Then, we propose Inner-IoU loss, which calculates IoU loss through auxiliary bounding boxes. For different datasets and detectors, we introduce a scaling factor ratio to control the scale size of the auxiliary bounding boxes for calculating losses. Finally, integrate Inner-IoU into the existing IoU-based loss functions for simulation and comparative experiments. The experiment result demonstrate a further enhancement in detection performance with the utilization of the method proposed in this paper, verifying the effectiveness and generalization ability of Inner-IoU loss. Code is available at https://github.com/malagoutou/Inner-IoU.

Results

TaskDatasetMetricValueModel
Object DetectionAI-TODmAP5043.77YOLOv5+Inner-IoU
Object DetectionAI-TODmAP@50-9518.23YOLOv5+Inner-IoU
Object DetectionPASCAL VOC 2007mAP@5064.44YOLOv7+Inner-IoU
Object DetectionPASCAL VOC 2007mAP@50-9538.52YOLOv7+Inner-IoU
3DAI-TODmAP5043.77YOLOv5+Inner-IoU
3DAI-TODmAP@50-9518.23YOLOv5+Inner-IoU
3DPASCAL VOC 2007mAP@5064.44YOLOv7+Inner-IoU
3DPASCAL VOC 2007mAP@50-9538.52YOLOv7+Inner-IoU
2D ClassificationAI-TODmAP5043.77YOLOv5+Inner-IoU
2D ClassificationAI-TODmAP@50-9518.23YOLOv5+Inner-IoU
2D ClassificationPASCAL VOC 2007mAP@5064.44YOLOv7+Inner-IoU
2D ClassificationPASCAL VOC 2007mAP@50-9538.52YOLOv7+Inner-IoU
2D Object DetectionAI-TODmAP5043.77YOLOv5+Inner-IoU
2D Object DetectionAI-TODmAP@50-9518.23YOLOv5+Inner-IoU
2D Object DetectionPASCAL VOC 2007mAP@5064.44YOLOv7+Inner-IoU
2D Object DetectionPASCAL VOC 2007mAP@50-9538.52YOLOv7+Inner-IoU
16kAI-TODmAP5043.77YOLOv5+Inner-IoU
16kAI-TODmAP@50-9518.23YOLOv5+Inner-IoU
16kPASCAL VOC 2007mAP@5064.44YOLOv7+Inner-IoU
16kPASCAL VOC 2007mAP@50-9538.52YOLOv7+Inner-IoU

Related Papers

Language Integration in Fine-Tuning Multimodal Large Language Models for Image-Based Regression2025-07-20A Real-Time System for Egocentric Hand-Object Interaction Detection in Industrial Domains2025-07-17RS-TinyNet: Stage-wise Feature Fusion Network for Detecting Tiny Objects in Remote Sensing Images2025-07-17Decoupled PROB: Decoupled Query Initialization Tasks and Objectness-Class Learning for Open World Object Detection2025-07-17Dual LiDAR-Based Traffic Movement Count Estimation at a Signalized Intersection: Deployment, Data Collection, and Preliminary Analysis2025-07-17Neural Network-Guided Symbolic Regression for Interpretable Descriptor Discovery in Perovskite Catalysts2025-07-16Imbalanced Regression Pipeline Recommendation2025-07-16Second-Order Bounds for [0,1]-Valued Regression via Betting Loss2025-07-16