Seulki Park, Jongin Lim, Younghan Jeon, Jin Young Choi
In this paper, we propose a balancing training method to address problems in imbalanced data learning. To this end, we derive a new loss used in the balancing training phase that alleviates the influence of samples that cause an overfitted decision boundary. The proposed loss efficiently improves the performance of any type of imbalance learning methods. In experiments on multiple benchmark data sets, we demonstrate the validity of our method and reveal that the proposed loss outperforms the state-of-the-art cost-sensitive loss methods. Furthermore, since our loss is not restricted to a specific task, model, or training method, it can be easily used in combination with other recent re-sampling, meta-learning, and cost-sensitive learning methods for class-imbalance problems.
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Image Classification | CIFAR-10-LT (ρ=10) | Error Rate | 12.93 | IBLLoss |
| Image Classification | CIFAR-100-LT (ρ=100) | Error Rate | 61.52 | IBLLoss |
| Few-Shot Image Classification | CIFAR-10-LT (ρ=10) | Error Rate | 12.93 | IBLLoss |
| Few-Shot Image Classification | CIFAR-100-LT (ρ=100) | Error Rate | 61.52 | IBLLoss |
| Generalized Few-Shot Classification | CIFAR-10-LT (ρ=10) | Error Rate | 12.93 | IBLLoss |
| Generalized Few-Shot Classification | CIFAR-100-LT (ρ=100) | Error Rate | 61.52 | IBLLoss |
| Long-tail Learning | CIFAR-10-LT (ρ=10) | Error Rate | 12.93 | IBLLoss |
| Long-tail Learning | CIFAR-100-LT (ρ=100) | Error Rate | 61.52 | IBLLoss |
| Generalized Few-Shot Learning | CIFAR-10-LT (ρ=10) | Error Rate | 12.93 | IBLLoss |
| Generalized Few-Shot Learning | CIFAR-100-LT (ρ=100) | Error Rate | 61.52 | IBLLoss |