Karsten Roth, Latha Pemula, Joaquin Zepeda, Bernhard Schölkopf, Thomas Brox, Peter Gehler
Being able to spot defective parts is a critical component in large-scale industrial manufacturing. A particular challenge that we address in this work is the cold-start problem: fit a model using nominal (non-defective) example images only. While handcrafted solutions per class are possible, the goal is to build systems that work well simultaneously on many different tasks automatically. The best performing approaches combine embeddings from ImageNet models with an outlier detection model. In this paper, we extend on this line of work and propose \textbf{PatchCore}, which uses a maximally representative memory bank of nominal patch-features. PatchCore offers competitive inference times while achieving state-of-the-art performance for both detection and localization. On the challenging, widely used MVTec AD benchmark PatchCore achieves an image-level anomaly detection AUROC score of up to $99.6\%$, more than halving the error compared to the next best competitor. We further report competitive results on two additional datasets and also find competitive results in the few samples regime.\freefootnote{$^*$ Work done during a research internship at Amazon AWS.} Code: github.com/amazon-research/patchcore-inspection.
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Anomaly Detection | MPDD | Detection AUROC | 82.12 | PatchCore |
| Anomaly Detection | MPDD | Segmentation AUROC | 95.66 | PatchCore |
| Anomaly Detection | AeBAD-V | Detection AUROC | 70.7 | PatchCore |
| Anomaly Detection | AeBAD-S | Detection AUROC | 71 | PatchCore |
| Anomaly Detection | AeBAD-S | Segmentation AUPRO | 87.8 | PatchCore |
| Anomaly Detection | MVTec AD | Detection AUROC | 99.6 | PatchCore Large |
| Anomaly Detection | MVTec AD | FPS | 5.88 | PatchCore Large |
| Anomaly Detection | MVTec AD | Segmentation AUPRO | 93.5 | PatchCore Large |
| Anomaly Detection | MVTec AD | Segmentation AUROC | 98.2 | PatchCore Large |
| Anomaly Detection | MVTec AD | Detection AUROC | 99.2 | PatchCore |
| Anomaly Detection | MVTec AD | Segmentation AUROC | 98.4 | PatchCore |
| Anomaly Detection | MVTec AD | Detection AUROC | 95.4 | PatchCore(16shot) |
| Anomaly Detection | MVTec LOCO AD | Avg. Detection AUROC | 80.3 | PatchCore |
| Anomaly Detection | MVTec LOCO AD | Detection AUROC (only logical) | 75.8 | PatchCore |
| Anomaly Detection | MVTec LOCO AD | Segmentation AU-sPRO (until FPR 5%) | 39.7 | PatchCore |
| Anomaly Detection | MVTec LOCO AD | Avg. Detection AUROC | 79.4 | PatchCore Ensemble |
| Anomaly Detection | MVTec LOCO AD | Detection AUROC (only logical) | 71 | PatchCore Ensemble |
| Anomaly Detection | MVTec LOCO AD | Detection AUROC (only structural) | 87.7 | PatchCore Ensemble |
| Anomaly Detection | MVTec LOCO AD | Segmentation AU-sPRO (until FPR 5%) | 36.5 | PatchCore Ensemble |
| Anomaly Detection | GoodsAD | AUPR | 86.1 | PatchCore-100% |
| Anomaly Detection | GoodsAD | AUROC | 85.5 | PatchCore-100% |
| Anomaly Detection | GoodsAD | AUPR | 83.3 | PatchCore-1% |
| Anomaly Detection | GoodsAD | AUROC | 81.4 | PatchCore-1% |
| Anomaly Detection | Real 3D-AD | Mean Performance of P. and O. | 0.687 | PatchCore (FPFH+Raw) |
| Anomaly Detection | Real 3D-AD | Object AUROC | 0.682 | PatchCore (FPFH+Raw) |
| Anomaly Detection | Real 3D-AD | Point AUROC | 0.692 | PatchCore (FPFH+Raw) |
| Anomaly Detection | Real 3D-AD | Mean Performance of P. and O. | 0.614 | PatchCore (PointMAE) |
| Anomaly Detection | Real 3D-AD | Object AUROC | 0.594 | PatchCore (PointMAE) |
| Anomaly Detection | Real 3D-AD | Point AUROC | 0.634 | PatchCore (PointMAE) |
| Anomaly Detection | Real 3D-AD | Mean Performance of P. and O. | 0.5925 | PatchCore (FPFH) |
| Anomaly Detection | Real 3D-AD | Object AUROC | 0.593 | PatchCore (FPFH) |
| Anomaly Detection | Real 3D-AD | Point AUROC | 0.592 | PatchCore (FPFH) |
| Anomaly Detection | Anomaly-ShapeNet10 | O-AUROC | 0.884 | PatchCore (FPFH) |
| Anomaly Detection | Anomaly-ShapeNet10 | P-AUROC | 0.923 | PatchCore (FPFH) |
| Anomaly Detection | Anomaly-ShapeNet10 | O-AUROC | 0.574 | PatchCore (PointMAE) |
| Anomaly Detection | Anomaly-ShapeNet10 | P-AUROC | 0.645 | PatchCore (PointMAE) |
| 2D Classification | GoodsAD | AUPR | 86.1 | PatchCore-100% |
| 2D Classification | GoodsAD | AUROC | 85.5 | PatchCore-100% |
| 2D Classification | GoodsAD | AUPR | 83.3 | PatchCore-1% |
| 2D Classification | GoodsAD | AUROC | 81.4 | PatchCore-1% |
| 3D Anomaly Detection | Real 3D-AD | Mean Performance of P. and O. | 0.687 | PatchCore (FPFH+Raw) |
| 3D Anomaly Detection | Real 3D-AD | Object AUROC | 0.682 | PatchCore (FPFH+Raw) |
| 3D Anomaly Detection | Real 3D-AD | Point AUROC | 0.692 | PatchCore (FPFH+Raw) |
| 3D Anomaly Detection | Real 3D-AD | Mean Performance of P. and O. | 0.614 | PatchCore (PointMAE) |
| 3D Anomaly Detection | Real 3D-AD | Object AUROC | 0.594 | PatchCore (PointMAE) |
| 3D Anomaly Detection | Real 3D-AD | Point AUROC | 0.634 | PatchCore (PointMAE) |
| 3D Anomaly Detection | Real 3D-AD | Mean Performance of P. and O. | 0.5925 | PatchCore (FPFH) |
| 3D Anomaly Detection | Real 3D-AD | Object AUROC | 0.593 | PatchCore (FPFH) |
| 3D Anomaly Detection | Real 3D-AD | Point AUROC | 0.592 | PatchCore (FPFH) |
| 3D Anomaly Detection | Anomaly-ShapeNet10 | O-AUROC | 0.884 | PatchCore (FPFH) |
| 3D Anomaly Detection | Anomaly-ShapeNet10 | P-AUROC | 0.923 | PatchCore (FPFH) |
| 3D Anomaly Detection | Anomaly-ShapeNet10 | O-AUROC | 0.574 | PatchCore (PointMAE) |
| 3D Anomaly Detection | Anomaly-ShapeNet10 | P-AUROC | 0.645 | PatchCore (PointMAE) |
| Anomaly Classification | GoodsAD | AUPR | 86.1 | PatchCore-100% |
| Anomaly Classification | GoodsAD | AUROC | 85.5 | PatchCore-100% |
| Anomaly Classification | GoodsAD | AUPR | 83.3 | PatchCore-1% |
| Anomaly Classification | GoodsAD | AUROC | 81.4 | PatchCore-1% |