TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

SotA/Computer Vision/Zero-Shot Semantic Segmentation/COCO-Stuff

Zero-Shot Semantic Segmentation on COCO-Stuff

Metric: Inductive Setting hIoU (higher is better)

LeaderboardDataset
Loading chart...

Results

Submit a result
#Model↕Inductive Setting hIoU▼Extra DataPaperDate↕Code
1OTSeg+41.5NoOTSeg: Multi-prompt Sinkhorn Attention for Zero-...2024-03-21Code
2OTSeg41.4NoOTSeg: Multi-prompt Sinkhorn Attention for Zero-...2024-03-21Code
3CLIP-RC41.2No--Code
4ZegCLIP40.8NoZegCLIP: Towards Adapting CLIP for Zero-shot Sem...2022-12-07Code
5DeOP38.2NoOpen-Vocabulary Semantic Segmentation with Decou...2023-04-03Code
6zsseg36.3NoA Simple Baseline for Open-Vocabulary Semantic S...2021-12-29Code
7ZegFormer33.2NoDecoupling Zero-Shot Semantic Segmentation2021-12-15Code
8SIGN20.9NoSIGN: Spatial-information Incorporated Generativ...2021-08-27-
9CaGNet18.2NoContext-aware Feature Generation for Zero-shot S...2020-08-16Code
10ZS515NoZero-Shot Semantic Segmentation2019-06-03Code
11SPNet14No--Code