TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/MagicBrush: A Manually Annotated Dataset for Instruction-G...

MagicBrush: A Manually Annotated Dataset for Instruction-Guided Image Editing

Kai Zhang, Lingbo Mo, Wenhu Chen, Huan Sun, Yu Su

2023-06-16NeurIPS 2023 11Image Editing
PaperPDFCode(official)

Abstract

Text-guided image editing is widely needed in daily life, ranging from personal use to professional applications such as Photoshop. However, existing methods are either zero-shot or trained on an automatically synthesized dataset, which contains a high volume of noise. Thus, they still require lots of manual tuning to produce desirable outcomes in practice. To address this issue, we introduce MagicBrush (https://osu-nlp-group.github.io/MagicBrush/), the first large-scale, manually annotated dataset for instruction-guided real image editing that covers diverse scenarios: single-turn, multi-turn, mask-provided, and mask-free editing. MagicBrush comprises over 10K manually annotated triplets (source image, instruction, target image), which supports trainining large-scale text-guided image editing models. We fine-tune InstructPix2Pix on MagicBrush and show that the new model can produce much better images according to human evaluation. We further conduct extensive experiments to evaluate current image editing baselines from multiple dimensions including quantitative, qualitative, and human evaluations. The results reveal the challenging nature of our dataset and the gap between current baselines and real-world editing needs.

Results

TaskDatasetMetricValueModel
Image EditingImgEdit-DataAction1.22MagicBrush
Image EditingImgEdit-DataAdd2.84MagicBrush
Image EditingImgEdit-DataAdjust1.58MagicBrush
Image EditingImgEdit-DataBackground1.75MagicBrush
Image EditingImgEdit-DataExtract1.51MagicBrush
Image EditingImgEdit-DataHybrid1.62MagicBrush
Image EditingImgEdit-DataOverall1.83MagicBrush
Image EditingImgEdit-DataRemove1.58MagicBrush
Image EditingImgEdit-DataReplace1.97MagicBrush
Image EditingImgEdit-DataStyle2.38MagicBrush

Related Papers

NoHumansRequired: Autonomous High-Quality Image Editing Triplet Mining2025-07-18UniWorld-V1: High-Resolution Semantic Encoders for Unified Visual Understanding and Generation2025-06-03ImgEdit: A Unified Image Editing Dataset and Benchmark2025-05-26Emerging Properties in Unified Multimodal Pretraining2025-05-20Step1X-Edit: A Practical Framework for General Image Editing2025-04-24AnyEdit: Edit Any Knowledge Encoded in Language Models2025-02-08UltraEdit: Instruction-based Fine-Grained Image Editing at Scale2024-07-07In-Context Editing: Learning Knowledge from Self-Induced Distributions2024-06-17