TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

SotA/Knowledge Base/Table annotation

Table annotation

21 benchmarks31 papers

Table annotation is the task of annotating a table with terms/concepts from knowledge graph or database schema. Table annotation is typically broken down into the following five subtasks:

  1. Cell Entity Annotation (CEA)
  2. Column Type Annotation (CTA)
  3. Column Property Annotation (CPA)
  4. Table Type Detection
  5. Row Annotation

The SemTab challenge is closely related to the Table Annotation problem. It is a yearly challenge which focuses on the first three tasks of table annotation and its purpose is to benchmark different table annotation systems.

Benchmarks

Table annotation on BiodivTab

F1 (%)

Table annotation on ToughTables-DBP

F1 (%)

Table annotation on ToughTables-WD

F1 (%)

Table annotation on WDC SOTAB V2

Micro F1

Table annotation on T2Dv2

F1 (%)Accuracy (%)

Table annotation on WDC SOTAB

Micro F1Weighted F1

Table annotation on GitTables-SemTab-DBP

F1 (%)

Table annotation on VizNet-Sato-Full

Macro-F1Weighted-F1

Table annotation on GitTables-SemTab-SCH

F1 (%)

Table annotation on VizNet-Sato-MultiColumn

Macro-F1Weighted-F1

Table annotation on WikiTables-TURL-CPA

F1 (%) Macro-F1

Table annotation on WikiTables-TURL-CTA

F1 (%)Macro-F1

Table annotation on WikipediaGS-CTA

Accuracy (%)

Table annotation on WikiTables-TURL-CEA

F1 (%)

Table annotation on WikipediaGS

F1 (%)