TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Datasets/WebQuestions

WebQuestions

TextsUnknownIntroduced 2013-01-01

The WebQuestions dataset is a question answering dataset using Freebase as the knowledge base and contains 6,642 question-answer pairs. It was created by crawling questions through the Google Suggest API, and then obtaining answers using Amazon Mechanical Turk. The original split uses 3,778 examples for training and 2,032 for testing. All answers are defined as Freebase entities.

Example questions (answers) in the dataset include “Where did Edgar Allan Poe died?” (baltimore) or “What degrees did Barack Obama get?” (bachelor_of_arts, juris_doctor).

Source: Question Answering with Subgraph Embeddings Image Source: Berant et al

Benchmarks

Data-to-Text Generation/BLEUData-to-Text Generation/METEORData-to-Text Generation/ROUGEKG-to-Text Generation/BLEUKG-to-Text Generation/METEORKG-to-Text Generation/ROUGEOpen-Domain Question Answering/Exact MatchQuestion Answering/EMQuestion Answering/F1Question Answering/Exact MatchText Generation/BLEUText Generation/METEORText Generation/ROUGE

Related Benchmarks

WebQuestionsSP/Question Answering/AccuracyWebQuestionsSP/Question Answering/F1WebQuestionsSP/Question Answering/Hits@1WebQuestionsSP/Semantic Parsing/Accuracy

Statistics

Papers
241
Benchmarks
13

Links

Homepage

Tasks

Data-to-Text GenerationKG-to-Text GenerationKnowledge Base Question AnsweringOpen-Domain Question AnsweringQuestion AnsweringText Generation