Tasks
SotA
Datasets
Papers
Methods
Submit
About
SotA
/
Knowledge Base
/
Text Summarization
/
Pubmed
Text Summarization on Pubmed
Metric: ROUGE-L (higher is better)
Leaderboard
Dataset
Loading chart...
Results
Submit a result
Hide extra data
Export CSV
#
Model
↕
ROUGE-L
▼
Extra Data
Paper
Date
↕
Code
1
LongT5
46.67
No
LongT5: Efficient Text-To-Text Transformer for L...
2021-12-15
Code
2
Top Down Transformer (AdaPool) (464M)
46.47
No
Long Document Summarization with Top-down and Bo...
2022-03-15
Code
3
eyeglaxs
45.96
No
Scaling Up Summarization: Leveraging Large Langu...
2024-08-28
-
4
GoSum (extractive)
45.1
No
GoSum: Extractive Summarization of Long Document...
2022-11-18
Code
5
Lodoss-full-large (extractive)
44.84
No
Toward Unifying Text Segmentation and Long Docum...
2022-10-28
Code
6
MemSum (extractive)
44.42
No
MemSum: Extractive Summarization of Long Documen...
2021-07-19
Code
7
Lodoss-full-base (extractive)
44.4
No
Toward Unifying Text Segmentation and Long Docum...
2022-10-28
Code
8
FactorSum
43.76
No
Factorizing Content and Budget Decisions in Abst...
2022-05-25
Code
9
GRETEL
43.16
No
GRETEL: Graph Contrastive Topic Enhanced Languag...
2022-08-21
-
10
DANCER PEGASUS
42.42
Yes
A Divide-and-Conquer Approach to the Summarizati...
2020-04-13
Code
11
BigBird-Pegasus
42.33
Yes
Big Bird: Transformers for Longer Sequences
2020-07-28
Code
12
HiStruct+
42.11
No
HiStruct+: Improving Extractive Text Summarizati...
2022-03-17
-
13
ExtSum-LG+MMR-Select+
40.99
No
Systematically Exploring Redundancy Reduction in...
2020-11-30
Code
14
ExtSum-LG+RdLoss
40.95
No
Systematically Exploring Redundancy Reduction in...
2020-11-30
Code
15
DANCER LSTM
40.27
No
A Divide-and-Conquer Approach to the Summarizati...
2020-04-13
Code
16
DANCER RUM
40.25
No
A Divide-and-Conquer Approach to the Summarizati...
2020-04-13
Code
17
GenCompareSum
38.25
No
-
-
Code
18
MatchSum (BERT-base)
36.75
No
Extractive Summarization as Text Matching
2020-04-19
Code
19
HAT-BART
36.69
Yes
Hierarchical Learning for Generation with Long S...
2021-04-15
-
20
Fastformer
34.81
No
Fastformer: Additive Attention Can Be All You Need
2021-08-20
Code