Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.00479
Cited By
DoT: An efficient Double Transformer for NLP tasks with tables
1 June 2021
Syrine Krichene
Thomas Müller
Julian Martin Eisenschlos
Re-assign community
ArXiv
PDF
HTML
Papers citing
"DoT: An efficient Double Transformer for NLP tasks with tables"
3 / 3 papers shown
Title
Table Retrieval May Not Necessitate Table-specific Model Design
Zhiruo Wang
Zhengbao Jiang
Eric Nyberg
Graham Neubig
LMTD
31
16
0
19 May 2022
Table Pre-training: A Survey on Model Architectures, Pre-training Objectives, and Downstream Tasks
Haoyu Dong
Zhoujun Cheng
Xinyi He
Mengyuan Zhou
Anda Zhou
Fan Zhou
Ao Liu
Shi Han
Dongmei Zhang
LMTD
65
64
0
24 Jan 2022
MATE: Multi-view Attention for Table Transformer Efficiency
Julian Martin Eisenschlos
Maharshi Gor
Thomas Müller
William W. Cohen
LMTD
75
95
0
09 Sep 2021
1