Papers
Communities
Organizations
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.00479
Cited By
DoT: An efficient Double Transformer for NLP tasks with tables
1 June 2021
Syrine Krichene
Thomas Müller
Julian Martin Eisenschlos
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"DoT: An efficient Double Transformer for NLP tasks with tables"
8 / 8 papers shown
Title
TableLoRA: Low-rank Adaptation on Table Structure Understanding for Large Language Models
Xinyi He
Yihao Liu
Mengyu Zhou
Yeye He
Haoyu Dong
Shi Han
Zejian Yuan
Dongmei Zhang
LMTD
151
1
0
06 Mar 2025
Training Table Question Answering via SQL Query Decomposition
Raphael Mouravieff
Benjamin Piwowarski
Sylvain Lamprier
ReLM
LMTD
97
1
0
19 Feb 2024
CABINET: Content Relevance based Noise Reduction for Table Question Answering
Sohan Patnaik
Heril Changwal
Milan Aggarwal
Sumita Bhatia
Yaman Kumar
Balaji Krishnamurthy
LMTD
RALM
119
19
0
02 Feb 2024
Around the GLOBE: Numerical Aggregation Question-Answering on Heterogeneous Genealogical Knowledge Graphs with Deep Neural Networks
Omri Suissa
M. Zhitomirsky-Geffet
Avshalom Elmalech
66
1
0
30 Jul 2023
Table Retrieval May Not Necessitate Table-specific Model Design
Zhiruo Wang
Zhengbao Jiang
Eric Nyberg
Graham Neubig
LMTD
78
16
0
19 May 2022
TIE: Topological Information Enhanced Structural Reading Comprehension on Web Pages
Zihan Zhao
Lu Chen
Ruisheng Cao
Hongshen Xu
Xingyu Chen
Kai Yu
94
9
0
13 May 2022
Table Pre-training: A Survey on Model Architectures, Pre-training Objectives, and Downstream Tasks
Haoyu Dong
Zhoujun Cheng
Xinyi He
Mengyuan Zhou
Anda Zhou
Fan Zhou
Ao Liu
Shi Han
Dongmei Zhang
LMTD
159
65
0
24 Jan 2022
MATE: Multi-view Attention for Table Transformer Efficiency
Julian Martin Eisenschlos
Maharshi Gor
Thomas Müller
William W. Cohen
LMTD
108
97
0
09 Sep 2021
1