Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2012.02469
Cited By
RPT: Relational Pre-trained Transformer Is Almost All You Need towards Democratizing Data Preparation
4 December 2020
Nan Tang
Ju Fan
Fangyi Li
Jianhong Tu
Xiaoyong Du
Guoliang Li
Samuel Madden
M. Ouzzani
Re-assign community
ArXiv
PDF
HTML
Papers citing
"RPT: Relational Pre-trained Transformer Is Almost All You Need towards Democratizing Data Preparation"
5 / 5 papers shown
Title
Table Transformers for Imputing Textual Attributes
Ting-Ruen Wei
Yuan Wang
Yoshitaka Inoue
Hsin-Tai Wu
Yi Fang
LMTD
32
0
0
04 Aug 2024
VerifAI: Verified Generative AI
Nan Tang
Chenyu Yang
Ju Fan
Lei Cao
Yuyu Luo
Alon Halevy
19
15
0
06 Jul 2023
DeepJoin: Joinable Table Discovery with Pre-trained Language Models
Yuyang Dong
Chuan Xiao
Takuma Nozawa
Masafumi Enomoto
M. Oyamada
9
18
0
15 Dec 2022
FORTAP: Using Formulas for Numerical-Reasoning-Aware Table Pretraining
Zhoujun Cheng
Haoyu Dong
Ran Jia
Pengfei Wu
Shi Han
Fan Cheng
Dongmei Zhang
AIMat
ReLM
LMTD
LRM
22
27
0
15 Sep 2021
Exploiting Cloze Questions for Few Shot Text Classification and Natural Language Inference
Timo Schick
Hinrich Schütze
258
1,587
0
21 Jan 2020
1