Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2307.09249
Cited By
UniTabE: A Universal Pretraining Protocol for Tabular Foundation Model in Data Science
18 July 2023
Yazheng Yang
Yuqi Wang
Guangyi Liu
Ledell Yu Wu
Qi Liu
LMTD
Re-assign community
ArXiv
PDF
HTML
Papers citing
"UniTabE: A Universal Pretraining Protocol for Tabular Foundation Model in Data Science"
30 / 30 papers shown
Title
Group-in-Group Policy Optimization for LLM Agent Training
Lang Feng
Zhenghai Xue
Tingcong Liu
Bo An
OffRL
195
2
0
16 May 2025
Zero-Shot Decision Tree Construction via Large Language Models
Lucas Carrasco
Felipe Urrutia
Andrés Abeliuk
142
0
0
28 Jan 2025
Llama 2: Open Foundation and Fine-Tuned Chat Models
Hugo Touvron
Louis Martin
Kevin R. Stone
Peter Albert
Amjad Almahairi
...
Sharan Narang
Aurelien Rodriguez
Robert Stojnic
Sergey Edunov
Thomas Scialom
AI4MH
ALM
290
11,828
0
18 Jul 2023
PTab: Using the Pre-trained Language Model for Modeling Tabular Data
Guangyi Liu
Jie Yang
Ledell Yu Wu
LMTD
96
36
0
15 Sep 2022
Efficient Training of Language Models to Fill in the Middle
Mohammad Bavarian
Heewoo Jun
Nikolas Tezak
John Schulman
C. McLeavey
Jerry Tworek
Mark Chen
63
195
0
28 Jul 2022
GANDALF: Gated Adaptive Network for Deep Automated Learning of Features
Manu Joseph
Harsh Raj
31
10
0
18 Jul 2022
TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second
Noah Hollmann
Samuel G. Müller
Katharina Eggensperger
Frank Hutter
99
306
0
05 Jul 2022
Multilingual Neural Machine Translation with Deep Encoder and Multiple Shallow Decoders
Xiang Kong
Adithya Renduchintala
James Cross
Yuqing Tang
Jiatao Gu
Xian Li
59
32
0
05 Jun 2022
TransTab: Learning Transferable Tabular Transformers Across Tables
Zifeng Wang
Jimeng Sun
LMTD
63
145
0
19 May 2022
On Embeddings for Numerical Features in Tabular Deep Learning
Yura Gorishniy
Ivan Rubachev
Artem Babenko
LMTD
84
172
0
10 Mar 2022
MATE: Multi-view Attention for Table Transformer Efficiency
Julian Martin Eisenschlos
Maharshi Gor
Thomas Müller
William W. Cohen
LMTD
89
96
0
09 Sep 2021
Revisiting Deep Learning Models for Tabular Data
Yu. V. Gorishniy
Ivan Rubachev
Valentin Khrulkov
Artem Babenko
LMTD
112
749
0
22 Jun 2021
Instantaneous Grammatical Error Correction with Shallow Aggressive Decoding
Xin Sun
Tao Ge
Furu Wei
Houfeng Wang
56
63
0
09 Jun 2021
SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training
Gowthami Somepalli
Micah Goldblum
Avi Schwarzschild
C. Bayan Bruss
Tom Goldstein
LMTD
85
326
0
02 Jun 2021
An Efficient Transformer Decoder with Compressed Sub-layers
Yanyang Li
Ye Lin
Tong Xiao
Jingbo Zhu
53
29
0
03 Jan 2021
TabTransformer: Tabular Data Modeling Using Contextual Embeddings
Xin Huang
A. Khetan
Milan Cvitkovic
Zohar Karnin
ViT
LMTD
197
453
0
11 Dec 2020
TUTA: Tree-based Transformers for Generally Structured Table Pre-training
Zhiruo Wang
Haoyu Dong
Ran Jia
Jia Li
Zhiyi Fu
Shi Han
Dongmei Zhang
LMTD
60
146
0
21 Oct 2020
Deep Encoder, Shallow Decoder: Reevaluating Non-autoregressive Machine Translation
Jungo Kasai
Nikolaos Pappas
Hao Peng
James Cross
Noah A. Smith
69
138
0
18 Jun 2020
What Makes for Good Views for Contrastive Learning?
Yonglong Tian
Chen Sun
Ben Poole
Dilip Krishnan
Cordelia Schmid
Phillip Isola
SSL
90
1,329
0
20 May 2020
TaBERT: Pretraining for Joint Understanding of Textual and Tabular Data
Pengcheng Yin
Graham Neubig
Wen-tau Yih
Sebastian Riedel
RALM
LMTD
81
598
0
17 May 2020
Recall and Learn: Fine-tuning Deep Pretrained Language Models with Less Forgetting
Sanyuan Chen
Yutai Hou
Yiming Cui
Wanxiang Che
Ting Liu
Xiangzhan Yu
KELM
CLL
97
224
0
27 Apr 2020
Supervised Contrastive Learning
Prannay Khosla
Piotr Teterwak
Chen Wang
Aaron Sarna
Yonglong Tian
Phillip Isola
Aaron Maschinot
Ce Liu
Dilip Krishnan
SSL
145
4,537
0
23 Apr 2020
TAPAS: Weakly Supervised Table Parsing via Pre-training
Jonathan Herzig
Pawel Krzysztof Nowak
Thomas Müller
Francesco Piccinno
Julian Martin Eisenschlos
LMTD
RALM
88
651
0
05 Apr 2020
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
M. Lewis
Yinhan Liu
Naman Goyal
Marjan Ghazvininejad
Abdel-rahman Mohamed
Omer Levy
Veselin Stoyanov
Luke Zettlemoyer
AIMat
VLM
246
10,819
0
29 Oct 2019
Neural Oblivious Decision Ensembles for Deep Learning on Tabular Data
Sergei Popov
S. Morozov
Artem Babenko
LMTD
132
313
0
13 Sep 2019
TabNet: Attentive Interpretable Tabular Learning
Sercan O. Arik
Tomas Pfister
LMTD
183
1,349
0
20 Aug 2019
AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks
Weiping Song
Chence Shi
Zhiping Xiao
Zhijian Duan
Yewen Xu
Ming Zhang
Jian Tang
CML
80
856
0
29 Oct 2018
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLM
SSL
SSeg
1.7K
94,770
0
11 Oct 2018
Attention Is All You Need
Ashish Vaswani
Noam M. Shazeer
Niki Parmar
Jakob Uszkoreit
Llion Jones
Aidan Gomez
Lukasz Kaiser
Illia Polosukhin
3DV
687
131,526
0
12 Jun 2017
XGBoost: A Scalable Tree Boosting System
Tianqi Chen
Carlos Guestrin
787
38,735
0
09 Mar 2016
1