Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2312.07753
Cited By
Polynomial-based Self-Attention for Table Representation learning
12 December 2023
Jayoung Kim
Yehjin Shin
Jeongwhan Choi
Hyowon Wi
Noseong Park
LMTD
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Polynomial-based Self-Attention for Table Representation learning"
4 / 4 papers shown
Title
Learning Advanced Self-Attention for Linear Transformers in the Singular Value Domain
Hyowon Wi
Jeongwhan Choi
Noseong Park
33
0
0
13 May 2025
Graph Convolutions Enrich the Self-Attention in Transformers!
Jeongwhan Choi
Hyowon Wi
Jayoung Kim
Yehjin Shin
Kookjin Lee
Nathaniel Trask
Noseong Park
30
4
0
07 Dec 2023
Masked Autoencoders Are Scalable Vision Learners
Kaiming He
Xinlei Chen
Saining Xie
Yanghao Li
Piotr Dollár
Ross B. Girshick
ViT
TPM
305
7,443
0
11 Nov 2021
TabTransformer: Tabular Data Modeling Using Contextual Embeddings
Xin Huang
A. Khetan
Milan Cvitkovic
Zohar S. Karnin
ViT
LMTD
157
417
0
11 Dec 2020
1