ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2201.02880
  4. Cited By
Attention-based Random Forest and Contamination Model

Attention-based Random Forest and Contamination Model

8 January 2022
Lev V. Utkin
A. Konstantinov
ArXivPDFHTML

Papers citing "Attention-based Random Forest and Contamination Model"

19 / 19 papers shown
Title
A Multi-Head Attention Soft Random Forest for Interpretable Patient No-Show Prediction
A Multi-Head Attention Soft Random Forest for Interpretable Patient No-Show Prediction
Ninda Nurseha Amalina
Kwadwo Boateng Ofori-Amanfo
Heungjo An
132
0
0
22 May 2025
Neural Attention Models in Deep Learning: Survey and Taxonomy
Neural Attention Models in Deep Learning: Survey and Taxonomy
Alana de Santana Correia
Esther Colombini
MLAU
43
18
0
11 Dec 2021
Hybrid Random Features
Hybrid Random Features
K. Choromanski
Haoxian Chen
Han Lin
Yuanzhe Ma
Arijit Sehanobish
...
Andy Zeng
Valerii Likhosherstov
Dmitry Kalashnikov
Vikas Sindhwani
Adrian Weller
56
21
0
08 Oct 2021
Dive into Deep Learning
Dive into Deep Learning
Aston Zhang
Zachary Chase Lipton
Mu Li
Alexander J. Smola
VLM
79
568
0
21 Jun 2021
A Survey of Transformers
A Survey of Transformers
Tianyang Lin
Yuxin Wang
Xiangyang Liu
Xipeng Qiu
ViT
142
1,124
0
08 Jun 2021
Tabular Data: Deep Learning is Not All You Need
Tabular Data: Deep Learning is Not All You Need
Ravid Shwartz-Ziv
Amitai Armon
LMTD
159
1,263
0
06 Jun 2021
Luna: Linear Unified Nested Attention
Luna: Linear Unified Nested Attention
Xuezhe Ma
Xiang Kong
Sinong Wang
Chunting Zhou
Jonathan May
Hao Ma
Luke Zettlemoyer
78
113
0
03 Jun 2021
Attention, please! A survey of Neural Attention Models in Deep Learning
Attention, please! A survey of Neural Attention Models in Deep Learning
Alana de Santana Correia
Esther Luna Colombini
HAI
56
187
0
31 Mar 2021
Random Feature Attention
Random Feature Attention
Hao Peng
Nikolaos Pappas
Dani Yogatama
Roy Schwartz
Noah A. Smith
Lingpeng Kong
99
357
0
03 Mar 2021
Linear Transformers Are Secretly Fast Weight Programmers
Linear Transformers Are Secretly Fast Weight Programmers
Imanol Schlag
Kazuki Irie
Jürgen Schmidhuber
117
246
0
22 Feb 2021
Rethinking Attention with Performers
Rethinking Attention with Performers
K. Choromanski
Valerii Likhosherstov
David Dohan
Xingyou Song
Andreea Gane
...
Afroz Mohiuddin
Lukasz Kaiser
David Belanger
Lucy J. Colwell
Adrian Weller
179
1,580
0
30 Sep 2020
Random Features for Kernel Approximation: A Survey on Algorithms,
  Theory, and Beyond
Random Features for Kernel Approximation: A Survey on Algorithms, Theory, and Beyond
Fanghui Liu
Xiaolin Huang
Yudong Chen
Johan A. K. Suykens
BDL
97
174
0
23 Apr 2020
TabNet: Attentive Interpretable Tabular Learning
TabNet: Attentive Interpretable Tabular Learning
Sercan O. Arik
Tomas Pfister
LMTD
183
1,349
0
20 Aug 2019
An Attentive Survey of Attention Models
An Attentive Survey of Attention Models
S. Chaudhari
Varun Mithal
Gungor Polatkan
R. Ramanath
130
659
0
05 Apr 2019
A weighted random survival forest
A weighted random survival forest
Lev V. Utkin
A. Konstantinov
V. Chukanov
M. V. Kots
M. Ryabinin
A. Meldo
88
41
0
01 Jan 2019
Attention Is All You Need
Attention Is All You Need
Ashish Vaswani
Noam M. Shazeer
Niki Parmar
Jakob Uszkoreit
Llion Jones
Aidan Gomez
Lukasz Kaiser
Illia Polosukhin
3DV
687
131,526
0
12 Jun 2017
Deep Forest
Deep Forest
Zhi Zhou
Ji Feng
99
1,010
0
28 Feb 2017
Effective Approaches to Attention-based Neural Machine Translation
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
377
7,962
0
17 Aug 2015
Neural Machine Translation by Jointly Learning to Align and Translate
Neural Machine Translation by Jointly Learning to Align and Translate
Dzmitry Bahdanau
Kyunghyun Cho
Yoshua Bengio
AIMat
546
27,300
0
01 Sep 2014
1