ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1711.00066
  4. Cited By
Fraternal Dropout

Fraternal Dropout

31 October 2017
Konrad Zolna
Devansh Arpit
Dendi Suhubdy
Yoshua Bengio
ArXivPDFHTML

Papers citing "Fraternal Dropout"

17 / 17 papers shown
Title
Generative Adversarial Training Can Improve Neural Language Models
Generative Adversarial Training Can Improve Neural Language Models
Sajad Movahedi
A. Shakery
GAN
AI4CE
34
2
0
02 Nov 2022
Information Geometry of Dropout Training
Information Geometry of Dropout Training
Masanari Kimura
H. Hino
24
2
0
22 Jun 2022
Augmentation-induced Consistency Regularization for Classification
Augmentation-induced Consistency Regularization for Classification
Jianguo Wu
Shijing Si
Jianzong Wang
Jing Xiao
26
2
0
25 May 2022
A Survey on Dropout Methods and Experimental Verification in
  Recommendation
A Survey on Dropout Methods and Experimental Verification in Recommendation
Yong Li
Weizhi Ma
C. L. Philip Chen
Hao Fei
Yiqun Liu
Shaoping Ma
Yue Yang
37
9
0
05 Apr 2022
Dependency-based Mixture Language Models
Dependency-based Mixture Language Models
Zhixian Yang
Xiaojun Wan
49
2
0
19 Mar 2022
Preventing posterior collapse in variational autoencoders for text
  generation via decoder regularization
Preventing posterior collapse in variational autoencoders for text generation via decoder regularization
Alban Petit
Caio Corro
DRL
21
3
0
28 Oct 2021
R-Drop: Regularized Dropout for Neural Networks
R-Drop: Regularized Dropout for Neural Networks
Xiaobo Liang
Lijun Wu
Juntao Li
Yue Wang
Qi Meng
Tao Qin
Wei Chen
Hao Fei
Tie-Yan Liu
47
424
0
28 Jun 2021
Not Enough Data? Deep Learning to the Rescue!
Not Enough Data? Deep Learning to the Rescue!
Ateret Anaby-Tavor
Boaz Carmeli
Esther Goldbraich
Amir Kantor
George Kour
Segev Shlomov
N. Tepper
Naama Zwerdling
22
366
0
08 Nov 2019
On the Regularization Properties of Structured Dropout
On the Regularization Properties of Structured Dropout
Ambar Pal
Connor Lane
René Vidal
B. Haeffele
18
13
0
30 Oct 2019
Alleviating Sequence Information Loss with Data Overlapping and Prime
  Batch Sizes
Alleviating Sequence Information Loss with Data Overlapping and Prime Batch Sizes
Noémien Kocher
Christian Scuito
Lorenzo Tarantino
Alexandros Lazaridis
Andreas Fischer
C. Musat
28
0
0
18 Sep 2019
Gmail Smart Compose: Real-Time Assisted Writing
Gmail Smart Compose: Real-Time Assisted Writing
Mengzhao Chen
Benjamin Lee
G. Bansal
Yuan Cao
Shuyuan Zhang
...
Yinan Wang
Andrew M. Dai
Zhehuai Chen
Timothy Sohn
Yonghui Wu
21
203
0
17 May 2019
Survey of Dropout Methods for Deep Neural Networks
Survey of Dropout Methods for Deep Neural Networks
Alex Labach
Hojjat Salehinejad
S. Valaee
27
149
0
25 Apr 2019
Breaking the Softmax Bottleneck via Learnable Monotonic Pointwise
  Non-linearities
Breaking the Softmax Bottleneck via Learnable Monotonic Pointwise Non-linearities
O. Ganea
Sylvain Gelly
Gary Bécigneul
Aliaksei Severyn
29
18
0
21 Feb 2019
Co-regularized Alignment for Unsupervised Domain Adaptation
Co-regularized Alignment for Unsupervised Domain Adaptation
Abhishek Kumar
P. Sattigeri
Kahini Wadhawan
Leonid Karlinsky
Rogerio Feris
William T. Freeman
G. Wornell
OOD
16
157
0
13 Nov 2018
Evolutionary Stochastic Gradient Descent for Optimization of Deep Neural
  Networks
Evolutionary Stochastic Gradient Descent for Optimization of Deep Neural Networks
Xiaodong Cui
Wei Zhang
Zoltán Tüske
M. Picheny
ODL
16
89
0
16 Oct 2018
Direct Output Connection for a High-Rank Language Model
Direct Output Connection for a High-Rank Language Model
Sho Takase
Jun Suzuki
Masaaki Nagata
18
36
0
30 Aug 2018
Recent Advances in Deep Learning: An Overview
Recent Advances in Deep Learning: An Overview
Matiur Rahman Minar
Jibon Naher
VLM
24
116
0
21 Jul 2018
1