Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.11678
Cited By
Improved optimization strategies for deep Multi-Task Networks
21 September 2021
Lucas Pascal
Pietro Michiardi
Xavier Bost
B. Huet
Maria A. Zuluaga
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Improved optimization strategies for deep Multi-Task Networks"
4 / 4 papers shown
Title
ATE-SG: Alternate Through the Epochs Stochastic Gradient for Multi-Task Neural Networks
Stefania Bellavia
Francesco Della Santa
Alessandra Papini
35
0
0
26 Dec 2023
Examining Common Paradigms in Multi-Task Learning
Cathrin Elich
Lukas Kirchdorfer
Jan M. Kohler
Lukas Schott
29
0
0
08 Nov 2023
FAMO: Fast Adaptive Multitask Optimization
B. Liu
Yihao Feng
Peter Stone
Qian Liu
33
30
0
06 Jun 2023
Alternating Gradient Descent and Mixture-of-Experts for Integrated Multimodal Perception
Hassan Akbari
Dan Kondratyuk
Yin Cui
Rachel Hornung
Haoran Wang
Hartwig Adam
VLM
MoE
30
11
0
10 May 2023
1