ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.11678
  4. Cited By
Improved optimization strategies for deep Multi-Task Networks

Improved optimization strategies for deep Multi-Task Networks

21 September 2021
Lucas Pascal
Pietro Michiardi
Xavier Bost
B. Huet
Maria A. Zuluaga
ArXivPDFHTML

Papers citing "Improved optimization strategies for deep Multi-Task Networks"

4 / 4 papers shown
Title
ATE-SG: Alternate Through the Epochs Stochastic Gradient for Multi-Task Neural Networks
ATE-SG: Alternate Through the Epochs Stochastic Gradient for Multi-Task Neural Networks
Stefania Bellavia
Francesco Della Santa
Alessandra Papini
33
0
0
26 Dec 2023
Examining Common Paradigms in Multi-Task Learning
Examining Common Paradigms in Multi-Task Learning
Cathrin Elich
Lukas Kirchdorfer
Jan M. Kohler
Lukas Schott
29
0
0
08 Nov 2023
FAMO: Fast Adaptive Multitask Optimization
FAMO: Fast Adaptive Multitask Optimization
B. Liu
Yihao Feng
Peter Stone
Qian Liu
33
30
0
06 Jun 2023
Alternating Gradient Descent and Mixture-of-Experts for Integrated
  Multimodal Perception
Alternating Gradient Descent and Mixture-of-Experts for Integrated Multimodal Perception
Hassan Akbari
Dan Kondratyuk
Yin Cui
Rachel Hornung
H. Wang
Hartwig Adam
VLM
MoE
30
11
0
10 May 2023
1