Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2410.01335
Cited By
v1
v2 (latest)
Layer Swapping for Zero-Shot Cross-Lingual Transfer in Large Language Models
2 October 2024
Lucas Bandarkar
Benjamin Muller
Pritish Yuvraj
Rui Hou
Nayan Singhal
Hongjiang Lv
Bing-Quan Liu
KELM
LRM
MoMe
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Layer Swapping for Zero-Shot Cross-Lingual Transfer in Large Language Models"
4 / 54 papers shown
Title
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
Jonas Pfeiffer
Ivan Vulić
Iryna Gurevych
Sebastian Ruder
103
630
0
30 Apr 2020
Unsupervised Cross-lingual Representation Learning at Scale
Alexis Conneau
Kartikay Khandelwal
Naman Goyal
Vishrav Chaudhary
Guillaume Wenzek
Francisco Guzmán
Edouard Grave
Myle Ott
Luke Zettlemoyer
Veselin Stoyanov
228
6,587
0
05 Nov 2019
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLM
SSL
SSeg
1.8K
95,175
0
11 Oct 2018
Averaging Weights Leads to Wider Optima and Better Generalization
Pavel Izmailov
Dmitrii Podoprikhin
T. Garipov
Dmitry Vetrov
A. Wilson
FedML
MoMe
135
1,670
0
14 Mar 2018
Previous
1
2