ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.02431
  4. Cited By
On the Multilingual Ability of Decoder-based Pre-trained Language
  Models: Finding and Controlling Language-Specific Neurons

On the Multilingual Ability of Decoder-based Pre-trained Language Models: Finding and Controlling Language-Specific Neurons

3 April 2024
Takeshi Kojima
Itsuki Okimura
Yusuke Iwasawa
Hitomi Yanaka
Yutaka Matsuo
    MILMLRM
ArXiv (abs)PDFHTML

Papers citing "On the Multilingual Ability of Decoder-based Pre-trained Language Models: Finding and Controlling Language-Specific Neurons"

3 / 3 papers shown
Title
ShifCon: Enhancing Non-Dominant Language Capabilities with a Shift-based Contrastive Framework
ShifCon: Enhancing Non-Dominant Language Capabilities with a Shift-based Contrastive Framework
Hengyuan Zhang
Chenming Shang
Sizhe Wang
Dongdong Zhang
Feng Yao
Renliang Sun
Yiyao Yu
Yujiu Yang
Furu Wei
141
6
0
25 Oct 2024
CiMaTe: Citation Count Prediction Effectively Leveraging the Main Text
CiMaTe: Citation Count Prediction Effectively Leveraging the Main Text
Jun Hirako
Ryohei Sasano
Koichi Takeda
85
3
0
06 Oct 2024
Layer Swapping for Zero-Shot Cross-Lingual Transfer in Large Language Models
Layer Swapping for Zero-Shot Cross-Lingual Transfer in Large Language Models
Lucas Bandarkar
Benjamin Muller
Pritish Yuvraj
Rui Hou
Nayan Singhal
Hongjiang Lv
Bing-Quan Liu
KELMLRMMoMe
96
5
0
02 Oct 2024
1