ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2401.09699
  4. Cited By
Curriculum Recommendations Using Transformer Base Model with InfoNCE
  Loss And Language Switching Method

Curriculum Recommendations Using Transformer Base Model with InfoNCE Loss And Language Switching Method

18 January 2024
Xiaonan Xu
Bin Yuan
Yongyao Mo
Tianbo Song
Shulin Li
ArXivPDFHTML

Papers citing "Curriculum Recommendations Using Transformer Base Model with InfoNCE Loss And Language Switching Method"

3 / 3 papers shown
Title
Rethinking InfoNCE: How Many Negative Samples Do You Need?
Rethinking InfoNCE: How Many Negative Samples Do You Need?
Chuhan Wu
Fangzhao Wu
Yongfeng Huang
42
45
0
27 May 2021
Language-agnostic BERT Sentence Embedding
Language-agnostic BERT Sentence Embedding
Fangxiaoyu Feng
Yinfei Yang
Daniel Cer
N. Arivazhagan
Wei Wang
98
896
0
03 Jul 2020
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Nils Reimers
Iryna Gurevych
641
11,979
0
27 Aug 2019
1