ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.03652
  4. Cited By
Cross-Thought for Sentence Encoder Pre-training

Cross-Thought for Sentence Encoder Pre-training

7 October 2020
Shuohang Wang
Yuwei Fang
S. Sun
Zhe Gan
Yu Cheng
Jing Jiang
Jingjing Liu
    LRM
ArXivPDFHTML

Papers citing "Cross-Thought for Sentence Encoder Pre-training"

6 / 6 papers shown
Title
Generative or Contrastive? Phrase Reconstruction for Better Sentence Representation Learning
Bohong Wu
Hai Zhao
SSL
18
2
0
20 Apr 2022
Enhancing Natural Language Representation with Large-Scale Out-of-Domain
  Commonsense
Enhancing Natural Language Representation with Large-Scale Out-of-Domain Commonsense
Wanyun Cui
Xingran Chen
22
6
0
06 Sep 2021
ConSERT: A Contrastive Framework for Self-Supervised Sentence
  Representation Transfer
ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer
Yuanmeng Yan
Rumei Li
Sirui Wang
Fuzheng Zhang
Wei Wu
Weiran Xu
SSL
52
546
0
25 May 2021
REPT: Bridging Language Models and Machine Reading Comprehension via
  Retrieval-Based Pre-training
REPT: Bridging Language Models and Machine Reading Comprehension via Retrieval-Based Pre-training
Fangkai Jiao
Yangyang Guo
Yilin Niu
Feng Ji
Feng-Lin Li
Liqiang Nie
LRM
34
12
0
10 May 2021
Revealing the Importance of Semantic Retrieval for Machine Reading at
  Scale
Revealing the Importance of Semantic Retrieval for Machine Reading at Scale
Yixin Nie
Songhe Wang
Joey Tianyi Zhou
RALM
164
134
0
17 Sep 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
299
6,984
0
20 Apr 2018
1