ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.09598
  4. Cited By
Build a Robust QA System with Transformer-based Mixture of Experts

Build a Robust QA System with Transformer-based Mixture of Experts

20 March 2022
Yu Qing Zhou
Xuyang Liu
Yu Dong
    MoE
ArXivPDFHTML

Papers citing "Build a Robust QA System with Transformer-based Mixture of Experts"

4 / 4 papers shown
Title
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and
  lighter
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Victor Sanh
Lysandre Debut
Julien Chaumond
Thomas Wolf
196
7,465
0
02 Oct 2019
EDA: Easy Data Augmentation Techniques for Boosting Performance on Text
  Classification Tasks
EDA: Easy Data Augmentation Techniques for Boosting Performance on Text Classification Tasks
Jason W. Wei
Kai Zou
101
1,949
0
31 Jan 2019
NewsQA: A Machine Comprehension Dataset
NewsQA: A Machine Comprehension Dataset
Adam Trischler
Tong Wang
Xingdi Yuan
Justin Harris
Alessandro Sordoni
Philip Bachman
Kaheer Suleman
82
893
0
29 Nov 2016
SQuAD: 100,000+ Questions for Machine Comprehension of Text
SQuAD: 100,000+ Questions for Machine Comprehension of Text
Pranav Rajpurkar
Jian Zhang
Konstantin Lopyrev
Percy Liang
RALM
231
8,113
0
16 Jun 2016
1