ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.08426
  4. Cited By
EncT5: A Framework for Fine-tuning T5 as Non-autoregressive Models

EncT5: A Framework for Fine-tuning T5 as Non-autoregressive Models

16 October 2021
Frederick Liu
T. Huang
Shihang Lyu
Siamak Shakeri
Hongkun Yu
Jing Li
ArXivPDFHTML

Papers citing "EncT5: A Framework for Fine-tuning T5 as Non-autoregressive Models"

5 / 5 papers shown
Title
The Ultimate Cookbook for Invisible Poison: Crafting Subtle Clean-Label Text Backdoors with Style Attributes
The Ultimate Cookbook for Invisible Poison: Crafting Subtle Clean-Label Text Backdoors with Style Attributes
Wencong You
Daniel Lowd
36
0
0
24 Apr 2025
Punctuation Restoration Improves Structure Understanding Without Supervision
Punctuation Restoration Improves Structure Understanding Without Supervision
Junghyun Min
Minho Lee
Woochul Lee
Yeonsoo Lee
59
1
0
13 Feb 2024
Boot and Switch: Alternating Distillation for Zero-Shot Dense Retrieval
Boot and Switch: Alternating Distillation for Zero-Shot Dense Retrieval
Fan Jiang
Qiongkai Xu
Tom Drummond
Trevor Cohn
21
2
0
27 Nov 2023
Scaling Laws for Neural Language Models
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
261
4,489
0
23 Jan 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,959
0
20 Apr 2018
1