ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.07160
  4. Cited By
Do not Mask Randomly: Effective Domain-adaptive Pre-training by Masking
  In-domain Keywords

Do not Mask Randomly: Effective Domain-adaptive Pre-training by Masking In-domain Keywords

14 July 2023
Shahriar Golchin
Mihai Surdeanu
N. Tavabi
A. Kiapour
ArXivPDFHTML

Papers citing "Do not Mask Randomly: Effective Domain-adaptive Pre-training by Masking In-domain Keywords"

4 / 4 papers shown
Title
TransformLLM: Adapting Large Language Models via LLM-Transformed Reading
  Comprehension Text
TransformLLM: Adapting Large Language Models via LLM-Transformed Reading Comprehension Text
Iftach Arbel
Yehonathan Refael
Ofir Lindenbaum
AILaw
26
0
0
28 Oct 2024
On the Rigour of Scientific Writing: Criteria, Analysis, and Insights
On the Rigour of Scientific Writing: Criteria, Analysis, and Insights
Joseph James
Chenghao Xiao
Yucheng Li
Chenghua Lin
22
1
0
07 Oct 2024
Boosting Low-Resource Biomedical QA via Entity-Aware Masking Strategies
Boosting Low-Resource Biomedical QA via Entity-Aware Masking Strategies
Gabriele Pergola
E. Kochkina
Lin Gui
Maria Liakata
Yulan He
88
31
0
16 Feb 2021
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,959
0
20 Apr 2018
1