ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.10200
  4. Cited By
BERTweet: A pre-trained language model for English Tweets

BERTweet: A pre-trained language model for English Tweets

20 May 2020
Dat Quoc Nguyen
Thanh Tien Vu
A. Nguyen
    VLM
ArXivPDFHTML

Papers citing "BERTweet: A pre-trained language model for English Tweets"

22 / 22 papers shown
Title
LLM-Enhanced Multiple Instance Learning for Joint Rumor and Stance Detection with Social Context Information
LLM-Enhanced Multiple Instance Learning for Joint Rumor and Stance Detection with Social Context Information
Ruichao Yang
Jing Ma
Wei Gao
Hongzhan Lin
112
0
0
13 Feb 2025
Improving and Assessing the Fidelity of Large Language Models Alignment to Online Communities
Improving and Assessing the Fidelity of Large Language Models Alignment to Online Communities
Minh Duc Hoang Chu
Zihao He
Rebecca Dorn
Kristina Lerman
69
2
0
18 Aug 2024
A Syntax-Injected Approach for Faster and More Accurate Sentiment Analysis
A Syntax-Injected Approach for Faster and More Accurate Sentiment Analysis
Muhammad Imran
Olga Kellert
Carlos Gómez-Rodríguez
31
1
0
21 Jun 2024
Are You Robert or RoBERTa? Deceiving Online Authorship Attribution
  Models Using Neural Text Generators
Are You Robert or RoBERTa? Deceiving Online Authorship Attribution Models Using Neural Text Generators
Keenan I. Jones
Jason R. C. Nurse
Shujun Li
DeLMO
74
19
0
18 Mar 2022
WNUT-2020 Task 2: Identification of Informative COVID-19 English Tweets
WNUT-2020 Task 2: Identification of Informative COVID-19 English Tweets
Dat Quoc Nguyen
Thanh Tien Vu
A. Rahimi
M. Dao
L. T. Nguyen
Long Doan
38
74
0
16 Oct 2020
Don't Stop Pretraining: Adapt Language Models to Domains and Tasks
Don't Stop Pretraining: Adapt Language Models to Domains and Tasks
Suchin Gururangan
Ana Marasović
Swabha Swayamdipta
Kyle Lo
Iz Beltagy
Doug Downey
Noah A. Smith
VLM
AI4CE
CLL
134
2,420
0
23 Apr 2020
SemEval-2017 Task 4: Sentiment Analysis in Twitter
SemEval-2017 Task 4: Sentiment Analysis in Twitter
Sara Rosenthal
N. Farra
Preslav Nakov
VLM
79
798
0
02 Dec 2019
Unsupervised Cross-lingual Representation Learning at Scale
Unsupervised Cross-lingual Representation Learning at Scale
Alexis Conneau
Kartikay Khandelwal
Naman Goyal
Vishrav Chaudhary
Guillaume Wenzek
Francisco Guzmán
Edouard Grave
Myle Ott
Luke Zettlemoyer
Veselin Stoyanov
193
6,538
0
05 Nov 2019
RoBERTa: A Robustly Optimized BERT Pretraining Approach
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
M. Lewis
Luke Zettlemoyer
Veselin Stoyanov
AIMat
522
24,351
0
26 Jul 2019
A Multi-task Approach for Named Entity Recognition in Social Media Data
A Multi-task Approach for Named Entity Recognition in Social Media Data
Gustavo Aguilar
Suraj Maharjan
Adrian Pastor Lopez-Monroy
Thamar Solorio
39
143
0
10 Jun 2019
fairseq: A Fast, Extensible Toolkit for Sequence Modeling
fairseq: A Fast, Extensible Toolkit for Sequence Modeling
Myle Ott
Sergey Edunov
Alexei Baevski
Angela Fan
Sam Gross
Nathan Ng
David Grangier
Michael Auli
VLM
FaML
95
3,147
0
01 Apr 2019
SciBERT: A Pretrained Language Model for Scientific Text
SciBERT: A Pretrained Language Model for Scientific Text
Iz Beltagy
Kyle Lo
Arman Cohan
118
2,957
0
26 Mar 2019
BioBERT: a pre-trained biomedical language representation model for
  biomedical text mining
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Jinhyuk Lee
Wonjin Yoon
Sungdong Kim
Donghyeon Kim
Sunkyu Kim
Chan Ho So
Jaewoo Kang
OOD
140
5,628
0
25 Jan 2019
BERT: Pre-training of Deep Bidirectional Transformers for Language
  Understanding
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLM
SSL
SSeg
1.5K
94,511
0
11 Oct 2018
Parsing Tweets into Universal Dependencies
Parsing Tweets into Universal Dependencies
Yijia Liu
Yi Zhu
Wanxiang Che
Bing Qin
Nathan Schneider
Noah A. Smith
49
74
0
23 Apr 2018
NTUA-SLP at SemEval-2018 Task 3: Tracking Ironic Tweets using Ensembles
  of Word and Character Level Attentive RNNs
NTUA-SLP at SemEval-2018 Task 3: Tracking Ironic Tweets using Ensembles of Word and Character Level Attentive RNNs
Christos Baziotis
Athanasiou Nikolaos
Pinelopi Papalampidi
Athanasia Kolovou
Georgios Paraskevopoulos
Nikolaos Ellinas
Alexandros Potamianos
55
53
0
18 Apr 2018
Attention Is All You Need
Attention Is All You Need
Ashish Vaswani
Noam M. Shazeer
Niki Parmar
Jakob Uszkoreit
Llion Jones
Aidan Gomez
Lukasz Kaiser
Illia Polosukhin
3DV
638
130,942
0
12 Jun 2017
BB_twtr at SemEval-2017 Task 4: Twitter Sentiment Analysis with CNNs and
  LSTMs
BB_twtr at SemEval-2017 Task 4: Twitter Sentiment Analysis with CNNs and LSTMs
M. Cliche
56
231
0
20 Apr 2017
Bag of Tricks for Efficient Text Classification
Bag of Tricks for Efficient Text Classification
Armand Joulin
Edouard Grave
Piotr Bojanowski
Tomas Mikolov
VLM
154
4,612
0
06 Jul 2016
End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF
End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF
Xuezhe Ma
Eduard H. Hovy
90
2,651
0
04 Mar 2016
Neural Machine Translation of Rare Words with Subword Units
Neural Machine Translation of Rare Words with Subword Units
Rico Sennrich
Barry Haddow
Alexandra Birch
195
7,729
0
31 Aug 2015
Adam: A Method for Stochastic Optimization
Adam: A Method for Stochastic Optimization
Diederik P. Kingma
Jimmy Ba
ODL
1.5K
149,842
0
22 Dec 2014
1