ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2007.13306
  4. Cited By
Public Sentiment Toward Solar Energy: Opinion Mining of Twitter Using a
  Transformer-Based Language Model

Public Sentiment Toward Solar Energy: Opinion Mining of Twitter Using a Transformer-Based Language Model

27 July 2020
Serena Y Kim
K. Ganesan
P. Dickens
S. Panda
ArXivPDFHTML

Papers citing "Public Sentiment Toward Solar Energy: Opinion Mining of Twitter Using a Transformer-Based Language Model"

12 / 12 papers shown
Title
Curating corpora with classifiers: A case study of clean energy sentiment online
Curating corpora with classifiers: A case study of clean energy sentiment online
M. V. Arnold
P. Dodds
C. Danforth
61
0
0
04 May 2023
Modeling Label Semantics for Predicting Emotional Reactions
Modeling Label Semantics for Predicting Emotional Reactions
Radhika Gaonkar
Heeyoung Kwon
Mohaddeseh Bastan
Niranjan Balasubramanian
Nathanael Chambers
31
29
0
09 Jun 2020
SentiBERT: A Transferable Transformer-Based Architecture for
  Compositional Sentiment Semantics
SentiBERT: A Transferable Transformer-Based Architecture for Compositional Sentiment Semantics
Da Yin
Tao Meng
Kai-Wei Chang
36
138
0
08 May 2020
The Ivory Tower Lost: How College Students Respond Differently than the
  General Public to the COVID-19 Pandemic
The Ivory Tower Lost: How College Students Respond Differently than the General Public to the COVID-19 Pandemic
Viet-An Duong
Phu Pham
Tongyu Yang
Yu Wang
Jiebo Luo
AI4CE
30
94
0
21 Apr 2020
A Transformer-based approach to Irony and Sarcasm detection
A Transformer-based approach to Irony and Sarcasm detection
Rolandos Alexandros Potamias
Georgios Siolas
A. Stafylopatis
37
209
0
23 Nov 2019
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and
  lighter
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Victor Sanh
Lysandre Debut
Julien Chaumond
Thomas Wolf
134
7,437
0
02 Oct 2019
ALBERT: A Lite BERT for Self-supervised Learning of Language
  Representations
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
Zhenzhong Lan
Mingda Chen
Sebastian Goodman
Kevin Gimpel
Piyush Sharma
Radu Soricut
SSL
AIMat
268
6,420
0
26 Sep 2019
RoBERTa: A Robustly Optimized BERT Pretraining Approach
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
M. Lewis
Luke Zettlemoyer
Veselin Stoyanov
AIMat
408
24,160
0
26 Jul 2019
XLNet: Generalized Autoregressive Pretraining for Language Understanding
XLNet: Generalized Autoregressive Pretraining for Language Understanding
Zhilin Yang
Zihang Dai
Yiming Yang
J. Carbonell
Ruslan Salakhutdinov
Quoc V. Le
AI4CE
183
8,386
0
19 Jun 2019
BERT: Pre-training of Deep Bidirectional Transformers for Language
  Understanding
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLM
SSL
SSeg
966
93,936
0
11 Oct 2018
Attention Is All You Need
Attention Is All You Need
Ashish Vaswani
Noam M. Shazeer
Niki Parmar
Jakob Uszkoreit
Llion Jones
Aidan Gomez
Lukasz Kaiser
Illia Polosukhin
3DV
453
129,831
0
12 Jun 2017
Adam: A Method for Stochastic Optimization
Adam: A Method for Stochastic Optimization
Diederik P. Kingma
Jimmy Ba
ODL
854
149,474
0
22 Dec 2014
1