ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2308.02976
  4. Cited By
Spanish Pre-trained BERT Model and Evaluation Data

Spanish Pre-trained BERT Model and Evaluation Data

6 August 2023
J. Cañete
Gabriel Chaperon
Rodrigo Fuentes
Jou-Hui Ho
Hojin Kang
Jorge Pérez
ArXivPDFHTML

Papers citing "Spanish Pre-trained BERT Model and Evaluation Data"

13 / 63 papers shown
Title
Cross-lingual Emotion Detection
Cross-lingual Emotion Detection
Sabit Hassan
Shaden Shaar
Kareem Darwish
32
12
0
10 Jun 2021
MergeDistill: Merging Pre-trained Language Models using Distillation
MergeDistill: Merging Pre-trained Language Models using Distillation
Simran Khanuja
Melvin Johnson
Partha P. Talukdar
40
16
0
05 Jun 2021
Uncovering Constraint-Based Behavior in Neural Models via Targeted
  Fine-Tuning
Uncovering Constraint-Based Behavior in Neural Models via Targeted Fine-Tuning
Forrest Davis
Marten van Schijndel
AI4CE
34
7
0
02 Jun 2021
Let's Play Mono-Poly: BERT Can Reveal Words' Polysemy Level and
  Partitionability into Senses
Let's Play Mono-Poly: BERT Can Reveal Words' Polysemy Level and Partitionability into Senses
Aina Garí Soler
Marianna Apidianaki
MILM
211
68
0
29 Apr 2021
AMMU : A Survey of Transformer-based Biomedical Pretrained Language
  Models
AMMU : A Survey of Transformer-based Biomedical Pretrained Language Models
Katikapalli Subramanyam Kalyan
A. Rajasekharan
S. Sangeetha
LM&MA
MedIm
42
164
0
16 Apr 2021
Multilingual Language Models Predict Human Reading Behavior
Multilingual Language Models Predict Human Reading Behavior
Nora Hollenstein
Federico Pirovano
Ce Zhang
Lena Jäger
Lisa Beinborn
VLM
55
46
0
12 Apr 2021
Bertinho: Galician BERT Representations
Bertinho: Galician BERT Representations
David Vilares
Marcos Garcia
Carlos Gómez-Rodríguez
70
22
0
25 Mar 2021
Pre-Training BERT on Arabic Tweets: Practical Considerations
Pre-Training BERT on Arabic Tweets: Practical Considerations
Ahmed Abdelali
Sabit Hassan
Hamdy Mubarak
Kareem Darwish
Younes Samih
48
97
0
21 Feb 2021
EstBERT: A Pretrained Language-Specific BERT for Estonian
EstBERT: A Pretrained Language-Specific BERT for Estonian
Hasan Tanvir
Claudia Kittask
Sandra Eiche
Kairit Sirts
25
36
0
09 Nov 2020
Unsupervised Word Polysemy Quantification with Multiresolution Grids of
  Contextual Embeddings
Unsupervised Word Polysemy Quantification with Multiresolution Grids of Contextual Embeddings
Christos Xypolopoulos
A. Tixier
Michalis Vazirgiannis
78
8
0
23 Mar 2020
MLQA: Evaluating Cross-lingual Extractive Question Answering
MLQA: Evaluating Cross-lingual Extractive Question Answering
Patrick Lewis
Barlas Oğuz
Ruty Rinott
Sebastian Riedel
Holger Schwenk
ELM
257
440
0
16 Oct 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
316
7,044
0
20 Apr 2018
Google's Neural Machine Translation System: Bridging the Gap between
  Human and Machine Translation
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Zhiwen Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
723
6,756
0
26 Sep 2016
Previous
12