ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.08179
  4. Cited By
MEDBERT.de: A Comprehensive German BERT Model for the Medical Domain
v1v2 (latest)

MEDBERT.de: A Comprehensive German BERT Model for the Medical Domain

14 March 2023
Keno K. Bressem
Jens-Michalis Papaioannou
Paul Grundmann
Florian Borchert
Lisa Christine Adams
Leonhard Liu
Felix Busch
Lina Xu
J. P. Loyen
S. Niehues
Moritz Augustin
Lennart Grosser
Marcus R. Makowski
Hugo J. W. L. Aerts
Alexander Loser
    AI4MH
ArXiv (abs)PDFHTML

Papers citing "MEDBERT.de: A Comprehensive German BERT Model for the Medical Domain"

15 / 15 papers shown
Title
Spanish Pre-trained BERT Model and Evaluation Data
Spanish Pre-trained BERT Model and Evaluation Data
J. Cañete
Gabriel Chaperon
Rodrigo Fuentes
Jou-Hui Ho
Hojin Kang
Jorge Pérez
76
663
0
06 Aug 2023
How much pretraining data do language models need to learn syntax?
How much pretraining data do language models need to learn syntax?
Laura Pérez-Mayos
Miguel Ballesteros
Leo Wanner
45
32
0
07 Sep 2021
Deduplicating Training Data Makes Language Models Better
Deduplicating Training Data Makes Language Models Better
Katherine Lee
Daphne Ippolito
A. Nystrom
Chiyuan Zhang
Douglas Eck
Chris Callison-Burch
Nicholas Carlini
SyDa
360
636
0
14 Jul 2021
How Good is Your Tokenizer? On the Monolingual Performance of
  Multilingual Language Models
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models
Phillip Rust
Jonas Pfeiffer
Ivan Vulić
Sebastian Ruder
Iryna Gurevych
130
255
0
31 Dec 2020
GottBERT: a pure German Language Model
GottBERT: a pure German Language Model
Raphael Scheible
Fabian Thomczyk
P. Tippmann
V. Jaravine
M. Boeker
VLM
43
80
0
03 Dec 2020
German's Next Language Model
German's Next Language Model
Branden Chan
Stefan Schweter
Timo Möller
104
273
0
21 Oct 2020
Domain-Specific Language Model Pretraining for Biomedical Natural
  Language Processing
Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing
Yu Gu
Robert Tinn
Hao Cheng
Michael R. Lucas
Naoto Usuyama
Xiaodong Liu
Tristan Naumann
Jianfeng Gao
Hoifung Poon
LM&MAAI4CE
121
1,783
0
31 Jul 2020
GGPONC: A Corpus of German Medical Text with Rich Metadata Based on
  Clinical Practice Guidelines
GGPONC: A Corpus of German Medical Text with Rich Metadata Based on Clinical Practice Guidelines
Florian Borchert
Christina Lohr
Luise Modersohn
T. Langer
M. Follmann
J. Sachs
U. Hahn
M. Schapranow
LM&MAAI4MH
45
28
0
13 Jul 2020
A Monolingual Approach to Contextualized Word Embeddings for
  Mid-Resource Languages
A Monolingual Approach to Contextualized Word Embeddings for Mid-Resource Languages
Pedro Ortiz Suarez
Laurent Romary
Benoît Sagot
64
233
0
11 Jun 2020
Med-BERT: pre-trained contextualized embeddings on large-scale
  structured electronic health records for disease prediction
Med-BERT: pre-trained contextualized embeddings on large-scale structured electronic health records for disease prediction
L. Rasmy
Yang Xiang
Z. Xie
Cui Tao
Degui Zhi
AI4MHLM&MA
99
696
0
22 May 2020
CamemBERT: a Tasty French Language Model
CamemBERT: a Tasty French Language Model
Louis Martin
Benjamin Muller
Pedro Ortiz Suarez
Yoann Dupont
Laurent Romary
Eric Villemonte de la Clergerie
Djamé Seddah
Benoît Sagot
126
976
0
10 Nov 2019
RoBERTa: A Robustly Optimized BERT Pretraining Approach
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
M. Lewis
Luke Zettlemoyer
Veselin Stoyanov
AIMat
689
24,557
0
26 Jul 2019
Large Batch Optimization for Deep Learning: Training BERT in 76 minutes
Large Batch Optimization for Deep Learning: Training BERT in 76 minutes
Yang You
Jing Li
Sashank J. Reddi
Jonathan Hseu
Sanjiv Kumar
Srinadh Bhojanapalli
Xiaodan Song
J. Demmel
Kurt Keutzer
Cho-Jui Hsieh
ODL
269
998
0
01 Apr 2019
BioBERT: a pre-trained biomedical language representation model for
  biomedical text mining
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Jinhyuk Lee
Wonjin Yoon
Sungdong Kim
Donghyeon Kim
Sunkyu Kim
Chan Ho So
Jaewoo Kang
OOD
182
5,674
0
25 Jan 2019
BERT: Pre-training of Deep Bidirectional Transformers for Language
  Understanding
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLMSSLSSeg
1.8K
95,229
0
11 Oct 2018
1