ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.11255
  4. Cited By
WisPerMed at "Discharge Me!": Advancing Text Generation in Healthcare
  with Large Language Models, Dynamic Expert Selection, and Priming Techniques
  on MIMIC-IV

WisPerMed at "Discharge Me!": Advancing Text Generation in Healthcare with Large Language Models, Dynamic Expert Selection, and Priming Techniques on MIMIC-IV

18 May 2024
Hendrik Damm
T. M. G. Pakull
Bahadir Eryilmaz
Helmut Becker
Ahmad Idrissi-Yaghir
Henning Schafer
Sergej Schultenkämper
Christoph M. Friedrich
ArXivPDFHTML

Papers citing "WisPerMed at "Discharge Me!": Advancing Text Generation in Healthcare with Large Language Models, Dynamic Expert Selection, and Priming Techniques on MIMIC-IV"

5 / 5 papers shown
Title
Chatbot Arena: An Open Platform for Evaluating LLMs by Human Preference
Chatbot Arena: An Open Platform for Evaluating LLMs by Human Preference
Wei-Lin Chiang
Lianmin Zheng
Ying Sheng
Anastasios Nikolas Angelopoulos
Tianle Li
...
Hao Zhang
Banghua Zhu
Michael I. Jordan
Joseph E. Gonzalez
Ion Stoica
OSLM
83
536
0
07 Mar 2024
Experimental Standards for Deep Learning in Natural Language Processing
  Research
Experimental Standards for Deep Learning in Natural Language Processing Research
Dennis Ulmer
Elisa Bassignana
Max Müller-Eberstein
Daniel Varab
Mike Zhang
Rob van der Goot
Christian Hardmeier
Barbara Plank
47
10
0
13 Apr 2022
SummaC: Re-Visiting NLI-based Models for Inconsistency Detection in
  Summarization
SummaC: Re-Visiting NLI-based Models for Inconsistency Detection in Summarization
Philippe Laban
Tobias Schnabel
Paul N. Bennett
Marti A. Hearst
HILM
65
387
0
18 Nov 2021
Quantifying the Carbon Emissions of Machine Learning
Quantifying the Carbon Emissions of Machine Learning
Alexandre Lacoste
A. Luccioni
Victor Schmidt
Thomas Dandres
70
688
0
21 Oct 2019
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and
  lighter
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Victor Sanh
Lysandre Debut
Julien Chaumond
Thomas Wolf
117
7,386
0
02 Oct 2019
1