Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2502.15418
Cited By
MHQA: A Diverse, Knowledge Intensive Mental Health Question Answering Challenge for Language Models
24 February 2025
Suraj Racha
Prashant Joshi
Anshika Raman
Nikita Jangid
Mridul Sharma
Ganesh Ramakrishnan
Nirmal Punjabi
AI4MH
Re-assign community
ArXiv
PDF
HTML
Papers citing
"MHQA: A Diverse, Knowledge Intensive Mental Health Question Answering Challenge for Language Models"
3 / 3 papers shown
Title
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
M. Lewis
Luke Zettlemoyer
Veselin Stoyanov
AIMat
467
24,160
0
26 Jul 2019
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Jinhyuk Lee
Wonjin Yoon
Sungdong Kim
Donghyeon Kim
Sunkyu Kim
Chan Ho So
Jaewoo Kang
OOD
127
5,579
0
25 Jan 2019
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLM
SSL
SSeg
1.2K
93,936
0
11 Oct 2018
1