Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2311.13878
Cited By
Minimizing Factual Inconsistency and Hallucination in Large Language Models
23 November 2023
Muneeswaran Irulandi
Shreya Saxena
Siva Prasad
M. V. Sai Prakash
Advaith Shankar
V. Varun
Vishal Vaddina
Saisubramaniam Gopalakrishnan
HILM
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Minimizing Factual Inconsistency and Hallucination in Large Language Models"
3 / 3 papers shown
Title
A Scoping Review of Natural Language Processing in Addressing Medically Inaccurate Information: Errors, Misinformation, and Hallucination
Zhaoyi Sun
Wen-wai Yim
Özlem Uzuner
Fei Xia
Meliha Yetisgen
82
0
0
16 Apr 2025
Graph Machine Learning in the Era of Large Language Models (LLMs)
Wenqi Fan
Shijie Wang
Jiani Huang
Zhikai Chen
Yu Song
...
Haitao Mao
Hui Liu
Xiaorui Liu
Dawei Yin
Qing Li
AI4CE
120
29
0
23 Apr 2024
MyVLM: Personalizing VLMs for User-Specific Queries
Yuval Alaluf
Elad Richardson
Sergey Tulyakov
Kfir Aberman
Daniel Cohen-Or
MLLM
VLM
107
23
0
21 Mar 2024
1