ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.05109
47
32
v1v2v3v4v5 (latest)

A Survey of NL2SQL with Large Language Models: Where are we, and where are we going?

9 August 2024
Xinyu Liu
Shuyu Shen
Boyan Li
Peixian Ma
Runzhi Jiang
Yuxin Zhang
Ju Fan
Guoliang Li
Nan Tang
Yuyu Luo
ArXiv (abs)PDFHTML
Main:17 Pages
14 Figures
Bibliography:3 Pages
4 Tables
Abstract

Translating users' natural language queries (NL) into SQL queries (i.e., NL2SQL, a.k.a., Text-to-SQL) can significantly reduce barriers to accessing relational databases and support various commercial applications. The performance of NL2SQL has been greatly enhanced with the emergence of Large Language Models (LLMs). In this survey, we provide a comprehensive review of NL2SQL techniques powered by LLMs, covering its entire lifecycle from the following four aspects: (1) Model: NL2SQL translation techniques that tackle not only NL ambiguity and under-specification, but also properly map NL with database schema and instances; (2) Data: From the collection of training data, data synthesis due to training data scarcity, to NL2SQL benchmarks; (3) Evaluation: Evaluating NL2SQL methods from multiple angles using different metrics and granularities; and (4) Error Analysis: analyzing NL2SQL errors to find the root cause and guiding NL2SQL models to evolve. Moreover, we provide a rule of thumb for developing NL2SQL solutions. Finally, we discuss the research challenges and open problems of NL2SQL in the LLMs era.

View on arXiv
@article{liu2025_2408.05109,
  title={ A Survey of Text-to-SQL in the Era of LLMs: Where are we, and where are we going? },
  author={ Xinyu Liu and Shuyu Shen and Boyan Li and Peixian Ma and Runzhi Jiang and Yuxin Zhang and Ju Fan and Guoliang Li and Nan Tang and Yuyu Luo },
  journal={arXiv preprint arXiv:2408.05109},
  year={ 2025 }
}
Comments on this paper