Published December 28, 2025
| Version v1.0.1
Software
Open
msaneme/text2sql-evaluation-artifacts: Text-to-SQL Evaluation Artifacts (v1.0.1)
Description
This release provides the experimental artifacts supporting the evaluation of large language models for Text-to-SQL translation, as reported in the associated IEEE Access manuscript.
The release includes:
- Complete prompt instances for each evaluated natural language query (NLQ).
- Expert-defined reference SQL queries used for execution-based evaluation.
- The full relational database schema employed in the experiments.
- Shared LLM generation parameters ensuring consistent and reproducible evaluation.
- Security-oriented prompt templates used in the preliminary validation of the proposed security framework.
These materials are made publicly available to support transparency and reproducibility of the experimental setup.
Files
msaneme/text2sql-evaluation-artifacts-v1.0.1.zip
Files
(43.1 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:d5741df76c7b6ffa702fe6327e0e5f99
|
43.1 kB | Preview Download |
Additional details
Related works
- Is supplement to
- Software: https://github.com/msaneme/text2sql-evaluation-artifacts/tree/v1.0.1 (URL)
Software
- Repository URL
- https://github.com/msaneme/text2sql-evaluation-artifacts