What Makes a Good Natural Language Prompt?

Abstract

As large language models (LLMs) have progressed towards more human-like and human–AI communications prevalent, prompting has emerged as a decisive component. However, there is limited conceptual consensus on what exactly quantifies natural language prompts. We attempt to address this question by conducting a meta-analysis surveying 150+ prompting-related papers from leading NLP and AI conferences (2022–2025), and blogs. We propose a property- and human-centric framework for evaluating prompt quality, encompassing 21 properties categorized into six dimensions. We then examine how existing studies assess their impact on LLMs, revealing their imbalanced support across models and tasks, and substantial research gaps. Further, we analyze correlations among properties in high-quality natural language prompts, deriving prompting recommendations. Finally, we explore multi-property prompt enhancements in reasoning tasks, observing that single-property enhancements often have the greatest impact. Our findings establish a foundation for property-centric prompt evaluation and optimization, bridging the gaps between human–AI communication and opening new prompting research directions.

Publication
In 63nd Annual Meeting of the Association for Computational Linguistics (Volume 1, Long Papers), Vienna, Austria, July 27–August 1st, 2025
Xuan Long Do
Xuan Long Do
A*STAR Doctoral Student (Aug ‘23)
Co-Supervised by Kenji Kawaguchi

PhD Candidate August 2023 Intake

Duy C. Dinh
Duy C. Dinh
Research Intern (Jan ‘25)

My name is Duy. I am currently working as an AI Engineer at Creative Force and graduated from Hanoi University of Science and Technology (HUST). With a strong foundation in machine learning research and a growing passion for Generative AI, I seek opportunities to contribute to meaningful and impactful research.

Hai N. Nguyen
Hai N. Nguyen
Research Intern (Jan ‘25)

My name is Hai, current AI Research Resident at VinAi. I have graduated from Vietnam National University, University of Science (Vietnam). I’m interested in Optimization, Optimal Transport and Large Language Models.

Min-Yen Kan
Min-Yen Kan
Associate Professor

WING lead; interests include Digital Libraries, Information Retrieval and Natural Language Processing.