Conference Paper (preprint): “Revisiting Prompt Engineering: A Comprehensive Evaluation for LLM-based Personalized Recommendation”
The preprint linked below was recently shared on arXiv.
Title
Revisiting Prompt Engineering: A Comprehensive Evaluation for LLM-based Personalized Recommendation
Authors
Genki Kusano
NEC Corportation
Kosuke Akimoto
NEC Corportation
Kunihiro Takeoka
NEC Corportation
Source
via arXiv
Abstract
Large language models (LLMs) can perform recommendation tasks by taking prompts written in natural language as input. Compared to traditional methods such as collaborative filtering, LLM-based recommendation offers advantages in handling cold-start, crossdomain, and zero-shot scenarios, as well as supporting flexible input formats and generating explanations of user behavior. In this paper, we focus on a single-user setting, where no information from other users is used. This setting is practical for privacy-sensitive or datalimited applications. In such cases, prompt engineering becomes especially important for controlling the output generated by the LLM. We conduct a large-scale comparison of 23 prompt types across 8 public datasets and 12 LLMs. We use statistical tests and linear mixed-effects models to evaluate both accuracy and inference cost. Our results show that for cost-efficient LLMs, three types of prompts are especially effective: those that rephrase instructions, consider background knowledge, and make the reasoning process easier to follow. For high-performance LLMs, simple prompts often outperform more complex ones while reducing cost. In contrast, commonly used prompting styles in natural language processing, such as step-by-step reasoning, or the use of reasoning models often lead to lower accuracy. Based on these findings, we provide practical suggestions for selecting prompts and LLMs depending on the required balance between accuracy and cost.
This paper has been accepted to ACM RecSys 2025. Please cite it appropriately after September 22, 2025.
Direct to Abstract and Link to Full Text
Filed under: Data Files, Journal Articles, News, Patrons and Users
About Gary Price
Gary Price (gprice@gmail.com) is a librarian, writer, consultant, and frequent conference speaker based in the Washington D.C. metro area. He earned his MLIS degree from Wayne State University in Detroit. Price has won several awards including the SLA Innovations in Technology Award and Alumnus of the Year from the Wayne St. University Library and Information Science Program. From 2006-2009 he was Director of Online Information Services at Ask.com.


