How Well Can Large Language Models Reflect? A Human Evaluation of LLM-generated Reflections for Motivational Interviewing Dialogues

Open Access
Authors
  • Erkan Başar
  • Xin Sun
  • Iris Hendrickx
  • Jan de Wit
Publication date 2025
Host editors
  • Owen Rambow
  • Leo Wanner
  • Marianna Apidianaki
  • Hend Al-Khalifa
  • Barbara Di Eugenio
  • Steven Schockaert
Book title The 31st International Conference on Computational Linguistics : proceedings of the main conference
Book subtitle COLING 2025 : January 19-24, 2025
ISBN (electronic)
  • 9798891761964
Event 31st International Conference on Computational Linguistics, COLING 2025
Pages (from-to) 1964-1982
Number of pages 19
Publisher Stroudsburg, PA: Association for Computational Linguistics
Organisations
  • Faculty of Social and Behavioural Sciences (FMG) - Psychology Research Institute (PsyRes)
Abstract

Motivational Interviewing (MI) is a counseling technique that promotes behavioral change through reflective responses to mirror or refine client statements. While advanced Large Language Models (LLMs) can generate engaging dialogues, challenges remain for applying them in a sensitive context such as MI. This work assesses the potential of LLMs to generate MI reflections via three LLMs: GPT-4, Llama-2, and BLOOM, and explores the effect of dialogue context size and integration of MI strategies for reflection generation by LLMs. We conduct evaluations using both automatic metrics and human judges on four criteria: appropriateness, relevance, engagement, and naturalness, to assess whether these LLMs can accurately generate the nuanced therapeutic communication required in MI. While we demonstrate LLMs' potential in generating MI reflections comparable to human therapists, content analysis shows that significant challenges remain. By identifying the strengths and limitations of LLMs in generating empathetic and contextually appropriate reflections in MI, this work contributes to the ongoing dialogue in enhancing LLM's role in therapeutic counseling.

Document type Conference contribution
Language English
Published at https://aclanthology.org/2025.coling-main.135/
Other links https://www.scopus.com/pages/publications/85218499725
Downloads
2025.coling-main.135 (Final published version)
Permalink to this page
Back