More Than Justifications an Analysis of Information Needs in Explanations and Motivations to Disable Personalization

Open Access
Authors
Publication date 2025
Journal Journalism Studies
Volume | Issue number 26 | 11
Pages (from-to) 1304-1312
Organisations
  • Faculty of Law (FdR) - Institute for Information Law (IViR)
  • Faculty of Social and Behavioural Sciences (FMG) - Amsterdam School of Communication Research (ASCoR)
Abstract
There is consensus that algorithmic news recommenders should be explainable to inform news readers of potential risks. However, debates continue over which information users need and which stakeholders should access this information. As the debate continues, researchers also call for more control over algorithmic news recommender systems, for example, by turning off personalized recommendations. Despite this call, it is unclear the extent to which news readers will use this feature. To add nuance to the discussion, we analyzed 586 responses to two open-ended questions: i) what information needs to contribute to trustworthiness perceptions of new recommendations, and ii) whether people want the ability to turn off personalization. Our results indicate that most participants found knowing the sources of news items important for trusting a recommendation system. Additionally, more than half of the participants were inclined to disable personalization. The most common reasons to turn off personalization included concerns about bias or filter bubbles and a preference to consume generalized news. These findings suggest that news readers have different information needs for explanations when interacting with an algorithmic news recommender and that many news readers prefer to disable the usage of personalized news recommendations.
Document type Article
Language English
Published at https://doi.org/10.1080/1461670X.2025.2505001
Other links https://www.scopus.com/pages/publications/105005514569
Downloads
Permalink to this page
Back