Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/44979
Title: Why should I trust you? Influence of explanation design on consumer behavior in AI-based services
Authors: NIZETTE, Florence 
Hammedi, Wafa
VAN RIEL, Allard 
Steils, Nadia
Issue Date: 2024
Publisher: EMERALD GROUP PUBLISHING LTD
Source: Journal of Service Management,
Status: Early view
Abstract: PurposeThis study explores how the format of explanations used in artificial intelligence (AI)-based services affects consumer behavior, specifically the effects of explanation detail (low vs high) and consumer control (automatic vs on demand) on trust and acceptance. The aim is to provide service providers with insights into how to optimize the format of explanations to enhance consumer evaluations of AI-based services.Design/methodology/approachDrawing on the literature on explainable AI (XAI) and information overload theory, a conceptual model is developed. To empirically test the conceptual model, two between-subjects experiments were conducted wherein the level of detail and level of control were manipulated, taking AI-based recommendations as a use case. The data were analyzed via partial least squares (PLS) regressions.FindingsThe results reveal significant positive correlations between level of detail and perceived understanding and between level of detail and perceived assurance. The level of control negatively moderates the relationship between the level of detail and perceived understanding. Further analyses revealed that the perceived competence and perceived integrity of AI systems positively and significantly influence the acceptance and purchase intentions of AI-based services.Practical implicationsThis research offers service providers key insights into how tailored explanations and maintaining a balance between detail and control build consumer trust and enhance AI-based service outcomes.Originality/valueThis article elucidates the nuanced interplay between the level of detail and control over explanations for non-expert consumers in high-credence service sectors. The findings offer insights into the design of more consumer-centric explanations to increase the acceptance of AI-based services.
Notes: Nizette, F (corresponding author), Univ Namur, Res Ctr Mkt & Serv Management, UNamur Sch Business, NADI CeRCLe, Namur, Belgium.; Nizette, F (corresponding author), Hasselt Univ, Fac Business Econ, Hasselt, Belgium.
florence.nizette@unamur.be; wafa.hammedi@unamur.be;
allard.vanriel@uhasselt.be; nadia.steils@uliege.be
Keywords: Design;Recommendations;Credence service;AI;XAI;Explanations
Document URI: http://hdl.handle.net/1942/44979
ISSN: 1757-5818
e-ISSN: 1757-5826
DOI: 10.1108/JOSM-05-2024-0223
ISI #: 001380500200001
Rights: Emerald Publishing Limited
Category: A1
Type: Journal Contribution
Appears in Collections:Research publications

Files in This Item:
File Description SizeFormat 
JOSM-05-2024-0223_proof 1..25.pdf
  Restricted Access
Early view1.13 MBAdobe PDFView/Open    Request a copy
xx.pdf
  Until 2025-12-20
Peer-reviewed author version1.34 MBAdobe PDFView/Open    Request a copy
Show full item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.