When, What, and how should generative artificial intelligence explain to Users?
- Authors
- Jang, Soobin; Lee, Haeyoon; Kim, Yujin; Lee, Daeho; Shin, Jungwoo; Nam, Jungwoo
- Issue Date
- Sep-2024
- Publisher
- Elsevier Ltd
- Keywords
- Conjoint analysis; Conversational user interface; Explainable AI; Generative AI
- Citation
- Telematics and Informatics, v.93
- Indexed
- SSCI
SCOPUS
- Journal Title
- Telematics and Informatics
- Volume
- 93
- URI
- https://scholarx.skku.edu/handle/2021.sw.skku/111939
- DOI
- 10.1016/j.tele.2024.102175
- ISSN
- 0736-5853
- Abstract
- With the commercialization of ChatGPT, generative artificial intelligence (AI) has been applied almost everywhere in our lives. However, even though generative AI has become a daily technology that anyone can use, most non-majors need to know the process and reason for the results because it can be misused due to lack of sufficient knowledge and misunderstanding. Therefore, this study investigated users’ preferences for when, what, and how generative AI should provide explanations about the process of generating and the reasoning behind the results, using conjoint method and mixed logit analysis. The results show that users are most sensitive to the timing of providing eXplainable AI (XAI), and that users want additional information only when they ask for explanations during the process of using generative AI. The results of this study will help shape the XAI design of future generative AI from a user perspective and improve usability. © 2024 Elsevier Ltd
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - Graduate School > Interaction Science > 1. Journal Articles
- Computing and Informatics > Convergence > 1. Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.