Explainable AI and User Experience. Prototyping and Evaluating an UX-Optimized XAI Interface in Computer Vision


Master's Thesis, 2023

151 Pages, Grade: 1,0


Abstract or Introduction

This thesis presents a toolkit of 17 user experience (UX) principles, which are categorized according to their relevance towards Explainable AI (XAI).

The goal of Explainable AI has been widely associated in literature with dimensions of comprehensibility, usefulness, trust, and acceptance. Moreover, authors in academia postulate that research should rather focus on the development of holistic explanation interfaces instead of single visual explanations. Consequently, the focus of XAI research should be more on potential users and their needs, rather than purely technical aspects of XAI methods. Considering these three impediments, the author of this thesis derives the assumption to bring valuable insights from the research area of User Interface (UI) and User Experience design into XAI research. Basically, UX is concerned with the design and evaluation of pragmatic and hedonic aspects of a user’s interaction with a system in some context.

These principles are taken into account in the subsequent prototyping of a custom XAI system called Brain Tumor Assistant (BTA). Here, a pre-trained EfficientNetB0 is used as a Convolutional Neural Network that can divide x-ray images of a human brain into four classes with an overall accuracy of 98%. To generate factual explanations, Local Interpretable Model-agnostic Explanations are subsequently applied as an XAI method. The following evaluation of the BTA is based on the so-called User Experience Questionnaire (UEQ) according to Laugwitz et al. (2008), whereby single items of the questionnaire are adapted to the specific context of XAI. Quantitative data from a study with 50 participants in each control and treatment group is used to present a standardized way of quantifying the dimensions of Usability and UX specifically for XAI systems. Furthermore, through an A/B test, evidence is presented that visual explanations have a significant (α=0.05) positive effect on the dimensions of attractiveness, usefulness, controllability, and trustworthiness. In summary, this thesis proves that explanations in computer vision not only have a significantly positive effect on trustworthiness, but also on other dimensions.

Details

Title
Explainable AI and User Experience. Prototyping and Evaluating an UX-Optimized XAI Interface in Computer Vision
College
University of Regensburg  (Professur für Wirtschaftsinformatik, insb. Internet Business & Digitale Soziale Medien)
Grade
1,0
Author
Year
2023
Pages
151
Catalog Number
V1356885
ISBN (eBook)
9783346873859
ISBN (Book)
9783346874191
Language
English
Keywords
Explainable AI, XAI, UX, UI, Computer Vision, User-centered Design, Figma, EfficientNetB0, LIME, Local interpretable model-agnostic explanations, Master Thesis, Literature Review, Hypothesis Test, Whitney U Test, Cohens d, Cronbach alpha, AI, Machline Learning, Convolutional Neural Networks, CNN, ML, Deep Learning, DL, Medicine, Healthcare, high-stake, UX principles, UEQ, User Experience Questionnaire, Brain Tumor, X ray, Röntgenbilder, Prototyping, Prototype
Quote paper
Georg Dedikov (Author), 2023, Explainable AI and User Experience. Prototyping and Evaluating an UX-Optimized XAI Interface in Computer Vision, Munich, GRIN Verlag, https://www.grin.com/document/1356885

Comments

  • No comments yet.
Look inside the ebook
Title: Explainable AI and User Experience. Prototyping and Evaluating an UX-Optimized XAI Interface in Computer Vision



Upload papers

Your term paper / thesis:

- Publication as eBook and book
- High royalties for the sales
- Completely free - with ISBN
- It only takes five minutes
- Every paper finds readers

Publish now - it's free