Do We Need Explainable AI in Companies? Investigation of Challenges, Expectations, and Chances from Employees' Perspective

7 Oct 2022  ·  Katharina Weitz, Chi Tai Dang, Elisabeth André ·

Companies' adoption of artificial intelligence (AI) is increasingly becoming an essential element of business success. However, using AI poses new requirements for companies and their employees, including transparency and comprehensibility of AI systems. The field of Explainable AI (XAI) aims to address these issues. Yet, the current research primarily consists of laboratory studies, and there is a need to improve the applicability of the findings to real-world situations. Therefore, this project report paper provides insights into employees' needs and attitudes towards (X)AI. For this, we investigate employees' perspectives on (X)AI. Our findings suggest that AI and XAI are well-known terms perceived as important for employees. This recognition is a critical first step for XAI to potentially drive successful usage of AI by providing comprehensible insights into AI technologies. In a lessons-learned section, we discuss the open questions identified and suggest future research directions to develop human-centered XAI designs for companies. By providing insights into employees' needs and attitudes towards (X)AI, our project report contributes to the development of XAI solutions that meet the requirements of companies and their employees, ultimately driving the successful adoption of AI technologies in the business context.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here