Fahse, TobiasTobiasFahseBlohm, IvoIvoBlohmHruby, RichardRichardHrubyvan Giffen, BenjaminBenjaminvan Giffen2023-04-132023-04-132022-06-18https://www.alexandria.unisg.ch/handle/20.500.14171/108601Algorithmic forecasts outperform human forecasts in many tasks. State-of-the-art machine learning (ML) algorithms have even widened that gap. Since sales forecasting plays a key role in business profitability, ML based sales forecasting can have significant advantages. However, individuals are resistant to use algorithmic forecasts. To overcome this algorithm aversion, explainable AI (XAI), where an explanation interface (XI) provides model predictions and explanations to the user, can help. However, current XAI techniques are incomprehensible for laymen. Despite the economic relevance of sales forecasting, there is no significant research effort towards aiding non-expert users make better decisions using ML forecasting systems by designing appropriate XI. We contribute to this research gap by designing a model-agnostic XI for laymen. We propose a design theory for XIs, instantiate our theory and report initial formative evaluation results. A real-world evaluation context is used: A medium-sized Swiss bakery chain provides past sales data and human forecasts.enForecastingExplainable AIXAIDesign ScienceExplanation Interfaces for Sales Forecastingconference paper