
Recent research efforts strive to aid in designing explainable systems. Nevertheless, a systematic and overarching approach to ensure explainability by design is still missing. Often it is not even clear what precisely is meant when demanding explainability. To address this challenge, we investigate the elicitation, specification, and verification of explainablity as a Non-Functional Requirement (NFR) with the long-term vision of establishing a standardized certification process for the explainability of software-driven systems in tandem with appropriate development techniques. In this work, we carve out different notions of explainability and high-level requirements people have in mind when demanding explainability, and sketch how explainability concerns may be approached in a hypothetical hiring scenario. We provide a conceptual analysis which unifies the different notions of explainability and the corresponding explainability demands.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 59 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 1% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
