Understanding why a model makes a certain prediction can be as crucial as the prediction's accuracy in many applications.
This problem is especially hard to solve for time series classification and regression in industrial applications such as predictive maintenance or production line optimization, for which each label or regression target is associated with several time series and meta-information simultaneously.
In this paper, a new model named FiBiNET as an abbreviation for Feature Importance and Bilinear feature Interaction NETwork is proposed to dynamically learn the feature importance and fine-grained feature interactions.
Different notations and terminology complicate the understanding, discussion and research of model-agnostic interpretation techniques in machine learning.
Official code for using / reproducing ACD (ICLR 2019) from the paper "Hierarchical interpretations for neural network predictions" https://arxiv. org/abs/1806. 05337
Deep neural networks (DNNs) have achieved impressive predictive performance due to their ability to learn complex, non-linear relationships between variables.