TY - GEN
T1 - Explanations for Occluded Images
AU - Chockler, Hana
AU - Kroening, Daniel
AU - Sun, Youcheng
PY - 2022/2/28
Y1 - 2022/2/28
N2 - Existing algorithms for explaining the output of image classifiers perform poorly on inputs where the object of interest is partially occluded. We present a novel, black-box algorithm for computing explanations that uses a principled approach based on causal theory. We have implemented the method in the DEEPCOVER tool. We obtain explanations that are much more accurate than those generated by the existing explanation tools on images with occlusions and observe a level of performance comparable to the state of the art when explaining images without occlusions.
AB - Existing algorithms for explaining the output of image classifiers perform poorly on inputs where the object of interest is partially occluded. We present a novel, black-box algorithm for computing explanations that uses a principled approach based on causal theory. We have implemented the method in the DEEPCOVER tool. We obtain explanations that are much more accurate than those generated by the existing explanation tools on images with occlusions and observe a level of performance comparable to the state of the art when explaining images without occlusions.
UR - https://pure.qub.ac.uk/en/publications/321558a2-186c-4b17-a6d3-01424b808333
U2 - 10.1109/ICCV48922.2021.00127
DO - 10.1109/ICCV48922.2021.00127
M3 - Conference contribution
BT - 2021 International Conference on Computer Vision: Proceedings
ER -