Self-Ensembling Attention Networks: Addressing Domain Shift for Semantic Segmentation

Authors

  • Yonghao Xu Wuhan University
  • Bo Du Wuhan University
  • Lefei Zhang Wuhan University
  • Qian Zhang Horizon Robotics
  • Guoli Wang Horizon Robotics
  • Liangpei Zhang Wuhan University

DOI:

https://doi.org/10.1609/aaai.v33i01.33015581

Abstract

Recent years have witnessed the great success of deep learning models in semantic segmentation. Nevertheless, these models may not generalize well to unseen image domains due to the phenomenon of domain shift. Since pixel-level annotations are laborious to collect, developing algorithms which can adapt labeled data from source domain to target domain is of great significance. To this end, we propose self-ensembling attention networks to reduce the domain gap between different datasets. To the best of our knowledge, the proposed method is the first attempt to introduce selfensembling model to domain adaptation for semantic segmentation, which provides a different view on how to learn domain-invariant features. Besides, since different regions in the image usually correspond to different levels of domain gap, we introduce the attention mechanism into the proposed framework to generate attention-aware features, which are further utilized to guide the calculation of consistency loss in the target domain. Experiments on two benchmark datasets demonstrate that the proposed framework can yield competitive performance compared with the state of the art methods.

Downloads

Published

2019-07-17

How to Cite

Xu, Y., Du, B., Zhang, L., Zhang, Q., Wang, G., & Zhang, L. (2019). Self-Ensembling Attention Networks: Addressing Domain Shift for Semantic Segmentation. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 5581-5588. https://doi.org/10.1609/aaai.v33i01.33015581

Issue

Section

AAAI Technical Track: Machine Learning