Unsupervised Domain Adaptation with Transformer-Based GAN for Semantic Segmentation of High-Resolution Remote Sensing Images
Cross-domain semantic segmentation of remote sensing (RS) imagery based on unsupervised domain adaptation (UDA) has become a research hotspot in geoscience research. Recently, transformer, with its ingenious and versatile architecture, has been successfully applied in a wide range of RS tasks. Despite some attempts to integrate transformers with convolutional neural networks (CNNs) in UDA, existing works appear ineffective at leveraging transformer structures, which is evidenced by the large performance gap between these existing UDA methods and supervised learning-based methods. In this work, we adopt a radical and subversive stance by completely overhauling the UDA framework using transformer structures to bridge the gap across different domains based on generative adversarial networks (GAN). To this end, we first present a transformer-based generator (TransG) consisting of two components, namely a transformer-based encoder (TransEn) and a transformer-based decoder (TransDe) equipped with global-local transformer blocks (GLTB). The TransEn can extract contextual details from high-resolution RS images more efficiently as compared to CNN-based encoders, whereas the TransDe restores accurate spatial and local details simultaneously during the segmentation process. Furthermore, a novel transformer-based discriminator (TransDx) is proposed to encourage TransG to learn better domain-invariant representations. The TransG and TransDx together constitute the proposed transformer-based GAN, named DA-TransGAN, which provides an easily comprehensible UDA framework that can be extended to general UDA tasks in a straightforward manner. Extensive experiments on two large-scale fine-resolution benchmark datasets, namely ISPRS Potsdam and Vaihingen, highlight the effectiveness and superiority of the proposed DA-TransGAN as compared to the state-of-the-art UDA methods.
History
Email Address of Submitting Author
xianpingma@link.cuhk.edu.cnORCID of Submitting Author
0000-0002-2180-2964Submitting Author's Institution
The Chinese University of Hong Kong, ShenzhenSubmitting Author's Country
- China