Automatic and periodic recompiling of building databases with up-to-date high-resolution images has become a critical requirement for rapidly developing urban environments. However, the architecture of most existing approaches for change extraction attempts to learn features related to changes but ignores objectives related to buildings. This inevitably leads to the generation of signifificant pseudo-changes, due to factors such as seasonal changes in images and the inclination of building fa¸cades. To alleviate the above-mentioned problems, we developed a contrastive learning approach by validating historical building footprints against single up-to-date remotely sensed images. This contrastive learning strategy allowed us to inject the semantics of buildings into a pipeline for the detection of changes, which is achieved by increasing the distinguishability of features of buildings from those of non-buildings. In addition, to reduce the effects of inconsistencies between historical building polygons and buildings in up-to-date images, we employed a deformable convolutional neural network to learn offsets intuitively. In summary, we formulated a multi-branch building extraction method that identififies newly constructed and removed buildings, respectively. To validate our method, we conducted comparative experiments using the public Wuhan University building change detection dataset and a more practical dataset named SI-BU that we established. Our method achieved F1 scores of 93.99% and 70.74% on the above datasets, respectively. Moreover, when the data of the public dataset were divided in the same manner as in previous related studies, our method achieved an F1 score of 94.63%, which surpasses that of the state-of-the-art method.
Results on SI-BU dataset
Results on WHU-CD Dataset
The authors would like to thank the editor, associate editor, and reviewers for their helpful comments and advice. This work was supported in part by the National Natural Science Foundation of China (Projects No. 42230102, 42071355, and 41871291), the Sichuan Science and Technology Fund for Distinguished Young Scholars (22JCQN0110), and the Cultivation Program for the Excellent Doctoral Dissertation of Southwest Jiaotong University (2020YBPY09).