Segment Any Change

Segment Any Change

15 Feb 2025 | Zhuo Zheng1, Yanfei Zhong2*, Liangpei Zhang2, Stefano Ermon1*
The paper introduces the Segment Any Change (AnyChange) model, a novel approach for zero-shot change detection in remote sensing images. AnyChange is built on the Segment Anything Model (SAM), a promptable image segmentation model, and leverages bitemporal latent matching to enable zero-shot change detection without additional training. The method exploits semantic similarities in SAM's latent space to identify changes between two time points, making it capable of detecting unseen change types and data distributions. The paper also introduces a point query mechanism to enable object-centric change detection, allowing users to click on objects to focus on specific changes. Extensive experiments on various datasets demonstrate the effectiveness of AnyChange, achieving state-of-the-art results in zero-shot change detection and outperforming strong baselines in unsupervised and supervised change detection tasks. AnyChange sets a new record on the SECOND benchmark for unsupervised change detection and achieves comparable accuracy to supervised methods with minimal manual annotations. The code for AnyChange is available at https://github.com/Z-Zheng/pytorch-change-models.The paper introduces the Segment Any Change (AnyChange) model, a novel approach for zero-shot change detection in remote sensing images. AnyChange is built on the Segment Anything Model (SAM), a promptable image segmentation model, and leverages bitemporal latent matching to enable zero-shot change detection without additional training. The method exploits semantic similarities in SAM's latent space to identify changes between two time points, making it capable of detecting unseen change types and data distributions. The paper also introduces a point query mechanism to enable object-centric change detection, allowing users to click on objects to focus on specific changes. Extensive experiments on various datasets demonstrate the effectiveness of AnyChange, achieving state-of-the-art results in zero-shot change detection and outperforming strong baselines in unsupervised and supervised change detection tasks. AnyChange sets a new record on the SECOND benchmark for unsupervised change detection and achieves comparable accuracy to supervised methods with minimal manual annotations. The code for AnyChange is available at https://github.com/Z-Zheng/pytorch-change-models.
Reach us at info@study.space