This paper introduces HEAL, a novel extensible framework for open heterogeneous collaborative perception. HEAL addresses the challenge of integrating new heterogeneous agent types into collaborative perception systems while maintaining high performance and low integration costs. The framework establishes a unified feature space using a novel multi-scale foreground-aware Pyramid Fusion network. When new heterogeneous agents emerge with previously unseen modalities or models, they are aligned to the established unified space through an innovative backward alignment mechanism, which only involves individual training on the new agent type, resulting in extremely low training costs and high extensibility. To enrich data heterogeneity, the authors introduce a new large-scale dataset, OPV2V-H, which includes more diverse sensor types. Extensive experiments on OPV2V-H and DAIR-V2X datasets show that HEAL outperforms state-of-the-art methods in performance while reducing training parameters by 91.5% when integrating three new agent types. HEAL also provides a comprehensive codebase at https://github.com/yifanlu0227/HEAL. The framework enables new agent types to join the collaboration with feature-level alignment, ensuring robust performance and low integration costs. The proposed method demonstrates significant advantages in model size, FLOPs, training time, and memory usage, while preserving model details and data privacy. The results show that HEAL maintains the best performance and lowest training cost across various agent type combinations. The framework is evaluated on real-world datasets and demonstrates robustness against pose errors and feature compression. The experiments validate the efficiency and effectiveness of HEAL in open heterogeneous collaborative perception.This paper introduces HEAL, a novel extensible framework for open heterogeneous collaborative perception. HEAL addresses the challenge of integrating new heterogeneous agent types into collaborative perception systems while maintaining high performance and low integration costs. The framework establishes a unified feature space using a novel multi-scale foreground-aware Pyramid Fusion network. When new heterogeneous agents emerge with previously unseen modalities or models, they are aligned to the established unified space through an innovative backward alignment mechanism, which only involves individual training on the new agent type, resulting in extremely low training costs and high extensibility. To enrich data heterogeneity, the authors introduce a new large-scale dataset, OPV2V-H, which includes more diverse sensor types. Extensive experiments on OPV2V-H and DAIR-V2X datasets show that HEAL outperforms state-of-the-art methods in performance while reducing training parameters by 91.5% when integrating three new agent types. HEAL also provides a comprehensive codebase at https://github.com/yifanlu0227/HEAL. The framework enables new agent types to join the collaboration with feature-level alignment, ensuring robust performance and low integration costs. The proposed method demonstrates significant advantages in model size, FLOPs, training time, and memory usage, while preserving model details and data privacy. The results show that HEAL maintains the best performance and lowest training cost across various agent type combinations. The framework is evaluated on real-world datasets and demonstrates robustness against pose errors and feature compression. The experiments validate the efficiency and effectiveness of HEAL in open heterogeneous collaborative perception.