This paper presents a fully automated method for panoramic image stitching using invariant local features. The authors formulate the stitching problem as a multi-image matching task, leveraging SIFT features to find matches between images despite variations in rotation, zoom, and illumination. The method is insensitive to the ordering, orientation, scale, and illumination of the input images, and can recognize multiple panoramas in unordered datasets. The paper extends previous work by introducing gain compensation and automatic straightening steps, enhancing the quality of the output panoramas. The authors also describe an efficient bundle adjustment implementation and a multi-band blending technique to handle illumination differences and preserve high-frequency details. The system is demonstrated on various datasets, showing robust performance even with changes in camera settings and illumination. Future work includes addressing camera and scene motion, advanced camera modeling, and photometric modeling.This paper presents a fully automated method for panoramic image stitching using invariant local features. The authors formulate the stitching problem as a multi-image matching task, leveraging SIFT features to find matches between images despite variations in rotation, zoom, and illumination. The method is insensitive to the ordering, orientation, scale, and illumination of the input images, and can recognize multiple panoramas in unordered datasets. The paper extends previous work by introducing gain compensation and automatic straightening steps, enhancing the quality of the output panoramas. The authors also describe an efficient bundle adjustment implementation and a multi-band blending technique to handle illumination differences and preserve high-frequency details. The system is demonstrated on various datasets, showing robust performance even with changes in camera settings and illumination. Future work includes addressing camera and scene motion, advanced camera modeling, and photometric modeling.