XGBoost is a highly scalable and effective tree boosting system designed to achieve state-of-the-art results on various machine learning challenges. The paper introduces several key innovations, including a novel sparsity-aware algorithm for handling sparse data, a weighted quantile sketch for approximate tree learning, and insights on cache access patterns, data compression, and sharding. These innovations enable XGBoost to scale beyond billions of examples using significantly fewer resources compared to existing systems. The system's performance is demonstrated through extensive experiments on large datasets, showing superior speed and accuracy in both single-machine and distributed settings. XGBoost has been widely adopted in real-world applications, as evidenced by its use in numerous machine learning competitions and production pipelines. The paper also provides a detailed overview of the system's design, implementation, and evaluation, highlighting its contributions to the field of machine learning.XGBoost is a highly scalable and effective tree boosting system designed to achieve state-of-the-art results on various machine learning challenges. The paper introduces several key innovations, including a novel sparsity-aware algorithm for handling sparse data, a weighted quantile sketch for approximate tree learning, and insights on cache access patterns, data compression, and sharding. These innovations enable XGBoost to scale beyond billions of examples using significantly fewer resources compared to existing systems. The system's performance is demonstrated through extensive experiments on large datasets, showing superior speed and accuracy in both single-machine and distributed settings. XGBoost has been widely adopted in real-world applications, as evidenced by its use in numerous machine learning competitions and production pipelines. The paper also provides a detailed overview of the system's design, implementation, and evaluation, highlighting its contributions to the field of machine learning.