The paper introduces Conifer, a novel instruction tuning dataset designed to enhance large language models (LLMs) in following multi-level instructions with complex constraints. Conifer is constructed using GPT-4 to generate and refine instructions, ensuring high quality. The dataset is organized into a progressive learning scheme, emphasizing an easy-to-hard progression and learning from process feedback. Experiments on various instruction-following benchmarks, including IFEval, FollowBench, and InFoBench, demonstrate that models trained with Conifer achieve significant improvements, outperforming state-of-the-art open-source models at the 70B scale on certain metrics. The main contributions include the introduction of Conifer, a novel dataset for complex constrained instruction-following, and a progressive learning scheme to enhance LLMs' ability to interpret and follow complex constraints.The paper introduces Conifer, a novel instruction tuning dataset designed to enhance large language models (LLMs) in following multi-level instructions with complex constraints. Conifer is constructed using GPT-4 to generate and refine instructions, ensuring high quality. The dataset is organized into a progressive learning scheme, emphasizing an easy-to-hard progression and learning from process feedback. Experiments on various instruction-following benchmarks, including IFEval, FollowBench, and InFoBench, demonstrate that models trained with Conifer achieve significant improvements, outperforming state-of-the-art open-source models at the 70B scale on certain metrics. The main contributions include the introduction of Conifer, a novel dataset for complex constrained instruction-following, and a progressive learning scheme to enhance LLMs' ability to interpret and follow complex constraints.