This paper introduces DEA$^{2}$H$^{2}$, a novel differential evolution (DE) architecture-based hyper-heuristic algorithm for continuous optimization. The algorithm consists of two main components: low-level and high-level. The low-level component uses ten DE-derived search operators as low-level heuristics (LLHs). The high-level component incorporates a success-history-based mechanism inspired by success-history adaptive DE (SHADE). If a search operator successfully evolves an offspring, it is preserved; otherwise, it is replaced by random initialization. The algorithm is validated through experiments on CEC2020 and CEC2022 benchmark functions, as well as eight engineering problems. It is compared against fifteen well-known metaheuristic algorithms (MAs), and ablation experiments are conducted to assess the effectiveness of the high-level component. The results confirm the superiority and robustness of DEA$^{2}$H$^{2}$ across diverse optimization tasks, highlighting its potential as an effective tool for continuous optimization. The source code is available at https://github.com/RuiZhong961230/DEA2H2.
The paper discusses concerns within the MA community about the proliferation of metaphor-based algorithms, which may lack scientific rigor. The hyper-heuristic (HH) framework offers an alternative approach that avoids overstated metaphors. Traditional MAs use predefined operators in fixed sequences, but HH focuses on dynamically generating or selecting LLHs. This allows the HH framework to learn from past optimization experiences and adapt the selection process. In recent years, HH has gained widespread attention for its adaptability in solving complex real-world problems. However, continuous optimization has received relatively less attention. This research proposes DEA$^{2}$H$^{2}$, a DE-architecture-based adaptive hyper-heuristic algorithm for continuous optimization. The LLHs module incorporates representative search operators from DE. Inspired by SHADE, the success history knowledge is used to dynamically adjust the optimization sequence. The main contributions include the proposal of DEA$^{2}$H$^{2}$, the use of DE-derived search operators as LLHs, the design of a success-history-based mechanism for the high-level component, and comprehensive performance evaluations on benchmark functions and engineering problems.This paper introduces DEA$^{2}$H$^{2}$, a novel differential evolution (DE) architecture-based hyper-heuristic algorithm for continuous optimization. The algorithm consists of two main components: low-level and high-level. The low-level component uses ten DE-derived search operators as low-level heuristics (LLHs). The high-level component incorporates a success-history-based mechanism inspired by success-history adaptive DE (SHADE). If a search operator successfully evolves an offspring, it is preserved; otherwise, it is replaced by random initialization. The algorithm is validated through experiments on CEC2020 and CEC2022 benchmark functions, as well as eight engineering problems. It is compared against fifteen well-known metaheuristic algorithms (MAs), and ablation experiments are conducted to assess the effectiveness of the high-level component. The results confirm the superiority and robustness of DEA$^{2}$H$^{2}$ across diverse optimization tasks, highlighting its potential as an effective tool for continuous optimization. The source code is available at https://github.com/RuiZhong961230/DEA2H2.
The paper discusses concerns within the MA community about the proliferation of metaphor-based algorithms, which may lack scientific rigor. The hyper-heuristic (HH) framework offers an alternative approach that avoids overstated metaphors. Traditional MAs use predefined operators in fixed sequences, but HH focuses on dynamically generating or selecting LLHs. This allows the HH framework to learn from past optimization experiences and adapt the selection process. In recent years, HH has gained widespread attention for its adaptability in solving complex real-world problems. However, continuous optimization has received relatively less attention. This research proposes DEA$^{2}$H$^{2}$, a DE-architecture-based adaptive hyper-heuristic algorithm for continuous optimization. The LLHs module incorporates representative search operators from DE. Inspired by SHADE, the success history knowledge is used to dynamically adjust the optimization sequence. The main contributions include the proposal of DEA$^{2}$H$^{2}$, the use of DE-derived search operators as LLHs, the design of a success-history-based mechanism for the high-level component, and comprehensive performance evaluations on benchmark functions and engineering problems.