21 February 2024 | Kyle Pretorius¹ · Nelishia Pillay¹
This study explores the use of genetic programming (GP) to automatically evolve crossover operators for neural network (NN) weights in genetic algorithms (GAs). The research addresses the challenge of designing effective crossover operators for NNs, which are often considered destructive and detrimental to GA performance. The study proposes a novel GP approach to evolve both reusable and disposable crossover operators, comparing their efficiency and effectiveness. Experiments show that GP-evolved disposable crossover operators significantly outperform traditional methods, leading to better GA results. The study highlights the benefits of using GP to design crossover operators, demonstrating that they can be more effective than commonly used human-designed operators. The results indicate that including GP-evolved crossover operators in GAs that optimize NN weights improves performance, especially when applied to disposable operators. The study also discusses the convergence argument, suggesting that the permutation problem may not be as significant once populations converge. The research contributes to the field of neuroevolution by showing that GP can effectively evolve crossover operators tailored to specific problem domains, enhancing the performance of GAs in NN weight optimization.This study explores the use of genetic programming (GP) to automatically evolve crossover operators for neural network (NN) weights in genetic algorithms (GAs). The research addresses the challenge of designing effective crossover operators for NNs, which are often considered destructive and detrimental to GA performance. The study proposes a novel GP approach to evolve both reusable and disposable crossover operators, comparing their efficiency and effectiveness. Experiments show that GP-evolved disposable crossover operators significantly outperform traditional methods, leading to better GA results. The study highlights the benefits of using GP to design crossover operators, demonstrating that they can be more effective than commonly used human-designed operators. The results indicate that including GP-evolved crossover operators in GAs that optimize NN weights improves performance, especially when applied to disposable operators. The study also discusses the convergence argument, suggesting that the permutation problem may not be as significant once populations converge. The research contributes to the field of neuroevolution by showing that GP can effectively evolve crossover operators tailored to specific problem domains, enhancing the performance of GAs in NN weight optimization.