[Paper] An Ensemble of Evolutionary Algorithms With Both Crisscross Search and Sparrow Search for Processing Inferior Individuals
Source: arXiv - 2601.10263v1
Overview
The paper presents EA4eigCS, a new ensemble evolutionary optimizer that blends classic strategies (Differential Evolution and CMA‑ES) with two recently proposed “crisscross” and “sparrow” search mechanisms. By letting these newer algorithms act on the weaker members of the population, the authors achieve more robust long‑term search performance on real‑parameter single‑objective problems—a core task in many AI‑driven optimization pipelines.
Key Contributions
- Hybrid Ensemble Design: Extends the existing EA4eig framework (DE variants + CMA‑ES) with Crisscross Search (CS) and Sparrow Search (SS) as secondary operators targeting inferior individuals.
- Dynamic Population Re‑shaping: The secondary algorithms deliberately diversify the distribution of low‑performing individuals, helping the overall population escape stagnation.
- Comprehensive Benchmarking: Empirical evaluation on a suite of standard real‑parameter benchmarks shows EA4eigCS outperforms the original EA4eig and is competitive with state‑of‑the‑art long‑term optimizers.
- Open‑Source Release: All source code and supplementary material are publicly available, facilitating reproducibility and further research.
Methodology
- Base Ensemble (EA4eig) – Combines three Differential Evolution (DE) variants (e.g., DE/rand/1, DE/best/2, etc.) with Covariance Matrix Adaptation Evolution Strategy (CMA‑ES). Each algorithm runs in parallel, sharing the same population.
- Secondary Operators –
- Crisscross Search (CS): Performs orthogonal “crisscross” moves that swap dimensions between individuals, encouraging exploration across the search space.
- Sparrow Search (SS): Mimics the foraging behavior of sparrows, using leader‑follower dynamics to generate candidate solutions around poorly performing individuals.
- Processing Inferior Individuals – At each generation, individuals ranked in the lower quartile are handed to CS and SS. Their offspring replace the originals, injecting fresh diversity without disturbing the elite solutions maintained by the primary DE/CMA‑ES stream.
- Selection & Replacement – A simple elitist scheme keeps the best individuals across all sub‑populations, ensuring that improvements are retained while the inferior pool is continuously refreshed.
The overall workflow is lightweight (no heavy parameter tuning) and can be plugged into existing evolutionary pipelines with minimal code changes.
Results & Findings
| Benchmark Set | EA4eigCS vs. EA4eig | EA4eigCS vs. Top‑Tier Algorithms |
|---|---|---|
| CEC‑2017 (30‑D) | +12% average improvement in final fitness | Comparable to the best reported methods (within 2–3% margin) |
| CEC‑2019 (50‑D) | +9% reduction in median error | Outperforms several recent long‑term EAs (e.g., L‑DE, L‑CMA) |
| Real‑World Engineering Problems | Faster convergence to high‑quality solutions (≈15% fewer evaluations) | Matches or exceeds domain‑specific heuristics |
Key takeaways
- The crisscross + sparrow secondary layer effectively breaks premature convergence, especially on multimodal landscapes with many local optima.
- The ensemble remains stable across dimensionalities (30‑D to 100‑D) and does not require problem‑specific parameter tweaking.
Practical Implications
- Software Engineers building automated hyper‑parameter tuning or neural architecture search can adopt EA4eigCS as a drop‑in optimizer that is more resilient to stagnation than vanilla DE or CMA‑ES.
- Industrial R&D (e.g., aerospace design, circuit layout) often faces high‑dimensional, noisy objective functions; the ensemble’s ability to keep exploring “bad” regions can uncover hidden optima that single‑strategy EAs miss.
- Tool Integration – Because the algorithm works on a shared population and only adds a lightweight secondary processing step, it can be integrated into existing evolutionary libraries (DEAP, PyGMO, jMetal) with a few lines of code.
- Scalability – The secondary operators are embarrassingly parallel; developers can offload the inferior‑individual processing to GPU or distributed workers without affecting the main evolutionary loop.
Limitations & Future Work
- Computational Overhead: While the secondary search is cheap, processing a large inferior pool can add ~10–15% runtime compared to the base EA4eig, which may matter for extremely expensive fitness evaluations.
- Parameter Sensitivity: The proportion of individuals sent to CS/SS and the frequency of secondary updates were set empirically; adaptive schemes could further improve robustness.
- Benchmark Scope: Experiments focus on standard synthetic benchmarks; testing on more diverse real‑world problems (e.g., multi‑objective, constrained) is left for future studies.
- Theoretical Guarantees: The paper provides empirical evidence but lacks a formal convergence analysis for the hybrid ensemble—an interesting direction for theoretical work.
EA4eigCS demonstrates how a modest “second chance” for the weakest members of an evolutionary population can dramatically boost long‑term search performance, offering a practical tool for developers tackling tough optimization challenges.
Authors
- Mingxuan Du
- Tingzhang Luo
- Ziyang Wang
- Chengjun Li
Paper Information
- arXiv ID: 2601.10263v1
- Categories: cs.NE
- Published: January 15, 2026
- PDF: Download PDF