Abstract: Multi-task optimization is an emerging research topic in computational intelligence community. In this paper, we propose a novel evolutionary framework, many-task evolutionary algorithm (MaTEA), for many-task optimization. In the proposed MaTEA,an adaptive selection mechanism is proposed to select suitable “assisted” task for a given task by considering the similarity between tasks and the accumulated rewards of knowledge transfer during the evolution. Besides, a knowledge transfer schema via crossover is adopted to exchange information among tasks to improve the search efficiency. In addition, to facilitate measuring similarity between tasks and transferring knowledge among tasks that arrive at different time instances, multiple archives are integrated with the proposed MaTEA. Experiments on both single-objective and multi-objective optimization problems have demonstrated that the proposed MaTEA can outperform the state-of-the-art multitask evolutionary algorithms, in terms of search efficiency and solution accuracy. Besides, the proposed MaTEA is also capable of solving dynamic many-task optimization where tasks arrive at different time instances.
Related paper:
Y. Chen, J. Zhong*, L. Feng, and J. Zhang, “An Adaptive Archive-based Evolutionary Framework for Many-task Optimization,” IEEE Transactions on Emerging Topics in Computational Intelligence, 2019, Accepted.
S. Huang, J. Zhong*, and W, Yu, “Surrogate-Assisted Evolutionary Framework with Adaptive Knowledge Transfer for Multi-task Optimization,” IEEE Transactions on Emerging Topics in Computing, vol. 9, no. 4, pp. 1930-1944, 1 Oct.-Dec. 2021, doi: 10.1109/TETC.2019.2945775.
T. Wei, S. Wang, J.Zhong*, D. Liu, and J. Zhang, “A Review on Evolutionary Multi-Task Optimization: Trends and Challenges,” IEEE Transactions on Evolutionary Computation, 2021, Accepted.
T. Wei, and J. Zhong*, “Towards Generalized Resource Allocation on Evolutionary Multitasking for Multi-Objective Optimization,” IEEE Computational Intelligence Magazine, vol. 16, no. 4, pp. 20-37, Nov. 2021, doi: 10.1109/MCI.2021.3108310.[link] [Code]