Preview

Моделирование и анализ информационных систем

Расширенный поиск

Правило «одной пятой» с возвратами для настройки размера популяции в генетическом алгоритме (1 + (λ,λ))

https://doi.org/10.18255/1818-1015-2020-4-488-508

Полный текст:

Аннотация

Известно, что настройка параметров может существенно улучшить время работы эволюционных алгоритмов.
Ярким примером этого является генетический алгоритм (1 + (λ,λ)), где адаптация размера популяции в процессе работы помогает достичь линейного времени работы на задаче OneMax. Однако если свойства решаемой задачи вступают в конфликт с принципами работы используемого метода настройки параметров, производительность эволюционного алгоритма может существенно ухудшаться. Так, например, происходит при использовании правила «одной пятой» в упомянутом алгоритме при решении задач со слабой корреляцией между приспособленностью и расстоянием до оптимума.
В данной работе предлагается модификация правила «одной пятой», существенно снижающая отрицательные эффекты от его использования при их наличии. Показывается, что данная модификация также достигает линейного времени работы на задаче OneMax, при этом ее использование приводит к улучшению производительности на линейных псевдобулевых функциях со случайными весами, а также на некотором классе задач MAX-3SAT.

Об авторах

Антон Олегович Басин
Университет ИТМО
Россия

Аспирант



Максим Викторович Буздалов
Университет ИТМО
Россия

Научный сотрудник, кандидат технических наук



Анатолий Абрамович Шалыто
Университет ИТМО
Россия

Главный научный сотрудник, доктор технических наук



Список литературы

1. J. H. Holland, Adaptation in Natural and Artificial Systems. University of Michigan, 1975, 211 pp.

2. I. Rechenberg, Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. Stuttgart: Fromman-Holzboorg Verlag, 1973.

3. H.-P. Schwefel, «Binare Optimierung durch somatische Mutation», Technical University of Berlin and Medical University of Hannover, Tech. Rep., May 1975.

4. L. J. Fogel, «Autonomous automata», Industrial Research, vol. 4, pp. 14-19, 1962.

5. J. R. Koza, Genetic programming: on the programming of computers by means of natural selection. Cambridge, MA, USA: MIT Press, 1992.

6. R. C. Eberhart and J. Kennedy, «A new optimizer using particle swarm theory», in Proceedings of the Sixth International Symposium on Micro Machine and Human Science, 1995, pp. 39-43.

7. M. Dorigo and L. M. Gambardella, «Ant colony system: A cooperative learning approach to the traveling salesman problem», IEEE Transactions on Evolutionary Computation, vol. 1, no. 1, pp. 53-66, 1997.

8. R. Poli, J. Kennedy, and T. Blackwell, «Particle swarm optimization, An overview», Swarm Intelligence, vol. 1, pp. 33-57, 2007.

9. S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, «Optimization by simulated annealing», Science, vol. 220, no. 4598, pp. 671-680, 1983.

10. J. M. Bishop, «Stochastic searching networks», in Proceedings of 1st IEEE Conference on Artificial Neural Networks, 1989, pp. 329-331.

11. R. Storn and K. Price, «Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces», Journal of Global Optimization, vol. 11, no. 4, pp. 341-359,1997.

12. M. Pelikan, D. E. Goldberg, and E. Cantu-Paz, «Linkage problem, distribution estimation, and bayesian networks», Evolutionary Computation, vol. 8, no. 3, pp. 311-340, 2000.

13. B. Doerr and M. S. Krejca, «Significance-based estimation-of-distribution algorithms», in Proceedings of Genetic and Evolutionary Computation Conference, 2018, pp. 1483-1490.

14. S. Luke, Essentials of Metaheuristics. Lulu, 2009, 253 pp.

15. A. Orlov, V. V. Kureichik, and A. Glushchenko, «Hybrid genetic algorithm for cutting stock and packaging problems», in Proceedings of East-West Design & Test Symposium, IEEE, 2016.

16. L. A. Gladkov, N. V. Gladkova, and S. A. Gromov, «Hybrid models of solving optimization tasks on the basis of integrating evolutionary design and multiagent technologies», in Proceedings of Computer Science Online Conference, ser. Advances in Intelligent Systems and Computing 985, 2019, pp. 381-391.

17. E. V. Kuliev, A. N. Dukkardt, V. V. Kureychik, and A. A. Legebokov, «Neighborhood research approach in swarm intelligence for solving the optimization problems», in Proceedings of East-West Design & Test Symposium, IEEE, 2014.

18. E. V. Kuliev, V. V. Kureichik, and I. O. Kursitys, «Decision making in VLSI components placement problem based on grey wolf optimization», in Proceedings of East-West Design & Test Symposium, IEEE, 2019.

19. V. V. Kureichik and D. V. Zaruba, «Combined approach to place electronic computing equipment circuit elements», in Proceedings of East-West Design & Test Symposium, IEEE, 2015.

20. L. A. Gladkov, N. V. Gladkova, and S. Leiba, «Electronic computing equipment schemes elements placement based on hybrid intelligence approach», in Proceedings of Computer Science Online Conference, ser. Advances in Intelligent Systems and Computing 348, 2015, pp. 35-44.

21. M. Semenkina, «Parallel version of self-configuring genetic algorithm application in spacecraft control system design», in Proceedings of Genetic and Evolutionary Computation Conference Companion, 2013, pp. 1751-1752.

22. D. Chivilikhin, V. Ulyantsev, and A. Shalyto, «Modified ant colony algorithm for constructing finite state machines from execution scenarios and temporal formulas», Automation and Remote Control, vol. 77, no. 3, pp. 473-484, 2016.

23. C. M. Fonseca and P. J. Fleming, «Nonlinear system identification with multiobjective genetic algorithm», in Proceedings of the World Congress of the International Federation of Automatic Control, 1996, pp. 187-192.

24. I. Buzhinsky, V. Ulyantsev, D. Chivilikhin, and A. Shalyto, «Inducing finite state machines from training samples using ant colony optimization», Journal of Computer and Systems Sciences International, vol. 53, no. 2, pp. 256-266, 2014.

25. D. Chivilikhin, V. Ulyantsev, and A. Shalyto, «Extended finite-state machine inference with parallel ant colony based algorithms», in International Student Workshop on Bioinspired Optimization Methods and Their Applications, 2014, pp. 117-126.

26. D. Chivilikhin, V. Ulyantsev, and A. Shalyto, «Combining exact and metaheuristic techniques for learning extended finite state machines from test scenarios and temporal properties», in Proceedings of International Conference on Machine Learning and Applications, 2014, pp. 350-355.

27. I. Buzhinsky, V. Ulyantsev, F. Tsarev, and A. Shalyto, «Search-based construction of finite-state machines with real-valued actions: New representation model», in Proceedings of Genetic and Evolutionary Computation Conference Companion, 2013, pp. 199-200.

28. S. El-Khatib, Y. Skobtsov, and S. Rodzin, «Improved particle swarm medical image segmentation algorithm for decision making», in Intelligent Distributed Computing XIII, ser. Studies in Computational Intelligence 868, 2019, pp. 437-442.

29. A. V. Eremeev and Y. V. Kovalenko, «Genetic algorithm with optimal recombination for the asymmetric travelling salesman problem», in LSSC 2017: Large-Scale Scientific Computing, ser. Lecture Notes in Computer Science 10665, 2017, pp. 341-349.

30. A. V. Eremeev and Y. V. Kovalenko, «A memetic algorithm with optimal recombination for the asymmetric travelling salesman problem», Memetic Computing, vol. 12, no. 1, pp. 23-36, 2020.

31. D. S. Sanches, D. Whitley, and R. Tinos, «Improving an exact solver for the traveling salesman problem using partition crossover», in Proceedings of Genetic and Evolutionary Computation Conference, 2017, pp. 337-344.

32. L. D. Whitley, F. Chicano, and B. W. Goldman, «Gray box optimization for Mk landscapes (NK landscapes and MAX-kSAT)», Evolutionary Computation, vol. 24, no. 3, pp. 491-519, 2016.

33. A. Glotic and A. Zamuda, «Short-term combined economic and emission hydrothermal optimization by surrogate differential evolution», Applied Energy, vol. 141, pp. 42-56, 2015.

34. T. Liao, P. J. Egbelu, and P. Chang, «Two hybrid differential evolution algorithms for optimal inbound and outbound truck sequencing in cross docking operations», Applied Soft Computing, vol. 12, no. 11, pp. 3683-3697, 2012.

35. V. Feoktistov, S. Pietravalle, and N. Heslot, «Optimal experimental design of field trials using differential evolution», in Proceedings of Congress on Evolutionary Computation, 2017, pp. 1690-1696.

36. T. Back, D. B. Fogel, and Z. Michalewicz, Eds., Evolutionary Computation 1: Basic Algorithms and Operators. Institute of Physics Publishing, 2000, 339 pp.

37. J. J. Grefenstette, «Optimization of control parameters for genetic algorithms», IEEE Transactions on Systems, Man, and Cybernetics, vol. 16, pp. 122-128, 1986.

38. H. Muhlenbein, «How genetic algorithms really work: Mutation and hillclimbing», in Parallel Problem Solving from Nature - PPSNII, Elsevier, 1992, pp. 15-26.

39. A. E. Eiben, R. Hinterding, and Z. Michalewicz, «Parameter control in evolutionary algorithms», IEEE Transactions on Evolutionary Computation, vol. 3, no. 2, pp. 124-141, 1999.

40. V. Stanovov, S. Akhmedova, E. Semenkin, and M. Semenkina, «Generalized Lehmer mean for success history based adaptive differential evolution», in Proceedings of International Joint Conference on Computational Intelligence, 2019, pp. 93-100.

41. V. Stanovov, S. Akhmedova, and E. Semenkin, «LSHADE algorithm with rank-based selective pressure strategy for solving CEC 2017 benchmark problems», in Proceedings of Congress on Evolutionary Computation, IEEE, 2018.

42. M. Semenkina and E. Semenkin, «Memetic self-configuring genetic programming for solving machine learning problems», in Proceedings of International Congress on Advanced Applied Informatics, 2015, pp. 599-604.

43. M. Semenkina, S. Akhmedova, C. Brester, and E. Semenkin, «Choice of spacecraft control contour variant with self-configuring stochastic algorithms of multi-criteria optimization», in Proceedings of International Conference on Informatics in Control, Automation and Robotics, 2016, pp. 281-286.

44. V. Stanovov, E. Semenkin, and O. Semenkina, «Self-configuring hybrid evolutionary algorithm for fuzzy imbalanced classification with adaptive instance selection», Journal of Artificial Intelligence and Soft Computing Research, vol. 6, no. 3, pp. 173-188, 2016.

45. N. Hansen and A. Ostermeier, «Completely derandomized self-adaptation in evolution strategies», Evolutionary Computation, vol. 9, pp. 159-195, 2001.

46. R. Tanabe and A. Fukunaga, «Success-history based parameter adaptation for differential evolution», in Proceedings of IEEE Congress on Evolutionary Computation, 2013, pp. 71-78.

47. R. Senkerik, M. Pluhacek, T. Kadavy, and A. Zamuda, «Distance based parameter adaptation for success-history based differential evolution», Swarm and Evolutionary Computation, vol. 50,2019. doi: 10.1016/j.swevo.2018.10.013.

48. N. Dang and C. Doerr, «Hyper-parameter tuning for the (1 + (λ, λ)) GA», in Proceedings of Genetic and Evolutionary Computation Conference, 2019, pp. 889-897.

49. E. Ridge and D. Kudenko, «Tuning an algorithm using design of experiments», in Experimental Methods for the Analysis of Optimization Algorithms, T. Bartz-Beielstein, M. Chiarandini, L. Paquete, and M. Preuss, Eds. Springer, 2010, pp. 265-286.

50. M. Lopez-Ibanez, J. Dubois-Lacoste, L. P. Caceres, T. Stutzle, and M. Birattari, «The irace package: Iterated racing for automatic algorithm configuration», Operations Research Perspectives, vol. 3, pp. 43-58, 2016.

51. F. Hutter, H. H. Hoos, and K. Leyton-Brown, «Sequential model-based optimization for general algorithm configuration», in Proceedings of Learning and Intelligent Optimization, Springer, 2011, pp. 507-523.

52. F. Hutter, H. H. Hoos, K. Leyton-Brown, and T. Stutzle, «ParamlLS: An automatic algorithm configuration framework», Journal of Artificial Intelligence Research, vol. 36, pp. 267-306, 2009.

53. I. Wegener, «Methods for the analysis of evolutionary algorithms on pseudo-Boolean functions», in Evolutionary Optimization, ser. International Series in Operations Research & Management Science 48, 2003, pp. 349-369.

54. A. Auger and B. Doerr, Theory of Randomized Search Heuristics: Foundations and Recent Developments. River Edge, NJ, USA: World Scientific Publishing Co., Inc., 2011.

55. B. Doerr and F. Neumann, Eds., Theory of Evolutionary Computation—Recent Developments in Discrete Optimization. Springer, 2020.

56. V. G. Red'ko and Y. Tsoy, «Estimation of the evolution speed for the quasispecies model: Arbitrary alphabet case», in Artificial Intelligence and Soft Computing - ICAISC 2006, ser. Lecture Notes in Computer Science 4029, 2006, pp. 460-469.

57. V. G. Red'ko, O. P. Mosalov, and D. V. Prokhorov, «Investigation of evolving populations of adaptive agents», in Artificial Neural Networks: Biological Inspirations - ICANN 2005, ser. Lecture Notes in Computer Science 3696, 2005, pp. 337-342.

58. A. Antamoshkin, E. Antamoshkin, and E. Semenkin, «Local search efficiency when optimizing unimodal pseudoboolean functions», Informatica, vol. 9, no. 3,1998.

59. A. N. Antamoshkin, V. N. Saraev, and E. Semenkin, «Optimization of unimodal monotone pseudoboolean functions», Kybernetika, vol. 26, no. 5, pp. 432-442, 1990.

60. S. Rodzin and L. Rodzina, «Theory of bionic optimization and its application to evolutionary synthesis of digital devices», in Proceedings of East-West Design & Test Symposium, IEEE, 2014.

61. S. El-Khatib, Y. Skobtsov, S. Rodzin, and S. Potryasaev, «Theoretical and experimental evaluation of PSO-K-Means algorithm for MRI images segmentation using drift theorem», in Proceedings of Computer Science Online Conference, ser. Advances in Intelligent Systems and Computing 985, 2019, pp. 316-323.

62. P. A. Borisovsky and A. V. Eremeev, «Comparing evolutionary algorithms to the (1+1)-EA», Theoretical Computer Science, vol. 403, no. 1, pp. 33-41, 2008. doi: 10.1016/j.tcs.2008.03.008.

63. D. Corus, D.-C. Dang, A. V. Eremeev, and P. K. Lehre, «Level-based analysis of genetic algorithms and other search processes», IEEE Transactions on Evolutionary Computation, vol. 22, no. 5, pp. 707-719, 2018.

64. A. V. Eremeev, «On non-elitist evolutionary algorithms optimizing fitness functions with a plateau», in Mathematical Optimization Theory and Operations Research, ser. Lecture Notes in Computer Science 12095, 2020, pp. 329-342.

65. A. V. Eremeev, «On proportions of fit individuals in population of mutation-based evolutionary algorithm with tournament selection», Evolutionary Computation, vol. 26, no. 2, pp. 269-297, 2018.

66. B. Doerr, C. Doerr, and F. Ebel, «From black-box complexity to designing new genetic algorithms», Theoretical Computer Science, vol. 567, pp. 87-104, 2015.

67. B. Doerr, C. Doerr, and F. Ebel, «Lessons from the black-box: Fast crossover-based genetic algorithms», in Proceedings of Genetic and Evolutionary Computation Conference, 2013, pp. 781-788.

68. B. Doerr and C. Doerr, «Optimal static and self-adjusting parameter choices for the (1 + (λ, λ)) genetic algorithm», Algorithmica, vol. 80, no. 5, pp. 1658-1709, 2018.

69. B. Doerr and C. Doerr, «Optimal parameter choices through self-adjustment: Applying the 1/5-th rule in discrete settings», in Proceedings of Genetic and Evolutionary Computation Conference, 2015, pp. 1335-1342.

70. D. Antipov, M. Buzdalov, and B. Doerr, «Fast mutation in crossover-based algorithms», in Proceedings of Genetic and Evolutionary Computation Conference, ACM, 2020, pp. 1268-1276.

71. B. Goldman and W. Punch, «Parameter-less population pyramid», in Proceedings of Genetic and Evolutionary Computation Conference, 2014, pp. 785-792.

72. M. Buzdalov and B. Doerr, «Runtime analysis of the (1 + (λ, λ)) genetic algorithm on random satisfiable 3-CNF formulas», in Proceedings of Genetic and Evolutionary Computation Conference, 2017, pp. 1343-1350.

73. A. Gandomi and B. Goldman, «Parameter-less population pyramid for large-scale tower optimization», Expert Systems with Applications, vol. 96, pp. 175-184, 2018.

74. V. Mironovich and M. Buzdalov, «Hard test generation for maximum flow algorithms with the fast crossover-based evolutionary algorithm», in Proceedings of Genetic and Evolutionary Computation Conference Companion, 2015, pp. 1229-1232.

75. M. A. Hevia Fajardo and D. Sudholt, «On the choice of the parameter control mechanism in the (1 + (λ, λ)) genetic algorithm», in Proceedings of Genetic and Evolutionary Computation Conference, 2020, pp. 832-840.

76. A. Bassin and M. Buzdalov, «The 1/5-th rule with rollbacks: On self-adjustment of the population size in the (1 + (λ, λ)) GA», in Proceedings of Genetic and Evolutionary Computation Conference Companion, 2019, pp. 277-278. [Online]. Available: https://arxiv.org/abs/1904.07284.

77. M. R. Garey and D. S. Johnson, Computers and Intractability: A Guide to the Theory of NP-Completeness. New York, NY, USA: W. H. Freeman & Co., 1979, 338 pp.

78. D. Mitchell, B. Selman, and H. Levesque, «Hard and easy distributions of SAT problems», in Proceedings of AAAI Conference on Artificial Intelligence, 1992, pp. 459-465.

79. A. M. Sutton and F. Neumann, «Runtime analysis of evolutionary algorithms on randomly constructed high-density satisfiable 3-CNF formulas», in Proceedings of Parallel Problem Solving from Nature, ser. Lecture Notes in Computer Science 8672, 2014, pp. 942-951.

80. B. Doerr, F. Neumann, and A. M. Sutton, «Improved runtime bounds for the (1+1) EA on random 3-CNF formulas based on fitness-distance correlation», in Proceedings of Genetic and Evolutionary Computation Conference, 2015, pp. 1415-1422.

81. B. Doerr and C. Doerr, «A tight runtime analysis of the (1 + (λ, λ)) genetic algorithm on OneMax», in Proceedings of Genetic and Evolutionary Computation Conference, 2015, pp. 1423-1430.

82. E. Carvalho Pinto and C. Doerr. (2018). Towards a more practice-aware runtime analysis of evolutionary algorithms, [Online]. Available: https://arxiv.org/abs/1812.00493.

83. E. C. Pinto and C. Doerr, «A simple proof for the usefulness of crossover in black-box optimization», in Parallel Problem Solving from Nature - PPSNXV, Vol. 2, ser. Lecture Notes in Computer Science 11102, 2018, pp. 29-41.

84. B. Doerr, «Optimal parameter settings for the (1 + (λ, λ)) genetic algorithm», in Proceedings of Genetic and Evolutionary Computation Conference, Full version available at http://arxiv.org/abs/1604.01088, 2016, pp. 1107-1114.


Для цитирования:


Басин А.О., Буздалов М.В., Шалыто А.А. Правило «одной пятой» с возвратами для настройки размера популяции в генетическом алгоритме (1 + (λ,λ)). Моделирование и анализ информационных систем. 2020;27(4):488-508. https://doi.org/10.18255/1818-1015-2020-4-488-508

For citation:


Bassin A.O., Buzdalov M.V., Shalyto A.A. The “One-fifth Rule” with Rollbacks for Self-Adjustment of the Population Size in the (1 + (λ,λ)) Genetic Algorithm. Modeling and Analysis of Information Systems. 2020;27(4):488-508. (In Russ.) https://doi.org/10.18255/1818-1015-2020-4-488-508

Просмотров: 136


Creative Commons License
Контент доступен под лицензией Creative Commons Attribution 4.0 License.


ISSN 1818-1015 (Print)
ISSN 2313-5417 (Online)