DOI: 10.17587/prin.17.67-82
Optimization of Hyperparameters of Recommendation Algorithms through a Single Evaluation Metric
R. S. Kulshin, Postgraduate Student, Assistant, roman.s.kulshin@tusur.ru,
A. A. Sidorov, Head of Department, anatolii.a.sidorov@tusur.ru,
Tomsk State University of Control Systems and Radioelectronics, Tomsk, 634050, Russian Federation
Corresponding author: Roman S. Kulshin, Postgraduate Student, Assistant, Tomsk State University of Control Systems and Radioelectronics, Tomsk, 634050, Russian Federation, E-mail: roman.s.kulshin@tusur.ru
Received on May 14, 2025
Accepted on August 13, 2025
The paper considers an approach to optimizing hyperparameters of recommendation algorithms using an integral assessment that combines several metrics into four key subindexes: accuracy, ranking, diversity, and resource intensity. This method allows for a more balanced tuning of models, ensuring improved quality of recommendations without loss in individual characteristics. It is shown that different algorithms react differently to the chosen optimization strategy. The developed methodology can be applied not only in recommendation systems, but also in other multi-criteria optimization tasks.
Keywords: optimization, algorithms, hyperparameters, recommendation systems, integral indicator
pp. 67—82
For citation:
Kulshin R. S., Sidorov A. A. Optimization of Hyperparameters of Recommendation Algorithms through a Single Evaluation Metric, Programmnaya Ingeneria, 2026, vol. 17, no. 2, pp. 67—82. DOI: 10.17587/prin.17.67-82. (in Russian).
References:
- Kulshin R. S., Sidorov A. A. An Entropy-Based Compos- ite Indicator for Evaluating the Effectiveness of Recommender System Algorithms, Control Sciences, 2024, no. 4, pp. 44—60. DOI: 10.25728/pu.2024.4.4 (in Russian).
- Del Buono N., Esposito F., Selicato L. Methods for hyperparameters optimization in learning approaches: an overview, Machine Learning, Optimization, and Data Science: 6th International Conference, LOD 2020, Siena, Italy, July 19—23, 2020, Revised Selected Papers, Part I 6, Springer International Publishing, 2020, pp. 100—112. DOI: 10.1007/978-3-030-64583-0_11.
- Anafiev A. S., Karyuk A. S. Overview of approaches to solving the hyperparameters optimization problem for the Machine Learning algorithms, Taurida Journal of Computer Science Theory and Mathematics, 2022, no. 2 (55), pp. 30—37 (in Russian).
- Bischl B., Binder M., Lang M. et al. Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges, Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 2023, vol. 13, no. 2, article e1484. DOI: 10.1002/widm.1484.
- Bergstra J., Bardenet R., Bengio Y. et al. Algorithms for hyper-parameter optimization, Advances in neural information processing systems, 2011, vol. 24.
- Bergstra J., Bengio Y. Random search for hyper-parameter optimization, The journal of machine learning research, 2012, vol. 13, no. 1, pp. 281—305.
- Belete D. M., Huchaiah M. D. Grid search in hyperparameter optimization of machine learning models for prediction of HIV/AIDS test results, International Journal of Computers and Applications, 2022, vol. 44, no. 9, pp. 875—886. DOI: 10.1080/1206212X.2021.1974663.
- Ros R., Hansen N. A simple modification in CMA-ES achieving linear time and space complexity, International conference on parallel problem solving from nature, Berlin, Heidelberg, Springer Berlin Heidelberg, 2008, pp. 296—305. DOI: 10.1007/978-3-540-87700-4_30.
- Bergstra J., Yamins D., Cox D. Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures, International conference on machine learning, PMLR, 2013, vol. 28, no. 1, pp. 115—123.
- Ozaki Y., Tanigaki Y., Watanabe S., Onishi M. Multiobjective tree-structured parzen estimator for computationally ex-pensive optimization problems, Proceedings of the 2020 genetic and evolutionary computation conference, 2020, pp. 533—541. DOI: 10.1145/3377930.3389817.
- Snoek J., Larochelle H., Adams R. Practical Bayesian optimization of machine learning algorithms, Advances in neural information processing systems, 2012, vol. 25. DOI: 10.48550/arXiv.1206.2944.
- Deshpande M., Karypis G. Item-based top-n recommendation algorithms, ACM Transactions on Information Systems (TOIS), 2004, vol. 22, no. 1, pp. 143—177. DOI: DOI: 10.1145/963770.963776.
- Ning X., Karypis G. Slim: Sparse linear methods for top-n recommender systems, 2011 IEEE 11th international conference on data mining, IEEE, 2011, pp. 497—506. DOI: 10.1109/ ICDM.2011.134.
- Wu Y., DuBois C., Zheng A. X. et al. Collaborative denoising auto-encoders for top-n recommender systems, Proceedings of the ninth ACM international conference on web search and data mining, 2016, pp. 153—162. DOI: 10.1145/2835776.2835837.
- Tang J., Qu M., Wang M. et al. Line: Large-scale information network embedding, Proceedings of the 24th international conference on world wide web, 2015, pp. 1067—1077. DOI: 10.1145/2736277.2741093.
- Harper F., Konstan J. The movielens datasets: History and context, Acm transactions on interactive intelligent systems (TiiS), 2015, vol. 5, no. 4, pp. 1—19. DOI: 10.1145/2827872.
- Hou Y., Li J., He Z. et al. Bridging language and items for retrieval and recommendation. 2024. arXiv: 2403.03952.