Research on Machine Learning Optimization Algorithm Based on QUBO Model Article Swipe
YOU?
·
· 2025
· Open Access
·
· DOI: https://doi.org/10.54097/mjmj0580
· OA: W4411721356
This paper deals with the research of QUBO model optimization machine learning algorithm, including the following: first, discretize the continuous parameters of the machine learning model into binary variables, and construct the QUBO objective function; then optimize the QUBO model by using the quantum annealing algorithm; and finally solve the globally optimal solution by adjusting the annealing parameters, which include the initial temperature, the coefficient of temperature reduction, and the number of iterations. QUBO model can discretize continuous variables (including parameters such as weights and bias) in AR, SVM, CNN and other models in machine learning into binary variables, so as to better deal with nonlinear relationships and reduce the computational complexity in the training process. With the QUBO model, the regularization term can be better controlled to avoid the overfitting problem. Meanwhile, the QUBO model can be solved in parallel by quantum computing or simulated annealing algorithm, which significantly improves the computational efficiency under large-scale data.