This thesis proposes an adaptive linearized alternating direction multiplier method to improve the convergence rate of the algorithm by using adaptive techniques to dynamically select the regular term coefficients. The innovation of this method is to utilize the information of the current iteration point to adaptively select the appropriate parameters, thus expanding the selection of the subproblem step size and improving the convergence rate of the algorithm while ensuring convergence.The advantage of this method is that it can improve the convergence rate of the algorithm as much as possible without compromising the convergence. This is very beneficial for the solution of optimization problems because the traditional linearized alternating direction multiplier method has a trade-off in the selection of the regular term coefficients: larger coefficients ensure convergence but tend to lead to small step sizes, while smaller coefficients allow for an increase in the iterative step size but tend to lead to the algorithm's non-convergence. This balance can be better handled by adaptively selecting the parameters, thus improving the efficiency of the algorithm.Overall, the method proposed in this thesis is of great importance in the field of matrix optimization and has a positive effect on improving the convergence speed and efficiency of the algorithm. It is hoped that this adaptive idea can bring new inspiration to the development of the field of matrix optimization and promote the research and application in related fields.
翻译:暂无翻译