|
摘要: |
本文研究了无约束优化问题.利用当前和前面迭代点的信息以及曲线搜索技巧产生新的迭代点,得到了一个新的求解无约束优化问题的下降方法.在较弱条件下证明了算法具有全局收敛性.当目标函数为一致凸函数时,证明了算法具有线性收敛速率.初步的数值试验表明算法是有效的. |
关键词: 无约束优化 记忆梯度法 曲线搜索 收敛性 |
DOI: |
分类号:O221.2 |
基金项目:国家自然科学基金项目(11101248);山东省自然科学基金(ZR2010AQ026). |
|
A DESCENT METHOD FOR UNCONSTRAINED OPTIMIZATION PROBLEMS |
DONG Li,ZHOU Jin-chuan
|
Abstract: |
This paper studies the unconstrained optimization problem. By using the current and previous iterative information and the curve search rule to generate a new iterative point, a new descent algorithm is proposed for solving the unconstrained optimization problem. We prove its global convergence under some mild conditions. The linear convergence rate is also proved when the objective function is uniformly convex. Numerical results show that the new method is efficient in practical computation. |
Key words: unconstrained optimization memory gradient method curve search convergence |