本文已被:浏览 2657次 下载 52次 |
|
|
一类新的求解无约束优化问题的记忆梯度法 |
汤京永,贺国平,董丽
|
1. 信阳师范学院数学与信息科学学院,河南,信阳,464000;上海交通大学数学系,上海,200240 2. 山东科技大学信息科学与工程学院,山东,青岛,266510 3. 信阳师范学院数学与信息科学学院,河南,信阳,464000
|
|
摘要: |
本文研究了无约束优化问题.利用当前和前面迭代点的信息产生下降方向以及Armijo线性搜索确定步长,得到了一类新的记忆梯度法.在较弱条件下证明了算法具有全局收敛性和线性收敛速率.数值试验表明算法是有效的. |
关键词: 无约束优化 记忆梯度法 全局收敛性 线性收敛速率 |
DOI: |
分类号:O221.2 |
基金项目:国家自然科学基金,山东省自然科学基金,信阳师范学院青年科研基金 |
|
A NEW CLASS OF MEMORY GRADIENT METHODS FOR UNCONSTRAINED OPTIMIZATION PROBLEMS |
TANG Jing-yong,HE Guo-ping,DONG Li
|
TANG Jing-yong~(1,2),HE Guo-ping~3,DONG Li~1 (1.College of Mathematics and Information Science,Xinyang Normal University,Xinyang 464000,China) (2.Department of Mathematics,Shanghai Jiaotong University,Shanghai 200240,China) (3.College of Information Science and Engineering,Shandong University of Science and Technology,Qingdao 266510,China)
|
Abstract: |
In this article,the unconstrained optimization problem is concerned.By using the current and previous iterative information and applying Armijo linear search,a new memory gradient method is presented.Meanwhile,we generate a descending direction and deflne the step-size.The global convergence and linear convergence rate are proved under some mild conditions.Numerical experiments show that the new method is efficient in practical computation. |
Key words: unconstrained optimization memory gradient method global convergence linear convergence rate |