数学与统计-共轭梯度法

文章来自微信公众号“科文路”,欢迎关注、互动。转发须注明出处。

备查手册-梯度法_共轭梯度法_线搜索,本文为第二部分。

2.2 共轭梯度法

conjugate gradient method

该方法基于梯度下降法。

由于梯度下降法每次下降的方向为负梯度方向,这并不能保证其是最优的方向。通过共轭方向的计算,保证第二步开始的下降方向在一个圆锥内,这能极大的提高下降的效率。


$$\begin{cases} x^{k+1}=x^k+\alpha d^k\d^k=-g^k+\beta d^k \ \beta = \frac{g_k^Tg_k}{g_{k-1}^Tg_{k-1}}\end{cases}$$

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
x_k = 6 # The algorithm starts at x=6
alpha = 0.01 # step size multiplier
precision = 0.00001 # end mark
previous_step_size = 1
max_iters = 10000 # maximum number of iterations
iters = 0 #iteration counter
beta = 0

df = lambda x: 4 * x**3 - 9 * x**2

g_k = df(x_k)

while previous_step_size > precision and iters < max_iters:
prev_x_k = x_k
prev_g_k = g_k
if 0 == iters:
d_k = -g_k
else:
g_k = df(x_k)
beta = g_k*g_k/(prev_g_k*prev_g_k)
d_k = -g_k + beta*d_k
x_k += alpha * d_k
previous_step_size = abs(x_k - prev_x_k)
iters+=1

print("The local minimum occurs at", x_k)
print("The number of iterations is", iters)
1
2
The local minimum occurs at 2.2500110335395793
The number of iterations is 22

可以看到,在近似的结果下,迭代次数大幅降低。也就是说运算量显著减少了。

下回介绍更进一步的的方法,line search,线搜。

都看到这儿了,不如关注每日推送的“科文路”、互动起来~

数学与统计-共轭梯度法

https://xlindo.com/kewenlu2022/posts/344d3a55/

Author

xlindo

Posted on

2022-04-21

Updated on

2023-05-10

Licensed under

Comments