AI基础(1):Gradient,-Jacobian-matrix-and-Hessian-matrix
文章来自微信公众号“科文路”,欢迎关注、互动。转载须注明出处。
Gradient, Jacobian matrix and Hessian matrix
这两周,扮演了几场面试官。最大的感触是,应届生在基础数学知识的储备上存在很大的问题。所以我决定把我认为重要的 AI 基础知识拿出来过一下。
用英文写的(水平一般),帮助各位了解下术语。
1 Gradient
The gradient of
e.g. in coordinate system,
沿着
方向的导数,就是 轴方向的分量
Attention: the relationship to derivation
2 Jacobian matrix
雅可比矩阵的重要性在于它体现了一个可微方程与给出点的最优线性逼近。 因此,雅可比矩阵类似于多元函数的导数。
The Jacobian of a vector-valued function in several variables generalizes the gradient of a scalar-valued function in several variables, which in turn generalizes the derivative of a scalar-valued function of a single variable. If
This approximation specializes to the approximation of a scalar function of a single variable by its Taylor polynomial of degree one, namely
The Jacobian matrix represents the differential of
3 Hessian
即在每一个变化方向(
~~
都看到这儿了,不如关注每日推送的“科文路”、互动起来~
赞赏请前往公众号菜单栏~~
至少点个赞再走吧~
AI基础(1):Gradient,-Jacobian-matrix-and-Hessian-matrix