Automatic Differentiation: Gradient Cheatseet

The rust autograd package is a port of tensorflow and pytorch facilities to rust. I am still familiarizing myself with the project, so I am writing down all the calculus rules used so I can understand the math a little bit better:

f(x) f'(x)
sin(g(x)) g'(x)*cos(g(x))
cos(g(x)) g'(x)*-sin(g(x))
abs(g(x)) g(x)*g'(x)/abs(g(x))
c*g(x) c*g'(x)
-g(x) -g'(x)
g(x)^c c*(g(x)^(c-1))*g'(x)
g(x)^2 2*g(x)*g'(x)
g(x)^-1 -g'(x)/(g(x)^2)
sqrt(g(x)) g'(x)/2*sqrt(g(x))