Python – tensorflow.GradientTape.gradient()
TensorFlow 是由 Google 设计的开源Python库,用于开发机器学习模型和深度学习神经网络。
gradient()用于使用记录在该磁带上下文中的操作来计算梯度。
Syntax: gradient(target, sources, output_gradients, unconnected_gradients)
Parameters:
- target: It is Tensor or list of Tensor to be differentiated.
- sources: It is Tensor or list of Tensor. Target values are differentiated against the source.
- output_gradients: It is a list of gradients with default value None.
- unconnected_gradients: It’s value can be either none or zero with default value none.
Returns: It returns a list or nested structure of Tensor.
示例 1:
Python3
# Importing the library
import tensorflow as tf
x = tf.constant(4.0)
# Using GradientTape
with tf.GradientTape() as gfg:
gfg.watch(x)
y = x * x * x
# Computing gradient
res = gfg.gradient(y, x)
# Printing result
print("res: ",res)
Python3
# Importing the library
import tensorflow as tf
x = tf.constant(4.0)
# Using GradientTape
with tf.GradientTape() as gfg:
gfg.watch(x)
# Using nested GradientTape for
# calculating higher order derivative
with tf.GradientTape() as gg:
gg.watch(x)
y = x * x * x
# Computing first order gradient
first_order = gg.gradient(y, x)
# Computing Second order gradient
second_order = gfg.gradient(first_order, x)
# Printing result
print("first_order: ",first_order)
print("second_order: ",second_order)
输出:
res: tf.Tensor(48.0, shape=(), dtype=float32)
示例 2:
Python3
# Importing the library
import tensorflow as tf
x = tf.constant(4.0)
# Using GradientTape
with tf.GradientTape() as gfg:
gfg.watch(x)
# Using nested GradientTape for
# calculating higher order derivative
with tf.GradientTape() as gg:
gg.watch(x)
y = x * x * x
# Computing first order gradient
first_order = gg.gradient(y, x)
# Computing Second order gradient
second_order = gfg.gradient(first_order, x)
# Printing result
print("first_order: ",first_order)
print("second_order: ",second_order)
输出:
first_order: tf.Tensor(48.0, shape=(), dtype=float32)
second_order: tf.Tensor(24.0, shape=(), dtype=float32)