2017-07-16 50 views
1

我正在训练一个需要计算二阶导数(即梯度梯度)的模型。这里是一个小片段,做的是:计算张量流中二阶导数时的误差

mapping_loss = tf.losses.sparse_softmax_cross_entropy(
    1 - adversary_label, adversary_logits) 
adversary_loss = tf.losses.sparse_softmax_cross_entropy(
    adversary_label, adversary_logits) 

''' # doesnt work using tf.nn.softmax_cross_entropy_with_logits too. 

mapping_loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(
    labels = tf.one_hot(1 - adversary_label, 2), logits = adversary_logits, name='loss1')) 
adversary_loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(
    labels = tf.one_hot(adversary_label, 2), logits = adversary_logits, name = 'loss2')) 
''' 

grads_target = tf.gradients(mapping_loss, target_vars.values()) 
grads_adv = tf.gradients(adversary_loss, adversary_vars.values()) 

grads_all = grads_target + grads_adv 

reg = 0.5*sum(tf.reduce_sum(tf.square(g)) for g in grads_all) 
Jgrads_target = tf.gradients(reg, target_vars.values()) 
Jgrads_adv = tf.gradients(reg, adversary_vars.values()) 

我收到以下错误

Traceback (most recent call last): 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/tensorflow/python/ops/gradients_impl.py", line 455, in gradients 
    grad_fn = ops.get_gradient_function(op) 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1682, in get_gradient_function 
    return _gradient_registry.lookup(op_type) 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/tensorflow/python/framework/registry.py", line 93, in lookup 
    "%s registry has no entry for: %s" % (self._name, name)) 
LookupError: gradient registry has no entry for: PreventGradient 

During handling of the above exception, another exception occurred: 

Traceback (most recent call last): 
    File "tools/train_adda.py", line 215, in <module> 
    main() 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/click/core.py", line 722, in __call__ 
    return self.main(*args, **kwargs) 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/click/core.py", line 697, in main 
    rv = self.invoke(ctx) 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/click/core.py", line 895, in invoke 
    return ctx.invoke(self.callback, **ctx.params) 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/click/core.py", line 535, in invoke 
    return callback(*args, **kwargs) 
    File "tools/train_adda.py", line 137, in main 
    Jgrads_target = tf.gradients(reg, list(target_vars.values())) 
    File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/tensorflow/python/ops/gradients_impl.py", line 459, in gradients 
    (op.name, op.type)) 
LookupError: No gradient defined for operation 'gradients_1/sparse_softmax_cross_entropy_loss_1/xentropy/xentropy_grad/PreventGradient' (op type: PreventGradient) 

回答