2017-09-27 62 views
0

这里是我的代码:tf.nn.sparse_softmax_cross_entropy_with_logits - 等级错误

import tensorflow as tf 
    with tf.Session() as sess: 
     y = tf.constant([0,0,1]) 
     x = tf.constant([0,1,0]) 
     r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x) 
     sess.run() 
     print(r.eval()) 

它生成以下错误:

ValueError        Traceback (most recent call last) 
<ipython-input-10-28a8854a9457> in <module>() 
     4  y = tf.constant([0,0,1]) 
     5  x = tf.constant([0,1,0]) 
----> 6  r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x) 
     7  sess.run() 
     8  print(r.eval()) 

~\AppData\Local\conda\conda\envs\tensorflow\lib\site-packages\tensorflow\python\ops\nn_ops.py in sparse_softmax_cross_entropy_with_logits(_sentinel, labels, logits, name) 
    1687  raise ValueError("Rank mismatch: Rank of labels (received %s) should " 
    1688      "equal rank of logits minus 1 (received %s)." % 
-> 1689      (labels_static_shape.ndims, logits.get_shape().ndims)) 
    1690  # Check if no reshapes are required. 
    1691  if logits.get_shape().ndims == 2: 

ValueError: Rank mismatch: Rank of labels (received 1) should equal rank of logits minus 1 (received 1). 

有人能帮助我理解这个问题?如何计算softmax并手动计算交叉熵是相当直接的。

此外,我将如何使用此功能,我需要批量进入它(2暗阵列)?

UPDATE

我也试过:

import tensorflow as tf 

with tf.Session() as sess: 
    y = tf.constant([1]) 
    x = tf.constant([0,1,0]) 
    r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x) 
    sess.run() 
    print(r.eval()) 

,它产生同样的错误

+1

不确定,但你有没有尝试与秩2张量? Softmax通常用于多类问题。 –

回答

1

固定它。 x需要是一个二维矢量

with tf.Session() as sess: 
    y = tf.constant([1]) 
    x = tf.expand_dims(tf.constant([0.0, 1.0, 0.0]), 0) 
    r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x) 
    print(r.eval())