我想了解tf.nn.sparse_softmax_cross_entropy_with_logits的工作原理。 描述说: A common use case is to have logits of shape [batch_size, num_classes]
and labels of shape [batch_size]. But higher dimensions are suppor
我需要将单热编码转换为由唯一整数表示的类别。用下面的代码创建的,因此一个热编码: from sklearn.preprocessing import OneHotEncoder
enc = OneHotEncoder()
labels = [[1],[2],[3]]
enc.fit(labels)
for x in [1,2,3]:
print(enc.transform([[x