3
只有一个共享变量阵列的一部分欲执行以下操作:计算梯度为在Theano
import theano, numpy, theano.tensor as T
a = T.fvector('a')
w = theano.shared(numpy.array([1, 2, 3, 4], dtype=theano.config.floatX))
w_sub = w[1]
b = T.sum(a * w)
grad = T.grad(b, w_sub)
这里,w_sub是例如W [1],但我不想要显式写出来b在函数w_sub。尽管经历了http://deeplearning.net/software/theano/tutorial/faq_tutorial.html和其他相关问题,我无法解决它。
这只是为了向你展示我的问题。其实,我真正想做的是与千层面稀疏卷积。权重矩阵中的零条目不需要更新,因此不需要计算w的这些条目的梯度。
亲切问候,并提前谢谢!
的Jeroen
PS:现在这是完整的错误消息:
Traceback (most recent call last):
File "D:/Jeroen/Project_Lasagne_General/test_script.py", line 9, in <module>
grad = T.grad(b, w_sub)
File "C:\Anaconda2\lib\site-packages\theano\gradient.py", line 545, in grad
handle_disconnected(elem)
File "C:\Anaconda2\lib\site-packages\theano\gradient.py", line 532, in handle_disconnected
raise DisconnectedInputError(message)
theano.gradient.DisconnectedInputError: grad method was asked to compute the gradient with respect to a variable that is not part of the computational graph of the cost, or is used only by a non-differentiable operator: Subtensor{int64}.0
Backtrace when the node is created:
File "D:/Jeroen/Project_Lasagne_General/test_script.py", line 6, in <module>
w_sub = w[1]
上正在发生的事情一个很好的解释。谢谢。 –