2012-08-08 35 views
3

我在执行阮算法的Widrow(下)了个去,它似乎正常,但我有一些后续问题:神经网络初始化 - Nguyen Widrow实施?

  • 这是否看起来像一个正确实施?

  • Nguyen Widrow初始化是否适用于任何网络拓扑/ 大小? (即5层AutoEncoder)

  • Nguyen Widrow初始化是否适用于任何输入范围? (0/1,-1/+ 1等)

  • Nguyen Widrow初始化是否对任何激活函数有效? (即逻辑,双曲正切,线性)

下面的代码假设网络已经被随机分配到-1/+ 1:

 ' Calculate the number of hidden neurons 
     Dim HiddenNeuronsCount As Integer = Me.TotalNeuronsCount - (Me.InputsCount - Me.OutputsCount) 

     ' Calculate the Beta value for all hidden layers 
     Dim Beta As Double = (0.7 * Math.Pow(HiddenNeuronsCount, (1.0/Me.InputsCount))) 

     ' Loop through each layer in neural network, skipping input layer 
     For i As Integer = 1 To Layers.GetUpperBound(0) 

      ' Loop through each neuron in layer 
      For j As Integer = 0 To Layers(i).Neurons.GetUpperBound(0) 

       Dim InputsNorm As Double = 0 

       ' Loop through each weight in neuron inputs, add weight value to InputsNorm 
       For k As Integer = 0 To Layers(i).Neurons(j).ConnectionWeights.GetUpperBound(0) 
        InputsNorm += Layers(i).Neurons(j).ConnectionWeights(k) * Layers(i).Neurons(j).ConnectionWeights(k) 
       Next 

       ' Add bias value to InputsNorm 
       InputsNorm += Layers(i).Neurons(j).Bias * Layers(i).Neurons(j).Bias 

       ' Finalize euclidean norm calculation 
       InputsNorm = Math.Sqrt(InputsNorm) 

       ' Loop through each weight in neuron inputs, scale the weight based on euclidean norm and beta 
       For k As Integer = 0 To Layers(i).Neurons(j).ConnectionWeights.GetUpperBound(0) 
        Layers(i).Neurons(j).ConnectionWeights(k) = (Beta * Layers(i).Neurons(j).ConnectionWeights(k))/InputsNorm 
       Next 

       ' Scale the bias based on euclidean norm and beta 
       Layers(i).Neurons(j).Bias = (Beta * Layers(i).Neurons(j).Bias)/InputsNorm 

      Next 

     Next 

回答

1

阮&的Widrow在他们的论文假定输入介于-1和+1之间。 Nguyen Widrow初始化对任何长度有限的激活函数都有效。 再次在他们的论文中,他们只是在谈论一个2层NN,并不确定5层的一个。

S

+0

如果您有更多图层,只需将算法应用于每个图层。检查[这个问题](https://stackoverflow.com/questions/13689765/weight-initialisation)。 – Luis 2016-06-27 17:49:56