Neural Network prob: the Backpopagation algorithm
-
I don't know where to post this but I think I got it right. How do I calculate the error of the a hidden node when I've calculated and made the adjustment of the outputlayer. I don't know how to do this. Please help me, please. If you want I could post all the classes to. This is just a part of the learing algorihtm. The main loop that's adding the data is before this, and the float answerOfData is refering to the expected output of each output node.
public int BackpropagationLA3(float[] answerOfData) { this.CalcNet(); NeuralNetworkLayer outputLayer = this.Layers[this.Layers.Length]; float[] Erre = new float[outputLayer.neurons.Length]; //Erre = Te - O for (int i = 0; i < Erre.Length; i++) { Erre[i] = answerOfData[i] - outputLayer[i].output; } //Wij = Wij + alfa * aj * Errei * g'(ini) for (int i = 0; i < Erre.Length; i++) { for (int j = 0; j < outputLayer[0].weights.Length; j++) { outputLayer[i].weights[j] += alfa * outputLayer[i][j].Output * Erre[i] * Neuron.deriavateActivationFunc(outputLayer[i].Sum()); } } for (int l = Layers.Length-1; l > 1; l--) { //delatj = g'(inj) * Sigma i * Wij * deltai [red]//I'm stuck here[/red] Neuron.deriavateActivationFunc(outputLayer[i].Sum()); } return 0;
Niklas Ulvinge aka IDK