如何申请一个网站 做视频,网站建设构架,网站建设付款银行写什么用途,物流公司网站建设进一步梳理理解了一下正向和反向传播。Forward 是利用当前网络对一条数据进行预测的过程#xff0c;BackPropagation 是根据误差进行网络权重调节的过程。 完整的代码在72天#xff0c;这里只粘贴Forward和BackPropagation两个方法。 /*** *********************************… 进一步梳理理解了一下正向和反向传播。Forward 是利用当前网络对一条数据进行预测的过程BackPropagation 是根据误差进行网络权重调节的过程。 完整的代码在72天这里只粘贴Forward和BackPropagation两个方法。 /*** ************************************************************ Forward prediction.* * param paraInput The input data of one instance.* return The data at the output end.* ************************************************************/public double[] forward(double[] paraInput) {// Initialize the input layer.for (int i 0; i layerNodeValues[0].length; i) {layerNodeValues[0][i] paraInput[i];}//of for i// Calculate the node values of each layer.double z;for (int l 1; l numLayers; l) {for (int j 0; j layerNodeValues[l].length; j) {// Initialize according to the offset, which is always 1//(l-1)层的第该层节点个数个节点偏置指向下一层[j]节点的值z等于该边的权值。z edgeWeights[l - 1][layerNodeValues[l - 1].length][j];//数组edgeWeights的第三维表示下一层节点的索引// Weighted sum on all edges for this node.for (int i 0; i layerNodeValues[l - 1].length; i) {z edgeWeights[l - 1][i][j] * layerNodeValues[l - 1][i];}//of for i (循环上一层的节点)// Sigmoid activation.// This line should be changed for other activation functions.layerNodeValues[l][j] 1 / (1 Math.exp(-z));}//of for j循环要计算的节点的当前层节点}//of for l循环神经网络的层return layerNodeValues[numLayers - 1];}//of forward/*** ******************************************************* Back propagation and change the edge weights.* * param paraTarget For 3-class data, it is [0, 0, 1], [0, 1, 0] or [1, 0, 0].* *******************************************************/public void backPropagation(double[] paraTarget) {// Step 1. Initialize the output layer error.int l numLayers - 1; //第l层即输出层for (int j 0; j layerNodeErrors[l].length; j) {layerNodeErrors[l][j] layerNodeValues[l][j] * (1 - layerNodeValues[l][j]) * (paraTarget[j] - layerNodeValues[l][j]); //套用输出层误差公式。上游传过来的值即误差乘以激活函数的倒数这里sigmod的倒数为y(1-y)。}//of for j// Step 2. Back-propagation even for l 0while (l 0) {l--;// Layer l, for each node.for (int j 0; j layerNumNodes[l]; j) {double z 0.0;// For each node of the next layer.for (int i 0; i layerNumNodes[l 1]; i) {if (l 0) {z layerNodeErrors[l 1][i] * edgeWeights[l][j][i];//(l1)层的第i个节点乘以l层第j个节点指向下一层第i个节点的边的权重。}//of if// Weight adjusting.edgeWeightsDelta[l][j][i] mobp * edgeWeightsDelta[l][j][i] learningRate * layerNodeErrors[l 1][i] * layerNodeValues[l][j];edgeWeights[l][j][i] edgeWeightsDelta[l][j][i];if (j layerNumNodes[l] - 1) {// Weight adjusting for the offset part.//偏置节点没包含在每层节点个数里所以要加1.edgeWeightsDelta[l][j 1][i] mobp * edgeWeightsDelta[l][j 1][i] learningRate * layerNodeErrors[l 1][i];edgeWeights[l][j 1][i] edgeWeightsDelta[l][j 1][i];}//of if}//of for i// Record the error according to the differential of Sigmoid.// This line should be changed for other activation functions.layerNodeErrors[l][j] layerNodeValues[l][j] * (1 - layerNodeValues[l][j]) * z;}//of for j}//of while}//of backPropagation
edgeWeights与edgeWeightsDelta两个三维数组再声明的时候第二维大小就是“layerNumNodes[l] 1”。所以偏置节点没包含在layerNumNodes[l] 里。因此在偏置调整时第二维的下标是j1。