精品偷拍一区二区三区,亚洲精品永久 码,亚洲综合日韩精品欧美国产,亚洲国产日韩a在线亚洲

  • <center id="usuqs"></center>
  • 
    
  • 英語(yǔ)翻譯

    英語(yǔ)翻譯
    Learning in such a network proceeds the same way as for perceptrons:example inputs are presented to the network,and if the network computes an output vector that matches the target,nothing is done.If there is an error (a difference between the output and target),then the weights are adjustec to reduce this error.The trick is to assess the blame for an error and divide it among the contributing weights.In perceptrons,this is easy because there is only one weight between each input and the output.But in multilayer networks.There are many weights connecting each input to an output,and each of these weights contributes to more than one output.
    在這樣一個(gè)網(wǎng)絡(luò)學(xué)習(xí)收益,感知器相同的方式:例如輸入提交給網(wǎng)絡(luò),如果網(wǎng)絡(luò)計(jì)算的輸出向量相匹配的目標(biāo),不采取任何行動(dòng).如果有一個(gè)錯(cuò)誤(一產(chǎn)出和目標(biāo)之間的差異),則權(quán)重adjustec減少這種誤差.訣竅是評(píng)估錯(cuò)誤引咎鴻溝在造成重了.在感知,這很容易,因?yàn)橹挥幸婚g每個(gè)輸入和輸出的重量.但在多層網(wǎng)絡(luò).有連接每個(gè)輸入到輸出許多重量,而這些重量每有助于多個(gè)輸出.
    The back-propagation algorithm is a sensibly approach to dividing the contribution of each weight.As in the perceptron learning algorithm,we try to minimize the error between each target output and the output actually computed by the network.At the output layer,the weight update rule is very similar to the rule for the perceptron.There are two differences:the activation of the hidden unit aj is used instead of the input value; and the rule contains a term for the gradient of the activation function.If Erri is the error (Ti-Oi) at the output node,then the weight update rule for the link from unit j to unit i is
    反向傳播算法是一種明智的方法來(lái)劃分,每個(gè)重量的貢獻(xiàn).正如在感知學(xué)習(xí)算法,我們盡量減少各目標(biāo)之間的輸出和實(shí)際的網(wǎng)絡(luò)計(jì)算的輸出錯(cuò)誤.在輸出層,重量更新規(guī)則非常類(lèi)似的感知規(guī)則.有兩點(diǎn)不同:隱藏的單元?dú)W塞爾激活,而不是輸入值使用;和規(guī)則包含了激活功能梯度的一個(gè)術(shù)語(yǔ).如果Erri是在輸出節(jié)點(diǎn)錯(cuò)誤(鈦愛(ài)),那么從單位重量的鏈接j到我單位更新規(guī)則
    Wj,i
    英語(yǔ)人氣:128 ℃時(shí)間:2019-12-12 00:38:53
    優(yōu)質(zhì)解答
    看到BP了,如果不出所料應(yīng)該是神經(jīng)網(wǎng)絡(luò)的內(nèi)容不要相信翻譯軟件,幫你重翻了一遍:術(shù)語(yǔ):weight 權(quán)重hidden unit 隱層Learning in such a network proceeds the same way as for perceptrons: example inputs are prese...
    我來(lái)回答
    類(lèi)似推薦
    請(qǐng)使用1024x768 IE6.0或更高版本瀏覽器瀏覽本站點(diǎn),以保證最佳閱讀效果。本頁(yè)提供作業(yè)小助手,一起搜作業(yè)以及作業(yè)好幫手最新版!
    版權(quán)所有 CopyRight © 2012-2024 作業(yè)小助手 All Rights Reserved. 手機(jī)版