英語(yǔ)翻譯
英語(yǔ)翻譯
Learning in such a network proceeds the same way as for perceptrons:example inputs are presented to the network,and if the network computes an output vector that matches the target,nothing is done.If there is an error (a difference between the output and target),then the weights are adjustec to reduce this error.The trick is to assess the blame for an error and divide it among the contributing weights.In perceptrons,this is easy because there is only one weight between each input and the output.But in multilayer networks.There are many weights connecting each input to an output,and each of these weights contributes to more than one output.
在這樣一個(gè)網(wǎng)絡(luò)學(xué)習(xí)收益,感知器相同的方式:例如輸入提交給網(wǎng)絡(luò),如果網(wǎng)絡(luò)計(jì)算的輸出向量相匹配的目標(biāo),不采取任何行動(dòng).如果有一個(gè)錯(cuò)誤(一產(chǎn)出和目標(biāo)之間的差異),則權(quán)重adjustec減少這種誤差.訣竅是評(píng)估錯(cuò)誤引咎鴻溝在造成重了.在感知,這很容易,因?yàn)橹挥幸婚g每個(gè)輸入和輸出的重量.但在多層網(wǎng)絡(luò).有連接每個(gè)輸入到輸出許多重量,而這些重量每有助于多個(gè)輸出.
The back-propagation algorithm is a sensibly approach to dividing the contribution of each weight.As in the perceptron learning algorithm,we try to minimize the error between each target output and the output actually computed by the network.At the output layer,the weight update rule is very similar to the rule for the perceptron.There are two differences:the activation of the hidden unit aj is used instead of the input value; and the rule contains a term for the gradient of the activation function.If Erri is the error (Ti-Oi) at the output node,then the weight update rule for the link from unit j to unit i is
反向傳播算法是一種明智的方法來(lái)劃分,每個(gè)重量的貢獻(xiàn).正如在感知學(xué)習(xí)算法,我們盡量減少各目標(biāo)之間的輸出和實(shí)際的網(wǎng)絡(luò)計(jì)算的輸出錯(cuò)誤.在輸出層,重量更新規(guī)則非常類(lèi)似的感知規(guī)則.有兩點(diǎn)不同:隱藏的單元?dú)W塞爾激活,而不是輸入值使用;和規(guī)則包含了激活功能梯度的一個(gè)術(shù)語(yǔ).如果Erri是在輸出節(jié)點(diǎn)錯(cuò)誤(鈦愛(ài)),那么從單位重量的鏈接j到我單位更新規(guī)則
Wj,i
Learning in such a network proceeds the same way as for perceptrons:example inputs are presented to the network,and if the network computes an output vector that matches the target,nothing is done.If there is an error (a difference between the output and target),then the weights are adjustec to reduce this error.The trick is to assess the blame for an error and divide it among the contributing weights.In perceptrons,this is easy because there is only one weight between each input and the output.But in multilayer networks.There are many weights connecting each input to an output,and each of these weights contributes to more than one output.
在這樣一個(gè)網(wǎng)絡(luò)學(xué)習(xí)收益,感知器相同的方式:例如輸入提交給網(wǎng)絡(luò),如果網(wǎng)絡(luò)計(jì)算的輸出向量相匹配的目標(biāo),不采取任何行動(dòng).如果有一個(gè)錯(cuò)誤(一產(chǎn)出和目標(biāo)之間的差異),則權(quán)重adjustec減少這種誤差.訣竅是評(píng)估錯(cuò)誤引咎鴻溝在造成重了.在感知,這很容易,因?yàn)橹挥幸婚g每個(gè)輸入和輸出的重量.但在多層網(wǎng)絡(luò).有連接每個(gè)輸入到輸出許多重量,而這些重量每有助于多個(gè)輸出.
The back-propagation algorithm is a sensibly approach to dividing the contribution of each weight.As in the perceptron learning algorithm,we try to minimize the error between each target output and the output actually computed by the network.At the output layer,the weight update rule is very similar to the rule for the perceptron.There are two differences:the activation of the hidden unit aj is used instead of the input value; and the rule contains a term for the gradient of the activation function.If Erri is the error (Ti-Oi) at the output node,then the weight update rule for the link from unit j to unit i is
反向傳播算法是一種明智的方法來(lái)劃分,每個(gè)重量的貢獻(xiàn).正如在感知學(xué)習(xí)算法,我們盡量減少各目標(biāo)之間的輸出和實(shí)際的網(wǎng)絡(luò)計(jì)算的輸出錯(cuò)誤.在輸出層,重量更新規(guī)則非常類(lèi)似的感知規(guī)則.有兩點(diǎn)不同:隱藏的單元?dú)W塞爾激活,而不是輸入值使用;和規(guī)則包含了激活功能梯度的一個(gè)術(shù)語(yǔ).如果Erri是在輸出節(jié)點(diǎn)錯(cuò)誤(鈦愛(ài)),那么從單位重量的鏈接j到我單位更新規(guī)則
Wj,i
英語(yǔ)人氣:128 ℃時(shí)間:2019-12-12 00:38:53
優(yōu)質(zhì)解答
看到BP了,如果不出所料應(yīng)該是神經(jīng)網(wǎng)絡(luò)的內(nèi)容不要相信翻譯軟件,幫你重翻了一遍:術(shù)語(yǔ):weight 權(quán)重hidden unit 隱層Learning in such a network proceeds the same way as for perceptrons: example inputs are prese...
我來(lái)回答
類(lèi)似推薦
猜你喜歡
- 1小學(xué)英語(yǔ)題…灰色減黑色為什么等于白色求大神幫助
- 2關(guān)于電路中的電容器
- 3有21朵蘭花,蘭花比紅花少30%,紅花有多少朵?要算式
- 4My father goes to work by car改為特殊疑問(wèn)句
- 5怎樣計(jì)算電子轉(zhuǎn)移的摩爾數(shù)
- 6急求:49克H2SO4的物質(zhì)的量是多少?1.5摩爾硫化氫的質(zhì)量是多少,其中含多少摩爾氫?
- 7化簡(jiǎn)xy/[(100-x)*8]=3/20
- 8填空題:折線統(tǒng)計(jì)圖不僅可以反映數(shù)量的______________,而且能清楚地表示出數(shù)量的__________________.
- 9eight中是字母組合ei發(fā)[ei]還是eigh發(fā)[ei]音?
- 10雷鋒精神是否過(guò)時(shí) 辯論賽
- 11( )映照
- 12南亞黃麻產(chǎn)區(qū)的地形特征