You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hama.apache.org by Apache Wiki <wi...@apache.org> on 2013/06/27 03:56:37 UTC

[Hama Wiki] Trivial Update of "MultiLayerPerceptron" by YexiJiang

Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hama Wiki" for change notification.

The "MultiLayerPerceptron" page has been changed by YexiJiang:
https://wiki.apache.org/hama/MultiLayerPerceptron?action=diff&rev1=22&rev2=23

  
  For each layer except the input layer, the value of the current neuron is calculated by taking the linear combination of the values output by the neurons of the previous layer, where the weight determines the contribution of a neuron in the previous layer to current neuron (as equation (1) shown). Obtaining the linear combination result z, a non-linear squashing function is used to constrain the output into a restricted range (as equation (2) shown). Typically, sigmoid function or tanh function are used. 
  
- {{http://bit.ly/19oTgzk}}
+ {{https://docs.google.com/file/d/0B7jrHSGlHowOVkM5Z3dtZ0ZSQ1U/edit?usp=sharing}}
  
- {{http://bit.ly/19om4bo}}
+ {{https://docs.google.com/file/d/0B7jrHSGlHowOM1UzeGdjRnMyMjg/edit?usp=sharing}}
  
  For each step of feed-forward, the calculated results are propagated one layer close to the output layer.