You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mahout.apache.org by "Trevor Grant (JIRA)" <ji...@apache.org> on 2017/05/28 16:10:04 UTC

[jira] [Created] (MAHOUT-1990) Implement Multilayer Perceptron

Trevor Grant created MAHOUT-1990:
------------------------------------

             Summary: Implement Multilayer Perceptron
                 Key: MAHOUT-1990
                 URL: https://issues.apache.org/jira/browse/MAHOUT-1990
             Project: Mahout
          Issue Type: Improvement
          Components: Algorithms
    Affects Versions: 0.13.2
            Reporter: Trevor Grant
            Assignee: Trevor Grant


Following strategy

It should- 
1. implement incoreMLPs which can be 'plugged together' for purposes of back propegation (this makes for easy extension into more complex networks)
2. implement a common distributed MLP which maps out incoreMLPs and then averages parameters
3. regression and classifier wrappers around the base MLP to reduce duplication of code
4. would be nice to make distributed and incore neural network 'trait' for consistent API across all future neural networks.




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)