You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@systemml.apache.org by "Mike Dusenberry (JIRA)" <ji...@apache.org> on 2016/03/16 22:32:33 UTC

[jira] [Commented] (SYSTEMML-540) Deep Learning

    [ https://issues.apache.org/jira/browse/SYSTEMML-540?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15198220#comment-15198220 ] 

Mike Dusenberry commented on SYSTEMML-540:
------------------------------------------

Update: I'm working on an experimental, layers-based framework directly in DML to contain layer abstractions with simple forward/backward APIs for affine, convolution (start with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, loss functions, and other layers.  As part of this experiment, I'm starting by implementing as much as possible in DML, and then will move to built-in functions as necessary.

> Deep Learning
> -------------
>
>                 Key: SYSTEMML-540
>                 URL: https://issues.apache.org/jira/browse/SYSTEMML-540
>             Project: SystemML
>          Issue Type: Epic
>            Reporter: Mike Dusenberry
>            Assignee: Mike Dusenberry
>
> This epic covers the addition of deep learning to SystemML, including:
> * Core DML layer abstractions for deep (convolutional, recurrent) neural nets, with simple forward/backward API: affine, convolution (start with 2D), max-pooling, non-linearities (relu, sigmoid, softmax), dropout, loss functions.
> * Modularized DML optimizers: (mini-batch, stochastic) gradient descent (w/ momentum, etc.).
> * Additional DML language support as necessary (tensors, built-in functions such as convolution, function pointers, list structures, etc.).
> * Integration with other deep learning frameworks (Caffe, Torch, Theano, TensoFlow, etc.) via automatic DML code generation.
> * etc.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)