You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@mxnet.apache.org by "Piyush Ghai (JIRA)" <ji...@apache.org> on 2019/01/03 21:46:00 UTC

[jira] [Commented] (MXNET-1279) Test our an end to end Scala Training Example Using FP64.

    [ https://issues.apache.org/jira/browse/MXNET-1279?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16733550#comment-16733550 ] 

Piyush Ghai commented on MXNET-1279:
------------------------------------

Tested training of LENet, MLP and ResNet with Synthetic Data for Float64 and Float32. 

The model trains fine and I was able to load back the trained model. 

Refer to this commit for training in FP64 with Synthetic Data :  https://github.com/apache/incubator-mxnet/pull/13678/commits/b1926ad687293cf25b3d6ded0dd597392a3d05c4

> Test our an end to end Scala Training Example Using FP64.
> ---------------------------------------------------------
>
>                 Key: MXNET-1279
>                 URL: https://issues.apache.org/jira/browse/MXNET-1279
>             Project: Apache MXNet
>          Issue Type: Sub-task
>          Components: Apache MXNet Scala API
>            Reporter: Piyush Ghai
>            Assignee: Piyush Ghai
>            Priority: Major
>
> Compare on : 
>  # Training Accuracy
>  # Loss precision values after every epoch
>  # Training time 
>  # Memory used



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@mxnet.apache.org
For additional commands, e-mail: issues-help@mxnet.apache.org