You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@madlib.apache.org by "Frank McQuillan (JIRA)" <ji...@apache.org> on 2019/05/29 23:22:00 UTC

[jira] [Closed] (MADLIB-1338) DL: Add support for reporting various metrics in fit/evaluate

     [ https://issues.apache.org/jira/browse/MADLIB-1338?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Frank McQuillan closed MADLIB-1338.
-----------------------------------

> DL: Add support for reporting various metrics in fit/evaluate
> -------------------------------------------------------------
>
>                 Key: MADLIB-1338
>                 URL: https://issues.apache.org/jira/browse/MADLIB-1338
>             Project: Apache MADlib
>          Issue Type: New Feature
>          Components: Deep Learning
>            Reporter: Nandish Jayaram
>            Priority: Major
>             Fix For: v1.16
>
>
> The current `madlib_keras.fit()` code reports accuracy as the only metric, along with loss value. But we could ask for different metrics in compile params (`mae`, `binary_accuracy ` etc.), then `Keras.evaluate()` would return back `loss` (by default) and `mean_absolute_error` or `binary_accuracy` (metrics).
>  This JIRA requests support to be able to report any one of these metrics in the output table.
>  Other requirements:
>  1. Remove training loss/accuracy computation from `fit_transition` and instead use the evaluate function to calculate the training loss/metric. See PR [https://github.com/apache/madlib/pull/388 |https://github.com/apache/madlib/pull/388/files]for more details
> 2. metric param can be optional
> 3. Maybe we should rename all the related output column as metric instead of metrics



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)