You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yanbo Liang (JIRA)" <ji...@apache.org> on 2016/11/29 09:14:59 UTC

[jira] [Comment Edited] (SPARK-18618) SparkR model predict should support type as a argument

    [ https://issues.apache.org/jira/browse/SPARK-18618?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15704741#comment-15704741 ] 

Yanbo Liang edited comment on SPARK-18618 at 11/29/16 9:14 AM:
---------------------------------------------------------------

Currently ML glm supports {{type = c("link", "response")}}. This is not limit to spark.glm, other algorithms such as {{spark.survreg}} should also match its corresponding R function {{predict.survreg}} https://stat.ethz.ch/R-manual/R-devel/library/survival/html/predict.survreg.html.
I think SparkR can only support a subset of native R prediction types initially, and add more later. Thanks.


was (Author: yanboliang):
Currently ML glm supports {{type = c("link", "response")}}. This is not limit to spark.glm, other algorithms such as {{spark.survreg}} should also match its corresponding R function {{predict.survreg}} https://stat.ethz.ch/R-manual/R-devel/library/survival/html/predict.survreg.html. Thanks.

> SparkR model predict should support type as a argument
> ------------------------------------------------------
>
>                 Key: SPARK-18618
>                 URL: https://issues.apache.org/jira/browse/SPARK-18618
>             Project: Spark
>          Issue Type: Improvement
>          Components: ML, SparkR
>            Reporter: Yanbo Liang
>
> SparkR model {{predict}} should support {{type}} as a argument. This will it consistent with native R predict such as https://stat.ethz.ch/R-manual/R-devel/library/stats/html/predict.glm.html .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org