You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "holdenk (JIRA)" <ji...@apache.org> on 2016/10/06 17:01:21 UTC

[jira] [Commented] (SPARK-15130) PySpark shared params should include default values to match Scala

    [ https://issues.apache.org/jira/browse/SPARK-15130?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15552513#comment-15552513 ] 

holdenk commented on SPARK-15130:
---------------------------------

Now that we've had 2.0.1 go out maybe we should take the time to figure out our story around this for 2.1? (cc [~mlnick] & [~josephkb] & [~sethah])?

> PySpark shared params should include default values to match Scala
> ------------------------------------------------------------------
>
>                 Key: SPARK-15130
>                 URL: https://issues.apache.org/jira/browse/SPARK-15130
>             Project: Spark
>          Issue Type: Improvement
>          Components: Documentation, ML, PySpark
>            Reporter: holdenk
>            Priority: Minor
>
> As part of checking the documentation in SPARK-14813, PySpark decision tree params do not include the default values (unlike the Scala ones). While the existing Scala default values will have been used, this information is likely worth exposing in the docs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org