You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Patrick Wendell (JIRA)" <ji...@apache.org> on 2014/08/09 09:48:11 UTC

[jira] [Resolved] (SPARK-2861) Doc comment of DoubleRDDFunctions.histogram is incorrect

     [ https://issues.apache.org/jira/browse/SPARK-2861?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Patrick Wendell resolved SPARK-2861.
------------------------------------

       Resolution: Fixed
    Fix Version/s: 1.1.0

Issue resolved by pull request 1786
[https://github.com/apache/spark/pull/1786]

> Doc comment of DoubleRDDFunctions.histogram is incorrect
> --------------------------------------------------------
>
>                 Key: SPARK-2861
>                 URL: https://issues.apache.org/jira/browse/SPARK-2861
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 0.9.0, 0.9.1, 1.0.0
>            Reporter: Chandan Kumar
>            Priority: Trivial
>             Fix For: 1.1.0
>
>
> The documentation comment of histogram method of DoubleRDDFunctions class in source file DoubleRDDFunctions.scala is  inconsistent. This might confuse somebody reading the documentation.
> Comment in question:
> {code}
>   /**
>    * Compute a histogram using the provided buckets. The buckets are all open
>    * to the left except for the last which is closed
>    *  e.g. for the array
>    *  [1, 10, 20, 50] the buckets are [1, 10) [10, 20) [20, 50]
>    *  e.g 1<=x<10 , 10<=x<20, 20<=x<50
>    *  And on the input of 1 and 50 we would have a histogram of 1, 0, 0
> {code}
> The buckets are all open to the right (NOT left) except for the last which is closed
> For the example quoted, the last bucket should be 20<=x<=50.
> Also, the histogram result on input of 1 and 50 would be 1, 0, 1 (NOT 1, 0, 0). This works correctly in Spark but the doc comment is incorrect.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org