You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/04/23 12:04:12 UTC

[jira] [Commented] (SPARK-14873) Java sampleByKey methods take ju.Map but with Scala Double values; results in type Object

    [ https://issues.apache.org/jira/browse/SPARK-14873?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15255221#comment-15255221 ] 

Apache Spark commented on SPARK-14873:
--------------------------------------

User 'srowen' has created a pull request for this issue:
https://github.com/apache/spark/pull/12637

> Java sampleByKey methods take ju.Map but with Scala Double values; results in type Object
> -----------------------------------------------------------------------------------------
>
>                 Key: SPARK-14873
>                 URL: https://issues.apache.org/jira/browse/SPARK-14873
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Java API, Spark Core
>    Affects Versions: 1.6.1
>            Reporter: Sean Owen
>            Assignee: Sean Owen
>            Priority: Minor
>
> There's this odd bit of code in {{JavaStratifiedSamplingExample}}:
> {code}
>     // specify the exact fraction desired from each key Map<K, Object>
>     ImmutableMap<Integer, Object> fractions =
>       ImmutableMap.of(1, (Object)0.1, 2, (Object) 0.6, 3, (Object) 0.3);
>     // Get an approximate sample from each stratum
>     JavaPairRDD<Integer, Character> approxSample = data.sampleByKey(false, fractions);
> {code}
> It highlights a problem like that in https://issues.apache.org/jira/browse/SPARK-12604 where Scala primitive types are used where Java requires an object, and the result is that a signature that logically takes Double (objects) takes an Object in the Java API. It's an easy, similar fix.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org