You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2014/11/22 23:21:12 UTC

[jira] [Updated] (SPARK-4557) Spark Streaming' foreachRDD method should accept a VoidFunction<...>, not a Function<..., Void>

     [ https://issues.apache.org/jira/browse/SPARK-4557?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen updated SPARK-4557:
-----------------------------
      Priority: Minor  (was: Major)
    Issue Type: Improvement  (was: Bug)

(Don't think this is a bug, really.) Yes, it's possible VoidFunction didn't exist when this API was defined. It can't be changed now without breaking API compatibility but AFAICT VoidFunction would be more appropriate. Maybe this can happen with some other related Java API rationalization in Spark 2.x.

> Spark Streaming' foreachRDD method should accept a VoidFunction<...>, not a Function<..., Void>
> -----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-4557
>                 URL: https://issues.apache.org/jira/browse/SPARK-4557
>             Project: Spark
>          Issue Type: Improvement
>    Affects Versions: 1.1.0
>            Reporter: Alexis Seigneurin
>            Priority: Minor
>
> In *Java*, using Spark Streaming's foreachRDD function is quite verbose. You have to write:
> {code:java}
>                 .foreachRDD(items -> {
>                     ...;
>                     return null;
>                 });
> {code}
> Instead of:
> {code:java}
>                 .foreachRDD(items -> ...);
> {code}
> This is because the foreachRDD method accepts a Function<JavaRDD<...>, Void> instead of a VoidFunction<JavaRDD<...>>. This would make sense to change it to a VoidFunction as, in Spark's API, the foreach method already accepts a VoidFunction.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org