You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2014/09/27 06:31:33 UTC
[jira] [Resolved] (SPARK-2895) Support mapPartitionsWithContext in
Spark Java API
[ https://issues.apache.org/jira/browse/SPARK-2895?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Reynold Xin resolved SPARK-2895.
--------------------------------
Resolution: Fixed
Fix Version/s: 1.2.0
Closing this since we resolved SPARK-3543.
> Support mapPartitionsWithContext in Spark Java API
> --------------------------------------------------
>
> Key: SPARK-2895
> URL: https://issues.apache.org/jira/browse/SPARK-2895
> Project: Spark
> Issue Type: New Feature
> Components: Java API
> Reporter: Chengxiang Li
> Assignee: Chengxiang Li
> Labels: hive
> Fix For: 1.2.0
>
>
> This is a requirement from Hive on Spark, mapPartitionsWithContext only exists in Spark Scala API, we expect to access from Spark Java API.
> For HIVE-7627, HIVE-7843, Hive operators which are invoked in mapPartitions closure need to get taskId.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org