You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2014/09/16 06:54:34 UTC
[jira] [Comment Edited] (SPARK-3543) Write TaskContext in Java and
expose it through a static accessor
[ https://issues.apache.org/jira/browse/SPARK-3543?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14134955#comment-14134955 ]
Reynold Xin edited comment on SPARK-3543 at 9/16/14 4:53 AM:
-------------------------------------------------------------
FYI you can define the current Scala closure in Java too
{code}
public TaskContext addTaskCompletionListener(scala.Function1<TaskContext, scala.Unit> f) {
// do whatever
return this;
}
{code}
was (Author: rxin):
FYI you can define the current Scala closure in Java too
{code}
public TaskContext addTaskCompletionListener(Function1<TaskContext, scala.Unit> f) {
// do whatever
return this;
}
{code}
> Write TaskContext in Java and expose it through a static accessor
> -----------------------------------------------------------------
>
> Key: SPARK-3543
> URL: https://issues.apache.org/jira/browse/SPARK-3543
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Reporter: Patrick Wendell
> Assignee: Prashant Sharma
> Priority: Critical
>
> Right now we have these xWithContext methods and it's a bit awkward (for instance, we don't support accessing taskContext from a normal map or filter operation). I'd propose the following
> 1. Re-write TaskContext in Java - it's a simple class. It can still refer to the scala version of TaskMetrics.
> 2. Have a static method `TaskContext.get()` which will return the current in-scope TaskContext. Under the hood this uses a thread local variable similar to SparkEnv that the Executor sets.
> 3. Deprecate all of the existing xWithContext methods.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org