You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xianyang Liu (Jira)" <ji...@apache.org> on 2019/10/24 02:50:00 UTC
[jira] [Created] (SPARK-29582) Unify the behavior of
pyspark.TaskContext with spark core
Xianyang Liu created SPARK-29582:
------------------------------------
Summary: Unify the behavior of pyspark.TaskContext with spark core
Key: SPARK-29582
URL: https://issues.apache.org/jira/browse/SPARK-29582
Project: Spark
Issue Type: Bug
Components: PySpark
Affects Versions: 2.4.4
Reporter: Xianyang Liu
In Spark Core, there is a `TaskContext` object which is a singleton. We set a task context instance which can be TaskContext or BarrierTaskContext before the task function startup, and unset it to none after the function end. So we can both get TaskContext and BarrierTaskContext with the object. How we can only get the BarrierTaskContext with `BarrierTaskContext`, we will get `None` if we get it by `TaskContext.get` in a barrier stage.
In this patch, we unify the behavior of TaskContext for pyspark with Spark core. This is useful when people switch from normal code to barrier code, and only need a little update.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org