You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "holdenk (JIRA)" <ji...@apache.org> on 2016/11/24 10:10:59 UTC

[jira] [Created] (SPARK-18576) Expose basic TaskContext info in PySpark

holdenk created SPARK-18576:
-------------------------------

             Summary: Expose basic TaskContext info in PySpark
                 Key: SPARK-18576
                 URL: https://issues.apache.org/jira/browse/SPARK-18576
             Project: Spark
          Issue Type: Improvement
          Components: PySpark
            Reporter: holdenk
            Priority: Minor


Currently the TaskContext info isn't exposed in PySpark. While we don't need to expose the full TaskContext information in Python it would make sense to expose the public APIs in Python for users who are doing custom logging or job handling with task id or retry attempt.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org