You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/12/08 09:59:58 UTC

[jira] [Assigned] (SPARK-18576) Expose basic TaskContext info in PySpark

     [ https://issues.apache.org/jira/browse/SPARK-18576?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-18576:
------------------------------------

    Assignee: Apache Spark

> Expose basic TaskContext info in PySpark
> ----------------------------------------
>
>                 Key: SPARK-18576
>                 URL: https://issues.apache.org/jira/browse/SPARK-18576
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>            Reporter: holdenk
>            Assignee: Apache Spark
>            Priority: Minor
>
> Currently the TaskContext info isn't exposed in PySpark. While we don't need to expose the full TaskContext information in Python it would make sense to expose the public APIs in Python for users who are doing custom logging or job handling with task id or retry attempt.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org