You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2020/05/01 16:50:02 UTC

[jira] [Assigned] (SPARK-30621) Dynamic Pruning thread propagates the localProperties to task

     [ https://issues.apache.org/jira/browse/SPARK-30621?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-30621:
------------------------------------

    Assignee: Apache Spark

> Dynamic Pruning thread propagates the localProperties to task
> -------------------------------------------------------------
>
>                 Key: SPARK-30621
>                 URL: https://issues.apache.org/jira/browse/SPARK-30621
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Ajith S
>            Assignee: Apache Spark
>            Priority: Major
>
> Local properties set via sparkContext are not available as TaskContext properties when executing parallel jobs and threadpools have idle threads
> Explanation:
> When executing parallel jobs via SubqueryBroadcastExec, the {{relationFuture}} is evaluated via a separate thread. The threads inherit the {{localProperties}} from sparkContext as they are the child threads.
> These threads are controlled via the executionContext (thread pools). Each Thread pool has a default {{keepAliveSeconds}} of 60 seconds for idle threads.
> Scenarios where the thread pool has threads which are idle and reused for a subsequent new query, the thread local properties will not be inherited from spark context (thread properties are inherited only on thread creation) hence end up having old or no properties set. This will cause taskset properties to be missing when properties are transferred by child thread



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org