You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marco Gaido (JIRA)" <ji...@apache.org> on 2019/02/26 09:17:00 UTC

[jira] [Commented] (SPARK-26988) Spark overwrites spark.scheduler.pool if set in configs

    [ https://issues.apache.org/jira/browse/SPARK-26988?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16777755#comment-16777755 ] 

Marco Gaido commented on SPARK-26988:
-------------------------------------

This seems indeed an issue for any property set using `sc.setLocalProperty`, as they are not tracked in the session state. This may cause regressions actually. cc [~cloud_fan] who worked on this. I cannot think of a good solution right now. A possible approach would be to introduce kind of a callback in order to put in the session state the configs set directly in the SparkContext, but it is not a clean solution.

> Spark overwrites spark.scheduler.pool if set in configs
> -------------------------------------------------------
>
>                 Key: SPARK-26988
>                 URL: https://issues.apache.org/jira/browse/SPARK-26988
>             Project: Spark
>          Issue Type: Improvement
>          Components: Scheduler
>    Affects Versions: 2.4.0
>            Reporter: Dave DeCaprio
>            Priority: Minor
>
> If you set a default spark.scheduler.pool in your configuration when you create a SparkSession and then you attempt to override that configuration by calling setLocalProperty on a SparkSession, as described in the Spark documentation - [https://spark.apache.org/docs/latest/job-scheduling.html#fair-scheduler-pools] - it won't work.
> Spark will go with the original pool name.
> I've traced this down to SQLExecution.withSQLConfPropagated, which copies any key that starts with "spark" from the the session state to the local properties.  The can end up overwriting the scheduler, which is set by spark.scheduler.pool



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org