You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/07/08 19:38:00 UTC

[jira] [Commented] (SPARK-24761) Check modifiability of config parameters

    [ https://issues.apache.org/jira/browse/SPARK-24761?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16536391#comment-16536391 ] 

Apache Spark commented on SPARK-24761:
--------------------------------------

User 'MaxGekk' has created a pull request for this issue:
https://github.com/apache/spark/pull/21730

> Check modifiability of config parameters
> ----------------------------------------
>
>                 Key: SPARK-24761
>                 URL: https://issues.apache.org/jira/browse/SPARK-24761
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.3.1
>            Reporter: Maxim Gekk
>            Priority: Minor
>
> Our customers and support team continuously face to the situation when setting a config parameter via *spark.conf.set()* does not may any effects. It is not clear from parameter's name is it static parameter or one of the parameter that can be set at runtime for current session state. It would be useful to have a method of *RuntimeConfig* which could tell to an user - does the given parameter may effect on the current behavior if he/she change it in the spark-shell or running notebook. The method can have the following signature:
> {code:scala}
> def isModifiable(key: String): Boolean
> {code}
> Any config parameter can be checked by using the syntax like this:
> {code:scala}
> scala> spark.conf.isModifiable("spark.sql.sources.schemaStringLengthThreshold")
> res0: Boolean = false
> {code}
> or for Spark Core parameter:
> {code:scala}
> scala> spark.conf.isModifiable("spark.task.cpus")
> res1: Boolean = false
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org