You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by xuanyuanking <gi...@git.apache.org> on 2018/06/30 07:50:16 UTC

[GitHub] spark pull request #21648: [SPARK-24665][PySpark] Use SQLConf in PySpark to ...

Github user xuanyuanking commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21648#discussion_r199316323
  
    --- Diff: python/pyspark/sql/conf.py ---
    @@ -64,6 +64,97 @@ def _checkType(self, obj, identifier):
                                 (identifier, obj, type(obj).__name__))
     
     
    +class ConfigEntry(object):
    +    """An entry contains all meta information for a configuration"""
    +
    +    def __init__(self, confKey):
    +        """Create a new ConfigEntry with config key"""
    +        self.confKey = confKey
    +        self.converter = None
    +        self.default = _NoValue
    +
    +    def boolConf(self):
    +        """Designate current config entry is boolean config"""
    +        self.converter = lambda x: str(x).lower() == "true"
    +        return self
    +
    +    def intConf(self):
    +        """Designate current config entry is integer config"""
    +        self.converter = lambda x: int(x)
    +        return self
    +
    +    def stringConf(self):
    +        """Designate current config entry is string config"""
    +        self.converter = lambda x: str(x)
    +        return self
    +
    +    def withDefault(self, default):
    +        """Give a default value for current config entry, the default value will be set
    +        to _NoValue when its absent"""
    +        self.default = default
    +        return self
    +
    +    def read(self, ctx):
    +        """Read value from this config entry through sql context"""
    +        return self.converter(ctx.getConf(self.confKey, self.default))
    +
    +
    +class SQLConf(object):
    +    """A class that enables the getting of SQL config parameters in pyspark"""
    +
    +    REPL_EAGER_EVAL_ENABLED = ConfigEntry("spark.sql.repl.eagerEval.enabled")\
    --- End diff --
    
    During see the currently implement, maybe we can directly use the SQLConf in SessionState? I do this in last commit 453004a. Please have a look when you have time, thanks :)


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org