You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Joseph K. Bradley (JIRA)" <ji...@apache.org> on 2018/09/10 17:50:00 UTC

[jira] [Updated] (SPARK-25397) SparkSession.conf fails when given default value with Python 3

     [ https://issues.apache.org/jira/browse/SPARK-25397?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Joseph K. Bradley updated SPARK-25397:
--------------------------------------
    Priority: Minor  (was: Major)

> SparkSession.conf fails when given default value with Python 3
> --------------------------------------------------------------
>
>                 Key: SPARK-25397
>                 URL: https://issues.apache.org/jira/browse/SPARK-25397
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.1
>            Reporter: Joseph K. Bradley
>            Priority: Minor
>
> Spark 2.3.1 has a Python 3 incompatibility when requesting a Conf value from SparkSession when you give non-string default values.  Reproduce via SparkSession call:
> {{spark.conf.get("myConf", False)}}
> This gives the error:
> {code}
> >>> spark.conf.get("myConf", False)
> Traceback (most recent call last):
>   File "<stdin>", line 1, in <module>
>   File "/Users/josephkb/work/spark-bin/spark-2.3.1-bin-hadoop2.7/python/pyspark/sql/conf.py", line 51, in get
>     self._checkType(default, "default")
>   File "/Users/josephkb/work/spark-bin/spark-2.3.1-bin-hadoop2.7/python/pyspark/sql/conf.py", line 62, in _checkType
>     if not isinstance(obj, str) and not isinstance(obj, unicode):
> *NameError: name 'unicode' is not defined*
> {code}
> The offending line in Spark in branch-2.3 is: https://github.com/apache/spark/blob/branch-2.3/python/pyspark/sql/conf.py which uses the value {{unicode}} which is not available in Python 3.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org