You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Haejoon Lee (Jira)" <ji...@apache.org> on 2023/03/03 16:35:00 UTC

[jira] [Updated] (SPARK-42663) Fix `default_session ` to work properly

     [ https://issues.apache.org/jira/browse/SPARK-42663?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Haejoon Lee updated SPARK-42663:
--------------------------------
    Summary: Fix `default_session ` to work properly  (was: Fix `SparkSession.conf.get` to work properly)

> Fix `default_session ` to work properly
> ---------------------------------------
>
>                 Key: SPARK-42663
>                 URL: https://issues.apache.org/jira/browse/SPARK-42663
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Connect, ps
>    Affects Versions: 3.5.0
>            Reporter: Haejoon Lee
>            Priority: Major
>
> Currently, default_session is not working properly in Spark Connect as below since `SparkSession.conf.get` is nor working as expected:
> {code:java}
> >>> spark = default_session()
> >>> spark.conf.set("default_index_type", "sequence")
> >>> spark.conf.get("default_index_type")
> 'sequence'
> >>>
> >>> spark = default_session()
> >>> spark.conf.get("default_index_type")
> Traceback (most recent call last):
> ...
> pyspark.errors.exceptions.connect.SparkConnectGrpcException: (java.util.NoSuchElementException) default_index_type
> {code}
> It should work as expected in regular PySpark as below:
> {code:java}
> >>> spark = default_session()
> >>> spark.conf.set("default_index_type", "sequence")
> >>> spark.conf.get("default_index_type")
> 'sequence'
> >>>
> >>> spark = default_session()
> >>> spark.conf.get("default_index_type")
> 'sequence'{code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org