You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Chungmin Lee (Jira)" <ji...@apache.org> on 2023/10/05 03:44:00 UTC
[jira] [Updated] (SPARK-45417) Make InheritableThread inherit active session
[ https://issues.apache.org/jira/browse/SPARK-45417?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Chungmin Lee updated SPARK-45417:
---------------------------------
Summary: Make InheritableThread inherit active session (was: InheritableThread doesn't inherit active session)
> Make InheritableThread inherit active session
> ---------------------------------------------
>
> Key: SPARK-45417
> URL: https://issues.apache.org/jira/browse/SPARK-45417
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 3.5.0
> Reporter: Chungmin Lee
> Priority: Major
>
> Repro:
> {code:java}
> from multiprocessing.pool import ThreadPool
> from pyspark import inheritable_thread_target
> from pyspark.sql import SparkSession
> spark = SparkSession.builder.appName("Test").getOrCreate()
> spark.sparkContext.setLogLevel("ERROR")
> def f(i, spark):
> print(f"{i} spark = {spark}")
> print(f"{i} active session = {SparkSession.getActiveSession()}")
> print(f"{i} local property foo = {spark.sparkContext.getLocalProperty('foo')}")
> spark = SparkSession.builder.appName("Test").getOrCreate()
> print(f"{i} spark = {spark}")
> print(f"{i} active session = {SparkSession.getActiveSession()}")
> pool = ThreadPool(4)
> spark.sparkContext.setLocalProperty("foo", "bar")
> pool.starmap(inheritable_thread_target(f), [(i, spark) for i in range(4)]){code}
> {{getOrCreate()}} doesn't set the active session either. The only way is calling the Java function directly: {{spark._jsparkSession.setActiveSession(spark._jsparkSession)}}.
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org