You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Faiz Halde (Jira)" <ji...@apache.org> on 2023/11/22 03:35:00 UTC

[jira] [Updated] (SPARK-46046) Isolated classloader per spark session

     [ https://issues.apache.org/jira/browse/SPARK-46046?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Faiz Halde updated SPARK-46046:
-------------------------------
    Description: 
Hello,

We use spark 3.5.0 and were wondering if the following is achievable using spark-core

Our use case involves spinning up a spark cluster wherein the driver application loads user jars on-the-fly ( the user jar is not the spark driver/application ) but merely a catalog of transformations. A single spark application can load multiple jars in its lifetime with potential of classpath conflict if care is not taken by the framework

The driver needs to load the jar, add the jar to the executor & call a predefined class.method to trigger the transformation

Each transformation runs in its own spark session inside the same spark application

AFAIK, on the executor side, isolated classloader per session is only possible when using the spark-connect facilities. Is it possible to do this without using spark connect?

Spark connect is the only facility that adds the jar into a sessionUUID directory of executor and when an executor runs a job from that session, it sets a childfirstclassloader pointing to the sessionUUID directory

 

Thank you

  was:
Hello,

We use spark 3.5.0 and were wondering if the following is achievable using spark-core

Our use case involves spinning up a spark cluster wherein the driver application loads user jars on-the-fly ( the user jar is not the spark driver/application ) but merely a catalog of transformations. A single spark application can load multiple jars in its lifetime with potential of classpath conflict if care is not taken by the framework

The driver needs to load the jar, add the jar to the executor & calls a predefined class.method to trigger the transformation

Each transformation runs in its own spark session inside the same spark application

AFAIK, on the executor side, isolated classloader per session is only possible when using the spark-connect facilities. Is it possible to do this without using spark connect?

Spark connect is the only facility that adds the jar into a sessionUUID directory of executor and when an executor runs a job from that session, it sets a childfirstclassloader pointing to the sessionUUID directory

 

Thank you


> Isolated classloader per spark session
> --------------------------------------
>
>                 Key: SPARK-46046
>                 URL: https://issues.apache.org/jira/browse/SPARK-46046
>             Project: Spark
>          Issue Type: Question
>          Components: Spark Core
>    Affects Versions: 3.5.0
>            Reporter: Faiz Halde
>            Priority: Major
>
> Hello,
> We use spark 3.5.0 and were wondering if the following is achievable using spark-core
> Our use case involves spinning up a spark cluster wherein the driver application loads user jars on-the-fly ( the user jar is not the spark driver/application ) but merely a catalog of transformations. A single spark application can load multiple jars in its lifetime with potential of classpath conflict if care is not taken by the framework
> The driver needs to load the jar, add the jar to the executor & call a predefined class.method to trigger the transformation
> Each transformation runs in its own spark session inside the same spark application
> AFAIK, on the executor side, isolated classloader per session is only possible when using the spark-connect facilities. Is it possible to do this without using spark connect?
> Spark connect is the only facility that adds the jar into a sessionUUID directory of executor and when an executor runs a job from that session, it sets a childfirstclassloader pointing to the sessionUUID directory
>  
> Thank you



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org