You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Naved Alam (JIRA)" <ji...@apache.org> on 2019/04/22 16:12:00 UTC

[jira] [Created] (SPARK-27541) Refresh class definitions for jars added via addJar()

Naved Alam created SPARK-27541:
----------------------------------

             Summary: Refresh class definitions for jars added via addJar()
                 Key: SPARK-27541
                 URL: https://issues.apache.org/jira/browse/SPARK-27541
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core
    Affects Versions: 2.2.3
            Reporter: Naved Alam


Currently, if a class is loaded by the executor, its definition cannot be updated (because classloaders won't load an already loaded class again). For use cases with long running sparkContexts, this becomes a problem when there are requirements to update the definition of one of these classes.

There should be a spark property which when turned on, allows the executors to refresh the definitions of these classes if they were to be added as a new jar using the addJar API.

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org