You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/10/17 19:00:02 UTC

[jira] [Assigned] (SPARK-21840) Allow multiple SparkSubmit invocations in same JVM without polluting system properties

     [ https://issues.apache.org/jira/browse/SPARK-21840?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-21840:
------------------------------------

    Assignee: Apache Spark

> Allow multiple SparkSubmit invocations in same JVM without polluting system properties
> --------------------------------------------------------------------------------------
>
>                 Key: SPARK-21840
>                 URL: https://issues.apache.org/jira/browse/SPARK-21840
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>    Affects Versions: 2.3.0
>            Reporter: Marcelo Vanzin
>            Assignee: Apache Spark
>            Priority: Minor
>
> Filing this as a sub-task of SPARK-11035; this feature was discussed as part of the PR currently attached to that bug.
> Basically, to allow the launcher library to run applications in-process, the easiest way is for it to run the {{SparkSubmit}} class. But that class currently propagates configuration to applications by modifying system properties.
> That means that when launching multiple applications in that manner in the same JVM, the configuration of the first application may leak into the second application (or to any other invocation of `new SparkConf()` for that matter).
> This feature is about breaking out the fix for this particular issue from the PR linked to SPARK-11035. With the changes in SPARK-21728, the implementation can even be further enhanced by providing an actual {{SparkConf}} instance to the application, instead of opaque maps.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org