You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Philip Ogren <ph...@oracle.com> on 2014/07/11 00:27:49 UTC

Multiple SparkContexts with different configurations in same JVM

In various previous versions of Spark (and I believe the current 
version, 1.0.0, as well) we have noticed that it does not seem possible 
to have a "local" SparkContext and a SparkContext connected to a cluster 
via either a Spark Cluster (i.e. using the Spark resource manager) or a 
YARN cluster.  Is this a known issue?  If not, then I would be happy to 
write up a bug report on what the bad/unexpected behavior is and how to 
reproduce it.  If so, then are there plans to fix this?  Perhaps there 
is a Jira issue I could be pointed to or other related discussion.

Thanks,
Philip