You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by me...@apache.org on 2017/03/27 15:53:46 UTC
spark git commit: [SPARK-20088] Do not create new SparkContext in
SparkR createSparkContext
Repository: spark
Updated Branches:
refs/heads/master 890493458 -> 0588dc7c0
[SPARK-20088] Do not create new SparkContext in SparkR createSparkContext
## What changes were proposed in this pull request?
Instead of creating new `JavaSparkContext` we use `SparkContext.getOrCreate`.
## How was this patch tested?
Existing tests
Author: Hossein <ho...@databricks.com>
Closes #17423 from falaki/SPARK-20088.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/0588dc7c
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/0588dc7c
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/0588dc7c
Branch: refs/heads/master
Commit: 0588dc7c0a9f3180dddae0dc202a6d41eb43464f
Parents: 8904934
Author: Hossein <ho...@databricks.com>
Authored: Mon Mar 27 08:53:45 2017 -0700
Committer: Xiangrui Meng <me...@databricks.com>
Committed: Mon Mar 27 08:53:45 2017 -0700
----------------------------------------------------------------------
core/src/main/scala/org/apache/spark/api/r/RRDD.scala | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/spark/blob/0588dc7c/core/src/main/scala/org/apache/spark/api/r/RRDD.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/api/r/RRDD.scala b/core/src/main/scala/org/apache/spark/api/r/RRDD.scala
index 72ae034..295355c 100644
--- a/core/src/main/scala/org/apache/spark/api/r/RRDD.scala
+++ b/core/src/main/scala/org/apache/spark/api/r/RRDD.scala
@@ -136,7 +136,7 @@ private[r] object RRDD {
.mkString(File.separator))
}
- val jsc = new JavaSparkContext(sparkConf)
+ val jsc = new JavaSparkContext(SparkContext.getOrCreate(sparkConf))
jars.foreach { jar =>
jsc.addJar(jar)
}
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org