You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by rekhajoshm <gi...@git.apache.org> on 2018/01/22 03:00:07 UTC
[GitHub] spark pull request #20347: [SPARK-20129][Core] JavaSparkContext should use S...
GitHub user rekhajoshm opened a pull request:
https://github.com/apache/spark/pull/20347
[SPARK-20129][Core] JavaSparkContext should use SparkContext.getOrCreate
## What changes were proposed in this pull request?
Using SparkContext getOrCreate() instead of recreating new sc in JavaSparkContext.
## How was this patch tested?
Existing tests
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/rekhajoshm/spark SPARK-20129
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/20347.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #20347
----
commit e3677c9fa9697e0d34f9df52442085a6a481c9e9
Author: Rekha Joshi <re...@...>
Date: 2015-05-05T23:10:08Z
Merge pull request #1 from apache/master
Pulling functionality from apache spark
commit 106fd8eee8f6a6f7c67cfc64f57c1161f76d8f75
Author: Rekha Joshi <re...@...>
Date: 2015-05-08T21:49:09Z
Merge pull request #2 from apache/master
pull latest from apache spark
commit 0be142d6becba7c09c6eba0b8ea1efe83d649e8c
Author: Rekha Joshi <re...@...>
Date: 2015-06-22T00:08:08Z
Merge pull request #3 from apache/master
Pulling functionality from apache spark
commit 6c6ee12fd733e3f9902e10faf92ccb78211245e3
Author: Rekha Joshi <re...@...>
Date: 2015-09-17T01:03:09Z
Merge pull request #4 from apache/master
Pulling functionality from apache spark
commit b123c601e459d1ad17511fd91dd304032154882a
Author: Rekha Joshi <re...@...>
Date: 2015-11-25T18:50:32Z
Merge pull request #5 from apache/master
pull request from apache/master
commit c73c32aadd6066e631956923725a48d98a18777e
Author: Rekha Joshi <re...@...>
Date: 2016-03-18T19:13:51Z
Merge pull request #6 from apache/master
pull latest from apache spark
commit 7dbf7320057978526635bed09dabc8cf8657a28a
Author: Rekha Joshi <re...@...>
Date: 2016-04-05T20:26:40Z
Merge pull request #8 from apache/master
pull latest from apache spark
commit 5e9d71827f8e2e4d07027281b80e4e073e7fecd1
Author: Rekha Joshi <re...@...>
Date: 2017-05-01T23:00:30Z
Merge pull request #9 from apache/master
Pull apache spark
commit 63d99b3ce5f222d7126133170a373591f0ac67dd
Author: Rekha Joshi <re...@...>
Date: 2017-09-30T22:26:44Z
Merge pull request #10 from apache/master
pull latest apache spark
commit a7fc787466b71784ff86f9694f617db0f1042da8
Author: Rekha Joshi <re...@...>
Date: 2018-01-21T00:17:58Z
Merge pull request #11 from apache/master
Apache spark pull latest
commit b1ae5125f65e0d8a59a4006a9777ed5185a758c9
Author: rjoshi2 <re...@...>
Date: 2018-01-22T02:53:06Z
[SPARK-20129][Core] JavaSparkContext should use SparkContext.getOrCreate
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...
Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20347
Can you please explain why do we need to change to `getOrCreate`?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...
Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/20347
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/84/
Test PASSed.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...
Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/20347
@rekhajoshm I think maybe the right resolution here is to do nothing. I haven't heard @mengxr on his old JIRA to make this change. Thank you for chasing down open JIRAs like this of course.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...
Posted by rekhajoshm <gi...@git.apache.org>.
Github user rekhajoshm commented on the issue:
https://github.com/apache/spark/pull/20347
Thank you @srowen I admire you for doing what you do over all the jira/PR's I have studied, and followed up.
If its ok, will keep this PR open for few days, and close if jira is getting to 'Not an issue'/'Won't fix' state.thanks.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...
Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20347
Using `getOrCreate` in constructor seems change the semantics. Maybe we can add a new static method for such usage in `JavaSparkContext`.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...
Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/20347
@mengxr suggested this in the JIRA originally -- what was the reasoning? It makes some sense, but so does leaving the current behavior, where a constructor calls a constructor. It's a behavior change, albeit a slight one.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...
Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/20347
**[Test build #86456 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/86456/testReport)** for PR 20347 at commit [`b1ae512`](https://github.com/apache/spark/commit/b1ae5125f65e0d8a59a4006a9777ed5185a758c9).
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...
Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/20347
Merged build finished. Test PASSed.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...
Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/20347
**[Test build #86456 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/86456/testReport)** for PR 20347 at commit [`b1ae512`](https://github.com/apache/spark/commit/b1ae5125f65e0d8a59a4006a9777ed5185a758c9).
* This patch passes all tests.
* This patch merges cleanly.
* This patch adds no public classes.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] spark pull request #20347: [SPARK-20129][Core] JavaSparkContext should use S...
Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/20347
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...
Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/20347
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/86456/
Test PASSed.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...
Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/20347
Yes, you can already get the new semantics here with `new JavaSparkContext(SparkContext.getOrCreate())`.
Yes, probably better to add a new method, or else, decide that it's not worth a new API method just as a shortcut for the above. Maybe that's the right conclusion, unless @mengxr comes back with a particular reason to change the behavior slightly.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...
Posted by jiangxb1987 <gi...@git.apache.org>.
Github user jiangxb1987 commented on the issue:
https://github.com/apache/spark/pull/20347
My major concern is that, if there is a existing `SparkContext`, some confs you set may not take effect, as described in `SparkContext.getOrCreate()`. It's hard to enumerate the use cases but I'm sure there are some that pass in specific confs to create a new `JavaSparkContext`, so I tend to keep the current behavior here.
On the other hand, the following comment copyed from the comment of the class `JavaSparkContext`:
```
* Only one SparkContext may be active per JVM. You must `stop()` the active SparkContext before
* creating a new one. This limitation may eventually be removed; see SPARK-2243 for more details.
```
If that is the case, there should be no active `SparkContext` before we initiate the `JavaSparkContext`, so the change doesn't bring any advantage in that means.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] spark issue #20347: [SPARK-20129][Core] JavaSparkContext should use SparkCon...
Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/20347
Merged build finished. Test PASSed.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org