You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by lgrcyanny <gi...@git.apache.org> on 2017/08/29 15:04:11 UTC

[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

GitHub user lgrcyanny opened a pull request:

    https://github.com/apache/spark/pull/19079

    [SPARK-21859][CORE] Fix SparkFiles.get failed on driver in yarn-cluster and yarn-client mode

    ## What changes were proposed in this pull request?
    when use SparkFiles.get a file on driver in yarn-client or yarn-cluster, it will report file not found exception.
    This exception only happens on driver, SparkFiles.get on executor works fine.
    we can reproduce the bug as follows:
    ```scala
    val conf = new SparkConf().setAppName("SparkFilesTest")
    val sc = new SparkContext(conf)
    def testOnDriver(fileName: String) = {
        val file = new File(SparkFiles.get(fileName))
        if (!file.exists()) {
            println(s"$file not exist")
        } else {
            // print file content on driver
            val content = Source.fromFile(file).getLines().mkString("\n")
            println(s"File content: ${content}")
        }
    }
    // the output will be file not exist
    ```
    
    ```python
    conf = SparkConf().setAppName("test files")
    sc = SparkContext(appName="spark files test")
    def test_on_driver(filename):
        file = SparkFiles.get(filename)
        print("file path: {}".format(file))
        if os.path.exists(file):
            with open(file) as f:
            lines = f.readlines()
            print(lines)
        else:
            print("file doesn't exist")
            run_command("ls .")
    ```
    the output will be file not exist
    
    ## How was this patch tested?
    
    tested in integration tests and manual tests
    submit the demo case in yarn-cluster and yarn-client mode, and verify the test result
    the testing  commands are:
    ```
    ./bin/spark-submit --master yarn-cluster --files README.md --class "testing.SparkFilesTest" testing.jar
    ./bin/spark-submit --master yarn-client --files README.md --class "testing.SparkFilesTest" testing.jar
    ./bin/spark-submit --master yarn-cluster --files README.md test_get_files.py
    ./bin/spark-submit --master yarn-client --files README.md test_get_files.py
    ```


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/lgrcyanny/spark fix-yarn-files-problem

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19079.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19079
    
----
commit 3f0e4a88bdb7156b5db7cfb56cd079d4b0de3a5b
Author: lgrcyanny <lg...@gmail.com>
Date:   2017-05-07T12:51:55Z

    [SPARK-21859][CORE] Fix SparkFiles.get failed on driver in yarn-cluster and yarn-client mode
    
    when use SparkFiles.get a file on driver in yarn-client or yarn-cluster, it will report file not found exception.
    This exception only happens on driver, SparkFiles.get on executor works fine.
    we can reproduce the bug as follows:
    ```scala
    val conf = new SparkConf().setAppName("SparkFilesTest")
    val sc = new SparkContext(conf)
    def testOnDriver(fileName: String) = {
        val file = new File(SparkFiles.get(fileName))
        if (!file.exists()) {
            println(s"$file not exist")
        } else {
            // print file content on driver
            val content = Source.fromFile(file).getLines().mkString("\n")
            println(s"File content: ${content}")
        }
    }
    // the output will be file not exist
    ```
    
    ```python
    conf = SparkConf().setAppName("test files")
    sc = SparkContext(appName="spark files test")
    def test_on_driver(filename):
        file = SparkFiles.get(filename)
        print("file path: {}".format(file))
        if os.path.exists(file):
            with open(file) as f:
            lines = f.readlines()
            print(lines)
        else:
            print("file doesn't exist")
            run_command("ls .")
    ```
    the output will be file not exist
    
    tested in integration tests and manual tests
    submit the demo case in yarn-cluster and yarn-client mode, and verify the test result
    
    ```
    ./bin/spark-submit --master yarn-cluster --files README.md --class "testing.SparkFilesTest" testing.jar
    ./bin/spark-submit --master yarn-client --files README.md --class "testing.SparkFilesTest" testing.jar
    ./bin/spark-submit --master yarn-cluster --files README.md test_get_files.py
    ./bin/spark-submit --master yarn-client --files README.md test_get_files.py
    ```
    
    Change-Id: I22034f99f571a451b862c1806b7f9350c6133c95

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

Posted by lgrcyanny <gi...@git.apache.org>.
Github user lgrcyanny closed the pull request at:

    https://github.com/apache/spark/pull/19079


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on driver ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/19079
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19079#discussion_r135973949
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
    @@ -481,7 +481,7 @@ object SparkSubmit extends CommandLineUtils {
             sysProp = "spark.executor.memory"),
           OptionAssigner(args.totalExecutorCores, STANDALONE | MESOS, ALL_DEPLOY_MODES,
             sysProp = "spark.cores.max"),
    -      OptionAssigner(args.files, LOCAL | STANDALONE | MESOS, ALL_DEPLOY_MODES,
    +      OptionAssigner(args.files, ALL_CLUSTER_MGRS, ALL_DEPLOY_MODES,
    --- End diff --
    
    Also as @vanzin mentioned,  also the description of "Yarn cluster mode `SparkFiles.get` is working" is not a design purpose. So to fix this issue I think you should have a more solid patch.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on driver ...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on the issue:

    https://github.com/apache/spark/pull/19079
  
    Please close this PR @lgrcyanny thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

Posted by lgrcyanny <gi...@git.apache.org>.
Github user lgrcyanny commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19079#discussion_r136055854
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
    @@ -481,7 +481,7 @@ object SparkSubmit extends CommandLineUtils {
             sysProp = "spark.executor.memory"),
           OptionAssigner(args.totalExecutorCores, STANDALONE | MESOS, ALL_DEPLOY_MODES,
             sysProp = "spark.cores.max"),
    -      OptionAssigner(args.files, LOCAL | STANDALONE | MESOS, ALL_DEPLOY_MODES,
    +      OptionAssigner(args.files, ALL_CLUSTER_MGRS, ALL_DEPLOY_MODES,
    --- End diff --
    
    hi @jerryshao About the remote files to handle yarn-client files problem, is there jira that explains the design? We can wait for a version which resolved the problem.
    
    I think my fix just solve the problem simply, do you have any other idea to solve it more elegantly?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19079#discussion_r135973543
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
    @@ -481,7 +481,7 @@ object SparkSubmit extends CommandLineUtils {
             sysProp = "spark.executor.memory"),
           OptionAssigner(args.totalExecutorCores, STANDALONE | MESOS, ALL_DEPLOY_MODES,
             sysProp = "spark.cores.max"),
    -      OptionAssigner(args.files, LOCAL | STANDALONE | MESOS, ALL_DEPLOY_MODES,
    +      OptionAssigner(args.files, ALL_CLUSTER_MGRS, ALL_DEPLOY_MODES,
    --- End diff --
    
    I don't say `SparkFiles.get` is not useful, I'm saying your fix is not correct, the changes here will break the original semantics. Also we recently support remote files, to handle this scenario we should think how to address this problem for all the cluster managers, not only in yarn client mode.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

Posted by lgrcyanny <gi...@git.apache.org>.
Github user lgrcyanny commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19079#discussion_r135968835
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
    @@ -481,7 +481,7 @@ object SparkSubmit extends CommandLineUtils {
             sysProp = "spark.executor.memory"),
           OptionAssigner(args.totalExecutorCores, STANDALONE | MESOS, ALL_DEPLOY_MODES,
             sysProp = "spark.cores.max"),
    -      OptionAssigner(args.files, LOCAL | STANDALONE | MESOS, ALL_DEPLOY_MODES,
    +      OptionAssigner(args.files, ALL_CLUSTER_MGRS, ALL_DEPLOY_MODES,
    --- End diff --
    
    I met some users complained about the wired action about SparkFiles.get in yarn-client and yarn-cluster mode. SparkFiles.get is very easy for user to get file path. why not keep the same action in yarn-cluster and yarn-client mode?
    Meanwhile, it not very easy for user to use spark.yarn.dist.files, it must be uploaded to HDFS in advance. To make spark on yarn more usable, use SparkFiles.get is better.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

Posted by brad-kaiser <gi...@git.apache.org>.
Github user brad-kaiser commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19079#discussion_r135846355
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkEnv.scala ---
    @@ -393,7 +393,7 @@ object SparkEnv extends Logging {
         // Add a reference to tmp dir created by driver, we will delete this tmp dir when stop() is
         // called, and we only need to do it for driver. Because driver may run as a service, and if we
         // don't delete this tmp dir when sc is stopped, then will create too many tmp dirs.
    -    if (isDriver) {
    +    if (isDriver && conf.getOption("spark.submit.deployMode").getOrElse("client") == "client") {
    --- End diff --
    
    Tiny nitpick, this might be simpler.
    
    ```
      if (isDriver && conf.get("spark.submit.deployMode", "client") == "client") { 
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

Posted by lgrcyanny <gi...@git.apache.org>.
Github user lgrcyanny commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19079#discussion_r135943288
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkEnv.scala ---
    @@ -393,7 +393,7 @@ object SparkEnv extends Logging {
         // Add a reference to tmp dir created by driver, we will delete this tmp dir when stop() is
         // called, and we only need to do it for driver. Because driver may run as a service, and if we
         // don't delete this tmp dir when sc is stopped, then will create too many tmp dirs.
    -    if (isDriver) {
    +    if (isDriver && conf.getOption("spark.submit.deployMode").getOrElse("client") == "client") {
    --- End diff --
    
    Originally, my version is
    ```
    conf.get("spark.submit.deployMode", "client") == "client"
    ```
    Then I refered to the SparkContext#deployMode function,
    it use
    ```
    conf.getOption("spark.submit.deployMode").getOrElse("client")
    ```
    I just want to keep the same style as SparkContext.
    which one is more better?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on driver ...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/19079
  
    Close this @lgrcyanny 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on driver ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/19079
  
    Please file PRs against the master branch.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

Posted by lgrcyanny <gi...@git.apache.org>.
Github user lgrcyanny commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19079#discussion_r136056972
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
    @@ -481,7 +481,7 @@ object SparkSubmit extends CommandLineUtils {
             sysProp = "spark.executor.memory"),
           OptionAssigner(args.totalExecutorCores, STANDALONE | MESOS, ALL_DEPLOY_MODES,
             sysProp = "spark.cores.max"),
    -      OptionAssigner(args.files, LOCAL | STANDALONE | MESOS, ALL_DEPLOY_MODES,
    +      OptionAssigner(args.files, ALL_CLUSTER_MGRS, ALL_DEPLOY_MODES,
    --- End diff --
    
    May I ask, why the OptionAssigner for "spark.files" works for local, standalone and mesos, only except yarn? is there any doc explain the design purpose? or may be this is really a issue.
    ```
    OptionAssigner(args.files, LOCAL | STANDALONE | MESOS, ALL_DEPLOY_MODES, "spark.files")
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

Posted by lgrcyanny <gi...@git.apache.org>.
Github user lgrcyanny commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19079#discussion_r135942521
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkEnv.scala ---
    @@ -393,7 +393,7 @@ object SparkEnv extends Logging {
         // Add a reference to tmp dir created by driver, we will delete this tmp dir when stop() is
         // called, and we only need to do it for driver. Because driver may run as a service, and if we
         // don't delete this tmp dir when sc is stopped, then will create too many tmp dirs.
    -    if (isDriver) {
    +    if (isDriver && conf.getOption("spark.submit.deployMode").getOrElse("client") == "client") {
    --- End diff --
    
    Ok, thanks, I will change it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on driver ...

Posted by lgrcyanny <gi...@git.apache.org>.
Github user lgrcyanny commented on the issue:

    https://github.com/apache/spark/pull/19079
  
    Hi @vanzin I have submit a PR based on master branch, please review it, thank you
    https://github.com/apache/spark/pull/19102


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19079#discussion_r135958924
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
    @@ -481,7 +481,7 @@ object SparkSubmit extends CommandLineUtils {
             sysProp = "spark.executor.memory"),
           OptionAssigner(args.totalExecutorCores, STANDALONE | MESOS, ALL_DEPLOY_MODES,
             sysProp = "spark.cores.max"),
    -      OptionAssigner(args.files, LOCAL | STANDALONE | MESOS, ALL_DEPLOY_MODES,
    +      OptionAssigner(args.files, ALL_CLUSTER_MGRS, ALL_DEPLOY_MODES,
    --- End diff --
    
    The changes here is not correct. For yarn application, we use `spark.yarn.dist.files` to handle files, and this will be added to distributed cache. Without your change it breaks the current code.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on driver ...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on the issue:

    https://github.com/apache/spark/pull/19079
  
    Currently for Spark yarn-client application, we don't support fetching files using above `SparkFiles.get` API. Since you already know where the file is in client mode, so may be you don't need to call `SparkFiles.get`.
    
    Here actually requires several changes including supporting remote files. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on driver ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/19079
  
    (Unless the issue does not exist in the master branch, in which case please call that out explicitly.)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org