You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by lgrcyanny <gi...@git.apache.org> on 2017/08/29 08:47:23 UTC

[GitHub] spark pull request #19076: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

GitHub user lgrcyanny opened a pull request:

    https://github.com/apache/spark/pull/19076

    [SPARK-21859][CORE] Fix SparkFiles.get failed on driver in yarn-cluster and yarn-client mode

    ## What changes were proposed in this pull request?
    when use SparkFiles.get a file on driver in yarn-client or yarn-cluster, it will report file not found exception.
    This exception only happens on driver, getting files on executor is ok.
    
    we can reproduce the bug as follows:
    ```scala
    def testOnDriver(fileName: String) = {
        val file = new File(SparkFiles.get(fileName))
        if (!file.exists()) {
            logging.info(s"$file not exist")
        } else {
            // print file content on driver
            val content = Source.fromFile(file).getLines().mkString("\n")
            logging.info(s"File content: ${content}")
        }
    }
    // the output will be file not exist
    ```
    
    ```python
    conf = SparkConf().setAppName("test files")
    sc = SparkContext(appName="spark files test")
    
    def test_on_driver(filename):
        file = SparkFiles.get(filename)
        print("file path: {}".format(file))
        if os.path.exists(file):
            with open(file) as f:
            lines = f.readlines()
            print(lines)
        else:
            print("file doesn't exist")
            run_command("ls .")
    ```
    ## How was this patch tested?
    tested in integration tests and manual tests
    submit the demo case in yarn-cluster and yarn-client mode, verify the test result
    the integration tests commands are as follows:
    
    ```shell
    ./bin/spark-submit --master yarn-cluster --files README.md --class "testing.SparkFilesTest" testing.jar
    ./bin/spark-submit --master yarn-client --files README.md --class "testing.SparkFilesTest" testing.jar
    ./bin/spark-submit --master yarn-cluster --files README.md test_get_files.py
    ./bin/spark-submit --master yarn-client --files README.md test_get_files.py
    ```

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/lgrcyanny/spark fix-yarn-files-problem

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19076.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19076
    
----
commit 5af0b9f5d0b226891410b00ab75327b61b96dcdd
Author: lgrcyanny <lg...@gmail.com>
Date:   2017-05-07T12:51:55Z

    [SPARK-21859][CORE] Fix SparkFiles.get failed on driver in yarn-cluster and yarn-client mode
    
    when use SparkFiles.get a file on driver in yarn-client or yarn-cluster, it will report file not found exception.
    This exception only happens on driver, getting files on executor is ok.
    
    we can reproduce the bug as follows:
    ```scala
    def testOnDriver(fileName: String) = {
        val file = new File(SparkFiles.get(fileName))
        if (!file.exists()) {
            logging.info(s"$file not exist")
        } else {
            // print file content on driver
            val content = Source.fromFile(file).getLines().mkString("\n")
            logging.info(s"File content: ${content}")
        }
    }
    // the output will be file not exist
    ```
    
    ```python
    conf = SparkConf().setAppName("test files")
    sc = SparkContext(appName="spark files test")
    
    def test_on_driver(filename):
        file = SparkFiles.get(filename)
        print("file path: {}".format(file))
        if os.path.exists(file):
            with open(file) as f:
            lines = f.readlines()
            print(lines)
        else:
            print("file doesn't exist")
            run_command("ls .")
    ```
    
    submit the demo case in yarn-cluster and yarn-client mode, verify the test result
    
    ```
    ./bin/spark-submit --master yarn-cluster --files README.md --class "testing.SparkFilesTest" testing.jar
    ./bin/spark-submit --master yarn-client --files README.md --class "testing.SparkFilesTest" testing.jar
    ./bin/spark-submit --master yarn-cluster --files README.md test_get_files.py
    ./bin/spark-submit --master yarn-client --files README.md test_get_files.py
    ```
    
    Change-Id: Ice7d43fc5ac18fbc229911533d06063ea1f17c5b

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19076: [SPARK-21859][CORE] Fix SparkFiles.get failed on driver ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/19076
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #19076: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

Posted by lgrcyanny <gi...@git.apache.org>.
Github user lgrcyanny closed the pull request at:

    https://github.com/apache/spark/pull/19076


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org