You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ram Venkatesh (JIRA)" <ji...@apache.org> on 2015/10/25 23:10:28 UTC

[jira] [Created] (SPARK-11304) SparkR in yarn-client mode fails creating sparkr.zip

Ram Venkatesh created SPARK-11304:
-------------------------------------

             Summary: SparkR in yarn-client mode fails creating sparkr.zip
                 Key: SPARK-11304
                 URL: https://issues.apache.org/jira/browse/SPARK-11304
             Project: Spark
          Issue Type: Bug
          Components: SparkR
    Affects Versions: 1.5.1
            Reporter: Ram Venkatesh


If you run sparkR in yarn-client mode and the spark installation directory is not writable by the current user, it fails with

Exception in thread "main" java.io.FileNotFoundException:
/usr/hdp/2.3.2.1-12/spark/R/lib/sparkr.zip (Permission denied)
        at java.io.FileOutputStream.open0(Native Method)
        at java.io.FileOutputStream.open(FileOutputStream.java:270)
        at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
        at
org.apache.spark.deploy.RPackageUtils$.zipRLibraries(RPackageUtils.scala:215)
        at
org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:371)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

The behavior is the same with the pre-built spark-1.5.1-bin-hadoop2.6
bits also.

We need to either use an existing sparkr.zip if we find one in the R/lib directory, or create the file in a location accessible to the submitting user.

Temporary hack workaround - create a world-writable file called sparkr.zip under R/lib. It will still fail if multiple users submit jobs at the same time.




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org