You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Shushant Arora <sh...@gmail.com> on 2015/08/03 10:34:05 UTC

spark --files permission error

Is there any setting to allow --files to copy jar from driver to executor
nodes.

When I am passing some jar files using --files to executors and adding them
in class path of executor it throws exception of File not found

15/08/03 07:59:50 WARN TaskSetManager: Lost task 8.0 in stage 0.0 (TID 8,
ip): java.io.FileNotFoundException: ./jar (Permission denied)
        at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
        at
org.spark-project.guava.common.io.Files$FileByteSink.openStream(Files.java:223)


Running program as :
spark-submit --class classname  --files "externaljarname.jar"
--driver-class-path "externaljarname.jar" --conf
"spark.executor.extraClassPath=externaljarname.jar" mainjar.jar