You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by style95 <st...@gmail.com> on 2014/09/17 07:23:20 UTC

permission denied on local dir

I am running spark on shared yarn cluster.
My user ID is "online", but I found that when I run my spark application,
local directories are created by "yarn" user ID.
So I am unable to delete local directories and finally application failed.

Please refer to my log below:

14/09/16 21:59:02 ERROR DiskBlockManager: Exception while deleting local
spark dir:
/hadoop02/hadoop/yarn/local/usercache/online/appcache/application_1410795082830_3994/spark-local-20140916215842-6fe7
java.io.IOException: Failed to list files for dir:
/hadoop02/hadoop/yarn/local/usercache/online/appcache/application_1410795082830_3994/spark-local-20140916215842-6fe7/3a
        at org.apache.spark.util.Utils$.listFilesSafely(Utils.scala:580)
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:592)
        at
org.apache.spark.util.Utils$$anonfun$deleteRecursively$1.apply(Utils.scala:593)
        at
org.apache.spark.util.Utils$$anonfun$deleteRecursively$1.apply(Utils.scala:592)
        at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at
scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:592)
        at
org.apache.spark.storage.DiskBlockManager$$anonfun$stop$1.apply(DiskBlockManager.scala:163)
        at
org.apache.spark.storage.DiskBlockManager$$anonfun$stop$1.apply(DiskBlockManager.scala:160)
        at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at
scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
        at
org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:160)
        at
org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply$mcV$sp(DiskBlockManager.scala:153)
        at
org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:151)
        at
org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:151)
        at
org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
        at
org.apache.spark.storage.DiskBlockManager$$anon$1.run(DiskBlockManager.scala:151)


I am unable to access
"/hadoop02/hadoop/yarn/local/usercache/online/appcache/application_1410795082830_3994/spark-local-20140916215842-6fe7" 
e.g) "ls
/hadoop02/hadoop/yarn/local/usercache/online/appcache/application_1410795082830_3994/spark-local-20140916215842-6fe7"
does not work and permission denied occurred.

I am using spark-1.0.0 and yarn 2.4.0.

Thanks in advance.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/permission-denied-on-local-dir-tp14422.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: permission denied on local dir

Posted by style95 <st...@gmail.com>.
More clearly, that yarn cluster is managed by other team.
That means, I do not have any permission to change the system.

If required, I can request to them, 
but as of now, I only have permission to manage my spark application.

So if there are any way to solve this problem by changing configuration in
spark application, it will be the best.
However, if configuration in yarn layer is required, I need to ask to
managing team.

Thanks
Regards
Dongkyoung.






--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Permission-denied-on-local-dir-tp14422p14492.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: permission denied on local dir

Posted by style95 <st...@gmail.com>.
However, in my application there is no logic to access local files.
So I thought that spark is internally using the local file system to cache
RDDs.

As per the log, it looks that error occurred during spark internal logic
rather than my business logic.
It is trying to delete local directories.
and they looks directories for cache.

/hadoop02/hadoop/yarn/local/*usercache/online/appcache*/application_1410795082830_3994/spark-local-20140916215842-6fe7

Are there any way not to use cache or local directories?
or way to access the directories created by "yarn" user via "online", my
spark user?

Thanks
Regards
Dongkyoung.




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Permission-denied-on-local-dir-tp14422p14453.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: permission denied on local dir

Posted by Sean Owen <so...@cloudera.com>.
Yes, that is how it is supposed to work. Apps run as yarn and do not
generally expect to depend on local file state that is created externally.

This directory should be owned by yarn though right? Your error does not
show permission denied. It looks like you are unable to list a yarn dir as
your user but that's expected. What are you expecting to do?
On Sep 17, 2014 6:23 AM, "style95" <st...@gmail.com> wrote:

> I am running spark on shared yarn cluster.
> My user ID is "online", but I found that when I run my spark application,
> local directories are created by "yarn" user ID.
> So I am unable to delete local directories and finally application failed.
>
> Please refer to my log below:
>
> 14/09/16 21:59:02 ERROR DiskBlockManager: Exception while deleting local
> spark dir:
>
> /hadoop02/hadoop/yarn/local/usercache/online/appcache/application_1410795082830_3994/spark-local-20140916215842-6fe7
> java.io.IOException: Failed to list files for dir:
>
> /hadoop02/hadoop/yarn/local/usercache/online/appcache/application_1410795082830_3994/spark-local-20140916215842-6fe7/3a
>         at org.apache.spark.util.Utils$.listFilesSafely(Utils.scala:580)
>         at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:592)
>         at
>
> org.apache.spark.util.Utils$$anonfun$deleteRecursively$1.apply(Utils.scala:593)
>         at
>
> org.apache.spark.util.Utils$$anonfun$deleteRecursively$1.apply(Utils.scala:592)
>         at
>
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
>         at
> scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
>         at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:592)
>         at
>
> org.apache.spark.storage.DiskBlockManager$$anonfun$stop$1.apply(DiskBlockManager.scala:163)
>         at
>
> org.apache.spark.storage.DiskBlockManager$$anonfun$stop$1.apply(DiskBlockManager.scala:160)
>         at
>
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
>         at
> scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
>         at
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:160)
>         at
>
> org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply$mcV$sp(DiskBlockManager.scala:153)
>         at
>
> org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:151)
>         at
>
> org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:151)
>         at
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
>         at
>
> org.apache.spark.storage.DiskBlockManager$$anon$1.run(DiskBlockManager.scala:151)
>
>
> I am unable to access
>
> "/hadoop02/hadoop/yarn/local/usercache/online/appcache/application_1410795082830_3994/spark-local-20140916215842-6fe7"
> e.g) "ls
>
> /hadoop02/hadoop/yarn/local/usercache/online/appcache/application_1410795082830_3994/spark-local-20140916215842-6fe7"
> does not work and permission denied occurred.
>
> I am using spark-1.0.0 and yarn 2.4.0.
>
> Thanks in advance.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/permission-denied-on-local-dir-tp14422.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>