You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@mahout.apache.org by Darius Miliauskas <da...@gmail.com> on 2013/09/13 16:12:52 UTC

java.io.IOException: Failed to set permissions of path

Dear All,

since I am using Windows 7 (Windows 7 &NetBeans 7), all my attempts to
write any code of mahout in Java (API) are stopped with the following error:

java.io.IOException: Failed to set permissions of path:
\tmp\hadoop-DARIUS\mapred\staging\DARIUS1904012429\.staging to 0700
at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:691)
at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:664)
at
org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:514)
at
org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:349)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:193)
at
org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:126)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:942)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)
at
org.apache.mahout.vectorizer.DocumentProcessor.tokenizeDocuments(DocumentProcessor.java:93)
at
org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.run(SparseVectorsFromSequenceFiles.java:257)
at
org.apache.mahout.mahoutnewsrecommender2.Recommender.myRecommender(Recommender.java:99)
at org.apache.mahout.mahoutnewsrecommender2.App.main(App.java:26)

BUILD SUCCESSFUL (total time: 8 seconds)

I checked online for the possible solutions. There is the patch (
https://github.com/congainc/patch-hadoop_7682-1.0.x-win) but it does not
fix the problem. Is there any experience how to work around this issue? Is
it possible perhaps to modify any class in hadoop to avoid this permission
issue? Is it possible to fix it without installing hadoop, just using the
libraries from there?


Thanks for Any Detailed Suggestion,

Darius