You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-issues@hadoop.apache.org by "Alina Danila (Commented) (JIRA)" <ji...@apache.org> on 2011/12/25 21:02:30 UTC

[jira] [Commented] (MAPREDUCE-3555) hadoop 0.20.205.0 Eclipse Plugin does not work with Eclipse,there are two problems with it.

    [ https://issues.apache.org/jira/browse/MAPREDUCE-3555?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13175848#comment-13175848 ] 

Alina Danila commented on MAPREDUCE-3555:
-----------------------------------------

How / Where do you add commons-lang-2.4.jar and commons-configuration-1.6.jar to the plugin build path?
                
> hadoop 0.20.205.0 Eclipse Plugin does not work with Eclipse,there are two problems with it.
> -------------------------------------------------------------------------------------------
>
>                 Key: MAPREDUCE-3555
>                 URL: https://issues.apache.org/jira/browse/MAPREDUCE-3555
>             Project: Hadoop Map/Reduce
>          Issue Type: Bug
>          Components: contrib/eclipse-plugin
>    Affects Versions: 0.20.205.0
>         Environment: windows7,Eclipse3.7.1,hadoop-0.20.205.0 on CentOs6.0
>            Reporter: Storm Lee
>              Labels: hadoop
>
> I found tow problems in the eclipse plugin.
> 1.Plugin's build path is missing jar ,when I use DFSView, it will report java.lang.NoClassDefFoundError exception as follow:
>     {quote}java.lang.NoClassDefFoundError: org/apache/commons/lang/StringUtils{quote}
>     {quote}java.lang.NoClassDefFoundError: org/apache/commons/configuration/Configuration{quote}
>   I add commons-lang-2.4.jar and commons-configuration-1.6.jar, and it works.
> 2.The job configuration may be overrided by default conf when it submited. Run log as follow:
>   {quote}11/12/14 10:55:19 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> Exception in thread "main" java.io.IOException: Failed to set permissions of path: \usr\local\hadoop\hadooptmp205\mapred\staging\storm-75040524\.staging to 0700
> 	at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:682)
> 	at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:655)
> 	at org.apache.hadoop.fs.RawLocalFileSystem.setPermission{color:red}(RawLocalFileSystem.java:484){color}
> 	at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs{color:red}(RawLocalFileSystem.java:319){color}
> 	at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189)
> 	at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
> 	at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:848)
> 	at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:842)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Unknown Source)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
> 	at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:842)
> 	at org.apache.hadoop.mapreduce.Job.submit(Job.java:465)
> 	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
> {quote}
> The point is that Job found a wrong fs, it was LocalFileSystem.
> By my configuration, job should be run on HDFS, but the conf which named "mapred.job.tracker" was overrided by default value.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira