You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by chris collins <ch...@scoutlabs.com> on 2008/06/07 19:54:57 UTC

Couple of basic hdfs starter issues

Sorry in advance if these "challenges" are covered in a document somewhere.

I have setup hadoop on a centos 64 bit Linux box.  I have verified that it is up and running only through seeing the java processes running and that I can access it from the admin ui.

hadoop version is 1.7.0 but I also tried 1.6.4 for the following issue:

>From a mac osx box using java 1.5 I am trying to run the following:

String home = "hdfs://linuxbox:9000";
URI uri = new URI(home);
Configuration conf = new Configuration();
    
FileSystem fs = FileSystem.get(uri, conf);

The call to FileSystem.get throws an IOException stating that there is a login error with message "whoami".

When I single step through the code there is an attempt to figure out what user is running this process by creating a processbuilder with "whoami".  This fails with a "not found" error.  I believe this is because you have to have a fully qualified path for processbuilder on the mac???

I also verified that my hadoop-default.xml and hadoop-site.xml is infact found in the classpath.

All this is being attempted via a debug session in intellij ide.

Any ideas on what I am doing wrong, I am sure its a configuration blunder on my part?

Further, we used to use an old copy of nutch, of course now the hadoop part of nutch is its own jar file, so I upgraded the nutch jars too.  We were using a few things within the nutch project that seem to of gone away:

net.sf incarnation of the snowball stemmer (I fixed this by pulling directly the source from the author).
language identification....any idea where it went?
carrot2 clustering....any idea where that went?

Thanks in advance.

Chris

RE: Couple of basic hdfs starter issues

Posted by chris collins <ch...@scoutlabs.com>.
I should update this to stupidity on my part (though the hidden shell execution within the client thats error gets masked is somewhat fickle).  Of course if I dont start the thing up via the ide, but from the command line it goes past this problem (security issue, but that one is probably a more obvious thing).  

Still if anyone has an idea what happened to language id and the carrot2 stuff inside nutch that would be appreciated.

C


-----Original Message-----
From: chris collins [mailto:chris@scoutlabs.com]
Sent: Sat 6/7/2008 10:54 AM
To: core-user@hadoop.apache.org
Subject: Couple of basic hdfs starter issues
 
Sorry in advance if these "challenges" are covered in a document somewhere.

I have setup hadoop on a centos 64 bit Linux box.  I have verified that it is up and running only through seeing the java processes running and that I can access it from the admin ui.

hadoop version is 1.7.0 but I also tried 1.6.4 for the following issue:

>From a mac osx box using java 1.5 I am trying to run the following:

String home = "hdfs://linuxbox:9000";
URI uri = new URI(home);
Configuration conf = new Configuration();
    
FileSystem fs = FileSystem.get(uri, conf);

The call to FileSystem.get throws an IOException stating that there is a login error with message "whoami".

When I single step through the code there is an attempt to figure out what user is running this process by creating a processbuilder with "whoami".  This fails with a "not found" error.  I believe this is because you have to have a fully qualified path for processbuilder on the mac???

I also verified that my hadoop-default.xml and hadoop-site.xml is infact found in the classpath.

All this is being attempted via a debug session in intellij ide.

Any ideas on what I am doing wrong, I am sure its a configuration blunder on my part?

Further, we used to use an old copy of nutch, of course now the hadoop part of nutch is its own jar file, so I upgraded the nutch jars too.  We were using a few things within the nutch project that seem to of gone away:

net.sf incarnation of the snowball stemmer (I fixed this by pulling directly the source from the author).
language identification....any idea where it went?
carrot2 clustering....any idea where that went?

Thanks in advance.

Chris