You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by Giridharan Kesavan <gk...@yahoo-inc.com> on 2009/11/10 23:02:00 UTC

Publishing hadoop artifacts - Apache Nexus Repo

Hadoop-Common-trunk-Commit and Hadoop-Hdfs-trunk-Commit jobs on hudson is configured to publish core, core-test , hdfs and hdfs-test jars resp. to the apache nexus snapshot repository.

This means hdfs will always be build with the latest published common jars available in the apache nexus snapshot repo.

Thanks,
Giri

Exception in checkPath() in FileSystem.java

Posted by arun kumar <ar...@yahoo.com>.
Hi All,

I am trying to execute the example wordcount application on a cluster in my University's lab. Since I don't have write access to the /etc/hosts file (and the admin won't allow me to add entries for each node in the cluster), I am using the IP address of each node in all of Hadoop's configuration files. Now copying the input files into HDFS works fine, but when I start the application I get this message:
Error initializing attempt_200911102009_0001_m_000002_1:
java.lang.IllegalArgumentException: Wrong FS: hdfs://128.226.118.98:54310/var/work/aselvan1/hadoop-tmp/mapred/system/job_200911102009_0001/job.xml, expected: hdfs://node22.cs.binghamton.edu:54310
        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:327)

I am using release 0.18.3. I tried to find the cause for the error, and I think that the Authority check fails (probably due to specifying IP address in configuration files). I am stuck here, any help is highly appreciated.

Thank you,
Arun