You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Chirag Dewan <ch...@ericsson.com> on 2014/02/24 13:36:36 UTC
Wrong FS hdfs:/localhost:9000 ;expected file///
Hi All,
I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code which reads a file from HDFS on a single node cluster. Now when I run my code using java -jar mytest.jar it throws the error Wrong FS hdfs://localhost.
When I run the same code with hadoop jar test.jar it works just fine.
I have my core-site.xml with fs.default.name as hdfs://localhost
Am I missing some classpath dependency here?
Thanks in advance.
Chirag Dewan
RE: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Chirag Dewan <ch...@ericsson.com>.
Hi All,
Thanks a lot everyone for the quick response.
I got that working by putting HADOOP_CONF_DIR in the classpath.
Thanks.
Chirag Dewan
From: Shumin Guo [mailto:gsmsteve@gmail.com]
Sent: Wednesday, February 26, 2014 7:50 AM
To: user@hadoop.apache.org
Subject: Re: Wrong FS hdfs:/localhost:9000 ;expected file///
The value should be hdfs:///localhost:<port>
On Feb 24, 2014 6:37 AM, "Chirag Dewan" <ch...@ericsson.com>> wrote:
Hi All,
I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code which reads a file from HDFS on a single node cluster. Now when I run my code using java -jar mytest.jar it throws the error Wrong FS hdfs://localhost.
When I run the same code with hadoop jar test.jar it works just fine.
I have my core-site.xml with fs.default.name<http://fs.default.name> as hdfs://localhost
Am I missing some classpath dependency here?
Thanks in advance.
Chirag Dewan
RE: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Chirag Dewan <ch...@ericsson.com>.
Hi All,
Thanks a lot everyone for the quick response.
I got that working by putting HADOOP_CONF_DIR in the classpath.
Thanks.
Chirag Dewan
From: Shumin Guo [mailto:gsmsteve@gmail.com]
Sent: Wednesday, February 26, 2014 7:50 AM
To: user@hadoop.apache.org
Subject: Re: Wrong FS hdfs:/localhost:9000 ;expected file///
The value should be hdfs:///localhost:<port>
On Feb 24, 2014 6:37 AM, "Chirag Dewan" <ch...@ericsson.com>> wrote:
Hi All,
I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code which reads a file from HDFS on a single node cluster. Now when I run my code using java -jar mytest.jar it throws the error Wrong FS hdfs://localhost.
When I run the same code with hadoop jar test.jar it works just fine.
I have my core-site.xml with fs.default.name<http://fs.default.name> as hdfs://localhost
Am I missing some classpath dependency here?
Thanks in advance.
Chirag Dewan
RE: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Chirag Dewan <ch...@ericsson.com>.
Hi All,
Thanks a lot everyone for the quick response.
I got that working by putting HADOOP_CONF_DIR in the classpath.
Thanks.
Chirag Dewan
From: Shumin Guo [mailto:gsmsteve@gmail.com]
Sent: Wednesday, February 26, 2014 7:50 AM
To: user@hadoop.apache.org
Subject: Re: Wrong FS hdfs:/localhost:9000 ;expected file///
The value should be hdfs:///localhost:<port>
On Feb 24, 2014 6:37 AM, "Chirag Dewan" <ch...@ericsson.com>> wrote:
Hi All,
I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code which reads a file from HDFS on a single node cluster. Now when I run my code using java -jar mytest.jar it throws the error Wrong FS hdfs://localhost.
When I run the same code with hadoop jar test.jar it works just fine.
I have my core-site.xml with fs.default.name<http://fs.default.name> as hdfs://localhost
Am I missing some classpath dependency here?
Thanks in advance.
Chirag Dewan
RE: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Chirag Dewan <ch...@ericsson.com>.
Hi All,
Thanks a lot everyone for the quick response.
I got that working by putting HADOOP_CONF_DIR in the classpath.
Thanks.
Chirag Dewan
From: Shumin Guo [mailto:gsmsteve@gmail.com]
Sent: Wednesday, February 26, 2014 7:50 AM
To: user@hadoop.apache.org
Subject: Re: Wrong FS hdfs:/localhost:9000 ;expected file///
The value should be hdfs:///localhost:<port>
On Feb 24, 2014 6:37 AM, "Chirag Dewan" <ch...@ericsson.com>> wrote:
Hi All,
I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code which reads a file from HDFS on a single node cluster. Now when I run my code using java -jar mytest.jar it throws the error Wrong FS hdfs://localhost.
When I run the same code with hadoop jar test.jar it works just fine.
I have my core-site.xml with fs.default.name<http://fs.default.name> as hdfs://localhost
Am I missing some classpath dependency here?
Thanks in advance.
Chirag Dewan
Re: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Shumin Guo <gs...@gmail.com>.
The value should be hdfs:///localhost:<port>
On Feb 24, 2014 6:37 AM, "Chirag Dewan" <ch...@ericsson.com> wrote:
> Hi All,
>
>
>
> I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code
> which reads a file from HDFS on a single node cluster. Now when I run my
> code using java -jar mytest.jar it throws the error Wrong FS
> hdfs://localhost.
>
>
>
> When I run the same code with hadoop jar test.jar it works just fine.
>
> I have my core-site.xml with fs.default.name as hdfs://localhost
>
>
>
> Am I missing some classpath dependency here?
>
>
>
> Thanks in advance.
>
>
>
> Chirag Dewan
>
>
>
>
>
Re: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Mohammad Tariq <do...@gmail.com>.
Hi Chirag,
Alternatively, you can add following 2 lines in your code in order to make
it work without having to worry about the classpath :
conf.addResource(new Path("/path/to/core-site.xml"));
conf.addResource(new Path("/path/to/hdfs-site.xml"));
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Feb 25, 2014 at 9:42 PM, Vinayakumar B <vi...@huawei.com>wrote:
> Hi Chirag,
>
>
>
> Hadoop expects core-site.xml to be in classpath which infact will be
> present in HADOOP_CONF_DIR
>
>
>
> When you run hadoop jar test.jar , hadoop script will take care of
> adding all dependencies to CLASSPATH including HADOOP_CONF_DIR and your
> client will run successfully.
>
>
>
>
>
> When you run using *java -jar test.jar *classpath will not be set and *-jar
> option of java will ignore the classpath* set either using *CLASSPATH*env variable or *-cp
> argument*. That means your *test.jar *should be complete Runnable Jar
> with all dependencies including *conf files.*
>
>
>
>
>
> Please verify by running in the following way by constructing the
> CLASSPATH which includes HADOOP_CONF_DIR
>
>
>
> java -cp <CLASSPATH> <MAIN-CLASS> <args>
>
>
>
> or
>
> simply use hadoop jar test.jar
>
>
>
>
>
> Cheers,
>
> Vinayakumar B
>
>
>
> *From:* Chris Mawata [mailto:chris.mawata@gmail.com]
> *Sent:* 25 February 2014 20:08
> *To:* user@hadoop.apache.org
> *Subject:* Re: Wrong FS hdfs:/localhost:9000 ;expected file///
>
>
>
> The hadoop command gives you a configuration object with the
> configurations that are in your XML files. In your Java code you are
> probably getting your FileSystem object from a blank Configuration when you
> don't use the hadoop command.
> Chris
>
> On Feb 24, 2014 7:37 AM, "Chirag Dewan" <ch...@ericsson.com> wrote:
>
> Hi All,
>
>
>
> I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code
> which reads a file from HDFS on a single node cluster. Now when I run my
> code using java -jar mytest.jar it throws the error Wrong FS
> hdfs://localhost.
>
>
>
> When I run the same code with hadoop jar test.jar it works just fine.
>
> I have my core-site.xml with fs.default.name as hdfs://localhost
>
>
>
> Am I missing some classpath dependency here?
>
>
>
> Thanks in advance.
>
>
>
> Chirag Dewan
>
>
>
>
>
Re: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Mohammad Tariq <do...@gmail.com>.
Hi Chirag,
Alternatively, you can add following 2 lines in your code in order to make
it work without having to worry about the classpath :
conf.addResource(new Path("/path/to/core-site.xml"));
conf.addResource(new Path("/path/to/hdfs-site.xml"));
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Feb 25, 2014 at 9:42 PM, Vinayakumar B <vi...@huawei.com>wrote:
> Hi Chirag,
>
>
>
> Hadoop expects core-site.xml to be in classpath which infact will be
> present in HADOOP_CONF_DIR
>
>
>
> When you run hadoop jar test.jar , hadoop script will take care of
> adding all dependencies to CLASSPATH including HADOOP_CONF_DIR and your
> client will run successfully.
>
>
>
>
>
> When you run using *java -jar test.jar *classpath will not be set and *-jar
> option of java will ignore the classpath* set either using *CLASSPATH*env variable or *-cp
> argument*. That means your *test.jar *should be complete Runnable Jar
> with all dependencies including *conf files.*
>
>
>
>
>
> Please verify by running in the following way by constructing the
> CLASSPATH which includes HADOOP_CONF_DIR
>
>
>
> java -cp <CLASSPATH> <MAIN-CLASS> <args>
>
>
>
> or
>
> simply use hadoop jar test.jar
>
>
>
>
>
> Cheers,
>
> Vinayakumar B
>
>
>
> *From:* Chris Mawata [mailto:chris.mawata@gmail.com]
> *Sent:* 25 February 2014 20:08
> *To:* user@hadoop.apache.org
> *Subject:* Re: Wrong FS hdfs:/localhost:9000 ;expected file///
>
>
>
> The hadoop command gives you a configuration object with the
> configurations that are in your XML files. In your Java code you are
> probably getting your FileSystem object from a blank Configuration when you
> don't use the hadoop command.
> Chris
>
> On Feb 24, 2014 7:37 AM, "Chirag Dewan" <ch...@ericsson.com> wrote:
>
> Hi All,
>
>
>
> I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code
> which reads a file from HDFS on a single node cluster. Now when I run my
> code using java -jar mytest.jar it throws the error Wrong FS
> hdfs://localhost.
>
>
>
> When I run the same code with hadoop jar test.jar it works just fine.
>
> I have my core-site.xml with fs.default.name as hdfs://localhost
>
>
>
> Am I missing some classpath dependency here?
>
>
>
> Thanks in advance.
>
>
>
> Chirag Dewan
>
>
>
>
>
Re: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Mohammad Tariq <do...@gmail.com>.
Hi Chirag,
Alternatively, you can add following 2 lines in your code in order to make
it work without having to worry about the classpath :
conf.addResource(new Path("/path/to/core-site.xml"));
conf.addResource(new Path("/path/to/hdfs-site.xml"));
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Feb 25, 2014 at 9:42 PM, Vinayakumar B <vi...@huawei.com>wrote:
> Hi Chirag,
>
>
>
> Hadoop expects core-site.xml to be in classpath which infact will be
> present in HADOOP_CONF_DIR
>
>
>
> When you run hadoop jar test.jar , hadoop script will take care of
> adding all dependencies to CLASSPATH including HADOOP_CONF_DIR and your
> client will run successfully.
>
>
>
>
>
> When you run using *java -jar test.jar *classpath will not be set and *-jar
> option of java will ignore the classpath* set either using *CLASSPATH*env variable or *-cp
> argument*. That means your *test.jar *should be complete Runnable Jar
> with all dependencies including *conf files.*
>
>
>
>
>
> Please verify by running in the following way by constructing the
> CLASSPATH which includes HADOOP_CONF_DIR
>
>
>
> java -cp <CLASSPATH> <MAIN-CLASS> <args>
>
>
>
> or
>
> simply use hadoop jar test.jar
>
>
>
>
>
> Cheers,
>
> Vinayakumar B
>
>
>
> *From:* Chris Mawata [mailto:chris.mawata@gmail.com]
> *Sent:* 25 February 2014 20:08
> *To:* user@hadoop.apache.org
> *Subject:* Re: Wrong FS hdfs:/localhost:9000 ;expected file///
>
>
>
> The hadoop command gives you a configuration object with the
> configurations that are in your XML files. In your Java code you are
> probably getting your FileSystem object from a blank Configuration when you
> don't use the hadoop command.
> Chris
>
> On Feb 24, 2014 7:37 AM, "Chirag Dewan" <ch...@ericsson.com> wrote:
>
> Hi All,
>
>
>
> I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code
> which reads a file from HDFS on a single node cluster. Now when I run my
> code using java -jar mytest.jar it throws the error Wrong FS
> hdfs://localhost.
>
>
>
> When I run the same code with hadoop jar test.jar it works just fine.
>
> I have my core-site.xml with fs.default.name as hdfs://localhost
>
>
>
> Am I missing some classpath dependency here?
>
>
>
> Thanks in advance.
>
>
>
> Chirag Dewan
>
>
>
>
>
Re: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Mohammad Tariq <do...@gmail.com>.
Hi Chirag,
Alternatively, you can add following 2 lines in your code in order to make
it work without having to worry about the classpath :
conf.addResource(new Path("/path/to/core-site.xml"));
conf.addResource(new Path("/path/to/hdfs-site.xml"));
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Feb 25, 2014 at 9:42 PM, Vinayakumar B <vi...@huawei.com>wrote:
> Hi Chirag,
>
>
>
> Hadoop expects core-site.xml to be in classpath which infact will be
> present in HADOOP_CONF_DIR
>
>
>
> When you run hadoop jar test.jar , hadoop script will take care of
> adding all dependencies to CLASSPATH including HADOOP_CONF_DIR and your
> client will run successfully.
>
>
>
>
>
> When you run using *java -jar test.jar *classpath will not be set and *-jar
> option of java will ignore the classpath* set either using *CLASSPATH*env variable or *-cp
> argument*. That means your *test.jar *should be complete Runnable Jar
> with all dependencies including *conf files.*
>
>
>
>
>
> Please verify by running in the following way by constructing the
> CLASSPATH which includes HADOOP_CONF_DIR
>
>
>
> java -cp <CLASSPATH> <MAIN-CLASS> <args>
>
>
>
> or
>
> simply use hadoop jar test.jar
>
>
>
>
>
> Cheers,
>
> Vinayakumar B
>
>
>
> *From:* Chris Mawata [mailto:chris.mawata@gmail.com]
> *Sent:* 25 February 2014 20:08
> *To:* user@hadoop.apache.org
> *Subject:* Re: Wrong FS hdfs:/localhost:9000 ;expected file///
>
>
>
> The hadoop command gives you a configuration object with the
> configurations that are in your XML files. In your Java code you are
> probably getting your FileSystem object from a blank Configuration when you
> don't use the hadoop command.
> Chris
>
> On Feb 24, 2014 7:37 AM, "Chirag Dewan" <ch...@ericsson.com> wrote:
>
> Hi All,
>
>
>
> I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code
> which reads a file from HDFS on a single node cluster. Now when I run my
> code using java -jar mytest.jar it throws the error Wrong FS
> hdfs://localhost.
>
>
>
> When I run the same code with hadoop jar test.jar it works just fine.
>
> I have my core-site.xml with fs.default.name as hdfs://localhost
>
>
>
> Am I missing some classpath dependency here?
>
>
>
> Thanks in advance.
>
>
>
> Chirag Dewan
>
>
>
>
>
RE: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Vinayakumar B <vi...@huawei.com>.
Hi Chirag,
Hadoop expects core-site.xml to be in classpath which infact will be present in HADOOP_CONF_DIR
When you run hadoop jar test.jar , hadoop script will take care of adding all dependencies to CLASSPATH including HADOOP_CONF_DIR and your client will run successfully.
When you run using java -jar test.jar classpath will not be set and -jar option of java will ignore the classpath set either using CLASSPATH env variable or -cp argument. That means your test.jar should be complete Runnable Jar with all dependencies including conf files.
Please verify by running in the following way by constructing the CLASSPATH which includes HADOOP_CONF_DIR
java -cp <CLASSPATH> <MAIN-CLASS> <args>
or
simply use hadoop jar test.jar
Cheers,
Vinayakumar B
From: Chris Mawata [mailto:chris.mawata@gmail.com]
Sent: 25 February 2014 20:08
To: user@hadoop.apache.org
Subject: Re: Wrong FS hdfs:/localhost:9000 ;expected file///
The hadoop command gives you a configuration object with the configurations that are in your XML files. In your Java code you are probably getting your FileSystem object from a blank Configuration when you don't use the hadoop command.
Chris
On Feb 24, 2014 7:37 AM, "Chirag Dewan" <ch...@ericsson.com>> wrote:
Hi All,
I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code which reads a file from HDFS on a single node cluster. Now when I run my code using java -jar mytest.jar it throws the error Wrong FS hdfs://localhost.
When I run the same code with hadoop jar test.jar it works just fine.
I have my core-site.xml with fs.default.name<http://fs.default.name> as hdfs://localhost
Am I missing some classpath dependency here?
Thanks in advance.
Chirag Dewan
RE: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Vinayakumar B <vi...@huawei.com>.
Hi Chirag,
Hadoop expects core-site.xml to be in classpath which infact will be present in HADOOP_CONF_DIR
When you run hadoop jar test.jar , hadoop script will take care of adding all dependencies to CLASSPATH including HADOOP_CONF_DIR and your client will run successfully.
When you run using java -jar test.jar classpath will not be set and -jar option of java will ignore the classpath set either using CLASSPATH env variable or -cp argument. That means your test.jar should be complete Runnable Jar with all dependencies including conf files.
Please verify by running in the following way by constructing the CLASSPATH which includes HADOOP_CONF_DIR
java -cp <CLASSPATH> <MAIN-CLASS> <args>
or
simply use hadoop jar test.jar
Cheers,
Vinayakumar B
From: Chris Mawata [mailto:chris.mawata@gmail.com]
Sent: 25 February 2014 20:08
To: user@hadoop.apache.org
Subject: Re: Wrong FS hdfs:/localhost:9000 ;expected file///
The hadoop command gives you a configuration object with the configurations that are in your XML files. In your Java code you are probably getting your FileSystem object from a blank Configuration when you don't use the hadoop command.
Chris
On Feb 24, 2014 7:37 AM, "Chirag Dewan" <ch...@ericsson.com>> wrote:
Hi All,
I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code which reads a file from HDFS on a single node cluster. Now when I run my code using java -jar mytest.jar it throws the error Wrong FS hdfs://localhost.
When I run the same code with hadoop jar test.jar it works just fine.
I have my core-site.xml with fs.default.name<http://fs.default.name> as hdfs://localhost
Am I missing some classpath dependency here?
Thanks in advance.
Chirag Dewan
RE: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Vinayakumar B <vi...@huawei.com>.
Hi Chirag,
Hadoop expects core-site.xml to be in classpath which infact will be present in HADOOP_CONF_DIR
When you run hadoop jar test.jar , hadoop script will take care of adding all dependencies to CLASSPATH including HADOOP_CONF_DIR and your client will run successfully.
When you run using java -jar test.jar classpath will not be set and -jar option of java will ignore the classpath set either using CLASSPATH env variable or -cp argument. That means your test.jar should be complete Runnable Jar with all dependencies including conf files.
Please verify by running in the following way by constructing the CLASSPATH which includes HADOOP_CONF_DIR
java -cp <CLASSPATH> <MAIN-CLASS> <args>
or
simply use hadoop jar test.jar
Cheers,
Vinayakumar B
From: Chris Mawata [mailto:chris.mawata@gmail.com]
Sent: 25 February 2014 20:08
To: user@hadoop.apache.org
Subject: Re: Wrong FS hdfs:/localhost:9000 ;expected file///
The hadoop command gives you a configuration object with the configurations that are in your XML files. In your Java code you are probably getting your FileSystem object from a blank Configuration when you don't use the hadoop command.
Chris
On Feb 24, 2014 7:37 AM, "Chirag Dewan" <ch...@ericsson.com>> wrote:
Hi All,
I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code which reads a file from HDFS on a single node cluster. Now when I run my code using java -jar mytest.jar it throws the error Wrong FS hdfs://localhost.
When I run the same code with hadoop jar test.jar it works just fine.
I have my core-site.xml with fs.default.name<http://fs.default.name> as hdfs://localhost
Am I missing some classpath dependency here?
Thanks in advance.
Chirag Dewan
RE: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Vinayakumar B <vi...@huawei.com>.
Hi Chirag,
Hadoop expects core-site.xml to be in classpath which infact will be present in HADOOP_CONF_DIR
When you run hadoop jar test.jar , hadoop script will take care of adding all dependencies to CLASSPATH including HADOOP_CONF_DIR and your client will run successfully.
When you run using java -jar test.jar classpath will not be set and -jar option of java will ignore the classpath set either using CLASSPATH env variable or -cp argument. That means your test.jar should be complete Runnable Jar with all dependencies including conf files.
Please verify by running in the following way by constructing the CLASSPATH which includes HADOOP_CONF_DIR
java -cp <CLASSPATH> <MAIN-CLASS> <args>
or
simply use hadoop jar test.jar
Cheers,
Vinayakumar B
From: Chris Mawata [mailto:chris.mawata@gmail.com]
Sent: 25 February 2014 20:08
To: user@hadoop.apache.org
Subject: Re: Wrong FS hdfs:/localhost:9000 ;expected file///
The hadoop command gives you a configuration object with the configurations that are in your XML files. In your Java code you are probably getting your FileSystem object from a blank Configuration when you don't use the hadoop command.
Chris
On Feb 24, 2014 7:37 AM, "Chirag Dewan" <ch...@ericsson.com>> wrote:
Hi All,
I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code which reads a file from HDFS on a single node cluster. Now when I run my code using java -jar mytest.jar it throws the error Wrong FS hdfs://localhost.
When I run the same code with hadoop jar test.jar it works just fine.
I have my core-site.xml with fs.default.name<http://fs.default.name> as hdfs://localhost
Am I missing some classpath dependency here?
Thanks in advance.
Chirag Dewan
Re: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Chris Mawata <ch...@gmail.com>.
The hadoop command gives you a configuration object with the configurations
that are in your XML files. In your Java code you are probably getting
your FileSystem object from a blank Configuration when you don't use the
hadoop command.
Chris
On Feb 24, 2014 7:37 AM, "Chirag Dewan" <ch...@ericsson.com> wrote:
> Hi All,
>
>
>
> I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code
> which reads a file from HDFS on a single node cluster. Now when I run my
> code using java -jar mytest.jar it throws the error Wrong FS
> hdfs://localhost.
>
>
>
> When I run the same code with hadoop jar test.jar it works just fine.
>
> I have my core-site.xml with fs.default.name as hdfs://localhost
>
>
>
> Am I missing some classpath dependency here?
>
>
>
> Thanks in advance.
>
>
>
> Chirag Dewan
>
>
>
>
>
Re: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Chris Mawata <ch...@gmail.com>.
The hadoop command gives you a configuration object with the configurations
that are in your XML files. In your Java code you are probably getting
your FileSystem object from a blank Configuration when you don't use the
hadoop command.
Chris
On Feb 24, 2014 7:37 AM, "Chirag Dewan" <ch...@ericsson.com> wrote:
> Hi All,
>
>
>
> I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code
> which reads a file from HDFS on a single node cluster. Now when I run my
> code using java -jar mytest.jar it throws the error Wrong FS
> hdfs://localhost.
>
>
>
> When I run the same code with hadoop jar test.jar it works just fine.
>
> I have my core-site.xml with fs.default.name as hdfs://localhost
>
>
>
> Am I missing some classpath dependency here?
>
>
>
> Thanks in advance.
>
>
>
> Chirag Dewan
>
>
>
>
>
Re: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Chris Mawata <ch...@gmail.com>.
The hadoop command gives you a configuration object with the configurations
that are in your XML files. In your Java code you are probably getting
your FileSystem object from a blank Configuration when you don't use the
hadoop command.
Chris
On Feb 24, 2014 7:37 AM, "Chirag Dewan" <ch...@ericsson.com> wrote:
> Hi All,
>
>
>
> I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code
> which reads a file from HDFS on a single node cluster. Now when I run my
> code using java -jar mytest.jar it throws the error Wrong FS
> hdfs://localhost.
>
>
>
> When I run the same code with hadoop jar test.jar it works just fine.
>
> I have my core-site.xml with fs.default.name as hdfs://localhost
>
>
>
> Am I missing some classpath dependency here?
>
>
>
> Thanks in advance.
>
>
>
> Chirag Dewan
>
>
>
>
>
Re: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Chris Mawata <ch...@gmail.com>.
The hadoop command gives you a configuration object with the configurations
that are in your XML files. In your Java code you are probably getting
your FileSystem object from a blank Configuration when you don't use the
hadoop command.
Chris
On Feb 24, 2014 7:37 AM, "Chirag Dewan" <ch...@ericsson.com> wrote:
> Hi All,
>
>
>
> I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code
> which reads a file from HDFS on a single node cluster. Now when I run my
> code using java -jar mytest.jar it throws the error Wrong FS
> hdfs://localhost.
>
>
>
> When I run the same code with hadoop jar test.jar it works just fine.
>
> I have my core-site.xml with fs.default.name as hdfs://localhost
>
>
>
> Am I missing some classpath dependency here?
>
>
>
> Thanks in advance.
>
>
>
> Chirag Dewan
>
>
>
>
>
Re: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Shumin Guo <gs...@gmail.com>.
The value should be hdfs:///localhost:<port>
On Feb 24, 2014 6:37 AM, "Chirag Dewan" <ch...@ericsson.com> wrote:
> Hi All,
>
>
>
> I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code
> which reads a file from HDFS on a single node cluster. Now when I run my
> code using java -jar mytest.jar it throws the error Wrong FS
> hdfs://localhost.
>
>
>
> When I run the same code with hadoop jar test.jar it works just fine.
>
> I have my core-site.xml with fs.default.name as hdfs://localhost
>
>
>
> Am I missing some classpath dependency here?
>
>
>
> Thanks in advance.
>
>
>
> Chirag Dewan
>
>
>
>
>
Re: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Shumin Guo <gs...@gmail.com>.
The value should be hdfs:///localhost:<port>
On Feb 24, 2014 6:37 AM, "Chirag Dewan" <ch...@ericsson.com> wrote:
> Hi All,
>
>
>
> I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code
> which reads a file from HDFS on a single node cluster. Now when I run my
> code using java -jar mytest.jar it throws the error Wrong FS
> hdfs://localhost.
>
>
>
> When I run the same code with hadoop jar test.jar it works just fine.
>
> I have my core-site.xml with fs.default.name as hdfs://localhost
>
>
>
> Am I missing some classpath dependency here?
>
>
>
> Thanks in advance.
>
>
>
> Chirag Dewan
>
>
>
>
>
Re: Wrong FS hdfs:/localhost:9000 ;expected file///
Posted by Shumin Guo <gs...@gmail.com>.
The value should be hdfs:///localhost:<port>
On Feb 24, 2014 6:37 AM, "Chirag Dewan" <ch...@ericsson.com> wrote:
> Hi All,
>
>
>
> I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code
> which reads a file from HDFS on a single node cluster. Now when I run my
> code using java -jar mytest.jar it throws the error Wrong FS
> hdfs://localhost.
>
>
>
> When I run the same code with hadoop jar test.jar it works just fine.
>
> I have my core-site.xml with fs.default.name as hdfs://localhost
>
>
>
> Am I missing some classpath dependency here?
>
>
>
> Thanks in advance.
>
>
>
> Chirag Dewan
>
>
>
>
>