You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Rajat Jain <ra...@gmail.com> on 2014/06/03 00:53:02 UTC

Re: Problems Starting NameNode on hadoop-2.2.0

Have you tried setting fs.defaultFS with the same value?


On Sat, May 31, 2014 at 11:22 AM, ishan patwa <ri...@gmail.com> wrote:

> Hi,
> I recenetly isntalled hadoop-2.2.0 on my machine running :Linux
> livingstream 3.2.0-29
>
> However I am unable to start the namenode using
>
> *hadoop-daemon.sh start namenode *
> In the log files I can see the following errors :
> +++++++++++
> 2014-05-31 14:03:12,844 ERROR
> org.apache.hadoop.hdfs.server.namenode.NameNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:///
>     at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:212)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:244)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:280)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:569)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1479)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488)
>
> 2014-05-31 14:03:12,845 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down NameNode at livingstream/127.0.1.1
> ************************************************************/
>
> ++++++++++++++++
>
> I googled around a little bit , and people mentioned that it might be
> beause I havent gives fs,default.name in my core-site-xml.
> I checked my core-site.xml , it looks fine
>
> +++++++
> <configuration>
> <property>
> <name>fs.default.name</name>
> <value>hdfs://localhost:9000</value>
> </property>
> </configuration>
> +++++++++=
>
> Do you guys have any suggesntions as to what else might cause this???
>
> Regards,
> Ishan
>
>
> --
> Ishan Patwa
> Software Developer
> Zynga Game Network Pvt Ltd.
> Bangalore
>

Re: Problems Starting NameNode on hadoop-2.2.0

Posted by Stanley Shi <ss...@gopivotal.com>.
Another possible reason is that you are not using the correct conf file;

Regards,
*Stanley Shi,*



On Tue, Jun 3, 2014 at 6:53 AM, Rajat Jain <ra...@gmail.com> wrote:

> Have you tried setting fs.defaultFS with the same value?
>
>
> On Sat, May 31, 2014 at 11:22 AM, ishan patwa <ri...@gmail.com>
> wrote:
>
>> Hi,
>> I recenetly isntalled hadoop-2.2.0 on my machine running :Linux
>> livingstream 3.2.0-29
>>
>> However I am unable to start the namenode using
>>
>> *hadoop-daemon.sh start namenode *
>> In the log files I can see the following errors :
>> +++++++++++
>> 2014-05-31 14:03:12,844 ERROR
>> org.apache.hadoop.hdfs.server.namenode.NameNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:///
>>     at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:212)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:244)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:280)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:569)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1479)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488)
>>
>> 2014-05-31 14:03:12,845 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
>> /************************************************************
>> SHUTDOWN_MSG: Shutting down NameNode at livingstream/127.0.1.1
>> ************************************************************/
>>
>> ++++++++++++++++
>>
>> I googled around a little bit , and people mentioned that it might be
>> beause I havent gives fs,default.name in my core-site-xml.
>> I checked my core-site.xml , it looks fine
>>
>> +++++++
>> <configuration>
>> <property>
>> <name>fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>> </property>
>> </configuration>
>> +++++++++=
>>
>> Do you guys have any suggesntions as to what else might cause this???
>>
>> Regards,
>> Ishan
>>
>>
>> --
>> Ishan Patwa
>> Software Developer
>> Zynga Game Network Pvt Ltd.
>> Bangalore
>>
>
>

Re: Problems Starting NameNode on hadoop-2.2.0

Posted by Stanley Shi <ss...@gopivotal.com>.
Another possible reason is that you are not using the correct conf file;

Regards,
*Stanley Shi,*



On Tue, Jun 3, 2014 at 6:53 AM, Rajat Jain <ra...@gmail.com> wrote:

> Have you tried setting fs.defaultFS with the same value?
>
>
> On Sat, May 31, 2014 at 11:22 AM, ishan patwa <ri...@gmail.com>
> wrote:
>
>> Hi,
>> I recenetly isntalled hadoop-2.2.0 on my machine running :Linux
>> livingstream 3.2.0-29
>>
>> However I am unable to start the namenode using
>>
>> *hadoop-daemon.sh start namenode *
>> In the log files I can see the following errors :
>> +++++++++++
>> 2014-05-31 14:03:12,844 ERROR
>> org.apache.hadoop.hdfs.server.namenode.NameNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:///
>>     at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:212)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:244)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:280)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:569)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1479)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488)
>>
>> 2014-05-31 14:03:12,845 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
>> /************************************************************
>> SHUTDOWN_MSG: Shutting down NameNode at livingstream/127.0.1.1
>> ************************************************************/
>>
>> ++++++++++++++++
>>
>> I googled around a little bit , and people mentioned that it might be
>> beause I havent gives fs,default.name in my core-site-xml.
>> I checked my core-site.xml , it looks fine
>>
>> +++++++
>> <configuration>
>> <property>
>> <name>fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>> </property>
>> </configuration>
>> +++++++++=
>>
>> Do you guys have any suggesntions as to what else might cause this???
>>
>> Regards,
>> Ishan
>>
>>
>> --
>> Ishan Patwa
>> Software Developer
>> Zynga Game Network Pvt Ltd.
>> Bangalore
>>
>
>

Re: Problems Starting NameNode on hadoop-2.2.0

Posted by Stanley Shi <ss...@gopivotal.com>.
Another possible reason is that you are not using the correct conf file;

Regards,
*Stanley Shi,*



On Tue, Jun 3, 2014 at 6:53 AM, Rajat Jain <ra...@gmail.com> wrote:

> Have you tried setting fs.defaultFS with the same value?
>
>
> On Sat, May 31, 2014 at 11:22 AM, ishan patwa <ri...@gmail.com>
> wrote:
>
>> Hi,
>> I recenetly isntalled hadoop-2.2.0 on my machine running :Linux
>> livingstream 3.2.0-29
>>
>> However I am unable to start the namenode using
>>
>> *hadoop-daemon.sh start namenode *
>> In the log files I can see the following errors :
>> +++++++++++
>> 2014-05-31 14:03:12,844 ERROR
>> org.apache.hadoop.hdfs.server.namenode.NameNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:///
>>     at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:212)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:244)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:280)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:569)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1479)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488)
>>
>> 2014-05-31 14:03:12,845 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
>> /************************************************************
>> SHUTDOWN_MSG: Shutting down NameNode at livingstream/127.0.1.1
>> ************************************************************/
>>
>> ++++++++++++++++
>>
>> I googled around a little bit , and people mentioned that it might be
>> beause I havent gives fs,default.name in my core-site-xml.
>> I checked my core-site.xml , it looks fine
>>
>> +++++++
>> <configuration>
>> <property>
>> <name>fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>> </property>
>> </configuration>
>> +++++++++=
>>
>> Do you guys have any suggesntions as to what else might cause this???
>>
>> Regards,
>> Ishan
>>
>>
>> --
>> Ishan Patwa
>> Software Developer
>> Zynga Game Network Pvt Ltd.
>> Bangalore
>>
>
>

Re: Problems Starting NameNode on hadoop-2.2.0

Posted by Stanley Shi <ss...@gopivotal.com>.
Another possible reason is that you are not using the correct conf file;

Regards,
*Stanley Shi,*



On Tue, Jun 3, 2014 at 6:53 AM, Rajat Jain <ra...@gmail.com> wrote:

> Have you tried setting fs.defaultFS with the same value?
>
>
> On Sat, May 31, 2014 at 11:22 AM, ishan patwa <ri...@gmail.com>
> wrote:
>
>> Hi,
>> I recenetly isntalled hadoop-2.2.0 on my machine running :Linux
>> livingstream 3.2.0-29
>>
>> However I am unable to start the namenode using
>>
>> *hadoop-daemon.sh start namenode *
>> In the log files I can see the following errors :
>> +++++++++++
>> 2014-05-31 14:03:12,844 ERROR
>> org.apache.hadoop.hdfs.server.namenode.NameNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:///
>>     at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:212)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:244)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:280)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:569)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1479)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488)
>>
>> 2014-05-31 14:03:12,845 INFO
>> org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
>> /************************************************************
>> SHUTDOWN_MSG: Shutting down NameNode at livingstream/127.0.1.1
>> ************************************************************/
>>
>> ++++++++++++++++
>>
>> I googled around a little bit , and people mentioned that it might be
>> beause I havent gives fs,default.name in my core-site-xml.
>> I checked my core-site.xml , it looks fine
>>
>> +++++++
>> <configuration>
>> <property>
>> <name>fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>> </property>
>> </configuration>
>> +++++++++=
>>
>> Do you guys have any suggesntions as to what else might cause this???
>>
>> Regards,
>> Ishan
>>
>>
>> --
>> Ishan Patwa
>> Software Developer
>> Zynga Game Network Pvt Ltd.
>> Bangalore
>>
>
>