You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Mohammad Alkahtani <m....@gmail.com> on 2013/03/17 14:32:49 UTC

Hadoop Debian Package

Hi to all users of Hadoop,

I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
configure it right. The conf dir is under templates in /usr/shar/hadoop. I
edit the core-site.xml, mapred-site.xml files to give
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
and for mapred
<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>

but i get these errors, I assume that there is problem, Hadoop cannot read
the configuration file.
I chaned the hadoop-env.sh to
export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
but dosen't solve the problem.

ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
java.lang.IllegalArgumentException: Does not contain a valid host:port
authority: file:/// at
org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)

________________________________

FATAL org.apache.hadoop.mapred.JobTracker:
java.lang.IllegalArgumentException: Does not contain a valid host:port
authority: local at
org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)

________________________________

ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
java.lang.IllegalArgumentException: Does not contain a valid host:port
authority: file:/// at
org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)

________________________________

Exception in thread "main" java.lang.IllegalArgumentException: Does not
contain a valid host:port authority: file:/// at
org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
at
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
at
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
at
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)

________________________________

ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
because java.lang.IllegalArgumentException: Does not contain a valid
host:port authority: local at
org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)


Regards,
Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I use hadoop-1.1.2 I will try and get back to you.

Regards,
Mohammad Alkahtani


On 17 Mar 2013, at 08:37 PM, Luangsay Sourygna <lu...@gmail.com> wrote:

> Hi,
> 
> What is the version of Hadoop you use?
> 
> Try using fs.defaultFS instead of fs.default.name (see the list of all
> the deprecated properties here:
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html).
> I remember I once had a similar error message and it was due to the
> change in properties names.
> 
> Regards,
> 
> Sourygna
> 
> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
> <m....@gmail.com> wrote:
>> Hi to all users of Hadoop,
>> 
>> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
>> edit the core-site.xml, mapred-site.xml files to give
>> <property>
>> <name>fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>> </property>
>> and for mapred
>> <property>
>> <name>mapred.job.tracker</name>
>> <value>localhost:9001</value>
>> </property>
>> 
>> but i get these errors, I assume that there is problem, Hadoop cannot read
>> the configuration file.
>> I chaned the hadoop-env.sh to
>> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>> but dosen't solve the problem.
>> 
>> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>> 
>> ________________________________
>> 
>> FATAL org.apache.hadoop.mapred.JobTracker:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>> 
>> ________________________________
>> 
>> Exception in thread "main" java.lang.IllegalArgumentException: Does not
>> contain a valid host:port authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>> because java.lang.IllegalArgumentException: Does not contain a valid
>> host:port authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>> 
>> 
>> Regards,
>> Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I use hadoop-1.1.2 I will try and get back to you.

Regards,
Mohammad Alkahtani


On 17 Mar 2013, at 08:37 PM, Luangsay Sourygna <lu...@gmail.com> wrote:

> Hi,
> 
> What is the version of Hadoop you use?
> 
> Try using fs.defaultFS instead of fs.default.name (see the list of all
> the deprecated properties here:
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html).
> I remember I once had a similar error message and it was due to the
> change in properties names.
> 
> Regards,
> 
> Sourygna
> 
> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
> <m....@gmail.com> wrote:
>> Hi to all users of Hadoop,
>> 
>> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
>> edit the core-site.xml, mapred-site.xml files to give
>> <property>
>> <name>fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>> </property>
>> and for mapred
>> <property>
>> <name>mapred.job.tracker</name>
>> <value>localhost:9001</value>
>> </property>
>> 
>> but i get these errors, I assume that there is problem, Hadoop cannot read
>> the configuration file.
>> I chaned the hadoop-env.sh to
>> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>> but dosen't solve the problem.
>> 
>> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>> 
>> ________________________________
>> 
>> FATAL org.apache.hadoop.mapred.JobTracker:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>> 
>> ________________________________
>> 
>> Exception in thread "main" java.lang.IllegalArgumentException: Does not
>> contain a valid host:port authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>> because java.lang.IllegalArgumentException: Does not contain a valid
>> host:port authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>> 
>> 
>> Regards,
>> Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
Hello Mohammad,

      Have you set your HADOOP_HOME properly?
Please check it once.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> The problem is I tried I read the configuration file by changing
> export HADOOP_CONF_DIR=${HADOOP_CONF_
> DIR:-"/usr/shar/hadoop/templates/conf"}
> but I think Hadoop dosen't get the configration from this dir, I trid and
> searched the system for conf dir the only dir is this one which I changed.
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>> taking setting configuration from other location...
>>
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> What is the version of Hadoop you use?
>>>
>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>> the deprecated properties here:
>>>
>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>> ).
>>> I remember I once had a similar error message and it was due to the
>>> change in properties names.
>>>
>>> Regards,
>>>
>>> Sourygna
>>>
>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>> <m....@gmail.com> wrote:
>>> > Hi to all users of Hadoop,
>>> >
>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>>> > configure it right. The conf dir is under templates in
>>> /usr/shar/hadoop. I
>>> > edit the core-site.xml, mapred-site.xml files to give
>>> > <property>
>>> > <name>fs.default.name</name>
>>> > <value>hdfs://localhost:9000</value>
>>> > </property>
>>> > and for mapred
>>> > <property>
>>> > <name>mapred.job.tracker</name>
>>> > <value>localhost:9001</value>
>>> > </property>
>>> >
>>> > but i get these errors, I assume that there is problem, Hadoop cannot
>>> read
>>> > the configuration file.
>>> > I chaned the hadoop-env.sh to
>>> > export
>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>> > but dosen't solve the problem.
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>> > at
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>> >
>>> > ________________________________
>>> >
>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>> > at
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>> >
>>> > ________________________________
>>> >
>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
>>> > contain a valid host:port authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>> > host:port authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>> >
>>> >
>>> > Regards,
>>> > Mohammad Alkahtani
>>>
>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I tried echo $HADOOP_HOME

and I got blank

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:37 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Mohammad
> I still get the same error with this msg
>
> localhost: Warning: $HADOOP_HOME is deprecated.
> I searched ~/.bashrc but only what I wrote is there.
>
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you can do that using these command :
>>
>> sudo gedit ~/.bashrc
>>
>> then go to the end of the file and add this line :
>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>
>> after that use it to freeze the changes :
>> source ~/.bashrc
>>
>> to check it :
>> echo $HADOOP_HOME
>>
>> This will permanently set your HADOOP_HOME.
>>
>> HTH
>>
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>> don't find it in the hadoop-env.sh
>>>
>>> Thank you Shashwat
>>> this is the output and it is already configured but hadoop don't read
>>> the configuration from here.
>>>
>>> /usr/share/maven-repo/org/apache
>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>> /commons-parent-debian-site.xml
>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>> /usr/share/compiz/composite.xml
>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>> /usr/share/hadoop/templates/conf/core-site.xml
>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>> dwivedishashwat@gmail.com> wrote:
>>>
>>>> try
>>>> find / -type f -iname "*site.xml"
>>>> it will show you where ever those files are..
>>>>
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> The problem is I tried I read the configuration file by changing
>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>>> and searched the system for conf dir the only dir is this one which I
>>>>> changed.
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>
>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>>> taking setting configuration from other location...
>>>>>>
>>>>>>
>>>>>>
>>>>>> ∞
>>>>>> Shashwat Shriparv
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>> luangsay@gmail.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> What is the version of Hadoop you use?
>>>>>>>
>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>>> all
>>>>>>> the deprecated properties here:
>>>>>>>
>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>> ).
>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>> change in properties names.
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sourygna
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>> <m....@gmail.com> wrote:
>>>>>>> > Hi to all users of Hadoop,
>>>>>>> >
>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>>>> not
>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>> /usr/shar/hadoop. I
>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>> > <property>
>>>>>>> > <name>fs.default.name</name>
>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>> > </property>
>>>>>>> > and for mapred
>>>>>>> > <property>
>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>> > <value>localhost:9001</value>
>>>>>>> > </property>
>>>>>>> >
>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>> cannot read
>>>>>>> > the configuration file.
>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>> > export
>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>> > but dosen't solve the problem.
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>> Does not
>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>> tracker
>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>> valid
>>>>>>> > host:port authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>> >
>>>>>>> >
>>>>>>> > Regards,
>>>>>>> > Mohammad Alkahtani
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I tried echo $HADOOP_HOME

and I got blank

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:37 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Mohammad
> I still get the same error with this msg
>
> localhost: Warning: $HADOOP_HOME is deprecated.
> I searched ~/.bashrc but only what I wrote is there.
>
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you can do that using these command :
>>
>> sudo gedit ~/.bashrc
>>
>> then go to the end of the file and add this line :
>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>
>> after that use it to freeze the changes :
>> source ~/.bashrc
>>
>> to check it :
>> echo $HADOOP_HOME
>>
>> This will permanently set your HADOOP_HOME.
>>
>> HTH
>>
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>> don't find it in the hadoop-env.sh
>>>
>>> Thank you Shashwat
>>> this is the output and it is already configured but hadoop don't read
>>> the configuration from here.
>>>
>>> /usr/share/maven-repo/org/apache
>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>> /commons-parent-debian-site.xml
>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>> /usr/share/compiz/composite.xml
>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>> /usr/share/hadoop/templates/conf/core-site.xml
>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>> dwivedishashwat@gmail.com> wrote:
>>>
>>>> try
>>>> find / -type f -iname "*site.xml"
>>>> it will show you where ever those files are..
>>>>
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> The problem is I tried I read the configuration file by changing
>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>>> and searched the system for conf dir the only dir is this one which I
>>>>> changed.
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>
>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>>> taking setting configuration from other location...
>>>>>>
>>>>>>
>>>>>>
>>>>>> ∞
>>>>>> Shashwat Shriparv
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>> luangsay@gmail.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> What is the version of Hadoop you use?
>>>>>>>
>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>>> all
>>>>>>> the deprecated properties here:
>>>>>>>
>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>> ).
>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>> change in properties names.
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sourygna
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>> <m....@gmail.com> wrote:
>>>>>>> > Hi to all users of Hadoop,
>>>>>>> >
>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>>>> not
>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>> /usr/shar/hadoop. I
>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>> > <property>
>>>>>>> > <name>fs.default.name</name>
>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>> > </property>
>>>>>>> > and for mapred
>>>>>>> > <property>
>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>> > <value>localhost:9001</value>
>>>>>>> > </property>
>>>>>>> >
>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>> cannot read
>>>>>>> > the configuration file.
>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>> > export
>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>> > but dosen't solve the problem.
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>> Does not
>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>> tracker
>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>> valid
>>>>>>> > host:port authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>> >
>>>>>>> >
>>>>>>> > Regards,
>>>>>>> > Mohammad Alkahtani
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
good to hear that :)

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 1:58 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Tariq, I removed the .deb
>  and download the source file hadoop-1.0.4.tar.gz<http://mirrors.isu.net.sa/pub/apache/hadoop/common/stable/hadoop-1.0.4.tar.gz>
>  and worked very well.
>
> Thank you again
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 11:07 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you have to use upper case 'HADOOP_HOME' (don't mind if it's a typo). do
>> you proper permission to read these files?
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 1:00 AM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set
>>> the Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also
>>> I got the error.
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> set these properties in the configuration files present in your /etc
>>>> directory. and HADOOP_HOME is the parent directory of the hadoop bin
>>>> directory that holds the Hadoop scripts. so, set that accordingly in
>>>> .bashrc file.
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> Thank you Mohammad Tariq
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> I tried all of the hadoop home dirs but didn't worke
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> OK what the Hadoop home should be in ubuntu because the binary files
>>>>>>> in /usr/bin
>>>>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>>>>
>>>>>>> shall I use /usr as hadoop path because it is the dir that contain
>>>>>>> the bin files
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> log out from the user. log in again and see if it works.
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> https://mtariq.jux.com/
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <
>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>>>>> the hadoop-env.sh file :
>>>>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Thank you Mohammad
>>>>>>>>>> I still get the same error with this msg
>>>>>>>>>>
>>>>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>> P.O.Box 102275
>>>>>>>>>> Riyadh 11675
>>>>>>>>>> Saudi Arabia
>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> you can do that using these command :
>>>>>>>>>>>
>>>>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>>>>
>>>>>>>>>>> then go to the end of the file and add this line :
>>>>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>>>>
>>>>>>>>>>> after that use it to freeze the changes :
>>>>>>>>>>> source ~/.bashrc
>>>>>>>>>>>
>>>>>>>>>>> to check it :
>>>>>>>>>>> echo $HADOOP_HOME
>>>>>>>>>>>
>>>>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>>>>
>>>>>>>>>>> HTH
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Warm Regards,
>>>>>>>>>>> Tariq
>>>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME
>>>>>>>>>>>> because I don't find it in the hadoop-env.sh
>>>>>>>>>>>>
>>>>>>>>>>>> Thank you Shashwat
>>>>>>>>>>>> this is the output and it is already configured but hadoopdon't read the configuration from here.
>>>>>>>>>>>>
>>>>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian
>>>>>>>>>>>> -site.xml
>>>>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>>>>
>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> try
>>>>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> ∞
>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> The problem is I tried I read the configuration file by
>>>>>>>>>>>>>> changing
>>>>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>> but I think Hadoop dosen't get the configration from this
>>>>>>>>>>>>>> dir, I trid and searched the system for conf dir the only dir is this one
>>>>>>>>>>>>>> which I changed.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check
>>>>>>>>>>>>>>> if it is taking setting configuration from other location...
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> ∞
>>>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>>>>> list of all
>>>>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>>>>> ).
>>>>>>>>>>>>>>>> I remember I once had a similar error message and it was
>>>>>>>>>>>>>>>> due to the
>>>>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I
>>>>>>>>>>>>>>>> might could not
>>>>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>>>>> > export
>>>>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>>> contain a valid
>>>>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
good to hear that :)

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 1:58 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Tariq, I removed the .deb
>  and download the source file hadoop-1.0.4.tar.gz<http://mirrors.isu.net.sa/pub/apache/hadoop/common/stable/hadoop-1.0.4.tar.gz>
>  and worked very well.
>
> Thank you again
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 11:07 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you have to use upper case 'HADOOP_HOME' (don't mind if it's a typo). do
>> you proper permission to read these files?
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 1:00 AM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set
>>> the Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also
>>> I got the error.
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> set these properties in the configuration files present in your /etc
>>>> directory. and HADOOP_HOME is the parent directory of the hadoop bin
>>>> directory that holds the Hadoop scripts. so, set that accordingly in
>>>> .bashrc file.
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> Thank you Mohammad Tariq
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> I tried all of the hadoop home dirs but didn't worke
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> OK what the Hadoop home should be in ubuntu because the binary files
>>>>>>> in /usr/bin
>>>>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>>>>
>>>>>>> shall I use /usr as hadoop path because it is the dir that contain
>>>>>>> the bin files
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> log out from the user. log in again and see if it works.
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> https://mtariq.jux.com/
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <
>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>>>>> the hadoop-env.sh file :
>>>>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Thank you Mohammad
>>>>>>>>>> I still get the same error with this msg
>>>>>>>>>>
>>>>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>> P.O.Box 102275
>>>>>>>>>> Riyadh 11675
>>>>>>>>>> Saudi Arabia
>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> you can do that using these command :
>>>>>>>>>>>
>>>>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>>>>
>>>>>>>>>>> then go to the end of the file and add this line :
>>>>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>>>>
>>>>>>>>>>> after that use it to freeze the changes :
>>>>>>>>>>> source ~/.bashrc
>>>>>>>>>>>
>>>>>>>>>>> to check it :
>>>>>>>>>>> echo $HADOOP_HOME
>>>>>>>>>>>
>>>>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>>>>
>>>>>>>>>>> HTH
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Warm Regards,
>>>>>>>>>>> Tariq
>>>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME
>>>>>>>>>>>> because I don't find it in the hadoop-env.sh
>>>>>>>>>>>>
>>>>>>>>>>>> Thank you Shashwat
>>>>>>>>>>>> this is the output and it is already configured but hadoopdon't read the configuration from here.
>>>>>>>>>>>>
>>>>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian
>>>>>>>>>>>> -site.xml
>>>>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>>>>
>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> try
>>>>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> ∞
>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> The problem is I tried I read the configuration file by
>>>>>>>>>>>>>> changing
>>>>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>> but I think Hadoop dosen't get the configration from this
>>>>>>>>>>>>>> dir, I trid and searched the system for conf dir the only dir is this one
>>>>>>>>>>>>>> which I changed.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check
>>>>>>>>>>>>>>> if it is taking setting configuration from other location...
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> ∞
>>>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>>>>> list of all
>>>>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>>>>> ).
>>>>>>>>>>>>>>>> I remember I once had a similar error message and it was
>>>>>>>>>>>>>>>> due to the
>>>>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I
>>>>>>>>>>>>>>>> might could not
>>>>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>>>>> > export
>>>>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>>> contain a valid
>>>>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
good to hear that :)

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 1:58 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Tariq, I removed the .deb
>  and download the source file hadoop-1.0.4.tar.gz<http://mirrors.isu.net.sa/pub/apache/hadoop/common/stable/hadoop-1.0.4.tar.gz>
>  and worked very well.
>
> Thank you again
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 11:07 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you have to use upper case 'HADOOP_HOME' (don't mind if it's a typo). do
>> you proper permission to read these files?
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 1:00 AM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set
>>> the Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also
>>> I got the error.
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> set these properties in the configuration files present in your /etc
>>>> directory. and HADOOP_HOME is the parent directory of the hadoop bin
>>>> directory that holds the Hadoop scripts. so, set that accordingly in
>>>> .bashrc file.
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> Thank you Mohammad Tariq
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> I tried all of the hadoop home dirs but didn't worke
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> OK what the Hadoop home should be in ubuntu because the binary files
>>>>>>> in /usr/bin
>>>>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>>>>
>>>>>>> shall I use /usr as hadoop path because it is the dir that contain
>>>>>>> the bin files
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> log out from the user. log in again and see if it works.
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> https://mtariq.jux.com/
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <
>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>>>>> the hadoop-env.sh file :
>>>>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Thank you Mohammad
>>>>>>>>>> I still get the same error with this msg
>>>>>>>>>>
>>>>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>> P.O.Box 102275
>>>>>>>>>> Riyadh 11675
>>>>>>>>>> Saudi Arabia
>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> you can do that using these command :
>>>>>>>>>>>
>>>>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>>>>
>>>>>>>>>>> then go to the end of the file and add this line :
>>>>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>>>>
>>>>>>>>>>> after that use it to freeze the changes :
>>>>>>>>>>> source ~/.bashrc
>>>>>>>>>>>
>>>>>>>>>>> to check it :
>>>>>>>>>>> echo $HADOOP_HOME
>>>>>>>>>>>
>>>>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>>>>
>>>>>>>>>>> HTH
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Warm Regards,
>>>>>>>>>>> Tariq
>>>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME
>>>>>>>>>>>> because I don't find it in the hadoop-env.sh
>>>>>>>>>>>>
>>>>>>>>>>>> Thank you Shashwat
>>>>>>>>>>>> this is the output and it is already configured but hadoopdon't read the configuration from here.
>>>>>>>>>>>>
>>>>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian
>>>>>>>>>>>> -site.xml
>>>>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>>>>
>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> try
>>>>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> ∞
>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> The problem is I tried I read the configuration file by
>>>>>>>>>>>>>> changing
>>>>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>> but I think Hadoop dosen't get the configration from this
>>>>>>>>>>>>>> dir, I trid and searched the system for conf dir the only dir is this one
>>>>>>>>>>>>>> which I changed.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check
>>>>>>>>>>>>>>> if it is taking setting configuration from other location...
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> ∞
>>>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>>>>> list of all
>>>>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>>>>> ).
>>>>>>>>>>>>>>>> I remember I once had a similar error message and it was
>>>>>>>>>>>>>>>> due to the
>>>>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I
>>>>>>>>>>>>>>>> might could not
>>>>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>>>>> > export
>>>>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>>> contain a valid
>>>>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
good to hear that :)

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 1:58 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Tariq, I removed the .deb
>  and download the source file hadoop-1.0.4.tar.gz<http://mirrors.isu.net.sa/pub/apache/hadoop/common/stable/hadoop-1.0.4.tar.gz>
>  and worked very well.
>
> Thank you again
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 11:07 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you have to use upper case 'HADOOP_HOME' (don't mind if it's a typo). do
>> you proper permission to read these files?
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 1:00 AM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set
>>> the Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also
>>> I got the error.
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> set these properties in the configuration files present in your /etc
>>>> directory. and HADOOP_HOME is the parent directory of the hadoop bin
>>>> directory that holds the Hadoop scripts. so, set that accordingly in
>>>> .bashrc file.
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> Thank you Mohammad Tariq
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> I tried all of the hadoop home dirs but didn't worke
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> OK what the Hadoop home should be in ubuntu because the binary files
>>>>>>> in /usr/bin
>>>>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>>>>
>>>>>>> shall I use /usr as hadoop path because it is the dir that contain
>>>>>>> the bin files
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> log out from the user. log in again and see if it works.
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> https://mtariq.jux.com/
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <
>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>>>>> the hadoop-env.sh file :
>>>>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Thank you Mohammad
>>>>>>>>>> I still get the same error with this msg
>>>>>>>>>>
>>>>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>> P.O.Box 102275
>>>>>>>>>> Riyadh 11675
>>>>>>>>>> Saudi Arabia
>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> you can do that using these command :
>>>>>>>>>>>
>>>>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>>>>
>>>>>>>>>>> then go to the end of the file and add this line :
>>>>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>>>>
>>>>>>>>>>> after that use it to freeze the changes :
>>>>>>>>>>> source ~/.bashrc
>>>>>>>>>>>
>>>>>>>>>>> to check it :
>>>>>>>>>>> echo $HADOOP_HOME
>>>>>>>>>>>
>>>>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>>>>
>>>>>>>>>>> HTH
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Warm Regards,
>>>>>>>>>>> Tariq
>>>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME
>>>>>>>>>>>> because I don't find it in the hadoop-env.sh
>>>>>>>>>>>>
>>>>>>>>>>>> Thank you Shashwat
>>>>>>>>>>>> this is the output and it is already configured but hadoopdon't read the configuration from here.
>>>>>>>>>>>>
>>>>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian
>>>>>>>>>>>> -site.xml
>>>>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>>>>
>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> try
>>>>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> ∞
>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> The problem is I tried I read the configuration file by
>>>>>>>>>>>>>> changing
>>>>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>> but I think Hadoop dosen't get the configration from this
>>>>>>>>>>>>>> dir, I trid and searched the system for conf dir the only dir is this one
>>>>>>>>>>>>>> which I changed.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check
>>>>>>>>>>>>>>> if it is taking setting configuration from other location...
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> ∞
>>>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>>>>> list of all
>>>>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>>>>> ).
>>>>>>>>>>>>>>>> I remember I once had a similar error message and it was
>>>>>>>>>>>>>>>> due to the
>>>>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I
>>>>>>>>>>>>>>>> might could not
>>>>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>>>>> > export
>>>>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>>> contain a valid
>>>>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Thank you Tariq, I removed the .deb
 and download the source file
hadoop-1.0.4.tar.gz<http://mirrors.isu.net.sa/pub/apache/hadoop/common/stable/hadoop-1.0.4.tar.gz>
and worked very well.

Thank you again

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 11:07 PM, Mohammad Tariq <do...@gmail.com> wrote:

> you have to use upper case 'HADOOP_HOME' (don't mind if it's a typo). do
> you proper permission to read these files?
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 1:00 AM, Mohammad Alkahtani <m.alkahtani@gmail.com
> > wrote:
>
>> The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set the
>> Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also I
>> got the error.
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> set these properties in the configuration files present in your /etc
>>> directory. and HADOOP_HOME is the parent directory of the hadoop bin
>>> directory that holds the Hadoop scripts. so, set that accordingly in
>>> .bashrc file.
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> Thank you Mohammad Tariq
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> I tried all of the hadoop home dirs but didn't worke
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> OK what the Hadoop home should be in ubuntu because the binary files
>>>>>> in /usr/bin
>>>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>>>
>>>>>> shall I use /usr as hadoop path because it is the dir that contain
>>>>>> the bin files
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> log out from the user. log in again and see if it works.
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> https://mtariq.jux.com/
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <dontariq@gmail.com
>>>>>>> > wrote:
>>>>>>>
>>>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>>>> the hadoop-env.sh file :
>>>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> https://mtariq.jux.com/
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Thank you Mohammad
>>>>>>>>> I still get the same error with this msg
>>>>>>>>>
>>>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Mohammad Alkahtani
>>>>>>>>> P.O.Box 102275
>>>>>>>>> Riyadh 11675
>>>>>>>>> Saudi Arabia
>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> you can do that using these command :
>>>>>>>>>>
>>>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>>>
>>>>>>>>>> then go to the end of the file and add this line :
>>>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>>>
>>>>>>>>>> after that use it to freeze the changes :
>>>>>>>>>> source ~/.bashrc
>>>>>>>>>>
>>>>>>>>>> to check it :
>>>>>>>>>> echo $HADOOP_HOME
>>>>>>>>>>
>>>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>>>
>>>>>>>>>> HTH
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Warm Regards,
>>>>>>>>>> Tariq
>>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME
>>>>>>>>>>> because I don't find it in the hadoop-env.sh
>>>>>>>>>>>
>>>>>>>>>>> Thank you Shashwat
>>>>>>>>>>> this is the output and it is already configured but hadoopdon't read the configuration from here.
>>>>>>>>>>>
>>>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian
>>>>>>>>>>> -site.xml
>>>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>>>
>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> try
>>>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> ∞
>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> The problem is I tried I read the configuration file by
>>>>>>>>>>>>> changing
>>>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir,
>>>>>>>>>>>>> I trid and searched the system for conf dir the only dir is this one which
>>>>>>>>>>>>> I changed.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check
>>>>>>>>>>>>>> if it is taking setting configuration from other location...
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> ∞
>>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>>>> list of all
>>>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>>>> ).
>>>>>>>>>>>>>>> I remember I once had a similar error message and it was due
>>>>>>>>>>>>>>> to the
>>>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I
>>>>>>>>>>>>>>> might could not
>>>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>>>> > export
>>>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070)
>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889)
>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883)
>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>> contain a valid
>>>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Thank you Tariq, I removed the .deb
 and download the source file
hadoop-1.0.4.tar.gz<http://mirrors.isu.net.sa/pub/apache/hadoop/common/stable/hadoop-1.0.4.tar.gz>
and worked very well.

Thank you again

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 11:07 PM, Mohammad Tariq <do...@gmail.com> wrote:

> you have to use upper case 'HADOOP_HOME' (don't mind if it's a typo). do
> you proper permission to read these files?
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 1:00 AM, Mohammad Alkahtani <m.alkahtani@gmail.com
> > wrote:
>
>> The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set the
>> Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also I
>> got the error.
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> set these properties in the configuration files present in your /etc
>>> directory. and HADOOP_HOME is the parent directory of the hadoop bin
>>> directory that holds the Hadoop scripts. so, set that accordingly in
>>> .bashrc file.
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> Thank you Mohammad Tariq
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> I tried all of the hadoop home dirs but didn't worke
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> OK what the Hadoop home should be in ubuntu because the binary files
>>>>>> in /usr/bin
>>>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>>>
>>>>>> shall I use /usr as hadoop path because it is the dir that contain
>>>>>> the bin files
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> log out from the user. log in again and see if it works.
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> https://mtariq.jux.com/
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <dontariq@gmail.com
>>>>>>> > wrote:
>>>>>>>
>>>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>>>> the hadoop-env.sh file :
>>>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> https://mtariq.jux.com/
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Thank you Mohammad
>>>>>>>>> I still get the same error with this msg
>>>>>>>>>
>>>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Mohammad Alkahtani
>>>>>>>>> P.O.Box 102275
>>>>>>>>> Riyadh 11675
>>>>>>>>> Saudi Arabia
>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> you can do that using these command :
>>>>>>>>>>
>>>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>>>
>>>>>>>>>> then go to the end of the file and add this line :
>>>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>>>
>>>>>>>>>> after that use it to freeze the changes :
>>>>>>>>>> source ~/.bashrc
>>>>>>>>>>
>>>>>>>>>> to check it :
>>>>>>>>>> echo $HADOOP_HOME
>>>>>>>>>>
>>>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>>>
>>>>>>>>>> HTH
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Warm Regards,
>>>>>>>>>> Tariq
>>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME
>>>>>>>>>>> because I don't find it in the hadoop-env.sh
>>>>>>>>>>>
>>>>>>>>>>> Thank you Shashwat
>>>>>>>>>>> this is the output and it is already configured but hadoopdon't read the configuration from here.
>>>>>>>>>>>
>>>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian
>>>>>>>>>>> -site.xml
>>>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>>>
>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> try
>>>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> ∞
>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> The problem is I tried I read the configuration file by
>>>>>>>>>>>>> changing
>>>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir,
>>>>>>>>>>>>> I trid and searched the system for conf dir the only dir is this one which
>>>>>>>>>>>>> I changed.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check
>>>>>>>>>>>>>> if it is taking setting configuration from other location...
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> ∞
>>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>>>> list of all
>>>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>>>> ).
>>>>>>>>>>>>>>> I remember I once had a similar error message and it was due
>>>>>>>>>>>>>>> to the
>>>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I
>>>>>>>>>>>>>>> might could not
>>>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>>>> > export
>>>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070)
>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889)
>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883)
>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>> contain a valid
>>>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Thank you Tariq, I removed the .deb
 and download the source file
hadoop-1.0.4.tar.gz<http://mirrors.isu.net.sa/pub/apache/hadoop/common/stable/hadoop-1.0.4.tar.gz>
and worked very well.

Thank you again

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 11:07 PM, Mohammad Tariq <do...@gmail.com> wrote:

> you have to use upper case 'HADOOP_HOME' (don't mind if it's a typo). do
> you proper permission to read these files?
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 1:00 AM, Mohammad Alkahtani <m.alkahtani@gmail.com
> > wrote:
>
>> The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set the
>> Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also I
>> got the error.
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> set these properties in the configuration files present in your /etc
>>> directory. and HADOOP_HOME is the parent directory of the hadoop bin
>>> directory that holds the Hadoop scripts. so, set that accordingly in
>>> .bashrc file.
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> Thank you Mohammad Tariq
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> I tried all of the hadoop home dirs but didn't worke
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> OK what the Hadoop home should be in ubuntu because the binary files
>>>>>> in /usr/bin
>>>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>>>
>>>>>> shall I use /usr as hadoop path because it is the dir that contain
>>>>>> the bin files
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> log out from the user. log in again and see if it works.
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> https://mtariq.jux.com/
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <dontariq@gmail.com
>>>>>>> > wrote:
>>>>>>>
>>>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>>>> the hadoop-env.sh file :
>>>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> https://mtariq.jux.com/
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Thank you Mohammad
>>>>>>>>> I still get the same error with this msg
>>>>>>>>>
>>>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Mohammad Alkahtani
>>>>>>>>> P.O.Box 102275
>>>>>>>>> Riyadh 11675
>>>>>>>>> Saudi Arabia
>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> you can do that using these command :
>>>>>>>>>>
>>>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>>>
>>>>>>>>>> then go to the end of the file and add this line :
>>>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>>>
>>>>>>>>>> after that use it to freeze the changes :
>>>>>>>>>> source ~/.bashrc
>>>>>>>>>>
>>>>>>>>>> to check it :
>>>>>>>>>> echo $HADOOP_HOME
>>>>>>>>>>
>>>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>>>
>>>>>>>>>> HTH
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Warm Regards,
>>>>>>>>>> Tariq
>>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME
>>>>>>>>>>> because I don't find it in the hadoop-env.sh
>>>>>>>>>>>
>>>>>>>>>>> Thank you Shashwat
>>>>>>>>>>> this is the output and it is already configured but hadoopdon't read the configuration from here.
>>>>>>>>>>>
>>>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian
>>>>>>>>>>> -site.xml
>>>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>>>
>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> try
>>>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> ∞
>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> The problem is I tried I read the configuration file by
>>>>>>>>>>>>> changing
>>>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir,
>>>>>>>>>>>>> I trid and searched the system for conf dir the only dir is this one which
>>>>>>>>>>>>> I changed.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check
>>>>>>>>>>>>>> if it is taking setting configuration from other location...
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> ∞
>>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>>>> list of all
>>>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>>>> ).
>>>>>>>>>>>>>>> I remember I once had a similar error message and it was due
>>>>>>>>>>>>>>> to the
>>>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I
>>>>>>>>>>>>>>> might could not
>>>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>>>> > export
>>>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070)
>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889)
>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883)
>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>> contain a valid
>>>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Thank you Tariq, I removed the .deb
 and download the source file
hadoop-1.0.4.tar.gz<http://mirrors.isu.net.sa/pub/apache/hadoop/common/stable/hadoop-1.0.4.tar.gz>
and worked very well.

Thank you again

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 11:07 PM, Mohammad Tariq <do...@gmail.com> wrote:

> you have to use upper case 'HADOOP_HOME' (don't mind if it's a typo). do
> you proper permission to read these files?
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 1:00 AM, Mohammad Alkahtani <m.alkahtani@gmail.com
> > wrote:
>
>> The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set the
>> Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also I
>> got the error.
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> set these properties in the configuration files present in your /etc
>>> directory. and HADOOP_HOME is the parent directory of the hadoop bin
>>> directory that holds the Hadoop scripts. so, set that accordingly in
>>> .bashrc file.
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> Thank you Mohammad Tariq
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> I tried all of the hadoop home dirs but didn't worke
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> OK what the Hadoop home should be in ubuntu because the binary files
>>>>>> in /usr/bin
>>>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>>>
>>>>>> shall I use /usr as hadoop path because it is the dir that contain
>>>>>> the bin files
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> log out from the user. log in again and see if it works.
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> https://mtariq.jux.com/
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <dontariq@gmail.com
>>>>>>> > wrote:
>>>>>>>
>>>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>>>> the hadoop-env.sh file :
>>>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> https://mtariq.jux.com/
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Thank you Mohammad
>>>>>>>>> I still get the same error with this msg
>>>>>>>>>
>>>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Mohammad Alkahtani
>>>>>>>>> P.O.Box 102275
>>>>>>>>> Riyadh 11675
>>>>>>>>> Saudi Arabia
>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> you can do that using these command :
>>>>>>>>>>
>>>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>>>
>>>>>>>>>> then go to the end of the file and add this line :
>>>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>>>
>>>>>>>>>> after that use it to freeze the changes :
>>>>>>>>>> source ~/.bashrc
>>>>>>>>>>
>>>>>>>>>> to check it :
>>>>>>>>>> echo $HADOOP_HOME
>>>>>>>>>>
>>>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>>>
>>>>>>>>>> HTH
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Warm Regards,
>>>>>>>>>> Tariq
>>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME
>>>>>>>>>>> because I don't find it in the hadoop-env.sh
>>>>>>>>>>>
>>>>>>>>>>> Thank you Shashwat
>>>>>>>>>>> this is the output and it is already configured but hadoopdon't read the configuration from here.
>>>>>>>>>>>
>>>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian
>>>>>>>>>>> -site.xml
>>>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>>>
>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> try
>>>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> ∞
>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> The problem is I tried I read the configuration file by
>>>>>>>>>>>>> changing
>>>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir,
>>>>>>>>>>>>> I trid and searched the system for conf dir the only dir is this one which
>>>>>>>>>>>>> I changed.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check
>>>>>>>>>>>>>> if it is taking setting configuration from other location...
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> ∞
>>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>>>> list of all
>>>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>>>> ).
>>>>>>>>>>>>>>> I remember I once had a similar error message and it was due
>>>>>>>>>>>>>>> to the
>>>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I
>>>>>>>>>>>>>>> might could not
>>>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>>>> > export
>>>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070)
>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889)
>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883)
>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>>> contain a valid
>>>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
you have to use upper case 'HADOOP_HOME' (don't mind if it's a typo). do
you proper permission to read these files?

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 1:00 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set the
> Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also I
> got the error.
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> set these properties in the configuration files present in your /etc
>> directory. and HADOOP_HOME is the parent directory of the hadoop bin
>> directory that holds the Hadoop scripts. so, set that accordingly in
>> .bashrc file.
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Thank you Mohammad Tariq
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> I tried all of the hadoop home dirs but didn't worke
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> OK what the Hadoop home should be in ubuntu because the binary files
>>>>> in /usr/bin
>>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>>
>>>>> shall I use /usr as hadoop path because it is the dir that contain the
>>>>> bin files
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> log out from the user. log in again and see if it works.
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> https://mtariq.jux.com/
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>>> the hadoop-env.sh file :
>>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> https://mtariq.jux.com/
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>
>>>>>>>> Thank you Mohammad
>>>>>>>> I still get the same error with this msg
>>>>>>>>
>>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>>
>>>>>>>>
>>>>>>>> Mohammad Alkahtani
>>>>>>>> P.O.Box 102275
>>>>>>>> Riyadh 11675
>>>>>>>> Saudi Arabia
>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> you can do that using these command :
>>>>>>>>>
>>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>>
>>>>>>>>> then go to the end of the file and add this line :
>>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>>
>>>>>>>>> after that use it to freeze the changes :
>>>>>>>>> source ~/.bashrc
>>>>>>>>>
>>>>>>>>> to check it :
>>>>>>>>> echo $HADOOP_HOME
>>>>>>>>>
>>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>>
>>>>>>>>> HTH
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME
>>>>>>>>>> because I don't find it in the hadoop-env.sh
>>>>>>>>>>
>>>>>>>>>> Thank you Shashwat
>>>>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>>>>> read the configuration from here.
>>>>>>>>>>
>>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian
>>>>>>>>>> -site.xml
>>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>>
>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>> P.O.Box 102275
>>>>>>>>>> Riyadh 11675
>>>>>>>>>> Saudi Arabia
>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> try
>>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ∞
>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> The problem is I tried I read the configuration file by
>>>>>>>>>>>> changing
>>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir,
>>>>>>>>>>>> I trid and searched the system for conf dir the only dir is this one which
>>>>>>>>>>>> I changed.
>>>>>>>>>>>>
>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if
>>>>>>>>>>>>> it is taking setting configuration from other location...
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> ∞
>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>>> list of all
>>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>>> ).
>>>>>>>>>>>>>> I remember I once had a similar error message and it was due
>>>>>>>>>>>>>> to the
>>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I
>>>>>>>>>>>>>> might could not
>>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>>> > export
>>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070)
>>>>>>>>>>>>>> at
>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889)
>>>>>>>>>>>>>> at
>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883)
>>>>>>>>>>>>>> at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>> contain a valid
>>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
you have to use upper case 'HADOOP_HOME' (don't mind if it's a typo). do
you proper permission to read these files?

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 1:00 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set the
> Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also I
> got the error.
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> set these properties in the configuration files present in your /etc
>> directory. and HADOOP_HOME is the parent directory of the hadoop bin
>> directory that holds the Hadoop scripts. so, set that accordingly in
>> .bashrc file.
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Thank you Mohammad Tariq
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> I tried all of the hadoop home dirs but didn't worke
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> OK what the Hadoop home should be in ubuntu because the binary files
>>>>> in /usr/bin
>>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>>
>>>>> shall I use /usr as hadoop path because it is the dir that contain the
>>>>> bin files
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> log out from the user. log in again and see if it works.
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> https://mtariq.jux.com/
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>>> the hadoop-env.sh file :
>>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> https://mtariq.jux.com/
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>
>>>>>>>> Thank you Mohammad
>>>>>>>> I still get the same error with this msg
>>>>>>>>
>>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>>
>>>>>>>>
>>>>>>>> Mohammad Alkahtani
>>>>>>>> P.O.Box 102275
>>>>>>>> Riyadh 11675
>>>>>>>> Saudi Arabia
>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> you can do that using these command :
>>>>>>>>>
>>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>>
>>>>>>>>> then go to the end of the file and add this line :
>>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>>
>>>>>>>>> after that use it to freeze the changes :
>>>>>>>>> source ~/.bashrc
>>>>>>>>>
>>>>>>>>> to check it :
>>>>>>>>> echo $HADOOP_HOME
>>>>>>>>>
>>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>>
>>>>>>>>> HTH
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME
>>>>>>>>>> because I don't find it in the hadoop-env.sh
>>>>>>>>>>
>>>>>>>>>> Thank you Shashwat
>>>>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>>>>> read the configuration from here.
>>>>>>>>>>
>>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian
>>>>>>>>>> -site.xml
>>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>>
>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>> P.O.Box 102275
>>>>>>>>>> Riyadh 11675
>>>>>>>>>> Saudi Arabia
>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> try
>>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ∞
>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> The problem is I tried I read the configuration file by
>>>>>>>>>>>> changing
>>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir,
>>>>>>>>>>>> I trid and searched the system for conf dir the only dir is this one which
>>>>>>>>>>>> I changed.
>>>>>>>>>>>>
>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if
>>>>>>>>>>>>> it is taking setting configuration from other location...
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> ∞
>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>>> list of all
>>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>>> ).
>>>>>>>>>>>>>> I remember I once had a similar error message and it was due
>>>>>>>>>>>>>> to the
>>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I
>>>>>>>>>>>>>> might could not
>>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>>> > export
>>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070)
>>>>>>>>>>>>>> at
>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889)
>>>>>>>>>>>>>> at
>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883)
>>>>>>>>>>>>>> at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>> contain a valid
>>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
you have to use upper case 'HADOOP_HOME' (don't mind if it's a typo). do
you proper permission to read these files?

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 1:00 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set the
> Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also I
> got the error.
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> set these properties in the configuration files present in your /etc
>> directory. and HADOOP_HOME is the parent directory of the hadoop bin
>> directory that holds the Hadoop scripts. so, set that accordingly in
>> .bashrc file.
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Thank you Mohammad Tariq
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> I tried all of the hadoop home dirs but didn't worke
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> OK what the Hadoop home should be in ubuntu because the binary files
>>>>> in /usr/bin
>>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>>
>>>>> shall I use /usr as hadoop path because it is the dir that contain the
>>>>> bin files
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> log out from the user. log in again and see if it works.
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> https://mtariq.jux.com/
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>>> the hadoop-env.sh file :
>>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> https://mtariq.jux.com/
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>
>>>>>>>> Thank you Mohammad
>>>>>>>> I still get the same error with this msg
>>>>>>>>
>>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>>
>>>>>>>>
>>>>>>>> Mohammad Alkahtani
>>>>>>>> P.O.Box 102275
>>>>>>>> Riyadh 11675
>>>>>>>> Saudi Arabia
>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> you can do that using these command :
>>>>>>>>>
>>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>>
>>>>>>>>> then go to the end of the file and add this line :
>>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>>
>>>>>>>>> after that use it to freeze the changes :
>>>>>>>>> source ~/.bashrc
>>>>>>>>>
>>>>>>>>> to check it :
>>>>>>>>> echo $HADOOP_HOME
>>>>>>>>>
>>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>>
>>>>>>>>> HTH
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME
>>>>>>>>>> because I don't find it in the hadoop-env.sh
>>>>>>>>>>
>>>>>>>>>> Thank you Shashwat
>>>>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>>>>> read the configuration from here.
>>>>>>>>>>
>>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian
>>>>>>>>>> -site.xml
>>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>>
>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>> P.O.Box 102275
>>>>>>>>>> Riyadh 11675
>>>>>>>>>> Saudi Arabia
>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> try
>>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ∞
>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> The problem is I tried I read the configuration file by
>>>>>>>>>>>> changing
>>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir,
>>>>>>>>>>>> I trid and searched the system for conf dir the only dir is this one which
>>>>>>>>>>>> I changed.
>>>>>>>>>>>>
>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if
>>>>>>>>>>>>> it is taking setting configuration from other location...
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> ∞
>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>>> list of all
>>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>>> ).
>>>>>>>>>>>>>> I remember I once had a similar error message and it was due
>>>>>>>>>>>>>> to the
>>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I
>>>>>>>>>>>>>> might could not
>>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>>> > export
>>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070)
>>>>>>>>>>>>>> at
>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889)
>>>>>>>>>>>>>> at
>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883)
>>>>>>>>>>>>>> at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>> contain a valid
>>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
you have to use upper case 'HADOOP_HOME' (don't mind if it's a typo). do
you proper permission to read these files?

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 1:00 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set the
> Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also I
> got the error.
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> set these properties in the configuration files present in your /etc
>> directory. and HADOOP_HOME is the parent directory of the hadoop bin
>> directory that holds the Hadoop scripts. so, set that accordingly in
>> .bashrc file.
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Thank you Mohammad Tariq
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> I tried all of the hadoop home dirs but didn't worke
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> OK what the Hadoop home should be in ubuntu because the binary files
>>>>> in /usr/bin
>>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>>
>>>>> shall I use /usr as hadoop path because it is the dir that contain the
>>>>> bin files
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> log out from the user. log in again and see if it works.
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> https://mtariq.jux.com/
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>>> the hadoop-env.sh file :
>>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> https://mtariq.jux.com/
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>
>>>>>>>> Thank you Mohammad
>>>>>>>> I still get the same error with this msg
>>>>>>>>
>>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>>
>>>>>>>>
>>>>>>>> Mohammad Alkahtani
>>>>>>>> P.O.Box 102275
>>>>>>>> Riyadh 11675
>>>>>>>> Saudi Arabia
>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> you can do that using these command :
>>>>>>>>>
>>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>>
>>>>>>>>> then go to the end of the file and add this line :
>>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>>
>>>>>>>>> after that use it to freeze the changes :
>>>>>>>>> source ~/.bashrc
>>>>>>>>>
>>>>>>>>> to check it :
>>>>>>>>> echo $HADOOP_HOME
>>>>>>>>>
>>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>>
>>>>>>>>> HTH
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME
>>>>>>>>>> because I don't find it in the hadoop-env.sh
>>>>>>>>>>
>>>>>>>>>> Thank you Shashwat
>>>>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>>>>> read the configuration from here.
>>>>>>>>>>
>>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian
>>>>>>>>>> -site.xml
>>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>>
>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>> P.O.Box 102275
>>>>>>>>>> Riyadh 11675
>>>>>>>>>> Saudi Arabia
>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> try
>>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ∞
>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> The problem is I tried I read the configuration file by
>>>>>>>>>>>> changing
>>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir,
>>>>>>>>>>>> I trid and searched the system for conf dir the only dir is this one which
>>>>>>>>>>>> I changed.
>>>>>>>>>>>>
>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if
>>>>>>>>>>>>> it is taking setting configuration from other location...
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> ∞
>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>>> list of all
>>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>>> ).
>>>>>>>>>>>>>> I remember I once had a similar error message and it was due
>>>>>>>>>>>>>> to the
>>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I
>>>>>>>>>>>>>> might could not
>>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>>> > export
>>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070)
>>>>>>>>>>>>>> at
>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889)
>>>>>>>>>>>>>> at
>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883)
>>>>>>>>>>>>>> at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>>> contain a valid
>>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> >
>>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set the
Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also I
got the error.

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com> wrote:

> set these properties in the configuration files present in your /etc
> directory. and HADOOP_HOME is the parent directory of the hadoop bin
> directory that holds the Hadoop scripts. so, set that accordingly in
> .bashrc file.
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> Thank you Mohammad Tariq
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> I tried all of the hadoop home dirs but didn't worke
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> OK what the Hadoop home should be in ubuntu because the binary files in
>>>> /usr/bin
>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>
>>>> shall I use /usr as hadoop path because it is the dir that contain the
>>>> bin files
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> log out from the user. log in again and see if it works.
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> https://mtariq.jux.com/
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>> the hadoop-env.sh file :
>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>
>>>>>>
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> https://mtariq.jux.com/
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> Thank you Mohammad
>>>>>>> I still get the same error with this msg
>>>>>>>
>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> you can do that using these command :
>>>>>>>>
>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>
>>>>>>>> then go to the end of the file and add this line :
>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>
>>>>>>>> after that use it to freeze the changes :
>>>>>>>> source ~/.bashrc
>>>>>>>>
>>>>>>>> to check it :
>>>>>>>> echo $HADOOP_HOME
>>>>>>>>
>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>
>>>>>>>> HTH
>>>>>>>>
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> https://mtariq.jux.com/
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because
>>>>>>>>> I don't find it in the hadoop-env.sh
>>>>>>>>>
>>>>>>>>> Thank you Shashwat
>>>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>>>> read the configuration from here.
>>>>>>>>>
>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.
>>>>>>>>> xml
>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>
>>>>>>>>> Mohammad Alkahtani
>>>>>>>>> P.O.Box 102275
>>>>>>>>> Riyadh 11675
>>>>>>>>> Saudi Arabia
>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> try
>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ∞
>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>>>>> changed.
>>>>>>>>>>>
>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if
>>>>>>>>>>>> it is taking setting configuration from other location...
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> ∞
>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>
>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>
>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>> list of all
>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>
>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>> ).
>>>>>>>>>>>>> I remember I once had a similar error message and it was due
>>>>>>>>>>>>> to the
>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>
>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>>>>> could not
>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>> > export
>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>>> host:port
>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>>> host:port
>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>>> host:port
>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain
>>>>>>>>>>>>> a valid
>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532)
>>>>>>>>>>>>> at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set the
Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also I
got the error.

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com> wrote:

> set these properties in the configuration files present in your /etc
> directory. and HADOOP_HOME is the parent directory of the hadoop bin
> directory that holds the Hadoop scripts. so, set that accordingly in
> .bashrc file.
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> Thank you Mohammad Tariq
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> I tried all of the hadoop home dirs but didn't worke
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> OK what the Hadoop home should be in ubuntu because the binary files in
>>>> /usr/bin
>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>
>>>> shall I use /usr as hadoop path because it is the dir that contain the
>>>> bin files
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> log out from the user. log in again and see if it works.
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> https://mtariq.jux.com/
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>> the hadoop-env.sh file :
>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>
>>>>>>
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> https://mtariq.jux.com/
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> Thank you Mohammad
>>>>>>> I still get the same error with this msg
>>>>>>>
>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> you can do that using these command :
>>>>>>>>
>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>
>>>>>>>> then go to the end of the file and add this line :
>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>
>>>>>>>> after that use it to freeze the changes :
>>>>>>>> source ~/.bashrc
>>>>>>>>
>>>>>>>> to check it :
>>>>>>>> echo $HADOOP_HOME
>>>>>>>>
>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>
>>>>>>>> HTH
>>>>>>>>
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> https://mtariq.jux.com/
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because
>>>>>>>>> I don't find it in the hadoop-env.sh
>>>>>>>>>
>>>>>>>>> Thank you Shashwat
>>>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>>>> read the configuration from here.
>>>>>>>>>
>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.
>>>>>>>>> xml
>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>
>>>>>>>>> Mohammad Alkahtani
>>>>>>>>> P.O.Box 102275
>>>>>>>>> Riyadh 11675
>>>>>>>>> Saudi Arabia
>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> try
>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ∞
>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>>>>> changed.
>>>>>>>>>>>
>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if
>>>>>>>>>>>> it is taking setting configuration from other location...
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> ∞
>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>
>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>
>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>> list of all
>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>
>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>> ).
>>>>>>>>>>>>> I remember I once had a similar error message and it was due
>>>>>>>>>>>>> to the
>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>
>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>>>>> could not
>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>> > export
>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>>> host:port
>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>>> host:port
>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>>> host:port
>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain
>>>>>>>>>>>>> a valid
>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532)
>>>>>>>>>>>>> at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set the
Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also I
got the error.

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com> wrote:

> set these properties in the configuration files present in your /etc
> directory. and HADOOP_HOME is the parent directory of the hadoop bin
> directory that holds the Hadoop scripts. so, set that accordingly in
> .bashrc file.
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> Thank you Mohammad Tariq
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> I tried all of the hadoop home dirs but didn't worke
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> OK what the Hadoop home should be in ubuntu because the binary files in
>>>> /usr/bin
>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>
>>>> shall I use /usr as hadoop path because it is the dir that contain the
>>>> bin files
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> log out from the user. log in again and see if it works.
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> https://mtariq.jux.com/
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>> the hadoop-env.sh file :
>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>
>>>>>>
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> https://mtariq.jux.com/
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> Thank you Mohammad
>>>>>>> I still get the same error with this msg
>>>>>>>
>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> you can do that using these command :
>>>>>>>>
>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>
>>>>>>>> then go to the end of the file and add this line :
>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>
>>>>>>>> after that use it to freeze the changes :
>>>>>>>> source ~/.bashrc
>>>>>>>>
>>>>>>>> to check it :
>>>>>>>> echo $HADOOP_HOME
>>>>>>>>
>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>
>>>>>>>> HTH
>>>>>>>>
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> https://mtariq.jux.com/
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because
>>>>>>>>> I don't find it in the hadoop-env.sh
>>>>>>>>>
>>>>>>>>> Thank you Shashwat
>>>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>>>> read the configuration from here.
>>>>>>>>>
>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.
>>>>>>>>> xml
>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>
>>>>>>>>> Mohammad Alkahtani
>>>>>>>>> P.O.Box 102275
>>>>>>>>> Riyadh 11675
>>>>>>>>> Saudi Arabia
>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> try
>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ∞
>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>>>>> changed.
>>>>>>>>>>>
>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if
>>>>>>>>>>>> it is taking setting configuration from other location...
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> ∞
>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>
>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>
>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>> list of all
>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>
>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>> ).
>>>>>>>>>>>>> I remember I once had a similar error message and it was due
>>>>>>>>>>>>> to the
>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>
>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>>>>> could not
>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>> > export
>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>>> host:port
>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>>> host:port
>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>>> host:port
>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain
>>>>>>>>>>>>> a valid
>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532)
>>>>>>>>>>>>> at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set the
Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also I
got the error.

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <do...@gmail.com> wrote:

> set these properties in the configuration files present in your /etc
> directory. and HADOOP_HOME is the parent directory of the hadoop bin
> directory that holds the Hadoop scripts. so, set that accordingly in
> .bashrc file.
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> Thank you Mohammad Tariq
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> I tried all of the hadoop home dirs but didn't worke
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> OK what the Hadoop home should be in ubuntu because the binary files in
>>>> /usr/bin
>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>
>>>> shall I use /usr as hadoop path because it is the dir that contain the
>>>> bin files
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> log out from the user. log in again and see if it works.
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> https://mtariq.jux.com/
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> you can avoid the warning by setting the following prop to true in
>>>>>> the hadoop-env.sh file :
>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>
>>>>>>
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> https://mtariq.jux.com/
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> Thank you Mohammad
>>>>>>> I still get the same error with this msg
>>>>>>>
>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> you can do that using these command :
>>>>>>>>
>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>
>>>>>>>> then go to the end of the file and add this line :
>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>
>>>>>>>> after that use it to freeze the changes :
>>>>>>>> source ~/.bashrc
>>>>>>>>
>>>>>>>> to check it :
>>>>>>>> echo $HADOOP_HOME
>>>>>>>>
>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>
>>>>>>>> HTH
>>>>>>>>
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> https://mtariq.jux.com/
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because
>>>>>>>>> I don't find it in the hadoop-env.sh
>>>>>>>>>
>>>>>>>>> Thank you Shashwat
>>>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>>>> read the configuration from here.
>>>>>>>>>
>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.
>>>>>>>>> xml
>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>
>>>>>>>>> Mohammad Alkahtani
>>>>>>>>> P.O.Box 102275
>>>>>>>>> Riyadh 11675
>>>>>>>>> Saudi Arabia
>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> try
>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ∞
>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>>>>> changed.
>>>>>>>>>>>
>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if
>>>>>>>>>>>> it is taking setting configuration from other location...
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> ∞
>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>
>>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>>
>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>>> list of all
>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>
>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>> ).
>>>>>>>>>>>>> I remember I once had a similar error message and it was due
>>>>>>>>>>>>> to the
>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>
>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>>>>> could not
>>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>>> > export
>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>>> host:port
>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>>> host:port
>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>>> host:port
>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>> > at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain
>>>>>>>>>>>>> a valid
>>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532)
>>>>>>>>>>>>> at
>>>>>>>>>>>>> >
>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>> >
>>>>>>>>>>>>> >
>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
set these properties in the configuration files present in your /etc
directory. and HADOOP_HOME is the parent directory of the hadoop bin
directory that holds the Hadoop scripts. so, set that accordingly in
.bashrc file.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Mohammad Tariq
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> I tried all of the hadoop home dirs but didn't worke
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> OK what the Hadoop home should be in ubuntu because the binary files in
>>> /usr/bin
>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>> the conf files in /usr/share/hadoop/templates/conf
>>>
>>> shall I use /usr as hadoop path because it is the dir that contain the
>>> bin files
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> log out from the user. log in again and see if it works.
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> you can avoid the warning by setting the following prop to true in the
>>>>> hadoop-env.sh file :
>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>
>>>>>
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> https://mtariq.jux.com/
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> Thank you Mohammad
>>>>>> I still get the same error with this msg
>>>>>>
>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> you can do that using these command :
>>>>>>>
>>>>>>> sudo gedit ~/.bashrc
>>>>>>>
>>>>>>> then go to the end of the file and add this line :
>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>
>>>>>>> after that use it to freeze the changes :
>>>>>>> source ~/.bashrc
>>>>>>>
>>>>>>> to check it :
>>>>>>> echo $HADOOP_HOME
>>>>>>>
>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>
>>>>>>> HTH
>>>>>>>
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> https://mtariq.jux.com/
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because
>>>>>>>> I don't find it in the hadoop-env.sh
>>>>>>>>
>>>>>>>> Thank you Shashwat
>>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>>> read the configuration from here.
>>>>>>>>
>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.
>>>>>>>> xml
>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>
>>>>>>>> Mohammad Alkahtani
>>>>>>>> P.O.Box 102275
>>>>>>>> Riyadh 11675
>>>>>>>> Saudi Arabia
>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> try
>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ∞
>>>>>>>>> Shashwat Shriparv
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>>>> changed.
>>>>>>>>>>
>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>> P.O.Box 102275
>>>>>>>>>> Riyadh 11675
>>>>>>>>>> Saudi Arabia
>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if
>>>>>>>>>>> it is taking setting configuration from other location...
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ∞
>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Hi,
>>>>>>>>>>>>
>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>
>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>> list of all
>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>
>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>> ).
>>>>>>>>>>>> I remember I once had a similar error message and it was due to
>>>>>>>>>>>> the
>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>
>>>>>>>>>>>> Regards,
>>>>>>>>>>>>
>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>> >
>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>>>> could not
>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>> > <property>
>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>> > </property>
>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>> > <property>
>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>> > </property>
>>>>>>>>>>>> >
>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>> > export
>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>> >
>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>> host:port
>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>> host:port
>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>> host:port
>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>> task tracker
>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain
>>>>>>>>>>>> a valid
>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532)
>>>>>>>>>>>> at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>> >
>>>>>>>>>>>> >
>>>>>>>>>>>> > Regards,
>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
set these properties in the configuration files present in your /etc
directory. and HADOOP_HOME is the parent directory of the hadoop bin
directory that holds the Hadoop scripts. so, set that accordingly in
.bashrc file.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Mohammad Tariq
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> I tried all of the hadoop home dirs but didn't worke
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> OK what the Hadoop home should be in ubuntu because the binary files in
>>> /usr/bin
>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>> the conf files in /usr/share/hadoop/templates/conf
>>>
>>> shall I use /usr as hadoop path because it is the dir that contain the
>>> bin files
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> log out from the user. log in again and see if it works.
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> you can avoid the warning by setting the following prop to true in the
>>>>> hadoop-env.sh file :
>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>
>>>>>
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> https://mtariq.jux.com/
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> Thank you Mohammad
>>>>>> I still get the same error with this msg
>>>>>>
>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> you can do that using these command :
>>>>>>>
>>>>>>> sudo gedit ~/.bashrc
>>>>>>>
>>>>>>> then go to the end of the file and add this line :
>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>
>>>>>>> after that use it to freeze the changes :
>>>>>>> source ~/.bashrc
>>>>>>>
>>>>>>> to check it :
>>>>>>> echo $HADOOP_HOME
>>>>>>>
>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>
>>>>>>> HTH
>>>>>>>
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> https://mtariq.jux.com/
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because
>>>>>>>> I don't find it in the hadoop-env.sh
>>>>>>>>
>>>>>>>> Thank you Shashwat
>>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>>> read the configuration from here.
>>>>>>>>
>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.
>>>>>>>> xml
>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>
>>>>>>>> Mohammad Alkahtani
>>>>>>>> P.O.Box 102275
>>>>>>>> Riyadh 11675
>>>>>>>> Saudi Arabia
>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> try
>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ∞
>>>>>>>>> Shashwat Shriparv
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>>>> changed.
>>>>>>>>>>
>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>> P.O.Box 102275
>>>>>>>>>> Riyadh 11675
>>>>>>>>>> Saudi Arabia
>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if
>>>>>>>>>>> it is taking setting configuration from other location...
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ∞
>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Hi,
>>>>>>>>>>>>
>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>
>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>> list of all
>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>
>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>> ).
>>>>>>>>>>>> I remember I once had a similar error message and it was due to
>>>>>>>>>>>> the
>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>
>>>>>>>>>>>> Regards,
>>>>>>>>>>>>
>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>> >
>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>>>> could not
>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>> > <property>
>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>> > </property>
>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>> > <property>
>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>> > </property>
>>>>>>>>>>>> >
>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>> > export
>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>> >
>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>> host:port
>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>> host:port
>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>> host:port
>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>> task tracker
>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain
>>>>>>>>>>>> a valid
>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532)
>>>>>>>>>>>> at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>> >
>>>>>>>>>>>> >
>>>>>>>>>>>> > Regards,
>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
set these properties in the configuration files present in your /etc
directory. and HADOOP_HOME is the parent directory of the hadoop bin
directory that holds the Hadoop scripts. so, set that accordingly in
.bashrc file.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Mohammad Tariq
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> I tried all of the hadoop home dirs but didn't worke
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> OK what the Hadoop home should be in ubuntu because the binary files in
>>> /usr/bin
>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>> the conf files in /usr/share/hadoop/templates/conf
>>>
>>> shall I use /usr as hadoop path because it is the dir that contain the
>>> bin files
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> log out from the user. log in again and see if it works.
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> you can avoid the warning by setting the following prop to true in the
>>>>> hadoop-env.sh file :
>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>
>>>>>
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> https://mtariq.jux.com/
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> Thank you Mohammad
>>>>>> I still get the same error with this msg
>>>>>>
>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> you can do that using these command :
>>>>>>>
>>>>>>> sudo gedit ~/.bashrc
>>>>>>>
>>>>>>> then go to the end of the file and add this line :
>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>
>>>>>>> after that use it to freeze the changes :
>>>>>>> source ~/.bashrc
>>>>>>>
>>>>>>> to check it :
>>>>>>> echo $HADOOP_HOME
>>>>>>>
>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>
>>>>>>> HTH
>>>>>>>
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> https://mtariq.jux.com/
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because
>>>>>>>> I don't find it in the hadoop-env.sh
>>>>>>>>
>>>>>>>> Thank you Shashwat
>>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>>> read the configuration from here.
>>>>>>>>
>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.
>>>>>>>> xml
>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>
>>>>>>>> Mohammad Alkahtani
>>>>>>>> P.O.Box 102275
>>>>>>>> Riyadh 11675
>>>>>>>> Saudi Arabia
>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> try
>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ∞
>>>>>>>>> Shashwat Shriparv
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>>>> changed.
>>>>>>>>>>
>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>> P.O.Box 102275
>>>>>>>>>> Riyadh 11675
>>>>>>>>>> Saudi Arabia
>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if
>>>>>>>>>>> it is taking setting configuration from other location...
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ∞
>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Hi,
>>>>>>>>>>>>
>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>
>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>> list of all
>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>
>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>> ).
>>>>>>>>>>>> I remember I once had a similar error message and it was due to
>>>>>>>>>>>> the
>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>
>>>>>>>>>>>> Regards,
>>>>>>>>>>>>
>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>> >
>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>>>> could not
>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>> > <property>
>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>> > </property>
>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>> > <property>
>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>> > </property>
>>>>>>>>>>>> >
>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>> > export
>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>> >
>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>> host:port
>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>> host:port
>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>> host:port
>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>> task tracker
>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain
>>>>>>>>>>>> a valid
>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532)
>>>>>>>>>>>> at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>> >
>>>>>>>>>>>> >
>>>>>>>>>>>> > Regards,
>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
set these properties in the configuration files present in your /etc
directory. and HADOOP_HOME is the parent directory of the hadoop bin
directory that holds the Hadoop scripts. so, set that accordingly in
.bashrc file.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Mohammad Tariq
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> I tried all of the hadoop home dirs but didn't worke
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> OK what the Hadoop home should be in ubuntu because the binary files in
>>> /usr/bin
>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>> the conf files in /usr/share/hadoop/templates/conf
>>>
>>> shall I use /usr as hadoop path because it is the dir that contain the
>>> bin files
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> log out from the user. log in again and see if it works.
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> you can avoid the warning by setting the following prop to true in the
>>>>> hadoop-env.sh file :
>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>
>>>>>
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> https://mtariq.jux.com/
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> Thank you Mohammad
>>>>>> I still get the same error with this msg
>>>>>>
>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> you can do that using these command :
>>>>>>>
>>>>>>> sudo gedit ~/.bashrc
>>>>>>>
>>>>>>> then go to the end of the file and add this line :
>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>
>>>>>>> after that use it to freeze the changes :
>>>>>>> source ~/.bashrc
>>>>>>>
>>>>>>> to check it :
>>>>>>> echo $HADOOP_HOME
>>>>>>>
>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>
>>>>>>> HTH
>>>>>>>
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> https://mtariq.jux.com/
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because
>>>>>>>> I don't find it in the hadoop-env.sh
>>>>>>>>
>>>>>>>> Thank you Shashwat
>>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>>> read the configuration from here.
>>>>>>>>
>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.
>>>>>>>> xml
>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>
>>>>>>>> Mohammad Alkahtani
>>>>>>>> P.O.Box 102275
>>>>>>>> Riyadh 11675
>>>>>>>> Saudi Arabia
>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> try
>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ∞
>>>>>>>>> Shashwat Shriparv
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>>>> changed.
>>>>>>>>>>
>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>> P.O.Box 102275
>>>>>>>>>> Riyadh 11675
>>>>>>>>>> Saudi Arabia
>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if
>>>>>>>>>>> it is taking setting configuration from other location...
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ∞
>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Hi,
>>>>>>>>>>>>
>>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>>
>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the
>>>>>>>>>>>> list of all
>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>
>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>> ).
>>>>>>>>>>>> I remember I once had a similar error message and it was due to
>>>>>>>>>>>> the
>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>
>>>>>>>>>>>> Regards,
>>>>>>>>>>>>
>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>> >
>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>>>> could not
>>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>>> > <property>
>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>> > </property>
>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>> > <property>
>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>> > </property>
>>>>>>>>>>>> >
>>>>>>>>>>>> > but i get these errors, I assume that there is problem,
>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>>> > export
>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>> >
>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>> host:port
>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>> host:port
>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>>> host:port
>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>> > at
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not
>>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>> > at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>> >
>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>> >
>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start
>>>>>>>>>>>> task tracker
>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain
>>>>>>>>>>>> a valid
>>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532)
>>>>>>>>>>>> at
>>>>>>>>>>>> >
>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>> >
>>>>>>>>>>>> >
>>>>>>>>>>>> > Regards,
>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Thank you Mohammad Tariq

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> I tried all of the hadoop home dirs but didn't worke
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <m.alkahtani@gmail.com
> > wrote:
>
>> OK what the Hadoop home should be in ubuntu because the binary files in
>> /usr/bin
>> the hadoop-env.sh and othe xml file in /etc/hadoop
>> the conf files in /usr/share/hadoop/templates/conf
>>
>> shall I use /usr as hadoop path because it is the dir that contain the
>> bin files
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> log out from the user. log in again and see if it works.
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> you can avoid the warning by setting the following prop to true in the
>>>> hadoop-env.sh file :
>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>
>>>>
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> Thank you Mohammad
>>>>> I still get the same error with this msg
>>>>>
>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> you can do that using these command :
>>>>>>
>>>>>> sudo gedit ~/.bashrc
>>>>>>
>>>>>> then go to the end of the file and add this line :
>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>
>>>>>> after that use it to freeze the changes :
>>>>>> source ~/.bashrc
>>>>>>
>>>>>> to check it :
>>>>>> echo $HADOOP_HOME
>>>>>>
>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>
>>>>>> HTH
>>>>>>
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> https://mtariq.jux.com/
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>>>>> don't find it in the hadoop-env.sh
>>>>>>>
>>>>>>> Thank you Shashwat
>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>> read the configuration from here.
>>>>>>>
>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>> /commons-parent-debian-site.xml
>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.
>>>>>>> xml
>>>>>>> /usr/share/compiz/composite.xml
>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>
>>>>>>>> try
>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>> it will show you where ever those files are..
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ∞
>>>>>>>> Shashwat Shriparv
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>>> changed.
>>>>>>>>>
>>>>>>>>> Mohammad Alkahtani
>>>>>>>>> P.O.Box 102275
>>>>>>>>> Riyadh 11675
>>>>>>>>> Saudi Arabia
>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it
>>>>>>>>>> is taking setting configuration from other location...
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ∞
>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi,
>>>>>>>>>>>
>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>
>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list
>>>>>>>>>>> of all
>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>
>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>> ).
>>>>>>>>>>> I remember I once had a similar error message and it was due to
>>>>>>>>>>> the
>>>>>>>>>>> change in properties names.
>>>>>>>>>>>
>>>>>>>>>>> Regards,
>>>>>>>>>>>
>>>>>>>>>>> Sourygna
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>> >
>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>>> could not
>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>> > <property>
>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>> > </property>
>>>>>>>>>>> > and for mapred
>>>>>>>>>>> > <property>
>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>> > </property>
>>>>>>>>>>> >
>>>>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>>>>> cannot read
>>>>>>>>>>> > the configuration file.
>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>> > export
>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>> >
>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>> host:port
>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>> host:port
>>>>>>>>>>> > authority: local at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>> host:port
>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>>>>> Does not
>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>>>>> tracker
>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>> valid
>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>> >
>>>>>>>>>>> >
>>>>>>>>>>> > Regards,
>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Thank you Mohammad Tariq

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> I tried all of the hadoop home dirs but didn't worke
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <m.alkahtani@gmail.com
> > wrote:
>
>> OK what the Hadoop home should be in ubuntu because the binary files in
>> /usr/bin
>> the hadoop-env.sh and othe xml file in /etc/hadoop
>> the conf files in /usr/share/hadoop/templates/conf
>>
>> shall I use /usr as hadoop path because it is the dir that contain the
>> bin files
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> log out from the user. log in again and see if it works.
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> you can avoid the warning by setting the following prop to true in the
>>>> hadoop-env.sh file :
>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>
>>>>
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> Thank you Mohammad
>>>>> I still get the same error with this msg
>>>>>
>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> you can do that using these command :
>>>>>>
>>>>>> sudo gedit ~/.bashrc
>>>>>>
>>>>>> then go to the end of the file and add this line :
>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>
>>>>>> after that use it to freeze the changes :
>>>>>> source ~/.bashrc
>>>>>>
>>>>>> to check it :
>>>>>> echo $HADOOP_HOME
>>>>>>
>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>
>>>>>> HTH
>>>>>>
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> https://mtariq.jux.com/
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>>>>> don't find it in the hadoop-env.sh
>>>>>>>
>>>>>>> Thank you Shashwat
>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>> read the configuration from here.
>>>>>>>
>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>> /commons-parent-debian-site.xml
>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.
>>>>>>> xml
>>>>>>> /usr/share/compiz/composite.xml
>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>
>>>>>>>> try
>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>> it will show you where ever those files are..
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ∞
>>>>>>>> Shashwat Shriparv
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>>> changed.
>>>>>>>>>
>>>>>>>>> Mohammad Alkahtani
>>>>>>>>> P.O.Box 102275
>>>>>>>>> Riyadh 11675
>>>>>>>>> Saudi Arabia
>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it
>>>>>>>>>> is taking setting configuration from other location...
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ∞
>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi,
>>>>>>>>>>>
>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>
>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list
>>>>>>>>>>> of all
>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>
>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>> ).
>>>>>>>>>>> I remember I once had a similar error message and it was due to
>>>>>>>>>>> the
>>>>>>>>>>> change in properties names.
>>>>>>>>>>>
>>>>>>>>>>> Regards,
>>>>>>>>>>>
>>>>>>>>>>> Sourygna
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>> >
>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>>> could not
>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>> > <property>
>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>> > </property>
>>>>>>>>>>> > and for mapred
>>>>>>>>>>> > <property>
>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>> > </property>
>>>>>>>>>>> >
>>>>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>>>>> cannot read
>>>>>>>>>>> > the configuration file.
>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>> > export
>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>> >
>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>> host:port
>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>> host:port
>>>>>>>>>>> > authority: local at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>> host:port
>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>>>>> Does not
>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>>>>> tracker
>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>> valid
>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>> >
>>>>>>>>>>> >
>>>>>>>>>>> > Regards,
>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Thank you Mohammad Tariq

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> I tried all of the hadoop home dirs but didn't worke
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <m.alkahtani@gmail.com
> > wrote:
>
>> OK what the Hadoop home should be in ubuntu because the binary files in
>> /usr/bin
>> the hadoop-env.sh and othe xml file in /etc/hadoop
>> the conf files in /usr/share/hadoop/templates/conf
>>
>> shall I use /usr as hadoop path because it is the dir that contain the
>> bin files
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> log out from the user. log in again and see if it works.
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> you can avoid the warning by setting the following prop to true in the
>>>> hadoop-env.sh file :
>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>
>>>>
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> Thank you Mohammad
>>>>> I still get the same error with this msg
>>>>>
>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> you can do that using these command :
>>>>>>
>>>>>> sudo gedit ~/.bashrc
>>>>>>
>>>>>> then go to the end of the file and add this line :
>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>
>>>>>> after that use it to freeze the changes :
>>>>>> source ~/.bashrc
>>>>>>
>>>>>> to check it :
>>>>>> echo $HADOOP_HOME
>>>>>>
>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>
>>>>>> HTH
>>>>>>
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> https://mtariq.jux.com/
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>>>>> don't find it in the hadoop-env.sh
>>>>>>>
>>>>>>> Thank you Shashwat
>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>> read the configuration from here.
>>>>>>>
>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>> /commons-parent-debian-site.xml
>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.
>>>>>>> xml
>>>>>>> /usr/share/compiz/composite.xml
>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>
>>>>>>>> try
>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>> it will show you where ever those files are..
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ∞
>>>>>>>> Shashwat Shriparv
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>>> changed.
>>>>>>>>>
>>>>>>>>> Mohammad Alkahtani
>>>>>>>>> P.O.Box 102275
>>>>>>>>> Riyadh 11675
>>>>>>>>> Saudi Arabia
>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it
>>>>>>>>>> is taking setting configuration from other location...
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ∞
>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi,
>>>>>>>>>>>
>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>
>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list
>>>>>>>>>>> of all
>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>
>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>> ).
>>>>>>>>>>> I remember I once had a similar error message and it was due to
>>>>>>>>>>> the
>>>>>>>>>>> change in properties names.
>>>>>>>>>>>
>>>>>>>>>>> Regards,
>>>>>>>>>>>
>>>>>>>>>>> Sourygna
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>> >
>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>>> could not
>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>> > <property>
>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>> > </property>
>>>>>>>>>>> > and for mapred
>>>>>>>>>>> > <property>
>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>> > </property>
>>>>>>>>>>> >
>>>>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>>>>> cannot read
>>>>>>>>>>> > the configuration file.
>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>> > export
>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>> >
>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>> host:port
>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>> host:port
>>>>>>>>>>> > authority: local at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>> host:port
>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>>>>> Does not
>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>>>>> tracker
>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>> valid
>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>> >
>>>>>>>>>>> >
>>>>>>>>>>> > Regards,
>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Thank you Mohammad Tariq

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> I tried all of the hadoop home dirs but didn't worke
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <m.alkahtani@gmail.com
> > wrote:
>
>> OK what the Hadoop home should be in ubuntu because the binary files in
>> /usr/bin
>> the hadoop-env.sh and othe xml file in /etc/hadoop
>> the conf files in /usr/share/hadoop/templates/conf
>>
>> shall I use /usr as hadoop path because it is the dir that contain the
>> bin files
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> log out from the user. log in again and see if it works.
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> you can avoid the warning by setting the following prop to true in the
>>>> hadoop-env.sh file :
>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>
>>>>
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> Thank you Mohammad
>>>>> I still get the same error with this msg
>>>>>
>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> you can do that using these command :
>>>>>>
>>>>>> sudo gedit ~/.bashrc
>>>>>>
>>>>>> then go to the end of the file and add this line :
>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>
>>>>>> after that use it to freeze the changes :
>>>>>> source ~/.bashrc
>>>>>>
>>>>>> to check it :
>>>>>> echo $HADOOP_HOME
>>>>>>
>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>
>>>>>> HTH
>>>>>>
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> https://mtariq.jux.com/
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>>>>> don't find it in the hadoop-env.sh
>>>>>>>
>>>>>>> Thank you Shashwat
>>>>>>> this is the output and it is already configured but hadoop don't
>>>>>>> read the configuration from here.
>>>>>>>
>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>> /commons-parent-debian-site.xml
>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.
>>>>>>> xml
>>>>>>> /usr/share/compiz/composite.xml
>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>
>>>>>>>> try
>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>> it will show you where ever those files are..
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ∞
>>>>>>>> Shashwat Shriparv
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>>> changed.
>>>>>>>>>
>>>>>>>>> Mohammad Alkahtani
>>>>>>>>> P.O.Box 102275
>>>>>>>>> Riyadh 11675
>>>>>>>>> Saudi Arabia
>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it
>>>>>>>>>> is taking setting configuration from other location...
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ∞
>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi,
>>>>>>>>>>>
>>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>>
>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list
>>>>>>>>>>> of all
>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>
>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>> ).
>>>>>>>>>>> I remember I once had a similar error message and it was due to
>>>>>>>>>>> the
>>>>>>>>>>> change in properties names.
>>>>>>>>>>>
>>>>>>>>>>> Regards,
>>>>>>>>>>>
>>>>>>>>>>> Sourygna
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>> >
>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>>> could not
>>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>>> > <property>
>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>> > </property>
>>>>>>>>>>> > and for mapred
>>>>>>>>>>> > <property>
>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>> > </property>
>>>>>>>>>>> >
>>>>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>>>>> cannot read
>>>>>>>>>>> > the configuration file.
>>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>>> > export
>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>> >
>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>> host:port
>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>> host:port
>>>>>>>>>>> > authority: local at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>>> host:port
>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>> > at
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>>>>> Does not
>>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>> > at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>> >
>>>>>>>>>>> > ________________________________
>>>>>>>>>>> >
>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>>>>> tracker
>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>>> valid
>>>>>>>>>>> > host:port authority: local at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>>> >
>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>> >
>>>>>>>>>>> >
>>>>>>>>>>> > Regards,
>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I tried all of the hadoop home dirs but didn't worke

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> OK what the Hadoop home should be in ubuntu because the binary files in
> /usr/bin
> the hadoop-env.sh and othe xml file in /etc/hadoop
> the conf files in /usr/share/hadoop/templates/conf
>
> shall I use /usr as hadoop path because it is the dir that contain the bin
> files
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> log out from the user. log in again and see if it works.
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> you can avoid the warning by setting the following prop to true in the
>>> hadoop-env.sh file :
>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>
>>>
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> Thank you Mohammad
>>>> I still get the same error with this msg
>>>>
>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> you can do that using these command :
>>>>>
>>>>> sudo gedit ~/.bashrc
>>>>>
>>>>> then go to the end of the file and add this line :
>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>
>>>>> after that use it to freeze the changes :
>>>>> source ~/.bashrc
>>>>>
>>>>> to check it :
>>>>> echo $HADOOP_HOME
>>>>>
>>>>> This will permanently set your HADOOP_HOME.
>>>>>
>>>>> HTH
>>>>>
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> https://mtariq.jux.com/
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>>>> don't find it in the hadoop-env.sh
>>>>>>
>>>>>> Thank you Shashwat
>>>>>> this is the output and it is already configured but hadoop don't
>>>>>> read the configuration from here.
>>>>>>
>>>>>> /usr/share/maven-repo/org/apache
>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>> /commons-parent-debian-site.xml
>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>>>>> /usr/share/compiz/composite.xml
>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>
>>>>>>> try
>>>>>>> find / -type f -iname "*site.xml"
>>>>>>> it will show you where ever those files are..
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ∞
>>>>>>> Shashwat Shriparv
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>
>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>> changed.
>>>>>>>>
>>>>>>>> Mohammad Alkahtani
>>>>>>>> P.O.Box 102275
>>>>>>>> Riyadh 11675
>>>>>>>> Saudi Arabia
>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it
>>>>>>>>> is taking setting configuration from other location...
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ∞
>>>>>>>>> Shashwat Shriparv
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Hi,
>>>>>>>>>>
>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>
>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list
>>>>>>>>>> of all
>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>
>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>> ).
>>>>>>>>>> I remember I once had a similar error message and it was due to
>>>>>>>>>> the
>>>>>>>>>> change in properties names.
>>>>>>>>>>
>>>>>>>>>> Regards,
>>>>>>>>>>
>>>>>>>>>> Sourygna
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>> >
>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>> could not
>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>> > <property>
>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>> > </property>
>>>>>>>>>> > and for mapred
>>>>>>>>>> > <property>
>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>> > </property>
>>>>>>>>>> >
>>>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>>>> cannot read
>>>>>>>>>> > the configuration file.
>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>> > export
>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>> >
>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>> host:port
>>>>>>>>>> > authority: file:/// at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>> host:port
>>>>>>>>>> > authority: local at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>> host:port
>>>>>>>>>> > authority: file:/// at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>>>> Does not
>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>>>> tracker
>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>> valid
>>>>>>>>>> > host:port authority: local at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>> >
>>>>>>>>>> >
>>>>>>>>>> > Regards,
>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I tried all of the hadoop home dirs but didn't worke

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> OK what the Hadoop home should be in ubuntu because the binary files in
> /usr/bin
> the hadoop-env.sh and othe xml file in /etc/hadoop
> the conf files in /usr/share/hadoop/templates/conf
>
> shall I use /usr as hadoop path because it is the dir that contain the bin
> files
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> log out from the user. log in again and see if it works.
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> you can avoid the warning by setting the following prop to true in the
>>> hadoop-env.sh file :
>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>
>>>
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> Thank you Mohammad
>>>> I still get the same error with this msg
>>>>
>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> you can do that using these command :
>>>>>
>>>>> sudo gedit ~/.bashrc
>>>>>
>>>>> then go to the end of the file and add this line :
>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>
>>>>> after that use it to freeze the changes :
>>>>> source ~/.bashrc
>>>>>
>>>>> to check it :
>>>>> echo $HADOOP_HOME
>>>>>
>>>>> This will permanently set your HADOOP_HOME.
>>>>>
>>>>> HTH
>>>>>
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> https://mtariq.jux.com/
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>>>> don't find it in the hadoop-env.sh
>>>>>>
>>>>>> Thank you Shashwat
>>>>>> this is the output and it is already configured but hadoop don't
>>>>>> read the configuration from here.
>>>>>>
>>>>>> /usr/share/maven-repo/org/apache
>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>> /commons-parent-debian-site.xml
>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>>>>> /usr/share/compiz/composite.xml
>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>
>>>>>>> try
>>>>>>> find / -type f -iname "*site.xml"
>>>>>>> it will show you where ever those files are..
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ∞
>>>>>>> Shashwat Shriparv
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>
>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>> changed.
>>>>>>>>
>>>>>>>> Mohammad Alkahtani
>>>>>>>> P.O.Box 102275
>>>>>>>> Riyadh 11675
>>>>>>>> Saudi Arabia
>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it
>>>>>>>>> is taking setting configuration from other location...
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ∞
>>>>>>>>> Shashwat Shriparv
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Hi,
>>>>>>>>>>
>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>
>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list
>>>>>>>>>> of all
>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>
>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>> ).
>>>>>>>>>> I remember I once had a similar error message and it was due to
>>>>>>>>>> the
>>>>>>>>>> change in properties names.
>>>>>>>>>>
>>>>>>>>>> Regards,
>>>>>>>>>>
>>>>>>>>>> Sourygna
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>> >
>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>> could not
>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>> > <property>
>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>> > </property>
>>>>>>>>>> > and for mapred
>>>>>>>>>> > <property>
>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>> > </property>
>>>>>>>>>> >
>>>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>>>> cannot read
>>>>>>>>>> > the configuration file.
>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>> > export
>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>> >
>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>> host:port
>>>>>>>>>> > authority: file:/// at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>> host:port
>>>>>>>>>> > authority: local at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>> host:port
>>>>>>>>>> > authority: file:/// at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>>>> Does not
>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>>>> tracker
>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>> valid
>>>>>>>>>> > host:port authority: local at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>> >
>>>>>>>>>> >
>>>>>>>>>> > Regards,
>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I tried all of the hadoop home dirs but didn't worke

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> OK what the Hadoop home should be in ubuntu because the binary files in
> /usr/bin
> the hadoop-env.sh and othe xml file in /etc/hadoop
> the conf files in /usr/share/hadoop/templates/conf
>
> shall I use /usr as hadoop path because it is the dir that contain the bin
> files
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> log out from the user. log in again and see if it works.
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> you can avoid the warning by setting the following prop to true in the
>>> hadoop-env.sh file :
>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>
>>>
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> Thank you Mohammad
>>>> I still get the same error with this msg
>>>>
>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> you can do that using these command :
>>>>>
>>>>> sudo gedit ~/.bashrc
>>>>>
>>>>> then go to the end of the file and add this line :
>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>
>>>>> after that use it to freeze the changes :
>>>>> source ~/.bashrc
>>>>>
>>>>> to check it :
>>>>> echo $HADOOP_HOME
>>>>>
>>>>> This will permanently set your HADOOP_HOME.
>>>>>
>>>>> HTH
>>>>>
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> https://mtariq.jux.com/
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>>>> don't find it in the hadoop-env.sh
>>>>>>
>>>>>> Thank you Shashwat
>>>>>> this is the output and it is already configured but hadoop don't
>>>>>> read the configuration from here.
>>>>>>
>>>>>> /usr/share/maven-repo/org/apache
>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>> /commons-parent-debian-site.xml
>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>>>>> /usr/share/compiz/composite.xml
>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>
>>>>>>> try
>>>>>>> find / -type f -iname "*site.xml"
>>>>>>> it will show you where ever those files are..
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ∞
>>>>>>> Shashwat Shriparv
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>
>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>> changed.
>>>>>>>>
>>>>>>>> Mohammad Alkahtani
>>>>>>>> P.O.Box 102275
>>>>>>>> Riyadh 11675
>>>>>>>> Saudi Arabia
>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it
>>>>>>>>> is taking setting configuration from other location...
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ∞
>>>>>>>>> Shashwat Shriparv
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Hi,
>>>>>>>>>>
>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>
>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list
>>>>>>>>>> of all
>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>
>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>> ).
>>>>>>>>>> I remember I once had a similar error message and it was due to
>>>>>>>>>> the
>>>>>>>>>> change in properties names.
>>>>>>>>>>
>>>>>>>>>> Regards,
>>>>>>>>>>
>>>>>>>>>> Sourygna
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>> >
>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>> could not
>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>> > <property>
>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>> > </property>
>>>>>>>>>> > and for mapred
>>>>>>>>>> > <property>
>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>> > </property>
>>>>>>>>>> >
>>>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>>>> cannot read
>>>>>>>>>> > the configuration file.
>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>> > export
>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>> >
>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>> host:port
>>>>>>>>>> > authority: file:/// at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>> host:port
>>>>>>>>>> > authority: local at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>> host:port
>>>>>>>>>> > authority: file:/// at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>>>> Does not
>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>>>> tracker
>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>> valid
>>>>>>>>>> > host:port authority: local at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>> >
>>>>>>>>>> >
>>>>>>>>>> > Regards,
>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I tried all of the hadoop home dirs but didn't worke

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> OK what the Hadoop home should be in ubuntu because the binary files in
> /usr/bin
> the hadoop-env.sh and othe xml file in /etc/hadoop
> the conf files in /usr/share/hadoop/templates/conf
>
> shall I use /usr as hadoop path because it is the dir that contain the bin
> files
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> log out from the user. log in again and see if it works.
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> you can avoid the warning by setting the following prop to true in the
>>> hadoop-env.sh file :
>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>
>>>
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> Thank you Mohammad
>>>> I still get the same error with this msg
>>>>
>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> you can do that using these command :
>>>>>
>>>>> sudo gedit ~/.bashrc
>>>>>
>>>>> then go to the end of the file and add this line :
>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>
>>>>> after that use it to freeze the changes :
>>>>> source ~/.bashrc
>>>>>
>>>>> to check it :
>>>>> echo $HADOOP_HOME
>>>>>
>>>>> This will permanently set your HADOOP_HOME.
>>>>>
>>>>> HTH
>>>>>
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> https://mtariq.jux.com/
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>>>> don't find it in the hadoop-env.sh
>>>>>>
>>>>>> Thank you Shashwat
>>>>>> this is the output and it is already configured but hadoop don't
>>>>>> read the configuration from here.
>>>>>>
>>>>>> /usr/share/maven-repo/org/apache
>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>> /commons-parent-debian-site.xml
>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>>>>> /usr/share/compiz/composite.xml
>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>
>>>>>>> try
>>>>>>> find / -type f -iname "*site.xml"
>>>>>>> it will show you where ever those files are..
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ∞
>>>>>>> Shashwat Shriparv
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>
>>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>>> changed.
>>>>>>>>
>>>>>>>> Mohammad Alkahtani
>>>>>>>> P.O.Box 102275
>>>>>>>> Riyadh 11675
>>>>>>>> Saudi Arabia
>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it
>>>>>>>>> is taking setting configuration from other location...
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ∞
>>>>>>>>> Shashwat Shriparv
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Hi,
>>>>>>>>>>
>>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>>
>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list
>>>>>>>>>> of all
>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>
>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>> ).
>>>>>>>>>> I remember I once had a similar error message and it was due to
>>>>>>>>>> the
>>>>>>>>>> change in properties names.
>>>>>>>>>>
>>>>>>>>>> Regards,
>>>>>>>>>>
>>>>>>>>>> Sourygna
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>> >
>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>>> could not
>>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>>> > <property>
>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>> > </property>
>>>>>>>>>> > and for mapred
>>>>>>>>>> > <property>
>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>> > </property>
>>>>>>>>>> >
>>>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>>>> cannot read
>>>>>>>>>> > the configuration file.
>>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>>> > export
>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>> >
>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>> host:port
>>>>>>>>>> > authority: file:/// at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>> host:port
>>>>>>>>>> > authority: local at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>>> host:port
>>>>>>>>>> > authority: file:/// at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>> > at
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>>>> Does not
>>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>> > at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>> >
>>>>>>>>>> > ________________________________
>>>>>>>>>> >
>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>>>> tracker
>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>>> valid
>>>>>>>>>> > host:port authority: local at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>>> >
>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>> >
>>>>>>>>>> >
>>>>>>>>>> > Regards,
>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
OK what the Hadoop home should be in ubuntu because the binary files in
/usr/bin
the hadoop-env.sh and othe xml file in /etc/hadoop
the conf files in /usr/share/hadoop/templates/conf

shall I use /usr as hadoop path because it is the dir that contain the bin
files

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com> wrote:

> log out from the user. log in again and see if it works.
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you can avoid the warning by setting the following prop to true in the
>> hadoop-env.sh file :
>> export HADOOP_HOME_WARN_SUPPRESS=true
>>
>>
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Thank you Mohammad
>>> I still get the same error with this msg
>>>
>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>> I searched ~/.bashrc but only what I wrote is there.
>>>
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> you can do that using these command :
>>>>
>>>> sudo gedit ~/.bashrc
>>>>
>>>> then go to the end of the file and add this line :
>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>
>>>> after that use it to freeze the changes :
>>>> source ~/.bashrc
>>>>
>>>> to check it :
>>>> echo $HADOOP_HOME
>>>>
>>>> This will permanently set your HADOOP_HOME.
>>>>
>>>> HTH
>>>>
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>>> don't find it in the hadoop-env.sh
>>>>>
>>>>> Thank you Shashwat
>>>>> this is the output and it is already configured but hadoop don't read
>>>>> the configuration from here.
>>>>>
>>>>> /usr/share/maven-repo/org/apache
>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>> /commons-parent-debian-site.xml
>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>>>> /usr/share/compiz/composite.xml
>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>
>>>>>> try
>>>>>> find / -type f -iname "*site.xml"
>>>>>> it will show you where ever those files are..
>>>>>>
>>>>>>
>>>>>>
>>>>>> ∞
>>>>>> Shashwat Shriparv
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>> changed.
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>
>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it
>>>>>>>> is taking setting configuration from other location...
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ∞
>>>>>>>> Shashwat Shriparv
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Hi,
>>>>>>>>>
>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>
>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list
>>>>>>>>> of all
>>>>>>>>> the deprecated properties here:
>>>>>>>>>
>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>> ).
>>>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>>>> change in properties names.
>>>>>>>>>
>>>>>>>>> Regards,
>>>>>>>>>
>>>>>>>>> Sourygna
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>> >
>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>> could not
>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>> > <property>
>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>> > </property>
>>>>>>>>> > and for mapred
>>>>>>>>> > <property>
>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>> > </property>
>>>>>>>>> >
>>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>>> cannot read
>>>>>>>>> > the configuration file.
>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>> > export
>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>> >
>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>> host:port
>>>>>>>>> > authority: file:/// at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>> host:port
>>>>>>>>> > authority: local at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>> host:port
>>>>>>>>> > authority: file:/// at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>>> Does not
>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>>> tracker
>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>> valid
>>>>>>>>> > host:port authority: local at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>> >
>>>>>>>>> >
>>>>>>>>> > Regards,
>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
OK what the Hadoop home should be in ubuntu because the binary files in
/usr/bin
the hadoop-env.sh and othe xml file in /etc/hadoop
the conf files in /usr/share/hadoop/templates/conf

shall I use /usr as hadoop path because it is the dir that contain the bin
files

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com> wrote:

> log out from the user. log in again and see if it works.
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you can avoid the warning by setting the following prop to true in the
>> hadoop-env.sh file :
>> export HADOOP_HOME_WARN_SUPPRESS=true
>>
>>
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Thank you Mohammad
>>> I still get the same error with this msg
>>>
>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>> I searched ~/.bashrc but only what I wrote is there.
>>>
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> you can do that using these command :
>>>>
>>>> sudo gedit ~/.bashrc
>>>>
>>>> then go to the end of the file and add this line :
>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>
>>>> after that use it to freeze the changes :
>>>> source ~/.bashrc
>>>>
>>>> to check it :
>>>> echo $HADOOP_HOME
>>>>
>>>> This will permanently set your HADOOP_HOME.
>>>>
>>>> HTH
>>>>
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>>> don't find it in the hadoop-env.sh
>>>>>
>>>>> Thank you Shashwat
>>>>> this is the output and it is already configured but hadoop don't read
>>>>> the configuration from here.
>>>>>
>>>>> /usr/share/maven-repo/org/apache
>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>> /commons-parent-debian-site.xml
>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>>>> /usr/share/compiz/composite.xml
>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>
>>>>>> try
>>>>>> find / -type f -iname "*site.xml"
>>>>>> it will show you where ever those files are..
>>>>>>
>>>>>>
>>>>>>
>>>>>> ∞
>>>>>> Shashwat Shriparv
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>> changed.
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>
>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it
>>>>>>>> is taking setting configuration from other location...
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ∞
>>>>>>>> Shashwat Shriparv
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Hi,
>>>>>>>>>
>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>
>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list
>>>>>>>>> of all
>>>>>>>>> the deprecated properties here:
>>>>>>>>>
>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>> ).
>>>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>>>> change in properties names.
>>>>>>>>>
>>>>>>>>> Regards,
>>>>>>>>>
>>>>>>>>> Sourygna
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>> >
>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>> could not
>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>> > <property>
>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>> > </property>
>>>>>>>>> > and for mapred
>>>>>>>>> > <property>
>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>> > </property>
>>>>>>>>> >
>>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>>> cannot read
>>>>>>>>> > the configuration file.
>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>> > export
>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>> >
>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>> host:port
>>>>>>>>> > authority: file:/// at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>> host:port
>>>>>>>>> > authority: local at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>> host:port
>>>>>>>>> > authority: file:/// at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>>> Does not
>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>>> tracker
>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>> valid
>>>>>>>>> > host:port authority: local at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>> >
>>>>>>>>> >
>>>>>>>>> > Regards,
>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
OK what the Hadoop home should be in ubuntu because the binary files in
/usr/bin
the hadoop-env.sh and othe xml file in /etc/hadoop
the conf files in /usr/share/hadoop/templates/conf

shall I use /usr as hadoop path because it is the dir that contain the bin
files

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com> wrote:

> log out from the user. log in again and see if it works.
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you can avoid the warning by setting the following prop to true in the
>> hadoop-env.sh file :
>> export HADOOP_HOME_WARN_SUPPRESS=true
>>
>>
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Thank you Mohammad
>>> I still get the same error with this msg
>>>
>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>> I searched ~/.bashrc but only what I wrote is there.
>>>
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> you can do that using these command :
>>>>
>>>> sudo gedit ~/.bashrc
>>>>
>>>> then go to the end of the file and add this line :
>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>
>>>> after that use it to freeze the changes :
>>>> source ~/.bashrc
>>>>
>>>> to check it :
>>>> echo $HADOOP_HOME
>>>>
>>>> This will permanently set your HADOOP_HOME.
>>>>
>>>> HTH
>>>>
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>>> don't find it in the hadoop-env.sh
>>>>>
>>>>> Thank you Shashwat
>>>>> this is the output and it is already configured but hadoop don't read
>>>>> the configuration from here.
>>>>>
>>>>> /usr/share/maven-repo/org/apache
>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>> /commons-parent-debian-site.xml
>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>>>> /usr/share/compiz/composite.xml
>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>
>>>>>> try
>>>>>> find / -type f -iname "*site.xml"
>>>>>> it will show you where ever those files are..
>>>>>>
>>>>>>
>>>>>>
>>>>>> ∞
>>>>>> Shashwat Shriparv
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>> changed.
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>
>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it
>>>>>>>> is taking setting configuration from other location...
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ∞
>>>>>>>> Shashwat Shriparv
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Hi,
>>>>>>>>>
>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>
>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list
>>>>>>>>> of all
>>>>>>>>> the deprecated properties here:
>>>>>>>>>
>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>> ).
>>>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>>>> change in properties names.
>>>>>>>>>
>>>>>>>>> Regards,
>>>>>>>>>
>>>>>>>>> Sourygna
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>> >
>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>> could not
>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>> > <property>
>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>> > </property>
>>>>>>>>> > and for mapred
>>>>>>>>> > <property>
>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>> > </property>
>>>>>>>>> >
>>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>>> cannot read
>>>>>>>>> > the configuration file.
>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>> > export
>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>> >
>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>> host:port
>>>>>>>>> > authority: file:/// at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>> host:port
>>>>>>>>> > authority: local at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>> host:port
>>>>>>>>> > authority: file:/// at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>>> Does not
>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>>> tracker
>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>> valid
>>>>>>>>> > host:port authority: local at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>> >
>>>>>>>>> >
>>>>>>>>> > Regards,
>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
OK what the Hadoop home should be in ubuntu because the binary files in
/usr/bin
the hadoop-env.sh and othe xml file in /etc/hadoop
the conf files in /usr/share/hadoop/templates/conf

shall I use /usr as hadoop path because it is the dir that contain the bin
files

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <do...@gmail.com> wrote:

> log out from the user. log in again and see if it works.
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you can avoid the warning by setting the following prop to true in the
>> hadoop-env.sh file :
>> export HADOOP_HOME_WARN_SUPPRESS=true
>>
>>
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Thank you Mohammad
>>> I still get the same error with this msg
>>>
>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>> I searched ~/.bashrc but only what I wrote is there.
>>>
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> you can do that using these command :
>>>>
>>>> sudo gedit ~/.bashrc
>>>>
>>>> then go to the end of the file and add this line :
>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>
>>>> after that use it to freeze the changes :
>>>> source ~/.bashrc
>>>>
>>>> to check it :
>>>> echo $HADOOP_HOME
>>>>
>>>> This will permanently set your HADOOP_HOME.
>>>>
>>>> HTH
>>>>
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> https://mtariq.jux.com/
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>>> don't find it in the hadoop-env.sh
>>>>>
>>>>> Thank you Shashwat
>>>>> this is the output and it is already configured but hadoop don't read
>>>>> the configuration from here.
>>>>>
>>>>> /usr/share/maven-repo/org/apache
>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>> /commons-parent-debian-site.xml
>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>>>> /usr/share/compiz/composite.xml
>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>
>>>>>> try
>>>>>> find / -type f -iname "*site.xml"
>>>>>> it will show you where ever those files are..
>>>>>>
>>>>>>
>>>>>>
>>>>>> ∞
>>>>>> Shashwat Shriparv
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>
>>>>>>> The problem is I tried I read the configuration file by changing
>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>> but I think Hadoop dosen't get the configration from this dir, I
>>>>>>> trid and searched the system for conf dir the only dir is this one which I
>>>>>>> changed.
>>>>>>>
>>>>>>> Mohammad Alkahtani
>>>>>>> P.O.Box 102275
>>>>>>> Riyadh 11675
>>>>>>> Saudi Arabia
>>>>>>> mobile: 00966 555 33 1717
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>
>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it
>>>>>>>> is taking setting configuration from other location...
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ∞
>>>>>>>> Shashwat Shriparv
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Hi,
>>>>>>>>>
>>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>>
>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list
>>>>>>>>> of all
>>>>>>>>> the deprecated properties here:
>>>>>>>>>
>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>> ).
>>>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>>>> change in properties names.
>>>>>>>>>
>>>>>>>>> Regards,
>>>>>>>>>
>>>>>>>>> Sourygna
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>> >
>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>>> could not
>>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>>> > <property>
>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>> > </property>
>>>>>>>>> > and for mapred
>>>>>>>>> > <property>
>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>> > </property>
>>>>>>>>> >
>>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>>> cannot read
>>>>>>>>> > the configuration file.
>>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>>> > export
>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>> >
>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>> host:port
>>>>>>>>> > authority: file:/// at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>> host:port
>>>>>>>>> > authority: local at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>>> host:port
>>>>>>>>> > authority: file:/// at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>> > at
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>>> Does not
>>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>> > at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>> >
>>>>>>>>> > ________________________________
>>>>>>>>> >
>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>>> tracker
>>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>>> valid
>>>>>>>>> > host:port authority: local at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>>> >
>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>> >
>>>>>>>>> >
>>>>>>>>> > Regards,
>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
log out from the user. log in again and see if it works.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com> wrote:

> you can avoid the warning by setting the following prop to true in the
> hadoop-env.sh file :
> export HADOOP_HOME_WARN_SUPPRESS=true
>
>
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> Thank you Mohammad
>> I still get the same error with this msg
>>
>> localhost: Warning: $HADOOP_HOME is deprecated.
>> I searched ~/.bashrc but only what I wrote is there.
>>
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> you can do that using these command :
>>>
>>> sudo gedit ~/.bashrc
>>>
>>> then go to the end of the file and add this line :
>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>
>>> after that use it to freeze the changes :
>>> source ~/.bashrc
>>>
>>> to check it :
>>> echo $HADOOP_HOME
>>>
>>> This will permanently set your HADOOP_HOME.
>>>
>>> HTH
>>>
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>> don't find it in the hadoop-env.sh
>>>>
>>>> Thank you Shashwat
>>>> this is the output and it is already configured but hadoop don't read
>>>> the configuration from here.
>>>>
>>>> /usr/share/maven-repo/org/apache
>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>> /commons-parent-debian-site.xml
>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>>> /usr/share/compiz/composite.xml
>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>> dwivedishashwat@gmail.com> wrote:
>>>>
>>>>> try
>>>>> find / -type f -iname "*site.xml"
>>>>> it will show you where ever those files are..
>>>>>
>>>>>
>>>>>
>>>>> ∞
>>>>> Shashwat Shriparv
>>>>>
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> The problem is I tried I read the configuration file by changing
>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>>>> and searched the system for conf dir the only dir is this one which I
>>>>>> changed.
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>
>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>>>> taking setting configuration from other location...
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ∞
>>>>>>> Shashwat Shriparv
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>
>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>>>> all
>>>>>>>> the deprecated properties here:
>>>>>>>>
>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>> ).
>>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>>> change in properties names.
>>>>>>>>
>>>>>>>> Regards,
>>>>>>>>
>>>>>>>> Sourygna
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>> >
>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>> could not
>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>> /usr/shar/hadoop. I
>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>> > <property>
>>>>>>>> > <name>fs.default.name</name>
>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>> > </property>
>>>>>>>> > and for mapred
>>>>>>>> > <property>
>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>> > <value>localhost:9001</value>
>>>>>>>> > </property>
>>>>>>>> >
>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>> cannot read
>>>>>>>> > the configuration file.
>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>> > export
>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>> > but dosen't solve the problem.
>>>>>>>> >
>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>> host:port
>>>>>>>> > authority: file:/// at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>> host:port
>>>>>>>> > authority: local at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>> host:port
>>>>>>>> > authority: file:/// at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>> Does not
>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>> tracker
>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>> valid
>>>>>>>> > host:port authority: local at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>> >
>>>>>>>> >
>>>>>>>> > Regards,
>>>>>>>> > Mohammad Alkahtani
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
log out from the user. log in again and see if it works.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com> wrote:

> you can avoid the warning by setting the following prop to true in the
> hadoop-env.sh file :
> export HADOOP_HOME_WARN_SUPPRESS=true
>
>
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> Thank you Mohammad
>> I still get the same error with this msg
>>
>> localhost: Warning: $HADOOP_HOME is deprecated.
>> I searched ~/.bashrc but only what I wrote is there.
>>
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> you can do that using these command :
>>>
>>> sudo gedit ~/.bashrc
>>>
>>> then go to the end of the file and add this line :
>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>
>>> after that use it to freeze the changes :
>>> source ~/.bashrc
>>>
>>> to check it :
>>> echo $HADOOP_HOME
>>>
>>> This will permanently set your HADOOP_HOME.
>>>
>>> HTH
>>>
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>> don't find it in the hadoop-env.sh
>>>>
>>>> Thank you Shashwat
>>>> this is the output and it is already configured but hadoop don't read
>>>> the configuration from here.
>>>>
>>>> /usr/share/maven-repo/org/apache
>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>> /commons-parent-debian-site.xml
>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>>> /usr/share/compiz/composite.xml
>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>> dwivedishashwat@gmail.com> wrote:
>>>>
>>>>> try
>>>>> find / -type f -iname "*site.xml"
>>>>> it will show you where ever those files are..
>>>>>
>>>>>
>>>>>
>>>>> ∞
>>>>> Shashwat Shriparv
>>>>>
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> The problem is I tried I read the configuration file by changing
>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>>>> and searched the system for conf dir the only dir is this one which I
>>>>>> changed.
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>
>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>>>> taking setting configuration from other location...
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ∞
>>>>>>> Shashwat Shriparv
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>
>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>>>> all
>>>>>>>> the deprecated properties here:
>>>>>>>>
>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>> ).
>>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>>> change in properties names.
>>>>>>>>
>>>>>>>> Regards,
>>>>>>>>
>>>>>>>> Sourygna
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>> >
>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>> could not
>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>> /usr/shar/hadoop. I
>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>> > <property>
>>>>>>>> > <name>fs.default.name</name>
>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>> > </property>
>>>>>>>> > and for mapred
>>>>>>>> > <property>
>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>> > <value>localhost:9001</value>
>>>>>>>> > </property>
>>>>>>>> >
>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>> cannot read
>>>>>>>> > the configuration file.
>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>> > export
>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>> > but dosen't solve the problem.
>>>>>>>> >
>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>> host:port
>>>>>>>> > authority: file:/// at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>> host:port
>>>>>>>> > authority: local at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>> host:port
>>>>>>>> > authority: file:/// at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>> Does not
>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>> tracker
>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>> valid
>>>>>>>> > host:port authority: local at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>> >
>>>>>>>> >
>>>>>>>> > Regards,
>>>>>>>> > Mohammad Alkahtani
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
log out from the user. log in again and see if it works.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com> wrote:

> you can avoid the warning by setting the following prop to true in the
> hadoop-env.sh file :
> export HADOOP_HOME_WARN_SUPPRESS=true
>
>
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> Thank you Mohammad
>> I still get the same error with this msg
>>
>> localhost: Warning: $HADOOP_HOME is deprecated.
>> I searched ~/.bashrc but only what I wrote is there.
>>
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> you can do that using these command :
>>>
>>> sudo gedit ~/.bashrc
>>>
>>> then go to the end of the file and add this line :
>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>
>>> after that use it to freeze the changes :
>>> source ~/.bashrc
>>>
>>> to check it :
>>> echo $HADOOP_HOME
>>>
>>> This will permanently set your HADOOP_HOME.
>>>
>>> HTH
>>>
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>> don't find it in the hadoop-env.sh
>>>>
>>>> Thank you Shashwat
>>>> this is the output and it is already configured but hadoop don't read
>>>> the configuration from here.
>>>>
>>>> /usr/share/maven-repo/org/apache
>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>> /commons-parent-debian-site.xml
>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>>> /usr/share/compiz/composite.xml
>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>> dwivedishashwat@gmail.com> wrote:
>>>>
>>>>> try
>>>>> find / -type f -iname "*site.xml"
>>>>> it will show you where ever those files are..
>>>>>
>>>>>
>>>>>
>>>>> ∞
>>>>> Shashwat Shriparv
>>>>>
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> The problem is I tried I read the configuration file by changing
>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>>>> and searched the system for conf dir the only dir is this one which I
>>>>>> changed.
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>
>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>>>> taking setting configuration from other location...
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ∞
>>>>>>> Shashwat Shriparv
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>
>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>>>> all
>>>>>>>> the deprecated properties here:
>>>>>>>>
>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>> ).
>>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>>> change in properties names.
>>>>>>>>
>>>>>>>> Regards,
>>>>>>>>
>>>>>>>> Sourygna
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>> >
>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>> could not
>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>> /usr/shar/hadoop. I
>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>> > <property>
>>>>>>>> > <name>fs.default.name</name>
>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>> > </property>
>>>>>>>> > and for mapred
>>>>>>>> > <property>
>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>> > <value>localhost:9001</value>
>>>>>>>> > </property>
>>>>>>>> >
>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>> cannot read
>>>>>>>> > the configuration file.
>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>> > export
>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>> > but dosen't solve the problem.
>>>>>>>> >
>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>> host:port
>>>>>>>> > authority: file:/// at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>> host:port
>>>>>>>> > authority: local at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>> host:port
>>>>>>>> > authority: file:/// at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>> Does not
>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>> tracker
>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>> valid
>>>>>>>> > host:port authority: local at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>> >
>>>>>>>> >
>>>>>>>> > Regards,
>>>>>>>> > Mohammad Alkahtani
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
log out from the user. log in again and see if it works.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <do...@gmail.com> wrote:

> you can avoid the warning by setting the following prop to true in the
> hadoop-env.sh file :
> export HADOOP_HOME_WARN_SUPPRESS=true
>
>
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> Thank you Mohammad
>> I still get the same error with this msg
>>
>> localhost: Warning: $HADOOP_HOME is deprecated.
>> I searched ~/.bashrc but only what I wrote is there.
>>
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> you can do that using these command :
>>>
>>> sudo gedit ~/.bashrc
>>>
>>> then go to the end of the file and add this line :
>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>
>>> after that use it to freeze the changes :
>>> source ~/.bashrc
>>>
>>> to check it :
>>> echo $HADOOP_HOME
>>>
>>> This will permanently set your HADOOP_HOME.
>>>
>>> HTH
>>>
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>>> don't find it in the hadoop-env.sh
>>>>
>>>> Thank you Shashwat
>>>> this is the output and it is already configured but hadoop don't read
>>>> the configuration from here.
>>>>
>>>> /usr/share/maven-repo/org/apache
>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>> /commons-parent-debian-site.xml
>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>>> /usr/share/compiz/composite.xml
>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>>> dwivedishashwat@gmail.com> wrote:
>>>>
>>>>> try
>>>>> find / -type f -iname "*site.xml"
>>>>> it will show you where ever those files are..
>>>>>
>>>>>
>>>>>
>>>>> ∞
>>>>> Shashwat Shriparv
>>>>>
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> The problem is I tried I read the configuration file by changing
>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>>>> and searched the system for conf dir the only dir is this one which I
>>>>>> changed.
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>
>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>>>> taking setting configuration from other location...
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ∞
>>>>>>> Shashwat Shriparv
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> What is the version of Hadoop you use?
>>>>>>>>
>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>>>> all
>>>>>>>> the deprecated properties here:
>>>>>>>>
>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>> ).
>>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>>> change in properties names.
>>>>>>>>
>>>>>>>> Regards,
>>>>>>>>
>>>>>>>> Sourygna
>>>>>>>>
>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>>> <m....@gmail.com> wrote:
>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>> >
>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might
>>>>>>>> could not
>>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>>> /usr/shar/hadoop. I
>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>>> > <property>
>>>>>>>> > <name>fs.default.name</name>
>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>> > </property>
>>>>>>>> > and for mapred
>>>>>>>> > <property>
>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>> > <value>localhost:9001</value>
>>>>>>>> > </property>
>>>>>>>> >
>>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>>> cannot read
>>>>>>>> > the configuration file.
>>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>>> > export
>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>> > but dosen't solve the problem.
>>>>>>>> >
>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>> host:port
>>>>>>>> > authority: file:/// at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>> host:port
>>>>>>>> > authority: local at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>>> host:port
>>>>>>>> > authority: file:/// at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>> > at
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>>> Does not
>>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>> > at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>> >
>>>>>>>> > ________________________________
>>>>>>>> >
>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>>> tracker
>>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>>> valid
>>>>>>>> > host:port authority: local at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>>>>> >
>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>> >
>>>>>>>> >
>>>>>>>> > Regards,
>>>>>>>> > Mohammad Alkahtani
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
you can avoid the warning by setting the following prop to true in the
hadoop-env.sh file :
export HADOOP_HOME_WARN_SUPPRESS=true



Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Mohammad
> I still get the same error with this msg
>
> localhost: Warning: $HADOOP_HOME is deprecated.
> I searched ~/.bashrc but only what I wrote is there.
>
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you can do that using these command :
>>
>> sudo gedit ~/.bashrc
>>
>> then go to the end of the file and add this line :
>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>
>> after that use it to freeze the changes :
>> source ~/.bashrc
>>
>> to check it :
>> echo $HADOOP_HOME
>>
>> This will permanently set your HADOOP_HOME.
>>
>> HTH
>>
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>> don't find it in the hadoop-env.sh
>>>
>>> Thank you Shashwat
>>> this is the output and it is already configured but hadoop don't read
>>> the configuration from here.
>>>
>>> /usr/share/maven-repo/org/apache
>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>> /commons-parent-debian-site.xml
>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>> /usr/share/compiz/composite.xml
>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>> /usr/share/hadoop/templates/conf/core-site.xml
>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>> dwivedishashwat@gmail.com> wrote:
>>>
>>>> try
>>>> find / -type f -iname "*site.xml"
>>>> it will show you where ever those files are..
>>>>
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> The problem is I tried I read the configuration file by changing
>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>>> and searched the system for conf dir the only dir is this one which I
>>>>> changed.
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>
>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>>> taking setting configuration from other location...
>>>>>>
>>>>>>
>>>>>>
>>>>>> ∞
>>>>>> Shashwat Shriparv
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>> luangsay@gmail.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> What is the version of Hadoop you use?
>>>>>>>
>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>>> all
>>>>>>> the deprecated properties here:
>>>>>>>
>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>> ).
>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>> change in properties names.
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sourygna
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>> <m....@gmail.com> wrote:
>>>>>>> > Hi to all users of Hadoop,
>>>>>>> >
>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>>>> not
>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>> /usr/shar/hadoop. I
>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>> > <property>
>>>>>>> > <name>fs.default.name</name>
>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>> > </property>
>>>>>>> > and for mapred
>>>>>>> > <property>
>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>> > <value>localhost:9001</value>
>>>>>>> > </property>
>>>>>>> >
>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>> cannot read
>>>>>>> > the configuration file.
>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>> > export
>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>> > but dosen't solve the problem.
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>> Does not
>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>> tracker
>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>> valid
>>>>>>> > host:port authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>> >
>>>>>>> >
>>>>>>> > Regards,
>>>>>>> > Mohammad Alkahtani
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I tried echo $HADOOP_HOME

and I got blank

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:37 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Mohammad
> I still get the same error with this msg
>
> localhost: Warning: $HADOOP_HOME is deprecated.
> I searched ~/.bashrc but only what I wrote is there.
>
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you can do that using these command :
>>
>> sudo gedit ~/.bashrc
>>
>> then go to the end of the file and add this line :
>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>
>> after that use it to freeze the changes :
>> source ~/.bashrc
>>
>> to check it :
>> echo $HADOOP_HOME
>>
>> This will permanently set your HADOOP_HOME.
>>
>> HTH
>>
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>> don't find it in the hadoop-env.sh
>>>
>>> Thank you Shashwat
>>> this is the output and it is already configured but hadoop don't read
>>> the configuration from here.
>>>
>>> /usr/share/maven-repo/org/apache
>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>> /commons-parent-debian-site.xml
>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>> /usr/share/compiz/composite.xml
>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>> /usr/share/hadoop/templates/conf/core-site.xml
>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>> dwivedishashwat@gmail.com> wrote:
>>>
>>>> try
>>>> find / -type f -iname "*site.xml"
>>>> it will show you where ever those files are..
>>>>
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> The problem is I tried I read the configuration file by changing
>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>>> and searched the system for conf dir the only dir is this one which I
>>>>> changed.
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>
>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>>> taking setting configuration from other location...
>>>>>>
>>>>>>
>>>>>>
>>>>>> ∞
>>>>>> Shashwat Shriparv
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>> luangsay@gmail.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> What is the version of Hadoop you use?
>>>>>>>
>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>>> all
>>>>>>> the deprecated properties here:
>>>>>>>
>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>> ).
>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>> change in properties names.
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sourygna
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>> <m....@gmail.com> wrote:
>>>>>>> > Hi to all users of Hadoop,
>>>>>>> >
>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>>>> not
>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>> /usr/shar/hadoop. I
>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>> > <property>
>>>>>>> > <name>fs.default.name</name>
>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>> > </property>
>>>>>>> > and for mapred
>>>>>>> > <property>
>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>> > <value>localhost:9001</value>
>>>>>>> > </property>
>>>>>>> >
>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>> cannot read
>>>>>>> > the configuration file.
>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>> > export
>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>> > but dosen't solve the problem.
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>> Does not
>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>> tracker
>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>> valid
>>>>>>> > host:port authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>> >
>>>>>>> >
>>>>>>> > Regards,
>>>>>>> > Mohammad Alkahtani
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I tried echo $HADOOP_HOME

and I got blank

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:37 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Mohammad
> I still get the same error with this msg
>
> localhost: Warning: $HADOOP_HOME is deprecated.
> I searched ~/.bashrc but only what I wrote is there.
>
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you can do that using these command :
>>
>> sudo gedit ~/.bashrc
>>
>> then go to the end of the file and add this line :
>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>
>> after that use it to freeze the changes :
>> source ~/.bashrc
>>
>> to check it :
>> echo $HADOOP_HOME
>>
>> This will permanently set your HADOOP_HOME.
>>
>> HTH
>>
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>> don't find it in the hadoop-env.sh
>>>
>>> Thank you Shashwat
>>> this is the output and it is already configured but hadoop don't read
>>> the configuration from here.
>>>
>>> /usr/share/maven-repo/org/apache
>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>> /commons-parent-debian-site.xml
>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>> /usr/share/compiz/composite.xml
>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>> /usr/share/hadoop/templates/conf/core-site.xml
>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>> dwivedishashwat@gmail.com> wrote:
>>>
>>>> try
>>>> find / -type f -iname "*site.xml"
>>>> it will show you where ever those files are..
>>>>
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> The problem is I tried I read the configuration file by changing
>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>>> and searched the system for conf dir the only dir is this one which I
>>>>> changed.
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>
>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>>> taking setting configuration from other location...
>>>>>>
>>>>>>
>>>>>>
>>>>>> ∞
>>>>>> Shashwat Shriparv
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>> luangsay@gmail.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> What is the version of Hadoop you use?
>>>>>>>
>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>>> all
>>>>>>> the deprecated properties here:
>>>>>>>
>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>> ).
>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>> change in properties names.
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sourygna
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>> <m....@gmail.com> wrote:
>>>>>>> > Hi to all users of Hadoop,
>>>>>>> >
>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>>>> not
>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>> /usr/shar/hadoop. I
>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>> > <property>
>>>>>>> > <name>fs.default.name</name>
>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>> > </property>
>>>>>>> > and for mapred
>>>>>>> > <property>
>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>> > <value>localhost:9001</value>
>>>>>>> > </property>
>>>>>>> >
>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>> cannot read
>>>>>>> > the configuration file.
>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>> > export
>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>> > but dosen't solve the problem.
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>> Does not
>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>> tracker
>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>> valid
>>>>>>> > host:port authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>> >
>>>>>>> >
>>>>>>> > Regards,
>>>>>>> > Mohammad Alkahtani
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
you can avoid the warning by setting the following prop to true in the
hadoop-env.sh file :
export HADOOP_HOME_WARN_SUPPRESS=true



Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Mohammad
> I still get the same error with this msg
>
> localhost: Warning: $HADOOP_HOME is deprecated.
> I searched ~/.bashrc but only what I wrote is there.
>
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you can do that using these command :
>>
>> sudo gedit ~/.bashrc
>>
>> then go to the end of the file and add this line :
>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>
>> after that use it to freeze the changes :
>> source ~/.bashrc
>>
>> to check it :
>> echo $HADOOP_HOME
>>
>> This will permanently set your HADOOP_HOME.
>>
>> HTH
>>
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>> don't find it in the hadoop-env.sh
>>>
>>> Thank you Shashwat
>>> this is the output and it is already configured but hadoop don't read
>>> the configuration from here.
>>>
>>> /usr/share/maven-repo/org/apache
>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>> /commons-parent-debian-site.xml
>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>> /usr/share/compiz/composite.xml
>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>> /usr/share/hadoop/templates/conf/core-site.xml
>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>> dwivedishashwat@gmail.com> wrote:
>>>
>>>> try
>>>> find / -type f -iname "*site.xml"
>>>> it will show you where ever those files are..
>>>>
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> The problem is I tried I read the configuration file by changing
>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>>> and searched the system for conf dir the only dir is this one which I
>>>>> changed.
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>
>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>>> taking setting configuration from other location...
>>>>>>
>>>>>>
>>>>>>
>>>>>> ∞
>>>>>> Shashwat Shriparv
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>> luangsay@gmail.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> What is the version of Hadoop you use?
>>>>>>>
>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>>> all
>>>>>>> the deprecated properties here:
>>>>>>>
>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>> ).
>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>> change in properties names.
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sourygna
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>> <m....@gmail.com> wrote:
>>>>>>> > Hi to all users of Hadoop,
>>>>>>> >
>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>>>> not
>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>> /usr/shar/hadoop. I
>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>> > <property>
>>>>>>> > <name>fs.default.name</name>
>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>> > </property>
>>>>>>> > and for mapred
>>>>>>> > <property>
>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>> > <value>localhost:9001</value>
>>>>>>> > </property>
>>>>>>> >
>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>> cannot read
>>>>>>> > the configuration file.
>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>> > export
>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>> > but dosen't solve the problem.
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>> Does not
>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>> tracker
>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>> valid
>>>>>>> > host:port authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>> >
>>>>>>> >
>>>>>>> > Regards,
>>>>>>> > Mohammad Alkahtani
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
you can avoid the warning by setting the following prop to true in the
hadoop-env.sh file :
export HADOOP_HOME_WARN_SUPPRESS=true



Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Mohammad
> I still get the same error with this msg
>
> localhost: Warning: $HADOOP_HOME is deprecated.
> I searched ~/.bashrc but only what I wrote is there.
>
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you can do that using these command :
>>
>> sudo gedit ~/.bashrc
>>
>> then go to the end of the file and add this line :
>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>
>> after that use it to freeze the changes :
>> source ~/.bashrc
>>
>> to check it :
>> echo $HADOOP_HOME
>>
>> This will permanently set your HADOOP_HOME.
>>
>> HTH
>>
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>> don't find it in the hadoop-env.sh
>>>
>>> Thank you Shashwat
>>> this is the output and it is already configured but hadoop don't read
>>> the configuration from here.
>>>
>>> /usr/share/maven-repo/org/apache
>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>> /commons-parent-debian-site.xml
>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>> /usr/share/compiz/composite.xml
>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>> /usr/share/hadoop/templates/conf/core-site.xml
>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>> dwivedishashwat@gmail.com> wrote:
>>>
>>>> try
>>>> find / -type f -iname "*site.xml"
>>>> it will show you where ever those files are..
>>>>
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> The problem is I tried I read the configuration file by changing
>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>>> and searched the system for conf dir the only dir is this one which I
>>>>> changed.
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>
>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>>> taking setting configuration from other location...
>>>>>>
>>>>>>
>>>>>>
>>>>>> ∞
>>>>>> Shashwat Shriparv
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>> luangsay@gmail.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> What is the version of Hadoop you use?
>>>>>>>
>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>>> all
>>>>>>> the deprecated properties here:
>>>>>>>
>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>> ).
>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>> change in properties names.
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sourygna
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>> <m....@gmail.com> wrote:
>>>>>>> > Hi to all users of Hadoop,
>>>>>>> >
>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>>>> not
>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>> /usr/shar/hadoop. I
>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>> > <property>
>>>>>>> > <name>fs.default.name</name>
>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>> > </property>
>>>>>>> > and for mapred
>>>>>>> > <property>
>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>> > <value>localhost:9001</value>
>>>>>>> > </property>
>>>>>>> >
>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>> cannot read
>>>>>>> > the configuration file.
>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>> > export
>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>> > but dosen't solve the problem.
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>> Does not
>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>> tracker
>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>> valid
>>>>>>> > host:port authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>> >
>>>>>>> >
>>>>>>> > Regards,
>>>>>>> > Mohammad Alkahtani
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
you can avoid the warning by setting the following prop to true in the
hadoop-env.sh file :
export HADOOP_HOME_WARN_SUPPRESS=true



Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Thank you Mohammad
> I still get the same error with this msg
>
> localhost: Warning: $HADOOP_HOME is deprecated.
> I searched ~/.bashrc but only what I wrote is there.
>
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> you can do that using these command :
>>
>> sudo gedit ~/.bashrc
>>
>> then go to the end of the file and add this line :
>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>
>> after that use it to freeze the changes :
>> source ~/.bashrc
>>
>> to check it :
>> echo $HADOOP_HOME
>>
>> This will permanently set your HADOOP_HOME.
>>
>> HTH
>>
>>
>> Warm Regards,
>> Tariq
>> https://mtariq.jux.com/
>> cloudfront.blogspot.com
>>
>>
>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>>> don't find it in the hadoop-env.sh
>>>
>>> Thank you Shashwat
>>> this is the output and it is already configured but hadoop don't read
>>> the configuration from here.
>>>
>>> /usr/share/maven-repo/org/apache
>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>> /commons-parent-debian-site.xml
>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>>> /usr/share/compiz/composite.xml
>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>> /usr/share/hadoop/templates/conf/core-site.xml
>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>>> dwivedishashwat@gmail.com> wrote:
>>>
>>>> try
>>>> find / -type f -iname "*site.xml"
>>>> it will show you where ever those files are..
>>>>
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> The problem is I tried I read the configuration file by changing
>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>>> and searched the system for conf dir the only dir is this one which I
>>>>> changed.
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>
>>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>>> taking setting configuration from other location...
>>>>>>
>>>>>>
>>>>>>
>>>>>> ∞
>>>>>> Shashwat Shriparv
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>>> luangsay@gmail.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> What is the version of Hadoop you use?
>>>>>>>
>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>>> all
>>>>>>> the deprecated properties here:
>>>>>>>
>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>> ).
>>>>>>> I remember I once had a similar error message and it was due to the
>>>>>>> change in properties names.
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sourygna
>>>>>>>
>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>>> <m....@gmail.com> wrote:
>>>>>>> > Hi to all users of Hadoop,
>>>>>>> >
>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>>>> not
>>>>>>> > configure it right. The conf dir is under templates in
>>>>>>> /usr/shar/hadoop. I
>>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>>> > <property>
>>>>>>> > <name>fs.default.name</name>
>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>> > </property>
>>>>>>> > and for mapred
>>>>>>> > <property>
>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>> > <value>localhost:9001</value>
>>>>>>> > </property>
>>>>>>> >
>>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>>> cannot read
>>>>>>> > the configuration file.
>>>>>>> > I chaned the hadoop-env.sh to
>>>>>>> > export
>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>> > but dosen't solve the problem.
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>>> host:port
>>>>>>> > authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>> > at
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException:
>>>>>>> Does not
>>>>>>> > contain a valid host:port authority: file:/// at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>> > at
>>>>>>> >
>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>> >
>>>>>>> > ________________________________
>>>>>>> >
>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>>> tracker
>>>>>>> > because java.lang.IllegalArgumentException: Does not contain a
>>>>>>> valid
>>>>>>> > host:port authority: local at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>>> at
>>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>>> at
>>>>>>> >
>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>> >
>>>>>>> >
>>>>>>> > Regards,
>>>>>>> > Mohammad Alkahtani
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Thank you Mohammad
I still get the same error with this msg

localhost: Warning: $HADOOP_HOME is deprecated.
I searched ~/.bashrc but only what I wrote is there.


Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com> wrote:

> you can do that using these command :
>
> sudo gedit ~/.bashrc
>
> then go to the end of the file and add this line :
> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>
> after that use it to freeze the changes :
> source ~/.bashrc
>
> to check it :
> echo $HADOOP_HOME
>
> This will permanently set your HADOOP_HOME.
>
> HTH
>
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>> don't find it in the hadoop-env.sh
>>
>> Thank you Shashwat
>> this is the output and it is already configured but hadoop don't read
>> the configuration from here.
>>
>> /usr/share/maven-repo/org/apache
>> /commons/commons-parent/22/commons-parent-22-site.xml
>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>> /commons-parent-debian-site.xml
>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>> /usr/share/compiz/composite.xml
>> /usr/share/hadoop/templates/conf/mapred-site.xml
>> /usr/share/hadoop/templates/conf/core-site.xml
>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>> dwivedishashwat@gmail.com> wrote:
>>
>>> try
>>> find / -type f -iname "*site.xml"
>>> it will show you where ever those files are..
>>>
>>>
>>>
>>> ∞
>>> Shashwat Shriparv
>>>
>>>
>>>
>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> The problem is I tried I read the configuration file by changing
>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>> and searched the system for conf dir the only dir is this one which I
>>>> changed.
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>> dwivedishashwat@gmail.com> wrote:
>>>>
>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>> taking setting configuration from other location...
>>>>>
>>>>>
>>>>>
>>>>> ∞
>>>>> Shashwat Shriparv
>>>>>
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>> luangsay@gmail.com> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> What is the version of Hadoop you use?
>>>>>>
>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>> all
>>>>>> the deprecated properties here:
>>>>>>
>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>> ).
>>>>>> I remember I once had a similar error message and it was due to the
>>>>>> change in properties names.
>>>>>>
>>>>>> Regards,
>>>>>>
>>>>>> Sourygna
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>> <m....@gmail.com> wrote:
>>>>>> > Hi to all users of Hadoop,
>>>>>> >
>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>>> not
>>>>>> > configure it right. The conf dir is under templates in
>>>>>> /usr/shar/hadoop. I
>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>> > <property>
>>>>>> > <name>fs.default.name</name>
>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>> > </property>
>>>>>> > and for mapred
>>>>>> > <property>
>>>>>> > <name>mapred.job.tracker</name>
>>>>>> > <value>localhost:9001</value>
>>>>>> > </property>
>>>>>> >
>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>> cannot read
>>>>>> > the configuration file.
>>>>>> > I chaned the hadoop-env.sh to
>>>>>> > export
>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>> > but dosen't solve the problem.
>>>>>> >
>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> host:port
>>>>>> > authority: file:/// at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> host:port
>>>>>> > authority: local at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> host:port
>>>>>> > authority: file:/// at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does
>>>>>> not
>>>>>> > contain a valid host:port authority: file:/// at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>> tracker
>>>>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> > host:port authority: local at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>> >
>>>>>> >
>>>>>> > Regards,
>>>>>> > Mohammad Alkahtani
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Thank you Mohammad
I still get the same error with this msg

localhost: Warning: $HADOOP_HOME is deprecated.
I searched ~/.bashrc but only what I wrote is there.


Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com> wrote:

> you can do that using these command :
>
> sudo gedit ~/.bashrc
>
> then go to the end of the file and add this line :
> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>
> after that use it to freeze the changes :
> source ~/.bashrc
>
> to check it :
> echo $HADOOP_HOME
>
> This will permanently set your HADOOP_HOME.
>
> HTH
>
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>> don't find it in the hadoop-env.sh
>>
>> Thank you Shashwat
>> this is the output and it is already configured but hadoop don't read
>> the configuration from here.
>>
>> /usr/share/maven-repo/org/apache
>> /commons/commons-parent/22/commons-parent-22-site.xml
>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>> /commons-parent-debian-site.xml
>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>> /usr/share/compiz/composite.xml
>> /usr/share/hadoop/templates/conf/mapred-site.xml
>> /usr/share/hadoop/templates/conf/core-site.xml
>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>> dwivedishashwat@gmail.com> wrote:
>>
>>> try
>>> find / -type f -iname "*site.xml"
>>> it will show you where ever those files are..
>>>
>>>
>>>
>>> ∞
>>> Shashwat Shriparv
>>>
>>>
>>>
>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> The problem is I tried I read the configuration file by changing
>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>> and searched the system for conf dir the only dir is this one which I
>>>> changed.
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>> dwivedishashwat@gmail.com> wrote:
>>>>
>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>> taking setting configuration from other location...
>>>>>
>>>>>
>>>>>
>>>>> ∞
>>>>> Shashwat Shriparv
>>>>>
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>> luangsay@gmail.com> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> What is the version of Hadoop you use?
>>>>>>
>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>> all
>>>>>> the deprecated properties here:
>>>>>>
>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>> ).
>>>>>> I remember I once had a similar error message and it was due to the
>>>>>> change in properties names.
>>>>>>
>>>>>> Regards,
>>>>>>
>>>>>> Sourygna
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>> <m....@gmail.com> wrote:
>>>>>> > Hi to all users of Hadoop,
>>>>>> >
>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>>> not
>>>>>> > configure it right. The conf dir is under templates in
>>>>>> /usr/shar/hadoop. I
>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>> > <property>
>>>>>> > <name>fs.default.name</name>
>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>> > </property>
>>>>>> > and for mapred
>>>>>> > <property>
>>>>>> > <name>mapred.job.tracker</name>
>>>>>> > <value>localhost:9001</value>
>>>>>> > </property>
>>>>>> >
>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>> cannot read
>>>>>> > the configuration file.
>>>>>> > I chaned the hadoop-env.sh to
>>>>>> > export
>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>> > but dosen't solve the problem.
>>>>>> >
>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> host:port
>>>>>> > authority: file:/// at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> host:port
>>>>>> > authority: local at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> host:port
>>>>>> > authority: file:/// at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does
>>>>>> not
>>>>>> > contain a valid host:port authority: file:/// at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>> tracker
>>>>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> > host:port authority: local at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>> >
>>>>>> >
>>>>>> > Regards,
>>>>>> > Mohammad Alkahtani
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Thank you Mohammad
I still get the same error with this msg

localhost: Warning: $HADOOP_HOME is deprecated.
I searched ~/.bashrc but only what I wrote is there.


Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com> wrote:

> you can do that using these command :
>
> sudo gedit ~/.bashrc
>
> then go to the end of the file and add this line :
> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>
> after that use it to freeze the changes :
> source ~/.bashrc
>
> to check it :
> echo $HADOOP_HOME
>
> This will permanently set your HADOOP_HOME.
>
> HTH
>
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>> don't find it in the hadoop-env.sh
>>
>> Thank you Shashwat
>> this is the output and it is already configured but hadoop don't read
>> the configuration from here.
>>
>> /usr/share/maven-repo/org/apache
>> /commons/commons-parent/22/commons-parent-22-site.xml
>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>> /commons-parent-debian-site.xml
>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>> /usr/share/compiz/composite.xml
>> /usr/share/hadoop/templates/conf/mapred-site.xml
>> /usr/share/hadoop/templates/conf/core-site.xml
>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>> dwivedishashwat@gmail.com> wrote:
>>
>>> try
>>> find / -type f -iname "*site.xml"
>>> it will show you where ever those files are..
>>>
>>>
>>>
>>> ∞
>>> Shashwat Shriparv
>>>
>>>
>>>
>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> The problem is I tried I read the configuration file by changing
>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>> and searched the system for conf dir the only dir is this one which I
>>>> changed.
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>> dwivedishashwat@gmail.com> wrote:
>>>>
>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>> taking setting configuration from other location...
>>>>>
>>>>>
>>>>>
>>>>> ∞
>>>>> Shashwat Shriparv
>>>>>
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>> luangsay@gmail.com> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> What is the version of Hadoop you use?
>>>>>>
>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>> all
>>>>>> the deprecated properties here:
>>>>>>
>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>> ).
>>>>>> I remember I once had a similar error message and it was due to the
>>>>>> change in properties names.
>>>>>>
>>>>>> Regards,
>>>>>>
>>>>>> Sourygna
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>> <m....@gmail.com> wrote:
>>>>>> > Hi to all users of Hadoop,
>>>>>> >
>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>>> not
>>>>>> > configure it right. The conf dir is under templates in
>>>>>> /usr/shar/hadoop. I
>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>> > <property>
>>>>>> > <name>fs.default.name</name>
>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>> > </property>
>>>>>> > and for mapred
>>>>>> > <property>
>>>>>> > <name>mapred.job.tracker</name>
>>>>>> > <value>localhost:9001</value>
>>>>>> > </property>
>>>>>> >
>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>> cannot read
>>>>>> > the configuration file.
>>>>>> > I chaned the hadoop-env.sh to
>>>>>> > export
>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>> > but dosen't solve the problem.
>>>>>> >
>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> host:port
>>>>>> > authority: file:/// at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> host:port
>>>>>> > authority: local at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> host:port
>>>>>> > authority: file:/// at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does
>>>>>> not
>>>>>> > contain a valid host:port authority: file:/// at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>> tracker
>>>>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> > host:port authority: local at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>> >
>>>>>> >
>>>>>> > Regards,
>>>>>> > Mohammad Alkahtani
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Thank you Mohammad
I still get the same error with this msg

localhost: Warning: $HADOOP_HOME is deprecated.
I searched ~/.bashrc but only what I wrote is there.


Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <do...@gmail.com> wrote:

> you can do that using these command :
>
> sudo gedit ~/.bashrc
>
> then go to the end of the file and add this line :
> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>
> after that use it to freeze the changes :
> source ~/.bashrc
>
> to check it :
> echo $HADOOP_HOME
>
> This will permanently set your HADOOP_HOME.
>
> HTH
>
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I
>> don't find it in the hadoop-env.sh
>>
>> Thank you Shashwat
>> this is the output and it is already configured but hadoop don't read
>> the configuration from here.
>>
>> /usr/share/maven-repo/org/apache
>> /commons/commons-parent/22/commons-parent-22-site.xml
>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>> /commons-parent-debian-site.xml
>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
>> /usr/share/compiz/composite.xml
>> /usr/share/hadoop/templates/conf/mapred-site.xml
>> /usr/share/hadoop/templates/conf/core-site.xml
>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
>> dwivedishashwat@gmail.com> wrote:
>>
>>> try
>>> find / -type f -iname "*site.xml"
>>> it will show you where ever those files are..
>>>
>>>
>>>
>>> ∞
>>> Shashwat Shriparv
>>>
>>>
>>>
>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> The problem is I tried I read the configuration file by changing
>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>>> and searched the system for conf dir the only dir is this one which I
>>>> changed.
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>>> dwivedishashwat@gmail.com> wrote:
>>>>
>>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>>> taking setting configuration from other location...
>>>>>
>>>>>
>>>>>
>>>>> ∞
>>>>> Shashwat Shriparv
>>>>>
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <
>>>>> luangsay@gmail.com> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> What is the version of Hadoop you use?
>>>>>>
>>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of
>>>>>> all
>>>>>> the deprecated properties here:
>>>>>>
>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>> ).
>>>>>> I remember I once had a similar error message and it was due to the
>>>>>> change in properties names.
>>>>>>
>>>>>> Regards,
>>>>>>
>>>>>> Sourygna
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>>> <m....@gmail.com> wrote:
>>>>>> > Hi to all users of Hadoop,
>>>>>> >
>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>>> not
>>>>>> > configure it right. The conf dir is under templates in
>>>>>> /usr/shar/hadoop. I
>>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>>> > <property>
>>>>>> > <name>fs.default.name</name>
>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>> > </property>
>>>>>> > and for mapred
>>>>>> > <property>
>>>>>> > <name>mapred.job.tracker</name>
>>>>>> > <value>localhost:9001</value>
>>>>>> > </property>
>>>>>> >
>>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>>> cannot read
>>>>>> > the configuration file.
>>>>>> > I chaned the hadoop-env.sh to
>>>>>> > export
>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>> > but dosen't solve the problem.
>>>>>> >
>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> host:port
>>>>>> > authority: file:/// at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> host:port
>>>>>> > authority: local at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> host:port
>>>>>> > authority: file:/// at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>> > at
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does
>>>>>> not
>>>>>> > contain a valid host:port authority: file:/// at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>> > at
>>>>>> >
>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>> >
>>>>>> > ________________________________
>>>>>> >
>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>>> tracker
>>>>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>>>>> > host:port authority: local at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>>>>>> at
>>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>>>>>> at
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>> >
>>>>>> >
>>>>>> > Regards,
>>>>>> > Mohammad Alkahtani
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
you can do that using these command :

sudo gedit ~/.bashrc

then go to the end of the file and add this line :
export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH

after that use it to freeze the changes :
source ~/.bashrc

to check it :
echo $HADOOP_HOME

This will permanently set your HADOOP_HOME.

HTH


Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I don't
> find it in the hadoop-env.sh
>
> Thank you Shashwat
> this is the output and it is already configured but hadoop don't read the
> configuration from here.
>
> /usr/share/maven-repo/org/apache
> /commons/commons-parent/22/commons-parent-22-site.xml
> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
> /commons-parent-debian-site.xml
> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
> /usr/share/compiz/composite.xml
> /usr/share/hadoop/templates/conf/mapred-site.xml
> /usr/share/hadoop/templates/conf/core-site.xml
> /usr/share/hadoop/templates/conf/hdfs-site.xml
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> try
>> find / -type f -iname "*site.xml"
>> it will show you where ever those files are..
>>
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> The problem is I tried I read the configuration file by changing
>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>> and searched the system for conf dir the only dir is this one which I
>>> changed.
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>> dwivedishashwat@gmail.com> wrote:
>>>
>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>> taking setting configuration from other location...
>>>>
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <luangsay@gmail.com
>>>> > wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> What is the version of Hadoop you use?
>>>>>
>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>>>> the deprecated properties here:
>>>>>
>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>> ).
>>>>> I remember I once had a similar error message and it was due to the
>>>>> change in properties names.
>>>>>
>>>>> Regards,
>>>>>
>>>>> Sourygna
>>>>>
>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>> <m....@gmail.com> wrote:
>>>>> > Hi to all users of Hadoop,
>>>>> >
>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>> not
>>>>> > configure it right. The conf dir is under templates in
>>>>> /usr/shar/hadoop. I
>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>> > <property>
>>>>> > <name>fs.default.name</name>
>>>>> > <value>hdfs://localhost:9000</value>
>>>>> > </property>
>>>>> > and for mapred
>>>>> > <property>
>>>>> > <name>mapred.job.tracker</name>
>>>>> > <value>localhost:9001</value>
>>>>> > </property>
>>>>> >
>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>> cannot read
>>>>> > the configuration file.
>>>>> > I chaned the hadoop-env.sh to
>>>>> > export
>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>> > but dosen't solve the problem.
>>>>> >
>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>> host:port
>>>>> > authority: file:/// at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>> host:port
>>>>> > authority: local at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>>> at
>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>> >
>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>> >
>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>> host:port
>>>>> > authority: file:/// at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does
>>>>> not
>>>>> > contain a valid host:port authority: file:/// at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>> tracker
>>>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>>>> > host:port authority: local at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>>> at
>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>> >
>>>>> >
>>>>> > Regards,
>>>>> > Mohammad Alkahtani
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
you can do that using these command :

sudo gedit ~/.bashrc

then go to the end of the file and add this line :
export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH

after that use it to freeze the changes :
source ~/.bashrc

to check it :
echo $HADOOP_HOME

This will permanently set your HADOOP_HOME.

HTH


Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I don't
> find it in the hadoop-env.sh
>
> Thank you Shashwat
> this is the output and it is already configured but hadoop don't read the
> configuration from here.
>
> /usr/share/maven-repo/org/apache
> /commons/commons-parent/22/commons-parent-22-site.xml
> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
> /commons-parent-debian-site.xml
> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
> /usr/share/compiz/composite.xml
> /usr/share/hadoop/templates/conf/mapred-site.xml
> /usr/share/hadoop/templates/conf/core-site.xml
> /usr/share/hadoop/templates/conf/hdfs-site.xml
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> try
>> find / -type f -iname "*site.xml"
>> it will show you where ever those files are..
>>
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> The problem is I tried I read the configuration file by changing
>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>> and searched the system for conf dir the only dir is this one which I
>>> changed.
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>> dwivedishashwat@gmail.com> wrote:
>>>
>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>> taking setting configuration from other location...
>>>>
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <luangsay@gmail.com
>>>> > wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> What is the version of Hadoop you use?
>>>>>
>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>>>> the deprecated properties here:
>>>>>
>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>> ).
>>>>> I remember I once had a similar error message and it was due to the
>>>>> change in properties names.
>>>>>
>>>>> Regards,
>>>>>
>>>>> Sourygna
>>>>>
>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>> <m....@gmail.com> wrote:
>>>>> > Hi to all users of Hadoop,
>>>>> >
>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>> not
>>>>> > configure it right. The conf dir is under templates in
>>>>> /usr/shar/hadoop. I
>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>> > <property>
>>>>> > <name>fs.default.name</name>
>>>>> > <value>hdfs://localhost:9000</value>
>>>>> > </property>
>>>>> > and for mapred
>>>>> > <property>
>>>>> > <name>mapred.job.tracker</name>
>>>>> > <value>localhost:9001</value>
>>>>> > </property>
>>>>> >
>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>> cannot read
>>>>> > the configuration file.
>>>>> > I chaned the hadoop-env.sh to
>>>>> > export
>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>> > but dosen't solve the problem.
>>>>> >
>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>> host:port
>>>>> > authority: file:/// at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>> host:port
>>>>> > authority: local at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>>> at
>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>> >
>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>> >
>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>> host:port
>>>>> > authority: file:/// at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does
>>>>> not
>>>>> > contain a valid host:port authority: file:/// at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>> tracker
>>>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>>>> > host:port authority: local at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>>> at
>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>> >
>>>>> >
>>>>> > Regards,
>>>>> > Mohammad Alkahtani
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
you can do that using these command :

sudo gedit ~/.bashrc

then go to the end of the file and add this line :
export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH

after that use it to freeze the changes :
source ~/.bashrc

to check it :
echo $HADOOP_HOME

This will permanently set your HADOOP_HOME.

HTH


Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I don't
> find it in the hadoop-env.sh
>
> Thank you Shashwat
> this is the output and it is already configured but hadoop don't read the
> configuration from here.
>
> /usr/share/maven-repo/org/apache
> /commons/commons-parent/22/commons-parent-22-site.xml
> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
> /commons-parent-debian-site.xml
> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
> /usr/share/compiz/composite.xml
> /usr/share/hadoop/templates/conf/mapred-site.xml
> /usr/share/hadoop/templates/conf/core-site.xml
> /usr/share/hadoop/templates/conf/hdfs-site.xml
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> try
>> find / -type f -iname "*site.xml"
>> it will show you where ever those files are..
>>
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> The problem is I tried I read the configuration file by changing
>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>> and searched the system for conf dir the only dir is this one which I
>>> changed.
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>> dwivedishashwat@gmail.com> wrote:
>>>
>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>> taking setting configuration from other location...
>>>>
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <luangsay@gmail.com
>>>> > wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> What is the version of Hadoop you use?
>>>>>
>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>>>> the deprecated properties here:
>>>>>
>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>> ).
>>>>> I remember I once had a similar error message and it was due to the
>>>>> change in properties names.
>>>>>
>>>>> Regards,
>>>>>
>>>>> Sourygna
>>>>>
>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>> <m....@gmail.com> wrote:
>>>>> > Hi to all users of Hadoop,
>>>>> >
>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>> not
>>>>> > configure it right. The conf dir is under templates in
>>>>> /usr/shar/hadoop. I
>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>> > <property>
>>>>> > <name>fs.default.name</name>
>>>>> > <value>hdfs://localhost:9000</value>
>>>>> > </property>
>>>>> > and for mapred
>>>>> > <property>
>>>>> > <name>mapred.job.tracker</name>
>>>>> > <value>localhost:9001</value>
>>>>> > </property>
>>>>> >
>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>> cannot read
>>>>> > the configuration file.
>>>>> > I chaned the hadoop-env.sh to
>>>>> > export
>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>> > but dosen't solve the problem.
>>>>> >
>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>> host:port
>>>>> > authority: file:/// at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>> host:port
>>>>> > authority: local at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>>> at
>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>> >
>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>> >
>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>> host:port
>>>>> > authority: file:/// at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does
>>>>> not
>>>>> > contain a valid host:port authority: file:/// at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>> tracker
>>>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>>>> > host:port authority: local at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>>> at
>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>> >
>>>>> >
>>>>> > Regards,
>>>>> > Mohammad Alkahtani
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
you can do that using these command :

sudo gedit ~/.bashrc

then go to the end of the file and add this line :
export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH

after that use it to freeze the changes :
source ~/.bashrc

to check it :
echo $HADOOP_HOME

This will permanently set your HADOOP_HOME.

HTH


Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I don't
> find it in the hadoop-env.sh
>
> Thank you Shashwat
> this is the output and it is already configured but hadoop don't read the
> configuration from here.
>
> /usr/share/maven-repo/org/apache
> /commons/commons-parent/22/commons-parent-22-site.xml
> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
> /commons-parent-debian-site.xml
> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
> /usr/share/compiz/composite.xml
> /usr/share/hadoop/templates/conf/mapred-site.xml
> /usr/share/hadoop/templates/conf/core-site.xml
> /usr/share/hadoop/templates/conf/hdfs-site.xml
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> try
>> find / -type f -iname "*site.xml"
>> it will show you where ever those files are..
>>
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>> m.alkahtani@gmail.com> wrote:
>>
>>> The problem is I tried I read the configuration file by changing
>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>> and searched the system for conf dir the only dir is this one which I
>>> changed.
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>> dwivedishashwat@gmail.com> wrote:
>>>
>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>> taking setting configuration from other location...
>>>>
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <luangsay@gmail.com
>>>> > wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> What is the version of Hadoop you use?
>>>>>
>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>>>> the deprecated properties here:
>>>>>
>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>> ).
>>>>> I remember I once had a similar error message and it was due to the
>>>>> change in properties names.
>>>>>
>>>>> Regards,
>>>>>
>>>>> Sourygna
>>>>>
>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>> <m....@gmail.com> wrote:
>>>>> > Hi to all users of Hadoop,
>>>>> >
>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>> not
>>>>> > configure it right. The conf dir is under templates in
>>>>> /usr/shar/hadoop. I
>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>> > <property>
>>>>> > <name>fs.default.name</name>
>>>>> > <value>hdfs://localhost:9000</value>
>>>>> > </property>
>>>>> > and for mapred
>>>>> > <property>
>>>>> > <name>mapred.job.tracker</name>
>>>>> > <value>localhost:9001</value>
>>>>> > </property>
>>>>> >
>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>> cannot read
>>>>> > the configuration file.
>>>>> > I chaned the hadoop-env.sh to
>>>>> > export
>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>> > but dosen't solve the problem.
>>>>> >
>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>> host:port
>>>>> > authority: file:/// at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>> host:port
>>>>> > authority: local at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>>> at
>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>>> >
>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>>>>> >
>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>> host:port
>>>>> > authority: file:/// at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>> > at
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does
>>>>> not
>>>>> > contain a valid host:port authority: file:/// at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>> > at
>>>>> >
>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>> >
>>>>> > ________________________________
>>>>> >
>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task
>>>>> tracker
>>>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>>>> > host:port authority: local at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>>> at
>>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>> >
>>>>> >
>>>>> > Regards,
>>>>> > Mohammad Alkahtani
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Hi Tariq, Could you please tell me how to set HADOOP_HOME because I don't
find it in the hadoop-env.sh

Thank you Shashwat
this is the output and it is already configured but hadoop don't read the
configuration from here.

/usr/share/maven-repo/org/apache
/commons/commons-parent/22/commons-parent-22-site.xml
/usr/share/maven-repo/org/apache/commons/commons-parent/debian
/commons-parent-debian-site.xml
/usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
/usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
/usr/share/compiz/composite.xml
/usr/share/hadoop/templates/conf/mapred-site.xml
/usr/share/hadoop/templates/conf/core-site.xml
/usr/share/hadoop/templates/conf/hdfs-site.xml

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
dwivedishashwat@gmail.com> wrote:

> try
> find / -type f -iname "*site.xml"
> it will show you where ever those files are..
>
>
>
> ∞
> Shashwat Shriparv
>
>
>
> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> The problem is I tried I read the configuration file by changing
>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>> DIR:-"/usr/shar/hadoop/templates/conf"}
>> but I think Hadoop dosen't get the configration from this dir, I trid and
>> searched the system for conf dir the only dir is this one which I changed.
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>> dwivedishashwat@gmail.com> wrote:
>>
>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>> taking setting configuration from other location...
>>>
>>>
>>>
>>> ∞
>>> Shashwat Shriparv
>>>
>>>
>>>
>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>> What is the version of Hadoop you use?
>>>>
>>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>>> the deprecated properties here:
>>>>
>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>> ).
>>>> I remember I once had a similar error message and it was due to the
>>>> change in properties names.
>>>>
>>>> Regards,
>>>>
>>>> Sourygna
>>>>
>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>> <m....@gmail.com> wrote:
>>>> > Hi to all users of Hadoop,
>>>> >
>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>>>> > configure it right. The conf dir is under templates in
>>>> /usr/shar/hadoop. I
>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>> > <property>
>>>> > <name>fs.default.name</name>
>>>> > <value>hdfs://localhost:9000</value>
>>>> > </property>
>>>> > and for mapred
>>>> > <property>
>>>> > <name>mapred.job.tracker</name>
>>>> > <value>localhost:9001</value>
>>>> > </property>
>>>> >
>>>> > but i get these errors, I assume that there is problem, Hadoop cannot
>>>> read
>>>> > the configuration file.
>>>> > I chaned the hadoop-env.sh to
>>>> > export
>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>> > but dosen't solve the problem.
>>>> >
>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>>> > authority: file:/// at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>>> > authority: local at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>> at
>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312)
>>>> at
>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303)
>>>> at
>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>>> > authority: file:/// at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does
>>>> not
>>>> > contain a valid host:port authority: file:/// at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>>> > host:port authority: local at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>> at
>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>> >
>>>> >
>>>> > Regards,
>>>> > Mohammad Alkahtani
>>>>
>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Hi Tariq, Could you please tell me how to set HADOOP_HOME because I don't
find it in the hadoop-env.sh

Thank you Shashwat
this is the output and it is already configured but hadoop don't read the
configuration from here.

/usr/share/maven-repo/org/apache
/commons/commons-parent/22/commons-parent-22-site.xml
/usr/share/maven-repo/org/apache/commons/commons-parent/debian
/commons-parent-debian-site.xml
/usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
/usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
/usr/share/compiz/composite.xml
/usr/share/hadoop/templates/conf/mapred-site.xml
/usr/share/hadoop/templates/conf/core-site.xml
/usr/share/hadoop/templates/conf/hdfs-site.xml

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
dwivedishashwat@gmail.com> wrote:

> try
> find / -type f -iname "*site.xml"
> it will show you where ever those files are..
>
>
>
> ∞
> Shashwat Shriparv
>
>
>
> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> The problem is I tried I read the configuration file by changing
>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>> DIR:-"/usr/shar/hadoop/templates/conf"}
>> but I think Hadoop dosen't get the configration from this dir, I trid and
>> searched the system for conf dir the only dir is this one which I changed.
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>> dwivedishashwat@gmail.com> wrote:
>>
>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>> taking setting configuration from other location...
>>>
>>>
>>>
>>> ∞
>>> Shashwat Shriparv
>>>
>>>
>>>
>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>> What is the version of Hadoop you use?
>>>>
>>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>>> the deprecated properties here:
>>>>
>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>> ).
>>>> I remember I once had a similar error message and it was due to the
>>>> change in properties names.
>>>>
>>>> Regards,
>>>>
>>>> Sourygna
>>>>
>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>> <m....@gmail.com> wrote:
>>>> > Hi to all users of Hadoop,
>>>> >
>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>>>> > configure it right. The conf dir is under templates in
>>>> /usr/shar/hadoop. I
>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>> > <property>
>>>> > <name>fs.default.name</name>
>>>> > <value>hdfs://localhost:9000</value>
>>>> > </property>
>>>> > and for mapred
>>>> > <property>
>>>> > <name>mapred.job.tracker</name>
>>>> > <value>localhost:9001</value>
>>>> > </property>
>>>> >
>>>> > but i get these errors, I assume that there is problem, Hadoop cannot
>>>> read
>>>> > the configuration file.
>>>> > I chaned the hadoop-env.sh to
>>>> > export
>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>> > but dosen't solve the problem.
>>>> >
>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>>> > authority: file:/// at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>>> > authority: local at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>> at
>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312)
>>>> at
>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303)
>>>> at
>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>>> > authority: file:/// at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does
>>>> not
>>>> > contain a valid host:port authority: file:/// at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>>> > host:port authority: local at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>> at
>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>> >
>>>> >
>>>> > Regards,
>>>> > Mohammad Alkahtani
>>>>
>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Hi Tariq, Could you please tell me how to set HADOOP_HOME because I don't
find it in the hadoop-env.sh

Thank you Shashwat
this is the output and it is already configured but hadoop don't read the
configuration from here.

/usr/share/maven-repo/org/apache
/commons/commons-parent/22/commons-parent-22-site.xml
/usr/share/maven-repo/org/apache/commons/commons-parent/debian
/commons-parent-debian-site.xml
/usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
/usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
/usr/share/compiz/composite.xml
/usr/share/hadoop/templates/conf/mapred-site.xml
/usr/share/hadoop/templates/conf/core-site.xml
/usr/share/hadoop/templates/conf/hdfs-site.xml

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
dwivedishashwat@gmail.com> wrote:

> try
> find / -type f -iname "*site.xml"
> it will show you where ever those files are..
>
>
>
> ∞
> Shashwat Shriparv
>
>
>
> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> The problem is I tried I read the configuration file by changing
>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>> DIR:-"/usr/shar/hadoop/templates/conf"}
>> but I think Hadoop dosen't get the configration from this dir, I trid and
>> searched the system for conf dir the only dir is this one which I changed.
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>> dwivedishashwat@gmail.com> wrote:
>>
>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>> taking setting configuration from other location...
>>>
>>>
>>>
>>> ∞
>>> Shashwat Shriparv
>>>
>>>
>>>
>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>> What is the version of Hadoop you use?
>>>>
>>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>>> the deprecated properties here:
>>>>
>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>> ).
>>>> I remember I once had a similar error message and it was due to the
>>>> change in properties names.
>>>>
>>>> Regards,
>>>>
>>>> Sourygna
>>>>
>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>> <m....@gmail.com> wrote:
>>>> > Hi to all users of Hadoop,
>>>> >
>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>>>> > configure it right. The conf dir is under templates in
>>>> /usr/shar/hadoop. I
>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>> > <property>
>>>> > <name>fs.default.name</name>
>>>> > <value>hdfs://localhost:9000</value>
>>>> > </property>
>>>> > and for mapred
>>>> > <property>
>>>> > <name>mapred.job.tracker</name>
>>>> > <value>localhost:9001</value>
>>>> > </property>
>>>> >
>>>> > but i get these errors, I assume that there is problem, Hadoop cannot
>>>> read
>>>> > the configuration file.
>>>> > I chaned the hadoop-env.sh to
>>>> > export
>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>> > but dosen't solve the problem.
>>>> >
>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>>> > authority: file:/// at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>>> > authority: local at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>> at
>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312)
>>>> at
>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303)
>>>> at
>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>>> > authority: file:/// at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does
>>>> not
>>>> > contain a valid host:port authority: file:/// at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>>> > host:port authority: local at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>> at
>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>> >
>>>> >
>>>> > Regards,
>>>> > Mohammad Alkahtani
>>>>
>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
Hi Tariq, Could you please tell me how to set HADOOP_HOME because I don't
find it in the hadoop-env.sh

Thank you Shashwat
this is the output and it is already configured but hadoop don't read the
configuration from here.

/usr/share/maven-repo/org/apache
/commons/commons-parent/22/commons-parent-22-site.xml
/usr/share/maven-repo/org/apache/commons/commons-parent/debian
/commons-parent-debian-site.xml
/usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
/usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
/usr/share/compiz/composite.xml
/usr/share/hadoop/templates/conf/mapred-site.xml
/usr/share/hadoop/templates/conf/core-site.xml
/usr/share/hadoop/templates/conf/hdfs-site.xml

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
dwivedishashwat@gmail.com> wrote:

> try
> find / -type f -iname "*site.xml"
> it will show you where ever those files are..
>
>
>
> ∞
> Shashwat Shriparv
>
>
>
> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
> m.alkahtani@gmail.com> wrote:
>
>> The problem is I tried I read the configuration file by changing
>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>> DIR:-"/usr/shar/hadoop/templates/conf"}
>> but I think Hadoop dosen't get the configration from this dir, I trid and
>> searched the system for conf dir the only dir is this one which I changed.
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>> dwivedishashwat@gmail.com> wrote:
>>
>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>> taking setting configuration from other location...
>>>
>>>
>>>
>>> ∞
>>> Shashwat Shriparv
>>>
>>>
>>>
>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>> What is the version of Hadoop you use?
>>>>
>>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>>> the deprecated properties here:
>>>>
>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>> ).
>>>> I remember I once had a similar error message and it was due to the
>>>> change in properties names.
>>>>
>>>> Regards,
>>>>
>>>> Sourygna
>>>>
>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>> <m....@gmail.com> wrote:
>>>> > Hi to all users of Hadoop,
>>>> >
>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>>>> > configure it right. The conf dir is under templates in
>>>> /usr/shar/hadoop. I
>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>> > <property>
>>>> > <name>fs.default.name</name>
>>>> > <value>hdfs://localhost:9000</value>
>>>> > </property>
>>>> > and for mapred
>>>> > <property>
>>>> > <name>mapred.job.tracker</name>
>>>> > <value>localhost:9001</value>
>>>> > </property>
>>>> >
>>>> > but i get these errors, I assume that there is problem, Hadoop cannot
>>>> read
>>>> > the configuration file.
>>>> > I chaned the hadoop-env.sh to
>>>> > export
>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>> > but dosen't solve the problem.
>>>> >
>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>>> > authority: file:/// at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>>> > authority: local at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>> at
>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312)
>>>> at
>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303)
>>>> at
>>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>>> > authority: file:/// at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>> > at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does
>>>> not
>>>> > contain a valid host:port authority: file:/// at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>> > at
>>>> >
>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>> >
>>>> > ________________________________
>>>> >
>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>>> > host:port authority: local at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
>>>> at
>>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>> >
>>>> >
>>>> > Regards,
>>>> > Mohammad Alkahtani
>>>>
>>>
>>>
>>
>

Re: Hadoop Debian Package

Posted by shashwat shriparv <dw...@gmail.com>.
try
find / -type f -iname "*site.xml"
it will show you where ever those files are..



∞
Shashwat Shriparv



On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> The problem is I tried I read the configuration file by changing
> export HADOOP_CONF_DIR=${HADOOP_CONF_
> DIR:-"/usr/shar/hadoop/templates/conf"}
> but I think Hadoop dosen't get the configration from this dir, I trid and
> searched the system for conf dir the only dir is this one which I changed.
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>> taking setting configuration from other location...
>>
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> What is the version of Hadoop you use?
>>>
>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>> the deprecated properties here:
>>>
>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>> ).
>>> I remember I once had a similar error message and it was due to the
>>> change in properties names.
>>>
>>> Regards,
>>>
>>> Sourygna
>>>
>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>> <m....@gmail.com> wrote:
>>> > Hi to all users of Hadoop,
>>> >
>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>>> > configure it right. The conf dir is under templates in
>>> /usr/shar/hadoop. I
>>> > edit the core-site.xml, mapred-site.xml files to give
>>> > <property>
>>> > <name>fs.default.name</name>
>>> > <value>hdfs://localhost:9000</value>
>>> > </property>
>>> > and for mapred
>>> > <property>
>>> > <name>mapred.job.tracker</name>
>>> > <value>localhost:9001</value>
>>> > </property>
>>> >
>>> > but i get these errors, I assume that there is problem, Hadoop cannot
>>> read
>>> > the configuration file.
>>> > I chaned the hadoop-env.sh to
>>> > export
>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>> > but dosen't solve the problem.
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>> > at
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>> >
>>> > ________________________________
>>> >
>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>> > at
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>> >
>>> > ________________________________
>>> >
>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
>>> > contain a valid host:port authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>> > host:port authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>> >
>>> >
>>> > Regards,
>>> > Mohammad Alkahtani
>>>
>>
>>
>

Re: Hadoop Debian Package

Posted by shashwat shriparv <dw...@gmail.com>.
try
find / -type f -iname "*site.xml"
it will show you where ever those files are..



∞
Shashwat Shriparv



On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> The problem is I tried I read the configuration file by changing
> export HADOOP_CONF_DIR=${HADOOP_CONF_
> DIR:-"/usr/shar/hadoop/templates/conf"}
> but I think Hadoop dosen't get the configration from this dir, I trid and
> searched the system for conf dir the only dir is this one which I changed.
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>> taking setting configuration from other location...
>>
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> What is the version of Hadoop you use?
>>>
>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>> the deprecated properties here:
>>>
>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>> ).
>>> I remember I once had a similar error message and it was due to the
>>> change in properties names.
>>>
>>> Regards,
>>>
>>> Sourygna
>>>
>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>> <m....@gmail.com> wrote:
>>> > Hi to all users of Hadoop,
>>> >
>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>>> > configure it right. The conf dir is under templates in
>>> /usr/shar/hadoop. I
>>> > edit the core-site.xml, mapred-site.xml files to give
>>> > <property>
>>> > <name>fs.default.name</name>
>>> > <value>hdfs://localhost:9000</value>
>>> > </property>
>>> > and for mapred
>>> > <property>
>>> > <name>mapred.job.tracker</name>
>>> > <value>localhost:9001</value>
>>> > </property>
>>> >
>>> > but i get these errors, I assume that there is problem, Hadoop cannot
>>> read
>>> > the configuration file.
>>> > I chaned the hadoop-env.sh to
>>> > export
>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>> > but dosen't solve the problem.
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>> > at
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>> >
>>> > ________________________________
>>> >
>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>> > at
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>> >
>>> > ________________________________
>>> >
>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
>>> > contain a valid host:port authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>> > host:port authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>> >
>>> >
>>> > Regards,
>>> > Mohammad Alkahtani
>>>
>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
Hello Mohammad,

      Have you set your HADOOP_HOME properly?
Please check it once.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> The problem is I tried I read the configuration file by changing
> export HADOOP_CONF_DIR=${HADOOP_CONF_
> DIR:-"/usr/shar/hadoop/templates/conf"}
> but I think Hadoop dosen't get the configration from this dir, I trid and
> searched the system for conf dir the only dir is this one which I changed.
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>> taking setting configuration from other location...
>>
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> What is the version of Hadoop you use?
>>>
>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>> the deprecated properties here:
>>>
>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>> ).
>>> I remember I once had a similar error message and it was due to the
>>> change in properties names.
>>>
>>> Regards,
>>>
>>> Sourygna
>>>
>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>> <m....@gmail.com> wrote:
>>> > Hi to all users of Hadoop,
>>> >
>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>>> > configure it right. The conf dir is under templates in
>>> /usr/shar/hadoop. I
>>> > edit the core-site.xml, mapred-site.xml files to give
>>> > <property>
>>> > <name>fs.default.name</name>
>>> > <value>hdfs://localhost:9000</value>
>>> > </property>
>>> > and for mapred
>>> > <property>
>>> > <name>mapred.job.tracker</name>
>>> > <value>localhost:9001</value>
>>> > </property>
>>> >
>>> > but i get these errors, I assume that there is problem, Hadoop cannot
>>> read
>>> > the configuration file.
>>> > I chaned the hadoop-env.sh to
>>> > export
>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>> > but dosen't solve the problem.
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>> > at
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>> >
>>> > ________________________________
>>> >
>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>> > at
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>> >
>>> > ________________________________
>>> >
>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
>>> > contain a valid host:port authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>> > host:port authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>> >
>>> >
>>> > Regards,
>>> > Mohammad Alkahtani
>>>
>>
>>
>

Re: Hadoop Debian Package

Posted by shashwat shriparv <dw...@gmail.com>.
try
find / -type f -iname "*site.xml"
it will show you where ever those files are..



∞
Shashwat Shriparv



On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> The problem is I tried I read the configuration file by changing
> export HADOOP_CONF_DIR=${HADOOP_CONF_
> DIR:-"/usr/shar/hadoop/templates/conf"}
> but I think Hadoop dosen't get the configration from this dir, I trid and
> searched the system for conf dir the only dir is this one which I changed.
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>> taking setting configuration from other location...
>>
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> What is the version of Hadoop you use?
>>>
>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>> the deprecated properties here:
>>>
>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>> ).
>>> I remember I once had a similar error message and it was due to the
>>> change in properties names.
>>>
>>> Regards,
>>>
>>> Sourygna
>>>
>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>> <m....@gmail.com> wrote:
>>> > Hi to all users of Hadoop,
>>> >
>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>>> > configure it right. The conf dir is under templates in
>>> /usr/shar/hadoop. I
>>> > edit the core-site.xml, mapred-site.xml files to give
>>> > <property>
>>> > <name>fs.default.name</name>
>>> > <value>hdfs://localhost:9000</value>
>>> > </property>
>>> > and for mapred
>>> > <property>
>>> > <name>mapred.job.tracker</name>
>>> > <value>localhost:9001</value>
>>> > </property>
>>> >
>>> > but i get these errors, I assume that there is problem, Hadoop cannot
>>> read
>>> > the configuration file.
>>> > I chaned the hadoop-env.sh to
>>> > export
>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>> > but dosen't solve the problem.
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>> > at
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>> >
>>> > ________________________________
>>> >
>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>> > at
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>> >
>>> > ________________________________
>>> >
>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
>>> > contain a valid host:port authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>> > host:port authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>> >
>>> >
>>> > Regards,
>>> > Mohammad Alkahtani
>>>
>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
Hello Mohammad,

      Have you set your HADOOP_HOME properly?
Please check it once.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> The problem is I tried I read the configuration file by changing
> export HADOOP_CONF_DIR=${HADOOP_CONF_
> DIR:-"/usr/shar/hadoop/templates/conf"}
> but I think Hadoop dosen't get the configration from this dir, I trid and
> searched the system for conf dir the only dir is this one which I changed.
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>> taking setting configuration from other location...
>>
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> What is the version of Hadoop you use?
>>>
>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>> the deprecated properties here:
>>>
>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>> ).
>>> I remember I once had a similar error message and it was due to the
>>> change in properties names.
>>>
>>> Regards,
>>>
>>> Sourygna
>>>
>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>> <m....@gmail.com> wrote:
>>> > Hi to all users of Hadoop,
>>> >
>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>>> > configure it right. The conf dir is under templates in
>>> /usr/shar/hadoop. I
>>> > edit the core-site.xml, mapred-site.xml files to give
>>> > <property>
>>> > <name>fs.default.name</name>
>>> > <value>hdfs://localhost:9000</value>
>>> > </property>
>>> > and for mapred
>>> > <property>
>>> > <name>mapred.job.tracker</name>
>>> > <value>localhost:9001</value>
>>> > </property>
>>> >
>>> > but i get these errors, I assume that there is problem, Hadoop cannot
>>> read
>>> > the configuration file.
>>> > I chaned the hadoop-env.sh to
>>> > export
>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>> > but dosen't solve the problem.
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>> > at
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>> >
>>> > ________________________________
>>> >
>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>> > at
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>> >
>>> > ________________________________
>>> >
>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
>>> > contain a valid host:port authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>> > host:port authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>> >
>>> >
>>> > Regards,
>>> > Mohammad Alkahtani
>>>
>>
>>
>

Re: Hadoop Debian Package

Posted by shashwat shriparv <dw...@gmail.com>.
try
find / -type f -iname "*site.xml"
it will show you where ever those files are..



∞
Shashwat Shriparv



On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> The problem is I tried I read the configuration file by changing
> export HADOOP_CONF_DIR=${HADOOP_CONF_
> DIR:-"/usr/shar/hadoop/templates/conf"}
> but I think Hadoop dosen't get the configration from this dir, I trid and
> searched the system for conf dir the only dir is this one which I changed.
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>> taking setting configuration from other location...
>>
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> What is the version of Hadoop you use?
>>>
>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>> the deprecated properties here:
>>>
>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>> ).
>>> I remember I once had a similar error message and it was due to the
>>> change in properties names.
>>>
>>> Regards,
>>>
>>> Sourygna
>>>
>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>> <m....@gmail.com> wrote:
>>> > Hi to all users of Hadoop,
>>> >
>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>>> > configure it right. The conf dir is under templates in
>>> /usr/shar/hadoop. I
>>> > edit the core-site.xml, mapred-site.xml files to give
>>> > <property>
>>> > <name>fs.default.name</name>
>>> > <value>hdfs://localhost:9000</value>
>>> > </property>
>>> > and for mapred
>>> > <property>
>>> > <name>mapred.job.tracker</name>
>>> > <value>localhost:9001</value>
>>> > </property>
>>> >
>>> > but i get these errors, I assume that there is problem, Hadoop cannot
>>> read
>>> > the configuration file.
>>> > I chaned the hadoop-env.sh to
>>> > export
>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>> > but dosen't solve the problem.
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>> > at
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>> >
>>> > ________________________________
>>> >
>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>> > at
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>> >
>>> > ________________________________
>>> >
>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
>>> > contain a valid host:port authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>> > host:port authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>> >
>>> >
>>> > Regards,
>>> > Mohammad Alkahtani
>>>
>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Tariq <do...@gmail.com>.
Hello Mohammad,

      Have you set your HADOOP_HOME properly?
Please check it once.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani
<m....@gmail.com>wrote:

> The problem is I tried I read the configuration file by changing
> export HADOOP_CONF_DIR=${HADOOP_CONF_
> DIR:-"/usr/shar/hadoop/templates/conf"}
> but I think Hadoop dosen't get the configration from this dir, I trid and
> searched the system for conf dir the only dir is this one which I changed.
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>> taking setting configuration from other location...
>>
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> What is the version of Hadoop you use?
>>>
>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>> the deprecated properties here:
>>>
>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>> ).
>>> I remember I once had a similar error message and it was due to the
>>> change in properties names.
>>>
>>> Regards,
>>>
>>> Sourygna
>>>
>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>> <m....@gmail.com> wrote:
>>> > Hi to all users of Hadoop,
>>> >
>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>>> > configure it right. The conf dir is under templates in
>>> /usr/shar/hadoop. I
>>> > edit the core-site.xml, mapred-site.xml files to give
>>> > <property>
>>> > <name>fs.default.name</name>
>>> > <value>hdfs://localhost:9000</value>
>>> > </property>
>>> > and for mapred
>>> > <property>
>>> > <name>mapred.job.tracker</name>
>>> > <value>localhost:9001</value>
>>> > </property>
>>> >
>>> > but i get these errors, I assume that there is problem, Hadoop cannot
>>> read
>>> > the configuration file.
>>> > I chaned the hadoop-env.sh to
>>> > export
>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>> > but dosen't solve the problem.
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>> > at
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>> >
>>> > ________________________________
>>> >
>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303)
>>> at
>>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>> > at
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>> >
>>> > ________________________________
>>> >
>>> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
>>> > contain a valid host:port authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>> >
>>> > ________________________________
>>> >
>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>>> > because java.lang.IllegalArgumentException: Does not contain a valid
>>> > host:port authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>> >
>>> >
>>> > Regards,
>>> > Mohammad Alkahtani
>>>
>>
>>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
The problem is I tried I read the configuration file by changing
export HADOOP_CONF_DIR=${HADOOP_CONF_
DIR:-"/usr/shar/hadoop/templates/conf"}
but I think Hadoop dosen't get the configration from this dir, I trid and
searched the system for conf dir the only dir is this one which I changed.

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
dwivedishashwat@gmail.com> wrote:

> Ye its is asking for file:/// instead of hdfs:// just check if it is
> taking setting configuration from other location...
>
>
>
> ∞
> Shashwat Shriparv
>
>
>
> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>
>> Hi,
>>
>> What is the version of Hadoop you use?
>>
>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>> the deprecated properties here:
>>
>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>> ).
>> I remember I once had a similar error message and it was due to the
>> change in properties names.
>>
>> Regards,
>>
>> Sourygna
>>
>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>> <m....@gmail.com> wrote:
>> > Hi to all users of Hadoop,
>> >
>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>> > configure it right. The conf dir is under templates in
>> /usr/shar/hadoop. I
>> > edit the core-site.xml, mapred-site.xml files to give
>> > <property>
>> > <name>fs.default.name</name>
>> > <value>hdfs://localhost:9000</value>
>> > </property>
>> > and for mapred
>> > <property>
>> > <name>mapred.job.tracker</name>
>> > <value>localhost:9001</value>
>> > </property>
>> >
>> > but i get these errors, I assume that there is problem, Hadoop cannot
>> read
>> > the configuration file.
>> > I chaned the hadoop-env.sh to
>> > export
>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>> > but dosen't solve the problem.
>> >
>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>> > authority: file:/// at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)
>> at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>> > at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>> >
>> > ________________________________
>> >
>> > FATAL org.apache.hadoop.mapred.JobTracker:
>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>> > authority: local at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>> >
>> > ________________________________
>> >
>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>> > authority: file:/// at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)
>> at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>> > at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>> >
>> > ________________________________
>> >
>> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
>> > contain a valid host:port authority: file:/// at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>> >
>> > ________________________________
>> >
>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>> > because java.lang.IllegalArgumentException: Does not contain a valid
>> > host:port authority: local at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>> >
>> >
>> > Regards,
>> > Mohammad Alkahtani
>>
>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
The problem is I tried I read the configuration file by changing
export HADOOP_CONF_DIR=${HADOOP_CONF_
DIR:-"/usr/shar/hadoop/templates/conf"}
but I think Hadoop dosen't get the configration from this dir, I trid and
searched the system for conf dir the only dir is this one which I changed.

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
dwivedishashwat@gmail.com> wrote:

> Ye its is asking for file:/// instead of hdfs:// just check if it is
> taking setting configuration from other location...
>
>
>
> ∞
> Shashwat Shriparv
>
>
>
> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>
>> Hi,
>>
>> What is the version of Hadoop you use?
>>
>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>> the deprecated properties here:
>>
>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>> ).
>> I remember I once had a similar error message and it was due to the
>> change in properties names.
>>
>> Regards,
>>
>> Sourygna
>>
>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>> <m....@gmail.com> wrote:
>> > Hi to all users of Hadoop,
>> >
>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>> > configure it right. The conf dir is under templates in
>> /usr/shar/hadoop. I
>> > edit the core-site.xml, mapred-site.xml files to give
>> > <property>
>> > <name>fs.default.name</name>
>> > <value>hdfs://localhost:9000</value>
>> > </property>
>> > and for mapred
>> > <property>
>> > <name>mapred.job.tracker</name>
>> > <value>localhost:9001</value>
>> > </property>
>> >
>> > but i get these errors, I assume that there is problem, Hadoop cannot
>> read
>> > the configuration file.
>> > I chaned the hadoop-env.sh to
>> > export
>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>> > but dosen't solve the problem.
>> >
>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>> > authority: file:/// at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)
>> at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>> > at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>> >
>> > ________________________________
>> >
>> > FATAL org.apache.hadoop.mapred.JobTracker:
>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>> > authority: local at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>> >
>> > ________________________________
>> >
>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>> > authority: file:/// at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)
>> at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>> > at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>> >
>> > ________________________________
>> >
>> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
>> > contain a valid host:port authority: file:/// at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>> >
>> > ________________________________
>> >
>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>> > because java.lang.IllegalArgumentException: Does not contain a valid
>> > host:port authority: local at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>> >
>> >
>> > Regards,
>> > Mohammad Alkahtani
>>
>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
The problem is I tried I read the configuration file by changing
export HADOOP_CONF_DIR=${HADOOP_CONF_
DIR:-"/usr/shar/hadoop/templates/conf"}
but I think Hadoop dosen't get the configration from this dir, I trid and
searched the system for conf dir the only dir is this one which I changed.

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
dwivedishashwat@gmail.com> wrote:

> Ye its is asking for file:/// instead of hdfs:// just check if it is
> taking setting configuration from other location...
>
>
>
> ∞
> Shashwat Shriparv
>
>
>
> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>
>> Hi,
>>
>> What is the version of Hadoop you use?
>>
>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>> the deprecated properties here:
>>
>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>> ).
>> I remember I once had a similar error message and it was due to the
>> change in properties names.
>>
>> Regards,
>>
>> Sourygna
>>
>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>> <m....@gmail.com> wrote:
>> > Hi to all users of Hadoop,
>> >
>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>> > configure it right. The conf dir is under templates in
>> /usr/shar/hadoop. I
>> > edit the core-site.xml, mapred-site.xml files to give
>> > <property>
>> > <name>fs.default.name</name>
>> > <value>hdfs://localhost:9000</value>
>> > </property>
>> > and for mapred
>> > <property>
>> > <name>mapred.job.tracker</name>
>> > <value>localhost:9001</value>
>> > </property>
>> >
>> > but i get these errors, I assume that there is problem, Hadoop cannot
>> read
>> > the configuration file.
>> > I chaned the hadoop-env.sh to
>> > export
>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>> > but dosen't solve the problem.
>> >
>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>> > authority: file:/// at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)
>> at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>> > at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>> >
>> > ________________________________
>> >
>> > FATAL org.apache.hadoop.mapred.JobTracker:
>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>> > authority: local at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>> >
>> > ________________________________
>> >
>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>> > authority: file:/// at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)
>> at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>> > at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>> >
>> > ________________________________
>> >
>> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
>> > contain a valid host:port authority: file:/// at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>> >
>> > ________________________________
>> >
>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>> > because java.lang.IllegalArgumentException: Does not contain a valid
>> > host:port authority: local at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>> >
>> >
>> > Regards,
>> > Mohammad Alkahtani
>>
>
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
The problem is I tried I read the configuration file by changing
export HADOOP_CONF_DIR=${HADOOP_CONF_
DIR:-"/usr/shar/hadoop/templates/conf"}
but I think Hadoop dosen't get the configration from this dir, I trid and
searched the system for conf dir the only dir is this one which I changed.

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
dwivedishashwat@gmail.com> wrote:

> Ye its is asking for file:/// instead of hdfs:// just check if it is
> taking setting configuration from other location...
>
>
>
> ∞
> Shashwat Shriparv
>
>
>
> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:
>
>> Hi,
>>
>> What is the version of Hadoop you use?
>>
>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>> the deprecated properties here:
>>
>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>> ).
>> I remember I once had a similar error message and it was due to the
>> change in properties names.
>>
>> Regards,
>>
>> Sourygna
>>
>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>> <m....@gmail.com> wrote:
>> > Hi to all users of Hadoop,
>> >
>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>> > configure it right. The conf dir is under templates in
>> /usr/shar/hadoop. I
>> > edit the core-site.xml, mapred-site.xml files to give
>> > <property>
>> > <name>fs.default.name</name>
>> > <value>hdfs://localhost:9000</value>
>> > </property>
>> > and for mapred
>> > <property>
>> > <name>mapred.job.tracker</name>
>> > <value>localhost:9001</value>
>> > </property>
>> >
>> > but i get these errors, I assume that there is problem, Hadoop cannot
>> read
>> > the configuration file.
>> > I chaned the hadoop-env.sh to
>> > export
>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>> > but dosen't solve the problem.
>> >
>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>> > authority: file:/// at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)
>> at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>> > at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>> >
>> > ________________________________
>> >
>> > FATAL org.apache.hadoop.mapred.JobTracker:
>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>> > authority: local at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>> >
>> > ________________________________
>> >
>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>> > authority: file:/// at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)
>> at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>> > at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>> >
>> > ________________________________
>> >
>> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
>> > contain a valid host:port authority: file:/// at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>> > at
>> >
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>> >
>> > ________________________________
>> >
>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>> > because java.lang.IllegalArgumentException: Does not contain a valid
>> > host:port authority: local at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>> >
>> >
>> > Regards,
>> > Mohammad Alkahtani
>>
>
>

Re: Hadoop Debian Package

Posted by shashwat shriparv <dw...@gmail.com>.
Ye its is asking for file:/// instead of hdfs:// just check if it is taking
setting configuration from other location...



∞
Shashwat Shriparv



On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:

> Hi,
>
> What is the version of Hadoop you use?
>
> Try using fs.defaultFS instead of fs.default.name (see the list of all
> the deprecated properties here:
>
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
> ).
> I remember I once had a similar error message and it was due to the
> change in properties names.
>
> Regards,
>
> Sourygna
>
> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
> <m....@gmail.com> wrote:
> > Hi to all users of Hadoop,
> >
> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
> > configure it right. The conf dir is under templates in /usr/shar/hadoop.
> I
> > edit the core-site.xml, mapred-site.xml files to give
> > <property>
> > <name>fs.default.name</name>
> > <value>hdfs://localhost:9000</value>
> > </property>
> > and for mapred
> > <property>
> > <name>mapred.job.tracker</name>
> > <value>localhost:9001</value>
> > </property>
> >
> > but i get these errors, I assume that there is problem, Hadoop cannot
> read
> > the configuration file.
> > I chaned the hadoop-env.sh to
> > export
> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
> > but dosen't solve the problem.
> >
> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> > java.lang.IllegalArgumentException: Does not contain a valid host:port
> > authority: file:/// at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
> > at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
> >
> > ________________________________
> >
> > FATAL org.apache.hadoop.mapred.JobTracker:
> > java.lang.IllegalArgumentException: Does not contain a valid host:port
> > authority: local at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
> >
> > ________________________________
> >
> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> > java.lang.IllegalArgumentException: Does not contain a valid host:port
> > authority: file:/// at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
> > at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
> >
> > ________________________________
> >
> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
> > contain a valid host:port authority: file:/// at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
> >
> > ________________________________
> >
> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
> > because java.lang.IllegalArgumentException: Does not contain a valid
> > host:port authority: local at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
> >
> >
> > Regards,
> > Mohammad Alkahtani
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I use hadoop-1.1.2 I will try and get back to you.

Regards,
Mohammad Alkahtani


On 17 Mar 2013, at 08:37 PM, Luangsay Sourygna <lu...@gmail.com> wrote:

> Hi,
> 
> What is the version of Hadoop you use?
> 
> Try using fs.defaultFS instead of fs.default.name (see the list of all
> the deprecated properties here:
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html).
> I remember I once had a similar error message and it was due to the
> change in properties names.
> 
> Regards,
> 
> Sourygna
> 
> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
> <m....@gmail.com> wrote:
>> Hi to all users of Hadoop,
>> 
>> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
>> edit the core-site.xml, mapred-site.xml files to give
>> <property>
>> <name>fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>> </property>
>> and for mapred
>> <property>
>> <name>mapred.job.tracker</name>
>> <value>localhost:9001</value>
>> </property>
>> 
>> but i get these errors, I assume that there is problem, Hadoop cannot read
>> the configuration file.
>> I chaned the hadoop-env.sh to
>> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>> but dosen't solve the problem.
>> 
>> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>> 
>> ________________________________
>> 
>> FATAL org.apache.hadoop.mapred.JobTracker:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>> 
>> ________________________________
>> 
>> Exception in thread "main" java.lang.IllegalArgumentException: Does not
>> contain a valid host:port authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>> because java.lang.IllegalArgumentException: Does not contain a valid
>> host:port authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>> 
>> 
>> Regards,
>> Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by shashwat shriparv <dw...@gmail.com>.
Ye its is asking for file:/// instead of hdfs:// just check if it is taking
setting configuration from other location...



∞
Shashwat Shriparv



On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:

> Hi,
>
> What is the version of Hadoop you use?
>
> Try using fs.defaultFS instead of fs.default.name (see the list of all
> the deprecated properties here:
>
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
> ).
> I remember I once had a similar error message and it was due to the
> change in properties names.
>
> Regards,
>
> Sourygna
>
> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
> <m....@gmail.com> wrote:
> > Hi to all users of Hadoop,
> >
> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
> > configure it right. The conf dir is under templates in /usr/shar/hadoop.
> I
> > edit the core-site.xml, mapred-site.xml files to give
> > <property>
> > <name>fs.default.name</name>
> > <value>hdfs://localhost:9000</value>
> > </property>
> > and for mapred
> > <property>
> > <name>mapred.job.tracker</name>
> > <value>localhost:9001</value>
> > </property>
> >
> > but i get these errors, I assume that there is problem, Hadoop cannot
> read
> > the configuration file.
> > I chaned the hadoop-env.sh to
> > export
> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
> > but dosen't solve the problem.
> >
> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> > java.lang.IllegalArgumentException: Does not contain a valid host:port
> > authority: file:/// at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
> > at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
> >
> > ________________________________
> >
> > FATAL org.apache.hadoop.mapred.JobTracker:
> > java.lang.IllegalArgumentException: Does not contain a valid host:port
> > authority: local at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
> >
> > ________________________________
> >
> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> > java.lang.IllegalArgumentException: Does not contain a valid host:port
> > authority: file:/// at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
> > at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
> >
> > ________________________________
> >
> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
> > contain a valid host:port authority: file:/// at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
> >
> > ________________________________
> >
> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
> > because java.lang.IllegalArgumentException: Does not contain a valid
> > host:port authority: local at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
> >
> >
> > Regards,
> > Mohammad Alkahtani
>

Re: Hadoop Debian Package

Posted by shashwat shriparv <dw...@gmail.com>.
Ye its is asking for file:/// instead of hdfs:// just check if it is taking
setting configuration from other location...



∞
Shashwat Shriparv



On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:

> Hi,
>
> What is the version of Hadoop you use?
>
> Try using fs.defaultFS instead of fs.default.name (see the list of all
> the deprecated properties here:
>
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
> ).
> I remember I once had a similar error message and it was due to the
> change in properties names.
>
> Regards,
>
> Sourygna
>
> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
> <m....@gmail.com> wrote:
> > Hi to all users of Hadoop,
> >
> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
> > configure it right. The conf dir is under templates in /usr/shar/hadoop.
> I
> > edit the core-site.xml, mapred-site.xml files to give
> > <property>
> > <name>fs.default.name</name>
> > <value>hdfs://localhost:9000</value>
> > </property>
> > and for mapred
> > <property>
> > <name>mapred.job.tracker</name>
> > <value>localhost:9001</value>
> > </property>
> >
> > but i get these errors, I assume that there is problem, Hadoop cannot
> read
> > the configuration file.
> > I chaned the hadoop-env.sh to
> > export
> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
> > but dosen't solve the problem.
> >
> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> > java.lang.IllegalArgumentException: Does not contain a valid host:port
> > authority: file:/// at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
> > at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
> >
> > ________________________________
> >
> > FATAL org.apache.hadoop.mapred.JobTracker:
> > java.lang.IllegalArgumentException: Does not contain a valid host:port
> > authority: local at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
> >
> > ________________________________
> >
> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> > java.lang.IllegalArgumentException: Does not contain a valid host:port
> > authority: file:/// at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
> > at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
> >
> > ________________________________
> >
> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
> > contain a valid host:port authority: file:/// at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
> >
> > ________________________________
> >
> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
> > because java.lang.IllegalArgumentException: Does not contain a valid
> > host:port authority: local at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
> >
> >
> > Regards,
> > Mohammad Alkahtani
>

Re: Hadoop Debian Package

Posted by shashwat shriparv <dw...@gmail.com>.
Ye its is asking for file:/// instead of hdfs:// just check if it is taking
setting configuration from other location...



∞
Shashwat Shriparv



On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <lu...@gmail.com>wrote:

> Hi,
>
> What is the version of Hadoop you use?
>
> Try using fs.defaultFS instead of fs.default.name (see the list of all
> the deprecated properties here:
>
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
> ).
> I remember I once had a similar error message and it was due to the
> change in properties names.
>
> Regards,
>
> Sourygna
>
> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
> <m....@gmail.com> wrote:
> > Hi to all users of Hadoop,
> >
> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
> > configure it right. The conf dir is under templates in /usr/shar/hadoop.
> I
> > edit the core-site.xml, mapred-site.xml files to give
> > <property>
> > <name>fs.default.name</name>
> > <value>hdfs://localhost:9000</value>
> > </property>
> > and for mapred
> > <property>
> > <name>mapred.job.tracker</name>
> > <value>localhost:9001</value>
> > </property>
> >
> > but i get these errors, I assume that there is problem, Hadoop cannot
> read
> > the configuration file.
> > I chaned the hadoop-env.sh to
> > export
> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
> > but dosen't solve the problem.
> >
> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> > java.lang.IllegalArgumentException: Does not contain a valid host:port
> > authority: file:/// at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
> > at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
> >
> > ________________________________
> >
> > FATAL org.apache.hadoop.mapred.JobTracker:
> > java.lang.IllegalArgumentException: Does not contain a valid host:port
> > authority: local at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
> >
> > ________________________________
> >
> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> > java.lang.IllegalArgumentException: Does not contain a valid host:port
> > authority: file:/// at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
> > at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
> >
> > ________________________________
> >
> > Exception in thread "main" java.lang.IllegalArgumentException: Does not
> > contain a valid host:port authority: file:/// at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
> >
> > ________________________________
> >
> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
> > because java.lang.IllegalArgumentException: Does not contain a valid
> > host:port authority: local at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
> >
> >
> > Regards,
> > Mohammad Alkahtani
>

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I use hadoop-1.1.2 I will try and get back to you.

Regards,
Mohammad Alkahtani


On 17 Mar 2013, at 08:37 PM, Luangsay Sourygna <lu...@gmail.com> wrote:

> Hi,
> 
> What is the version of Hadoop you use?
> 
> Try using fs.defaultFS instead of fs.default.name (see the list of all
> the deprecated properties here:
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html).
> I remember I once had a similar error message and it was due to the
> change in properties names.
> 
> Regards,
> 
> Sourygna
> 
> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
> <m....@gmail.com> wrote:
>> Hi to all users of Hadoop,
>> 
>> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
>> edit the core-site.xml, mapred-site.xml files to give
>> <property>
>> <name>fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>> </property>
>> and for mapred
>> <property>
>> <name>mapred.job.tracker</name>
>> <value>localhost:9001</value>
>> </property>
>> 
>> but i get these errors, I assume that there is problem, Hadoop cannot read
>> the configuration file.
>> I chaned the hadoop-env.sh to
>> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>> but dosen't solve the problem.
>> 
>> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>> 
>> ________________________________
>> 
>> FATAL org.apache.hadoop.mapred.JobTracker:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>> 
>> ________________________________
>> 
>> Exception in thread "main" java.lang.IllegalArgumentException: Does not
>> contain a valid host:port authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>> because java.lang.IllegalArgumentException: Does not contain a valid
>> host:port authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>> 
>> 
>> Regards,
>> Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by Luangsay Sourygna <lu...@gmail.com>.
Hi,

What is the version of Hadoop you use?

Try using fs.defaultFS instead of fs.default.name (see the list of all
the deprecated properties here:
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html).
I remember I once had a similar error message and it was due to the
change in properties names.

Regards,

Sourygna

On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
<m....@gmail.com> wrote:
> Hi to all users of Hadoop,
>
> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
> edit the core-site.xml, mapred-site.xml files to give
> <property>
> <name>fs.default.name</name>
> <value>hdfs://localhost:9000</value>
> </property>
> and for mapred
> <property>
> <name>mapred.job.tracker</name>
> <value>localhost:9001</value>
> </property>
>
> but i get these errors, I assume that there is problem, Hadoop cannot read
> the configuration file.
> I chaned the hadoop-env.sh to
> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
> but dosen't solve the problem.
>
> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>
> ________________________________
>
> FATAL org.apache.hadoop.mapred.JobTracker:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>
> ________________________________
>
> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>
> ________________________________
>
> Exception in thread "main" java.lang.IllegalArgumentException: Does not
> contain a valid host:port authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>
> ________________________________
>
> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
> because java.lang.IllegalArgumentException: Does not contain a valid
> host:port authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>
>
> Regards,
> Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I followed it but he used the binary package not Debian 

Regards,
Mohammad Alkahtani


On 17 Mar 2013, at 08:11 PM, Jean-Marc Spaggiari <je...@spaggiari.org> wrote:

> Hi Mohammad,
> 
> Maybe you can take a look here and see if anything apply to you?
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
> you might be able to directly jump to the configuration section...
> 
> JM
> 
> 2013/3/17 Mohammad Alkahtani <m....@gmail.com>:
>> Hi to all users of Hadoop,
>> 
>> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
>> edit the core-site.xml, mapred-site.xml files to give
>> <property>
>> <name>fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>> </property>
>> and for mapred
>> <property>
>> <name>mapred.job.tracker</name>
>> <value>localhost:9001</value>
>> </property>
>> 
>> but i get these errors, I assume that there is problem, Hadoop cannot read
>> the configuration file.
>> I chaned the hadoop-env.sh to
>> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>> but dosen't solve the problem.
>> 
>> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>> 
>> ________________________________
>> 
>> FATAL org.apache.hadoop.mapred.JobTracker:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>> 
>> ________________________________
>> 
>> Exception in thread "main" java.lang.IllegalArgumentException: Does not
>> contain a valid host:port authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>> because java.lang.IllegalArgumentException: Does not contain a valid
>> host:port authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>> 
>> 
>> Regards,
>> Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I followed it but he used the binary package not Debian 

Regards,
Mohammad Alkahtani


On 17 Mar 2013, at 08:11 PM, Jean-Marc Spaggiari <je...@spaggiari.org> wrote:

> Hi Mohammad,
> 
> Maybe you can take a look here and see if anything apply to you?
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
> you might be able to directly jump to the configuration section...
> 
> JM
> 
> 2013/3/17 Mohammad Alkahtani <m....@gmail.com>:
>> Hi to all users of Hadoop,
>> 
>> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
>> edit the core-site.xml, mapred-site.xml files to give
>> <property>
>> <name>fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>> </property>
>> and for mapred
>> <property>
>> <name>mapred.job.tracker</name>
>> <value>localhost:9001</value>
>> </property>
>> 
>> but i get these errors, I assume that there is problem, Hadoop cannot read
>> the configuration file.
>> I chaned the hadoop-env.sh to
>> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>> but dosen't solve the problem.
>> 
>> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>> 
>> ________________________________
>> 
>> FATAL org.apache.hadoop.mapred.JobTracker:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>> 
>> ________________________________
>> 
>> Exception in thread "main" java.lang.IllegalArgumentException: Does not
>> contain a valid host:port authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>> because java.lang.IllegalArgumentException: Does not contain a valid
>> host:port authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>> 
>> 
>> Regards,
>> Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I followed it but he used the binary package not Debian 

Regards,
Mohammad Alkahtani


On 17 Mar 2013, at 08:11 PM, Jean-Marc Spaggiari <je...@spaggiari.org> wrote:

> Hi Mohammad,
> 
> Maybe you can take a look here and see if anything apply to you?
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
> you might be able to directly jump to the configuration section...
> 
> JM
> 
> 2013/3/17 Mohammad Alkahtani <m....@gmail.com>:
>> Hi to all users of Hadoop,
>> 
>> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
>> edit the core-site.xml, mapred-site.xml files to give
>> <property>
>> <name>fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>> </property>
>> and for mapred
>> <property>
>> <name>mapred.job.tracker</name>
>> <value>localhost:9001</value>
>> </property>
>> 
>> but i get these errors, I assume that there is problem, Hadoop cannot read
>> the configuration file.
>> I chaned the hadoop-env.sh to
>> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>> but dosen't solve the problem.
>> 
>> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>> 
>> ________________________________
>> 
>> FATAL org.apache.hadoop.mapred.JobTracker:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>> 
>> ________________________________
>> 
>> Exception in thread "main" java.lang.IllegalArgumentException: Does not
>> contain a valid host:port authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>> because java.lang.IllegalArgumentException: Does not contain a valid
>> host:port authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>> 
>> 
>> Regards,
>> Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by Mohammad Alkahtani <m....@gmail.com>.
I followed it but he used the binary package not Debian 

Regards,
Mohammad Alkahtani


On 17 Mar 2013, at 08:11 PM, Jean-Marc Spaggiari <je...@spaggiari.org> wrote:

> Hi Mohammad,
> 
> Maybe you can take a look here and see if anything apply to you?
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
> you might be able to directly jump to the configuration section...
> 
> JM
> 
> 2013/3/17 Mohammad Alkahtani <m....@gmail.com>:
>> Hi to all users of Hadoop,
>> 
>> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
>> edit the core-site.xml, mapred-site.xml files to give
>> <property>
>> <name>fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>> </property>
>> and for mapred
>> <property>
>> <name>mapred.job.tracker</name>
>> <value>localhost:9001</value>
>> </property>
>> 
>> but i get these errors, I assume that there is problem, Hadoop cannot read
>> the configuration file.
>> I chaned the hadoop-env.sh to
>> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>> but dosen't solve the problem.
>> 
>> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>> at
>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>> 
>> ________________________________
>> 
>> FATAL org.apache.hadoop.mapred.JobTracker:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>> java.lang.IllegalArgumentException: Does not contain a valid host:port
>> authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>> 
>> ________________________________
>> 
>> Exception in thread "main" java.lang.IllegalArgumentException: Does not
>> contain a valid host:port authority: file:/// at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>> at
>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>> 
>> ________________________________
>> 
>> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
>> because java.lang.IllegalArgumentException: Does not contain a valid
>> host:port authority: local at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>> 
>> 
>> Regards,
>> Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Hi Mohammad,

Maybe you can take a look here and see if anything apply to you?
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
you might be able to directly jump to the configuration section...

JM

2013/3/17 Mohammad Alkahtani <m....@gmail.com>:
> Hi to all users of Hadoop,
>
> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
> edit the core-site.xml, mapred-site.xml files to give
> <property>
> <name>fs.default.name</name>
> <value>hdfs://localhost:9000</value>
> </property>
> and for mapred
> <property>
> <name>mapred.job.tracker</name>
> <value>localhost:9001</value>
> </property>
>
> but i get these errors, I assume that there is problem, Hadoop cannot read
> the configuration file.
> I chaned the hadoop-env.sh to
> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
> but dosen't solve the problem.
>
> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>
> ________________________________
>
> FATAL org.apache.hadoop.mapred.JobTracker:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>
> ________________________________
>
> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>
> ________________________________
>
> Exception in thread "main" java.lang.IllegalArgumentException: Does not
> contain a valid host:port authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>
> ________________________________
>
> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
> because java.lang.IllegalArgumentException: Does not contain a valid
> host:port authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>
>
> Regards,
> Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by Luangsay Sourygna <lu...@gmail.com>.
Hi,

What is the version of Hadoop you use?

Try using fs.defaultFS instead of fs.default.name (see the list of all
the deprecated properties here:
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html).
I remember I once had a similar error message and it was due to the
change in properties names.

Regards,

Sourygna

On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
<m....@gmail.com> wrote:
> Hi to all users of Hadoop,
>
> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
> edit the core-site.xml, mapred-site.xml files to give
> <property>
> <name>fs.default.name</name>
> <value>hdfs://localhost:9000</value>
> </property>
> and for mapred
> <property>
> <name>mapred.job.tracker</name>
> <value>localhost:9001</value>
> </property>
>
> but i get these errors, I assume that there is problem, Hadoop cannot read
> the configuration file.
> I chaned the hadoop-env.sh to
> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
> but dosen't solve the problem.
>
> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>
> ________________________________
>
> FATAL org.apache.hadoop.mapred.JobTracker:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>
> ________________________________
>
> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>
> ________________________________
>
> Exception in thread "main" java.lang.IllegalArgumentException: Does not
> contain a valid host:port authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>
> ________________________________
>
> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
> because java.lang.IllegalArgumentException: Does not contain a valid
> host:port authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>
>
> Regards,
> Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by Luangsay Sourygna <lu...@gmail.com>.
Hi,

What is the version of Hadoop you use?

Try using fs.defaultFS instead of fs.default.name (see the list of all
the deprecated properties here:
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html).
I remember I once had a similar error message and it was due to the
change in properties names.

Regards,

Sourygna

On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
<m....@gmail.com> wrote:
> Hi to all users of Hadoop,
>
> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
> edit the core-site.xml, mapred-site.xml files to give
> <property>
> <name>fs.default.name</name>
> <value>hdfs://localhost:9000</value>
> </property>
> and for mapred
> <property>
> <name>mapred.job.tracker</name>
> <value>localhost:9001</value>
> </property>
>
> but i get these errors, I assume that there is problem, Hadoop cannot read
> the configuration file.
> I chaned the hadoop-env.sh to
> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
> but dosen't solve the problem.
>
> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>
> ________________________________
>
> FATAL org.apache.hadoop.mapred.JobTracker:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>
> ________________________________
>
> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>
> ________________________________
>
> Exception in thread "main" java.lang.IllegalArgumentException: Does not
> contain a valid host:port authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>
> ________________________________
>
> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
> because java.lang.IllegalArgumentException: Does not contain a valid
> host:port authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>
>
> Regards,
> Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Hi Mohammad,

Maybe you can take a look here and see if anything apply to you?
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
you might be able to directly jump to the configuration section...

JM

2013/3/17 Mohammad Alkahtani <m....@gmail.com>:
> Hi to all users of Hadoop,
>
> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
> edit the core-site.xml, mapred-site.xml files to give
> <property>
> <name>fs.default.name</name>
> <value>hdfs://localhost:9000</value>
> </property>
> and for mapred
> <property>
> <name>mapred.job.tracker</name>
> <value>localhost:9001</value>
> </property>
>
> but i get these errors, I assume that there is problem, Hadoop cannot read
> the configuration file.
> I chaned the hadoop-env.sh to
> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
> but dosen't solve the problem.
>
> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>
> ________________________________
>
> FATAL org.apache.hadoop.mapred.JobTracker:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>
> ________________________________
>
> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>
> ________________________________
>
> Exception in thread "main" java.lang.IllegalArgumentException: Does not
> contain a valid host:port authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>
> ________________________________
>
> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
> because java.lang.IllegalArgumentException: Does not contain a valid
> host:port authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>
>
> Regards,
> Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Hi Mohammad,

Maybe you can take a look here and see if anything apply to you?
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
you might be able to directly jump to the configuration section...

JM

2013/3/17 Mohammad Alkahtani <m....@gmail.com>:
> Hi to all users of Hadoop,
>
> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
> edit the core-site.xml, mapred-site.xml files to give
> <property>
> <name>fs.default.name</name>
> <value>hdfs://localhost:9000</value>
> </property>
> and for mapred
> <property>
> <name>mapred.job.tracker</name>
> <value>localhost:9001</value>
> </property>
>
> but i get these errors, I assume that there is problem, Hadoop cannot read
> the configuration file.
> I chaned the hadoop-env.sh to
> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
> but dosen't solve the problem.
>
> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>
> ________________________________
>
> FATAL org.apache.hadoop.mapred.JobTracker:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>
> ________________________________
>
> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>
> ________________________________
>
> Exception in thread "main" java.lang.IllegalArgumentException: Does not
> contain a valid host:port authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>
> ________________________________
>
> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
> because java.lang.IllegalArgumentException: Does not contain a valid
> host:port authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>
>
> Regards,
> Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by Luangsay Sourygna <lu...@gmail.com>.
Hi,

What is the version of Hadoop you use?

Try using fs.defaultFS instead of fs.default.name (see the list of all
the deprecated properties here:
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html).
I remember I once had a similar error message and it was due to the
change in properties names.

Regards,

Sourygna

On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
<m....@gmail.com> wrote:
> Hi to all users of Hadoop,
>
> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
> edit the core-site.xml, mapred-site.xml files to give
> <property>
> <name>fs.default.name</name>
> <value>hdfs://localhost:9000</value>
> </property>
> and for mapred
> <property>
> <name>mapred.job.tracker</name>
> <value>localhost:9001</value>
> </property>
>
> but i get these errors, I assume that there is problem, Hadoop cannot read
> the configuration file.
> I chaned the hadoop-env.sh to
> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
> but dosen't solve the problem.
>
> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>
> ________________________________
>
> FATAL org.apache.hadoop.mapred.JobTracker:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>
> ________________________________
>
> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>
> ________________________________
>
> Exception in thread "main" java.lang.IllegalArgumentException: Does not
> contain a valid host:port authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>
> ________________________________
>
> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
> because java.lang.IllegalArgumentException: Does not contain a valid
> host:port authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>
>
> Regards,
> Mohammad Alkahtani

Re: Hadoop Debian Package

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Hi Mohammad,

Maybe you can take a look here and see if anything apply to you?
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
you might be able to directly jump to the configuration section...

JM

2013/3/17 Mohammad Alkahtani <m....@gmail.com>:
> Hi to all users of Hadoop,
>
> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
> configure it right. The conf dir is under templates in /usr/shar/hadoop. I
> edit the core-site.xml, mapred-site.xml files to give
> <property>
> <name>fs.default.name</name>
> <value>hdfs://localhost:9000</value>
> </property>
> and for mapred
> <property>
> <name>mapred.job.tracker</name>
> <value>localhost:9001</value>
> </property>
>
> but i get these errors, I assume that there is problem, Hadoop cannot read
> the configuration file.
> I chaned the hadoop-env.sh to
> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
> but dosen't solve the problem.
>
> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>
> ________________________________
>
> FATAL org.apache.hadoop.mapred.JobTracker:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>
> ________________________________
>
> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>
> ________________________________
>
> Exception in thread "main" java.lang.IllegalArgumentException: Does not
> contain a valid host:port authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>
> ________________________________
>
> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker
> because java.lang.IllegalArgumentException: Does not contain a valid
> host:port authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>
>
> Regards,
> Mohammad Alkahtani