You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Anand Murali <an...@yahoo.com> on 2015/04/22 10:46:08 UTC

Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
can you send me ur hosts file

On Thu, Apr 23, 2015 at 11:55 AM, Anand Murali <an...@yahoo.com>
wrote:

> Hi:
>
> I tried and was succesfull in changing etc/hosts. I shutdown and
> re-started and get the same error.
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> Last login: Thu Apr 23 11:18:43 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From java.net.UnknownHostException: Latitude-E5540:
> Latitude-E5540 to localhost:9000 failed on connection exception:
> java.net.ConnectException: Connection refused; For more details see:
> http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
> Very strange.
>
> Regards
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Thursday, April 23, 2015 11:22 AM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Many thanks my friend. Shall try it right away.
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Thursday, April 23, 2015 10:51 AM, sandeep vura <
> sandeepvura@gmail.com> wrote:
>
>
> run this command in the terminal from root directory
>
> $ sudo nano /etc/hosts (( It will prompt to enter root password))
>
> Later you can comment those lines in hosts files #127.0.1.1
>
> add this line 127.0.0.1     localhost
>
> save the host file and exit
>
>
>
> On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Sudo what my friend. There are so many options to sudo
>
> Sent from my iPhone
>
> On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:
>
> Ananad,
>
> Try sudo it will work
>
> On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com>
> wrote:
>
> Can you try sudo?
> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>
> Regards,
> Shahab
>
> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear Sandeep:
>
> many thanks. I did find hosts, but I do not have write priveleges,
> eventhough I am administrator. This is strange. Can you please advise.
>
> Thanks
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
> sandeepvura@gmail.com> wrote:
>
>
> Hi Anand,
>
> You should search /etc directory in root not Hadoop directory.
>
> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> I dont see a etc/host. Find below.
>
>
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
> total 76
> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
> total 12
> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
> total 176
> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
> capacity-scheduler.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
> container-executor.cfg
> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
> hadoop-metrics2.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
> hadoop-metrics.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
> httpfs-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
> httpfs-signature.secret
> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
> kms-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
> mapred-queues.xml.template
> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
> mapred-site.xml.template~
> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
> ssl-client.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
> ssl-server.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
> localhost
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>
> Thanks.
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Ok thanks will do
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>
> hosts file will be available in /etc directory please check once.
>
> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
can you send me ur hosts file

On Thu, Apr 23, 2015 at 11:55 AM, Anand Murali <an...@yahoo.com>
wrote:

> Hi:
>
> I tried and was succesfull in changing etc/hosts. I shutdown and
> re-started and get the same error.
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> Last login: Thu Apr 23 11:18:43 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From java.net.UnknownHostException: Latitude-E5540:
> Latitude-E5540 to localhost:9000 failed on connection exception:
> java.net.ConnectException: Connection refused; For more details see:
> http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
> Very strange.
>
> Regards
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Thursday, April 23, 2015 11:22 AM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Many thanks my friend. Shall try it right away.
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Thursday, April 23, 2015 10:51 AM, sandeep vura <
> sandeepvura@gmail.com> wrote:
>
>
> run this command in the terminal from root directory
>
> $ sudo nano /etc/hosts (( It will prompt to enter root password))
>
> Later you can comment those lines in hosts files #127.0.1.1
>
> add this line 127.0.0.1     localhost
>
> save the host file and exit
>
>
>
> On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Sudo what my friend. There are so many options to sudo
>
> Sent from my iPhone
>
> On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:
>
> Ananad,
>
> Try sudo it will work
>
> On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com>
> wrote:
>
> Can you try sudo?
> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>
> Regards,
> Shahab
>
> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear Sandeep:
>
> many thanks. I did find hosts, but I do not have write priveleges,
> eventhough I am administrator. This is strange. Can you please advise.
>
> Thanks
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
> sandeepvura@gmail.com> wrote:
>
>
> Hi Anand,
>
> You should search /etc directory in root not Hadoop directory.
>
> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> I dont see a etc/host. Find below.
>
>
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
> total 76
> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
> total 12
> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
> total 176
> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
> capacity-scheduler.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
> container-executor.cfg
> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
> hadoop-metrics2.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
> hadoop-metrics.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
> httpfs-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
> httpfs-signature.secret
> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
> kms-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
> mapred-queues.xml.template
> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
> mapred-site.xml.template~
> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
> ssl-client.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
> ssl-server.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
> localhost
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>
> Thanks.
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Ok thanks will do
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>
> hosts file will be available in /etc directory please check once.
>
> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
can you send me ur hosts file

On Thu, Apr 23, 2015 at 11:55 AM, Anand Murali <an...@yahoo.com>
wrote:

> Hi:
>
> I tried and was succesfull in changing etc/hosts. I shutdown and
> re-started and get the same error.
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> Last login: Thu Apr 23 11:18:43 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From java.net.UnknownHostException: Latitude-E5540:
> Latitude-E5540 to localhost:9000 failed on connection exception:
> java.net.ConnectException: Connection refused; For more details see:
> http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
> Very strange.
>
> Regards
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Thursday, April 23, 2015 11:22 AM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Many thanks my friend. Shall try it right away.
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Thursday, April 23, 2015 10:51 AM, sandeep vura <
> sandeepvura@gmail.com> wrote:
>
>
> run this command in the terminal from root directory
>
> $ sudo nano /etc/hosts (( It will prompt to enter root password))
>
> Later you can comment those lines in hosts files #127.0.1.1
>
> add this line 127.0.0.1     localhost
>
> save the host file and exit
>
>
>
> On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Sudo what my friend. There are so many options to sudo
>
> Sent from my iPhone
>
> On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:
>
> Ananad,
>
> Try sudo it will work
>
> On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com>
> wrote:
>
> Can you try sudo?
> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>
> Regards,
> Shahab
>
> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear Sandeep:
>
> many thanks. I did find hosts, but I do not have write priveleges,
> eventhough I am administrator. This is strange. Can you please advise.
>
> Thanks
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
> sandeepvura@gmail.com> wrote:
>
>
> Hi Anand,
>
> You should search /etc directory in root not Hadoop directory.
>
> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> I dont see a etc/host. Find below.
>
>
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
> total 76
> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
> total 12
> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
> total 176
> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
> capacity-scheduler.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
> container-executor.cfg
> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
> hadoop-metrics2.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
> hadoop-metrics.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
> httpfs-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
> httpfs-signature.secret
> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
> kms-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
> mapred-queues.xml.template
> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
> mapred-site.xml.template~
> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
> ssl-client.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
> ssl-server.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
> localhost
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>
> Thanks.
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Ok thanks will do
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>
> hosts file will be available in /etc directory please check once.
>
> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
can you send me ur hosts file

On Thu, Apr 23, 2015 at 11:55 AM, Anand Murali <an...@yahoo.com>
wrote:

> Hi:
>
> I tried and was succesfull in changing etc/hosts. I shutdown and
> re-started and get the same error.
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> Last login: Thu Apr 23 11:18:43 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From java.net.UnknownHostException: Latitude-E5540:
> Latitude-E5540 to localhost:9000 failed on connection exception:
> java.net.ConnectException: Connection refused; For more details see:
> http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
> Very strange.
>
> Regards
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Thursday, April 23, 2015 11:22 AM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Many thanks my friend. Shall try it right away.
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Thursday, April 23, 2015 10:51 AM, sandeep vura <
> sandeepvura@gmail.com> wrote:
>
>
> run this command in the terminal from root directory
>
> $ sudo nano /etc/hosts (( It will prompt to enter root password))
>
> Later you can comment those lines in hosts files #127.0.1.1
>
> add this line 127.0.0.1     localhost
>
> save the host file and exit
>
>
>
> On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Sudo what my friend. There are so many options to sudo
>
> Sent from my iPhone
>
> On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:
>
> Ananad,
>
> Try sudo it will work
>
> On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com>
> wrote:
>
> Can you try sudo?
> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>
> Regards,
> Shahab
>
> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear Sandeep:
>
> many thanks. I did find hosts, but I do not have write priveleges,
> eventhough I am administrator. This is strange. Can you please advise.
>
> Thanks
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
> sandeepvura@gmail.com> wrote:
>
>
> Hi Anand,
>
> You should search /etc directory in root not Hadoop directory.
>
> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> I dont see a etc/host. Find below.
>
>
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
> total 76
> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
> total 12
> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
> total 176
> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
> capacity-scheduler.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
> container-executor.cfg
> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
> hadoop-metrics2.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
> hadoop-metrics.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
> httpfs-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
> httpfs-signature.secret
> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
> kms-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
> mapred-queues.xml.template
> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
> mapred-site.xml.template~
> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
> ssl-client.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
> ssl-server.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
> localhost
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>
> Thanks.
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Ok thanks will do
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>
> hosts file will be available in /etc directory please check once.
>
> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Hi:
I tried and was succesfull in changing etc/hosts. I shutdown and re-started and get the same error.
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

Last login: Thu Apr 23 11:18:43 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From java.net.UnknownHostException: Latitude-E5540: Latitude-E5540 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 

Very strange.
Regards
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Thursday, April 23, 2015 11:22 AM, Anand Murali <an...@yahoo.com> wrote:
   

 Many thanks my friend. Shall try it right away.
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Thursday, April 23, 2015 10:51 AM, sandeep vura <sa...@gmail.com> wrote:
   

 run this command in the terminal from root directory 

$ sudo nano /etc/hosts (( It will prompt to enter root password)) 

Later you can comment those lines in hosts files #127.0.1.1 

add this line 127.0.0.1     localhost

save the host file and exit 



On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com> wrote:

Sudo what my friend. There are so many options to sudo 
Sent from my iPhone
On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:


Ananad,
Try sudo it will work 
On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com> wrote:

Can you try sudo?https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo

Regards,Shahab
On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:

Dear Sandeep:
many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
   

 Hi Anand,
You should search /etc directory in root not Hadoop directory.
On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







   



   








   

  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Hi:
I tried and was succesfull in changing etc/hosts. I shutdown and re-started and get the same error.
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

Last login: Thu Apr 23 11:18:43 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From java.net.UnknownHostException: Latitude-E5540: Latitude-E5540 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 

Very strange.
Regards
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Thursday, April 23, 2015 11:22 AM, Anand Murali <an...@yahoo.com> wrote:
   

 Many thanks my friend. Shall try it right away.
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Thursday, April 23, 2015 10:51 AM, sandeep vura <sa...@gmail.com> wrote:
   

 run this command in the terminal from root directory 

$ sudo nano /etc/hosts (( It will prompt to enter root password)) 

Later you can comment those lines in hosts files #127.0.1.1 

add this line 127.0.0.1     localhost

save the host file and exit 



On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com> wrote:

Sudo what my friend. There are so many options to sudo 
Sent from my iPhone
On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:


Ananad,
Try sudo it will work 
On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com> wrote:

Can you try sudo?https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo

Regards,Shahab
On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:

Dear Sandeep:
many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
   

 Hi Anand,
You should search /etc directory in root not Hadoop directory.
On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







   



   








   

  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Hi:
I tried and was succesfull in changing etc/hosts. I shutdown and re-started and get the same error.
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

Last login: Thu Apr 23 11:18:43 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From java.net.UnknownHostException: Latitude-E5540: Latitude-E5540 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 

Very strange.
Regards
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Thursday, April 23, 2015 11:22 AM, Anand Murali <an...@yahoo.com> wrote:
   

 Many thanks my friend. Shall try it right away.
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Thursday, April 23, 2015 10:51 AM, sandeep vura <sa...@gmail.com> wrote:
   

 run this command in the terminal from root directory 

$ sudo nano /etc/hosts (( It will prompt to enter root password)) 

Later you can comment those lines in hosts files #127.0.1.1 

add this line 127.0.0.1     localhost

save the host file and exit 



On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com> wrote:

Sudo what my friend. There are so many options to sudo 
Sent from my iPhone
On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:


Ananad,
Try sudo it will work 
On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com> wrote:

Can you try sudo?https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo

Regards,Shahab
On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:

Dear Sandeep:
many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
   

 Hi Anand,
You should search /etc directory in root not Hadoop directory.
On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







   



   








   

  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Hi:
I tried and was succesfull in changing etc/hosts. I shutdown and re-started and get the same error.
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

Last login: Thu Apr 23 11:18:43 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From java.net.UnknownHostException: Latitude-E5540: Latitude-E5540 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 

Very strange.
Regards
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Thursday, April 23, 2015 11:22 AM, Anand Murali <an...@yahoo.com> wrote:
   

 Many thanks my friend. Shall try it right away.
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Thursday, April 23, 2015 10:51 AM, sandeep vura <sa...@gmail.com> wrote:
   

 run this command in the terminal from root directory 

$ sudo nano /etc/hosts (( It will prompt to enter root password)) 

Later you can comment those lines in hosts files #127.0.1.1 

add this line 127.0.0.1     localhost

save the host file and exit 



On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com> wrote:

Sudo what my friend. There are so many options to sudo 
Sent from my iPhone
On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:


Ananad,
Try sudo it will work 
On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com> wrote:

Can you try sudo?https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo

Regards,Shahab
On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:

Dear Sandeep:
many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
   

 Hi Anand,
You should search /etc directory in root not Hadoop directory.
On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







   



   








   

  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Many thanks my friend. Shall try it right away.
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Thursday, April 23, 2015 10:51 AM, sandeep vura <sa...@gmail.com> wrote:
   

 run this command in the terminal from root directory 

$ sudo nano /etc/hosts (( It will prompt to enter root password)) 

Later you can comment those lines in hosts files #127.0.1.1 

add this line 127.0.0.1     localhost

save the host file and exit 



On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com> wrote:

Sudo what my friend. There are so many options to sudo 
Sent from my iPhone
On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:


Ananad,
Try sudo it will work 
On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com> wrote:

Can you try sudo?https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo

Regards,Shahab
On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:

Dear Sandeep:
many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
   

 Hi Anand,
You should search /etc directory in root not Hadoop directory.
On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







   



   








  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Many thanks my friend. Shall try it right away.
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Thursday, April 23, 2015 10:51 AM, sandeep vura <sa...@gmail.com> wrote:
   

 run this command in the terminal from root directory 

$ sudo nano /etc/hosts (( It will prompt to enter root password)) 

Later you can comment those lines in hosts files #127.0.1.1 

add this line 127.0.0.1     localhost

save the host file and exit 



On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com> wrote:

Sudo what my friend. There are so many options to sudo 
Sent from my iPhone
On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:


Ananad,
Try sudo it will work 
On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com> wrote:

Can you try sudo?https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo

Regards,Shahab
On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:

Dear Sandeep:
many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
   

 Hi Anand,
You should search /etc directory in root not Hadoop directory.
On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







   



   








  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Many thanks my friend. Shall try it right away.
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Thursday, April 23, 2015 10:51 AM, sandeep vura <sa...@gmail.com> wrote:
   

 run this command in the terminal from root directory 

$ sudo nano /etc/hosts (( It will prompt to enter root password)) 

Later you can comment those lines in hosts files #127.0.1.1 

add this line 127.0.0.1     localhost

save the host file and exit 



On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com> wrote:

Sudo what my friend. There are so many options to sudo 
Sent from my iPhone
On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:


Ananad,
Try sudo it will work 
On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com> wrote:

Can you try sudo?https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo

Regards,Shahab
On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:

Dear Sandeep:
many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
   

 Hi Anand,
You should search /etc directory in root not Hadoop directory.
On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







   



   








  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Many thanks my friend. Shall try it right away.
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Thursday, April 23, 2015 10:51 AM, sandeep vura <sa...@gmail.com> wrote:
   

 run this command in the terminal from root directory 

$ sudo nano /etc/hosts (( It will prompt to enter root password)) 

Later you can comment those lines in hosts files #127.0.1.1 

add this line 127.0.0.1     localhost

save the host file and exit 



On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com> wrote:

Sudo what my friend. There are so many options to sudo 
Sent from my iPhone
On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:


Ananad,
Try sudo it will work 
On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com> wrote:

Can you try sudo?https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo

Regards,Shahab
On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:

Dear Sandeep:
many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
   

 Hi Anand,
You should search /etc directory in root not Hadoop directory.
On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







   



   








  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
run this command in the terminal from root directory

$ sudo nano /etc/hosts (( It will prompt to enter root password))

Later you can comment those lines in hosts files #127.0.1.1

add this line 127.0.0.1     localhost

save the host file and exit



On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com> wrote:

> Sudo what my friend. There are so many options to sudo
>
> Sent from my iPhone
>
> On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:
>
> Ananad,
>
> Try sudo it will work
>
> On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com>
> wrote:
>
>> Can you try sudo?
>>
>> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>>
>> Regards,
>> Shahab
>>
>> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>>> Dear Sandeep:
>>>
>>> many thanks. I did find hosts, but I do not have write priveleges,
>>> eventhough I am administrator. This is strange. Can you please advise.
>>>
>>> Thanks
>>>
>>> Anand Murali
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>>
>>>
>>>
>>>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
>>> sandeepvura@gmail.com> wrote:
>>>
>>>
>>> Hi Anand,
>>>
>>> You should search /etc directory in root not Hadoop directory.
>>>
>>> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
>>> wrote:
>>>
>>> Dear All:
>>>
>>> I dont see a etc/host. Find below.
>>>
>>>
>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
>>> total 76
>>> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
>>> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
>>> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
>>> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
>>> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
>>> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
>>> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
>>> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
>>> total 12
>>> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
>>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
>>> total 176
>>> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
>>> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
>>> capacity-scheduler.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
>>> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
>>> container-executor.cfg
>>> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
>>> hadoop-metrics2.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
>>> hadoop-metrics.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
>>> httpfs-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
>>> httpfs-signature.secret
>>> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
>>> kms-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
>>> mapred-queues.xml.template
>>> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
>>> mapred-site.xml.template~
>>> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
>>> ssl-client.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
>>> ssl-server.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
>>> localhost
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>>>
>>> Thanks.
>>>
>>> Regards,
>>>
>>>
>>>
>>> Anand Murali
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>>
>>>
>>>
>>>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
>>> anand_vihar@yahoo.com> wrote:
>>>
>>>
>>> Ok thanks will do
>>>
>>> Sent from my iPhone
>>>
>>> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>>>
>>> hosts file will be available in /etc directory please check once.
>>>
>>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
>>> wrote:
>>>
>>> I don't seem to have etc/host
>>>
>>>
>>> Sent from my iPhone
>>>
>>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>>
>>> Hi Anand,
>>>
>>> comment the ip address - 127.0.1.1 in /etc/hosts
>>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>>
>>> Restart your hadoop cluster after made changes in /etc/hosts
>>>
>>> Regards,
>>> Sandeep.v
>>>
>>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
>>> wrote:
>>>
>>> Dear All:
>>>
>>> Has anyone encountered this error and if so how have you fixed it other
>>> then re-installing Hadoop or re-starting start-dfs.sh when you have already
>>> started after boot. Find below
>>>
>>> anand_vihar@Latitude-E5540:~$ ssh localhost
>>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>>
>>>  * Documentation:  https://help.ubuntu.com/
>>>
>>> 1 package can be updated.
>>> 1 update is a security update.
>>>
>>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>>> /home/anand_vihar/hadoop-2.6.0
>>> /home/anand_vihar/jdk1.7.0_75
>>> /home/anand_vihar/hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>>> Hadoop 2.6.0
>>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>>> Compiled by jenkins on 2014-11-13T21:10Z
>>> Compiled with protoc 2.5.0
>>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>>> This command was run using
>>> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>>> Starting namenodes on [localhost]
>>> localhost: starting namenode, logging to
>>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>>> localhost: starting datanode, logging to
>>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>>> Starting secondary namenodes [0.0.0.0]
>>> 0.0.0.0: starting secondarynamenode, logging to
>>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
>>> connection exception: java.net.ConnectException: Connection refused; For
>>> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>>>
>>>
>>>
>>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but
>>> there is no fix to the problem rather it seems to be a Ubuntu network
>>> problem. I have many times killed nanenode/datanode/secondary data note,
>>> shutdown and restarted, but this error still appears. The only way seems to
>>> be re-installing hadoop. Please advise or refer.
>>>
>>> Many thanks,
>>>
>>> Regards,
>>>
>>>
>>>
>>> Anand Murali
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
run this command in the terminal from root directory

$ sudo nano /etc/hosts (( It will prompt to enter root password))

Later you can comment those lines in hosts files #127.0.1.1

add this line 127.0.0.1     localhost

save the host file and exit



On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com> wrote:

> Sudo what my friend. There are so many options to sudo
>
> Sent from my iPhone
>
> On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:
>
> Ananad,
>
> Try sudo it will work
>
> On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com>
> wrote:
>
>> Can you try sudo?
>>
>> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>>
>> Regards,
>> Shahab
>>
>> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>>> Dear Sandeep:
>>>
>>> many thanks. I did find hosts, but I do not have write priveleges,
>>> eventhough I am administrator. This is strange. Can you please advise.
>>>
>>> Thanks
>>>
>>> Anand Murali
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>>
>>>
>>>
>>>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
>>> sandeepvura@gmail.com> wrote:
>>>
>>>
>>> Hi Anand,
>>>
>>> You should search /etc directory in root not Hadoop directory.
>>>
>>> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
>>> wrote:
>>>
>>> Dear All:
>>>
>>> I dont see a etc/host. Find below.
>>>
>>>
>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
>>> total 76
>>> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
>>> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
>>> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
>>> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
>>> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
>>> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
>>> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
>>> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
>>> total 12
>>> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
>>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
>>> total 176
>>> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
>>> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
>>> capacity-scheduler.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
>>> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
>>> container-executor.cfg
>>> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
>>> hadoop-metrics2.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
>>> hadoop-metrics.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
>>> httpfs-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
>>> httpfs-signature.secret
>>> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
>>> kms-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
>>> mapred-queues.xml.template
>>> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
>>> mapred-site.xml.template~
>>> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
>>> ssl-client.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
>>> ssl-server.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
>>> localhost
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>>>
>>> Thanks.
>>>
>>> Regards,
>>>
>>>
>>>
>>> Anand Murali
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>>
>>>
>>>
>>>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
>>> anand_vihar@yahoo.com> wrote:
>>>
>>>
>>> Ok thanks will do
>>>
>>> Sent from my iPhone
>>>
>>> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>>>
>>> hosts file will be available in /etc directory please check once.
>>>
>>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
>>> wrote:
>>>
>>> I don't seem to have etc/host
>>>
>>>
>>> Sent from my iPhone
>>>
>>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>>
>>> Hi Anand,
>>>
>>> comment the ip address - 127.0.1.1 in /etc/hosts
>>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>>
>>> Restart your hadoop cluster after made changes in /etc/hosts
>>>
>>> Regards,
>>> Sandeep.v
>>>
>>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
>>> wrote:
>>>
>>> Dear All:
>>>
>>> Has anyone encountered this error and if so how have you fixed it other
>>> then re-installing Hadoop or re-starting start-dfs.sh when you have already
>>> started after boot. Find below
>>>
>>> anand_vihar@Latitude-E5540:~$ ssh localhost
>>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>>
>>>  * Documentation:  https://help.ubuntu.com/
>>>
>>> 1 package can be updated.
>>> 1 update is a security update.
>>>
>>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>>> /home/anand_vihar/hadoop-2.6.0
>>> /home/anand_vihar/jdk1.7.0_75
>>> /home/anand_vihar/hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>>> Hadoop 2.6.0
>>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>>> Compiled by jenkins on 2014-11-13T21:10Z
>>> Compiled with protoc 2.5.0
>>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>>> This command was run using
>>> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>>> Starting namenodes on [localhost]
>>> localhost: starting namenode, logging to
>>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>>> localhost: starting datanode, logging to
>>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>>> Starting secondary namenodes [0.0.0.0]
>>> 0.0.0.0: starting secondarynamenode, logging to
>>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
>>> connection exception: java.net.ConnectException: Connection refused; For
>>> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>>>
>>>
>>>
>>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but
>>> there is no fix to the problem rather it seems to be a Ubuntu network
>>> problem. I have many times killed nanenode/datanode/secondary data note,
>>> shutdown and restarted, but this error still appears. The only way seems to
>>> be re-installing hadoop. Please advise or refer.
>>>
>>> Many thanks,
>>>
>>> Regards,
>>>
>>>
>>>
>>> Anand Murali
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
run this command in the terminal from root directory

$ sudo nano /etc/hosts (( It will prompt to enter root password))

Later you can comment those lines in hosts files #127.0.1.1

add this line 127.0.0.1     localhost

save the host file and exit



On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com> wrote:

> Sudo what my friend. There are so many options to sudo
>
> Sent from my iPhone
>
> On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:
>
> Ananad,
>
> Try sudo it will work
>
> On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com>
> wrote:
>
>> Can you try sudo?
>>
>> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>>
>> Regards,
>> Shahab
>>
>> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>>> Dear Sandeep:
>>>
>>> many thanks. I did find hosts, but I do not have write priveleges,
>>> eventhough I am administrator. This is strange. Can you please advise.
>>>
>>> Thanks
>>>
>>> Anand Murali
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>>
>>>
>>>
>>>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
>>> sandeepvura@gmail.com> wrote:
>>>
>>>
>>> Hi Anand,
>>>
>>> You should search /etc directory in root not Hadoop directory.
>>>
>>> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
>>> wrote:
>>>
>>> Dear All:
>>>
>>> I dont see a etc/host. Find below.
>>>
>>>
>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
>>> total 76
>>> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
>>> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
>>> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
>>> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
>>> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
>>> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
>>> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
>>> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
>>> total 12
>>> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
>>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
>>> total 176
>>> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
>>> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
>>> capacity-scheduler.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
>>> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
>>> container-executor.cfg
>>> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
>>> hadoop-metrics2.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
>>> hadoop-metrics.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
>>> httpfs-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
>>> httpfs-signature.secret
>>> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
>>> kms-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
>>> mapred-queues.xml.template
>>> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
>>> mapred-site.xml.template~
>>> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
>>> ssl-client.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
>>> ssl-server.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
>>> localhost
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>>>
>>> Thanks.
>>>
>>> Regards,
>>>
>>>
>>>
>>> Anand Murali
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>>
>>>
>>>
>>>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
>>> anand_vihar@yahoo.com> wrote:
>>>
>>>
>>> Ok thanks will do
>>>
>>> Sent from my iPhone
>>>
>>> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>>>
>>> hosts file will be available in /etc directory please check once.
>>>
>>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
>>> wrote:
>>>
>>> I don't seem to have etc/host
>>>
>>>
>>> Sent from my iPhone
>>>
>>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>>
>>> Hi Anand,
>>>
>>> comment the ip address - 127.0.1.1 in /etc/hosts
>>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>>
>>> Restart your hadoop cluster after made changes in /etc/hosts
>>>
>>> Regards,
>>> Sandeep.v
>>>
>>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
>>> wrote:
>>>
>>> Dear All:
>>>
>>> Has anyone encountered this error and if so how have you fixed it other
>>> then re-installing Hadoop or re-starting start-dfs.sh when you have already
>>> started after boot. Find below
>>>
>>> anand_vihar@Latitude-E5540:~$ ssh localhost
>>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>>
>>>  * Documentation:  https://help.ubuntu.com/
>>>
>>> 1 package can be updated.
>>> 1 update is a security update.
>>>
>>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>>> /home/anand_vihar/hadoop-2.6.0
>>> /home/anand_vihar/jdk1.7.0_75
>>> /home/anand_vihar/hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>>> Hadoop 2.6.0
>>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>>> Compiled by jenkins on 2014-11-13T21:10Z
>>> Compiled with protoc 2.5.0
>>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>>> This command was run using
>>> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>>> Starting namenodes on [localhost]
>>> localhost: starting namenode, logging to
>>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>>> localhost: starting datanode, logging to
>>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>>> Starting secondary namenodes [0.0.0.0]
>>> 0.0.0.0: starting secondarynamenode, logging to
>>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
>>> connection exception: java.net.ConnectException: Connection refused; For
>>> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>>>
>>>
>>>
>>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but
>>> there is no fix to the problem rather it seems to be a Ubuntu network
>>> problem. I have many times killed nanenode/datanode/secondary data note,
>>> shutdown and restarted, but this error still appears. The only way seems to
>>> be re-installing hadoop. Please advise or refer.
>>>
>>> Many thanks,
>>>
>>> Regards,
>>>
>>>
>>>
>>> Anand Murali
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
run this command in the terminal from root directory

$ sudo nano /etc/hosts (( It will prompt to enter root password))

Later you can comment those lines in hosts files #127.0.1.1

add this line 127.0.0.1     localhost

save the host file and exit



On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali <an...@yahoo.com> wrote:

> Sudo what my friend. There are so many options to sudo
>
> Sent from my iPhone
>
> On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:
>
> Ananad,
>
> Try sudo it will work
>
> On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com>
> wrote:
>
>> Can you try sudo?
>>
>> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>>
>> Regards,
>> Shahab
>>
>> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>>> Dear Sandeep:
>>>
>>> many thanks. I did find hosts, but I do not have write priveleges,
>>> eventhough I am administrator. This is strange. Can you please advise.
>>>
>>> Thanks
>>>
>>> Anand Murali
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>>
>>>
>>>
>>>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
>>> sandeepvura@gmail.com> wrote:
>>>
>>>
>>> Hi Anand,
>>>
>>> You should search /etc directory in root not Hadoop directory.
>>>
>>> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
>>> wrote:
>>>
>>> Dear All:
>>>
>>> I dont see a etc/host. Find below.
>>>
>>>
>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
>>> total 76
>>> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
>>> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
>>> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
>>> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
>>> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
>>> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
>>> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
>>> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
>>> total 12
>>> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
>>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
>>> total 176
>>> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
>>> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
>>> capacity-scheduler.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
>>> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
>>> container-executor.cfg
>>> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
>>> hadoop-metrics2.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
>>> hadoop-metrics.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
>>> httpfs-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
>>> httpfs-signature.secret
>>> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
>>> kms-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
>>> mapred-queues.xml.template
>>> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
>>> mapred-site.xml.template~
>>> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
>>> ssl-client.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
>>> ssl-server.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
>>> localhost
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>>>
>>> Thanks.
>>>
>>> Regards,
>>>
>>>
>>>
>>> Anand Murali
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>>
>>>
>>>
>>>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
>>> anand_vihar@yahoo.com> wrote:
>>>
>>>
>>> Ok thanks will do
>>>
>>> Sent from my iPhone
>>>
>>> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>>>
>>> hosts file will be available in /etc directory please check once.
>>>
>>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
>>> wrote:
>>>
>>> I don't seem to have etc/host
>>>
>>>
>>> Sent from my iPhone
>>>
>>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>>
>>> Hi Anand,
>>>
>>> comment the ip address - 127.0.1.1 in /etc/hosts
>>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>>
>>> Restart your hadoop cluster after made changes in /etc/hosts
>>>
>>> Regards,
>>> Sandeep.v
>>>
>>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
>>> wrote:
>>>
>>> Dear All:
>>>
>>> Has anyone encountered this error and if so how have you fixed it other
>>> then re-installing Hadoop or re-starting start-dfs.sh when you have already
>>> started after boot. Find below
>>>
>>> anand_vihar@Latitude-E5540:~$ ssh localhost
>>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>>
>>>  * Documentation:  https://help.ubuntu.com/
>>>
>>> 1 package can be updated.
>>> 1 update is a security update.
>>>
>>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>>> /home/anand_vihar/hadoop-2.6.0
>>> /home/anand_vihar/jdk1.7.0_75
>>> /home/anand_vihar/hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>>> Hadoop 2.6.0
>>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>>> Compiled by jenkins on 2014-11-13T21:10Z
>>> Compiled with protoc 2.5.0
>>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>>> This command was run using
>>> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>>> Starting namenodes on [localhost]
>>> localhost: starting namenode, logging to
>>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>>> localhost: starting datanode, logging to
>>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>>> Starting secondary namenodes [0.0.0.0]
>>> 0.0.0.0: starting secondarynamenode, logging to
>>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
>>> connection exception: java.net.ConnectException: Connection refused; For
>>> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>>>
>>>
>>>
>>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but
>>> there is no fix to the problem rather it seems to be a Ubuntu network
>>> problem. I have many times killed nanenode/datanode/secondary data note,
>>> shutdown and restarted, but this error still appears. The only way seems to
>>> be re-installing hadoop. Please advise or refer.
>>>
>>> Many thanks,
>>>
>>> Regards,
>>>
>>>
>>>
>>> Anand Murali
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Sudo what my friend. There are so many options to sudo 

Sent from my iPhone

> On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:
> 
> Ananad,
> 
> Try sudo it will work 
> 
>> On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com> wrote:
>> Can you try sudo?
>> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>> 
>> Regards,
>> Shahab
>> 
>>> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:
>>> Dear Sandeep:
>>> 
>>> many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
>>> 
>>> Thanks
>>>  
>>> Anand Murali  
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>> 
>>> 
>>> 
>>> On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
>>> 
>>> 
>>> Hi Anand,
>>> 
>>> You should search /etc directory in root not Hadoop directory.
>>> 
>>> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:
>>> Dear All:
>>> 
>>> I dont see a etc/host. Find below.
>>> 
>>> 
>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
>>> total 76
>>> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
>>> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
>>> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
>>> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
>>> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
>>> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
>>> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
>>> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
>>> total 12
>>> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
>>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
>>> total 176
>>> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
>>> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
>>> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
>>> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
>>> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
>>> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
>>> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
>>> localhost
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 
>>> 
>>> Thanks.
>>> 
>>> Regards,
>>> 
>>> 
>>>  
>>> Anand Murali  
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>> 
>>> 
>>> 
>>> On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
>>> 
>>> 
>>> Ok thanks will do
>>> 
>>> Sent from my iPhone
>>> 
>>>> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>>>> 
>>>> hosts file will be available in /etc directory please check once.
>>>> 
>>>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:
>>>> I don't seem to have etc/host
>>>> 
>>>> 
>>>> Sent from my iPhone
>>>> 
>>>>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>>>> 
>>>>> Hi Anand,
>>>>> 
>>>>> comment the ip address - 127.0.1.1 in /etc/hosts
>>>>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>>>> 
>>>>> Restart your hadoop cluster after made changes in /etc/hosts
>>>>> 
>>>>> Regards,
>>>>> Sandeep.v
>>>>> 
>>>>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:
>>>>> Dear All:
>>>>> 
>>>>> Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
>>>>> 
>>>>> anand_vihar@Latitude-E5540:~$ ssh localhost
>>>>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>>>> 
>>>>>  * Documentation:  https://help.ubuntu.com/
>>>>> 
>>>>> 1 package can be updated.
>>>>> 1 update is a security update.
>>>>> 
>>>>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>>>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>>>>> /home/anand_vihar/hadoop-2.6.0
>>>>> /home/anand_vihar/jdk1.7.0_75
>>>>> /home/anand_vihar/hadoop-2.6.0
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>>>>> Hadoop 2.6.0
>>>>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>>>>> Compiled by jenkins on 2014-11-13T21:10Z
>>>>> Compiled with protoc 2.5.0
>>>>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>>>>> This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>>>>> Starting namenodes on [localhost]
>>>>> localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>>>>> localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>>>>> Starting secondary namenodes [0.0.0.0]
>>>>> 0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>>>>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
>>>>> 
>>>>> 
>>>>> 
>>>>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
>>>>> 
>>>>> Many thanks,
>>>>> 
>>>>> Regards,
>>>>> 
>>>>> 
>>>>>  
>>>>> Anand Murali  
>>>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>>>> Chennai - 600 004, India
>>>>> Ph: (044)- 28474593/ 43526162 (voicemail)
> 

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Sudo what my friend. There are so many options to sudo 

Sent from my iPhone

> On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:
> 
> Ananad,
> 
> Try sudo it will work 
> 
>> On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com> wrote:
>> Can you try sudo?
>> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>> 
>> Regards,
>> Shahab
>> 
>>> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:
>>> Dear Sandeep:
>>> 
>>> many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
>>> 
>>> Thanks
>>>  
>>> Anand Murali  
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>> 
>>> 
>>> 
>>> On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
>>> 
>>> 
>>> Hi Anand,
>>> 
>>> You should search /etc directory in root not Hadoop directory.
>>> 
>>> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:
>>> Dear All:
>>> 
>>> I dont see a etc/host. Find below.
>>> 
>>> 
>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
>>> total 76
>>> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
>>> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
>>> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
>>> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
>>> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
>>> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
>>> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
>>> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
>>> total 12
>>> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
>>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
>>> total 176
>>> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
>>> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
>>> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
>>> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
>>> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
>>> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
>>> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
>>> localhost
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 
>>> 
>>> Thanks.
>>> 
>>> Regards,
>>> 
>>> 
>>>  
>>> Anand Murali  
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>> 
>>> 
>>> 
>>> On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
>>> 
>>> 
>>> Ok thanks will do
>>> 
>>> Sent from my iPhone
>>> 
>>>> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>>>> 
>>>> hosts file will be available in /etc directory please check once.
>>>> 
>>>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:
>>>> I don't seem to have etc/host
>>>> 
>>>> 
>>>> Sent from my iPhone
>>>> 
>>>>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>>>> 
>>>>> Hi Anand,
>>>>> 
>>>>> comment the ip address - 127.0.1.1 in /etc/hosts
>>>>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>>>> 
>>>>> Restart your hadoop cluster after made changes in /etc/hosts
>>>>> 
>>>>> Regards,
>>>>> Sandeep.v
>>>>> 
>>>>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:
>>>>> Dear All:
>>>>> 
>>>>> Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
>>>>> 
>>>>> anand_vihar@Latitude-E5540:~$ ssh localhost
>>>>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>>>> 
>>>>>  * Documentation:  https://help.ubuntu.com/
>>>>> 
>>>>> 1 package can be updated.
>>>>> 1 update is a security update.
>>>>> 
>>>>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>>>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>>>>> /home/anand_vihar/hadoop-2.6.0
>>>>> /home/anand_vihar/jdk1.7.0_75
>>>>> /home/anand_vihar/hadoop-2.6.0
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>>>>> Hadoop 2.6.0
>>>>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>>>>> Compiled by jenkins on 2014-11-13T21:10Z
>>>>> Compiled with protoc 2.5.0
>>>>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>>>>> This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>>>>> Starting namenodes on [localhost]
>>>>> localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>>>>> localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>>>>> Starting secondary namenodes [0.0.0.0]
>>>>> 0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>>>>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
>>>>> 
>>>>> 
>>>>> 
>>>>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
>>>>> 
>>>>> Many thanks,
>>>>> 
>>>>> Regards,
>>>>> 
>>>>> 
>>>>>  
>>>>> Anand Murali  
>>>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>>>> Chennai - 600 004, India
>>>>> Ph: (044)- 28474593/ 43526162 (voicemail)
> 

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Sudo what my friend. There are so many options to sudo 

Sent from my iPhone

> On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:
> 
> Ananad,
> 
> Try sudo it will work 
> 
>> On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com> wrote:
>> Can you try sudo?
>> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>> 
>> Regards,
>> Shahab
>> 
>>> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:
>>> Dear Sandeep:
>>> 
>>> many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
>>> 
>>> Thanks
>>>  
>>> Anand Murali  
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>> 
>>> 
>>> 
>>> On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
>>> 
>>> 
>>> Hi Anand,
>>> 
>>> You should search /etc directory in root not Hadoop directory.
>>> 
>>> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:
>>> Dear All:
>>> 
>>> I dont see a etc/host. Find below.
>>> 
>>> 
>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
>>> total 76
>>> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
>>> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
>>> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
>>> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
>>> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
>>> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
>>> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
>>> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
>>> total 12
>>> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
>>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
>>> total 176
>>> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
>>> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
>>> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
>>> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
>>> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
>>> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
>>> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
>>> localhost
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 
>>> 
>>> Thanks.
>>> 
>>> Regards,
>>> 
>>> 
>>>  
>>> Anand Murali  
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>> 
>>> 
>>> 
>>> On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
>>> 
>>> 
>>> Ok thanks will do
>>> 
>>> Sent from my iPhone
>>> 
>>>> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>>>> 
>>>> hosts file will be available in /etc directory please check once.
>>>> 
>>>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:
>>>> I don't seem to have etc/host
>>>> 
>>>> 
>>>> Sent from my iPhone
>>>> 
>>>>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>>>> 
>>>>> Hi Anand,
>>>>> 
>>>>> comment the ip address - 127.0.1.1 in /etc/hosts
>>>>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>>>> 
>>>>> Restart your hadoop cluster after made changes in /etc/hosts
>>>>> 
>>>>> Regards,
>>>>> Sandeep.v
>>>>> 
>>>>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:
>>>>> Dear All:
>>>>> 
>>>>> Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
>>>>> 
>>>>> anand_vihar@Latitude-E5540:~$ ssh localhost
>>>>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>>>> 
>>>>>  * Documentation:  https://help.ubuntu.com/
>>>>> 
>>>>> 1 package can be updated.
>>>>> 1 update is a security update.
>>>>> 
>>>>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>>>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>>>>> /home/anand_vihar/hadoop-2.6.0
>>>>> /home/anand_vihar/jdk1.7.0_75
>>>>> /home/anand_vihar/hadoop-2.6.0
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>>>>> Hadoop 2.6.0
>>>>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>>>>> Compiled by jenkins on 2014-11-13T21:10Z
>>>>> Compiled with protoc 2.5.0
>>>>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>>>>> This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>>>>> Starting namenodes on [localhost]
>>>>> localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>>>>> localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>>>>> Starting secondary namenodes [0.0.0.0]
>>>>> 0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>>>>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
>>>>> 
>>>>> 
>>>>> 
>>>>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
>>>>> 
>>>>> Many thanks,
>>>>> 
>>>>> Regards,
>>>>> 
>>>>> 
>>>>>  
>>>>> Anand Murali  
>>>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>>>> Chennai - 600 004, India
>>>>> Ph: (044)- 28474593/ 43526162 (voicemail)
> 

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Sudo what my friend. There are so many options to sudo 

Sent from my iPhone

> On 23-Apr-2015, at 8:20 am, sandeep vura <sa...@gmail.com> wrote:
> 
> Ananad,
> 
> Try sudo it will work 
> 
>> On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com> wrote:
>> Can you try sudo?
>> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>> 
>> Regards,
>> Shahab
>> 
>>> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:
>>> Dear Sandeep:
>>> 
>>> many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
>>> 
>>> Thanks
>>>  
>>> Anand Murali  
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>> 
>>> 
>>> 
>>> On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
>>> 
>>> 
>>> Hi Anand,
>>> 
>>> You should search /etc directory in root not Hadoop directory.
>>> 
>>> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:
>>> Dear All:
>>> 
>>> I dont see a etc/host. Find below.
>>> 
>>> 
>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
>>> total 76
>>> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
>>> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
>>> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
>>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
>>> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
>>> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
>>> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
>>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
>>> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
>>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
>>> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
>>> total 12
>>> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
>>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
>>> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
>>> total 176
>>> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
>>> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
>>> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
>>> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
>>> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
>>> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
>>> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
>>> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
>>> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
>>> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
>>> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
>>> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
>>> localhost
>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 
>>> 
>>> Thanks.
>>> 
>>> Regards,
>>> 
>>> 
>>>  
>>> Anand Murali  
>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>> Chennai - 600 004, India
>>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>> 
>>> 
>>> 
>>> On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
>>> 
>>> 
>>> Ok thanks will do
>>> 
>>> Sent from my iPhone
>>> 
>>>> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>>>> 
>>>> hosts file will be available in /etc directory please check once.
>>>> 
>>>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:
>>>> I don't seem to have etc/host
>>>> 
>>>> 
>>>> Sent from my iPhone
>>>> 
>>>>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>>>> 
>>>>> Hi Anand,
>>>>> 
>>>>> comment the ip address - 127.0.1.1 in /etc/hosts
>>>>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>>>> 
>>>>> Restart your hadoop cluster after made changes in /etc/hosts
>>>>> 
>>>>> Regards,
>>>>> Sandeep.v
>>>>> 
>>>>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:
>>>>> Dear All:
>>>>> 
>>>>> Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
>>>>> 
>>>>> anand_vihar@Latitude-E5540:~$ ssh localhost
>>>>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>>>> 
>>>>>  * Documentation:  https://help.ubuntu.com/
>>>>> 
>>>>> 1 package can be updated.
>>>>> 1 update is a security update.
>>>>> 
>>>>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>>>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>>>>> /home/anand_vihar/hadoop-2.6.0
>>>>> /home/anand_vihar/jdk1.7.0_75
>>>>> /home/anand_vihar/hadoop-2.6.0
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>>>>> Hadoop 2.6.0
>>>>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>>>>> Compiled by jenkins on 2014-11-13T21:10Z
>>>>> Compiled with protoc 2.5.0
>>>>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>>>>> This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>>>>> Starting namenodes on [localhost]
>>>>> localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>>>>> localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>>>>> Starting secondary namenodes [0.0.0.0]
>>>>> 0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>>>>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
>>>>> 
>>>>> 
>>>>> 
>>>>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
>>>>> 
>>>>> Many thanks,
>>>>> 
>>>>> Regards,
>>>>> 
>>>>> 
>>>>>  
>>>>> Anand Murali  
>>>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>>>> Chennai - 600 004, India
>>>>> Ph: (044)- 28474593/ 43526162 (voicemail)
> 

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
Ananad,

Try sudo it will work

On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com>
wrote:

> Can you try sudo?
> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>
> Regards,
> Shahab
>
> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com>
> wrote:
>
>> Dear Sandeep:
>>
>> many thanks. I did find hosts, but I do not have write priveleges,
>> eventhough I am administrator. This is strange. Can you please advise.
>>
>> Thanks
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>>
>>
>>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
>> sandeepvura@gmail.com> wrote:
>>
>>
>> Hi Anand,
>>
>> You should search /etc directory in root not Hadoop directory.
>>
>> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>> Dear All:
>>
>> I dont see a etc/host. Find below.
>>
>>
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
>> total 76
>> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
>> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
>> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
>> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
>> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
>> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
>> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
>> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
>> total 12
>> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
>> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
>> total 176
>> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
>> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
>> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
>> capacity-scheduler.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
>> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
>> container-executor.cfg
>> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
>> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
>> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
>> hadoop-metrics2.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
>> hadoop-metrics.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
>> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
>> httpfs-log4j.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
>> httpfs-signature.secret
>> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
>> kms-log4j.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
>> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
>> mapred-queues.xml.template
>> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
>> mapred-site.xml.template~
>> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
>> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
>> ssl-client.xml.example
>> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
>> ssl-server.xml.example
>> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
>> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
>> localhost
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>>
>> Thanks.
>>
>> Regards,
>>
>>
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>>
>>
>>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
>> anand_vihar@yahoo.com> wrote:
>>
>>
>> Ok thanks will do
>>
>> Sent from my iPhone
>>
>> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>>
>> hosts file will be available in /etc directory please check once.
>>
>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>> I don't seem to have etc/host
>>
>>
>> Sent from my iPhone
>>
>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>
>> Hi Anand,
>>
>> comment the ip address - 127.0.1.1 in /etc/hosts
>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>
>> Restart your hadoop cluster after made changes in /etc/hosts
>>
>> Regards,
>> Sandeep.v
>>
>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>> Dear All:
>>
>> Has anyone encountered this error and if so how have you fixed it other
>> then re-installing Hadoop or re-starting start-dfs.sh when you have already
>> started after boot. Find below
>>
>> anand_vihar@Latitude-E5540:~$ ssh localhost
>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>
>>  * Documentation:  https://help.ubuntu.com/
>>
>> 1 package can be updated.
>> 1 update is a security update.
>>
>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>> /home/anand_vihar/hadoop-2.6.0
>> /home/anand_vihar/jdk1.7.0_75
>> /home/anand_vihar/hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>> Hadoop 2.6.0
>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>> Compiled by jenkins on 2014-11-13T21:10Z
>> Compiled with protoc 2.5.0
>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>> This command was run using
>> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>> Starting namenodes on [localhost]
>> localhost: starting namenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>> localhost: starting datanode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>> Starting secondary namenodes [0.0.0.0]
>> 0.0.0.0: starting secondarynamenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
>> connection exception: java.net.ConnectException: Connection refused; For
>> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>>
>>
>>
>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
>> is no fix to the problem rather it seems to be a Ubuntu network problem. I
>> have many times killed nanenode/datanode/secondary data note, shutdown and
>> restarted, but this error still appears. The only way seems to be
>> re-installing hadoop. Please advise or refer.
>>
>> Many thanks,
>>
>> Regards,
>>
>>
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>>
>>
>>
>>
>>
>>
>>
>>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
Ananad,

Try sudo it will work

On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com>
wrote:

> Can you try sudo?
> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>
> Regards,
> Shahab
>
> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com>
> wrote:
>
>> Dear Sandeep:
>>
>> many thanks. I did find hosts, but I do not have write priveleges,
>> eventhough I am administrator. This is strange. Can you please advise.
>>
>> Thanks
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>>
>>
>>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
>> sandeepvura@gmail.com> wrote:
>>
>>
>> Hi Anand,
>>
>> You should search /etc directory in root not Hadoop directory.
>>
>> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>> Dear All:
>>
>> I dont see a etc/host. Find below.
>>
>>
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
>> total 76
>> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
>> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
>> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
>> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
>> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
>> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
>> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
>> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
>> total 12
>> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
>> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
>> total 176
>> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
>> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
>> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
>> capacity-scheduler.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
>> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
>> container-executor.cfg
>> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
>> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
>> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
>> hadoop-metrics2.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
>> hadoop-metrics.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
>> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
>> httpfs-log4j.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
>> httpfs-signature.secret
>> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
>> kms-log4j.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
>> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
>> mapred-queues.xml.template
>> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
>> mapred-site.xml.template~
>> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
>> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
>> ssl-client.xml.example
>> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
>> ssl-server.xml.example
>> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
>> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
>> localhost
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>>
>> Thanks.
>>
>> Regards,
>>
>>
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>>
>>
>>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
>> anand_vihar@yahoo.com> wrote:
>>
>>
>> Ok thanks will do
>>
>> Sent from my iPhone
>>
>> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>>
>> hosts file will be available in /etc directory please check once.
>>
>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>> I don't seem to have etc/host
>>
>>
>> Sent from my iPhone
>>
>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>
>> Hi Anand,
>>
>> comment the ip address - 127.0.1.1 in /etc/hosts
>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>
>> Restart your hadoop cluster after made changes in /etc/hosts
>>
>> Regards,
>> Sandeep.v
>>
>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>> Dear All:
>>
>> Has anyone encountered this error and if so how have you fixed it other
>> then re-installing Hadoop or re-starting start-dfs.sh when you have already
>> started after boot. Find below
>>
>> anand_vihar@Latitude-E5540:~$ ssh localhost
>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>
>>  * Documentation:  https://help.ubuntu.com/
>>
>> 1 package can be updated.
>> 1 update is a security update.
>>
>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>> /home/anand_vihar/hadoop-2.6.0
>> /home/anand_vihar/jdk1.7.0_75
>> /home/anand_vihar/hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>> Hadoop 2.6.0
>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>> Compiled by jenkins on 2014-11-13T21:10Z
>> Compiled with protoc 2.5.0
>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>> This command was run using
>> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>> Starting namenodes on [localhost]
>> localhost: starting namenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>> localhost: starting datanode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>> Starting secondary namenodes [0.0.0.0]
>> 0.0.0.0: starting secondarynamenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
>> connection exception: java.net.ConnectException: Connection refused; For
>> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>>
>>
>>
>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
>> is no fix to the problem rather it seems to be a Ubuntu network problem. I
>> have many times killed nanenode/datanode/secondary data note, shutdown and
>> restarted, but this error still appears. The only way seems to be
>> re-installing hadoop. Please advise or refer.
>>
>> Many thanks,
>>
>> Regards,
>>
>>
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>>
>>
>>
>>
>>
>>
>>
>>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
Ananad,

Try sudo it will work

On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com>
wrote:

> Can you try sudo?
> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>
> Regards,
> Shahab
>
> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com>
> wrote:
>
>> Dear Sandeep:
>>
>> many thanks. I did find hosts, but I do not have write priveleges,
>> eventhough I am administrator. This is strange. Can you please advise.
>>
>> Thanks
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>>
>>
>>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
>> sandeepvura@gmail.com> wrote:
>>
>>
>> Hi Anand,
>>
>> You should search /etc directory in root not Hadoop directory.
>>
>> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>> Dear All:
>>
>> I dont see a etc/host. Find below.
>>
>>
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
>> total 76
>> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
>> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
>> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
>> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
>> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
>> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
>> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
>> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
>> total 12
>> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
>> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
>> total 176
>> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
>> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
>> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
>> capacity-scheduler.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
>> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
>> container-executor.cfg
>> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
>> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
>> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
>> hadoop-metrics2.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
>> hadoop-metrics.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
>> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
>> httpfs-log4j.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
>> httpfs-signature.secret
>> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
>> kms-log4j.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
>> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
>> mapred-queues.xml.template
>> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
>> mapred-site.xml.template~
>> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
>> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
>> ssl-client.xml.example
>> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
>> ssl-server.xml.example
>> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
>> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
>> localhost
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>>
>> Thanks.
>>
>> Regards,
>>
>>
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>>
>>
>>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
>> anand_vihar@yahoo.com> wrote:
>>
>>
>> Ok thanks will do
>>
>> Sent from my iPhone
>>
>> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>>
>> hosts file will be available in /etc directory please check once.
>>
>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>> I don't seem to have etc/host
>>
>>
>> Sent from my iPhone
>>
>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>
>> Hi Anand,
>>
>> comment the ip address - 127.0.1.1 in /etc/hosts
>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>
>> Restart your hadoop cluster after made changes in /etc/hosts
>>
>> Regards,
>> Sandeep.v
>>
>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>> Dear All:
>>
>> Has anyone encountered this error and if so how have you fixed it other
>> then re-installing Hadoop or re-starting start-dfs.sh when you have already
>> started after boot. Find below
>>
>> anand_vihar@Latitude-E5540:~$ ssh localhost
>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>
>>  * Documentation:  https://help.ubuntu.com/
>>
>> 1 package can be updated.
>> 1 update is a security update.
>>
>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>> /home/anand_vihar/hadoop-2.6.0
>> /home/anand_vihar/jdk1.7.0_75
>> /home/anand_vihar/hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>> Hadoop 2.6.0
>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>> Compiled by jenkins on 2014-11-13T21:10Z
>> Compiled with protoc 2.5.0
>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>> This command was run using
>> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>> Starting namenodes on [localhost]
>> localhost: starting namenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>> localhost: starting datanode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>> Starting secondary namenodes [0.0.0.0]
>> 0.0.0.0: starting secondarynamenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
>> connection exception: java.net.ConnectException: Connection refused; For
>> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>>
>>
>>
>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
>> is no fix to the problem rather it seems to be a Ubuntu network problem. I
>> have many times killed nanenode/datanode/secondary data note, shutdown and
>> restarted, but this error still appears. The only way seems to be
>> re-installing hadoop. Please advise or refer.
>>
>> Many thanks,
>>
>> Regards,
>>
>>
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>>
>>
>>
>>
>>
>>
>>
>>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
Ananad,

Try sudo it will work

On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <sh...@gmail.com>
wrote:

> Can you try sudo?
> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>
> Regards,
> Shahab
>
> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com>
> wrote:
>
>> Dear Sandeep:
>>
>> many thanks. I did find hosts, but I do not have write priveleges,
>> eventhough I am administrator. This is strange. Can you please advise.
>>
>> Thanks
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>>
>>
>>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
>> sandeepvura@gmail.com> wrote:
>>
>>
>> Hi Anand,
>>
>> You should search /etc directory in root not Hadoop directory.
>>
>> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>> Dear All:
>>
>> I dont see a etc/host. Find below.
>>
>>
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
>> total 76
>> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
>> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
>> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
>> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
>> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
>> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
>> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
>> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
>> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
>> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
>> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
>> total 12
>> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
>> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
>> total 176
>> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
>> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
>> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
>> capacity-scheduler.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
>> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
>> container-executor.cfg
>> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
>> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
>> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
>> hadoop-metrics2.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
>> hadoop-metrics.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
>> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
>> httpfs-log4j.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
>> httpfs-signature.secret
>> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
>> kms-log4j.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
>> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
>> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
>> mapred-queues.xml.template
>> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
>> mapred-site.xml.template~
>> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
>> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
>> ssl-client.xml.example
>> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
>> ssl-server.xml.example
>> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
>> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
>> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
>> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
>> localhost
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>>
>> Thanks.
>>
>> Regards,
>>
>>
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>>
>>
>>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
>> anand_vihar@yahoo.com> wrote:
>>
>>
>> Ok thanks will do
>>
>> Sent from my iPhone
>>
>> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>>
>> hosts file will be available in /etc directory please check once.
>>
>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>> I don't seem to have etc/host
>>
>>
>> Sent from my iPhone
>>
>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>
>> Hi Anand,
>>
>> comment the ip address - 127.0.1.1 in /etc/hosts
>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>
>> Restart your hadoop cluster after made changes in /etc/hosts
>>
>> Regards,
>> Sandeep.v
>>
>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
>> wrote:
>>
>> Dear All:
>>
>> Has anyone encountered this error and if so how have you fixed it other
>> then re-installing Hadoop or re-starting start-dfs.sh when you have already
>> started after boot. Find below
>>
>> anand_vihar@Latitude-E5540:~$ ssh localhost
>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>
>>  * Documentation:  https://help.ubuntu.com/
>>
>> 1 package can be updated.
>> 1 update is a security update.
>>
>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>> /home/anand_vihar/hadoop-2.6.0
>> /home/anand_vihar/jdk1.7.0_75
>> /home/anand_vihar/hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>> Hadoop 2.6.0
>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>> Compiled by jenkins on 2014-11-13T21:10Z
>> Compiled with protoc 2.5.0
>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>> This command was run using
>> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>> Starting namenodes on [localhost]
>> localhost: starting namenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>> localhost: starting datanode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>> Starting secondary namenodes [0.0.0.0]
>> 0.0.0.0: starting secondarynamenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
>> connection exception: java.net.ConnectException: Connection refused; For
>> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>>
>>
>>
>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
>> is no fix to the problem rather it seems to be a Ubuntu network problem. I
>> have many times killed nanenode/datanode/secondary data note, shutdown and
>> restarted, but this error still appears. The only way seems to be
>> re-installing hadoop. Please advise or refer.
>>
>> Many thanks,
>>
>> Regards,
>>
>>
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>>
>>
>>
>>
>>
>>
>>
>>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Shahab Yunus <sh...@gmail.com>.
Can you try sudo?
https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo

Regards,
Shahab

On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:

> Dear Sandeep:
>
> many thanks. I did find hosts, but I do not have write priveleges,
> eventhough I am administrator. This is strange. Can you please advise.
>
> Thanks
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
> sandeepvura@gmail.com> wrote:
>
>
> Hi Anand,
>
> You should search /etc directory in root not Hadoop directory.
>
> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> I dont see a etc/host. Find below.
>
>
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
> total 76
> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
> total 12
> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
> total 176
> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
> capacity-scheduler.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
> container-executor.cfg
> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
> hadoop-metrics2.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
> hadoop-metrics.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
> httpfs-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
> httpfs-signature.secret
> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
> kms-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
> mapred-queues.xml.template
> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
> mapred-site.xml.template~
> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
> ssl-client.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
> ssl-server.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
> localhost
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>
> Thanks.
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Ok thanks will do
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>
> hosts file will be available in /etc directory please check once.
>
> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>
>
>
>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Shahab Yunus <sh...@gmail.com>.
Can you try sudo?
https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo

Regards,
Shahab

On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:

> Dear Sandeep:
>
> many thanks. I did find hosts, but I do not have write priveleges,
> eventhough I am administrator. This is strange. Can you please advise.
>
> Thanks
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
> sandeepvura@gmail.com> wrote:
>
>
> Hi Anand,
>
> You should search /etc directory in root not Hadoop directory.
>
> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> I dont see a etc/host. Find below.
>
>
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
> total 76
> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
> total 12
> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
> total 176
> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
> capacity-scheduler.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
> container-executor.cfg
> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
> hadoop-metrics2.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
> hadoop-metrics.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
> httpfs-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
> httpfs-signature.secret
> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
> kms-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
> mapred-queues.xml.template
> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
> mapred-site.xml.template~
> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
> ssl-client.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
> ssl-server.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
> localhost
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>
> Thanks.
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Ok thanks will do
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>
> hosts file will be available in /etc directory please check once.
>
> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>
>
>
>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Shahab Yunus <sh...@gmail.com>.
Can you try sudo?
https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo

Regards,
Shahab

On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:

> Dear Sandeep:
>
> many thanks. I did find hosts, but I do not have write priveleges,
> eventhough I am administrator. This is strange. Can you please advise.
>
> Thanks
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
> sandeepvura@gmail.com> wrote:
>
>
> Hi Anand,
>
> You should search /etc directory in root not Hadoop directory.
>
> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> I dont see a etc/host. Find below.
>
>
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
> total 76
> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
> total 12
> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
> total 176
> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
> capacity-scheduler.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
> container-executor.cfg
> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
> hadoop-metrics2.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
> hadoop-metrics.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
> httpfs-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
> httpfs-signature.secret
> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
> kms-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
> mapred-queues.xml.template
> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
> mapred-site.xml.template~
> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
> ssl-client.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
> ssl-server.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
> localhost
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>
> Thanks.
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Ok thanks will do
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>
> hosts file will be available in /etc directory please check once.
>
> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>
>
>
>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Shahab Yunus <sh...@gmail.com>.
Can you try sudo?
https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo

Regards,
Shahab

On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali <an...@yahoo.com> wrote:

> Dear Sandeep:
>
> many thanks. I did find hosts, but I do not have write priveleges,
> eventhough I am administrator. This is strange. Can you please advise.
>
> Thanks
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 4:43 PM, sandeep vura <
> sandeepvura@gmail.com> wrote:
>
>
> Hi Anand,
>
> You should search /etc directory in root not Hadoop directory.
>
> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> I dont see a etc/host. Find below.
>
>
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
> total 76
> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
> total 12
> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
> total 176
> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
> capacity-scheduler.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
> container-executor.cfg
> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
> hadoop-metrics2.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
> hadoop-metrics.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
> httpfs-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
> httpfs-signature.secret
> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
> kms-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
> mapred-queues.xml.template
> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
> mapred-site.xml.template~
> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
> ssl-client.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
> ssl-server.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
> localhost
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>
> Thanks.
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Ok thanks will do
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>
> hosts file will be available in /etc directory please check once.
>
> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>
>
>
>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Dear Sandeep:
many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
   

 Hi Anand,
You should search /etc directory in root not Hadoop directory.
On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







   



  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Dear Sandeep:
many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
   

 Hi Anand,
You should search /etc directory in root not Hadoop directory.
On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







   



  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Dear Sandeep:
many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
   

 Hi Anand,
You should search /etc directory in root not Hadoop directory.
On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







   



  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Dear Sandeep:
many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 4:43 PM, sandeep vura <sa...@gmail.com> wrote:
   

 Hi Anand,
You should search /etc directory in root not Hadoop directory.
On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







   



  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
Hi Anand,

You should search /etc directory in root not Hadoop directory.

On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

> Dear All:
>
> I dont see a etc/host. Find below.
>
>
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
> total 76
> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
> total 12
> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
> total 176
> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
> capacity-scheduler.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
> container-executor.cfg
> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
> hadoop-metrics2.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
> hadoop-metrics.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
> httpfs-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
> httpfs-signature.secret
> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
> kms-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
> mapred-queues.xml.template
> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
> mapred-site.xml.template~
> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
> ssl-client.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
> ssl-server.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
> localhost
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>
> Thanks.
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Ok thanks will do
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>
> hosts file will be available in /etc directory please check once.
>
> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
Hi Anand,

You should search /etc directory in root not Hadoop directory.

On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

> Dear All:
>
> I dont see a etc/host. Find below.
>
>
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
> total 76
> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
> total 12
> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
> total 176
> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
> capacity-scheduler.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
> container-executor.cfg
> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
> hadoop-metrics2.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
> hadoop-metrics.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
> httpfs-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
> httpfs-signature.secret
> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
> kms-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
> mapred-queues.xml.template
> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
> mapred-site.xml.template~
> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
> ssl-client.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
> ssl-server.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
> localhost
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>
> Thanks.
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Ok thanks will do
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>
> hosts file will be available in /etc directory please check once.
>
> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
Hi Anand,

You should search /etc directory in root not Hadoop directory.

On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

> Dear All:
>
> I dont see a etc/host. Find below.
>
>
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
> total 76
> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
> total 12
> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
> total 176
> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
> capacity-scheduler.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
> container-executor.cfg
> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
> hadoop-metrics2.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
> hadoop-metrics.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
> httpfs-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
> httpfs-signature.secret
> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
> kms-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
> mapred-queues.xml.template
> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
> mapred-site.xml.template~
> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
> ssl-client.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
> ssl-server.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
> localhost
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>
> Thanks.
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Ok thanks will do
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>
> hosts file will be available in /etc directory please check once.
>
> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
Hi Anand,

You should search /etc directory in root not Hadoop directory.

On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali <an...@yahoo.com> wrote:

> Dear All:
>
> I dont see a etc/host. Find below.
>
>
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
> total 76
> drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
> drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
> -rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
> drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
> -rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
> drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
> -rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
> drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
> -rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
> drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
> drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
> total 12
> drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
> drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
> total 176
> drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
> drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
> -rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50
> capacity-scheduler.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
> -rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50
> container-executor.cfg
> -rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50
> hadoop-metrics2.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50
> hadoop-metrics.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
> -rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50
> httpfs-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50
> httpfs-signature.secret
> -rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
> -rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50
> kms-log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
> -rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50
> mapred-queues.xml.template
> -rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50
> mapred-site.xml.template~
> -rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
> -rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50
> ssl-client.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50
> ssl-server.xml.example
> -rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
> -rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
> -rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
> -rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
> localhost
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$
>
> Thanks.
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>   On Wednesday, April 22, 2015 2:41 PM, Anand Murali <
> anand_vihar@yahoo.com> wrote:
>
>
> Ok thanks will do
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
>
> hosts file will be available in /etc directory please check once.
>
> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Dear All:
I dont see a etc/host. Find below.

anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--  1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 include
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anand_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r--r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:08 logs
-rw-r--r--  1 anand_vihar anand_vihar   101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_vihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--  1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50 sbin
drwxr-xr-x  4 anand_vihar anand_vihar  4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x  3 anand_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x  2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 Nov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 container-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 02:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.properties
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 02:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vihar   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar  3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar anand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar   858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand_vihar    10 Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar  2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ 

Thanks.
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


     On Wednesday, April 22, 2015 2:41 PM, Anand Murali <an...@yahoo.com> wrote:
   

 Ok thanks will do

Sent from my iPhone
On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:


hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

I don't seem to have etc/host

Sent from my iPhone
On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:


Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hostsadd the following ip address - 127.0.0.1  localhost  in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

Dear All:
Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
>From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
Many thanks,
Regards,


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)







  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Ok thanks will do

Sent from my iPhone

> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
> 
> hosts file will be available in /etc directory please check once.
> 
>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:
>> I don't seem to have etc/host
>> 
>> 
>> Sent from my iPhone
>> 
>>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>> 
>>> Hi Anand,
>>> 
>>> comment the ip address - 127.0.1.1 in /etc/hosts
>>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>> 
>>> Restart your hadoop cluster after made changes in /etc/hosts
>>> 
>>> Regards,
>>> Sandeep.v
>>> 
>>>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:
>>>> Dear All:
>>>> 
>>>> Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
>>>> 
>>>> anand_vihar@Latitude-E5540:~$ ssh localhost
>>>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>>> 
>>>>  * Documentation:  https://help.ubuntu.com/
>>>> 
>>>> 1 package can be updated.
>>>> 1 update is a security update.
>>>> 
>>>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>>>> /home/anand_vihar/hadoop-2.6.0
>>>> /home/anand_vihar/jdk1.7.0_75
>>>> /home/anand_vihar/hadoop-2.6.0
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>>>> Hadoop 2.6.0
>>>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>>>> Compiled by jenkins on 2014-11-13T21:10Z
>>>> Compiled with protoc 2.5.0
>>>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>>>> This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>>>> Starting namenodes on [localhost]
>>>> localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>>>> localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>>>> Starting secondary namenodes [0.0.0.0]
>>>> 0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>>>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
>>>> 
>>>> 
>>>> 
>>>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
>>>> 
>>>> Many thanks,
>>>> 
>>>> Regards,
>>>> 
>>>> 
>>>>  
>>>> Anand Murali  
>>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>>> Chennai - 600 004, India
>>>> Ph: (044)- 28474593/ 43526162 (voicemail)
> 

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Ok thanks will do

Sent from my iPhone

> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
> 
> hosts file will be available in /etc directory please check once.
> 
>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:
>> I don't seem to have etc/host
>> 
>> 
>> Sent from my iPhone
>> 
>>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>> 
>>> Hi Anand,
>>> 
>>> comment the ip address - 127.0.1.1 in /etc/hosts
>>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>> 
>>> Restart your hadoop cluster after made changes in /etc/hosts
>>> 
>>> Regards,
>>> Sandeep.v
>>> 
>>>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:
>>>> Dear All:
>>>> 
>>>> Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
>>>> 
>>>> anand_vihar@Latitude-E5540:~$ ssh localhost
>>>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>>> 
>>>>  * Documentation:  https://help.ubuntu.com/
>>>> 
>>>> 1 package can be updated.
>>>> 1 update is a security update.
>>>> 
>>>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>>>> /home/anand_vihar/hadoop-2.6.0
>>>> /home/anand_vihar/jdk1.7.0_75
>>>> /home/anand_vihar/hadoop-2.6.0
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>>>> Hadoop 2.6.0
>>>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>>>> Compiled by jenkins on 2014-11-13T21:10Z
>>>> Compiled with protoc 2.5.0
>>>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>>>> This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>>>> Starting namenodes on [localhost]
>>>> localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>>>> localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>>>> Starting secondary namenodes [0.0.0.0]
>>>> 0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>>>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
>>>> 
>>>> 
>>>> 
>>>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
>>>> 
>>>> Many thanks,
>>>> 
>>>> Regards,
>>>> 
>>>> 
>>>>  
>>>> Anand Murali  
>>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>>> Chennai - 600 004, India
>>>> Ph: (044)- 28474593/ 43526162 (voicemail)
> 

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Ok thanks will do

Sent from my iPhone

> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
> 
> hosts file will be available in /etc directory please check once.
> 
>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:
>> I don't seem to have etc/host
>> 
>> 
>> Sent from my iPhone
>> 
>>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>> 
>>> Hi Anand,
>>> 
>>> comment the ip address - 127.0.1.1 in /etc/hosts
>>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>> 
>>> Restart your hadoop cluster after made changes in /etc/hosts
>>> 
>>> Regards,
>>> Sandeep.v
>>> 
>>>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:
>>>> Dear All:
>>>> 
>>>> Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
>>>> 
>>>> anand_vihar@Latitude-E5540:~$ ssh localhost
>>>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>>> 
>>>>  * Documentation:  https://help.ubuntu.com/
>>>> 
>>>> 1 package can be updated.
>>>> 1 update is a security update.
>>>> 
>>>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>>>> /home/anand_vihar/hadoop-2.6.0
>>>> /home/anand_vihar/jdk1.7.0_75
>>>> /home/anand_vihar/hadoop-2.6.0
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>>>> Hadoop 2.6.0
>>>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>>>> Compiled by jenkins on 2014-11-13T21:10Z
>>>> Compiled with protoc 2.5.0
>>>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>>>> This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>>>> Starting namenodes on [localhost]
>>>> localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>>>> localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>>>> Starting secondary namenodes [0.0.0.0]
>>>> 0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>>>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
>>>> 
>>>> 
>>>> 
>>>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
>>>> 
>>>> Many thanks,
>>>> 
>>>> Regards,
>>>> 
>>>> 
>>>>  
>>>> Anand Murali  
>>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>>> Chennai - 600 004, India
>>>> Ph: (044)- 28474593/ 43526162 (voicemail)
> 

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
Ok thanks will do

Sent from my iPhone

> On 22-Apr-2015, at 2:39 pm, sandeep vura <sa...@gmail.com> wrote:
> 
> hosts file will be available in /etc directory please check once.
> 
>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:
>> I don't seem to have etc/host
>> 
>> 
>> Sent from my iPhone
>> 
>>> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>>> 
>>> Hi Anand,
>>> 
>>> comment the ip address - 127.0.1.1 in /etc/hosts
>>> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>>> 
>>> Restart your hadoop cluster after made changes in /etc/hosts
>>> 
>>> Regards,
>>> Sandeep.v
>>> 
>>>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:
>>>> Dear All:
>>>> 
>>>> Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
>>>> 
>>>> anand_vihar@Latitude-E5540:~$ ssh localhost
>>>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>>> 
>>>>  * Documentation:  https://help.ubuntu.com/
>>>> 
>>>> 1 package can be updated.
>>>> 1 update is a security update.
>>>> 
>>>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>>>> /home/anand_vihar/hadoop-2.6.0
>>>> /home/anand_vihar/jdk1.7.0_75
>>>> /home/anand_vihar/hadoop-2.6.0
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>>>> Hadoop 2.6.0
>>>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>>>> Compiled by jenkins on 2014-11-13T21:10Z
>>>> Compiled with protoc 2.5.0
>>>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>>>> This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>>>> Starting namenodes on [localhost]
>>>> localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>>>> localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>>>> Starting secondary namenodes [0.0.0.0]
>>>> 0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>>>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
>>>> 
>>>> 
>>>> 
>>>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
>>>> 
>>>> Many thanks,
>>>> 
>>>> Regards,
>>>> 
>>>> 
>>>>  
>>>> Anand Murali  
>>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>>>> Chennai - 600 004, India
>>>> Ph: (044)- 28474593/ 43526162 (voicemail)
> 

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
hosts file will be available in /etc directory please check once.

On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
>> Dear All:
>>
>> Has anyone encountered this error and if so how have you fixed it other
>> then re-installing Hadoop or re-starting start-dfs.sh when you have already
>> started after boot. Find below
>>
>> anand_vihar@Latitude-E5540:~$ ssh localhost
>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>
>>  * Documentation:  https://help.ubuntu.com/
>>
>> 1 package can be updated.
>> 1 update is a security update.
>>
>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>> /home/anand_vihar/hadoop-2.6.0
>> /home/anand_vihar/jdk1.7.0_75
>> /home/anand_vihar/hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>> Hadoop 2.6.0
>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>> Compiled by jenkins on 2014-11-13T21:10Z
>> Compiled with protoc 2.5.0
>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>> This command was run using
>> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>> Starting namenodes on [localhost]
>> localhost: starting namenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>> localhost: starting datanode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>> Starting secondary namenodes [0.0.0.0]
>> 0.0.0.0: starting secondarynamenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
>> connection exception: java.net.ConnectException: Connection refused; For
>> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>>
>>
>>
>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
>> is no fix to the problem rather it seems to be a Ubuntu network problem. I
>> have many times killed nanenode/datanode/secondary data note, shutdown and
>> restarted, but this error still appears. The only way seems to be
>> re-installing hadoop. Please advise or refer.
>>
>> Many thanks,
>>
>> Regards,
>>
>>
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
hosts file will be available in /etc directory please check once.

On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
>> Dear All:
>>
>> Has anyone encountered this error and if so how have you fixed it other
>> then re-installing Hadoop or re-starting start-dfs.sh when you have already
>> started after boot. Find below
>>
>> anand_vihar@Latitude-E5540:~$ ssh localhost
>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>
>>  * Documentation:  https://help.ubuntu.com/
>>
>> 1 package can be updated.
>> 1 update is a security update.
>>
>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>> /home/anand_vihar/hadoop-2.6.0
>> /home/anand_vihar/jdk1.7.0_75
>> /home/anand_vihar/hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>> Hadoop 2.6.0
>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>> Compiled by jenkins on 2014-11-13T21:10Z
>> Compiled with protoc 2.5.0
>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>> This command was run using
>> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>> Starting namenodes on [localhost]
>> localhost: starting namenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>> localhost: starting datanode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>> Starting secondary namenodes [0.0.0.0]
>> 0.0.0.0: starting secondarynamenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
>> connection exception: java.net.ConnectException: Connection refused; For
>> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>>
>>
>>
>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
>> is no fix to the problem rather it seems to be a Ubuntu network problem. I
>> have many times killed nanenode/datanode/secondary data note, shutdown and
>> restarted, but this error still appears. The only way seems to be
>> re-installing hadoop. Please advise or refer.
>>
>> Many thanks,
>>
>> Regards,
>>
>>
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
hosts file will be available in /etc directory please check once.

On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
>> Dear All:
>>
>> Has anyone encountered this error and if so how have you fixed it other
>> then re-installing Hadoop or re-starting start-dfs.sh when you have already
>> started after boot. Find below
>>
>> anand_vihar@Latitude-E5540:~$ ssh localhost
>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>
>>  * Documentation:  https://help.ubuntu.com/
>>
>> 1 package can be updated.
>> 1 update is a security update.
>>
>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>> /home/anand_vihar/hadoop-2.6.0
>> /home/anand_vihar/jdk1.7.0_75
>> /home/anand_vihar/hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>> Hadoop 2.6.0
>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>> Compiled by jenkins on 2014-11-13T21:10Z
>> Compiled with protoc 2.5.0
>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>> This command was run using
>> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>> Starting namenodes on [localhost]
>> localhost: starting namenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>> localhost: starting datanode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>> Starting secondary namenodes [0.0.0.0]
>> 0.0.0.0: starting secondarynamenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
>> connection exception: java.net.ConnectException: Connection refused; For
>> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>>
>>
>>
>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
>> is no fix to the problem rather it seems to be a Ubuntu network problem. I
>> have many times killed nanenode/datanode/secondary data note, shutdown and
>> restarted, but this error still appears. The only way seems to be
>> re-installing hadoop. Please advise or refer.
>>
>> Many thanks,
>>
>> Regards,
>>
>>
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
hosts file will be available in /etc directory please check once.

On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <an...@yahoo.com> wrote:

> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com>
> wrote:
>
>> Dear All:
>>
>> Has anyone encountered this error and if so how have you fixed it other
>> then re-installing Hadoop or re-starting start-dfs.sh when you have already
>> started after boot. Find below
>>
>> anand_vihar@Latitude-E5540:~$ ssh localhost
>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>
>>  * Documentation:  https://help.ubuntu.com/
>>
>> 1 package can be updated.
>> 1 update is a security update.
>>
>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>> /home/anand_vihar/hadoop-2.6.0
>> /home/anand_vihar/jdk1.7.0_75
>> /home/anand_vihar/hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>> Hadoop 2.6.0
>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>> Compiled by jenkins on 2014-11-13T21:10Z
>> Compiled with protoc 2.5.0
>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>> This command was run using
>> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>> Starting namenodes on [localhost]
>> localhost: starting namenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>> localhost: starting datanode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>> Starting secondary namenodes [0.0.0.0]
>> 0.0.0.0: starting secondarynamenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
>> connection exception: java.net.ConnectException: Connection refused; For
>> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>>
>>
>>
>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
>> is no fix to the problem rather it seems to be a Ubuntu network problem. I
>> have many times killed nanenode/datanode/secondary data note, shutdown and
>> restarted, but this error still appears. The only way seems to be
>> re-installing hadoop. Please advise or refer.
>>
>> Many thanks,
>>
>> Regards,
>>
>>
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
I don't seem to have etc/host


Sent from my iPhone

> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
> 
> Hi Anand,
> 
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
> 
> Restart your hadoop cluster after made changes in /etc/hosts
> 
> Regards,
> Sandeep.v
> 
>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:
>> Dear All:
>> 
>> Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
>> 
>> anand_vihar@Latitude-E5540:~$ ssh localhost
>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>> 
>>  * Documentation:  https://help.ubuntu.com/
>> 
>> 1 package can be updated.
>> 1 update is a security update.
>> 
>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>> /home/anand_vihar/hadoop-2.6.0
>> /home/anand_vihar/jdk1.7.0_75
>> /home/anand_vihar/hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>> Hadoop 2.6.0
>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>> Compiled by jenkins on 2014-11-13T21:10Z
>> Compiled with protoc 2.5.0
>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>> This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>> Starting namenodes on [localhost]
>> localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>> localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>> Starting secondary namenodes [0.0.0.0]
>> 0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
>> 
>> 
>> 
>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
>> 
>> Many thanks,
>> 
>> Regards,
>> 
>> 
>>  
>> Anand Murali  
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
> 

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
I don't seem to have etc/host


Sent from my iPhone

> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
> 
> Hi Anand,
> 
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
> 
> Restart your hadoop cluster after made changes in /etc/hosts
> 
> Regards,
> Sandeep.v
> 
>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:
>> Dear All:
>> 
>> Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
>> 
>> anand_vihar@Latitude-E5540:~$ ssh localhost
>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>> 
>>  * Documentation:  https://help.ubuntu.com/
>> 
>> 1 package can be updated.
>> 1 update is a security update.
>> 
>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>> /home/anand_vihar/hadoop-2.6.0
>> /home/anand_vihar/jdk1.7.0_75
>> /home/anand_vihar/hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>> Hadoop 2.6.0
>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>> Compiled by jenkins on 2014-11-13T21:10Z
>> Compiled with protoc 2.5.0
>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>> This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>> Starting namenodes on [localhost]
>> localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>> localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>> Starting secondary namenodes [0.0.0.0]
>> 0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
>> 
>> 
>> 
>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
>> 
>> Many thanks,
>> 
>> Regards,
>> 
>> 
>>  
>> Anand Murali  
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
> 

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
I don't seem to have etc/host


Sent from my iPhone

> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
> 
> Hi Anand,
> 
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
> 
> Restart your hadoop cluster after made changes in /etc/hosts
> 
> Regards,
> Sandeep.v
> 
>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:
>> Dear All:
>> 
>> Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
>> 
>> anand_vihar@Latitude-E5540:~$ ssh localhost
>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>> 
>>  * Documentation:  https://help.ubuntu.com/
>> 
>> 1 package can be updated.
>> 1 update is a security update.
>> 
>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>> /home/anand_vihar/hadoop-2.6.0
>> /home/anand_vihar/jdk1.7.0_75
>> /home/anand_vihar/hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>> Hadoop 2.6.0
>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>> Compiled by jenkins on 2014-11-13T21:10Z
>> Compiled with protoc 2.5.0
>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>> This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>> Starting namenodes on [localhost]
>> localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>> localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>> Starting secondary namenodes [0.0.0.0]
>> 0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
>> 
>> 
>> 
>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
>> 
>> Many thanks,
>> 
>> Regards,
>> 
>> 
>>  
>> Anand Murali  
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
> 

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by Anand Murali <an...@yahoo.com>.
I don't seem to have etc/host


Sent from my iPhone

> On 22-Apr-2015, at 2:30 pm, sandeep vura <sa...@gmail.com> wrote:
> 
> Hi Anand,
> 
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
> 
> Restart your hadoop cluster after made changes in /etc/hosts
> 
> Regards,
> Sandeep.v
> 
>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:
>> Dear All:
>> 
>> Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below
>> 
>> anand_vihar@Latitude-E5540:~$ ssh localhost
>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>> 
>>  * Documentation:  https://help.ubuntu.com/
>> 
>> 1 package can be updated.
>> 1 update is a security update.
>> 
>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>> /home/anand_vihar/hadoop-2.6.0
>> /home/anand_vihar/jdk1.7.0_75
>> /home/anand_vihar/hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>> Hadoop 2.6.0
>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>> Compiled by jenkins on 2014-11-13T21:10Z
>> Compiled with protoc 2.5.0
>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>> This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>> Starting namenodes on [localhost]
>> localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>> localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>> Starting secondary namenodes [0.0.0.0]
>> 0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
>> 
>> 
>> 
>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix to the problem rather it seems to be a Ubuntu network problem. I have many times killed nanenode/datanode/secondary data note, shutdown and restarted, but this error still appears. The only way seems to be re-installing hadoop. Please advise or refer.
>> 
>> Many thanks,
>> 
>> Regards,
>> 
>> 
>>  
>> Anand Murali  
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
> 

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
Hi Anand,

comment the ip address - 127.0.1.1 in /etc/hosts
add the following ip address - 127.0.0.1  localhost  in /etc/hosts.

Restart your hadoop cluster after made changes in /etc/hosts

Regards,
Sandeep.v

On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
Hi Anand,

comment the ip address - 127.0.1.1 in /etc/hosts
add the following ip address - 127.0.0.1  localhost  in /etc/hosts.

Restart your hadoop cluster after made changes in /etc/hosts

Regards,
Sandeep.v

On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
Hi Anand,

comment the ip address - 127.0.1.1 in /etc/hosts
add the following ip address - 127.0.0.1  localhost  in /etc/hosts.

Restart your hadoop cluster after made changes in /etc/hosts

Regards,
Sandeep.v

On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

Posted by sandeep vura <sa...@gmail.com>.
Hi Anand,

comment the ip address - 127.0.1.1 in /etc/hosts
add the following ip address - 127.0.0.1  localhost  in /etc/hosts.

Restart your hadoop cluster after made changes in /etc/hosts

Regards,
Sandeep.v

On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an...@yahoo.com> wrote:

> Dear All:
>
> Has anyone encountered this error and if so how have you fixed it other
> then re-installing Hadoop or re-starting start-dfs.sh when you have already
> started after boot. Find below
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
> 1 package can be updated.
> 1 update is a security update.
>
> Last login: Wed Apr 22 13:33:26 2015 from localhost
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
> /home/anand_vihar/hadoop-2.6.0
> /home/anand_vihar/jdk1.7.0_75
> /home/anand_vihar/hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using
> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
> Starting namenodes on [localhost]
> localhost: starting namenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
> localhost: starting datanode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: starting secondarynamenode, logging to
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
> connection exception: java.net.ConnectException: Connection refused; For
> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>
>
>
> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
> is no fix to the problem rather it seems to be a Ubuntu network problem. I
> have many times killed nanenode/datanode/secondary data note, shutdown and
> restarted, but this error still appears. The only way seems to be
> re-installing hadoop. Please advise or refer.
>
> Many thanks,
>
> Regards,
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>