You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@whirr.apache.org by a b <au...@yahoo.com> on 2013/08/07 19:30:24 UTC

how to login as hadoop after cluster is launched

i ran "launch-cluster" on ec2, but hadoop did not start. i would like to login as hadoop and run start-all.sh and see what error i get.

i tried ssh -i bootstrap-private-key.pem hadoop@ec2-public-ip
and that failed

i tried ssh -i bootstrap-private-key.pem ubuntu@ec2-public-ip
which worked then i tried to switch users to hadoop:
su - hadoop
but it asked me for a password i do not have.

i could use a suggestion.

Re: how to login as hadoop after cluster is launched

Posted by Sean Zhang <zs...@gmail.com>.
Maybe hadoop is never made as a login user. You can execute any command as
hadoop with "sudo -u hadoop <command>". You can try sudo -u hadoop
start-all.sh.




On Wed, Aug 7, 2013 at 11:10 AM, a b <au...@yahoo.com> wrote:

> it looks like hadoop on ubuntu did not get a home directory: /home/hadoop
>
> You can log into instances using the following ssh commands:
> [hadoop-datanode+hadoop-tasktracker]: ssh -i
> /home/ab/whirr/ab-ubuntu-12-64.pem -o "UserKnownHostsFile /dev/null" -o
> StrictHostKeyChecking=no ab@54.225.34.149
> [hadoop-namenode+hadoop-jobtracker]: ssh -i
> /home/ab/whirr/ab-ubuntu-12-64.pem -o "UserKnownHostsFile /dev/null" -o
> StrictHostKeyChecking=no ab@50.19.198.81
> To destroy cluster, run 'whirr destroy-cluster' with the same options used
> to launch it.
> ab@ubuntu12-64:~$ ssh -i /home/ab/whirr/ab-ubuntu-12-64.pem -o
> "UserKnownHostsFile /dev/null" -o StrictHostKeyChecking=no
> ab@54.225.34.149
> Warning: Permanently added '54.225.34.149' (ECDSA) to the list of known
> hosts.
> Welcome to Ubuntu 12.04.2 LTS (GNU/Linux 3.2.0-48-virtual x86_64)
>
>  * Documentation:  https://help.ubuntu.com/
>
>   System information as of Wed Aug  7 17:55:20 UTC 2013
>
>   System load:  0.27              Processes:           58
>   Usage of /:   18.0% of 7.87GB   Users logged in:     0
>   Memory usage: 27%               IP address for eth0: 10.164.72.201
>   Swap usage:   0%
>
>   Graph this data and manage this system at
> https://landscape.canonical.com/
>
>   Get cloud support with Ubuntu Advantage Cloud Guest:
>     http://www.ubuntu.com/business/services/cloud
>
>   Use Juju to deploy your cloud instances and workloads:
>     https://juju.ubuntu.com/#cloud-precise
>
> 42 packages can be updated.
> 24 updates are security updates.
>
> Last login: Wed Aug  7 17:54:04 2013 from 50.0.160.200
> ab@ip-10-164-72-201:~$ java -version
> java version "1.7.0_25"
> Java(TM) SE Runtime Environment (build 1.7.0_25-b15)
> Java HotSpot(TM) 64-Bit Server VM (build 23.25-b01, mixed mode)
> ab@ip-10-164-72-201:~$ ps -ef | grep hadoop
> ab     2312  2252  0 17:55 pts/0    00:00:00 grep hadoop
> ab@ip-10-164-72-201:~$ sudo su - hadoop
> No directory, logging in with HOME=/
> $ which start-all.sh
> /usr/local/hadoop-1.2.1/bin/start-all.sh
> $ start-all.sh
> Warning: $HADOOP_HOME is deprecated.
>
> starting namenode, logging to
> /var/log/hadoop/logs/hadoop-hadoop-namenode-ip-10-164-72-201.out
> Error: JAVA_HOME is not set.
> localhost: Could not create directory '/home/hadoop/.ssh'.
> localhost: Failed to add the host to the list of known hosts
> (/home/hadoop/.ssh/known_hosts).
> localhost: Permission denied (publickey).
> localhost: Could not create directory '/home/hadoop/.ssh'.
> localhost: Failed to add the host to the list of known hosts
> (/home/hadoop/.ssh/known_hosts).
> localhost: Permission denied (publickey).
> starting jobtracker, logging to
> /var/log/hadoop/logs/hadoop-hadoop-jobtracker-ip-10-164-72-201.out
> Error: JAVA_HOME is not set.
> localhost: Could not create directory '/home/hadoop/.ssh'.
> localhost: Failed to add the host to the list of known hosts
> (/home/hadoop/.ssh/known_hosts).
> localhost: Permission denied (publickey).
> $
>
> i create a home directory to see if that was enough - it wasn't:
>
> $ exit
> ab@ip-10-164-72-201:~$ sudo mkdir /home/hadoop
> ab@ip-10-164-72-201:~$ chown -R hadoop:hadoop /home/hadoop
> chown: changing ownership of `/home/hadoop': Operation not permitted
> ab@ip-10-164-72-201:~$ sudo chown -R hadoop:hadoop /home/hadoop
> ab@ip-10-164-72-201:~$ sudo su - hadoop
> $ start-all.sh
> Warning: $HADOOP_HOME is deprecated.
>
> starting namenode, logging to
> /var/log/hadoop/logs/hadoop-hadoop-namenode-ip-10-164-72-201.out
> Error: JAVA_HOME is not set.
> localhost: Warning: Permanently added 'localhost' (ECDSA) to the list of
> known hosts.
> localhost: Permission denied (publickey).
> localhost: Permission denied (publickey).
> starting jobtracker, logging to
> /var/log/hadoop/logs/hadoop-hadoop-jobtracker-ip-10-164-72-201.out
> Error: JAVA_HOME is not set.
> localhost: Permission denied (publickey).
> $ ps -ef |grep hadoop
> root      2582  2252  0 18:03 pts/0    00:00:00 sudo su - hadoop
> hadoop    2583  2582  0 18:03 pts/0    00:00:00 su - hadoop
> hadoop    2584  2583  0 18:03 pts/0    00:00:00 -su
> hadoop    2857  2584  0 18:04 pts/0    00:00:00 ps -ef
> hadoop    2858  2584  0 18:04 pts/0    00:00:00 grep hadoop
>
> it looks like i need the ssh keys - i don't know which ones - but, anyway,
> i'm wasting time. any idea why it might had skipped this provisioning step?
> any idea what i should try next?
>
>   ------------------------------
>  *From:* Andrew Bayer <an...@gmail.com>
> *To:* user@whirr.apache.org; a b <au...@yahoo.com>
> *Sent:* Wednesday, August 7, 2013 10:39 AM
> *Subject:* Re: how to login as hadoop after cluster is launched
>
> Login as ubuntu, sudo su - hadoop.
>
> A.
>
> On Wed, Aug 7, 2013 at 10:30 AM, a b <au...@yahoo.com> wrote:
>
> i ran "launch-cluster" on ec2, but hadoop did not start. i would like to
> login as hadoop and run start-all.sh and see what error i get.
>
> i tried ssh -i bootstrap-private-key.pem hadoop@ec2-public-ip
> and that failed
>
> i tried ssh -i bootstrap-private-key.pem ubuntu@ec2-public-ip
> which worked then i tried to switch users to hadoop:
> su - hadoop
> but it asked me for a password i do not have.
>
> i could use a suggestion.
>
>
>
>
>

Re: how to login as hadoop after cluster is launched

Posted by a b <au...@yahoo.com>.
it looks like hadoop on ubuntu did not get a home directory: /home/hadoop


You can log into instances using the following ssh commands:
[hadoop-datanode+hadoop-tasktracker]: ssh -i /home/ab/whirr/ab-ubuntu-12-64.pem -o "UserKnownHostsFile /dev/null" -o StrictHostKeyChecking=no ab@54.225.34.149
[hadoop-namenode+hadoop-jobtracker]: ssh -i /home/ab/whirr/ab-ubuntu-12-64.pem -o "UserKnownHostsFile /dev/null" -o StrictHostKeyChecking=no ab@50.19.198.81
To destroy cluster, run 'whirr destroy-cluster' with the same options used to launch it.
ab@ubuntu12-64:~$ ssh -i /home/ab/whirr/ab-ubuntu-12-64.pem -o "UserKnownHostsFile /dev/null" -o StrictHostKeyChecking=no ab@54.225.34.149
Warning: Permanently added '54.225.34.149' (ECDSA) to the list of known hosts.
Welcome to Ubuntu 12.04.2 LTS (GNU/Linux 3.2.0-48-virtual x86_64)

 * Documentation:  https://help.ubuntu.com/

  System information as of Wed Aug  7 17:55:20 UTC 2013

  System load:  0.27              Processes:           58
  Usage of /:   18.0% of 7.87GB   Users logged in:     0
  Memory usage: 27%               IP address for eth0: 10.164.72.201
  Swap usage:   0%

  Graph this data and manage this system at https://landscape.canonical.com/

  Get cloud support with Ubuntu Advantage Cloud Guest:
    http://www.ubuntu.com/business/services/cloud

  Use Juju to deploy your cloud instances and workloads:
    https://juju.ubuntu.com/#cloud-precise

42 packages can be updated.
24 updates are security updates.

Last login: Wed Aug  7 17:54:04 2013 from 50.0.160.200
ab@ip-10-164-72-201:~$ java -version
java version "1.7.0_25"
Java(TM) SE Runtime Environment (build 1.7.0_25-b15)
Java HotSpot(TM) 64-Bit Server VM (build 23.25-b01, mixed mode)
ab@ip-10-164-72-201:~$ ps -ef | grep hadoop
ab     2312  2252  0 17:55 pts/0    00:00:00 grep hadoop
ab@ip-10-164-72-201:~$ sudo su - hadoop
No directory, logging in with HOME=/
$ which start-all.sh
/usr/local/hadoop-1.2.1/bin/start-all.sh
$ start-all.sh
Warning: $HADOOP_HOME is deprecated.

starting namenode, logging to /var/log/hadoop/logs/hadoop-hadoop-namenode-ip-10-164-72-201.out
Error: JAVA_HOME is not set.
localhost: Could not create directory '/home/hadoop/.ssh'.
localhost: Failed to add the host to the list of known hosts (/home/hadoop/.ssh/known_hosts).
localhost: Permission denied (publickey).
localhost: Could not create directory '/home/hadoop/.ssh'.
localhost: Failed to add the host to the list of known hosts (/home/hadoop/.ssh/known_hosts).
localhost: Permission denied (publickey).
starting jobtracker, logging to /var/log/hadoop/logs/hadoop-hadoop-jobtracker-ip-10-164-72-201.out
Error: JAVA_HOME is not set.
localhost: Could not create directory '/home/hadoop/.ssh'.
localhost: Failed to add the host to the list of known hosts (/home/hadoop/.ssh/known_hosts).
localhost: Permission denied (publickey).
$ 


i create a home directory to see if that was enough - it wasn't:


$ exit
ab@ip-10-164-72-201:~$ sudo mkdir /home/hadoop
ab@ip-10-164-72-201:~$ chown -R hadoop:hadoop /home/hadoop
chown: changing ownership of `/home/hadoop': Operation not permitted
ab@ip-10-164-72-201:~$ sudo chown -R hadoop:hadoop /home/hadoop
ab@ip-10-164-72-201:~$ sudo su - hadoop
$ start-all.sh
Warning: $HADOOP_HOME is deprecated.

starting namenode, logging to /var/log/hadoop/logs/hadoop-hadoop-namenode-ip-10-164-72-201.out
Error: JAVA_HOME is not set.
localhost: Warning: Permanently added 'localhost' (ECDSA) to the list of known hosts.
localhost: Permission denied (publickey).
localhost: Permission denied (publickey).
starting jobtracker, logging to /var/log/hadoop/logs/hadoop-hadoop-jobtracker-ip-10-164-72-201.out
Error: JAVA_HOME is not set.
localhost: Permission denied (publickey).
$ ps -ef |grep hadoop
root      2582  2252  0 18:03 pts/0    00:00:00 sudo su - hadoop
hadoop    2583  2582  0 18:03 pts/0    00:00:00 su - hadoop
hadoop    2584  2583  0 18:03 pts/0    00:00:00 -su
hadoop    2857  2584  0 18:04 pts/0    00:00:00 ps -ef
hadoop    2858  2584  0 18:04 pts/0    00:00:00 grep hadoop


it looks like i need the ssh keys - i don't know which ones - but, anyway, i'm wasting time. any idea why it might had skipped this provisioning step? any idea what i should try next?



________________________________
 From: Andrew Bayer <an...@gmail.com>
To: user@whirr.apache.org; a b <au...@yahoo.com> 
Sent: Wednesday, August 7, 2013 10:39 AM
Subject: Re: how to login as hadoop after cluster is launched
 


Login as ubuntu, sudo su - hadoop.

A.


On Wed, Aug 7, 2013 at 10:30 AM, a b <au...@yahoo.com> wrote:

i ran "launch-cluster" on ec2, but hadoop did not start. i would like to login as hadoop and run start-all.sh and see what error i get.
>
>i tried ssh -i bootstrap-private-key.pem hadoop@ec2-public-ip
>and that failed
>
>i tried ssh -i bootstrap-private-key.pem ubuntu@ec2-public-ip
>which worked then i tried to switch users to hadoop:
>su - hadoop
>but it asked me for a password i do not have.
>
>i could use a suggestion.
>
>

Re: how to login as hadoop after cluster is launched

Posted by Andrew Bayer <an...@gmail.com>.
Login as ubuntu, sudo su - hadoop.

A.

On Wed, Aug 7, 2013 at 10:30 AM, a b <au...@yahoo.com> wrote:

> i ran "launch-cluster" on ec2, but hadoop did not start. i would like to
> login as hadoop and run start-all.sh and see what error i get.
>
> i tried ssh -i bootstrap-private-key.pem hadoop@ec2-public-ip
> and that failed
>
> i tried ssh -i bootstrap-private-key.pem ubuntu@ec2-public-ip
> which worked then i tried to switch users to hadoop:
> su - hadoop
> but it asked me for a password i do not have.
>
> i could use a suggestion.
>
>