You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by "Kartashov, Andy" <An...@mpac.ca> on 2012/11/06 21:07:31 UTC

RE: starting Daemons remotely, through ssh - SOLVED

Harsh/Ravi,

I wrote my own scripts to [start|stop|restart]  [hdfs|mapred] daemons. The content of the script is

$sudo service /etc/init.d/hadoop-[hdfs-*|mapred-*]

I have no problem starting Daemons locally on each node. I have sudo-access granted. I run script and enter password.
I have passwordless ssh enabled. But when I am trying to start slave Daemons from NN I get
"<remothsot>:  sudo: sorry, you must have a tty to run sudo".

I just solved this issue by:

a.       Modifying slaves.sh script to add -t after ssh "-t" <host-ip> <command>


a1. Someone suggested to:

$sudo sudeors and commenting out Defaults requiretty

But a. worked just fine.



Cheers,

AK47

From: Ravi Mutyala [mailto:ravi@hortonworks.com]
Sent: Tuesday, November 06, 2012 2:31 PM
To: user@hadoop.apache.org
Subject: Re: starting Daemons remotely, through ssh

Andy,

Which specific scripts are these? As far as I know, all daemons on the hadoop side can be started with hadoop/hdfs or any non-sudo username. You will require passwordless ssh for that user if you are using slaves/regionservers configured to start the daemons remotely from the NameNode/HBase Master.


On Tue, Nov 6, 2012 at 1:25 PM, Kartashov, Andy <An...@mpac.ca>> wrote:
Hadoopers,

How does one start Daemons remotely when scripts normally require root user to start them? Do you modify scripts?

Thanks,
NOTICE: This e-mail message and any attachments are confidential, subject to copyright and may be privileged. Any unauthorized use, copying or disclosure is prohibited. If you are not the intended recipient, please delete and contact the sender immediately. Please consider the environment before printing this e-mail. AVIS : le présent courriel et toute pièce jointe qui l'accompagne sont confidentiels, protégés par le droit d'auteur et peuvent être couverts par le secret professionnel. Toute utilisation, copie ou divulgation non autorisée est interdite. Si vous n'êtes pas le destinataire prévu de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. Veuillez penser à l'environnement avant d'imprimer le présent courriel

NOTICE: This e-mail message and any attachments are confidential, subject to copyright and may be privileged. Any unauthorized use, copying or disclosure is prohibited. If you are not the intended recipient, please delete and contact the sender immediately. Please consider the environment before printing this e-mail. AVIS : le présent courriel et toute pièce jointe qui l'accompagne sont confidentiels, protégés par le droit d'auteur et peuvent être couverts par le secret professionnel. Toute utilisation, copie ou divulgation non autorisée est interdite. Si vous n'êtes pas le destinataire prévu de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. Veuillez penser à l'environnement avant d'imprimer le présent courriel

Re: Hadoop web interface

Posted by Bharati <bh...@mparallelo.com>.
Thanks,
Will take a look. 
Bharati

Sent from my iPhone

On Nov 6, 2012, at 6:15 PM, Bing Jiang <ji...@gmail.com> wrote:

> if you want to look up the source code of webapps, please refer to HADOOP_HOME/src/webapps. 
> 
> 2012/11/7 Bharati <bh...@mparallelo.com>
>> Hi,
>> I am very new to Hadoop. I have run tutorials for map and reduce.
>> I would like more information on hadoop's web interface and details
>> 
>> Thanks,
>> Bharati
>> 
>> Fortigate Filtered
> 
> 
> 
> -- 
> Bing Jiang
> Tel:(86)134-2619-1361
> weibo: http://weibo.com/jiangbinglover
> BLOG: http://blog.sina.com.cn/jiangbinglover
> National Research Center for Intelligent Computing Systems
> Institute of Computing technology
> Graduate University of Chinese Academy of Science
> 

Re: Hadoop web interface

Posted by Bharati <bh...@mparallelo.com>.
Thanks,
Will take a look. 
Bharati

Sent from my iPhone

On Nov 6, 2012, at 6:15 PM, Bing Jiang <ji...@gmail.com> wrote:

> if you want to look up the source code of webapps, please refer to HADOOP_HOME/src/webapps. 
> 
> 2012/11/7 Bharati <bh...@mparallelo.com>
>> Hi,
>> I am very new to Hadoop. I have run tutorials for map and reduce.
>> I would like more information on hadoop's web interface and details
>> 
>> Thanks,
>> Bharati
>> 
>> Fortigate Filtered
> 
> 
> 
> -- 
> Bing Jiang
> Tel:(86)134-2619-1361
> weibo: http://weibo.com/jiangbinglover
> BLOG: http://blog.sina.com.cn/jiangbinglover
> National Research Center for Intelligent Computing Systems
> Institute of Computing technology
> Graduate University of Chinese Academy of Science
> 

Re: Hadoop web interface

Posted by Bharati <bh...@mparallelo.com>.
Thanks,
Will take a look. 
Bharati

Sent from my iPhone

On Nov 6, 2012, at 6:15 PM, Bing Jiang <ji...@gmail.com> wrote:

> if you want to look up the source code of webapps, please refer to HADOOP_HOME/src/webapps. 
> 
> 2012/11/7 Bharati <bh...@mparallelo.com>
>> Hi,
>> I am very new to Hadoop. I have run tutorials for map and reduce.
>> I would like more information on hadoop's web interface and details
>> 
>> Thanks,
>> Bharati
>> 
>> Fortigate Filtered
> 
> 
> 
> -- 
> Bing Jiang
> Tel:(86)134-2619-1361
> weibo: http://weibo.com/jiangbinglover
> BLOG: http://blog.sina.com.cn/jiangbinglover
> National Research Center for Intelligent Computing Systems
> Institute of Computing technology
> Graduate University of Chinese Academy of Science
> 

Re: Hadoop web interface

Posted by Bharati <bh...@mparallelo.com>.
Thanks,
Will take a look. 
Bharati

Sent from my iPhone

On Nov 6, 2012, at 6:15 PM, Bing Jiang <ji...@gmail.com> wrote:

> if you want to look up the source code of webapps, please refer to HADOOP_HOME/src/webapps. 
> 
> 2012/11/7 Bharati <bh...@mparallelo.com>
>> Hi,
>> I am very new to Hadoop. I have run tutorials for map and reduce.
>> I would like more information on hadoop's web interface and details
>> 
>> Thanks,
>> Bharati
>> 
>> Fortigate Filtered
> 
> 
> 
> -- 
> Bing Jiang
> Tel:(86)134-2619-1361
> weibo: http://weibo.com/jiangbinglover
> BLOG: http://blog.sina.com.cn/jiangbinglover
> National Research Center for Intelligent Computing Systems
> Institute of Computing technology
> Graduate University of Chinese Academy of Science
> 

Re: Hadoop web interface

Posted by Bing Jiang <ji...@gmail.com>.
if you want to look up the source code of webapps, please refer to
HADOOP_HOME/src/webapps.

2012/11/7 Bharati <bh...@mparallelo.com>

> Hi,
> I am very new to Hadoop. I have run tutorials for map and reduce.
> I would like more information on hadoop's web interface and details
>
> Thanks,
> Bharati
>
> Fortigate Filtered
>
>


-- 
Bing Jiang
Tel:(86)134-2619-1361
weibo: http://weibo.com/jiangbinglover
BLOG: http://blog.sina.com.cn/jiangbinglover
National Research Center for Intelligent Computing Systems
Institute of Computing technology
Graduate University of Chinese Academy of Science

Re: Hadoop web interface

Posted by Bing Jiang <ji...@gmail.com>.
if you want to look up the source code of webapps, please refer to
HADOOP_HOME/src/webapps.

2012/11/7 Bharati <bh...@mparallelo.com>

> Hi,
> I am very new to Hadoop. I have run tutorials for map and reduce.
> I would like more information on hadoop's web interface and details
>
> Thanks,
> Bharati
>
> Fortigate Filtered
>
>


-- 
Bing Jiang
Tel:(86)134-2619-1361
weibo: http://weibo.com/jiangbinglover
BLOG: http://blog.sina.com.cn/jiangbinglover
National Research Center for Intelligent Computing Systems
Institute of Computing technology
Graduate University of Chinese Academy of Science

Re: Hadoop web interface

Posted by Bing Jiang <ji...@gmail.com>.
if you want to look up the source code of webapps, please refer to
HADOOP_HOME/src/webapps.

2012/11/7 Bharati <bh...@mparallelo.com>

> Hi,
> I am very new to Hadoop. I have run tutorials for map and reduce.
> I would like more information on hadoop's web interface and details
>
> Thanks,
> Bharati
>
> Fortigate Filtered
>
>


-- 
Bing Jiang
Tel:(86)134-2619-1361
weibo: http://weibo.com/jiangbinglover
BLOG: http://blog.sina.com.cn/jiangbinglover
National Research Center for Intelligent Computing Systems
Institute of Computing technology
Graduate University of Chinese Academy of Science

Re: Hadoop web interface

Posted by Bing Jiang <ji...@gmail.com>.
if you want to look up the source code of webapps, please refer to
HADOOP_HOME/src/webapps.

2012/11/7 Bharati <bh...@mparallelo.com>

> Hi,
> I am very new to Hadoop. I have run tutorials for map and reduce.
> I would like more information on hadoop's web interface and details
>
> Thanks,
> Bharati
>
> Fortigate Filtered
>
>


-- 
Bing Jiang
Tel:(86)134-2619-1361
weibo: http://weibo.com/jiangbinglover
BLOG: http://blog.sina.com.cn/jiangbinglover
National Research Center for Intelligent Computing Systems
Institute of Computing technology
Graduate University of Chinese Academy of Science

Hadoop web interface

Posted by Bharati <bh...@mparallelo.com>.
Hi,
I am very new to Hadoop. I have run tutorials for map and reduce.
I would like more information on hadoop's web interface and details

Thanks,
Bharati

Hadoop web interface

Posted by Bharati <bh...@mparallelo.com>.
Hi,
I am very new to Hadoop. I have run tutorials for map and reduce.
I would like more information on hadoop's web interface and details

Thanks,
Bharati

Hadoop web interface

Posted by Bharati <bh...@mparallelo.com>.
Hi,
I am very new to Hadoop. I have run tutorials for map and reduce.
I would like more information on hadoop's web interface and details

Thanks,
Bharati

Hadoop web interface

Posted by Bharati <bh...@mparallelo.com>.
Hi,
I am very new to Hadoop. I have run tutorials for map and reduce.
I would like more information on hadoop's web interface and details

Thanks,
Bharati

RE: starting Daemons remotely, through ssh - SOLVED

Posted by "Kartashov, Andy" <An...@mpac.ca>.
Forrest,

I must admit. I am very new myself.  Two months ago I didn't know what "$sudo" was. :) So, do not take my comments too seriously. I could be wrong about many issues.


There are 5 daemon scripts  in /etc/init.d/. I am using MRv1, so mine are:

hadoop-hdfs-datanode
hadoop-hdfs-namenode
hadoop-hdfs-secondarynamenode
<specific to MRv1>
hadoop-0.20-mapreduce-jobtracker
hadoop-0.20-mapreduce-tasktracker

For scalability I chose to:

-           only have NN/JT on my master node so I need to start|stop|restart only hadoop-hdfs-namenode/hadoop-0.20-mapreduce-jobtracker

-          different node for SNN so I need to only start|stop|restart hadoop-hdfs-secondarynamenode

-          all of my DNs - hadoop-hdfs-datanode/hadoop-0.20-mapreduce-tasktracker
My script is VERY simple for now:

#!/bin/bash
# myHadoop.sh
dir=/etc/init.d
if [ $# -lt 1 ]; then
  echo "USAGE: $`basename $0` [start|stop|restart]"
  exit 1
fi
sudo ${dir}/hadoop-hdfs-namenode  $1
sudo ${dir}/hadoop-0.20-mapreduce-jobtracker  $1
exit 0

for SNN I altered script to run instead:
sudo ${dir}/hadoop-hdfs-secondarynamenode  $1

      and for DNs:
sudo ${dir}/hadoop-hdfs-datanode  $1
sudo ${dir}/hadoop-0.20-mapreduce-tasktracker $1

p.s. Criticism on my post is welcomed.

AK47


From: Forrest Aldrich [mailto:forrie@gmail.com]
Sent: Tuesday, November 06, 2012 3:09 PM
To: user@hadoop.apache.org
Subject: Re: starting Daemons remotely, through ssh - SOLVED

Andy, I noticed you are splitting up your /etc/init.d startup -- we're new-ish to Hadoop and found that the supplied init.d script doesn't necessarily stop services properly, so I end up doing it manually.   If you could share your info that would be great... I can see where they can be time-delayed in the startup with chkconfig (RHEL).


Thanks.
On 11/6/12 3:07 PM, Kartashov, Andy wrote:
Harsh/Ravi,

I wrote my own scripts to [start|stop|restart]  [hdfs|mapred] daemons. The content of the script is

$sudo service /etc/init.d/hadoop-[hdfs-*|mapred-*]

I have no problem starting Daemons locally on each node. I have sudo-access granted. I run script and enter password.
I have passwordless ssh enabled. But when I am trying to start slave Daemons from NN I get
"<remothsot>:  sudo: sorry, you must have a tty to run sudo".

I just solved this issue by:

a.       Modifying slaves.sh script to add -t after ssh "-t" <host-ip> <command>


a1. Someone suggested to:

$sudo sudeors and commenting out Defaults requiretty

But a. worked just fine.



Cheers,

AK47

From: Ravi Mutyala [mailto:ravi@hortonworks.com]
Sent: Tuesday, November 06, 2012 2:31 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: starting Daemons remotely, through ssh

Andy,

Which specific scripts are these? As far as I know, all daemons on the hadoop side can be started with hadoop/hdfs or any non-sudo username. You will require passwordless ssh for that user if you are using slaves/regionservers configured to start the daemons remotely from the NameNode/HBase Master.


On Tue, Nov 6, 2012 at 1:25 PM, Kartashov, Andy <An...@mpac.ca>> wrote:
Hadoopers,

How does one start Daemons remotely when scripts normally require root user to start them? Do you modify scripts?

Thanks,
NOTICE: This e-mail message and any attachments are confidential, subject to copyright and may be privileged. Any unauthorized use, copying or disclosure is prohibited. If you are not the intended recipient, please delete and contact the sender immediately. Please consider the environment before printing this e-mail. AVIS : le présent courriel et toute pièce jointe qui l'accompagne sont confidentiels, protégés par le droit d'auteur et peuvent être couverts par le secret professionnel. Toute utilisation, copie ou divulgation non autorisée est interdite. Si vous n'êtes pas le destinataire prévu de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. Veuillez penser à l'environnement avant d'imprimer le présent courriel

NOTICE: This e-mail message and any attachments are confidential, subject to copyright and may be privileged. Any unauthorized use, copying or disclosure is prohibited. If you are not the intended recipient, please delete and contact the sender immediately. Please consider the environment before printing this e-mail. AVIS : le présent courriel et toute pièce jointe qui l'accompagne sont confidentiels, protégés par le droit d'auteur et peuvent être couverts par le secret professionnel. Toute utilisation, copie ou divulgation non autorisée est interdite. Si vous n'êtes pas le destinataire prévu de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. Veuillez penser à l'environnement avant d'imprimer le présent courriel

NOTICE: This e-mail message and any attachments are confidential, subject to copyright and may be privileged. Any unauthorized use, copying or disclosure is prohibited. If you are not the intended recipient, please delete and contact the sender immediately. Please consider the environment before printing this e-mail. AVIS : le présent courriel et toute pièce jointe qui l'accompagne sont confidentiels, protégés par le droit d'auteur et peuvent être couverts par le secret professionnel. Toute utilisation, copie ou divulgation non autorisée est interdite. Si vous n'êtes pas le destinataire prévu de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. Veuillez penser à l'environnement avant d'imprimer le présent courriel

RE: starting Daemons remotely, through ssh - SOLVED

Posted by "Kartashov, Andy" <An...@mpac.ca>.
Forrest,

I must admit. I am very new myself.  Two months ago I didn't know what "$sudo" was. :) So, do not take my comments too seriously. I could be wrong about many issues.


There are 5 daemon scripts  in /etc/init.d/. I am using MRv1, so mine are:

hadoop-hdfs-datanode
hadoop-hdfs-namenode
hadoop-hdfs-secondarynamenode
<specific to MRv1>
hadoop-0.20-mapreduce-jobtracker
hadoop-0.20-mapreduce-tasktracker

For scalability I chose to:

-           only have NN/JT on my master node so I need to start|stop|restart only hadoop-hdfs-namenode/hadoop-0.20-mapreduce-jobtracker

-          different node for SNN so I need to only start|stop|restart hadoop-hdfs-secondarynamenode

-          all of my DNs - hadoop-hdfs-datanode/hadoop-0.20-mapreduce-tasktracker
My script is VERY simple for now:

#!/bin/bash
# myHadoop.sh
dir=/etc/init.d
if [ $# -lt 1 ]; then
  echo "USAGE: $`basename $0` [start|stop|restart]"
  exit 1
fi
sudo ${dir}/hadoop-hdfs-namenode  $1
sudo ${dir}/hadoop-0.20-mapreduce-jobtracker  $1
exit 0

for SNN I altered script to run instead:
sudo ${dir}/hadoop-hdfs-secondarynamenode  $1

      and for DNs:
sudo ${dir}/hadoop-hdfs-datanode  $1
sudo ${dir}/hadoop-0.20-mapreduce-tasktracker $1

p.s. Criticism on my post is welcomed.

AK47


From: Forrest Aldrich [mailto:forrie@gmail.com]
Sent: Tuesday, November 06, 2012 3:09 PM
To: user@hadoop.apache.org
Subject: Re: starting Daemons remotely, through ssh - SOLVED

Andy, I noticed you are splitting up your /etc/init.d startup -- we're new-ish to Hadoop and found that the supplied init.d script doesn't necessarily stop services properly, so I end up doing it manually.   If you could share your info that would be great... I can see where they can be time-delayed in the startup with chkconfig (RHEL).


Thanks.
On 11/6/12 3:07 PM, Kartashov, Andy wrote:
Harsh/Ravi,

I wrote my own scripts to [start|stop|restart]  [hdfs|mapred] daemons. The content of the script is

$sudo service /etc/init.d/hadoop-[hdfs-*|mapred-*]

I have no problem starting Daemons locally on each node. I have sudo-access granted. I run script and enter password.
I have passwordless ssh enabled. But when I am trying to start slave Daemons from NN I get
"<remothsot>:  sudo: sorry, you must have a tty to run sudo".

I just solved this issue by:

a.       Modifying slaves.sh script to add -t after ssh "-t" <host-ip> <command>


a1. Someone suggested to:

$sudo sudeors and commenting out Defaults requiretty

But a. worked just fine.



Cheers,

AK47

From: Ravi Mutyala [mailto:ravi@hortonworks.com]
Sent: Tuesday, November 06, 2012 2:31 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: starting Daemons remotely, through ssh

Andy,

Which specific scripts are these? As far as I know, all daemons on the hadoop side can be started with hadoop/hdfs or any non-sudo username. You will require passwordless ssh for that user if you are using slaves/regionservers configured to start the daemons remotely from the NameNode/HBase Master.


On Tue, Nov 6, 2012 at 1:25 PM, Kartashov, Andy <An...@mpac.ca>> wrote:
Hadoopers,

How does one start Daemons remotely when scripts normally require root user to start them? Do you modify scripts?

Thanks,
NOTICE: This e-mail message and any attachments are confidential, subject to copyright and may be privileged. Any unauthorized use, copying or disclosure is prohibited. If you are not the intended recipient, please delete and contact the sender immediately. Please consider the environment before printing this e-mail. AVIS : le présent courriel et toute pièce jointe qui l'accompagne sont confidentiels, protégés par le droit d'auteur et peuvent être couverts par le secret professionnel. Toute utilisation, copie ou divulgation non autorisée est interdite. Si vous n'êtes pas le destinataire prévu de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. Veuillez penser à l'environnement avant d'imprimer le présent courriel

NOTICE: This e-mail message and any attachments are confidential, subject to copyright and may be privileged. Any unauthorized use, copying or disclosure is prohibited. If you are not the intended recipient, please delete and contact the sender immediately. Please consider the environment before printing this e-mail. AVIS : le présent courriel et toute pièce jointe qui l'accompagne sont confidentiels, protégés par le droit d'auteur et peuvent être couverts par le secret professionnel. Toute utilisation, copie ou divulgation non autorisée est interdite. Si vous n'êtes pas le destinataire prévu de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. Veuillez penser à l'environnement avant d'imprimer le présent courriel

NOTICE: This e-mail message and any attachments are confidential, subject to copyright and may be privileged. Any unauthorized use, copying or disclosure is prohibited. If you are not the intended recipient, please delete and contact the sender immediately. Please consider the environment before printing this e-mail. AVIS : le présent courriel et toute pièce jointe qui l'accompagne sont confidentiels, protégés par le droit d'auteur et peuvent être couverts par le secret professionnel. Toute utilisation, copie ou divulgation non autorisée est interdite. Si vous n'êtes pas le destinataire prévu de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. Veuillez penser à l'environnement avant d'imprimer le présent courriel

RE: starting Daemons remotely, through ssh - SOLVED

Posted by "Kartashov, Andy" <An...@mpac.ca>.
Forrest,

I must admit. I am very new myself.  Two months ago I didn't know what "$sudo" was. :) So, do not take my comments too seriously. I could be wrong about many issues.


There are 5 daemon scripts  in /etc/init.d/. I am using MRv1, so mine are:

hadoop-hdfs-datanode
hadoop-hdfs-namenode
hadoop-hdfs-secondarynamenode
<specific to MRv1>
hadoop-0.20-mapreduce-jobtracker
hadoop-0.20-mapreduce-tasktracker

For scalability I chose to:

-           only have NN/JT on my master node so I need to start|stop|restart only hadoop-hdfs-namenode/hadoop-0.20-mapreduce-jobtracker

-          different node for SNN so I need to only start|stop|restart hadoop-hdfs-secondarynamenode

-          all of my DNs - hadoop-hdfs-datanode/hadoop-0.20-mapreduce-tasktracker
My script is VERY simple for now:

#!/bin/bash
# myHadoop.sh
dir=/etc/init.d
if [ $# -lt 1 ]; then
  echo "USAGE: $`basename $0` [start|stop|restart]"
  exit 1
fi
sudo ${dir}/hadoop-hdfs-namenode  $1
sudo ${dir}/hadoop-0.20-mapreduce-jobtracker  $1
exit 0

for SNN I altered script to run instead:
sudo ${dir}/hadoop-hdfs-secondarynamenode  $1

      and for DNs:
sudo ${dir}/hadoop-hdfs-datanode  $1
sudo ${dir}/hadoop-0.20-mapreduce-tasktracker $1

p.s. Criticism on my post is welcomed.

AK47


From: Forrest Aldrich [mailto:forrie@gmail.com]
Sent: Tuesday, November 06, 2012 3:09 PM
To: user@hadoop.apache.org
Subject: Re: starting Daemons remotely, through ssh - SOLVED

Andy, I noticed you are splitting up your /etc/init.d startup -- we're new-ish to Hadoop and found that the supplied init.d script doesn't necessarily stop services properly, so I end up doing it manually.   If you could share your info that would be great... I can see where they can be time-delayed in the startup with chkconfig (RHEL).


Thanks.
On 11/6/12 3:07 PM, Kartashov, Andy wrote:
Harsh/Ravi,

I wrote my own scripts to [start|stop|restart]  [hdfs|mapred] daemons. The content of the script is

$sudo service /etc/init.d/hadoop-[hdfs-*|mapred-*]

I have no problem starting Daemons locally on each node. I have sudo-access granted. I run script and enter password.
I have passwordless ssh enabled. But when I am trying to start slave Daemons from NN I get
"<remothsot>:  sudo: sorry, you must have a tty to run sudo".

I just solved this issue by:

a.       Modifying slaves.sh script to add -t after ssh "-t" <host-ip> <command>


a1. Someone suggested to:

$sudo sudeors and commenting out Defaults requiretty

But a. worked just fine.



Cheers,

AK47

From: Ravi Mutyala [mailto:ravi@hortonworks.com]
Sent: Tuesday, November 06, 2012 2:31 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: starting Daemons remotely, through ssh

Andy,

Which specific scripts are these? As far as I know, all daemons on the hadoop side can be started with hadoop/hdfs or any non-sudo username. You will require passwordless ssh for that user if you are using slaves/regionservers configured to start the daemons remotely from the NameNode/HBase Master.


On Tue, Nov 6, 2012 at 1:25 PM, Kartashov, Andy <An...@mpac.ca>> wrote:
Hadoopers,

How does one start Daemons remotely when scripts normally require root user to start them? Do you modify scripts?

Thanks,
NOTICE: This e-mail message and any attachments are confidential, subject to copyright and may be privileged. Any unauthorized use, copying or disclosure is prohibited. If you are not the intended recipient, please delete and contact the sender immediately. Please consider the environment before printing this e-mail. AVIS : le présent courriel et toute pièce jointe qui l'accompagne sont confidentiels, protégés par le droit d'auteur et peuvent être couverts par le secret professionnel. Toute utilisation, copie ou divulgation non autorisée est interdite. Si vous n'êtes pas le destinataire prévu de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. Veuillez penser à l'environnement avant d'imprimer le présent courriel

NOTICE: This e-mail message and any attachments are confidential, subject to copyright and may be privileged. Any unauthorized use, copying or disclosure is prohibited. If you are not the intended recipient, please delete and contact the sender immediately. Please consider the environment before printing this e-mail. AVIS : le présent courriel et toute pièce jointe qui l'accompagne sont confidentiels, protégés par le droit d'auteur et peuvent être couverts par le secret professionnel. Toute utilisation, copie ou divulgation non autorisée est interdite. Si vous n'êtes pas le destinataire prévu de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. Veuillez penser à l'environnement avant d'imprimer le présent courriel

NOTICE: This e-mail message and any attachments are confidential, subject to copyright and may be privileged. Any unauthorized use, copying or disclosure is prohibited. If you are not the intended recipient, please delete and contact the sender immediately. Please consider the environment before printing this e-mail. AVIS : le présent courriel et toute pièce jointe qui l'accompagne sont confidentiels, protégés par le droit d'auteur et peuvent être couverts par le secret professionnel. Toute utilisation, copie ou divulgation non autorisée est interdite. Si vous n'êtes pas le destinataire prévu de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. Veuillez penser à l'environnement avant d'imprimer le présent courriel

RE: starting Daemons remotely, through ssh - SOLVED

Posted by "Kartashov, Andy" <An...@mpac.ca>.
Forrest,

I must admit. I am very new myself.  Two months ago I didn't know what "$sudo" was. :) So, do not take my comments too seriously. I could be wrong about many issues.


There are 5 daemon scripts  in /etc/init.d/. I am using MRv1, so mine are:

hadoop-hdfs-datanode
hadoop-hdfs-namenode
hadoop-hdfs-secondarynamenode
<specific to MRv1>
hadoop-0.20-mapreduce-jobtracker
hadoop-0.20-mapreduce-tasktracker

For scalability I chose to:

-           only have NN/JT on my master node so I need to start|stop|restart only hadoop-hdfs-namenode/hadoop-0.20-mapreduce-jobtracker

-          different node for SNN so I need to only start|stop|restart hadoop-hdfs-secondarynamenode

-          all of my DNs - hadoop-hdfs-datanode/hadoop-0.20-mapreduce-tasktracker
My script is VERY simple for now:

#!/bin/bash
# myHadoop.sh
dir=/etc/init.d
if [ $# -lt 1 ]; then
  echo "USAGE: $`basename $0` [start|stop|restart]"
  exit 1
fi
sudo ${dir}/hadoop-hdfs-namenode  $1
sudo ${dir}/hadoop-0.20-mapreduce-jobtracker  $1
exit 0

for SNN I altered script to run instead:
sudo ${dir}/hadoop-hdfs-secondarynamenode  $1

      and for DNs:
sudo ${dir}/hadoop-hdfs-datanode  $1
sudo ${dir}/hadoop-0.20-mapreduce-tasktracker $1

p.s. Criticism on my post is welcomed.

AK47


From: Forrest Aldrich [mailto:forrie@gmail.com]
Sent: Tuesday, November 06, 2012 3:09 PM
To: user@hadoop.apache.org
Subject: Re: starting Daemons remotely, through ssh - SOLVED

Andy, I noticed you are splitting up your /etc/init.d startup -- we're new-ish to Hadoop and found that the supplied init.d script doesn't necessarily stop services properly, so I end up doing it manually.   If you could share your info that would be great... I can see where they can be time-delayed in the startup with chkconfig (RHEL).


Thanks.
On 11/6/12 3:07 PM, Kartashov, Andy wrote:
Harsh/Ravi,

I wrote my own scripts to [start|stop|restart]  [hdfs|mapred] daemons. The content of the script is

$sudo service /etc/init.d/hadoop-[hdfs-*|mapred-*]

I have no problem starting Daemons locally on each node. I have sudo-access granted. I run script and enter password.
I have passwordless ssh enabled. But when I am trying to start slave Daemons from NN I get
"<remothsot>:  sudo: sorry, you must have a tty to run sudo".

I just solved this issue by:

a.       Modifying slaves.sh script to add -t after ssh "-t" <host-ip> <command>


a1. Someone suggested to:

$sudo sudeors and commenting out Defaults requiretty

But a. worked just fine.



Cheers,

AK47

From: Ravi Mutyala [mailto:ravi@hortonworks.com]
Sent: Tuesday, November 06, 2012 2:31 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: starting Daemons remotely, through ssh

Andy,

Which specific scripts are these? As far as I know, all daemons on the hadoop side can be started with hadoop/hdfs or any non-sudo username. You will require passwordless ssh for that user if you are using slaves/regionservers configured to start the daemons remotely from the NameNode/HBase Master.


On Tue, Nov 6, 2012 at 1:25 PM, Kartashov, Andy <An...@mpac.ca>> wrote:
Hadoopers,

How does one start Daemons remotely when scripts normally require root user to start them? Do you modify scripts?

Thanks,
NOTICE: This e-mail message and any attachments are confidential, subject to copyright and may be privileged. Any unauthorized use, copying or disclosure is prohibited. If you are not the intended recipient, please delete and contact the sender immediately. Please consider the environment before printing this e-mail. AVIS : le présent courriel et toute pièce jointe qui l'accompagne sont confidentiels, protégés par le droit d'auteur et peuvent être couverts par le secret professionnel. Toute utilisation, copie ou divulgation non autorisée est interdite. Si vous n'êtes pas le destinataire prévu de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. Veuillez penser à l'environnement avant d'imprimer le présent courriel

NOTICE: This e-mail message and any attachments are confidential, subject to copyright and may be privileged. Any unauthorized use, copying or disclosure is prohibited. If you are not the intended recipient, please delete and contact the sender immediately. Please consider the environment before printing this e-mail. AVIS : le présent courriel et toute pièce jointe qui l'accompagne sont confidentiels, protégés par le droit d'auteur et peuvent être couverts par le secret professionnel. Toute utilisation, copie ou divulgation non autorisée est interdite. Si vous n'êtes pas le destinataire prévu de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. Veuillez penser à l'environnement avant d'imprimer le présent courriel

NOTICE: This e-mail message and any attachments are confidential, subject to copyright and may be privileged. Any unauthorized use, copying or disclosure is prohibited. If you are not the intended recipient, please delete and contact the sender immediately. Please consider the environment before printing this e-mail. AVIS : le présent courriel et toute pièce jointe qui l'accompagne sont confidentiels, protégés par le droit d'auteur et peuvent être couverts par le secret professionnel. Toute utilisation, copie ou divulgation non autorisée est interdite. Si vous n'êtes pas le destinataire prévu de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. Veuillez penser à l'environnement avant d'imprimer le présent courriel

Re: starting Daemons remotely, through ssh - SOLVED

Posted by Forrest Aldrich <fo...@gmail.com>.
Andy, I noticed you are splitting up your /etc/init.d startup -- we're 
new-ish to Hadoop and found that the supplied init.d script doesn't 
necessarily stop services properly, so I end up doing it manually.   If 
you could share your info that would be great... I can see where they 
can be time-delayed in the startup with chkconfig (RHEL).


Thanks.

On 11/6/12 3:07 PM, Kartashov, Andy wrote:
>
> Harsh/Ravi,
>
> I wrote my own scripts to [start|stop|restart]  [hdfs|mapred] daemons. 
> The content of the script is
>
> $sudo service /etc/init.d/hadoop-[hdfs-*|mapred-*]
>
> I have no problem starting Daemons locally on each node. I have 
> sudo-access granted. I run script and enter password.
>
> I have passwordless ssh enabled. But when I am trying to start slave 
> Daemons from NN I get
>
> "<remothsot>:  sudo: sorry, you must have a tty to run sudo".
>
> I just solved this issue by:
>
> a.Modifying slaves.sh script to add --t after ssh "-t" <host-ip> <command>
>
> a1. Someone suggested to:
>
> $sudo sudeors and commenting out Defaults requiretty
>
> But a. worked just fine.
>
> Cheers,
>
> AK47
>
> *From:*Ravi Mutyala [mailto:ravi@hortonworks.com]
> *Sent:* Tuesday, November 06, 2012 2:31 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: starting Daemons remotely, through ssh
>
> Andy,
>
> Which specific scripts are these? As far as I know, all daemons on the 
> hadoop side can be started with hadoop/hdfs or any non-sudo username. 
> You will require passwordless ssh for that user if you are using 
> slaves/regionservers configured to start the daemons remotely from the 
> NameNode/HBase Master.
>
> On Tue, Nov 6, 2012 at 1:25 PM, Kartashov, Andy 
> <Andy.Kartashov@mpac.ca <ma...@mpac.ca>> wrote:
>
> Hadoopers,
>
> How does one start Daemons remotely when scripts normally require root 
> user to start them? Do you modify scripts?
>
> Thanks,
>
> NOTICE: This e-mail message and any attachments are confidential, 
> subject to copyright and may be privileged. Any unauthorized use, 
> copying or disclosure is prohibited. If you are not the intended 
> recipient, please delete and contact the sender immediately. Please 
> consider the environment before printing this e-mail. AVIS : le 
> présent courriel et toute pièce jointe qui l'accompagne sont 
> confidentiels, protégés par le droit d'auteur et peuvent être couverts 
> par le secret professionnel. Toute utilisation, copie ou divulgation 
> non autorisée est interdite. Si vous n'êtes pas le destinataire prévu 
> de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. 
> Veuillez penser à l'environnement avant d'imprimer le présent courriel
>
> NOTICE: This e-mail message and any attachments are confidential, 
> subject to copyright and may be privileged. Any unauthorized use, 
> copying or disclosure is prohibited. If you are not the intended 
> recipient, please delete and contact the sender immediately. Please 
> consider the environment before printing this e-mail. AVIS : le 
> présent courriel et toute pièce jointe qui l'accompagne sont 
> confidentiels, protégés par le droit d'auteur et peuvent être couverts 
> par le secret professionnel. Toute utilisation, copie ou divulgation 
> non autorisée est interdite. Si vous n'êtes pas le destinataire prévu 
> de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. 
> Veuillez penser à l'environnement avant d'imprimer le présent courriel 


Re: starting Daemons remotely, through ssh - SOLVED

Posted by Forrest Aldrich <fo...@gmail.com>.
Andy, I noticed you are splitting up your /etc/init.d startup -- we're 
new-ish to Hadoop and found that the supplied init.d script doesn't 
necessarily stop services properly, so I end up doing it manually.   If 
you could share your info that would be great... I can see where they 
can be time-delayed in the startup with chkconfig (RHEL).


Thanks.

On 11/6/12 3:07 PM, Kartashov, Andy wrote:
>
> Harsh/Ravi,
>
> I wrote my own scripts to [start|stop|restart]  [hdfs|mapred] daemons. 
> The content of the script is
>
> $sudo service /etc/init.d/hadoop-[hdfs-*|mapred-*]
>
> I have no problem starting Daemons locally on each node. I have 
> sudo-access granted. I run script and enter password.
>
> I have passwordless ssh enabled. But when I am trying to start slave 
> Daemons from NN I get
>
> "<remothsot>:  sudo: sorry, you must have a tty to run sudo".
>
> I just solved this issue by:
>
> a.Modifying slaves.sh script to add --t after ssh "-t" <host-ip> <command>
>
> a1. Someone suggested to:
>
> $sudo sudeors and commenting out Defaults requiretty
>
> But a. worked just fine.
>
> Cheers,
>
> AK47
>
> *From:*Ravi Mutyala [mailto:ravi@hortonworks.com]
> *Sent:* Tuesday, November 06, 2012 2:31 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: starting Daemons remotely, through ssh
>
> Andy,
>
> Which specific scripts are these? As far as I know, all daemons on the 
> hadoop side can be started with hadoop/hdfs or any non-sudo username. 
> You will require passwordless ssh for that user if you are using 
> slaves/regionservers configured to start the daemons remotely from the 
> NameNode/HBase Master.
>
> On Tue, Nov 6, 2012 at 1:25 PM, Kartashov, Andy 
> <Andy.Kartashov@mpac.ca <ma...@mpac.ca>> wrote:
>
> Hadoopers,
>
> How does one start Daemons remotely when scripts normally require root 
> user to start them? Do you modify scripts?
>
> Thanks,
>
> NOTICE: This e-mail message and any attachments are confidential, 
> subject to copyright and may be privileged. Any unauthorized use, 
> copying or disclosure is prohibited. If you are not the intended 
> recipient, please delete and contact the sender immediately. Please 
> consider the environment before printing this e-mail. AVIS : le 
> présent courriel et toute pièce jointe qui l'accompagne sont 
> confidentiels, protégés par le droit d'auteur et peuvent être couverts 
> par le secret professionnel. Toute utilisation, copie ou divulgation 
> non autorisée est interdite. Si vous n'êtes pas le destinataire prévu 
> de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. 
> Veuillez penser à l'environnement avant d'imprimer le présent courriel
>
> NOTICE: This e-mail message and any attachments are confidential, 
> subject to copyright and may be privileged. Any unauthorized use, 
> copying or disclosure is prohibited. If you are not the intended 
> recipient, please delete and contact the sender immediately. Please 
> consider the environment before printing this e-mail. AVIS : le 
> présent courriel et toute pièce jointe qui l'accompagne sont 
> confidentiels, protégés par le droit d'auteur et peuvent être couverts 
> par le secret professionnel. Toute utilisation, copie ou divulgation 
> non autorisée est interdite. Si vous n'êtes pas le destinataire prévu 
> de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. 
> Veuillez penser à l'environnement avant d'imprimer le présent courriel 


Re: starting Daemons remotely, through ssh - SOLVED

Posted by Forrest Aldrich <fo...@gmail.com>.
Andy, I noticed you are splitting up your /etc/init.d startup -- we're 
new-ish to Hadoop and found that the supplied init.d script doesn't 
necessarily stop services properly, so I end up doing it manually.   If 
you could share your info that would be great... I can see where they 
can be time-delayed in the startup with chkconfig (RHEL).


Thanks.

On 11/6/12 3:07 PM, Kartashov, Andy wrote:
>
> Harsh/Ravi,
>
> I wrote my own scripts to [start|stop|restart]  [hdfs|mapred] daemons. 
> The content of the script is
>
> $sudo service /etc/init.d/hadoop-[hdfs-*|mapred-*]
>
> I have no problem starting Daemons locally on each node. I have 
> sudo-access granted. I run script and enter password.
>
> I have passwordless ssh enabled. But when I am trying to start slave 
> Daemons from NN I get
>
> "<remothsot>:  sudo: sorry, you must have a tty to run sudo".
>
> I just solved this issue by:
>
> a.Modifying slaves.sh script to add --t after ssh "-t" <host-ip> <command>
>
> a1. Someone suggested to:
>
> $sudo sudeors and commenting out Defaults requiretty
>
> But a. worked just fine.
>
> Cheers,
>
> AK47
>
> *From:*Ravi Mutyala [mailto:ravi@hortonworks.com]
> *Sent:* Tuesday, November 06, 2012 2:31 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: starting Daemons remotely, through ssh
>
> Andy,
>
> Which specific scripts are these? As far as I know, all daemons on the 
> hadoop side can be started with hadoop/hdfs or any non-sudo username. 
> You will require passwordless ssh for that user if you are using 
> slaves/regionservers configured to start the daemons remotely from the 
> NameNode/HBase Master.
>
> On Tue, Nov 6, 2012 at 1:25 PM, Kartashov, Andy 
> <Andy.Kartashov@mpac.ca <ma...@mpac.ca>> wrote:
>
> Hadoopers,
>
> How does one start Daemons remotely when scripts normally require root 
> user to start them? Do you modify scripts?
>
> Thanks,
>
> NOTICE: This e-mail message and any attachments are confidential, 
> subject to copyright and may be privileged. Any unauthorized use, 
> copying or disclosure is prohibited. If you are not the intended 
> recipient, please delete and contact the sender immediately. Please 
> consider the environment before printing this e-mail. AVIS : le 
> présent courriel et toute pièce jointe qui l'accompagne sont 
> confidentiels, protégés par le droit d'auteur et peuvent être couverts 
> par le secret professionnel. Toute utilisation, copie ou divulgation 
> non autorisée est interdite. Si vous n'êtes pas le destinataire prévu 
> de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. 
> Veuillez penser à l'environnement avant d'imprimer le présent courriel
>
> NOTICE: This e-mail message and any attachments are confidential, 
> subject to copyright and may be privileged. Any unauthorized use, 
> copying or disclosure is prohibited. If you are not the intended 
> recipient, please delete and contact the sender immediately. Please 
> consider the environment before printing this e-mail. AVIS : le 
> présent courriel et toute pièce jointe qui l'accompagne sont 
> confidentiels, protégés par le droit d'auteur et peuvent être couverts 
> par le secret professionnel. Toute utilisation, copie ou divulgation 
> non autorisée est interdite. Si vous n'êtes pas le destinataire prévu 
> de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. 
> Veuillez penser à l'environnement avant d'imprimer le présent courriel 


Re: starting Daemons remotely, through ssh - SOLVED

Posted by Forrest Aldrich <fo...@gmail.com>.
Andy, I noticed you are splitting up your /etc/init.d startup -- we're 
new-ish to Hadoop and found that the supplied init.d script doesn't 
necessarily stop services properly, so I end up doing it manually.   If 
you could share your info that would be great... I can see where they 
can be time-delayed in the startup with chkconfig (RHEL).


Thanks.

On 11/6/12 3:07 PM, Kartashov, Andy wrote:
>
> Harsh/Ravi,
>
> I wrote my own scripts to [start|stop|restart]  [hdfs|mapred] daemons. 
> The content of the script is
>
> $sudo service /etc/init.d/hadoop-[hdfs-*|mapred-*]
>
> I have no problem starting Daemons locally on each node. I have 
> sudo-access granted. I run script and enter password.
>
> I have passwordless ssh enabled. But when I am trying to start slave 
> Daemons from NN I get
>
> "<remothsot>:  sudo: sorry, you must have a tty to run sudo".
>
> I just solved this issue by:
>
> a.Modifying slaves.sh script to add --t after ssh "-t" <host-ip> <command>
>
> a1. Someone suggested to:
>
> $sudo sudeors and commenting out Defaults requiretty
>
> But a. worked just fine.
>
> Cheers,
>
> AK47
>
> *From:*Ravi Mutyala [mailto:ravi@hortonworks.com]
> *Sent:* Tuesday, November 06, 2012 2:31 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: starting Daemons remotely, through ssh
>
> Andy,
>
> Which specific scripts are these? As far as I know, all daemons on the 
> hadoop side can be started with hadoop/hdfs or any non-sudo username. 
> You will require passwordless ssh for that user if you are using 
> slaves/regionservers configured to start the daemons remotely from the 
> NameNode/HBase Master.
>
> On Tue, Nov 6, 2012 at 1:25 PM, Kartashov, Andy 
> <Andy.Kartashov@mpac.ca <ma...@mpac.ca>> wrote:
>
> Hadoopers,
>
> How does one start Daemons remotely when scripts normally require root 
> user to start them? Do you modify scripts?
>
> Thanks,
>
> NOTICE: This e-mail message and any attachments are confidential, 
> subject to copyright and may be privileged. Any unauthorized use, 
> copying or disclosure is prohibited. If you are not the intended 
> recipient, please delete and contact the sender immediately. Please 
> consider the environment before printing this e-mail. AVIS : le 
> présent courriel et toute pièce jointe qui l'accompagne sont 
> confidentiels, protégés par le droit d'auteur et peuvent être couverts 
> par le secret professionnel. Toute utilisation, copie ou divulgation 
> non autorisée est interdite. Si vous n'êtes pas le destinataire prévu 
> de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. 
> Veuillez penser à l'environnement avant d'imprimer le présent courriel
>
> NOTICE: This e-mail message and any attachments are confidential, 
> subject to copyright and may be privileged. Any unauthorized use, 
> copying or disclosure is prohibited. If you are not the intended 
> recipient, please delete and contact the sender immediately. Please 
> consider the environment before printing this e-mail. AVIS : le 
> présent courriel et toute pièce jointe qui l'accompagne sont 
> confidentiels, protégés par le droit d'auteur et peuvent être couverts 
> par le secret professionnel. Toute utilisation, copie ou divulgation 
> non autorisée est interdite. Si vous n'êtes pas le destinataire prévu 
> de ce courriel, supprimez-le et contactez immédiatement l'expéditeur. 
> Veuillez penser à l'environnement avant d'imprimer le présent courriel