You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Chengwei Yang <ch...@gmail.com> on 2014/03/03 06:37:11 UTC

Need help to understand hadoop.tmp.dir

Hi List,

I'm confusing by hadoop.tmp.dir currently because its default value
"/tmp/hadoop-${user.name}" always means a directory in tmpfs in Linux.
So after the name node machine reboot, it gone away and then name node
fail to start.

I found this was reported here.
http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox/%3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E

As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a lot
properties are based on hadoop.tmp.dir, like
dfs.namenode.name.dir	file://${hadoop.tmp.dir}/dfs/name

I'm wondering, if we can set the default value of hadoop.tmp.dir to
a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
directory?

--
Thanks,
Chengwei

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 09:03:28AM -0500, JCAD Cell 1 wrote:
> With the services stopped you would change the setting in core-site.xml:
>   <property>
>     <name>hadoop.tmp.dir</name>
>     <value>/var/hadoop/tmp</value>
>   </property>
> 
> Then move your /tmp/hadoop folder over to the new location:
> mv /tmp/hadoop /var/hadoop/tmp

Thank you, since the machine did a reboot, so all files in
/tmp/hadoop-user, the previous default direcotry of hadoop.tmp.dir, were
gone.

So that's the problem, no files to left, if the namenode can't re-create
its name space, then I think the only way is to format the namenode
again?

--
Thanks,
Chengwei

> 
> 
> 
> On Mon, Mar 3, 2014 at 5:55 AM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     On Mon, Mar 03, 2014 at 01:57:49PM +0530, shashwat shriparv wrote:
>     > NO need to format just change the value and restart the cluster;
> 
>     Hmm, seems it doesn't work for me, if the only need to do is to change
>     to another directory, then why it can not re-init the directory in /tmp
>     just as another directory?
> 
>     If I changed to another directory, a new directory, the same error
>     happen.
> 
>     --
>     Thanks,
>     Chengwei
> 
>     >
>     >
>     > Warm Regards_∞_
>     > Shashwat Shriparv
>     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     twitter.com/
>     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     +ShashwatShriparv
>     > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
>     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >
>     >
>     >
>     > On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <chengwei.yang.cn@gmail.com
>     >
>     > wrote:
>     >
>     >     On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
>     >     > Ya its always better to change the temp dir path in hadoop, as it
>     will
>     >     prevent
>     >     > deletion of file while the server reboots.
>     >
>     >     Thanks, so is there anyway to recovery from this state? Or I have to
>     format
>     >     namenode again?
>     >
>     >     --
>     >     Thanks,
>     >     Chengwei
>     >
>     >     >
>     >     >
>     >     > Warm Regards_∞_
>     >     > Shashwat Shriparv
>     >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     >     twitter.com/
>     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     >     +ShashwatShriparv
>     >     > http://www.youtube.com/user/sShriparv/videoshttp://
>     profile.yahoo.com/
>     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >     >
>     >     >
>     >     >
>     >     > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
>     >     chengwei.yang.cn@gmail.com>
>     >     > wrote:
>     >     >
>     >     >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv
>     wrote:
>     >     >     > You can use any directory you like beside permissions are
>     right.
>     >     >
>     >     >     I mean if it's better if we change the default hadoop.tmp.dir?
>     >     Because it
>     >     >     can not work cross reboot in default Linux environment.
>     >     >
>     >     >     --
>     >     >     Thanks,
>     >     >     Chengwei
>     >     >
>     >     >     >
>     >     >     >
>     >     >     > Warm Regards_∞_
>     >     >     > Shashwat Shriparv
>     >     >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/
>     2a9https://
>     >     >     twitter.com/
>     >     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     >     >     +ShashwatShriparv
>     >     >     > http://www.youtube.com/user/sShriparv/videoshttp://
>     >     profile.yahoo.com/
>     >     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >     >     >
>     >     >     >
>     >     >     >
>     >     >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
>     >     >     chengwei.yang.cn@gmail.com>
>     >     >     > wrote:
>     >     >     >
>     >     >     >     Hi List,
>     >     >     >
>     >     >     >     I'm confusing by hadoop.tmp.dir currently because its
>     default
>     >     value
>     >     >     >     "/tmp/hadoop-${user.name}" always means a directory in
>     tmpfs in
>     >     >     Linux.
>     >     >     >     So after the name node machine reboot, it gone away and
>     then
>     >     name
>     >     >     node
>     >     >     >     fail to start.
>     >     >     >
>     >     >     >     I found this was reported here.
>     >     >     >     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user
>     /
>     >     201205.mbox
>     >     >     /
>     >     >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>     >     >     >
>     >     >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/,
>     there
>     >     are a
>     >     >     lot
>     >     >     >     properties are based on hadoop.tmp.dir, like
>     >     >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>     >     >     >
>     >     >     >     I'm wondering, if we can set the default value of
>     >     hadoop.tmp.dir to
>     >     >     >     a non-tmpfs direcotry if it doesn't work at all by using
>     a real
>     >     tmpfs
>     >     >     >     directory?
>     >     >     >
>     >     >     >     --
>     >     >     >     Thanks,
>     >     >     >     Chengwei
>     >     >     >
>     >     >     >
>     >     >
>     >     >
>     >
>     >
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 09:03:28AM -0500, JCAD Cell 1 wrote:
> With the services stopped you would change the setting in core-site.xml:
>   <property>
>     <name>hadoop.tmp.dir</name>
>     <value>/var/hadoop/tmp</value>
>   </property>
> 
> Then move your /tmp/hadoop folder over to the new location:
> mv /tmp/hadoop /var/hadoop/tmp

Thank you, since the machine did a reboot, so all files in
/tmp/hadoop-user, the previous default direcotry of hadoop.tmp.dir, were
gone.

So that's the problem, no files to left, if the namenode can't re-create
its name space, then I think the only way is to format the namenode
again?

--
Thanks,
Chengwei

> 
> 
> 
> On Mon, Mar 3, 2014 at 5:55 AM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     On Mon, Mar 03, 2014 at 01:57:49PM +0530, shashwat shriparv wrote:
>     > NO need to format just change the value and restart the cluster;
> 
>     Hmm, seems it doesn't work for me, if the only need to do is to change
>     to another directory, then why it can not re-init the directory in /tmp
>     just as another directory?
> 
>     If I changed to another directory, a new directory, the same error
>     happen.
> 
>     --
>     Thanks,
>     Chengwei
> 
>     >
>     >
>     > Warm Regards_∞_
>     > Shashwat Shriparv
>     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     twitter.com/
>     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     +ShashwatShriparv
>     > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
>     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >
>     >
>     >
>     > On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <chengwei.yang.cn@gmail.com
>     >
>     > wrote:
>     >
>     >     On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
>     >     > Ya its always better to change the temp dir path in hadoop, as it
>     will
>     >     prevent
>     >     > deletion of file while the server reboots.
>     >
>     >     Thanks, so is there anyway to recovery from this state? Or I have to
>     format
>     >     namenode again?
>     >
>     >     --
>     >     Thanks,
>     >     Chengwei
>     >
>     >     >
>     >     >
>     >     > Warm Regards_∞_
>     >     > Shashwat Shriparv
>     >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     >     twitter.com/
>     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     >     +ShashwatShriparv
>     >     > http://www.youtube.com/user/sShriparv/videoshttp://
>     profile.yahoo.com/
>     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >     >
>     >     >
>     >     >
>     >     > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
>     >     chengwei.yang.cn@gmail.com>
>     >     > wrote:
>     >     >
>     >     >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv
>     wrote:
>     >     >     > You can use any directory you like beside permissions are
>     right.
>     >     >
>     >     >     I mean if it's better if we change the default hadoop.tmp.dir?
>     >     Because it
>     >     >     can not work cross reboot in default Linux environment.
>     >     >
>     >     >     --
>     >     >     Thanks,
>     >     >     Chengwei
>     >     >
>     >     >     >
>     >     >     >
>     >     >     > Warm Regards_∞_
>     >     >     > Shashwat Shriparv
>     >     >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/
>     2a9https://
>     >     >     twitter.com/
>     >     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     >     >     +ShashwatShriparv
>     >     >     > http://www.youtube.com/user/sShriparv/videoshttp://
>     >     profile.yahoo.com/
>     >     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >     >     >
>     >     >     >
>     >     >     >
>     >     >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
>     >     >     chengwei.yang.cn@gmail.com>
>     >     >     > wrote:
>     >     >     >
>     >     >     >     Hi List,
>     >     >     >
>     >     >     >     I'm confusing by hadoop.tmp.dir currently because its
>     default
>     >     value
>     >     >     >     "/tmp/hadoop-${user.name}" always means a directory in
>     tmpfs in
>     >     >     Linux.
>     >     >     >     So after the name node machine reboot, it gone away and
>     then
>     >     name
>     >     >     node
>     >     >     >     fail to start.
>     >     >     >
>     >     >     >     I found this was reported here.
>     >     >     >     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user
>     /
>     >     201205.mbox
>     >     >     /
>     >     >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>     >     >     >
>     >     >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/,
>     there
>     >     are a
>     >     >     lot
>     >     >     >     properties are based on hadoop.tmp.dir, like
>     >     >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>     >     >     >
>     >     >     >     I'm wondering, if we can set the default value of
>     >     hadoop.tmp.dir to
>     >     >     >     a non-tmpfs direcotry if it doesn't work at all by using
>     a real
>     >     tmpfs
>     >     >     >     directory?
>     >     >     >
>     >     >     >     --
>     >     >     >     Thanks,
>     >     >     >     Chengwei
>     >     >     >
>     >     >     >
>     >     >
>     >     >
>     >
>     >
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 09:03:28AM -0500, JCAD Cell 1 wrote:
> With the services stopped you would change the setting in core-site.xml:
>   <property>
>     <name>hadoop.tmp.dir</name>
>     <value>/var/hadoop/tmp</value>
>   </property>
> 
> Then move your /tmp/hadoop folder over to the new location:
> mv /tmp/hadoop /var/hadoop/tmp

Thank you, since the machine did a reboot, so all files in
/tmp/hadoop-user, the previous default direcotry of hadoop.tmp.dir, were
gone.

So that's the problem, no files to left, if the namenode can't re-create
its name space, then I think the only way is to format the namenode
again?

--
Thanks,
Chengwei

> 
> 
> 
> On Mon, Mar 3, 2014 at 5:55 AM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     On Mon, Mar 03, 2014 at 01:57:49PM +0530, shashwat shriparv wrote:
>     > NO need to format just change the value and restart the cluster;
> 
>     Hmm, seems it doesn't work for me, if the only need to do is to change
>     to another directory, then why it can not re-init the directory in /tmp
>     just as another directory?
> 
>     If I changed to another directory, a new directory, the same error
>     happen.
> 
>     --
>     Thanks,
>     Chengwei
> 
>     >
>     >
>     > Warm Regards_∞_
>     > Shashwat Shriparv
>     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     twitter.com/
>     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     +ShashwatShriparv
>     > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
>     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >
>     >
>     >
>     > On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <chengwei.yang.cn@gmail.com
>     >
>     > wrote:
>     >
>     >     On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
>     >     > Ya its always better to change the temp dir path in hadoop, as it
>     will
>     >     prevent
>     >     > deletion of file while the server reboots.
>     >
>     >     Thanks, so is there anyway to recovery from this state? Or I have to
>     format
>     >     namenode again?
>     >
>     >     --
>     >     Thanks,
>     >     Chengwei
>     >
>     >     >
>     >     >
>     >     > Warm Regards_∞_
>     >     > Shashwat Shriparv
>     >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     >     twitter.com/
>     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     >     +ShashwatShriparv
>     >     > http://www.youtube.com/user/sShriparv/videoshttp://
>     profile.yahoo.com/
>     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >     >
>     >     >
>     >     >
>     >     > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
>     >     chengwei.yang.cn@gmail.com>
>     >     > wrote:
>     >     >
>     >     >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv
>     wrote:
>     >     >     > You can use any directory you like beside permissions are
>     right.
>     >     >
>     >     >     I mean if it's better if we change the default hadoop.tmp.dir?
>     >     Because it
>     >     >     can not work cross reboot in default Linux environment.
>     >     >
>     >     >     --
>     >     >     Thanks,
>     >     >     Chengwei
>     >     >
>     >     >     >
>     >     >     >
>     >     >     > Warm Regards_∞_
>     >     >     > Shashwat Shriparv
>     >     >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/
>     2a9https://
>     >     >     twitter.com/
>     >     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     >     >     +ShashwatShriparv
>     >     >     > http://www.youtube.com/user/sShriparv/videoshttp://
>     >     profile.yahoo.com/
>     >     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >     >     >
>     >     >     >
>     >     >     >
>     >     >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
>     >     >     chengwei.yang.cn@gmail.com>
>     >     >     > wrote:
>     >     >     >
>     >     >     >     Hi List,
>     >     >     >
>     >     >     >     I'm confusing by hadoop.tmp.dir currently because its
>     default
>     >     value
>     >     >     >     "/tmp/hadoop-${user.name}" always means a directory in
>     tmpfs in
>     >     >     Linux.
>     >     >     >     So after the name node machine reboot, it gone away and
>     then
>     >     name
>     >     >     node
>     >     >     >     fail to start.
>     >     >     >
>     >     >     >     I found this was reported here.
>     >     >     >     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user
>     /
>     >     201205.mbox
>     >     >     /
>     >     >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>     >     >     >
>     >     >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/,
>     there
>     >     are a
>     >     >     lot
>     >     >     >     properties are based on hadoop.tmp.dir, like
>     >     >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>     >     >     >
>     >     >     >     I'm wondering, if we can set the default value of
>     >     hadoop.tmp.dir to
>     >     >     >     a non-tmpfs direcotry if it doesn't work at all by using
>     a real
>     >     tmpfs
>     >     >     >     directory?
>     >     >     >
>     >     >     >     --
>     >     >     >     Thanks,
>     >     >     >     Chengwei
>     >     >     >
>     >     >     >
>     >     >
>     >     >
>     >
>     >
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 09:03:28AM -0500, JCAD Cell 1 wrote:
> With the services stopped you would change the setting in core-site.xml:
>   <property>
>     <name>hadoop.tmp.dir</name>
>     <value>/var/hadoop/tmp</value>
>   </property>
> 
> Then move your /tmp/hadoop folder over to the new location:
> mv /tmp/hadoop /var/hadoop/tmp

Thank you, since the machine did a reboot, so all files in
/tmp/hadoop-user, the previous default direcotry of hadoop.tmp.dir, were
gone.

So that's the problem, no files to left, if the namenode can't re-create
its name space, then I think the only way is to format the namenode
again?

--
Thanks,
Chengwei

> 
> 
> 
> On Mon, Mar 3, 2014 at 5:55 AM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     On Mon, Mar 03, 2014 at 01:57:49PM +0530, shashwat shriparv wrote:
>     > NO need to format just change the value and restart the cluster;
> 
>     Hmm, seems it doesn't work for me, if the only need to do is to change
>     to another directory, then why it can not re-init the directory in /tmp
>     just as another directory?
> 
>     If I changed to another directory, a new directory, the same error
>     happen.
> 
>     --
>     Thanks,
>     Chengwei
> 
>     >
>     >
>     > Warm Regards_∞_
>     > Shashwat Shriparv
>     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     twitter.com/
>     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     +ShashwatShriparv
>     > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
>     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >
>     >
>     >
>     > On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <chengwei.yang.cn@gmail.com
>     >
>     > wrote:
>     >
>     >     On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
>     >     > Ya its always better to change the temp dir path in hadoop, as it
>     will
>     >     prevent
>     >     > deletion of file while the server reboots.
>     >
>     >     Thanks, so is there anyway to recovery from this state? Or I have to
>     format
>     >     namenode again?
>     >
>     >     --
>     >     Thanks,
>     >     Chengwei
>     >
>     >     >
>     >     >
>     >     > Warm Regards_∞_
>     >     > Shashwat Shriparv
>     >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     >     twitter.com/
>     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     >     +ShashwatShriparv
>     >     > http://www.youtube.com/user/sShriparv/videoshttp://
>     profile.yahoo.com/
>     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >     >
>     >     >
>     >     >
>     >     > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
>     >     chengwei.yang.cn@gmail.com>
>     >     > wrote:
>     >     >
>     >     >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv
>     wrote:
>     >     >     > You can use any directory you like beside permissions are
>     right.
>     >     >
>     >     >     I mean if it's better if we change the default hadoop.tmp.dir?
>     >     Because it
>     >     >     can not work cross reboot in default Linux environment.
>     >     >
>     >     >     --
>     >     >     Thanks,
>     >     >     Chengwei
>     >     >
>     >     >     >
>     >     >     >
>     >     >     > Warm Regards_∞_
>     >     >     > Shashwat Shriparv
>     >     >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/
>     2a9https://
>     >     >     twitter.com/
>     >     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     >     >     +ShashwatShriparv
>     >     >     > http://www.youtube.com/user/sShriparv/videoshttp://
>     >     profile.yahoo.com/
>     >     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >     >     >
>     >     >     >
>     >     >     >
>     >     >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
>     >     >     chengwei.yang.cn@gmail.com>
>     >     >     > wrote:
>     >     >     >
>     >     >     >     Hi List,
>     >     >     >
>     >     >     >     I'm confusing by hadoop.tmp.dir currently because its
>     default
>     >     value
>     >     >     >     "/tmp/hadoop-${user.name}" always means a directory in
>     tmpfs in
>     >     >     Linux.
>     >     >     >     So after the name node machine reboot, it gone away and
>     then
>     >     name
>     >     >     node
>     >     >     >     fail to start.
>     >     >     >
>     >     >     >     I found this was reported here.
>     >     >     >     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user
>     /
>     >     201205.mbox
>     >     >     /
>     >     >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>     >     >     >
>     >     >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/,
>     there
>     >     are a
>     >     >     lot
>     >     >     >     properties are based on hadoop.tmp.dir, like
>     >     >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>     >     >     >
>     >     >     >     I'm wondering, if we can set the default value of
>     >     hadoop.tmp.dir to
>     >     >     >     a non-tmpfs direcotry if it doesn't work at all by using
>     a real
>     >     tmpfs
>     >     >     >     directory?
>     >     >     >
>     >     >     >     --
>     >     >     >     Thanks,
>     >     >     >     Chengwei
>     >     >     >
>     >     >     >
>     >     >
>     >     >
>     >
>     >
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by JCAD Cell 1 <jc...@gmail.com>.
With the services stopped you would change the setting in core-site.xml:
  <property>
    <name>hadoop.tmp.dir</name>
    <value>/var/hadoop/tmp</value>
  </property>

Then move your /tmp/hadoop folder over to the new location:
mv /tmp/hadoop /var/hadoop/tmp



On Mon, Mar 3, 2014 at 5:55 AM, Chengwei Yang <ch...@gmail.com>wrote:

> On Mon, Mar 03, 2014 at 01:57:49PM +0530, shashwat shriparv wrote:
> > NO need to format just change the value and restart the cluster;
>
> Hmm, seems it doesn't work for me, if the only need to do is to change
> to another directory, then why it can not re-init the directory in /tmp
> just as another directory?
>
> If I changed to another directory, a new directory, the same error
> happen.
>
> --
> Thanks,
> Chengwei
>
> >
> >
> > Warm Regards_∞_
> > Shashwat Shriparv
> >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> > shriparvhttps://
> www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >
> >
> >
> > On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <
> chengwei.yang.cn@gmail.com>
> > wrote:
> >
> >     On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
> >     > Ya its always better to change the temp dir path in hadoop, as it
> will
> >     prevent
> >     > deletion of file while the server reboots.
> >
> >     Thanks, so is there anyway to recovery from this state? Or I have to
> format
> >     namenode again?
> >
> >     --
> >     Thanks,
> >     Chengwei
> >
> >     >
> >     >
> >     > Warm Regards_∞_
> >     > Shashwat Shriparv
> >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
> >     twitter.com/
> >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
> >     +ShashwatShriparv
> >     >
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >     >
> >     >
> >     >
> >     > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
> >     chengwei.yang.cn@gmail.com>
> >     > wrote:
> >     >
> >     >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv
> wrote:
> >     >     > You can use any directory you like beside permissions are
> right.
> >     >
> >     >     I mean if it's better if we change the default hadoop.tmp.dir?
> >     Because it
> >     >     can not work cross reboot in default Linux environment.
> >     >
> >     >     --
> >     >     Thanks,
> >     >     Chengwei
> >     >
> >     >     >
> >     >     >
> >     >     > Warm Regards_∞_
> >     >     > Shashwat Shriparv
> >     >     >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
> >     >     twitter.com/
> >     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
> >     >     +ShashwatShriparv
> >     >     > http://www.youtube.com/user/sShriparv/videoshttp://
> >     profile.yahoo.com/
> >     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >     >     >
> >     >     >
> >     >     >
> >     >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
> >     >     chengwei.yang.cn@gmail.com>
> >     >     > wrote:
> >     >     >
> >     >     >     Hi List,
> >     >     >
> >     >     >     I'm confusing by hadoop.tmp.dir currently because its
> default
> >     value
> >     >     >     "/tmp/hadoop-${user.name}" always means a directory in
> tmpfs in
> >     >     Linux.
> >     >     >     So after the name node machine reboot, it gone away and
> then
> >     name
> >     >     node
> >     >     >     fail to start.
> >     >     >
> >     >     >     I found this was reported here.
> >     >     >
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/
> >     201205.mbox
> >     >     /
> >     >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> >     >     >
> >     >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/,
> there
> >     are a
> >     >     lot
> >     >     >     properties are based on hadoop.tmp.dir, like
> >     >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> >     >     >
> >     >     >     I'm wondering, if we can set the default value of
> >     hadoop.tmp.dir to
> >     >     >     a non-tmpfs direcotry if it doesn't work at all by using
> a real
> >     tmpfs
> >     >     >     directory?
> >     >     >
> >     >     >     --
> >     >     >     Thanks,
> >     >     >     Chengwei
> >     >     >
> >     >     >
> >     >
> >     >
> >
> >
>

Re: Need help to understand hadoop.tmp.dir

Posted by JCAD Cell 1 <jc...@gmail.com>.
With the services stopped you would change the setting in core-site.xml:
  <property>
    <name>hadoop.tmp.dir</name>
    <value>/var/hadoop/tmp</value>
  </property>

Then move your /tmp/hadoop folder over to the new location:
mv /tmp/hadoop /var/hadoop/tmp



On Mon, Mar 3, 2014 at 5:55 AM, Chengwei Yang <ch...@gmail.com>wrote:

> On Mon, Mar 03, 2014 at 01:57:49PM +0530, shashwat shriparv wrote:
> > NO need to format just change the value and restart the cluster;
>
> Hmm, seems it doesn't work for me, if the only need to do is to change
> to another directory, then why it can not re-init the directory in /tmp
> just as another directory?
>
> If I changed to another directory, a new directory, the same error
> happen.
>
> --
> Thanks,
> Chengwei
>
> >
> >
> > Warm Regards_∞_
> > Shashwat Shriparv
> >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> > shriparvhttps://
> www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >
> >
> >
> > On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <
> chengwei.yang.cn@gmail.com>
> > wrote:
> >
> >     On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
> >     > Ya its always better to change the temp dir path in hadoop, as it
> will
> >     prevent
> >     > deletion of file while the server reboots.
> >
> >     Thanks, so is there anyway to recovery from this state? Or I have to
> format
> >     namenode again?
> >
> >     --
> >     Thanks,
> >     Chengwei
> >
> >     >
> >     >
> >     > Warm Regards_∞_
> >     > Shashwat Shriparv
> >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
> >     twitter.com/
> >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
> >     +ShashwatShriparv
> >     >
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >     >
> >     >
> >     >
> >     > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
> >     chengwei.yang.cn@gmail.com>
> >     > wrote:
> >     >
> >     >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv
> wrote:
> >     >     > You can use any directory you like beside permissions are
> right.
> >     >
> >     >     I mean if it's better if we change the default hadoop.tmp.dir?
> >     Because it
> >     >     can not work cross reboot in default Linux environment.
> >     >
> >     >     --
> >     >     Thanks,
> >     >     Chengwei
> >     >
> >     >     >
> >     >     >
> >     >     > Warm Regards_∞_
> >     >     > Shashwat Shriparv
> >     >     >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
> >     >     twitter.com/
> >     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
> >     >     +ShashwatShriparv
> >     >     > http://www.youtube.com/user/sShriparv/videoshttp://
> >     profile.yahoo.com/
> >     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >     >     >
> >     >     >
> >     >     >
> >     >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
> >     >     chengwei.yang.cn@gmail.com>
> >     >     > wrote:
> >     >     >
> >     >     >     Hi List,
> >     >     >
> >     >     >     I'm confusing by hadoop.tmp.dir currently because its
> default
> >     value
> >     >     >     "/tmp/hadoop-${user.name}" always means a directory in
> tmpfs in
> >     >     Linux.
> >     >     >     So after the name node machine reboot, it gone away and
> then
> >     name
> >     >     node
> >     >     >     fail to start.
> >     >     >
> >     >     >     I found this was reported here.
> >     >     >
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/
> >     201205.mbox
> >     >     /
> >     >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> >     >     >
> >     >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/,
> there
> >     are a
> >     >     lot
> >     >     >     properties are based on hadoop.tmp.dir, like
> >     >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> >     >     >
> >     >     >     I'm wondering, if we can set the default value of
> >     hadoop.tmp.dir to
> >     >     >     a non-tmpfs direcotry if it doesn't work at all by using
> a real
> >     tmpfs
> >     >     >     directory?
> >     >     >
> >     >     >     --
> >     >     >     Thanks,
> >     >     >     Chengwei
> >     >     >
> >     >     >
> >     >
> >     >
> >
> >
>

Re: Need help to understand hadoop.tmp.dir

Posted by JCAD Cell 1 <jc...@gmail.com>.
With the services stopped you would change the setting in core-site.xml:
  <property>
    <name>hadoop.tmp.dir</name>
    <value>/var/hadoop/tmp</value>
  </property>

Then move your /tmp/hadoop folder over to the new location:
mv /tmp/hadoop /var/hadoop/tmp



On Mon, Mar 3, 2014 at 5:55 AM, Chengwei Yang <ch...@gmail.com>wrote:

> On Mon, Mar 03, 2014 at 01:57:49PM +0530, shashwat shriparv wrote:
> > NO need to format just change the value and restart the cluster;
>
> Hmm, seems it doesn't work for me, if the only need to do is to change
> to another directory, then why it can not re-init the directory in /tmp
> just as another directory?
>
> If I changed to another directory, a new directory, the same error
> happen.
>
> --
> Thanks,
> Chengwei
>
> >
> >
> > Warm Regards_∞_
> > Shashwat Shriparv
> >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> > shriparvhttps://
> www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >
> >
> >
> > On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <
> chengwei.yang.cn@gmail.com>
> > wrote:
> >
> >     On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
> >     > Ya its always better to change the temp dir path in hadoop, as it
> will
> >     prevent
> >     > deletion of file while the server reboots.
> >
> >     Thanks, so is there anyway to recovery from this state? Or I have to
> format
> >     namenode again?
> >
> >     --
> >     Thanks,
> >     Chengwei
> >
> >     >
> >     >
> >     > Warm Regards_∞_
> >     > Shashwat Shriparv
> >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
> >     twitter.com/
> >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
> >     +ShashwatShriparv
> >     >
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >     >
> >     >
> >     >
> >     > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
> >     chengwei.yang.cn@gmail.com>
> >     > wrote:
> >     >
> >     >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv
> wrote:
> >     >     > You can use any directory you like beside permissions are
> right.
> >     >
> >     >     I mean if it's better if we change the default hadoop.tmp.dir?
> >     Because it
> >     >     can not work cross reboot in default Linux environment.
> >     >
> >     >     --
> >     >     Thanks,
> >     >     Chengwei
> >     >
> >     >     >
> >     >     >
> >     >     > Warm Regards_∞_
> >     >     > Shashwat Shriparv
> >     >     >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
> >     >     twitter.com/
> >     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
> >     >     +ShashwatShriparv
> >     >     > http://www.youtube.com/user/sShriparv/videoshttp://
> >     profile.yahoo.com/
> >     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >     >     >
> >     >     >
> >     >     >
> >     >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
> >     >     chengwei.yang.cn@gmail.com>
> >     >     > wrote:
> >     >     >
> >     >     >     Hi List,
> >     >     >
> >     >     >     I'm confusing by hadoop.tmp.dir currently because its
> default
> >     value
> >     >     >     "/tmp/hadoop-${user.name}" always means a directory in
> tmpfs in
> >     >     Linux.
> >     >     >     So after the name node machine reboot, it gone away and
> then
> >     name
> >     >     node
> >     >     >     fail to start.
> >     >     >
> >     >     >     I found this was reported here.
> >     >     >
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/
> >     201205.mbox
> >     >     /
> >     >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> >     >     >
> >     >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/,
> there
> >     are a
> >     >     lot
> >     >     >     properties are based on hadoop.tmp.dir, like
> >     >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> >     >     >
> >     >     >     I'm wondering, if we can set the default value of
> >     hadoop.tmp.dir to
> >     >     >     a non-tmpfs direcotry if it doesn't work at all by using
> a real
> >     tmpfs
> >     >     >     directory?
> >     >     >
> >     >     >     --
> >     >     >     Thanks,
> >     >     >     Chengwei
> >     >     >
> >     >     >
> >     >
> >     >
> >
> >
>

Re: Need help to understand hadoop.tmp.dir

Posted by JCAD Cell 1 <jc...@gmail.com>.
With the services stopped you would change the setting in core-site.xml:
  <property>
    <name>hadoop.tmp.dir</name>
    <value>/var/hadoop/tmp</value>
  </property>

Then move your /tmp/hadoop folder over to the new location:
mv /tmp/hadoop /var/hadoop/tmp



On Mon, Mar 3, 2014 at 5:55 AM, Chengwei Yang <ch...@gmail.com>wrote:

> On Mon, Mar 03, 2014 at 01:57:49PM +0530, shashwat shriparv wrote:
> > NO need to format just change the value and restart the cluster;
>
> Hmm, seems it doesn't work for me, if the only need to do is to change
> to another directory, then why it can not re-init the directory in /tmp
> just as another directory?
>
> If I changed to another directory, a new directory, the same error
> happen.
>
> --
> Thanks,
> Chengwei
>
> >
> >
> > Warm Regards_∞_
> > Shashwat Shriparv
> >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> > shriparvhttps://
> www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >
> >
> >
> > On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <
> chengwei.yang.cn@gmail.com>
> > wrote:
> >
> >     On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
> >     > Ya its always better to change the temp dir path in hadoop, as it
> will
> >     prevent
> >     > deletion of file while the server reboots.
> >
> >     Thanks, so is there anyway to recovery from this state? Or I have to
> format
> >     namenode again?
> >
> >     --
> >     Thanks,
> >     Chengwei
> >
> >     >
> >     >
> >     > Warm Regards_∞_
> >     > Shashwat Shriparv
> >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
> >     twitter.com/
> >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
> >     +ShashwatShriparv
> >     >
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >     >
> >     >
> >     >
> >     > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
> >     chengwei.yang.cn@gmail.com>
> >     > wrote:
> >     >
> >     >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv
> wrote:
> >     >     > You can use any directory you like beside permissions are
> right.
> >     >
> >     >     I mean if it's better if we change the default hadoop.tmp.dir?
> >     Because it
> >     >     can not work cross reboot in default Linux environment.
> >     >
> >     >     --
> >     >     Thanks,
> >     >     Chengwei
> >     >
> >     >     >
> >     >     >
> >     >     > Warm Regards_∞_
> >     >     > Shashwat Shriparv
> >     >     >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
> >     >     twitter.com/
> >     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
> >     >     +ShashwatShriparv
> >     >     > http://www.youtube.com/user/sShriparv/videoshttp://
> >     profile.yahoo.com/
> >     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >     >     >
> >     >     >
> >     >     >
> >     >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
> >     >     chengwei.yang.cn@gmail.com>
> >     >     > wrote:
> >     >     >
> >     >     >     Hi List,
> >     >     >
> >     >     >     I'm confusing by hadoop.tmp.dir currently because its
> default
> >     value
> >     >     >     "/tmp/hadoop-${user.name}" always means a directory in
> tmpfs in
> >     >     Linux.
> >     >     >     So after the name node machine reboot, it gone away and
> then
> >     name
> >     >     node
> >     >     >     fail to start.
> >     >     >
> >     >     >     I found this was reported here.
> >     >     >
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/
> >     201205.mbox
> >     >     /
> >     >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> >     >     >
> >     >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/,
> there
> >     are a
> >     >     lot
> >     >     >     properties are based on hadoop.tmp.dir, like
> >     >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> >     >     >
> >     >     >     I'm wondering, if we can set the default value of
> >     hadoop.tmp.dir to
> >     >     >     a non-tmpfs direcotry if it doesn't work at all by using
> a real
> >     tmpfs
> >     >     >     directory?
> >     >     >
> >     >     >     --
> >     >     >     Thanks,
> >     >     >     Chengwei
> >     >     >
> >     >     >
> >     >
> >     >
> >
> >
>

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 01:57:49PM +0530, shashwat shriparv wrote:
> NO need to format just change the value and restart the cluster;

Hmm, seems it doesn't work for me, if the only need to do is to change
to another directory, then why it can not re-init the directory in /tmp
just as another directory?

If I changed to another directory, a new directory, the same error
happen.

--
Thanks,
Chengwei

> 
> 
> Warm Regards_∞_
> Shashwat Shriparv
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> shriparvhttps://www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> SWXSTW3DVSDTF2HHSRM47AV6DI/
> 
> 
> 
> On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
>     > Ya its always better to change the temp dir path in hadoop, as it will
>     prevent
>     > deletion of file while the server reboots.
> 
>     Thanks, so is there anyway to recovery from this state? Or I have to format
>     namenode again?
> 
>     --
>     Thanks,
>     Chengwei
> 
>     >
>     >
>     > Warm Regards_∞_
>     > Shashwat Shriparv
>     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     twitter.com/
>     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     +ShashwatShriparv
>     > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
>     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >
>     >
>     >
>     > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
>     chengwei.yang.cn@gmail.com>
>     > wrote:
>     >
>     >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
>     >     > You can use any directory you like beside permissions are right.
>     >
>     >     I mean if it's better if we change the default hadoop.tmp.dir?
>     Because it
>     >     can not work cross reboot in default Linux environment.
>     >
>     >     --
>     >     Thanks,
>     >     Chengwei
>     >
>     >     >
>     >     >
>     >     > Warm Regards_∞_
>     >     > Shashwat Shriparv
>     >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     >     twitter.com/
>     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     >     +ShashwatShriparv
>     >     > http://www.youtube.com/user/sShriparv/videoshttp://
>     profile.yahoo.com/
>     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >     >
>     >     >
>     >     >
>     >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
>     >     chengwei.yang.cn@gmail.com>
>     >     > wrote:
>     >     >
>     >     >     Hi List,
>     >     >
>     >     >     I'm confusing by hadoop.tmp.dir currently because its default
>     value
>     >     >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in
>     >     Linux.
>     >     >     So after the name node machine reboot, it gone away and then
>     name
>     >     node
>     >     >     fail to start.
>     >     >
>     >     >     I found this was reported here.
>     >     >     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/
>     201205.mbox
>     >     /
>     >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>     >     >
>     >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there
>     are a
>     >     lot
>     >     >     properties are based on hadoop.tmp.dir, like
>     >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>     >     >
>     >     >     I'm wondering, if we can set the default value of
>     hadoop.tmp.dir to
>     >     >     a non-tmpfs direcotry if it doesn't work at all by using a real
>     tmpfs
>     >     >     directory?
>     >     >
>     >     >     --
>     >     >     Thanks,
>     >     >     Chengwei
>     >     >
>     >     >
>     >
>     >
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 01:57:49PM +0530, shashwat shriparv wrote:
> NO need to format just change the value and restart the cluster;

Hmm, seems it doesn't work for me, if the only need to do is to change
to another directory, then why it can not re-init the directory in /tmp
just as another directory?

If I changed to another directory, a new directory, the same error
happen.

--
Thanks,
Chengwei

> 
> 
> Warm Regards_∞_
> Shashwat Shriparv
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> shriparvhttps://www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> SWXSTW3DVSDTF2HHSRM47AV6DI/
> 
> 
> 
> On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
>     > Ya its always better to change the temp dir path in hadoop, as it will
>     prevent
>     > deletion of file while the server reboots.
> 
>     Thanks, so is there anyway to recovery from this state? Or I have to format
>     namenode again?
> 
>     --
>     Thanks,
>     Chengwei
> 
>     >
>     >
>     > Warm Regards_∞_
>     > Shashwat Shriparv
>     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     twitter.com/
>     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     +ShashwatShriparv
>     > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
>     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >
>     >
>     >
>     > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
>     chengwei.yang.cn@gmail.com>
>     > wrote:
>     >
>     >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
>     >     > You can use any directory you like beside permissions are right.
>     >
>     >     I mean if it's better if we change the default hadoop.tmp.dir?
>     Because it
>     >     can not work cross reboot in default Linux environment.
>     >
>     >     --
>     >     Thanks,
>     >     Chengwei
>     >
>     >     >
>     >     >
>     >     > Warm Regards_∞_
>     >     > Shashwat Shriparv
>     >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     >     twitter.com/
>     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     >     +ShashwatShriparv
>     >     > http://www.youtube.com/user/sShriparv/videoshttp://
>     profile.yahoo.com/
>     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >     >
>     >     >
>     >     >
>     >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
>     >     chengwei.yang.cn@gmail.com>
>     >     > wrote:
>     >     >
>     >     >     Hi List,
>     >     >
>     >     >     I'm confusing by hadoop.tmp.dir currently because its default
>     value
>     >     >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in
>     >     Linux.
>     >     >     So after the name node machine reboot, it gone away and then
>     name
>     >     node
>     >     >     fail to start.
>     >     >
>     >     >     I found this was reported here.
>     >     >     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/
>     201205.mbox
>     >     /
>     >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>     >     >
>     >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there
>     are a
>     >     lot
>     >     >     properties are based on hadoop.tmp.dir, like
>     >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>     >     >
>     >     >     I'm wondering, if we can set the default value of
>     hadoop.tmp.dir to
>     >     >     a non-tmpfs direcotry if it doesn't work at all by using a real
>     tmpfs
>     >     >     directory?
>     >     >
>     >     >     --
>     >     >     Thanks,
>     >     >     Chengwei
>     >     >
>     >     >
>     >
>     >
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 01:57:49PM +0530, shashwat shriparv wrote:
> NO need to format just change the value and restart the cluster;

Hmm, seems it doesn't work for me, if the only need to do is to change
to another directory, then why it can not re-init the directory in /tmp
just as another directory?

If I changed to another directory, a new directory, the same error
happen.

--
Thanks,
Chengwei

> 
> 
> Warm Regards_∞_
> Shashwat Shriparv
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> shriparvhttps://www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> SWXSTW3DVSDTF2HHSRM47AV6DI/
> 
> 
> 
> On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
>     > Ya its always better to change the temp dir path in hadoop, as it will
>     prevent
>     > deletion of file while the server reboots.
> 
>     Thanks, so is there anyway to recovery from this state? Or I have to format
>     namenode again?
> 
>     --
>     Thanks,
>     Chengwei
> 
>     >
>     >
>     > Warm Regards_∞_
>     > Shashwat Shriparv
>     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     twitter.com/
>     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     +ShashwatShriparv
>     > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
>     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >
>     >
>     >
>     > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
>     chengwei.yang.cn@gmail.com>
>     > wrote:
>     >
>     >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
>     >     > You can use any directory you like beside permissions are right.
>     >
>     >     I mean if it's better if we change the default hadoop.tmp.dir?
>     Because it
>     >     can not work cross reboot in default Linux environment.
>     >
>     >     --
>     >     Thanks,
>     >     Chengwei
>     >
>     >     >
>     >     >
>     >     > Warm Regards_∞_
>     >     > Shashwat Shriparv
>     >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     >     twitter.com/
>     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     >     +ShashwatShriparv
>     >     > http://www.youtube.com/user/sShriparv/videoshttp://
>     profile.yahoo.com/
>     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >     >
>     >     >
>     >     >
>     >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
>     >     chengwei.yang.cn@gmail.com>
>     >     > wrote:
>     >     >
>     >     >     Hi List,
>     >     >
>     >     >     I'm confusing by hadoop.tmp.dir currently because its default
>     value
>     >     >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in
>     >     Linux.
>     >     >     So after the name node machine reboot, it gone away and then
>     name
>     >     node
>     >     >     fail to start.
>     >     >
>     >     >     I found this was reported here.
>     >     >     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/
>     201205.mbox
>     >     /
>     >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>     >     >
>     >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there
>     are a
>     >     lot
>     >     >     properties are based on hadoop.tmp.dir, like
>     >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>     >     >
>     >     >     I'm wondering, if we can set the default value of
>     hadoop.tmp.dir to
>     >     >     a non-tmpfs direcotry if it doesn't work at all by using a real
>     tmpfs
>     >     >     directory?
>     >     >
>     >     >     --
>     >     >     Thanks,
>     >     >     Chengwei
>     >     >
>     >     >
>     >
>     >
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 01:57:49PM +0530, shashwat shriparv wrote:
> NO need to format just change the value and restart the cluster;

Hmm, seems it doesn't work for me, if the only need to do is to change
to another directory, then why it can not re-init the directory in /tmp
just as another directory?

If I changed to another directory, a new directory, the same error
happen.

--
Thanks,
Chengwei

> 
> 
> Warm Regards_∞_
> Shashwat Shriparv
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> shriparvhttps://www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> SWXSTW3DVSDTF2HHSRM47AV6DI/
> 
> 
> 
> On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
>     > Ya its always better to change the temp dir path in hadoop, as it will
>     prevent
>     > deletion of file while the server reboots.
> 
>     Thanks, so is there anyway to recovery from this state? Or I have to format
>     namenode again?
> 
>     --
>     Thanks,
>     Chengwei
> 
>     >
>     >
>     > Warm Regards_∞_
>     > Shashwat Shriparv
>     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     twitter.com/
>     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     +ShashwatShriparv
>     > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
>     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >
>     >
>     >
>     > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
>     chengwei.yang.cn@gmail.com>
>     > wrote:
>     >
>     >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
>     >     > You can use any directory you like beside permissions are right.
>     >
>     >     I mean if it's better if we change the default hadoop.tmp.dir?
>     Because it
>     >     can not work cross reboot in default Linux environment.
>     >
>     >     --
>     >     Thanks,
>     >     Chengwei
>     >
>     >     >
>     >     >
>     >     > Warm Regards_∞_
>     >     > Shashwat Shriparv
>     >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     >     twitter.com/
>     >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     >     +ShashwatShriparv
>     >     > http://www.youtube.com/user/sShriparv/videoshttp://
>     profile.yahoo.com/
>     >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >     >
>     >     >
>     >     >
>     >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
>     >     chengwei.yang.cn@gmail.com>
>     >     > wrote:
>     >     >
>     >     >     Hi List,
>     >     >
>     >     >     I'm confusing by hadoop.tmp.dir currently because its default
>     value
>     >     >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in
>     >     Linux.
>     >     >     So after the name node machine reboot, it gone away and then
>     name
>     >     node
>     >     >     fail to start.
>     >     >
>     >     >     I found this was reported here.
>     >     >     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/
>     201205.mbox
>     >     /
>     >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>     >     >
>     >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there
>     are a
>     >     lot
>     >     >     properties are based on hadoop.tmp.dir, like
>     >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>     >     >
>     >     >     I'm wondering, if we can set the default value of
>     hadoop.tmp.dir to
>     >     >     a non-tmpfs direcotry if it doesn't work at all by using a real
>     tmpfs
>     >     >     directory?
>     >     >
>     >     >     --
>     >     >     Thanks,
>     >     >     Chengwei
>     >     >
>     >     >
>     >
>     >
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by shashwat shriparv <dw...@gmail.com>.
NO need to format just change the value and restart the cluster;


*Warm Regards_**∞_*
* Shashwat Shriparv*
 [image: http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9]<http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9>[image:
https://twitter.com/shriparv] <https://twitter.com/shriparv>[image:
https://www.facebook.com/shriparv] <https://www.facebook.com/shriparv>[image:
http://google.com/+ShashwatShriparv]
<http://google.com/+ShashwatShriparv>[image:
http://www.youtube.com/user/sShriparv/videos]<http://www.youtube.com/user/sShriparv/videos>[image:
http://profile.yahoo.com/SWXSTW3DVSDTF2HHSRM47AV6DI/] <sh...@yahoo.com>



On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <ch...@gmail.com>wrote:

> On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
> > Ya its always better to change the temp dir path in hadoop, as it will
> prevent
> > deletion of file while the server reboots.
>
> Thanks, so is there anyway to recovery from this state? Or I have to format
> namenode again?
>
> --
> Thanks,
> Chengwei
>
> >
> >
> > Warm Regards_∞_
> > Shashwat Shriparv
> >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> > shriparvhttps://
> www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >
> >
> >
> > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
> chengwei.yang.cn@gmail.com>
> > wrote:
> >
> >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
> >     > You can use any directory you like beside permissions are right.
> >
> >     I mean if it's better if we change the default hadoop.tmp.dir?
> Because it
> >     can not work cross reboot in default Linux environment.
> >
> >     --
> >     Thanks,
> >     Chengwei
> >
> >     >
> >     >
> >     > Warm Regards_∞_
> >     > Shashwat Shriparv
> >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
> >     twitter.com/
> >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
> >     +ShashwatShriparv
> >     >
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >     >
> >     >
> >     >
> >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
> >     chengwei.yang.cn@gmail.com>
> >     > wrote:
> >     >
> >     >     Hi List,
> >     >
> >     >     I'm confusing by hadoop.tmp.dir currently because its default
> value
> >     >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs
> in
> >     Linux.
> >     >     So after the name node machine reboot, it gone away and then
> name
> >     node
> >     >     fail to start.
> >     >
> >     >     I found this was reported here.
> >     >
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox
> >     /
> >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> >     >
> >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there
> are a
> >     lot
> >     >     properties are based on hadoop.tmp.dir, like
> >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> >     >
> >     >     I'm wondering, if we can set the default value of
> hadoop.tmp.dir to
> >     >     a non-tmpfs direcotry if it doesn't work at all by using a
> real tmpfs
> >     >     directory?
> >     >
> >     >     --
> >     >     Thanks,
> >     >     Chengwei
> >     >
> >     >
> >
> >
>

Re: Need help to understand hadoop.tmp.dir

Posted by shashwat shriparv <dw...@gmail.com>.
NO need to format just change the value and restart the cluster;


*Warm Regards_**∞_*
* Shashwat Shriparv*
 [image: http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9]<http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9>[image:
https://twitter.com/shriparv] <https://twitter.com/shriparv>[image:
https://www.facebook.com/shriparv] <https://www.facebook.com/shriparv>[image:
http://google.com/+ShashwatShriparv]
<http://google.com/+ShashwatShriparv>[image:
http://www.youtube.com/user/sShriparv/videos]<http://www.youtube.com/user/sShriparv/videos>[image:
http://profile.yahoo.com/SWXSTW3DVSDTF2HHSRM47AV6DI/] <sh...@yahoo.com>



On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <ch...@gmail.com>wrote:

> On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
> > Ya its always better to change the temp dir path in hadoop, as it will
> prevent
> > deletion of file while the server reboots.
>
> Thanks, so is there anyway to recovery from this state? Or I have to format
> namenode again?
>
> --
> Thanks,
> Chengwei
>
> >
> >
> > Warm Regards_∞_
> > Shashwat Shriparv
> >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> > shriparvhttps://
> www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >
> >
> >
> > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
> chengwei.yang.cn@gmail.com>
> > wrote:
> >
> >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
> >     > You can use any directory you like beside permissions are right.
> >
> >     I mean if it's better if we change the default hadoop.tmp.dir?
> Because it
> >     can not work cross reboot in default Linux environment.
> >
> >     --
> >     Thanks,
> >     Chengwei
> >
> >     >
> >     >
> >     > Warm Regards_∞_
> >     > Shashwat Shriparv
> >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
> >     twitter.com/
> >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
> >     +ShashwatShriparv
> >     >
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >     >
> >     >
> >     >
> >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
> >     chengwei.yang.cn@gmail.com>
> >     > wrote:
> >     >
> >     >     Hi List,
> >     >
> >     >     I'm confusing by hadoop.tmp.dir currently because its default
> value
> >     >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs
> in
> >     Linux.
> >     >     So after the name node machine reboot, it gone away and then
> name
> >     node
> >     >     fail to start.
> >     >
> >     >     I found this was reported here.
> >     >
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox
> >     /
> >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> >     >
> >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there
> are a
> >     lot
> >     >     properties are based on hadoop.tmp.dir, like
> >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> >     >
> >     >     I'm wondering, if we can set the default value of
> hadoop.tmp.dir to
> >     >     a non-tmpfs direcotry if it doesn't work at all by using a
> real tmpfs
> >     >     directory?
> >     >
> >     >     --
> >     >     Thanks,
> >     >     Chengwei
> >     >
> >     >
> >
> >
>

Re: Need help to understand hadoop.tmp.dir

Posted by shashwat shriparv <dw...@gmail.com>.
NO need to format just change the value and restart the cluster;


*Warm Regards_**∞_*
* Shashwat Shriparv*
 [image: http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9]<http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9>[image:
https://twitter.com/shriparv] <https://twitter.com/shriparv>[image:
https://www.facebook.com/shriparv] <https://www.facebook.com/shriparv>[image:
http://google.com/+ShashwatShriparv]
<http://google.com/+ShashwatShriparv>[image:
http://www.youtube.com/user/sShriparv/videos]<http://www.youtube.com/user/sShriparv/videos>[image:
http://profile.yahoo.com/SWXSTW3DVSDTF2HHSRM47AV6DI/] <sh...@yahoo.com>



On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <ch...@gmail.com>wrote:

> On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
> > Ya its always better to change the temp dir path in hadoop, as it will
> prevent
> > deletion of file while the server reboots.
>
> Thanks, so is there anyway to recovery from this state? Or I have to format
> namenode again?
>
> --
> Thanks,
> Chengwei
>
> >
> >
> > Warm Regards_∞_
> > Shashwat Shriparv
> >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> > shriparvhttps://
> www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >
> >
> >
> > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
> chengwei.yang.cn@gmail.com>
> > wrote:
> >
> >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
> >     > You can use any directory you like beside permissions are right.
> >
> >     I mean if it's better if we change the default hadoop.tmp.dir?
> Because it
> >     can not work cross reboot in default Linux environment.
> >
> >     --
> >     Thanks,
> >     Chengwei
> >
> >     >
> >     >
> >     > Warm Regards_∞_
> >     > Shashwat Shriparv
> >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
> >     twitter.com/
> >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
> >     +ShashwatShriparv
> >     >
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >     >
> >     >
> >     >
> >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
> >     chengwei.yang.cn@gmail.com>
> >     > wrote:
> >     >
> >     >     Hi List,
> >     >
> >     >     I'm confusing by hadoop.tmp.dir currently because its default
> value
> >     >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs
> in
> >     Linux.
> >     >     So after the name node machine reboot, it gone away and then
> name
> >     node
> >     >     fail to start.
> >     >
> >     >     I found this was reported here.
> >     >
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox
> >     /
> >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> >     >
> >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there
> are a
> >     lot
> >     >     properties are based on hadoop.tmp.dir, like
> >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> >     >
> >     >     I'm wondering, if we can set the default value of
> hadoop.tmp.dir to
> >     >     a non-tmpfs direcotry if it doesn't work at all by using a
> real tmpfs
> >     >     directory?
> >     >
> >     >     --
> >     >     Thanks,
> >     >     Chengwei
> >     >
> >     >
> >
> >
>

Re: Need help to understand hadoop.tmp.dir

Posted by shashwat shriparv <dw...@gmail.com>.
NO need to format just change the value and restart the cluster;


*Warm Regards_**∞_*
* Shashwat Shriparv*
 [image: http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9]<http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9>[image:
https://twitter.com/shriparv] <https://twitter.com/shriparv>[image:
https://www.facebook.com/shriparv] <https://www.facebook.com/shriparv>[image:
http://google.com/+ShashwatShriparv]
<http://google.com/+ShashwatShriparv>[image:
http://www.youtube.com/user/sShriparv/videos]<http://www.youtube.com/user/sShriparv/videos>[image:
http://profile.yahoo.com/SWXSTW3DVSDTF2HHSRM47AV6DI/] <sh...@yahoo.com>



On Mon, Mar 3, 2014 at 1:55 PM, Chengwei Yang <ch...@gmail.com>wrote:

> On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
> > Ya its always better to change the temp dir path in hadoop, as it will
> prevent
> > deletion of file while the server reboots.
>
> Thanks, so is there anyway to recovery from this state? Or I have to format
> namenode again?
>
> --
> Thanks,
> Chengwei
>
> >
> >
> > Warm Regards_∞_
> > Shashwat Shriparv
> >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> > shriparvhttps://
> www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >
> >
> >
> > On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <
> chengwei.yang.cn@gmail.com>
> > wrote:
> >
> >     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
> >     > You can use any directory you like beside permissions are right.
> >
> >     I mean if it's better if we change the default hadoop.tmp.dir?
> Because it
> >     can not work cross reboot in default Linux environment.
> >
> >     --
> >     Thanks,
> >     Chengwei
> >
> >     >
> >     >
> >     > Warm Regards_∞_
> >     > Shashwat Shriparv
> >     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
> >     twitter.com/
> >     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
> >     +ShashwatShriparv
> >     >
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> >     > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >     >
> >     >
> >     >
> >     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
> >     chengwei.yang.cn@gmail.com>
> >     > wrote:
> >     >
> >     >     Hi List,
> >     >
> >     >     I'm confusing by hadoop.tmp.dir currently because its default
> value
> >     >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs
> in
> >     Linux.
> >     >     So after the name node machine reboot, it gone away and then
> name
> >     node
> >     >     fail to start.
> >     >
> >     >     I found this was reported here.
> >     >
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox
> >     /
> >     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> >     >
> >     >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there
> are a
> >     lot
> >     >     properties are based on hadoop.tmp.dir, like
> >     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> >     >
> >     >     I'm wondering, if we can set the default value of
> hadoop.tmp.dir to
> >     >     a non-tmpfs direcotry if it doesn't work at all by using a
> real tmpfs
> >     >     directory?
> >     >
> >     >     --
> >     >     Thanks,
> >     >     Chengwei
> >     >
> >     >
> >
> >
>

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
> Ya its always better to change the temp dir path in hadoop, as it will prevent
> deletion of file while the server reboots.

Thanks, so is there anyway to recovery from this state? Or I have to format
namenode again?

--
Thanks,
Chengwei

> 
> 
> Warm Regards_∞_
> Shashwat Shriparv
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> shriparvhttps://www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> SWXSTW3DVSDTF2HHSRM47AV6DI/
> 
> 
> 
> On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
>     > You can use any directory you like beside permissions are right.
> 
>     I mean if it's better if we change the default hadoop.tmp.dir? Because it
>     can not work cross reboot in default Linux environment.
> 
>     --
>     Thanks,
>     Chengwei
> 
>     >
>     >
>     > Warm Regards_∞_
>     > Shashwat Shriparv
>     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     twitter.com/
>     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     +ShashwatShriparv
>     > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
>     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >
>     >
>     >
>     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
>     chengwei.yang.cn@gmail.com>
>     > wrote:
>     >
>     >     Hi List,
>     >
>     >     I'm confusing by hadoop.tmp.dir currently because its default value
>     >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in
>     Linux.
>     >     So after the name node machine reboot, it gone away and then name
>     node
>     >     fail to start.
>     >
>     >     I found this was reported here.
>     >     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox
>     /
>     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>     >
>     >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a
>     lot
>     >     properties are based on hadoop.tmp.dir, like
>     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>     >
>     >     I'm wondering, if we can set the default value of hadoop.tmp.dir to
>     >     a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
>     >     directory?
>     >
>     >     --
>     >     Thanks,
>     >     Chengwei
>     >
>     >
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
> Ya its always better to change the temp dir path in hadoop, as it will prevent
> deletion of file while the server reboots.

Thanks, so is there anyway to recovery from this state? Or I have to format
namenode again?

--
Thanks,
Chengwei

> 
> 
> Warm Regards_∞_
> Shashwat Shriparv
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> shriparvhttps://www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> SWXSTW3DVSDTF2HHSRM47AV6DI/
> 
> 
> 
> On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
>     > You can use any directory you like beside permissions are right.
> 
>     I mean if it's better if we change the default hadoop.tmp.dir? Because it
>     can not work cross reboot in default Linux environment.
> 
>     --
>     Thanks,
>     Chengwei
> 
>     >
>     >
>     > Warm Regards_∞_
>     > Shashwat Shriparv
>     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     twitter.com/
>     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     +ShashwatShriparv
>     > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
>     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >
>     >
>     >
>     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
>     chengwei.yang.cn@gmail.com>
>     > wrote:
>     >
>     >     Hi List,
>     >
>     >     I'm confusing by hadoop.tmp.dir currently because its default value
>     >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in
>     Linux.
>     >     So after the name node machine reboot, it gone away and then name
>     node
>     >     fail to start.
>     >
>     >     I found this was reported here.
>     >     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox
>     /
>     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>     >
>     >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a
>     lot
>     >     properties are based on hadoop.tmp.dir, like
>     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>     >
>     >     I'm wondering, if we can set the default value of hadoop.tmp.dir to
>     >     a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
>     >     directory?
>     >
>     >     --
>     >     Thanks,
>     >     Chengwei
>     >
>     >
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
> Ya its always better to change the temp dir path in hadoop, as it will prevent
> deletion of file while the server reboots.

Thanks, so is there anyway to recovery from this state? Or I have to format
namenode again?

--
Thanks,
Chengwei

> 
> 
> Warm Regards_∞_
> Shashwat Shriparv
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> shriparvhttps://www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> SWXSTW3DVSDTF2HHSRM47AV6DI/
> 
> 
> 
> On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
>     > You can use any directory you like beside permissions are right.
> 
>     I mean if it's better if we change the default hadoop.tmp.dir? Because it
>     can not work cross reboot in default Linux environment.
> 
>     --
>     Thanks,
>     Chengwei
> 
>     >
>     >
>     > Warm Regards_∞_
>     > Shashwat Shriparv
>     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     twitter.com/
>     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     +ShashwatShriparv
>     > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
>     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >
>     >
>     >
>     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
>     chengwei.yang.cn@gmail.com>
>     > wrote:
>     >
>     >     Hi List,
>     >
>     >     I'm confusing by hadoop.tmp.dir currently because its default value
>     >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in
>     Linux.
>     >     So after the name node machine reboot, it gone away and then name
>     node
>     >     fail to start.
>     >
>     >     I found this was reported here.
>     >     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox
>     /
>     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>     >
>     >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a
>     lot
>     >     properties are based on hadoop.tmp.dir, like
>     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>     >
>     >     I'm wondering, if we can set the default value of hadoop.tmp.dir to
>     >     a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
>     >     directory?
>     >
>     >     --
>     >     Thanks,
>     >     Chengwei
>     >
>     >
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 11:56:08AM +0530, shashwat shriparv wrote:
> Ya its always better to change the temp dir path in hadoop, as it will prevent
> deletion of file while the server reboots.

Thanks, so is there anyway to recovery from this state? Or I have to format
namenode again?

--
Thanks,
Chengwei

> 
> 
> Warm Regards_∞_
> Shashwat Shriparv
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> shriparvhttps://www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> SWXSTW3DVSDTF2HHSRM47AV6DI/
> 
> 
> 
> On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
>     > You can use any directory you like beside permissions are right.
> 
>     I mean if it's better if we change the default hadoop.tmp.dir? Because it
>     can not work cross reboot in default Linux environment.
> 
>     --
>     Thanks,
>     Chengwei
> 
>     >
>     >
>     > Warm Regards_∞_
>     > Shashwat Shriparv
>     > http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://
>     twitter.com/
>     > shriparvhttps://www.facebook.com/shriparvhttp://google.com/
>     +ShashwatShriparv
>     > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
>     > SWXSTW3DVSDTF2HHSRM47AV6DI/
>     >
>     >
>     >
>     > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
>     chengwei.yang.cn@gmail.com>
>     > wrote:
>     >
>     >     Hi List,
>     >
>     >     I'm confusing by hadoop.tmp.dir currently because its default value
>     >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in
>     Linux.
>     >     So after the name node machine reboot, it gone away and then name
>     node
>     >     fail to start.
>     >
>     >     I found this was reported here.
>     >     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox
>     /
>     >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>     >
>     >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a
>     lot
>     >     properties are based on hadoop.tmp.dir, like
>     >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>     >
>     >     I'm wondering, if we can set the default value of hadoop.tmp.dir to
>     >     a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
>     >     directory?
>     >
>     >     --
>     >     Thanks,
>     >     Chengwei
>     >
>     >
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by shashwat shriparv <dw...@gmail.com>.
Ya its always better to change the temp dir path in hadoop, as it will
prevent deletion of file while the server reboots.


*Warm Regards_**∞_*
* Shashwat Shriparv*
 [image: http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9]<http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9>[image:
https://twitter.com/shriparv] <https://twitter.com/shriparv>[image:
https://www.facebook.com/shriparv] <https://www.facebook.com/shriparv>[image:
http://google.com/+ShashwatShriparv]
<http://google.com/+ShashwatShriparv>[image:
http://www.youtube.com/user/sShriparv/videos]<http://www.youtube.com/user/sShriparv/videos>[image:
http://profile.yahoo.com/SWXSTW3DVSDTF2HHSRM47AV6DI/] <sh...@yahoo.com>



On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang
<ch...@gmail.com>wrote:

> On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
> > You can use any directory you like beside permissions are right.
>
> I mean if it's better if we change the default hadoop.tmp.dir? Because it
> can not work cross reboot in default Linux environment.
>
> --
> Thanks,
> Chengwei
>
> >
> >
> > Warm Regards_∞_
> > Shashwat Shriparv
> >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> > shriparvhttps://
> www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >
> >
> >
> > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
> chengwei.yang.cn@gmail.com>
> > wrote:
> >
> >     Hi List,
> >
> >     I'm confusing by hadoop.tmp.dir currently because its default value
> >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in
> Linux.
> >     So after the name node machine reboot, it gone away and then name
> node
> >     fail to start.
> >
> >     I found this was reported here.
> >
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox/
> >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> >
> >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a
> lot
> >     properties are based on hadoop.tmp.dir, like
> >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> >
> >     I'm wondering, if we can set the default value of hadoop.tmp.dir to
> >     a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
> >     directory?
> >
> >     --
> >     Thanks,
> >     Chengwei
> >
> >
>

Re: Need help to understand hadoop.tmp.dir

Posted by shashwat shriparv <dw...@gmail.com>.
Ya its always better to change the temp dir path in hadoop, as it will
prevent deletion of file while the server reboots.


*Warm Regards_**∞_*
* Shashwat Shriparv*
 [image: http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9]<http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9>[image:
https://twitter.com/shriparv] <https://twitter.com/shriparv>[image:
https://www.facebook.com/shriparv] <https://www.facebook.com/shriparv>[image:
http://google.com/+ShashwatShriparv]
<http://google.com/+ShashwatShriparv>[image:
http://www.youtube.com/user/sShriparv/videos]<http://www.youtube.com/user/sShriparv/videos>[image:
http://profile.yahoo.com/SWXSTW3DVSDTF2HHSRM47AV6DI/] <sh...@yahoo.com>



On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang
<ch...@gmail.com>wrote:

> On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
> > You can use any directory you like beside permissions are right.
>
> I mean if it's better if we change the default hadoop.tmp.dir? Because it
> can not work cross reboot in default Linux environment.
>
> --
> Thanks,
> Chengwei
>
> >
> >
> > Warm Regards_∞_
> > Shashwat Shriparv
> >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> > shriparvhttps://
> www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >
> >
> >
> > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
> chengwei.yang.cn@gmail.com>
> > wrote:
> >
> >     Hi List,
> >
> >     I'm confusing by hadoop.tmp.dir currently because its default value
> >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in
> Linux.
> >     So after the name node machine reboot, it gone away and then name
> node
> >     fail to start.
> >
> >     I found this was reported here.
> >
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox/
> >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> >
> >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a
> lot
> >     properties are based on hadoop.tmp.dir, like
> >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> >
> >     I'm wondering, if we can set the default value of hadoop.tmp.dir to
> >     a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
> >     directory?
> >
> >     --
> >     Thanks,
> >     Chengwei
> >
> >
>

Re: Need help to understand hadoop.tmp.dir

Posted by shashwat shriparv <dw...@gmail.com>.
Ya its always better to change the temp dir path in hadoop, as it will
prevent deletion of file while the server reboots.


*Warm Regards_**∞_*
* Shashwat Shriparv*
 [image: http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9]<http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9>[image:
https://twitter.com/shriparv] <https://twitter.com/shriparv>[image:
https://www.facebook.com/shriparv] <https://www.facebook.com/shriparv>[image:
http://google.com/+ShashwatShriparv]
<http://google.com/+ShashwatShriparv>[image:
http://www.youtube.com/user/sShriparv/videos]<http://www.youtube.com/user/sShriparv/videos>[image:
http://profile.yahoo.com/SWXSTW3DVSDTF2HHSRM47AV6DI/] <sh...@yahoo.com>



On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang
<ch...@gmail.com>wrote:

> On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
> > You can use any directory you like beside permissions are right.
>
> I mean if it's better if we change the default hadoop.tmp.dir? Because it
> can not work cross reboot in default Linux environment.
>
> --
> Thanks,
> Chengwei
>
> >
> >
> > Warm Regards_∞_
> > Shashwat Shriparv
> >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> > shriparvhttps://
> www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >
> >
> >
> > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
> chengwei.yang.cn@gmail.com>
> > wrote:
> >
> >     Hi List,
> >
> >     I'm confusing by hadoop.tmp.dir currently because its default value
> >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in
> Linux.
> >     So after the name node machine reboot, it gone away and then name
> node
> >     fail to start.
> >
> >     I found this was reported here.
> >
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox/
> >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> >
> >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a
> lot
> >     properties are based on hadoop.tmp.dir, like
> >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> >
> >     I'm wondering, if we can set the default value of hadoop.tmp.dir to
> >     a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
> >     directory?
> >
> >     --
> >     Thanks,
> >     Chengwei
> >
> >
>

Re: Need help to understand hadoop.tmp.dir

Posted by shashwat shriparv <dw...@gmail.com>.
Ya its always better to change the temp dir path in hadoop, as it will
prevent deletion of file while the server reboots.


*Warm Regards_**∞_*
* Shashwat Shriparv*
 [image: http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9]<http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9>[image:
https://twitter.com/shriparv] <https://twitter.com/shriparv>[image:
https://www.facebook.com/shriparv] <https://www.facebook.com/shriparv>[image:
http://google.com/+ShashwatShriparv]
<http://google.com/+ShashwatShriparv>[image:
http://www.youtube.com/user/sShriparv/videos]<http://www.youtube.com/user/sShriparv/videos>[image:
http://profile.yahoo.com/SWXSTW3DVSDTF2HHSRM47AV6DI/] <sh...@yahoo.com>



On Mon, Mar 3, 2014 at 11:52 AM, Chengwei Yang
<ch...@gmail.com>wrote:

> On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
> > You can use any directory you like beside permissions are right.
>
> I mean if it's better if we change the default hadoop.tmp.dir? Because it
> can not work cross reboot in default Linux environment.
>
> --
> Thanks,
> Chengwei
>
> >
> >
> > Warm Regards_∞_
> > Shashwat Shriparv
> >
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> > shriparvhttps://
> www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> > http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> > SWXSTW3DVSDTF2HHSRM47AV6DI/
> >
> >
> >
> > On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <
> chengwei.yang.cn@gmail.com>
> > wrote:
> >
> >     Hi List,
> >
> >     I'm confusing by hadoop.tmp.dir currently because its default value
> >     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in
> Linux.
> >     So after the name node machine reboot, it gone away and then name
> node
> >     fail to start.
> >
> >     I found this was reported here.
> >
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox/
> >     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> >
> >     As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a
> lot
> >     properties are based on hadoop.tmp.dir, like
> >     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> >
> >     I'm wondering, if we can set the default value of hadoop.tmp.dir to
> >     a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
> >     directory?
> >
> >     --
> >     Thanks,
> >     Chengwei
> >
> >
>

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
> You can use any directory you like beside permissions are right.

I mean if it's better if we change the default hadoop.tmp.dir? Because it
can not work cross reboot in default Linux environment.

--
Thanks,
Chengwei

> 
> 
> Warm Regards_∞_
> Shashwat Shriparv
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> shriparvhttps://www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> SWXSTW3DVSDTF2HHSRM47AV6DI/
> 
> 
> 
> On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     Hi List,
> 
>     I'm confusing by hadoop.tmp.dir currently because its default value
>     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in Linux.
>     So after the name node machine reboot, it gone away and then name node
>     fail to start.
> 
>     I found this was reported here.
>     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox/
>     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> 
>     As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a lot
>     properties are based on hadoop.tmp.dir, like
>     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> 
>     I'm wondering, if we can set the default value of hadoop.tmp.dir to
>     a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
>     directory?
> 
>     --
>     Thanks,
>     Chengwei
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
> You can use any directory you like beside permissions are right.

I mean if it's better if we change the default hadoop.tmp.dir? Because it
can not work cross reboot in default Linux environment.

--
Thanks,
Chengwei

> 
> 
> Warm Regards_∞_
> Shashwat Shriparv
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> shriparvhttps://www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> SWXSTW3DVSDTF2HHSRM47AV6DI/
> 
> 
> 
> On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     Hi List,
> 
>     I'm confusing by hadoop.tmp.dir currently because its default value
>     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in Linux.
>     So after the name node machine reboot, it gone away and then name node
>     fail to start.
> 
>     I found this was reported here.
>     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox/
>     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> 
>     As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a lot
>     properties are based on hadoop.tmp.dir, like
>     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> 
>     I'm wondering, if we can set the default value of hadoop.tmp.dir to
>     a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
>     directory?
> 
>     --
>     Thanks,
>     Chengwei
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
> You can use any directory you like beside permissions are right.

I mean if it's better if we change the default hadoop.tmp.dir? Because it
can not work cross reboot in default Linux environment.

--
Thanks,
Chengwei

> 
> 
> Warm Regards_∞_
> Shashwat Shriparv
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> shriparvhttps://www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> SWXSTW3DVSDTF2HHSRM47AV6DI/
> 
> 
> 
> On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     Hi List,
> 
>     I'm confusing by hadoop.tmp.dir currently because its default value
>     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in Linux.
>     So after the name node machine reboot, it gone away and then name node
>     fail to start.
> 
>     I found this was reported here.
>     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox/
>     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> 
>     As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a lot
>     properties are based on hadoop.tmp.dir, like
>     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> 
>     I'm wondering, if we can set the default value of hadoop.tmp.dir to
>     a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
>     directory?
> 
>     --
>     Thanks,
>     Chengwei
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by Chengwei Yang <ch...@gmail.com>.
On Mon, Mar 03, 2014 at 11:25:59AM +0530, shashwat shriparv wrote:
> You can use any directory you like beside permissions are right.

I mean if it's better if we change the default hadoop.tmp.dir? Because it
can not work cross reboot in default Linux environment.

--
Thanks,
Chengwei

> 
> 
> Warm Regards_∞_
> Shashwat Shriparv
> http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9https://twitter.com/
> shriparvhttps://www.facebook.com/shriparvhttp://google.com/+ShashwatShriparv
> http://www.youtube.com/user/sShriparv/videoshttp://profile.yahoo.com/
> SWXSTW3DVSDTF2HHSRM47AV6DI/
> 
> 
> 
> On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang <ch...@gmail.com>
> wrote:
> 
>     Hi List,
> 
>     I'm confusing by hadoop.tmp.dir currently because its default value
>     "/tmp/hadoop-${user.name}" always means a directory in tmpfs in Linux.
>     So after the name node machine reboot, it gone away and then name node
>     fail to start.
> 
>     I found this was reported here.
>     http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox/
>     %3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
> 
>     As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a lot
>     properties are based on hadoop.tmp.dir, like
>     dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
> 
>     I'm wondering, if we can set the default value of hadoop.tmp.dir to
>     a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
>     directory?
> 
>     --
>     Thanks,
>     Chengwei
> 
> 

Re: Need help to understand hadoop.tmp.dir

Posted by shashwat shriparv <dw...@gmail.com>.
You can use any directory you like beside permissions are right.


*Warm Regards_**∞_*
* Shashwat Shriparv*
 [image: http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9]<http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9>[image:
https://twitter.com/shriparv] <https://twitter.com/shriparv>[image:
https://www.facebook.com/shriparv] <https://www.facebook.com/shriparv>[image:
http://google.com/+ShashwatShriparv]
<http://google.com/+ShashwatShriparv>[image:
http://www.youtube.com/user/sShriparv/videos]<http://www.youtube.com/user/sShriparv/videos>[image:
http://profile.yahoo.com/SWXSTW3DVSDTF2HHSRM47AV6DI/] <sh...@yahoo.com>



On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang
<ch...@gmail.com>wrote:

> Hi List,
>
> I'm confusing by hadoop.tmp.dir currently because its default value
> "/tmp/hadoop-${user.name}" always means a directory in tmpfs in Linux.
> So after the name node machine reboot, it gone away and then name node
> fail to start.
>
> I found this was reported here.
>
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox/%3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>
> As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a lot
> properties are based on hadoop.tmp.dir, like
> dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>
> I'm wondering, if we can set the default value of hadoop.tmp.dir to
> a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
> directory?
>
> --
> Thanks,
> Chengwei
>

Re: Need help to understand hadoop.tmp.dir

Posted by shashwat shriparv <dw...@gmail.com>.
You can use any directory you like beside permissions are right.


*Warm Regards_**∞_*
* Shashwat Shriparv*
 [image: http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9]<http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9>[image:
https://twitter.com/shriparv] <https://twitter.com/shriparv>[image:
https://www.facebook.com/shriparv] <https://www.facebook.com/shriparv>[image:
http://google.com/+ShashwatShriparv]
<http://google.com/+ShashwatShriparv>[image:
http://www.youtube.com/user/sShriparv/videos]<http://www.youtube.com/user/sShriparv/videos>[image:
http://profile.yahoo.com/SWXSTW3DVSDTF2HHSRM47AV6DI/] <sh...@yahoo.com>



On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang
<ch...@gmail.com>wrote:

> Hi List,
>
> I'm confusing by hadoop.tmp.dir currently because its default value
> "/tmp/hadoop-${user.name}" always means a directory in tmpfs in Linux.
> So after the name node machine reboot, it gone away and then name node
> fail to start.
>
> I found this was reported here.
>
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox/%3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>
> As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a lot
> properties are based on hadoop.tmp.dir, like
> dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>
> I'm wondering, if we can set the default value of hadoop.tmp.dir to
> a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
> directory?
>
> --
> Thanks,
> Chengwei
>

Re: Need help to understand hadoop.tmp.dir

Posted by shashwat shriparv <dw...@gmail.com>.
You can use any directory you like beside permissions are right.


*Warm Regards_**∞_*
* Shashwat Shriparv*
 [image: http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9]<http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9>[image:
https://twitter.com/shriparv] <https://twitter.com/shriparv>[image:
https://www.facebook.com/shriparv] <https://www.facebook.com/shriparv>[image:
http://google.com/+ShashwatShriparv]
<http://google.com/+ShashwatShriparv>[image:
http://www.youtube.com/user/sShriparv/videos]<http://www.youtube.com/user/sShriparv/videos>[image:
http://profile.yahoo.com/SWXSTW3DVSDTF2HHSRM47AV6DI/] <sh...@yahoo.com>



On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang
<ch...@gmail.com>wrote:

> Hi List,
>
> I'm confusing by hadoop.tmp.dir currently because its default value
> "/tmp/hadoop-${user.name}" always means a directory in tmpfs in Linux.
> So after the name node machine reboot, it gone away and then name node
> fail to start.
>
> I found this was reported here.
>
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox/%3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>
> As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a lot
> properties are based on hadoop.tmp.dir, like
> dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>
> I'm wondering, if we can set the default value of hadoop.tmp.dir to
> a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
> directory?
>
> --
> Thanks,
> Chengwei
>

Re: Need help to understand hadoop.tmp.dir

Posted by shashwat shriparv <dw...@gmail.com>.
You can use any directory you like beside permissions are right.


*Warm Regards_**∞_*
* Shashwat Shriparv*
 [image: http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9]<http://www.linkedin.com/pub/shashwat-shriparv/19/214/2a9>[image:
https://twitter.com/shriparv] <https://twitter.com/shriparv>[image:
https://www.facebook.com/shriparv] <https://www.facebook.com/shriparv>[image:
http://google.com/+ShashwatShriparv]
<http://google.com/+ShashwatShriparv>[image:
http://www.youtube.com/user/sShriparv/videos]<http://www.youtube.com/user/sShriparv/videos>[image:
http://profile.yahoo.com/SWXSTW3DVSDTF2HHSRM47AV6DI/] <sh...@yahoo.com>



On Mon, Mar 3, 2014 at 11:07 AM, Chengwei Yang
<ch...@gmail.com>wrote:

> Hi List,
>
> I'm confusing by hadoop.tmp.dir currently because its default value
> "/tmp/hadoop-${user.name}" always means a directory in tmpfs in Linux.
> So after the name node machine reboot, it gone away and then name node
> fail to start.
>
> I found this was reported here.
>
> http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201205.mbox/%3CBAY148-W22BF95C5FBE2C40BF7CD9F86020@phx.gbl%3E
>
> As I found from http://hadoop.apache.org/docs/r2.3.0/, there are a lot
> properties are based on hadoop.tmp.dir, like
> dfs.namenode.name.dir   file://${hadoop.tmp.dir}/dfs/name
>
> I'm wondering, if we can set the default value of hadoop.tmp.dir to
> a non-tmpfs direcotry if it doesn't work at all by using a real tmpfs
> directory?
>
> --
> Thanks,
> Chengwei
>