You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Geelong Yao <ge...@gmail.com> on 2013/04/23 16:25:42 UTC

Need help for hadoop

Hi  Everyone


I have met two issues when changing the configuration for hadoop

1.When I execute the start-all.sh, it shows some errors but it won't affect
the cluster

[image: 内嵌图片 1]

Does anyone know what happened?missing some jars ?

2.When I need to change the setting of data.dfs.dir in hdfs-site.xml to add
aonther volume (mounted on /sda)of my disk
 <property>
        <name>data.dfs.dir</name>
        <value>/usr/hadoop/tmp/dfs/data,/sda</value>
    </property>
I fonud I can't start dfs, when I checking the datanode, there is no
datanode running

Could anyone know how to make the change works after I change the
hdfs-site.xml on datanode,the details would be better


BRs
Geelong

-- 
>From Good To Great

Re: Need help for hadoop

Posted by sudhakara st <su...@gmail.com>.
Hi,
*data.dfs.dir in hdfs-site.xml**- Determines where on the local filesystem,
DFS data node should store its blocks. These are the local directories
where the HDFS data written.This value is a comma-delimited list of
directories and  data will be stored in all named directories, typically on
different mounts. Directories that do not exist are ignored.
*
*No need to * formatting the Hadoop filesystem after editing directories,
just restart data node daemon/all process*.**
*
*
****hadoop.tmp.dir In core-site.xml*: The local directory whereHDFS
operation stores intermediate data files.



On Tue, Apr 23, 2013 at 9:14 PM, Geelong Yao <ge...@gmail.com> wrote:

> Thank for your reply.
> 1.For the first issue, I think this is mainly caused by missing jar in
> hadoop lib.
> I am confused about the data.dfs.dir with more than one values:
> /usr/hadoop/tmp/dfs/data, /sda
>
> 2.when we running job,which one should be firstly used ? or both used for
> temp file?
> Besides,after changing the values of hdfs-site.xml in datanodes, how can I
> immediately make this change effective? should I use start-all.sh or hadoop
> namenode -format?
>
>
> BRs
> Geelong
>
>
>
>
> 2013/4/23 shashwat shriparv <dw...@gmail.com>
>
>> 1: check for sl4j jar file, set HADOOP_CLASSPATH
>> 2. check permission on /sda
>> better make a directory in /sda/data or somethig like that and then try
>>
>> *Thanks & Regards    *
>>
>> ��
>> Shashwat Shriparv
>>
>>
>>
>> On Tue, Apr 23, 2013 at 7:55 PM, Geelong Yao <ge...@gmail.com>wrote:
>>
>>> Hi  Everyone
>>>
>>>
>>> I have met two issues when changing the configuration for hadoop
>>>
>>> 1.When I execute the start-all.sh, it shows some errors but it won't
>>> affect the cluster
>>>
>>> [image: ��ǶͼƬ 1]
>>>
>>> Does anyone know what happened?missing some jars ?
>>>
>>> 2.When I need to change the setting of data.dfs.dir in hdfs-site.xml to
>>> add aonther volume (mounted on /sda)of my disk
>>>  <property>
>>>         <name>data.dfs.dir</name>
>>>         <value>/usr/hadoop/tmp/dfs/data,/sda</value>
>>>     </property>
>>> I fonud I can't start dfs, when I checking the datanode, there is no
>>> datanode running
>>>
>>> Could anyone know how to make the change works after I change the
>>> hdfs-site.xml on datanode,the details would be better
>>>
>>>
>>> BRs
>>> Geelong
>>>
>>> --
>>> From Good To Great
>>>
>>
>>
>
>
> --
> From Good To Great
>



-- 

Regards,
.....  Sudhakara.st

Re: Need help for hadoop

Posted by sudhakara st <su...@gmail.com>.
Hi,
*data.dfs.dir in hdfs-site.xml**- Determines where on the local filesystem,
DFS data node should store its blocks. These are the local directories
where the HDFS data written.This value is a comma-delimited list of
directories and  data will be stored in all named directories, typically on
different mounts. Directories that do not exist are ignored.
*
*No need to * formatting the Hadoop filesystem after editing directories,
just restart data node daemon/all process*.**
*
*
****hadoop.tmp.dir In core-site.xml*: The local directory whereHDFS
operation stores intermediate data files.



On Tue, Apr 23, 2013 at 9:14 PM, Geelong Yao <ge...@gmail.com> wrote:

> Thank for your reply.
> 1.For the first issue, I think this is mainly caused by missing jar in
> hadoop lib.
> I am confused about the data.dfs.dir with more than one values:
> /usr/hadoop/tmp/dfs/data, /sda
>
> 2.when we running job,which one should be firstly used ? or both used for
> temp file?
> Besides,after changing the values of hdfs-site.xml in datanodes, how can I
> immediately make this change effective? should I use start-all.sh or hadoop
> namenode -format?
>
>
> BRs
> Geelong
>
>
>
>
> 2013/4/23 shashwat shriparv <dw...@gmail.com>
>
>> 1: check for sl4j jar file, set HADOOP_CLASSPATH
>> 2. check permission on /sda
>> better make a directory in /sda/data or somethig like that and then try
>>
>> *Thanks & Regards    *
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Tue, Apr 23, 2013 at 7:55 PM, Geelong Yao <ge...@gmail.com>wrote:
>>
>>> Hi  Everyone
>>>
>>>
>>> I have met two issues when changing the configuration for hadoop
>>>
>>> 1.When I execute the start-all.sh, it shows some errors but it won't
>>> affect the cluster
>>>
>>> [image: 内嵌图片 1]
>>>
>>> Does anyone know what happened?missing some jars ?
>>>
>>> 2.When I need to change the setting of data.dfs.dir in hdfs-site.xml to
>>> add aonther volume (mounted on /sda)of my disk
>>>  <property>
>>>         <name>data.dfs.dir</name>
>>>         <value>/usr/hadoop/tmp/dfs/data,/sda</value>
>>>     </property>
>>> I fonud I can't start dfs, when I checking the datanode, there is no
>>> datanode running
>>>
>>> Could anyone know how to make the change works after I change the
>>> hdfs-site.xml on datanode,the details would be better
>>>
>>>
>>> BRs
>>> Geelong
>>>
>>> --
>>> From Good To Great
>>>
>>
>>
>
>
> --
> From Good To Great
>



-- 

Regards,
.....  Sudhakara.st

Re: Need help for hadoop

Posted by sudhakara st <su...@gmail.com>.
Hi,
*data.dfs.dir in hdfs-site.xml**- Determines where on the local filesystem,
DFS data node should store its blocks. These are the local directories
where the HDFS data written.This value is a comma-delimited list of
directories and  data will be stored in all named directories, typically on
different mounts. Directories that do not exist are ignored.
*
*No need to * formatting the Hadoop filesystem after editing directories,
just restart data node daemon/all process*.**
*
*
****hadoop.tmp.dir In core-site.xml*: The local directory whereHDFS
operation stores intermediate data files.



On Tue, Apr 23, 2013 at 9:14 PM, Geelong Yao <ge...@gmail.com> wrote:

> Thank for your reply.
> 1.For the first issue, I think this is mainly caused by missing jar in
> hadoop lib.
> I am confused about the data.dfs.dir with more than one values:
> /usr/hadoop/tmp/dfs/data, /sda
>
> 2.when we running job,which one should be firstly used ? or both used for
> temp file?
> Besides,after changing the values of hdfs-site.xml in datanodes, how can I
> immediately make this change effective? should I use start-all.sh or hadoop
> namenode -format?
>
>
> BRs
> Geelong
>
>
>
>
> 2013/4/23 shashwat shriparv <dw...@gmail.com>
>
>> 1: check for sl4j jar file, set HADOOP_CLASSPATH
>> 2. check permission on /sda
>> better make a directory in /sda/data or somethig like that and then try
>>
>> *Thanks & Regards    *
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Tue, Apr 23, 2013 at 7:55 PM, Geelong Yao <ge...@gmail.com>wrote:
>>
>>> Hi  Everyone
>>>
>>>
>>> I have met two issues when changing the configuration for hadoop
>>>
>>> 1.When I execute the start-all.sh, it shows some errors but it won't
>>> affect the cluster
>>>
>>> [image: 内嵌图片 1]
>>>
>>> Does anyone know what happened?missing some jars ?
>>>
>>> 2.When I need to change the setting of data.dfs.dir in hdfs-site.xml to
>>> add aonther volume (mounted on /sda)of my disk
>>>  <property>
>>>         <name>data.dfs.dir</name>
>>>         <value>/usr/hadoop/tmp/dfs/data,/sda</value>
>>>     </property>
>>> I fonud I can't start dfs, when I checking the datanode, there is no
>>> datanode running
>>>
>>> Could anyone know how to make the change works after I change the
>>> hdfs-site.xml on datanode,the details would be better
>>>
>>>
>>> BRs
>>> Geelong
>>>
>>> --
>>> From Good To Great
>>>
>>
>>
>
>
> --
> From Good To Great
>



-- 

Regards,
.....  Sudhakara.st

Re: Need help for hadoop

Posted by sudhakara st <su...@gmail.com>.
Hi,
*data.dfs.dir in hdfs-site.xml**- Determines where on the local filesystem,
DFS data node should store its blocks. These are the local directories
where the HDFS data written.This value is a comma-delimited list of
directories and  data will be stored in all named directories, typically on
different mounts. Directories that do not exist are ignored.
*
*No need to * formatting the Hadoop filesystem after editing directories,
just restart data node daemon/all process*.**
*
*
****hadoop.tmp.dir In core-site.xml*: The local directory whereHDFS
operation stores intermediate data files.



On Tue, Apr 23, 2013 at 9:14 PM, Geelong Yao <ge...@gmail.com> wrote:

> Thank for your reply.
> 1.For the first issue, I think this is mainly caused by missing jar in
> hadoop lib.
> I am confused about the data.dfs.dir with more than one values:
> /usr/hadoop/tmp/dfs/data, /sda
>
> 2.when we running job,which one should be firstly used ? or both used for
> temp file?
> Besides,after changing the values of hdfs-site.xml in datanodes, how can I
> immediately make this change effective? should I use start-all.sh or hadoop
> namenode -format?
>
>
> BRs
> Geelong
>
>
>
>
> 2013/4/23 shashwat shriparv <dw...@gmail.com>
>
>> 1: check for sl4j jar file, set HADOOP_CLASSPATH
>> 2. check permission on /sda
>> better make a directory in /sda/data or somethig like that and then try
>>
>> *Thanks & Regards    *
>>
>> ��
>> Shashwat Shriparv
>>
>>
>>
>> On Tue, Apr 23, 2013 at 7:55 PM, Geelong Yao <ge...@gmail.com>wrote:
>>
>>> Hi  Everyone
>>>
>>>
>>> I have met two issues when changing the configuration for hadoop
>>>
>>> 1.When I execute the start-all.sh, it shows some errors but it won't
>>> affect the cluster
>>>
>>> [image: ��ǶͼƬ 1]
>>>
>>> Does anyone know what happened?missing some jars ?
>>>
>>> 2.When I need to change the setting of data.dfs.dir in hdfs-site.xml to
>>> add aonther volume (mounted on /sda)of my disk
>>>  <property>
>>>         <name>data.dfs.dir</name>
>>>         <value>/usr/hadoop/tmp/dfs/data,/sda</value>
>>>     </property>
>>> I fonud I can't start dfs, when I checking the datanode, there is no
>>> datanode running
>>>
>>> Could anyone know how to make the change works after I change the
>>> hdfs-site.xml on datanode,the details would be better
>>>
>>>
>>> BRs
>>> Geelong
>>>
>>> --
>>> From Good To Great
>>>
>>
>>
>
>
> --
> From Good To Great
>



-- 

Regards,
.....  Sudhakara.st

Re: Need help for hadoop

Posted by Geelong Yao <ge...@gmail.com>.
Thank for your reply.
1.For the first issue, I think this is mainly caused by missing jar in
hadoop lib.
I am confused about the data.dfs.dir with more than one values:
/usr/hadoop/tmp/dfs/data, /sda

2.when we running job,which one should be firstly used ? or both used for
temp file?
Besides,after changing the values of hdfs-site.xml in datanodes, how can I
immediately make this change effective? should I use start-all.sh or hadoop
namenode -format?


BRs
Geelong




2013/4/23 shashwat shriparv <dw...@gmail.com>

> 1: check for sl4j jar file, set HADOOP_CLASSPATH
> 2. check permission on /sda
> better make a directory in /sda/data or somethig like that and then try
>
> *Thanks & Regards    *
>
> ��
> Shashwat Shriparv
>
>
>
> On Tue, Apr 23, 2013 at 7:55 PM, Geelong Yao <ge...@gmail.com> wrote:
>
>> Hi  Everyone
>>
>>
>> I have met two issues when changing the configuration for hadoop
>>
>> 1.When I execute the start-all.sh, it shows some errors but it won't
>> affect the cluster
>>
>> [image: ��ǶͼƬ 1]
>>
>> Does anyone know what happened?missing some jars ?
>>
>> 2.When I need to change the setting of data.dfs.dir in hdfs-site.xml to
>> add aonther volume (mounted on /sda)of my disk
>>  <property>
>>         <name>data.dfs.dir</name>
>>         <value>/usr/hadoop/tmp/dfs/data,/sda</value>
>>     </property>
>> I fonud I can't start dfs, when I checking the datanode, there is no
>> datanode running
>>
>> Could anyone know how to make the change works after I change the
>> hdfs-site.xml on datanode,the details would be better
>>
>>
>> BRs
>> Geelong
>>
>> --
>> From Good To Great
>>
>
>


-- 
>From Good To Great

Re: Need help for hadoop

Posted by Geelong Yao <ge...@gmail.com>.
Thank for your reply.
1.For the first issue, I think this is mainly caused by missing jar in
hadoop lib.
I am confused about the data.dfs.dir with more than one values:
/usr/hadoop/tmp/dfs/data, /sda

2.when we running job,which one should be firstly used ? or both used for
temp file?
Besides,after changing the values of hdfs-site.xml in datanodes, how can I
immediately make this change effective? should I use start-all.sh or hadoop
namenode -format?


BRs
Geelong




2013/4/23 shashwat shriparv <dw...@gmail.com>

> 1: check for sl4j jar file, set HADOOP_CLASSPATH
> 2. check permission on /sda
> better make a directory in /sda/data or somethig like that and then try
>
> *Thanks & Regards    *
>
> ∞
> Shashwat Shriparv
>
>
>
> On Tue, Apr 23, 2013 at 7:55 PM, Geelong Yao <ge...@gmail.com> wrote:
>
>> Hi  Everyone
>>
>>
>> I have met two issues when changing the configuration for hadoop
>>
>> 1.When I execute the start-all.sh, it shows some errors but it won't
>> affect the cluster
>>
>> [image: 内嵌图片 1]
>>
>> Does anyone know what happened?missing some jars ?
>>
>> 2.When I need to change the setting of data.dfs.dir in hdfs-site.xml to
>> add aonther volume (mounted on /sda)of my disk
>>  <property>
>>         <name>data.dfs.dir</name>
>>         <value>/usr/hadoop/tmp/dfs/data,/sda</value>
>>     </property>
>> I fonud I can't start dfs, when I checking the datanode, there is no
>> datanode running
>>
>> Could anyone know how to make the change works after I change the
>> hdfs-site.xml on datanode,the details would be better
>>
>>
>> BRs
>> Geelong
>>
>> --
>> From Good To Great
>>
>
>


-- 
>From Good To Great

Re: Need help for hadoop

Posted by Geelong Yao <ge...@gmail.com>.
Thank for your reply.
1.For the first issue, I think this is mainly caused by missing jar in
hadoop lib.
I am confused about the data.dfs.dir with more than one values:
/usr/hadoop/tmp/dfs/data, /sda

2.when we running job,which one should be firstly used ? or both used for
temp file?
Besides,after changing the values of hdfs-site.xml in datanodes, how can I
immediately make this change effective? should I use start-all.sh or hadoop
namenode -format?


BRs
Geelong




2013/4/23 shashwat shriparv <dw...@gmail.com>

> 1: check for sl4j jar file, set HADOOP_CLASSPATH
> 2. check permission on /sda
> better make a directory in /sda/data or somethig like that and then try
>
> *Thanks & Regards    *
>
> ��
> Shashwat Shriparv
>
>
>
> On Tue, Apr 23, 2013 at 7:55 PM, Geelong Yao <ge...@gmail.com> wrote:
>
>> Hi  Everyone
>>
>>
>> I have met two issues when changing the configuration for hadoop
>>
>> 1.When I execute the start-all.sh, it shows some errors but it won't
>> affect the cluster
>>
>> [image: ��ǶͼƬ 1]
>>
>> Does anyone know what happened?missing some jars ?
>>
>> 2.When I need to change the setting of data.dfs.dir in hdfs-site.xml to
>> add aonther volume (mounted on /sda)of my disk
>>  <property>
>>         <name>data.dfs.dir</name>
>>         <value>/usr/hadoop/tmp/dfs/data,/sda</value>
>>     </property>
>> I fonud I can't start dfs, when I checking the datanode, there is no
>> datanode running
>>
>> Could anyone know how to make the change works after I change the
>> hdfs-site.xml on datanode,the details would be better
>>
>>
>> BRs
>> Geelong
>>
>> --
>> From Good To Great
>>
>
>


-- 
>From Good To Great

Re: Need help for hadoop

Posted by Geelong Yao <ge...@gmail.com>.
Thank for your reply.
1.For the first issue, I think this is mainly caused by missing jar in
hadoop lib.
I am confused about the data.dfs.dir with more than one values:
/usr/hadoop/tmp/dfs/data, /sda

2.when we running job,which one should be firstly used ? or both used for
temp file?
Besides,after changing the values of hdfs-site.xml in datanodes, how can I
immediately make this change effective? should I use start-all.sh or hadoop
namenode -format?


BRs
Geelong




2013/4/23 shashwat shriparv <dw...@gmail.com>

> 1: check for sl4j jar file, set HADOOP_CLASSPATH
> 2. check permission on /sda
> better make a directory in /sda/data or somethig like that and then try
>
> *Thanks & Regards    *
>
> ∞
> Shashwat Shriparv
>
>
>
> On Tue, Apr 23, 2013 at 7:55 PM, Geelong Yao <ge...@gmail.com> wrote:
>
>> Hi  Everyone
>>
>>
>> I have met two issues when changing the configuration for hadoop
>>
>> 1.When I execute the start-all.sh, it shows some errors but it won't
>> affect the cluster
>>
>> [image: 内嵌图片 1]
>>
>> Does anyone know what happened?missing some jars ?
>>
>> 2.When I need to change the setting of data.dfs.dir in hdfs-site.xml to
>> add aonther volume (mounted on /sda)of my disk
>>  <property>
>>         <name>data.dfs.dir</name>
>>         <value>/usr/hadoop/tmp/dfs/data,/sda</value>
>>     </property>
>> I fonud I can't start dfs, when I checking the datanode, there is no
>> datanode running
>>
>> Could anyone know how to make the change works after I change the
>> hdfs-site.xml on datanode,the details would be better
>>
>>
>> BRs
>> Geelong
>>
>> --
>> From Good To Great
>>
>
>


-- 
>From Good To Great

Re: Need help for hadoop

Posted by shashwat shriparv <dw...@gmail.com>.
1: check for sl4j jar file, set HADOOP_CLASSPATH
2. check permission on /sda
better make a directory in /sda/data or somethig like that and then try

*Thanks & Regards    *

∞
Shashwat Shriparv



On Tue, Apr 23, 2013 at 7:55 PM, Geelong Yao <ge...@gmail.com> wrote:

> Hi  Everyone
>
>
> I have met two issues when changing the configuration for hadoop
>
> 1.When I execute the start-all.sh, it shows some errors but it won't
> affect the cluster
>
> [image: 内嵌图片 1]
>
> Does anyone know what happened?missing some jars ?
>
> 2.When I need to change the setting of data.dfs.dir in hdfs-site.xml to
> add aonther volume (mounted on /sda)of my disk
>  <property>
>         <name>data.dfs.dir</name>
>         <value>/usr/hadoop/tmp/dfs/data,/sda</value>
>     </property>
> I fonud I can't start dfs, when I checking the datanode, there is no
> datanode running
>
> Could anyone know how to make the change works after I change the
> hdfs-site.xml on datanode,the details would be better
>
>
> BRs
> Geelong
>
> --
> From Good To Great
>

Re: Need help for hadoop

Posted by shashwat shriparv <dw...@gmail.com>.
1: check for sl4j jar file, set HADOOP_CLASSPATH
2. check permission on /sda
better make a directory in /sda/data or somethig like that and then try

*Thanks & Regards    *

��
Shashwat Shriparv



On Tue, Apr 23, 2013 at 7:55 PM, Geelong Yao <ge...@gmail.com> wrote:

> Hi  Everyone
>
>
> I have met two issues when changing the configuration for hadoop
>
> 1.When I execute the start-all.sh, it shows some errors but it won't
> affect the cluster
>
> [image: ��ǶͼƬ 1]
>
> Does anyone know what happened?missing some jars ?
>
> 2.When I need to change the setting of data.dfs.dir in hdfs-site.xml to
> add aonther volume (mounted on /sda)of my disk
>  <property>
>         <name>data.dfs.dir</name>
>         <value>/usr/hadoop/tmp/dfs/data,/sda</value>
>     </property>
> I fonud I can't start dfs, when I checking the datanode, there is no
> datanode running
>
> Could anyone know how to make the change works after I change the
> hdfs-site.xml on datanode,the details would be better
>
>
> BRs
> Geelong
>
> --
> From Good To Great
>

Re: Need help for hadoop

Posted by shashwat shriparv <dw...@gmail.com>.
1: check for sl4j jar file, set HADOOP_CLASSPATH
2. check permission on /sda
better make a directory in /sda/data or somethig like that and then try

*Thanks & Regards    *

∞
Shashwat Shriparv



On Tue, Apr 23, 2013 at 7:55 PM, Geelong Yao <ge...@gmail.com> wrote:

> Hi  Everyone
>
>
> I have met two issues when changing the configuration for hadoop
>
> 1.When I execute the start-all.sh, it shows some errors but it won't
> affect the cluster
>
> [image: 内嵌图片 1]
>
> Does anyone know what happened?missing some jars ?
>
> 2.When I need to change the setting of data.dfs.dir in hdfs-site.xml to
> add aonther volume (mounted on /sda)of my disk
>  <property>
>         <name>data.dfs.dir</name>
>         <value>/usr/hadoop/tmp/dfs/data,/sda</value>
>     </property>
> I fonud I can't start dfs, when I checking the datanode, there is no
> datanode running
>
> Could anyone know how to make the change works after I change the
> hdfs-site.xml on datanode,the details would be better
>
>
> BRs
> Geelong
>
> --
> From Good To Great
>

Re: Need help for hadoop

Posted by shashwat shriparv <dw...@gmail.com>.
1: check for sl4j jar file, set HADOOP_CLASSPATH
2. check permission on /sda
better make a directory in /sda/data or somethig like that and then try

*Thanks & Regards    *

��
Shashwat Shriparv



On Tue, Apr 23, 2013 at 7:55 PM, Geelong Yao <ge...@gmail.com> wrote:

> Hi  Everyone
>
>
> I have met two issues when changing the configuration for hadoop
>
> 1.When I execute the start-all.sh, it shows some errors but it won't
> affect the cluster
>
> [image: ��ǶͼƬ 1]
>
> Does anyone know what happened?missing some jars ?
>
> 2.When I need to change the setting of data.dfs.dir in hdfs-site.xml to
> add aonther volume (mounted on /sda)of my disk
>  <property>
>         <name>data.dfs.dir</name>
>         <value>/usr/hadoop/tmp/dfs/data,/sda</value>
>     </property>
> I fonud I can't start dfs, when I checking the datanode, there is no
> datanode running
>
> Could anyone know how to make the change works after I change the
> hdfs-site.xml on datanode,the details would be better
>
>
> BRs
> Geelong
>
> --
> From Good To Great
>