You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by sandeep vura <sa...@gmail.com> on 2015/04/08 19:53:00 UTC

Unable to load file from local to HDFS cluster

Hi,

When loading a file from local to HDFS cluster using the below command

hadoop fs -put sales.txt /sales_dept.

Getting the following exception.Please let me know how to resolve this
issue asap.Please find the attached is the logs that is displaying on
namenode.

Regards,
Sandeep.v

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Its not conflicted our network team as changed settings in Core Switch of
VLAN

On Sun, Apr 12, 2015 at 8:26 AM, 杨浩 <ya...@gmail.com> wrote:

> Oh, I see. Is that you have configured a conflicted port before?
>
> 2015-04-09 18:36 GMT+08:00 sandeep vura <sa...@gmail.com>:
>
>> Hi Yanghaogn,
>>
>> Sure, We couldn't able to load the file from local to HDFS. Its getting
>> exception DFSOutputStream connection refused,which means packets are not
>> receiving properly from namenode to datanodes .However,if we start clusters
>> our datanodes are not starting properly and getting connection closed
>> exception.
>>
>> Our Hadoop WebUI also opening very slow ,ssh connection also very
>> slow.Then finally we have changed our network ports and checked the
>> performance of the cluster it works good.
>>
>> Issue was fixed in Namenode server network port.
>>
>> Regards,
>> Sandeep.v
>>
>>
>> On Thu, Apr 9, 2015 at 12:30 PM, 杨浩 <ya...@gmail.com> wrote:
>>
>>> Root cause: Network related issue?
>>> can you tell us more detailedly? Thank you
>>>
>>> 2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:
>>>
>>>> Our issue has been resolved.
>>>>
>>>> Root cause: Network related issue.
>>>>
>>>> Thanks for each and everyone spent sometime and replied to my questions.
>>>>
>>>> Regards,
>>>> Sandeep.v
>>>>
>>>> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
>>>> wrote:
>>>>
>>>>> Can anyone give solution for my issue?
>>>>>
>>>>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Exactly but every time it picks randomly. Our datanodes are
>>>>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>>>>
>>>>>> Namenode  : 192.168.2.80
>>>>>>
>>>>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>>>>> connection closed
>>>>>>
>>>>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <
>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>
>>>>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>>>>> -datanode))
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>>> *Sent:* April 8, 2015 2:39 PM
>>>>>>>
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> We are using this setup from a very long time.We are able to run all
>>>>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>> I have also noticed another issue when starting hadoop cluster
>>>>>>> start-all.sh command
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>>>>> datanode would drop the connection and it shows the message connection
>>>>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>>>>> cluster datanode will keeps changing .
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>>>>> connection closed
>>>>>>>
>>>>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection
>>>>>>> closed .This point again 192.168.2.1 will starts successfuly without any
>>>>>>> errors.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>>>>> network or Hadoop configuration.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <
>>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>>
>>>>>>> hadoop fs -put <source> <destination> Copy from remote location to
>>>>>>> HDFS
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>>> *Sent:* April 8, 2015 2:24 PM
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sandeep.V
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <
>>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>>
>>>>>>> Should be hadoop dfs -put
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>>> *Sent:* April 8, 2015 1:53 PM
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> When loading a file from local to HDFS cluster using the below
>>>>>>> command
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> hadoop fs -put sales.txt /sales_dept.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Getting the following exception.Please let me know how to resolve
>>>>>>> this issue asap.Please find the attached is the logs that is displaying on
>>>>>>> namenode.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sandeep.v
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Its not conflicted our network team as changed settings in Core Switch of
VLAN

On Sun, Apr 12, 2015 at 8:26 AM, 杨浩 <ya...@gmail.com> wrote:

> Oh, I see. Is that you have configured a conflicted port before?
>
> 2015-04-09 18:36 GMT+08:00 sandeep vura <sa...@gmail.com>:
>
>> Hi Yanghaogn,
>>
>> Sure, We couldn't able to load the file from local to HDFS. Its getting
>> exception DFSOutputStream connection refused,which means packets are not
>> receiving properly from namenode to datanodes .However,if we start clusters
>> our datanodes are not starting properly and getting connection closed
>> exception.
>>
>> Our Hadoop WebUI also opening very slow ,ssh connection also very
>> slow.Then finally we have changed our network ports and checked the
>> performance of the cluster it works good.
>>
>> Issue was fixed in Namenode server network port.
>>
>> Regards,
>> Sandeep.v
>>
>>
>> On Thu, Apr 9, 2015 at 12:30 PM, 杨浩 <ya...@gmail.com> wrote:
>>
>>> Root cause: Network related issue?
>>> can you tell us more detailedly? Thank you
>>>
>>> 2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:
>>>
>>>> Our issue has been resolved.
>>>>
>>>> Root cause: Network related issue.
>>>>
>>>> Thanks for each and everyone spent sometime and replied to my questions.
>>>>
>>>> Regards,
>>>> Sandeep.v
>>>>
>>>> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
>>>> wrote:
>>>>
>>>>> Can anyone give solution for my issue?
>>>>>
>>>>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Exactly but every time it picks randomly. Our datanodes are
>>>>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>>>>
>>>>>> Namenode  : 192.168.2.80
>>>>>>
>>>>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>>>>> connection closed
>>>>>>
>>>>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <
>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>
>>>>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>>>>> -datanode))
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>>> *Sent:* April 8, 2015 2:39 PM
>>>>>>>
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> We are using this setup from a very long time.We are able to run all
>>>>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>> I have also noticed another issue when starting hadoop cluster
>>>>>>> start-all.sh command
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>>>>> datanode would drop the connection and it shows the message connection
>>>>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>>>>> cluster datanode will keeps changing .
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>>>>> connection closed
>>>>>>>
>>>>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection
>>>>>>> closed .This point again 192.168.2.1 will starts successfuly without any
>>>>>>> errors.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>>>>> network or Hadoop configuration.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <
>>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>>
>>>>>>> hadoop fs -put <source> <destination> Copy from remote location to
>>>>>>> HDFS
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>>> *Sent:* April 8, 2015 2:24 PM
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sandeep.V
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <
>>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>>
>>>>>>> Should be hadoop dfs -put
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>>> *Sent:* April 8, 2015 1:53 PM
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> When loading a file from local to HDFS cluster using the below
>>>>>>> command
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> hadoop fs -put sales.txt /sales_dept.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Getting the following exception.Please let me know how to resolve
>>>>>>> this issue asap.Please find the attached is the logs that is displaying on
>>>>>>> namenode.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sandeep.v
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Its not conflicted our network team as changed settings in Core Switch of
VLAN

On Sun, Apr 12, 2015 at 8:26 AM, 杨浩 <ya...@gmail.com> wrote:

> Oh, I see. Is that you have configured a conflicted port before?
>
> 2015-04-09 18:36 GMT+08:00 sandeep vura <sa...@gmail.com>:
>
>> Hi Yanghaogn,
>>
>> Sure, We couldn't able to load the file from local to HDFS. Its getting
>> exception DFSOutputStream connection refused,which means packets are not
>> receiving properly from namenode to datanodes .However,if we start clusters
>> our datanodes are not starting properly and getting connection closed
>> exception.
>>
>> Our Hadoop WebUI also opening very slow ,ssh connection also very
>> slow.Then finally we have changed our network ports and checked the
>> performance of the cluster it works good.
>>
>> Issue was fixed in Namenode server network port.
>>
>> Regards,
>> Sandeep.v
>>
>>
>> On Thu, Apr 9, 2015 at 12:30 PM, 杨浩 <ya...@gmail.com> wrote:
>>
>>> Root cause: Network related issue?
>>> can you tell us more detailedly? Thank you
>>>
>>> 2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:
>>>
>>>> Our issue has been resolved.
>>>>
>>>> Root cause: Network related issue.
>>>>
>>>> Thanks for each and everyone spent sometime and replied to my questions.
>>>>
>>>> Regards,
>>>> Sandeep.v
>>>>
>>>> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
>>>> wrote:
>>>>
>>>>> Can anyone give solution for my issue?
>>>>>
>>>>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Exactly but every time it picks randomly. Our datanodes are
>>>>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>>>>
>>>>>> Namenode  : 192.168.2.80
>>>>>>
>>>>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>>>>> connection closed
>>>>>>
>>>>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <
>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>
>>>>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>>>>> -datanode))
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>>> *Sent:* April 8, 2015 2:39 PM
>>>>>>>
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> We are using this setup from a very long time.We are able to run all
>>>>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>> I have also noticed another issue when starting hadoop cluster
>>>>>>> start-all.sh command
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>>>>> datanode would drop the connection and it shows the message connection
>>>>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>>>>> cluster datanode will keeps changing .
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>>>>> connection closed
>>>>>>>
>>>>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection
>>>>>>> closed .This point again 192.168.2.1 will starts successfuly without any
>>>>>>> errors.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>>>>> network or Hadoop configuration.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <
>>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>>
>>>>>>> hadoop fs -put <source> <destination> Copy from remote location to
>>>>>>> HDFS
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>>> *Sent:* April 8, 2015 2:24 PM
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sandeep.V
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <
>>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>>
>>>>>>> Should be hadoop dfs -put
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>>> *Sent:* April 8, 2015 1:53 PM
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> When loading a file from local to HDFS cluster using the below
>>>>>>> command
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> hadoop fs -put sales.txt /sales_dept.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Getting the following exception.Please let me know how to resolve
>>>>>>> this issue asap.Please find the attached is the logs that is displaying on
>>>>>>> namenode.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sandeep.v
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Its not conflicted our network team as changed settings in Core Switch of
VLAN

On Sun, Apr 12, 2015 at 8:26 AM, 杨浩 <ya...@gmail.com> wrote:

> Oh, I see. Is that you have configured a conflicted port before?
>
> 2015-04-09 18:36 GMT+08:00 sandeep vura <sa...@gmail.com>:
>
>> Hi Yanghaogn,
>>
>> Sure, We couldn't able to load the file from local to HDFS. Its getting
>> exception DFSOutputStream connection refused,which means packets are not
>> receiving properly from namenode to datanodes .However,if we start clusters
>> our datanodes are not starting properly and getting connection closed
>> exception.
>>
>> Our Hadoop WebUI also opening very slow ,ssh connection also very
>> slow.Then finally we have changed our network ports and checked the
>> performance of the cluster it works good.
>>
>> Issue was fixed in Namenode server network port.
>>
>> Regards,
>> Sandeep.v
>>
>>
>> On Thu, Apr 9, 2015 at 12:30 PM, 杨浩 <ya...@gmail.com> wrote:
>>
>>> Root cause: Network related issue?
>>> can you tell us more detailedly? Thank you
>>>
>>> 2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:
>>>
>>>> Our issue has been resolved.
>>>>
>>>> Root cause: Network related issue.
>>>>
>>>> Thanks for each and everyone spent sometime and replied to my questions.
>>>>
>>>> Regards,
>>>> Sandeep.v
>>>>
>>>> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
>>>> wrote:
>>>>
>>>>> Can anyone give solution for my issue?
>>>>>
>>>>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Exactly but every time it picks randomly. Our datanodes are
>>>>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>>>>
>>>>>> Namenode  : 192.168.2.80
>>>>>>
>>>>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>>>>> connection closed
>>>>>>
>>>>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <
>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>
>>>>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>>>>> -datanode))
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>>> *Sent:* April 8, 2015 2:39 PM
>>>>>>>
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> We are using this setup from a very long time.We are able to run all
>>>>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>> I have also noticed another issue when starting hadoop cluster
>>>>>>> start-all.sh command
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>>>>> datanode would drop the connection and it shows the message connection
>>>>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>>>>> cluster datanode will keeps changing .
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>>>>> connection closed
>>>>>>>
>>>>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection
>>>>>>> closed .This point again 192.168.2.1 will starts successfuly without any
>>>>>>> errors.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>>>>> network or Hadoop configuration.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <
>>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>>
>>>>>>> hadoop fs -put <source> <destination> Copy from remote location to
>>>>>>> HDFS
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>>> *Sent:* April 8, 2015 2:24 PM
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sandeep.V
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <
>>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>>
>>>>>>> Should be hadoop dfs -put
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>>> *Sent:* April 8, 2015 1:53 PM
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> When loading a file from local to HDFS cluster using the below
>>>>>>> command
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> hadoop fs -put sales.txt /sales_dept.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Getting the following exception.Please let me know how to resolve
>>>>>>> this issue asap.Please find the attached is the logs that is displaying on
>>>>>>> namenode.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sandeep.v
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by 杨浩 <ya...@gmail.com>.
Oh, I see. Is that you have configured a conflicted port before?

2015-04-09 18:36 GMT+08:00 sandeep vura <sa...@gmail.com>:

> Hi Yanghaogn,
>
> Sure, We couldn't able to load the file from local to HDFS. Its getting
> exception DFSOutputStream connection refused,which means packets are not
> receiving properly from namenode to datanodes .However,if we start clusters
> our datanodes are not starting properly and getting connection closed
> exception.
>
> Our Hadoop WebUI also opening very slow ,ssh connection also very
> slow.Then finally we have changed our network ports and checked the
> performance of the cluster it works good.
>
> Issue was fixed in Namenode server network port.
>
> Regards,
> Sandeep.v
>
>
> On Thu, Apr 9, 2015 at 12:30 PM, 杨浩 <ya...@gmail.com> wrote:
>
>> Root cause: Network related issue?
>> can you tell us more detailedly? Thank you
>>
>> 2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:
>>
>>> Our issue has been resolved.
>>>
>>> Root cause: Network related issue.
>>>
>>> Thanks for each and everyone spent sometime and replied to my questions.
>>>
>>> Regards,
>>> Sandeep.v
>>>
>>> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
>>> wrote:
>>>
>>>> Can anyone give solution for my issue?
>>>>
>>>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>>>> wrote:
>>>>
>>>>> Exactly but every time it picks randomly. Our datanodes are
>>>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>>>
>>>>> Namenode  : 192.168.2.80
>>>>>
>>>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>>>> connection closed
>>>>>
>>>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <
>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>
>>>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>>>> -datanode))
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>> *Sent:* April 8, 2015 2:39 PM
>>>>>>
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>
>>>>>>
>>>>>>
>>>>>> We are using this setup from a very long time.We are able to run all
>>>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>> I have also noticed another issue when starting hadoop cluster
>>>>>> start-all.sh command
>>>>>>
>>>>>>
>>>>>>
>>>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>>>> datanode would drop the connection and it shows the message connection
>>>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>>>> cluster datanode will keeps changing .
>>>>>>
>>>>>>
>>>>>>
>>>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>>>> connection closed
>>>>>>
>>>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>>>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>>>>
>>>>>>
>>>>>>
>>>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>>>> network or Hadoop configuration.
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <
>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>
>>>>>> hadoop fs -put <source> <destination> Copy from remote location to
>>>>>> HDFS
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>> *Sent:* April 8, 2015 2:24 PM
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>
>>>>>>
>>>>>>
>>>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Regards,
>>>>>>
>>>>>> Sandeep.V
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <
>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>
>>>>>> Should be hadoop dfs -put
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>> *Sent:* April 8, 2015 1:53 PM
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>>>
>>>>>>
>>>>>>
>>>>>> Hi,
>>>>>>
>>>>>>
>>>>>>
>>>>>> When loading a file from local to HDFS cluster using the below
>>>>>> command
>>>>>>
>>>>>>
>>>>>>
>>>>>> hadoop fs -put sales.txt /sales_dept.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Getting the following exception.Please let me know how to resolve
>>>>>> this issue asap.Please find the attached is the logs that is displaying on
>>>>>> namenode.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Regards,
>>>>>>
>>>>>> Sandeep.v
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by 杨浩 <ya...@gmail.com>.
Oh, I see. Is that you have configured a conflicted port before?

2015-04-09 18:36 GMT+08:00 sandeep vura <sa...@gmail.com>:

> Hi Yanghaogn,
>
> Sure, We couldn't able to load the file from local to HDFS. Its getting
> exception DFSOutputStream connection refused,which means packets are not
> receiving properly from namenode to datanodes .However,if we start clusters
> our datanodes are not starting properly and getting connection closed
> exception.
>
> Our Hadoop WebUI also opening very slow ,ssh connection also very
> slow.Then finally we have changed our network ports and checked the
> performance of the cluster it works good.
>
> Issue was fixed in Namenode server network port.
>
> Regards,
> Sandeep.v
>
>
> On Thu, Apr 9, 2015 at 12:30 PM, 杨浩 <ya...@gmail.com> wrote:
>
>> Root cause: Network related issue?
>> can you tell us more detailedly? Thank you
>>
>> 2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:
>>
>>> Our issue has been resolved.
>>>
>>> Root cause: Network related issue.
>>>
>>> Thanks for each and everyone spent sometime and replied to my questions.
>>>
>>> Regards,
>>> Sandeep.v
>>>
>>> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
>>> wrote:
>>>
>>>> Can anyone give solution for my issue?
>>>>
>>>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>>>> wrote:
>>>>
>>>>> Exactly but every time it picks randomly. Our datanodes are
>>>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>>>
>>>>> Namenode  : 192.168.2.80
>>>>>
>>>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>>>> connection closed
>>>>>
>>>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <
>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>
>>>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>>>> -datanode))
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>> *Sent:* April 8, 2015 2:39 PM
>>>>>>
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>
>>>>>>
>>>>>>
>>>>>> We are using this setup from a very long time.We are able to run all
>>>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>> I have also noticed another issue when starting hadoop cluster
>>>>>> start-all.sh command
>>>>>>
>>>>>>
>>>>>>
>>>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>>>> datanode would drop the connection and it shows the message connection
>>>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>>>> cluster datanode will keeps changing .
>>>>>>
>>>>>>
>>>>>>
>>>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>>>> connection closed
>>>>>>
>>>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>>>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>>>>
>>>>>>
>>>>>>
>>>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>>>> network or Hadoop configuration.
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <
>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>
>>>>>> hadoop fs -put <source> <destination> Copy from remote location to
>>>>>> HDFS
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>> *Sent:* April 8, 2015 2:24 PM
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>
>>>>>>
>>>>>>
>>>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Regards,
>>>>>>
>>>>>> Sandeep.V
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <
>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>
>>>>>> Should be hadoop dfs -put
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>> *Sent:* April 8, 2015 1:53 PM
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>>>
>>>>>>
>>>>>>
>>>>>> Hi,
>>>>>>
>>>>>>
>>>>>>
>>>>>> When loading a file from local to HDFS cluster using the below
>>>>>> command
>>>>>>
>>>>>>
>>>>>>
>>>>>> hadoop fs -put sales.txt /sales_dept.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Getting the following exception.Please let me know how to resolve
>>>>>> this issue asap.Please find the attached is the logs that is displaying on
>>>>>> namenode.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Regards,
>>>>>>
>>>>>> Sandeep.v
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by 杨浩 <ya...@gmail.com>.
Oh, I see. Is that you have configured a conflicted port before?

2015-04-09 18:36 GMT+08:00 sandeep vura <sa...@gmail.com>:

> Hi Yanghaogn,
>
> Sure, We couldn't able to load the file from local to HDFS. Its getting
> exception DFSOutputStream connection refused,which means packets are not
> receiving properly from namenode to datanodes .However,if we start clusters
> our datanodes are not starting properly and getting connection closed
> exception.
>
> Our Hadoop WebUI also opening very slow ,ssh connection also very
> slow.Then finally we have changed our network ports and checked the
> performance of the cluster it works good.
>
> Issue was fixed in Namenode server network port.
>
> Regards,
> Sandeep.v
>
>
> On Thu, Apr 9, 2015 at 12:30 PM, 杨浩 <ya...@gmail.com> wrote:
>
>> Root cause: Network related issue?
>> can you tell us more detailedly? Thank you
>>
>> 2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:
>>
>>> Our issue has been resolved.
>>>
>>> Root cause: Network related issue.
>>>
>>> Thanks for each and everyone spent sometime and replied to my questions.
>>>
>>> Regards,
>>> Sandeep.v
>>>
>>> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
>>> wrote:
>>>
>>>> Can anyone give solution for my issue?
>>>>
>>>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>>>> wrote:
>>>>
>>>>> Exactly but every time it picks randomly. Our datanodes are
>>>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>>>
>>>>> Namenode  : 192.168.2.80
>>>>>
>>>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>>>> connection closed
>>>>>
>>>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <
>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>
>>>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>>>> -datanode))
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>> *Sent:* April 8, 2015 2:39 PM
>>>>>>
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>
>>>>>>
>>>>>>
>>>>>> We are using this setup from a very long time.We are able to run all
>>>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>> I have also noticed another issue when starting hadoop cluster
>>>>>> start-all.sh command
>>>>>>
>>>>>>
>>>>>>
>>>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>>>> datanode would drop the connection and it shows the message connection
>>>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>>>> cluster datanode will keeps changing .
>>>>>>
>>>>>>
>>>>>>
>>>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>>>> connection closed
>>>>>>
>>>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>>>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>>>>
>>>>>>
>>>>>>
>>>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>>>> network or Hadoop configuration.
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <
>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>
>>>>>> hadoop fs -put <source> <destination> Copy from remote location to
>>>>>> HDFS
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>> *Sent:* April 8, 2015 2:24 PM
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>
>>>>>>
>>>>>>
>>>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Regards,
>>>>>>
>>>>>> Sandeep.V
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <
>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>
>>>>>> Should be hadoop dfs -put
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>> *Sent:* April 8, 2015 1:53 PM
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>>>
>>>>>>
>>>>>>
>>>>>> Hi,
>>>>>>
>>>>>>
>>>>>>
>>>>>> When loading a file from local to HDFS cluster using the below
>>>>>> command
>>>>>>
>>>>>>
>>>>>>
>>>>>> hadoop fs -put sales.txt /sales_dept.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Getting the following exception.Please let me know how to resolve
>>>>>> this issue asap.Please find the attached is the logs that is displaying on
>>>>>> namenode.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Regards,
>>>>>>
>>>>>> Sandeep.v
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by 杨浩 <ya...@gmail.com>.
Oh, I see. Is that you have configured a conflicted port before?

2015-04-09 18:36 GMT+08:00 sandeep vura <sa...@gmail.com>:

> Hi Yanghaogn,
>
> Sure, We couldn't able to load the file from local to HDFS. Its getting
> exception DFSOutputStream connection refused,which means packets are not
> receiving properly from namenode to datanodes .However,if we start clusters
> our datanodes are not starting properly and getting connection closed
> exception.
>
> Our Hadoop WebUI also opening very slow ,ssh connection also very
> slow.Then finally we have changed our network ports and checked the
> performance of the cluster it works good.
>
> Issue was fixed in Namenode server network port.
>
> Regards,
> Sandeep.v
>
>
> On Thu, Apr 9, 2015 at 12:30 PM, 杨浩 <ya...@gmail.com> wrote:
>
>> Root cause: Network related issue?
>> can you tell us more detailedly? Thank you
>>
>> 2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:
>>
>>> Our issue has been resolved.
>>>
>>> Root cause: Network related issue.
>>>
>>> Thanks for each and everyone spent sometime and replied to my questions.
>>>
>>> Regards,
>>> Sandeep.v
>>>
>>> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
>>> wrote:
>>>
>>>> Can anyone give solution for my issue?
>>>>
>>>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>>>> wrote:
>>>>
>>>>> Exactly but every time it picks randomly. Our datanodes are
>>>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>>>
>>>>> Namenode  : 192.168.2.80
>>>>>
>>>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>>>> connection closed
>>>>>
>>>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <
>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>
>>>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>>>> -datanode))
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>> *Sent:* April 8, 2015 2:39 PM
>>>>>>
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>
>>>>>>
>>>>>>
>>>>>> We are using this setup from a very long time.We are able to run all
>>>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>> I have also noticed another issue when starting hadoop cluster
>>>>>> start-all.sh command
>>>>>>
>>>>>>
>>>>>>
>>>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>>>> datanode would drop the connection and it shows the message connection
>>>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>>>> cluster datanode will keeps changing .
>>>>>>
>>>>>>
>>>>>>
>>>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>>>> connection closed
>>>>>>
>>>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>>>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>>>>
>>>>>>
>>>>>>
>>>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>>>> network or Hadoop configuration.
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <
>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>
>>>>>> hadoop fs -put <source> <destination> Copy from remote location to
>>>>>> HDFS
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>> *Sent:* April 8, 2015 2:24 PM
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>
>>>>>>
>>>>>>
>>>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Regards,
>>>>>>
>>>>>> Sandeep.V
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <
>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>
>>>>>> Should be hadoop dfs -put
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>> *Sent:* April 8, 2015 1:53 PM
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>>>
>>>>>>
>>>>>>
>>>>>> Hi,
>>>>>>
>>>>>>
>>>>>>
>>>>>> When loading a file from local to HDFS cluster using the below
>>>>>> command
>>>>>>
>>>>>>
>>>>>>
>>>>>> hadoop fs -put sales.txt /sales_dept.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Getting the following exception.Please let me know how to resolve
>>>>>> this issue asap.Please find the attached is the logs that is displaying on
>>>>>> namenode.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Regards,
>>>>>>
>>>>>> Sandeep.v
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Hi Yanghaogn,

Sure, We couldn't able to load the file from local to HDFS. Its getting
exception DFSOutputStream connection refused,which means packets are not
receiving properly from namenode to datanodes .However,if we start clusters
our datanodes are not starting properly and getting connection closed
exception.

Our Hadoop WebUI also opening very slow ,ssh connection also very slow.Then
finally we have changed our network ports and checked the performance of
the cluster it works good.

Issue was fixed in Namenode server network port.

Regards,
Sandeep.v


On Thu, Apr 9, 2015 at 12:30 PM, 杨浩 <ya...@gmail.com> wrote:

> Root cause: Network related issue?
> can you tell us more detailedly? Thank you
>
> 2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:
>
>> Our issue has been resolved.
>>
>> Root cause: Network related issue.
>>
>> Thanks for each and everyone spent sometime and replied to my questions.
>>
>> Regards,
>> Sandeep.v
>>
>> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
>> wrote:
>>
>>> Can anyone give solution for my issue?
>>>
>>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>>> wrote:
>>>
>>>> Exactly but every time it picks randomly. Our datanodes are
>>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>>
>>>> Namenode  : 192.168.2.80
>>>>
>>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>>> connection closed
>>>>
>>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Huat.Liaw@ontario.ca
>>>> > wrote:
>>>>
>>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>>> -datanode))
>>>>>
>>>>>
>>>>>
>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>> *Sent:* April 8, 2015 2:39 PM
>>>>>
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>
>>>>>
>>>>>
>>>>> We are using this setup from a very long time.We are able to run all
>>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>>
>>>>>
>>>>>
>>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>>> wrote:
>>>>>
>>>>> I have also noticed another issue when starting hadoop cluster
>>>>> start-all.sh command
>>>>>
>>>>>
>>>>>
>>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>>> datanode would drop the connection and it shows the message connection
>>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>>> cluster datanode will keeps changing .
>>>>>
>>>>>
>>>>>
>>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>>> connection closed
>>>>>
>>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>>>
>>>>>
>>>>>
>>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>>> network or Hadoop configuration.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <
>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>
>>>>> hadoop fs -put <source> <destination> Copy from remote location to
>>>>> HDFS
>>>>>
>>>>>
>>>>>
>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>> *Sent:* April 8, 2015 2:24 PM
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>
>>>>>
>>>>>
>>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>>
>>>>>
>>>>>
>>>>> Regards,
>>>>>
>>>>> Sandeep.V
>>>>>
>>>>>
>>>>>
>>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <
>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>
>>>>> Should be hadoop dfs -put
>>>>>
>>>>>
>>>>>
>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>> *Sent:* April 8, 2015 1:53 PM
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>>
>>>>>
>>>>>
>>>>> Hi,
>>>>>
>>>>>
>>>>>
>>>>> When loading a file from local to HDFS cluster using the below command
>>>>>
>>>>>
>>>>>
>>>>> hadoop fs -put sales.txt /sales_dept.
>>>>>
>>>>>
>>>>>
>>>>> Getting the following exception.Please let me know how to resolve this
>>>>> issue asap.Please find the attached is the logs that is displaying on
>>>>> namenode.
>>>>>
>>>>>
>>>>>
>>>>> Regards,
>>>>>
>>>>> Sandeep.v
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Hi Yanghaogn,

Sure, We couldn't able to load the file from local to HDFS. Its getting
exception DFSOutputStream connection refused,which means packets are not
receiving properly from namenode to datanodes .However,if we start clusters
our datanodes are not starting properly and getting connection closed
exception.

Our Hadoop WebUI also opening very slow ,ssh connection also very slow.Then
finally we have changed our network ports and checked the performance of
the cluster it works good.

Issue was fixed in Namenode server network port.

Regards,
Sandeep.v


On Thu, Apr 9, 2015 at 12:30 PM, 杨浩 <ya...@gmail.com> wrote:

> Root cause: Network related issue?
> can you tell us more detailedly? Thank you
>
> 2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:
>
>> Our issue has been resolved.
>>
>> Root cause: Network related issue.
>>
>> Thanks for each and everyone spent sometime and replied to my questions.
>>
>> Regards,
>> Sandeep.v
>>
>> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
>> wrote:
>>
>>> Can anyone give solution for my issue?
>>>
>>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>>> wrote:
>>>
>>>> Exactly but every time it picks randomly. Our datanodes are
>>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>>
>>>> Namenode  : 192.168.2.80
>>>>
>>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>>> connection closed
>>>>
>>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Huat.Liaw@ontario.ca
>>>> > wrote:
>>>>
>>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>>> -datanode))
>>>>>
>>>>>
>>>>>
>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>> *Sent:* April 8, 2015 2:39 PM
>>>>>
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>
>>>>>
>>>>>
>>>>> We are using this setup from a very long time.We are able to run all
>>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>>
>>>>>
>>>>>
>>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>>> wrote:
>>>>>
>>>>> I have also noticed another issue when starting hadoop cluster
>>>>> start-all.sh command
>>>>>
>>>>>
>>>>>
>>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>>> datanode would drop the connection and it shows the message connection
>>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>>> cluster datanode will keeps changing .
>>>>>
>>>>>
>>>>>
>>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>>> connection closed
>>>>>
>>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>>>
>>>>>
>>>>>
>>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>>> network or Hadoop configuration.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <
>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>
>>>>> hadoop fs -put <source> <destination> Copy from remote location to
>>>>> HDFS
>>>>>
>>>>>
>>>>>
>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>> *Sent:* April 8, 2015 2:24 PM
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>
>>>>>
>>>>>
>>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>>
>>>>>
>>>>>
>>>>> Regards,
>>>>>
>>>>> Sandeep.V
>>>>>
>>>>>
>>>>>
>>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <
>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>
>>>>> Should be hadoop dfs -put
>>>>>
>>>>>
>>>>>
>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>> *Sent:* April 8, 2015 1:53 PM
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>>
>>>>>
>>>>>
>>>>> Hi,
>>>>>
>>>>>
>>>>>
>>>>> When loading a file from local to HDFS cluster using the below command
>>>>>
>>>>>
>>>>>
>>>>> hadoop fs -put sales.txt /sales_dept.
>>>>>
>>>>>
>>>>>
>>>>> Getting the following exception.Please let me know how to resolve this
>>>>> issue asap.Please find the attached is the logs that is displaying on
>>>>> namenode.
>>>>>
>>>>>
>>>>>
>>>>> Regards,
>>>>>
>>>>> Sandeep.v
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Hi Yanghaogn,

Sure, We couldn't able to load the file from local to HDFS. Its getting
exception DFSOutputStream connection refused,which means packets are not
receiving properly from namenode to datanodes .However,if we start clusters
our datanodes are not starting properly and getting connection closed
exception.

Our Hadoop WebUI also opening very slow ,ssh connection also very slow.Then
finally we have changed our network ports and checked the performance of
the cluster it works good.

Issue was fixed in Namenode server network port.

Regards,
Sandeep.v


On Thu, Apr 9, 2015 at 12:30 PM, 杨浩 <ya...@gmail.com> wrote:

> Root cause: Network related issue?
> can you tell us more detailedly? Thank you
>
> 2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:
>
>> Our issue has been resolved.
>>
>> Root cause: Network related issue.
>>
>> Thanks for each and everyone spent sometime and replied to my questions.
>>
>> Regards,
>> Sandeep.v
>>
>> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
>> wrote:
>>
>>> Can anyone give solution for my issue?
>>>
>>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>>> wrote:
>>>
>>>> Exactly but every time it picks randomly. Our datanodes are
>>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>>
>>>> Namenode  : 192.168.2.80
>>>>
>>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>>> connection closed
>>>>
>>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Huat.Liaw@ontario.ca
>>>> > wrote:
>>>>
>>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>>> -datanode))
>>>>>
>>>>>
>>>>>
>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>> *Sent:* April 8, 2015 2:39 PM
>>>>>
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>
>>>>>
>>>>>
>>>>> We are using this setup from a very long time.We are able to run all
>>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>>
>>>>>
>>>>>
>>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>>> wrote:
>>>>>
>>>>> I have also noticed another issue when starting hadoop cluster
>>>>> start-all.sh command
>>>>>
>>>>>
>>>>>
>>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>>> datanode would drop the connection and it shows the message connection
>>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>>> cluster datanode will keeps changing .
>>>>>
>>>>>
>>>>>
>>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>>> connection closed
>>>>>
>>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>>>
>>>>>
>>>>>
>>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>>> network or Hadoop configuration.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <
>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>
>>>>> hadoop fs -put <source> <destination> Copy from remote location to
>>>>> HDFS
>>>>>
>>>>>
>>>>>
>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>> *Sent:* April 8, 2015 2:24 PM
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>
>>>>>
>>>>>
>>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>>
>>>>>
>>>>>
>>>>> Regards,
>>>>>
>>>>> Sandeep.V
>>>>>
>>>>>
>>>>>
>>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <
>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>
>>>>> Should be hadoop dfs -put
>>>>>
>>>>>
>>>>>
>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>> *Sent:* April 8, 2015 1:53 PM
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>>
>>>>>
>>>>>
>>>>> Hi,
>>>>>
>>>>>
>>>>>
>>>>> When loading a file from local to HDFS cluster using the below command
>>>>>
>>>>>
>>>>>
>>>>> hadoop fs -put sales.txt /sales_dept.
>>>>>
>>>>>
>>>>>
>>>>> Getting the following exception.Please let me know how to resolve this
>>>>> issue asap.Please find the attached is the logs that is displaying on
>>>>> namenode.
>>>>>
>>>>>
>>>>>
>>>>> Regards,
>>>>>
>>>>> Sandeep.v
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Hi Yanghaogn,

Sure, We couldn't able to load the file from local to HDFS. Its getting
exception DFSOutputStream connection refused,which means packets are not
receiving properly from namenode to datanodes .However,if we start clusters
our datanodes are not starting properly and getting connection closed
exception.

Our Hadoop WebUI also opening very slow ,ssh connection also very slow.Then
finally we have changed our network ports and checked the performance of
the cluster it works good.

Issue was fixed in Namenode server network port.

Regards,
Sandeep.v


On Thu, Apr 9, 2015 at 12:30 PM, 杨浩 <ya...@gmail.com> wrote:

> Root cause: Network related issue?
> can you tell us more detailedly? Thank you
>
> 2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:
>
>> Our issue has been resolved.
>>
>> Root cause: Network related issue.
>>
>> Thanks for each and everyone spent sometime and replied to my questions.
>>
>> Regards,
>> Sandeep.v
>>
>> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
>> wrote:
>>
>>> Can anyone give solution for my issue?
>>>
>>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>>> wrote:
>>>
>>>> Exactly but every time it picks randomly. Our datanodes are
>>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>>
>>>> Namenode  : 192.168.2.80
>>>>
>>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>>> connection closed
>>>>
>>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Huat.Liaw@ontario.ca
>>>> > wrote:
>>>>
>>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>>> -datanode))
>>>>>
>>>>>
>>>>>
>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>> *Sent:* April 8, 2015 2:39 PM
>>>>>
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>
>>>>>
>>>>>
>>>>> We are using this setup from a very long time.We are able to run all
>>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>>
>>>>>
>>>>>
>>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>>> wrote:
>>>>>
>>>>> I have also noticed another issue when starting hadoop cluster
>>>>> start-all.sh command
>>>>>
>>>>>
>>>>>
>>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>>> datanode would drop the connection and it shows the message connection
>>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>>> cluster datanode will keeps changing .
>>>>>
>>>>>
>>>>>
>>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>>> connection closed
>>>>>
>>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>>>
>>>>>
>>>>>
>>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>>> network or Hadoop configuration.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <
>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>
>>>>> hadoop fs -put <source> <destination> Copy from remote location to
>>>>> HDFS
>>>>>
>>>>>
>>>>>
>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>> *Sent:* April 8, 2015 2:24 PM
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>
>>>>>
>>>>>
>>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>>
>>>>>
>>>>>
>>>>> Regards,
>>>>>
>>>>> Sandeep.V
>>>>>
>>>>>
>>>>>
>>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <
>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>
>>>>> Should be hadoop dfs -put
>>>>>
>>>>>
>>>>>
>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>> *Sent:* April 8, 2015 1:53 PM
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>>
>>>>>
>>>>>
>>>>> Hi,
>>>>>
>>>>>
>>>>>
>>>>> When loading a file from local to HDFS cluster using the below command
>>>>>
>>>>>
>>>>>
>>>>> hadoop fs -put sales.txt /sales_dept.
>>>>>
>>>>>
>>>>>
>>>>> Getting the following exception.Please let me know how to resolve this
>>>>> issue asap.Please find the attached is the logs that is displaying on
>>>>> namenode.
>>>>>
>>>>>
>>>>>
>>>>> Regards,
>>>>>
>>>>> Sandeep.v
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by 杨浩 <ya...@gmail.com>.
Root cause: Network related issue?
can you tell us more detailedly? Thank you

2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:

> Our issue has been resolved.
>
> Root cause: Network related issue.
>
> Thanks for each and everyone spent sometime and replied to my questions.
>
> Regards,
> Sandeep.v
>
> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
> wrote:
>
>> Can anyone give solution for my issue?
>>
>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>> wrote:
>>
>>> Exactly but every time it picks randomly. Our datanodes are
>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>
>>> Namenode  : 192.168.2.80
>>>
>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>> connection closed
>>>
>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>> wrote:
>>>
>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>> -datanode))
>>>>
>>>>
>>>>
>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>> *Sent:* April 8, 2015 2:39 PM
>>>>
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>
>>>>
>>>>
>>>> We are using this setup from a very long time.We are able to run all
>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>
>>>>
>>>>
>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>> wrote:
>>>>
>>>> I have also noticed another issue when starting hadoop cluster
>>>> start-all.sh command
>>>>
>>>>
>>>>
>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>> datanode would drop the connection and it shows the message connection
>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>> cluster datanode will keeps changing .
>>>>
>>>>
>>>>
>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>> connection closed
>>>>
>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>>
>>>>
>>>>
>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>> network or Hadoop configuration.
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>>> wrote:
>>>>
>>>> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>>>
>>>>
>>>>
>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>> *Sent:* April 8, 2015 2:24 PM
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>
>>>>
>>>>
>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>
>>>>
>>>>
>>>> Regards,
>>>>
>>>> Sandeep.V
>>>>
>>>>
>>>>
>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>>> wrote:
>>>>
>>>> Should be hadoop dfs -put
>>>>
>>>>
>>>>
>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>> *Sent:* April 8, 2015 1:53 PM
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>
>>>>
>>>>
>>>> Hi,
>>>>
>>>>
>>>>
>>>> When loading a file from local to HDFS cluster using the below command
>>>>
>>>>
>>>>
>>>> hadoop fs -put sales.txt /sales_dept.
>>>>
>>>>
>>>>
>>>> Getting the following exception.Please let me know how to resolve this
>>>> issue asap.Please find the attached is the logs that is displaying on
>>>> namenode.
>>>>
>>>>
>>>>
>>>> Regards,
>>>>
>>>> Sandeep.v
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by 杨浩 <ya...@gmail.com>.
Root cause: Network related issue?
can you tell us more detailedly? Thank you

2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:

> Our issue has been resolved.
>
> Root cause: Network related issue.
>
> Thanks for each and everyone spent sometime and replied to my questions.
>
> Regards,
> Sandeep.v
>
> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
> wrote:
>
>> Can anyone give solution for my issue?
>>
>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>> wrote:
>>
>>> Exactly but every time it picks randomly. Our datanodes are
>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>
>>> Namenode  : 192.168.2.80
>>>
>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>> connection closed
>>>
>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>> wrote:
>>>
>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>> -datanode))
>>>>
>>>>
>>>>
>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>> *Sent:* April 8, 2015 2:39 PM
>>>>
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>
>>>>
>>>>
>>>> We are using this setup from a very long time.We are able to run all
>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>
>>>>
>>>>
>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>> wrote:
>>>>
>>>> I have also noticed another issue when starting hadoop cluster
>>>> start-all.sh command
>>>>
>>>>
>>>>
>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>> datanode would drop the connection and it shows the message connection
>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>> cluster datanode will keeps changing .
>>>>
>>>>
>>>>
>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>> connection closed
>>>>
>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>>
>>>>
>>>>
>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>> network or Hadoop configuration.
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>>> wrote:
>>>>
>>>> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>>>
>>>>
>>>>
>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>> *Sent:* April 8, 2015 2:24 PM
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>
>>>>
>>>>
>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>
>>>>
>>>>
>>>> Regards,
>>>>
>>>> Sandeep.V
>>>>
>>>>
>>>>
>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>>> wrote:
>>>>
>>>> Should be hadoop dfs -put
>>>>
>>>>
>>>>
>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>> *Sent:* April 8, 2015 1:53 PM
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>
>>>>
>>>>
>>>> Hi,
>>>>
>>>>
>>>>
>>>> When loading a file from local to HDFS cluster using the below command
>>>>
>>>>
>>>>
>>>> hadoop fs -put sales.txt /sales_dept.
>>>>
>>>>
>>>>
>>>> Getting the following exception.Please let me know how to resolve this
>>>> issue asap.Please find the attached is the logs that is displaying on
>>>> namenode.
>>>>
>>>>
>>>>
>>>> Regards,
>>>>
>>>> Sandeep.v
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by 杨浩 <ya...@gmail.com>.
Root cause: Network related issue?
can you tell us more detailedly? Thank you

2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:

> Our issue has been resolved.
>
> Root cause: Network related issue.
>
> Thanks for each and everyone spent sometime and replied to my questions.
>
> Regards,
> Sandeep.v
>
> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
> wrote:
>
>> Can anyone give solution for my issue?
>>
>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>> wrote:
>>
>>> Exactly but every time it picks randomly. Our datanodes are
>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>
>>> Namenode  : 192.168.2.80
>>>
>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>> connection closed
>>>
>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>> wrote:
>>>
>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>> -datanode))
>>>>
>>>>
>>>>
>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>> *Sent:* April 8, 2015 2:39 PM
>>>>
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>
>>>>
>>>>
>>>> We are using this setup from a very long time.We are able to run all
>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>
>>>>
>>>>
>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>> wrote:
>>>>
>>>> I have also noticed another issue when starting hadoop cluster
>>>> start-all.sh command
>>>>
>>>>
>>>>
>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>> datanode would drop the connection and it shows the message connection
>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>> cluster datanode will keeps changing .
>>>>
>>>>
>>>>
>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>> connection closed
>>>>
>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>>
>>>>
>>>>
>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>> network or Hadoop configuration.
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>>> wrote:
>>>>
>>>> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>>>
>>>>
>>>>
>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>> *Sent:* April 8, 2015 2:24 PM
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>
>>>>
>>>>
>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>
>>>>
>>>>
>>>> Regards,
>>>>
>>>> Sandeep.V
>>>>
>>>>
>>>>
>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>>> wrote:
>>>>
>>>> Should be hadoop dfs -put
>>>>
>>>>
>>>>
>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>> *Sent:* April 8, 2015 1:53 PM
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>
>>>>
>>>>
>>>> Hi,
>>>>
>>>>
>>>>
>>>> When loading a file from local to HDFS cluster using the below command
>>>>
>>>>
>>>>
>>>> hadoop fs -put sales.txt /sales_dept.
>>>>
>>>>
>>>>
>>>> Getting the following exception.Please let me know how to resolve this
>>>> issue asap.Please find the attached is the logs that is displaying on
>>>> namenode.
>>>>
>>>>
>>>>
>>>> Regards,
>>>>
>>>> Sandeep.v
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by 杨浩 <ya...@gmail.com>.
Root cause: Network related issue?
can you tell us more detailedly? Thank you

2015-04-09 13:51 GMT+08:00 sandeep vura <sa...@gmail.com>:

> Our issue has been resolved.
>
> Root cause: Network related issue.
>
> Thanks for each and everyone spent sometime and replied to my questions.
>
> Regards,
> Sandeep.v
>
> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com>
> wrote:
>
>> Can anyone give solution for my issue?
>>
>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
>> wrote:
>>
>>> Exactly but every time it picks randomly. Our datanodes are
>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>
>>> Namenode  : 192.168.2.80
>>>
>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>> connection closed
>>>
>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>> wrote:
>>>
>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>> -datanode))
>>>>
>>>>
>>>>
>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>> *Sent:* April 8, 2015 2:39 PM
>>>>
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>
>>>>
>>>>
>>>> We are using this setup from a very long time.We are able to run all
>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>
>>>>
>>>>
>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>>> wrote:
>>>>
>>>> I have also noticed another issue when starting hadoop cluster
>>>> start-all.sh command
>>>>
>>>>
>>>>
>>>> namenode and datanode daemons are starting.But sometimes one of the
>>>> datanode would drop the connection and it shows the message connection
>>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>>> cluster datanode will keeps changing .
>>>>
>>>>
>>>>
>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>>> connection closed
>>>>
>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>>
>>>>
>>>>
>>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>>> network or Hadoop configuration.
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>>> wrote:
>>>>
>>>> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>>>
>>>>
>>>>
>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>> *Sent:* April 8, 2015 2:24 PM
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>
>>>>
>>>>
>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>
>>>>
>>>>
>>>> Regards,
>>>>
>>>> Sandeep.V
>>>>
>>>>
>>>>
>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>>> wrote:
>>>>
>>>> Should be hadoop dfs -put
>>>>
>>>>
>>>>
>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>> *Sent:* April 8, 2015 1:53 PM
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>
>>>>
>>>>
>>>> Hi,
>>>>
>>>>
>>>>
>>>> When loading a file from local to HDFS cluster using the below command
>>>>
>>>>
>>>>
>>>> hadoop fs -put sales.txt /sales_dept.
>>>>
>>>>
>>>>
>>>> Getting the following exception.Please let me know how to resolve this
>>>> issue asap.Please find the attached is the logs that is displaying on
>>>> namenode.
>>>>
>>>>
>>>>
>>>> Regards,
>>>>
>>>> Sandeep.v
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Our issue has been resolved.

Root cause: Network related issue.

Thanks for each and everyone spent sometime and replied to my questions.

Regards,
Sandeep.v

On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com> wrote:

> Can anyone give solution for my issue?
>
> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
> wrote:
>
>> Exactly but every time it picks randomly. Our datanodes are
>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>
>> Namenode  : 192.168.2.80
>>
>> If i restarts the cluster next time it will show 192.168.2.81:50010
>> connection closed
>>
>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>> -datanode))
>>>
>>>
>>>
>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>> *Sent:* April 8, 2015 2:39 PM
>>>
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>
>>>
>>>
>>> We are using this setup from a very long time.We are able to run all the
>>> jobs successfully but suddenly went wrong with namenode.
>>>
>>>
>>>
>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>> wrote:
>>>
>>> I have also noticed another issue when starting hadoop cluster
>>> start-all.sh command
>>>
>>>
>>>
>>> namenode and datanode daemons are starting.But sometimes one of the
>>> datanode would drop the connection and it shows the message connection
>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>> cluster datanode will keeps changing .
>>>
>>>
>>>
>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>> connection closed
>>>
>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>
>>>
>>>
>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>> network or Hadoop configuration.
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>> wrote:
>>>
>>> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>>
>>>
>>>
>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>> *Sent:* April 8, 2015 2:24 PM
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>
>>>
>>>
>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>
>>>
>>>
>>> Regards,
>>>
>>> Sandeep.V
>>>
>>>
>>>
>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>> wrote:
>>>
>>> Should be hadoop dfs -put
>>>
>>>
>>>
>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>> *Sent:* April 8, 2015 1:53 PM
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Unable to load file from local to HDFS cluster
>>>
>>>
>>>
>>> Hi,
>>>
>>>
>>>
>>> When loading a file from local to HDFS cluster using the below command
>>>
>>>
>>>
>>> hadoop fs -put sales.txt /sales_dept.
>>>
>>>
>>>
>>> Getting the following exception.Please let me know how to resolve this
>>> issue asap.Please find the attached is the logs that is displaying on
>>> namenode.
>>>
>>>
>>>
>>> Regards,
>>>
>>> Sandeep.v
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Our issue has been resolved.

Root cause: Network related issue.

Thanks for each and everyone spent sometime and replied to my questions.

Regards,
Sandeep.v

On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com> wrote:

> Can anyone give solution for my issue?
>
> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
> wrote:
>
>> Exactly but every time it picks randomly. Our datanodes are
>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>
>> Namenode  : 192.168.2.80
>>
>> If i restarts the cluster next time it will show 192.168.2.81:50010
>> connection closed
>>
>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>> -datanode))
>>>
>>>
>>>
>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>> *Sent:* April 8, 2015 2:39 PM
>>>
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>
>>>
>>>
>>> We are using this setup from a very long time.We are able to run all the
>>> jobs successfully but suddenly went wrong with namenode.
>>>
>>>
>>>
>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>> wrote:
>>>
>>> I have also noticed another issue when starting hadoop cluster
>>> start-all.sh command
>>>
>>>
>>>
>>> namenode and datanode daemons are starting.But sometimes one of the
>>> datanode would drop the connection and it shows the message connection
>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>> cluster datanode will keeps changing .
>>>
>>>
>>>
>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>> connection closed
>>>
>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>
>>>
>>>
>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>> network or Hadoop configuration.
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>> wrote:
>>>
>>> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>>
>>>
>>>
>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>> *Sent:* April 8, 2015 2:24 PM
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>
>>>
>>>
>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>
>>>
>>>
>>> Regards,
>>>
>>> Sandeep.V
>>>
>>>
>>>
>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>> wrote:
>>>
>>> Should be hadoop dfs -put
>>>
>>>
>>>
>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>> *Sent:* April 8, 2015 1:53 PM
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Unable to load file from local to HDFS cluster
>>>
>>>
>>>
>>> Hi,
>>>
>>>
>>>
>>> When loading a file from local to HDFS cluster using the below command
>>>
>>>
>>>
>>> hadoop fs -put sales.txt /sales_dept.
>>>
>>>
>>>
>>> Getting the following exception.Please let me know how to resolve this
>>> issue asap.Please find the attached is the logs that is displaying on
>>> namenode.
>>>
>>>
>>>
>>> Regards,
>>>
>>> Sandeep.v
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Our issue has been resolved.

Root cause: Network related issue.

Thanks for each and everyone spent sometime and replied to my questions.

Regards,
Sandeep.v

On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com> wrote:

> Can anyone give solution for my issue?
>
> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
> wrote:
>
>> Exactly but every time it picks randomly. Our datanodes are
>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>
>> Namenode  : 192.168.2.80
>>
>> If i restarts the cluster next time it will show 192.168.2.81:50010
>> connection closed
>>
>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>> -datanode))
>>>
>>>
>>>
>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>> *Sent:* April 8, 2015 2:39 PM
>>>
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>
>>>
>>>
>>> We are using this setup from a very long time.We are able to run all the
>>> jobs successfully but suddenly went wrong with namenode.
>>>
>>>
>>>
>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>> wrote:
>>>
>>> I have also noticed another issue when starting hadoop cluster
>>> start-all.sh command
>>>
>>>
>>>
>>> namenode and datanode daemons are starting.But sometimes one of the
>>> datanode would drop the connection and it shows the message connection
>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>> cluster datanode will keeps changing .
>>>
>>>
>>>
>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>> connection closed
>>>
>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>
>>>
>>>
>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>> network or Hadoop configuration.
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>> wrote:
>>>
>>> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>>
>>>
>>>
>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>> *Sent:* April 8, 2015 2:24 PM
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>
>>>
>>>
>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>
>>>
>>>
>>> Regards,
>>>
>>> Sandeep.V
>>>
>>>
>>>
>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>> wrote:
>>>
>>> Should be hadoop dfs -put
>>>
>>>
>>>
>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>> *Sent:* April 8, 2015 1:53 PM
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Unable to load file from local to HDFS cluster
>>>
>>>
>>>
>>> Hi,
>>>
>>>
>>>
>>> When loading a file from local to HDFS cluster using the below command
>>>
>>>
>>>
>>> hadoop fs -put sales.txt /sales_dept.
>>>
>>>
>>>
>>> Getting the following exception.Please let me know how to resolve this
>>> issue asap.Please find the attached is the logs that is displaying on
>>> namenode.
>>>
>>>
>>>
>>> Regards,
>>>
>>> Sandeep.v
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Our issue has been resolved.

Root cause: Network related issue.

Thanks for each and everyone spent sometime and replied to my questions.

Regards,
Sandeep.v

On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sa...@gmail.com> wrote:

> Can anyone give solution for my issue?
>
> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com>
> wrote:
>
>> Exactly but every time it picks randomly. Our datanodes are
>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>
>> Namenode  : 192.168.2.80
>>
>> If i restarts the cluster next time it will show 192.168.2.81:50010
>> connection closed
>>
>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>> -datanode))
>>>
>>>
>>>
>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>> *Sent:* April 8, 2015 2:39 PM
>>>
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>
>>>
>>>
>>> We are using this setup from a very long time.We are able to run all the
>>> jobs successfully but suddenly went wrong with namenode.
>>>
>>>
>>>
>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>>> wrote:
>>>
>>> I have also noticed another issue when starting hadoop cluster
>>> start-all.sh command
>>>
>>>
>>>
>>> namenode and datanode daemons are starting.But sometimes one of the
>>> datanode would drop the connection and it shows the message connection
>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>> cluster datanode will keeps changing .
>>>
>>>
>>>
>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>> connection closed
>>>
>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>
>>>
>>>
>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>> network or Hadoop configuration.
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>> wrote:
>>>
>>> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>>
>>>
>>>
>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>> *Sent:* April 8, 2015 2:24 PM
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>
>>>
>>>
>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>
>>>
>>>
>>> Regards,
>>>
>>> Sandeep.V
>>>
>>>
>>>
>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>>> wrote:
>>>
>>> Should be hadoop dfs -put
>>>
>>>
>>>
>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>> *Sent:* April 8, 2015 1:53 PM
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Unable to load file from local to HDFS cluster
>>>
>>>
>>>
>>> Hi,
>>>
>>>
>>>
>>> When loading a file from local to HDFS cluster using the below command
>>>
>>>
>>>
>>> hadoop fs -put sales.txt /sales_dept.
>>>
>>>
>>>
>>> Getting the following exception.Please let me know how to resolve this
>>> issue asap.Please find the attached is the logs that is displaying on
>>> namenode.
>>>
>>>
>>>
>>> Regards,
>>>
>>> Sandeep.v
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Can anyone give solution for my issue?

On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com> wrote:

> Exactly but every time it picks randomly. Our datanodes are
> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>
> Namenode  : 192.168.2.80
>
> If i restarts the cluster next time it will show 192.168.2.81:50010
> connection closed
>
> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>> -datanode))
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 2:39 PM
>>
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>
>>
>>
>> We are using this setup from a very long time.We are able to run all the
>> jobs successfully but suddenly went wrong with namenode.
>>
>>
>>
>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>> wrote:
>>
>> I have also noticed another issue when starting hadoop cluster
>> start-all.sh command
>>
>>
>>
>> namenode and datanode daemons are starting.But sometimes one of the
>> datanode would drop the connection and it shows the message connection
>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>> cluster datanode will keeps changing .
>>
>>
>>
>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>> connection closed
>>
>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>
>>
>>
>> I couldn't able to figure out the issue exactly.Is issue relates to
>> network or Hadoop configuration.
>>
>>
>>
>>
>>
>>
>>
>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 2:24 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>
>>
>>
>> Sorry Liaw,I tried same command but its didn't resolve.
>>
>>
>>
>> Regards,
>>
>> Sandeep.V
>>
>>
>>
>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>> Should be hadoop dfs -put
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 1:53 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Unable to load file from local to HDFS cluster
>>
>>
>>
>> Hi,
>>
>>
>>
>> When loading a file from local to HDFS cluster using the below command
>>
>>
>>
>> hadoop fs -put sales.txt /sales_dept.
>>
>>
>>
>> Getting the following exception.Please let me know how to resolve this
>> issue asap.Please find the attached is the logs that is displaying on
>> namenode.
>>
>>
>>
>> Regards,
>>
>> Sandeep.v
>>
>>
>>
>>
>>
>>
>>
>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Can anyone give solution for my issue?

On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com> wrote:

> Exactly but every time it picks randomly. Our datanodes are
> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>
> Namenode  : 192.168.2.80
>
> If i restarts the cluster next time it will show 192.168.2.81:50010
> connection closed
>
> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>> -datanode))
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 2:39 PM
>>
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>
>>
>>
>> We are using this setup from a very long time.We are able to run all the
>> jobs successfully but suddenly went wrong with namenode.
>>
>>
>>
>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>> wrote:
>>
>> I have also noticed another issue when starting hadoop cluster
>> start-all.sh command
>>
>>
>>
>> namenode and datanode daemons are starting.But sometimes one of the
>> datanode would drop the connection and it shows the message connection
>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>> cluster datanode will keeps changing .
>>
>>
>>
>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>> connection closed
>>
>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>
>>
>>
>> I couldn't able to figure out the issue exactly.Is issue relates to
>> network or Hadoop configuration.
>>
>>
>>
>>
>>
>>
>>
>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 2:24 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>
>>
>>
>> Sorry Liaw,I tried same command but its didn't resolve.
>>
>>
>>
>> Regards,
>>
>> Sandeep.V
>>
>>
>>
>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>> Should be hadoop dfs -put
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 1:53 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Unable to load file from local to HDFS cluster
>>
>>
>>
>> Hi,
>>
>>
>>
>> When loading a file from local to HDFS cluster using the below command
>>
>>
>>
>> hadoop fs -put sales.txt /sales_dept.
>>
>>
>>
>> Getting the following exception.Please let me know how to resolve this
>> issue asap.Please find the attached is the logs that is displaying on
>> namenode.
>>
>>
>>
>> Regards,
>>
>> Sandeep.v
>>
>>
>>
>>
>>
>>
>>
>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Can anyone give solution for my issue?

On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com> wrote:

> Exactly but every time it picks randomly. Our datanodes are
> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>
> Namenode  : 192.168.2.80
>
> If i restarts the cluster next time it will show 192.168.2.81:50010
> connection closed
>
> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>> -datanode))
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 2:39 PM
>>
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>
>>
>>
>> We are using this setup from a very long time.We are able to run all the
>> jobs successfully but suddenly went wrong with namenode.
>>
>>
>>
>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>> wrote:
>>
>> I have also noticed another issue when starting hadoop cluster
>> start-all.sh command
>>
>>
>>
>> namenode and datanode daemons are starting.But sometimes one of the
>> datanode would drop the connection and it shows the message connection
>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>> cluster datanode will keeps changing .
>>
>>
>>
>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>> connection closed
>>
>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>
>>
>>
>> I couldn't able to figure out the issue exactly.Is issue relates to
>> network or Hadoop configuration.
>>
>>
>>
>>
>>
>>
>>
>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 2:24 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>
>>
>>
>> Sorry Liaw,I tried same command but its didn't resolve.
>>
>>
>>
>> Regards,
>>
>> Sandeep.V
>>
>>
>>
>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>> Should be hadoop dfs -put
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 1:53 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Unable to load file from local to HDFS cluster
>>
>>
>>
>> Hi,
>>
>>
>>
>> When loading a file from local to HDFS cluster using the below command
>>
>>
>>
>> hadoop fs -put sales.txt /sales_dept.
>>
>>
>>
>> Getting the following exception.Please let me know how to resolve this
>> issue asap.Please find the attached is the logs that is displaying on
>> namenode.
>>
>>
>>
>> Regards,
>>
>> Sandeep.v
>>
>>
>>
>>
>>
>>
>>
>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Can anyone give solution for my issue?

On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sa...@gmail.com> wrote:

> Exactly but every time it picks randomly. Our datanodes are
> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>
> Namenode  : 192.168.2.80
>
> If i restarts the cluster next time it will show 192.168.2.81:50010
> connection closed
>
> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>> -datanode))
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 2:39 PM
>>
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>
>>
>>
>> We are using this setup from a very long time.We are able to run all the
>> jobs successfully but suddenly went wrong with namenode.
>>
>>
>>
>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
>> wrote:
>>
>> I have also noticed another issue when starting hadoop cluster
>> start-all.sh command
>>
>>
>>
>> namenode and datanode daemons are starting.But sometimes one of the
>> datanode would drop the connection and it shows the message connection
>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>> cluster datanode will keeps changing .
>>
>>
>>
>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>> connection closed
>>
>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>
>>
>>
>> I couldn't able to figure out the issue exactly.Is issue relates to
>> network or Hadoop configuration.
>>
>>
>>
>>
>>
>>
>>
>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 2:24 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>
>>
>>
>> Sorry Liaw,I tried same command but its didn't resolve.
>>
>>
>>
>> Regards,
>>
>> Sandeep.V
>>
>>
>>
>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>> Should be hadoop dfs -put
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 1:53 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Unable to load file from local to HDFS cluster
>>
>>
>>
>> Hi,
>>
>>
>>
>> When loading a file from local to HDFS cluster using the below command
>>
>>
>>
>> hadoop fs -put sales.txt /sales_dept.
>>
>>
>>
>> Getting the following exception.Please let me know how to resolve this
>> issue asap.Please find the attached is the logs that is displaying on
>> namenode.
>>
>>
>>
>> Regards,
>>
>> Sandeep.v
>>
>>
>>
>>
>>
>>
>>
>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Exactly but every time it picks randomly. Our datanodes are
192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85

Namenode  : 192.168.2.80

If i restarts the cluster next time it will show 192.168.2.81:50010
connection closed

On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
wrote:

>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
> -datanode))
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 2:39 PM
>
> *To:* user@hadoop.apache.org
> *Subject:* Re: Unable to load file from local to HDFS cluster
>
>
>
> We are using this setup from a very long time.We are able to run all the
> jobs successfully but suddenly went wrong with namenode.
>
>
>
> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
> wrote:
>
> I have also noticed another issue when starting hadoop cluster
> start-all.sh command
>
>
>
> namenode and datanode daemons are starting.But sometimes one of the
> datanode would drop the connection and it shows the message connection
> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
> cluster datanode will keeps changing .
>
>
>
> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
> connection closed
>
> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
> .This point again 192.168.2.1 will starts successfuly without any errors.
>
>
>
> I couldn't able to figure out the issue exactly.Is issue relates to
> network or Hadoop configuration.
>
>
>
>
>
>
>
> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 2:24 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: Unable to load file from local to HDFS cluster
>
>
>
> Sorry Liaw,I tried same command but its didn't resolve.
>
>
>
> Regards,
>
> Sandeep.V
>
>
>
> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
> Should be hadoop dfs -put
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 1:53 PM
> *To:* user@hadoop.apache.org
> *Subject:* Unable to load file from local to HDFS cluster
>
>
>
> Hi,
>
>
>
> When loading a file from local to HDFS cluster using the below command
>
>
>
> hadoop fs -put sales.txt /sales_dept.
>
>
>
> Getting the following exception.Please let me know how to resolve this
> issue asap.Please find the attached is the logs that is displaying on
> namenode.
>
>
>
> Regards,
>
> Sandeep.v
>
>
>
>
>
>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Exactly but every time it picks randomly. Our datanodes are
192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85

Namenode  : 192.168.2.80

If i restarts the cluster next time it will show 192.168.2.81:50010
connection closed

On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
wrote:

>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
> -datanode))
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 2:39 PM
>
> *To:* user@hadoop.apache.org
> *Subject:* Re: Unable to load file from local to HDFS cluster
>
>
>
> We are using this setup from a very long time.We are able to run all the
> jobs successfully but suddenly went wrong with namenode.
>
>
>
> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
> wrote:
>
> I have also noticed another issue when starting hadoop cluster
> start-all.sh command
>
>
>
> namenode and datanode daemons are starting.But sometimes one of the
> datanode would drop the connection and it shows the message connection
> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
> cluster datanode will keeps changing .
>
>
>
> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
> connection closed
>
> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
> .This point again 192.168.2.1 will starts successfuly without any errors.
>
>
>
> I couldn't able to figure out the issue exactly.Is issue relates to
> network or Hadoop configuration.
>
>
>
>
>
>
>
> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 2:24 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: Unable to load file from local to HDFS cluster
>
>
>
> Sorry Liaw,I tried same command but its didn't resolve.
>
>
>
> Regards,
>
> Sandeep.V
>
>
>
> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
> Should be hadoop dfs -put
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 1:53 PM
> *To:* user@hadoop.apache.org
> *Subject:* Unable to load file from local to HDFS cluster
>
>
>
> Hi,
>
>
>
> When loading a file from local to HDFS cluster using the below command
>
>
>
> hadoop fs -put sales.txt /sales_dept.
>
>
>
> Getting the following exception.Please let me know how to resolve this
> issue asap.Please find the attached is the logs that is displaying on
> namenode.
>
>
>
> Regards,
>
> Sandeep.v
>
>
>
>
>
>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Exactly but every time it picks randomly. Our datanodes are
192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85

Namenode  : 192.168.2.80

If i restarts the cluster next time it will show 192.168.2.81:50010
connection closed

On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
wrote:

>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
> -datanode))
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 2:39 PM
>
> *To:* user@hadoop.apache.org
> *Subject:* Re: Unable to load file from local to HDFS cluster
>
>
>
> We are using this setup from a very long time.We are able to run all the
> jobs successfully but suddenly went wrong with namenode.
>
>
>
> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
> wrote:
>
> I have also noticed another issue when starting hadoop cluster
> start-all.sh command
>
>
>
> namenode and datanode daemons are starting.But sometimes one of the
> datanode would drop the connection and it shows the message connection
> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
> cluster datanode will keeps changing .
>
>
>
> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
> connection closed
>
> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
> .This point again 192.168.2.1 will starts successfuly without any errors.
>
>
>
> I couldn't able to figure out the issue exactly.Is issue relates to
> network or Hadoop configuration.
>
>
>
>
>
>
>
> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 2:24 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: Unable to load file from local to HDFS cluster
>
>
>
> Sorry Liaw,I tried same command but its didn't resolve.
>
>
>
> Regards,
>
> Sandeep.V
>
>
>
> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
> Should be hadoop dfs -put
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 1:53 PM
> *To:* user@hadoop.apache.org
> *Subject:* Unable to load file from local to HDFS cluster
>
>
>
> Hi,
>
>
>
> When loading a file from local to HDFS cluster using the below command
>
>
>
> hadoop fs -put sales.txt /sales_dept.
>
>
>
> Getting the following exception.Please let me know how to resolve this
> issue asap.Please find the attached is the logs that is displaying on
> namenode.
>
>
>
> Regards,
>
> Sandeep.v
>
>
>
>
>
>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Exactly but every time it picks randomly. Our datanodes are
192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85

Namenode  : 192.168.2.80

If i restarts the cluster next time it will show 192.168.2.81:50010
connection closed

On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Hu...@ontario.ca>
wrote:

>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
> -datanode))
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 2:39 PM
>
> *To:* user@hadoop.apache.org
> *Subject:* Re: Unable to load file from local to HDFS cluster
>
>
>
> We are using this setup from a very long time.We are able to run all the
> jobs successfully but suddenly went wrong with namenode.
>
>
>
> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>
> wrote:
>
> I have also noticed another issue when starting hadoop cluster
> start-all.sh command
>
>
>
> namenode and datanode daemons are starting.But sometimes one of the
> datanode would drop the connection and it shows the message connection
> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
> cluster datanode will keeps changing .
>
>
>
> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
> connection closed
>
> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
> .This point again 192.168.2.1 will starts successfuly without any errors.
>
>
>
> I couldn't able to figure out the issue exactly.Is issue relates to
> network or Hadoop configuration.
>
>
>
>
>
>
>
> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 2:24 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: Unable to load file from local to HDFS cluster
>
>
>
> Sorry Liaw,I tried same command but its didn't resolve.
>
>
>
> Regards,
>
> Sandeep.V
>
>
>
> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
> Should be hadoop dfs -put
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 1:53 PM
> *To:* user@hadoop.apache.org
> *Subject:* Unable to load file from local to HDFS cluster
>
>
>
> Hi,
>
>
>
> When loading a file from local to HDFS cluster using the below command
>
>
>
> hadoop fs -put sales.txt /sales_dept.
>
>
>
> Getting the following exception.Please let me know how to resolve this
> issue asap.Please find the attached is the logs that is displaying on
> namenode.
>
>
>
> Regards,
>
> Sandeep.v
>
>
>
>
>
>
>

RE: Unable to load file from local to HDFS cluster

Posted by "Liaw, Huat (MTO)" <Hu...@ontario.ca>.
You can not start 192.168.2.84:50010…. closed by ((192.168.2.x -datanode))

From: sandeep vura [mailto:sandeepvura@gmail.com]
Sent: April 8, 2015 2:39 PM
To: user@hadoop.apache.org
Subject: Re: Unable to load file from local to HDFS cluster

We are using this setup from a very long time.We are able to run all the jobs successfully but suddenly went wrong with namenode.

On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>> wrote:
I have also noticed another issue when starting hadoop cluster start-all.sh command

namenode and datanode daemons are starting.But sometimes one of the datanode would drop the connection and it shows the message connection closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop cluster datanode will keeps changing .

for example 1st time when i starts hadoop cluster - 192.168.2.1 - connection closed
2nd time when i starts hadoop cluster - 192.168.2.2-connection closed .This point again 192.168.2.1 will starts successfuly without any errors.

I couldn't able to figure out the issue exactly.Is issue relates to network or Hadoop configuration.



On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>> wrote:
hadoop fs -put <source> <destination> Copy from remote location to HDFS

From: sandeep vura [mailto:sandeepvura@gmail.com<ma...@gmail.com>]
Sent: April 8, 2015 2:24 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Unable to load file from local to HDFS cluster

Sorry Liaw,I tried same command but its didn't resolve.

Regards,
Sandeep.V

On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>> wrote:
Should be hadoop dfs -put

From: sandeep vura [mailto:sandeepvura@gmail.com<ma...@gmail.com>]
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Unable to load file from local to HDFS cluster

Hi,

When loading a file from local to HDFS cluster using the below command

hadoop fs -put sales.txt /sales_dept.

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

Regards,
Sandeep.v




RE: Unable to load file from local to HDFS cluster

Posted by "Liaw, Huat (MTO)" <Hu...@ontario.ca>.
You can not start 192.168.2.84:50010…. closed by ((192.168.2.x -datanode))

From: sandeep vura [mailto:sandeepvura@gmail.com]
Sent: April 8, 2015 2:39 PM
To: user@hadoop.apache.org
Subject: Re: Unable to load file from local to HDFS cluster

We are using this setup from a very long time.We are able to run all the jobs successfully but suddenly went wrong with namenode.

On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>> wrote:
I have also noticed another issue when starting hadoop cluster start-all.sh command

namenode and datanode daemons are starting.But sometimes one of the datanode would drop the connection and it shows the message connection closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop cluster datanode will keeps changing .

for example 1st time when i starts hadoop cluster - 192.168.2.1 - connection closed
2nd time when i starts hadoop cluster - 192.168.2.2-connection closed .This point again 192.168.2.1 will starts successfuly without any errors.

I couldn't able to figure out the issue exactly.Is issue relates to network or Hadoop configuration.



On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>> wrote:
hadoop fs -put <source> <destination> Copy from remote location to HDFS

From: sandeep vura [mailto:sandeepvura@gmail.com<ma...@gmail.com>]
Sent: April 8, 2015 2:24 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Unable to load file from local to HDFS cluster

Sorry Liaw,I tried same command but its didn't resolve.

Regards,
Sandeep.V

On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>> wrote:
Should be hadoop dfs -put

From: sandeep vura [mailto:sandeepvura@gmail.com<ma...@gmail.com>]
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Unable to load file from local to HDFS cluster

Hi,

When loading a file from local to HDFS cluster using the below command

hadoop fs -put sales.txt /sales_dept.

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

Regards,
Sandeep.v




RE: Unable to load file from local to HDFS cluster

Posted by "Liaw, Huat (MTO)" <Hu...@ontario.ca>.
You can not start 192.168.2.84:50010…. closed by ((192.168.2.x -datanode))

From: sandeep vura [mailto:sandeepvura@gmail.com]
Sent: April 8, 2015 2:39 PM
To: user@hadoop.apache.org
Subject: Re: Unable to load file from local to HDFS cluster

We are using this setup from a very long time.We are able to run all the jobs successfully but suddenly went wrong with namenode.

On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>> wrote:
I have also noticed another issue when starting hadoop cluster start-all.sh command

namenode and datanode daemons are starting.But sometimes one of the datanode would drop the connection and it shows the message connection closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop cluster datanode will keeps changing .

for example 1st time when i starts hadoop cluster - 192.168.2.1 - connection closed
2nd time when i starts hadoop cluster - 192.168.2.2-connection closed .This point again 192.168.2.1 will starts successfuly without any errors.

I couldn't able to figure out the issue exactly.Is issue relates to network or Hadoop configuration.



On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>> wrote:
hadoop fs -put <source> <destination> Copy from remote location to HDFS

From: sandeep vura [mailto:sandeepvura@gmail.com<ma...@gmail.com>]
Sent: April 8, 2015 2:24 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Unable to load file from local to HDFS cluster

Sorry Liaw,I tried same command but its didn't resolve.

Regards,
Sandeep.V

On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>> wrote:
Should be hadoop dfs -put

From: sandeep vura [mailto:sandeepvura@gmail.com<ma...@gmail.com>]
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Unable to load file from local to HDFS cluster

Hi,

When loading a file from local to HDFS cluster using the below command

hadoop fs -put sales.txt /sales_dept.

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

Regards,
Sandeep.v




RE: Unable to load file from local to HDFS cluster

Posted by "Liaw, Huat (MTO)" <Hu...@ontario.ca>.
You can not start 192.168.2.84:50010…. closed by ((192.168.2.x -datanode))

From: sandeep vura [mailto:sandeepvura@gmail.com]
Sent: April 8, 2015 2:39 PM
To: user@hadoop.apache.org
Subject: Re: Unable to load file from local to HDFS cluster

We are using this setup from a very long time.We are able to run all the jobs successfully but suddenly went wrong with namenode.

On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com>> wrote:
I have also noticed another issue when starting hadoop cluster start-all.sh command

namenode and datanode daemons are starting.But sometimes one of the datanode would drop the connection and it shows the message connection closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop cluster datanode will keeps changing .

for example 1st time when i starts hadoop cluster - 192.168.2.1 - connection closed
2nd time when i starts hadoop cluster - 192.168.2.2-connection closed .This point again 192.168.2.1 will starts successfuly without any errors.

I couldn't able to figure out the issue exactly.Is issue relates to network or Hadoop configuration.



On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>> wrote:
hadoop fs -put <source> <destination> Copy from remote location to HDFS

From: sandeep vura [mailto:sandeepvura@gmail.com<ma...@gmail.com>]
Sent: April 8, 2015 2:24 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Unable to load file from local to HDFS cluster

Sorry Liaw,I tried same command but its didn't resolve.

Regards,
Sandeep.V

On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>> wrote:
Should be hadoop dfs -put

From: sandeep vura [mailto:sandeepvura@gmail.com<ma...@gmail.com>]
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Unable to load file from local to HDFS cluster

Hi,

When loading a file from local to HDFS cluster using the below command

hadoop fs -put sales.txt /sales_dept.

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

Regards,
Sandeep.v




Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
We are using this setup from a very long time.We are able to run all the
jobs successfully but suddenly went wrong with namenode.

On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com> wrote:

> I have also noticed another issue when starting hadoop cluster
> start-all.sh command
>
> namenode and datanode daemons are starting.But sometimes one of the
> datanode would drop the connection and it shows the message connection
> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
> cluster datanode will keeps changing .
>
> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
> connection closed
> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
> .This point again 192.168.2.1 will starts successfuly without any errors.
>
> I couldn't able to figure out the issue exactly.Is issue relates to
> network or Hadoop configuration.
>
>
>
> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
>>  hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 2:24 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>
>>
>>
>> Sorry Liaw,I tried same command but its didn't resolve.
>>
>>
>>
>> Regards,
>>
>> Sandeep.V
>>
>>
>>
>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>> Should be hadoop dfs -put
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 1:53 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Unable to load file from local to HDFS cluster
>>
>>
>>
>> Hi,
>>
>>
>>
>> When loading a file from local to HDFS cluster using the below command
>>
>>
>>
>> hadoop fs -put sales.txt /sales_dept.
>>
>>
>>
>> Getting the following exception.Please let me know how to resolve this
>> issue asap.Please find the attached is the logs that is displaying on
>> namenode.
>>
>>
>>
>> Regards,
>>
>> Sandeep.v
>>
>>
>>
>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
We are using this setup from a very long time.We are able to run all the
jobs successfully but suddenly went wrong with namenode.

On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com> wrote:

> I have also noticed another issue when starting hadoop cluster
> start-all.sh command
>
> namenode and datanode daemons are starting.But sometimes one of the
> datanode would drop the connection and it shows the message connection
> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
> cluster datanode will keeps changing .
>
> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
> connection closed
> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
> .This point again 192.168.2.1 will starts successfuly without any errors.
>
> I couldn't able to figure out the issue exactly.Is issue relates to
> network or Hadoop configuration.
>
>
>
> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
>>  hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 2:24 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>
>>
>>
>> Sorry Liaw,I tried same command but its didn't resolve.
>>
>>
>>
>> Regards,
>>
>> Sandeep.V
>>
>>
>>
>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>> Should be hadoop dfs -put
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 1:53 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Unable to load file from local to HDFS cluster
>>
>>
>>
>> Hi,
>>
>>
>>
>> When loading a file from local to HDFS cluster using the below command
>>
>>
>>
>> hadoop fs -put sales.txt /sales_dept.
>>
>>
>>
>> Getting the following exception.Please let me know how to resolve this
>> issue asap.Please find the attached is the logs that is displaying on
>> namenode.
>>
>>
>>
>> Regards,
>>
>> Sandeep.v
>>
>>
>>
>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
We are using this setup from a very long time.We are able to run all the
jobs successfully but suddenly went wrong with namenode.

On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com> wrote:

> I have also noticed another issue when starting hadoop cluster
> start-all.sh command
>
> namenode and datanode daemons are starting.But sometimes one of the
> datanode would drop the connection and it shows the message connection
> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
> cluster datanode will keeps changing .
>
> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
> connection closed
> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
> .This point again 192.168.2.1 will starts successfuly without any errors.
>
> I couldn't able to figure out the issue exactly.Is issue relates to
> network or Hadoop configuration.
>
>
>
> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
>>  hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 2:24 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>
>>
>>
>> Sorry Liaw,I tried same command but its didn't resolve.
>>
>>
>>
>> Regards,
>>
>> Sandeep.V
>>
>>
>>
>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>> Should be hadoop dfs -put
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 1:53 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Unable to load file from local to HDFS cluster
>>
>>
>>
>> Hi,
>>
>>
>>
>> When loading a file from local to HDFS cluster using the below command
>>
>>
>>
>> hadoop fs -put sales.txt /sales_dept.
>>
>>
>>
>> Getting the following exception.Please let me know how to resolve this
>> issue asap.Please find the attached is the logs that is displaying on
>> namenode.
>>
>>
>>
>> Regards,
>>
>> Sandeep.v
>>
>>
>>
>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
We are using this setup from a very long time.We are able to run all the
jobs successfully but suddenly went wrong with namenode.

On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sa...@gmail.com> wrote:

> I have also noticed another issue when starting hadoop cluster
> start-all.sh command
>
> namenode and datanode daemons are starting.But sometimes one of the
> datanode would drop the connection and it shows the message connection
> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
> cluster datanode will keeps changing .
>
> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
> connection closed
> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
> .This point again 192.168.2.1 will starts successfuly without any errors.
>
> I couldn't able to figure out the issue exactly.Is issue relates to
> network or Hadoop configuration.
>
>
>
> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
>>  hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 2:24 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>
>>
>>
>> Sorry Liaw,I tried same command but its didn't resolve.
>>
>>
>>
>> Regards,
>>
>> Sandeep.V
>>
>>
>>
>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
>> wrote:
>>
>> Should be hadoop dfs -put
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 1:53 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Unable to load file from local to HDFS cluster
>>
>>
>>
>> Hi,
>>
>>
>>
>> When loading a file from local to HDFS cluster using the below command
>>
>>
>>
>> hadoop fs -put sales.txt /sales_dept.
>>
>>
>>
>> Getting the following exception.Please let me know how to resolve this
>> issue asap.Please find the attached is the logs that is displaying on
>> namenode.
>>
>>
>>
>> Regards,
>>
>> Sandeep.v
>>
>>
>>
>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
I have also noticed another issue when starting hadoop cluster start-all.sh
command

namenode and datanode daemons are starting.But sometimes one of the
datanode would drop the connection and it shows the message connection
closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
cluster datanode will keeps changing .

for example 1st time when i starts hadoop cluster - 192.168.2.1 -
connection closed
2nd time when i starts hadoop cluster - 192.168.2.2-connection closed .This
point again 192.168.2.1 will starts successfuly without any errors.

I couldn't able to figure out the issue exactly.Is issue relates to network
or Hadoop configuration.



On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
wrote:

>  hadoop fs -put <source> <destination> Copy from remote location to HDFS
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 2:24 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: Unable to load file from local to HDFS cluster
>
>
>
> Sorry Liaw,I tried same command but its didn't resolve.
>
>
>
> Regards,
>
> Sandeep.V
>
>
>
> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
> Should be hadoop dfs -put
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 1:53 PM
> *To:* user@hadoop.apache.org
> *Subject:* Unable to load file from local to HDFS cluster
>
>
>
> Hi,
>
>
>
> When loading a file from local to HDFS cluster using the below command
>
>
>
> hadoop fs -put sales.txt /sales_dept.
>
>
>
> Getting the following exception.Please let me know how to resolve this
> issue asap.Please find the attached is the logs that is displaying on
> namenode.
>
>
>
> Regards,
>
> Sandeep.v
>
>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
I have also noticed another issue when starting hadoop cluster start-all.sh
command

namenode and datanode daemons are starting.But sometimes one of the
datanode would drop the connection and it shows the message connection
closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
cluster datanode will keeps changing .

for example 1st time when i starts hadoop cluster - 192.168.2.1 -
connection closed
2nd time when i starts hadoop cluster - 192.168.2.2-connection closed .This
point again 192.168.2.1 will starts successfuly without any errors.

I couldn't able to figure out the issue exactly.Is issue relates to network
or Hadoop configuration.



On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
wrote:

>  hadoop fs -put <source> <destination> Copy from remote location to HDFS
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 2:24 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: Unable to load file from local to HDFS cluster
>
>
>
> Sorry Liaw,I tried same command but its didn't resolve.
>
>
>
> Regards,
>
> Sandeep.V
>
>
>
> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
> Should be hadoop dfs -put
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 1:53 PM
> *To:* user@hadoop.apache.org
> *Subject:* Unable to load file from local to HDFS cluster
>
>
>
> Hi,
>
>
>
> When loading a file from local to HDFS cluster using the below command
>
>
>
> hadoop fs -put sales.txt /sales_dept.
>
>
>
> Getting the following exception.Please let me know how to resolve this
> issue asap.Please find the attached is the logs that is displaying on
> namenode.
>
>
>
> Regards,
>
> Sandeep.v
>
>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
I have also noticed another issue when starting hadoop cluster start-all.sh
command

namenode and datanode daemons are starting.But sometimes one of the
datanode would drop the connection and it shows the message connection
closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
cluster datanode will keeps changing .

for example 1st time when i starts hadoop cluster - 192.168.2.1 -
connection closed
2nd time when i starts hadoop cluster - 192.168.2.2-connection closed .This
point again 192.168.2.1 will starts successfuly without any errors.

I couldn't able to figure out the issue exactly.Is issue relates to network
or Hadoop configuration.



On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
wrote:

>  hadoop fs -put <source> <destination> Copy from remote location to HDFS
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 2:24 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: Unable to load file from local to HDFS cluster
>
>
>
> Sorry Liaw,I tried same command but its didn't resolve.
>
>
>
> Regards,
>
> Sandeep.V
>
>
>
> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
> Should be hadoop dfs -put
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 1:53 PM
> *To:* user@hadoop.apache.org
> *Subject:* Unable to load file from local to HDFS cluster
>
>
>
> Hi,
>
>
>
> When loading a file from local to HDFS cluster using the below command
>
>
>
> hadoop fs -put sales.txt /sales_dept.
>
>
>
> Getting the following exception.Please let me know how to resolve this
> issue asap.Please find the attached is the logs that is displaying on
> namenode.
>
>
>
> Regards,
>
> Sandeep.v
>
>
>

Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
I have also noticed another issue when starting hadoop cluster start-all.sh
command

namenode and datanode daemons are starting.But sometimes one of the
datanode would drop the connection and it shows the message connection
closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
cluster datanode will keeps changing .

for example 1st time when i starts hadoop cluster - 192.168.2.1 -
connection closed
2nd time when i starts hadoop cluster - 192.168.2.2-connection closed .This
point again 192.168.2.1 will starts successfuly without any errors.

I couldn't able to figure out the issue exactly.Is issue relates to network
or Hadoop configuration.



On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
wrote:

>  hadoop fs -put <source> <destination> Copy from remote location to HDFS
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 2:24 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: Unable to load file from local to HDFS cluster
>
>
>
> Sorry Liaw,I tried same command but its didn't resolve.
>
>
>
> Regards,
>
> Sandeep.V
>
>
>
> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
> wrote:
>
> Should be hadoop dfs -put
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 1:53 PM
> *To:* user@hadoop.apache.org
> *Subject:* Unable to load file from local to HDFS cluster
>
>
>
> Hi,
>
>
>
> When loading a file from local to HDFS cluster using the below command
>
>
>
> hadoop fs -put sales.txt /sales_dept.
>
>
>
> Getting the following exception.Please let me know how to resolve this
> issue asap.Please find the attached is the logs that is displaying on
> namenode.
>
>
>
> Regards,
>
> Sandeep.v
>
>
>

RE: Unable to load file from local to HDFS cluster

Posted by "Liaw, Huat (MTO)" <Hu...@ontario.ca>.
hadoop fs -put <source> <destination> Copy from remote location to HDFS

From: sandeep vura [mailto:sandeepvura@gmail.com]
Sent: April 8, 2015 2:24 PM
To: user@hadoop.apache.org
Subject: Re: Unable to load file from local to HDFS cluster

Sorry Liaw,I tried same command but its didn't resolve.

Regards,
Sandeep.V

On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>> wrote:
Should be hadoop dfs -put

From: sandeep vura [mailto:sandeepvura@gmail.com<ma...@gmail.com>]
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Unable to load file from local to HDFS cluster

Hi,

When loading a file from local to HDFS cluster using the below command

hadoop fs -put sales.txt /sales_dept.

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

Regards,
Sandeep.v


RE: Unable to load file from local to HDFS cluster

Posted by "Liaw, Huat (MTO)" <Hu...@ontario.ca>.
hadoop fs -put <source> <destination> Copy from remote location to HDFS

From: sandeep vura [mailto:sandeepvura@gmail.com]
Sent: April 8, 2015 2:24 PM
To: user@hadoop.apache.org
Subject: Re: Unable to load file from local to HDFS cluster

Sorry Liaw,I tried same command but its didn't resolve.

Regards,
Sandeep.V

On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>> wrote:
Should be hadoop dfs -put

From: sandeep vura [mailto:sandeepvura@gmail.com<ma...@gmail.com>]
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Unable to load file from local to HDFS cluster

Hi,

When loading a file from local to HDFS cluster using the below command

hadoop fs -put sales.txt /sales_dept.

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

Regards,
Sandeep.v


RE: Unable to load file from local to HDFS cluster

Posted by "Liaw, Huat (MTO)" <Hu...@ontario.ca>.
hadoop fs -put <source> <destination> Copy from remote location to HDFS

From: sandeep vura [mailto:sandeepvura@gmail.com]
Sent: April 8, 2015 2:24 PM
To: user@hadoop.apache.org
Subject: Re: Unable to load file from local to HDFS cluster

Sorry Liaw,I tried same command but its didn't resolve.

Regards,
Sandeep.V

On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>> wrote:
Should be hadoop dfs -put

From: sandeep vura [mailto:sandeepvura@gmail.com<ma...@gmail.com>]
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Unable to load file from local to HDFS cluster

Hi,

When loading a file from local to HDFS cluster using the below command

hadoop fs -put sales.txt /sales_dept.

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

Regards,
Sandeep.v


RE: Unable to load file from local to HDFS cluster

Posted by "Liaw, Huat (MTO)" <Hu...@ontario.ca>.
hadoop fs -put <source> <destination> Copy from remote location to HDFS

From: sandeep vura [mailto:sandeepvura@gmail.com]
Sent: April 8, 2015 2:24 PM
To: user@hadoop.apache.org
Subject: Re: Unable to load file from local to HDFS cluster

Sorry Liaw,I tried same command but its didn't resolve.

Regards,
Sandeep.V

On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>> wrote:
Should be hadoop dfs -put

From: sandeep vura [mailto:sandeepvura@gmail.com<ma...@gmail.com>]
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Unable to load file from local to HDFS cluster

Hi,

When loading a file from local to HDFS cluster using the below command

hadoop fs -put sales.txt /sales_dept.

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

Regards,
Sandeep.v


Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Sorry Liaw,I tried same command but its didn't resolve.

Regards,
Sandeep.V

On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
wrote:

>  Should be hadoop dfs -put
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 1:53 PM
> *To:* user@hadoop.apache.org
> *Subject:* Unable to load file from local to HDFS cluster
>
>
>
> Hi,
>
>
>
> When loading a file from local to HDFS cluster using the below command
>
>
>
> hadoop fs -put sales.txt /sales_dept.
>
>
>
> Getting the following exception.Please let me know how to resolve this
> issue asap.Please find the attached is the logs that is displaying on
> namenode.
>
>
>
> Regards,
>
> Sandeep.v
>

RE: Unable to load file from local to HDFS cluster

Posted by Mich Talebzadeh <mi...@peridale.co.uk>.
I would say hdfs dfs … like examples below from remote host 

 

hdfs dfs -mkdir hdfs://rhes564:9000/some_directory

hdfs dfs -put hadoop-hduser-datanode-rhes5.log hdfs://rhes564:9000/some_directory

hduser@rhes5::/home/hduser/hadoop/hadoop-2.6.0/logs> hdfs dfs -ls hdfs://rhes564:9000/some_directory

Found 1 items

-rw-r--r--   2 hduser supergroup    1274532 2015-04-08 19:25 hdfs://rhes564:9000/some_directory/hadoop-hduser-datanode-rhes5.log

 

HTH

 

Mich Talebzadeh

 

http://talebzadehmich.wordpress.com

 

Publications due shortly:

Creating in-memory Data Grid for Trading Systems with Oracle TimesTen and Coherence Cache

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

From: Liaw, Huat (MTO) [mailto:Huat.Liaw@ontario.ca] 
Sent: 08 April 2015 19:08
To: user@hadoop.apache.org
Subject: RE: Unable to load file from local to HDFS cluster

 

Should be hadoop dfs -put

 

From: sandeep vura [mailto:sandeepvura@gmail.com] 
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org
Subject: Unable to load file from local to HDFS cluster

 

Hi,

 

When loading a file from local to HDFS cluster using the below command 

 

hadoop fs -put sales.txt /sales_dept.

 

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

 

Regards,

Sandeep.v


Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Sorry Liaw,I tried same command but its didn't resolve.

Regards,
Sandeep.V

On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
wrote:

>  Should be hadoop dfs -put
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 1:53 PM
> *To:* user@hadoop.apache.org
> *Subject:* Unable to load file from local to HDFS cluster
>
>
>
> Hi,
>
>
>
> When loading a file from local to HDFS cluster using the below command
>
>
>
> hadoop fs -put sales.txt /sales_dept.
>
>
>
> Getting the following exception.Please let me know how to resolve this
> issue asap.Please find the attached is the logs that is displaying on
> namenode.
>
>
>
> Regards,
>
> Sandeep.v
>

RE: Unable to load file from local to HDFS cluster

Posted by Mich Talebzadeh <mi...@peridale.co.uk>.
I would say hdfs dfs … like examples below from remote host 

 

hdfs dfs -mkdir hdfs://rhes564:9000/some_directory

hdfs dfs -put hadoop-hduser-datanode-rhes5.log hdfs://rhes564:9000/some_directory

hduser@rhes5::/home/hduser/hadoop/hadoop-2.6.0/logs> hdfs dfs -ls hdfs://rhes564:9000/some_directory

Found 1 items

-rw-r--r--   2 hduser supergroup    1274532 2015-04-08 19:25 hdfs://rhes564:9000/some_directory/hadoop-hduser-datanode-rhes5.log

 

HTH

 

Mich Talebzadeh

 

http://talebzadehmich.wordpress.com

 

Publications due shortly:

Creating in-memory Data Grid for Trading Systems with Oracle TimesTen and Coherence Cache

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

From: Liaw, Huat (MTO) [mailto:Huat.Liaw@ontario.ca] 
Sent: 08 April 2015 19:08
To: user@hadoop.apache.org
Subject: RE: Unable to load file from local to HDFS cluster

 

Should be hadoop dfs -put

 

From: sandeep vura [mailto:sandeepvura@gmail.com] 
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org
Subject: Unable to load file from local to HDFS cluster

 

Hi,

 

When loading a file from local to HDFS cluster using the below command 

 

hadoop fs -put sales.txt /sales_dept.

 

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

 

Regards,

Sandeep.v


Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Sorry Liaw,I tried same command but its didn't resolve.

Regards,
Sandeep.V

On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
wrote:

>  Should be hadoop dfs -put
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 1:53 PM
> *To:* user@hadoop.apache.org
> *Subject:* Unable to load file from local to HDFS cluster
>
>
>
> Hi,
>
>
>
> When loading a file from local to HDFS cluster using the below command
>
>
>
> hadoop fs -put sales.txt /sales_dept.
>
>
>
> Getting the following exception.Please let me know how to resolve this
> issue asap.Please find the attached is the logs that is displaying on
> namenode.
>
>
>
> Regards,
>
> Sandeep.v
>

RE: Unable to load file from local to HDFS cluster

Posted by Mich Talebzadeh <mi...@peridale.co.uk>.
I would say hdfs dfs … like examples below from remote host 

 

hdfs dfs -mkdir hdfs://rhes564:9000/some_directory

hdfs dfs -put hadoop-hduser-datanode-rhes5.log hdfs://rhes564:9000/some_directory

hduser@rhes5::/home/hduser/hadoop/hadoop-2.6.0/logs> hdfs dfs -ls hdfs://rhes564:9000/some_directory

Found 1 items

-rw-r--r--   2 hduser supergroup    1274532 2015-04-08 19:25 hdfs://rhes564:9000/some_directory/hadoop-hduser-datanode-rhes5.log

 

HTH

 

Mich Talebzadeh

 

http://talebzadehmich.wordpress.com

 

Publications due shortly:

Creating in-memory Data Grid for Trading Systems with Oracle TimesTen and Coherence Cache

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

From: Liaw, Huat (MTO) [mailto:Huat.Liaw@ontario.ca] 
Sent: 08 April 2015 19:08
To: user@hadoop.apache.org
Subject: RE: Unable to load file from local to HDFS cluster

 

Should be hadoop dfs -put

 

From: sandeep vura [mailto:sandeepvura@gmail.com] 
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org
Subject: Unable to load file from local to HDFS cluster

 

Hi,

 

When loading a file from local to HDFS cluster using the below command 

 

hadoop fs -put sales.txt /sales_dept.

 

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

 

Regards,

Sandeep.v


RE: Unable to load file from local to HDFS cluster

Posted by Mich Talebzadeh <mi...@peridale.co.uk>.
I would say hdfs dfs … like examples below from remote host 

 

hdfs dfs -mkdir hdfs://rhes564:9000/some_directory

hdfs dfs -put hadoop-hduser-datanode-rhes5.log hdfs://rhes564:9000/some_directory

hduser@rhes5::/home/hduser/hadoop/hadoop-2.6.0/logs> hdfs dfs -ls hdfs://rhes564:9000/some_directory

Found 1 items

-rw-r--r--   2 hduser supergroup    1274532 2015-04-08 19:25 hdfs://rhes564:9000/some_directory/hadoop-hduser-datanode-rhes5.log

 

HTH

 

Mich Talebzadeh

 

http://talebzadehmich.wordpress.com

 

Publications due shortly:

Creating in-memory Data Grid for Trading Systems with Oracle TimesTen and Coherence Cache

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

From: Liaw, Huat (MTO) [mailto:Huat.Liaw@ontario.ca] 
Sent: 08 April 2015 19:08
To: user@hadoop.apache.org
Subject: RE: Unable to load file from local to HDFS cluster

 

Should be hadoop dfs -put

 

From: sandeep vura [mailto:sandeepvura@gmail.com] 
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org
Subject: Unable to load file from local to HDFS cluster

 

Hi,

 

When loading a file from local to HDFS cluster using the below command 

 

hadoop fs -put sales.txt /sales_dept.

 

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

 

Regards,

Sandeep.v


Re: Unable to load file from local to HDFS cluster

Posted by sandeep vura <sa...@gmail.com>.
Sorry Liaw,I tried same command but its didn't resolve.

Regards,
Sandeep.V

On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Hu...@ontario.ca>
wrote:

>  Should be hadoop dfs -put
>
>
>
> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
> *Sent:* April 8, 2015 1:53 PM
> *To:* user@hadoop.apache.org
> *Subject:* Unable to load file from local to HDFS cluster
>
>
>
> Hi,
>
>
>
> When loading a file from local to HDFS cluster using the below command
>
>
>
> hadoop fs -put sales.txt /sales_dept.
>
>
>
> Getting the following exception.Please let me know how to resolve this
> issue asap.Please find the attached is the logs that is displaying on
> namenode.
>
>
>
> Regards,
>
> Sandeep.v
>

RE: Unable to load file from local to HDFS cluster

Posted by "Liaw, Huat (MTO)" <Hu...@ontario.ca>.
Should be hadoop dfs -put

From: sandeep vura [mailto:sandeepvura@gmail.com]
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org
Subject: Unable to load file from local to HDFS cluster

Hi,

When loading a file from local to HDFS cluster using the below command

hadoop fs -put sales.txt /sales_dept.

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

Regards,
Sandeep.v

RE: Unable to load file from local to HDFS cluster

Posted by "Liaw, Huat (MTO)" <Hu...@ontario.ca>.
Should be hadoop dfs -put

From: sandeep vura [mailto:sandeepvura@gmail.com]
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org
Subject: Unable to load file from local to HDFS cluster

Hi,

When loading a file from local to HDFS cluster using the below command

hadoop fs -put sales.txt /sales_dept.

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

Regards,
Sandeep.v

RE: Unable to load file from local to HDFS cluster

Posted by "Liaw, Huat (MTO)" <Hu...@ontario.ca>.
Should be hadoop dfs -put

From: sandeep vura [mailto:sandeepvura@gmail.com]
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org
Subject: Unable to load file from local to HDFS cluster

Hi,

When loading a file from local to HDFS cluster using the below command

hadoop fs -put sales.txt /sales_dept.

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

Regards,
Sandeep.v

RE: Unable to load file from local to HDFS cluster

Posted by "Liaw, Huat (MTO)" <Hu...@ontario.ca>.
Should be hadoop dfs -put

From: sandeep vura [mailto:sandeepvura@gmail.com]
Sent: April 8, 2015 1:53 PM
To: user@hadoop.apache.org
Subject: Unable to load file from local to HDFS cluster

Hi,

When loading a file from local to HDFS cluster using the below command

hadoop fs -put sales.txt /sales_dept.

Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode.

Regards,
Sandeep.v