You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Vinodh Nagaraj <vi...@gmail.com> on 2016/03/04 13:59:02 UTC

Error While copying file from local to dfs

Hi All,

I am new bee to Hadoop.

I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for
learning purpose.

I can execute start-all.cmd successfully.

When i execute jps,i got the below output.
28544 NameNode
35728
36308 DataNode
43828 Jps
40688 NodeManager
33820 ResourceManager

My configuration files are.

core-site.xml
---------------------
<configuration>
 <property>
       <name>fs.defaultFS</name>
       <value>hdfs://10.219.149.100:50075/</value>
       <description>NameNode URI</description>
 </property>
</configuration>



hdfs-site.xml
---------------------
<configuration>
   <property>
     <name>dfs.replication</name>
     <value>2</value>
    </property>
    <property>
      <name>dfs.namenode.name.dir</name>
      <value>D:\Hadoop_TEST\Hadoop\Data</value>
    </property>
    <property>
     <name>dfs.datanode.data.dir</name>
      <value>D:\Hadoop_TEST\Hadoop\Secondary</value>
   </property>

  <property>
      <name>dfs.namenode.datanode.registration.ip-hostname-check</name>
     <value>false</value>
   </property>
</configuration>

I tried to copy text file from my locad drive to hdfs file system.but i got
the below error.

*D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt
hdfs://10.219.149.100:50010/a.txt <http://10.219.149.100:50010/a.txt>*
*copyFromLocal: End of File Exception between local host is:
"PC205172/10.219.149.100 <http://10.219.149.100>"; destination host is:
"PC205172.cts.com <http://PC205172.cts.com>":50010; : java.io.EOFException;
For more details see:  http://wiki.apache.org/hadoop/EOFException
<http://wiki.apache.org/hadoop/EOFException>*


Please share your suggestions.

How to identify whether i have installed hadoop properly or not
how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs or
hadoop command
how to identify  NAME NODE LOCATION , NAME NODE PROT and its  configuration
details by hdfs or hadoop command like  how many replicat etc.

Thanks & Regards,
Vinodh.N

Re: Error While copying file from local to dfs

Posted by Naresh Jangra <ha...@gmail.com>.
Hi Vinodh,

After looking at your configs below are my answer -

1. Use any other port in dfs.defaultFS property because 50075 is the port
used for Datanode's WEB UI. Port in this property is mean to used for
metadata transfer which is 8020 by default. You can use 8020 or any other
except port in below links -

https://ambari.apache.org/1.2.3/installing-hadoop-using-ambari/content/reference_chap2_1.html

2.  Secondly, you have used Replication factor 2 an data node ONLY ONE,
make it to 1.

3. Lastly, when you fire copyFromLocal command, you can do it like -

hadoop fs -copyFromLocal filename hdfs_path_name

Or if you want to use the namenode address and port -

hadoop fs -copyFromLocal *4300.txt hdfs://10.219.149.100:<PORT>/a.txt
<http://10.219.149.100:50010/a.txt>*

Note that the Port should be same that you used in dfs.defaultFS.

Let me know if it helped.

Regards,
Naresh Jangra

On Sun, Mar 6, 2016 at 2:05 PM, Mallanagouda Patil <
mallanagouda.c.patil@gmail.com> wrote:

> Hi Vinod,
>
> Can you try this.
> 1: core-site.XML
> hdfs://localhost
> 2: restart hadoop stop-dfs.sh and start-dfs.sh
> 2:try this command
> hadoop fs -copyFromLocal sourcefile /
> It copies file from Source file to hdfs root.
> I hope it helps.
>
> Thanks
> Mallan
> On Mar 5, 2016 11:11 AM, "Vinodh Nagaraj" <vi...@gmail.com> wrote:
>
>> Hi All,
>>
>> Please help me.
>>
>> Thanks & Regards,
>> Vinodh.N
>>
>> On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <vi...@gmail.com>
>> wrote:
>>
>>> Hi All,
>>>
>>> I am new bee to Hadoop.
>>>
>>> I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for
>>> learning purpose.
>>>
>>> I can execute start-all.cmd successfully.
>>>
>>> When i execute jps,i got the below output.
>>> 28544 NameNode
>>> 35728
>>> 36308 DataNode
>>> 43828 Jps
>>> 40688 NodeManager
>>> 33820 ResourceManager
>>>
>>> My configuration files are.
>>>
>>> core-site.xml
>>> ---------------------
>>> <configuration>
>>>  <property>
>>>        <name>fs.defaultFS</name>
>>>        <value>hdfs://10.219.149.100:50075/</value>
>>>        <description>NameNode URI</description>
>>>  </property>
>>> </configuration>
>>>
>>>
>>>
>>> hdfs-site.xml
>>> ---------------------
>>> <configuration>
>>>    <property>
>>>      <name>dfs.replication</name>
>>>      <value>2</value>
>>>     </property>
>>>     <property>
>>>       <name>dfs.namenode.name.dir</name>
>>>       <value>D:\Hadoop_TEST\Hadoop\Data</value>
>>>     </property>
>>>     <property>
>>>      <name>dfs.datanode.data.dir</name>
>>>       <value>D:\Hadoop_TEST\Hadoop\Secondary</value>
>>>    </property>
>>>
>>>   <property>
>>>       <name>dfs.namenode.datanode.registration.ip-hostname-check</name>
>>>
>>>      <value>false</value>
>>>    </property>
>>> </configuration>
>>>
>>> I tried to copy text file from my locad drive to hdfs file system.but i
>>> got the below error.
>>>
>>> *D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt
>>> hdfs://10.219.149.100:50010/a.txt <http://10.219.149.100:50010/a.txt>*
>>> *copyFromLocal: End of File Exception between local host is:
>>> "PC205172/10.219.149.100 <http://10.219.149.100>"; destination host is:
>>> "PC205172.cts.com <http://PC205172.cts.com>":50010; : java.io.EOFException;
>>> For more details see:  http://wiki.apache.org/hadoop/EOFException
>>> <http://wiki.apache.org/hadoop/EOFException>*
>>>
>>>
>>> Please share your suggestions.
>>>
>>> How to identify whether i have installed hadoop properly or not
>>> how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs
>>> or hadoop command
>>> how to identify  NAME NODE LOCATION , NAME NODE PROT and its
>>>  configuration details by hdfs or hadoop command like  how many replicat
>>> etc.
>>>
>>> Thanks & Regards,
>>> Vinodh.N
>>>
>>>
>>>
>>

Re: Error While copying file from local to dfs

Posted by Naresh Jangra <ha...@gmail.com>.
Hi Vinodh,

After looking at your configs below are my answer -

1. Use any other port in dfs.defaultFS property because 50075 is the port
used for Datanode's WEB UI. Port in this property is mean to used for
metadata transfer which is 8020 by default. You can use 8020 or any other
except port in below links -

https://ambari.apache.org/1.2.3/installing-hadoop-using-ambari/content/reference_chap2_1.html

2.  Secondly, you have used Replication factor 2 an data node ONLY ONE,
make it to 1.

3. Lastly, when you fire copyFromLocal command, you can do it like -

hadoop fs -copyFromLocal filename hdfs_path_name

Or if you want to use the namenode address and port -

hadoop fs -copyFromLocal *4300.txt hdfs://10.219.149.100:<PORT>/a.txt
<http://10.219.149.100:50010/a.txt>*

Note that the Port should be same that you used in dfs.defaultFS.

Let me know if it helped.

Regards,
Naresh Jangra

On Sun, Mar 6, 2016 at 2:05 PM, Mallanagouda Patil <
mallanagouda.c.patil@gmail.com> wrote:

> Hi Vinod,
>
> Can you try this.
> 1: core-site.XML
> hdfs://localhost
> 2: restart hadoop stop-dfs.sh and start-dfs.sh
> 2:try this command
> hadoop fs -copyFromLocal sourcefile /
> It copies file from Source file to hdfs root.
> I hope it helps.
>
> Thanks
> Mallan
> On Mar 5, 2016 11:11 AM, "Vinodh Nagaraj" <vi...@gmail.com> wrote:
>
>> Hi All,
>>
>> Please help me.
>>
>> Thanks & Regards,
>> Vinodh.N
>>
>> On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <vi...@gmail.com>
>> wrote:
>>
>>> Hi All,
>>>
>>> I am new bee to Hadoop.
>>>
>>> I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for
>>> learning purpose.
>>>
>>> I can execute start-all.cmd successfully.
>>>
>>> When i execute jps,i got the below output.
>>> 28544 NameNode
>>> 35728
>>> 36308 DataNode
>>> 43828 Jps
>>> 40688 NodeManager
>>> 33820 ResourceManager
>>>
>>> My configuration files are.
>>>
>>> core-site.xml
>>> ---------------------
>>> <configuration>
>>>  <property>
>>>        <name>fs.defaultFS</name>
>>>        <value>hdfs://10.219.149.100:50075/</value>
>>>        <description>NameNode URI</description>
>>>  </property>
>>> </configuration>
>>>
>>>
>>>
>>> hdfs-site.xml
>>> ---------------------
>>> <configuration>
>>>    <property>
>>>      <name>dfs.replication</name>
>>>      <value>2</value>
>>>     </property>
>>>     <property>
>>>       <name>dfs.namenode.name.dir</name>
>>>       <value>D:\Hadoop_TEST\Hadoop\Data</value>
>>>     </property>
>>>     <property>
>>>      <name>dfs.datanode.data.dir</name>
>>>       <value>D:\Hadoop_TEST\Hadoop\Secondary</value>
>>>    </property>
>>>
>>>   <property>
>>>       <name>dfs.namenode.datanode.registration.ip-hostname-check</name>
>>>
>>>      <value>false</value>
>>>    </property>
>>> </configuration>
>>>
>>> I tried to copy text file from my locad drive to hdfs file system.but i
>>> got the below error.
>>>
>>> *D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt
>>> hdfs://10.219.149.100:50010/a.txt <http://10.219.149.100:50010/a.txt>*
>>> *copyFromLocal: End of File Exception between local host is:
>>> "PC205172/10.219.149.100 <http://10.219.149.100>"; destination host is:
>>> "PC205172.cts.com <http://PC205172.cts.com>":50010; : java.io.EOFException;
>>> For more details see:  http://wiki.apache.org/hadoop/EOFException
>>> <http://wiki.apache.org/hadoop/EOFException>*
>>>
>>>
>>> Please share your suggestions.
>>>
>>> How to identify whether i have installed hadoop properly or not
>>> how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs
>>> or hadoop command
>>> how to identify  NAME NODE LOCATION , NAME NODE PROT and its
>>>  configuration details by hdfs or hadoop command like  how many replicat
>>> etc.
>>>
>>> Thanks & Regards,
>>> Vinodh.N
>>>
>>>
>>>
>>

Re: Error While copying file from local to dfs

Posted by Naresh Jangra <ha...@gmail.com>.
Hi Vinodh,

After looking at your configs below are my answer -

1. Use any other port in dfs.defaultFS property because 50075 is the port
used for Datanode's WEB UI. Port in this property is mean to used for
metadata transfer which is 8020 by default. You can use 8020 or any other
except port in below links -

https://ambari.apache.org/1.2.3/installing-hadoop-using-ambari/content/reference_chap2_1.html

2.  Secondly, you have used Replication factor 2 an data node ONLY ONE,
make it to 1.

3. Lastly, when you fire copyFromLocal command, you can do it like -

hadoop fs -copyFromLocal filename hdfs_path_name

Or if you want to use the namenode address and port -

hadoop fs -copyFromLocal *4300.txt hdfs://10.219.149.100:<PORT>/a.txt
<http://10.219.149.100:50010/a.txt>*

Note that the Port should be same that you used in dfs.defaultFS.

Let me know if it helped.

Regards,
Naresh Jangra

On Sun, Mar 6, 2016 at 2:05 PM, Mallanagouda Patil <
mallanagouda.c.patil@gmail.com> wrote:

> Hi Vinod,
>
> Can you try this.
> 1: core-site.XML
> hdfs://localhost
> 2: restart hadoop stop-dfs.sh and start-dfs.sh
> 2:try this command
> hadoop fs -copyFromLocal sourcefile /
> It copies file from Source file to hdfs root.
> I hope it helps.
>
> Thanks
> Mallan
> On Mar 5, 2016 11:11 AM, "Vinodh Nagaraj" <vi...@gmail.com> wrote:
>
>> Hi All,
>>
>> Please help me.
>>
>> Thanks & Regards,
>> Vinodh.N
>>
>> On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <vi...@gmail.com>
>> wrote:
>>
>>> Hi All,
>>>
>>> I am new bee to Hadoop.
>>>
>>> I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for
>>> learning purpose.
>>>
>>> I can execute start-all.cmd successfully.
>>>
>>> When i execute jps,i got the below output.
>>> 28544 NameNode
>>> 35728
>>> 36308 DataNode
>>> 43828 Jps
>>> 40688 NodeManager
>>> 33820 ResourceManager
>>>
>>> My configuration files are.
>>>
>>> core-site.xml
>>> ---------------------
>>> <configuration>
>>>  <property>
>>>        <name>fs.defaultFS</name>
>>>        <value>hdfs://10.219.149.100:50075/</value>
>>>        <description>NameNode URI</description>
>>>  </property>
>>> </configuration>
>>>
>>>
>>>
>>> hdfs-site.xml
>>> ---------------------
>>> <configuration>
>>>    <property>
>>>      <name>dfs.replication</name>
>>>      <value>2</value>
>>>     </property>
>>>     <property>
>>>       <name>dfs.namenode.name.dir</name>
>>>       <value>D:\Hadoop_TEST\Hadoop\Data</value>
>>>     </property>
>>>     <property>
>>>      <name>dfs.datanode.data.dir</name>
>>>       <value>D:\Hadoop_TEST\Hadoop\Secondary</value>
>>>    </property>
>>>
>>>   <property>
>>>       <name>dfs.namenode.datanode.registration.ip-hostname-check</name>
>>>
>>>      <value>false</value>
>>>    </property>
>>> </configuration>
>>>
>>> I tried to copy text file from my locad drive to hdfs file system.but i
>>> got the below error.
>>>
>>> *D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt
>>> hdfs://10.219.149.100:50010/a.txt <http://10.219.149.100:50010/a.txt>*
>>> *copyFromLocal: End of File Exception between local host is:
>>> "PC205172/10.219.149.100 <http://10.219.149.100>"; destination host is:
>>> "PC205172.cts.com <http://PC205172.cts.com>":50010; : java.io.EOFException;
>>> For more details see:  http://wiki.apache.org/hadoop/EOFException
>>> <http://wiki.apache.org/hadoop/EOFException>*
>>>
>>>
>>> Please share your suggestions.
>>>
>>> How to identify whether i have installed hadoop properly or not
>>> how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs
>>> or hadoop command
>>> how to identify  NAME NODE LOCATION , NAME NODE PROT and its
>>>  configuration details by hdfs or hadoop command like  how many replicat
>>> etc.
>>>
>>> Thanks & Regards,
>>> Vinodh.N
>>>
>>>
>>>
>>

Re: Error While copying file from local to dfs

Posted by Naresh Jangra <ha...@gmail.com>.
Hi Vinodh,

After looking at your configs below are my answer -

1. Use any other port in dfs.defaultFS property because 50075 is the port
used for Datanode's WEB UI. Port in this property is mean to used for
metadata transfer which is 8020 by default. You can use 8020 or any other
except port in below links -

https://ambari.apache.org/1.2.3/installing-hadoop-using-ambari/content/reference_chap2_1.html

2.  Secondly, you have used Replication factor 2 an data node ONLY ONE,
make it to 1.

3. Lastly, when you fire copyFromLocal command, you can do it like -

hadoop fs -copyFromLocal filename hdfs_path_name

Or if you want to use the namenode address and port -

hadoop fs -copyFromLocal *4300.txt hdfs://10.219.149.100:<PORT>/a.txt
<http://10.219.149.100:50010/a.txt>*

Note that the Port should be same that you used in dfs.defaultFS.

Let me know if it helped.

Regards,
Naresh Jangra

On Sun, Mar 6, 2016 at 2:05 PM, Mallanagouda Patil <
mallanagouda.c.patil@gmail.com> wrote:

> Hi Vinod,
>
> Can you try this.
> 1: core-site.XML
> hdfs://localhost
> 2: restart hadoop stop-dfs.sh and start-dfs.sh
> 2:try this command
> hadoop fs -copyFromLocal sourcefile /
> It copies file from Source file to hdfs root.
> I hope it helps.
>
> Thanks
> Mallan
> On Mar 5, 2016 11:11 AM, "Vinodh Nagaraj" <vi...@gmail.com> wrote:
>
>> Hi All,
>>
>> Please help me.
>>
>> Thanks & Regards,
>> Vinodh.N
>>
>> On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <vi...@gmail.com>
>> wrote:
>>
>>> Hi All,
>>>
>>> I am new bee to Hadoop.
>>>
>>> I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for
>>> learning purpose.
>>>
>>> I can execute start-all.cmd successfully.
>>>
>>> When i execute jps,i got the below output.
>>> 28544 NameNode
>>> 35728
>>> 36308 DataNode
>>> 43828 Jps
>>> 40688 NodeManager
>>> 33820 ResourceManager
>>>
>>> My configuration files are.
>>>
>>> core-site.xml
>>> ---------------------
>>> <configuration>
>>>  <property>
>>>        <name>fs.defaultFS</name>
>>>        <value>hdfs://10.219.149.100:50075/</value>
>>>        <description>NameNode URI</description>
>>>  </property>
>>> </configuration>
>>>
>>>
>>>
>>> hdfs-site.xml
>>> ---------------------
>>> <configuration>
>>>    <property>
>>>      <name>dfs.replication</name>
>>>      <value>2</value>
>>>     </property>
>>>     <property>
>>>       <name>dfs.namenode.name.dir</name>
>>>       <value>D:\Hadoop_TEST\Hadoop\Data</value>
>>>     </property>
>>>     <property>
>>>      <name>dfs.datanode.data.dir</name>
>>>       <value>D:\Hadoop_TEST\Hadoop\Secondary</value>
>>>    </property>
>>>
>>>   <property>
>>>       <name>dfs.namenode.datanode.registration.ip-hostname-check</name>
>>>
>>>      <value>false</value>
>>>    </property>
>>> </configuration>
>>>
>>> I tried to copy text file from my locad drive to hdfs file system.but i
>>> got the below error.
>>>
>>> *D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt
>>> hdfs://10.219.149.100:50010/a.txt <http://10.219.149.100:50010/a.txt>*
>>> *copyFromLocal: End of File Exception between local host is:
>>> "PC205172/10.219.149.100 <http://10.219.149.100>"; destination host is:
>>> "PC205172.cts.com <http://PC205172.cts.com>":50010; : java.io.EOFException;
>>> For more details see:  http://wiki.apache.org/hadoop/EOFException
>>> <http://wiki.apache.org/hadoop/EOFException>*
>>>
>>>
>>> Please share your suggestions.
>>>
>>> How to identify whether i have installed hadoop properly or not
>>> how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs
>>> or hadoop command
>>> how to identify  NAME NODE LOCATION , NAME NODE PROT and its
>>>  configuration details by hdfs or hadoop command like  how many replicat
>>> etc.
>>>
>>> Thanks & Regards,
>>> Vinodh.N
>>>
>>>
>>>
>>

Re: Error While copying file from local to dfs

Posted by Mallanagouda Patil <ma...@gmail.com>.
Hi Vinod,

Can you try this.
1: core-site.XML
hdfs://localhost
2: restart hadoop stop-dfs.sh and start-dfs.sh
2:try this command
hadoop fs -copyFromLocal sourcefile /
It copies file from Source file to hdfs root.
I hope it helps.

Thanks
Mallan
On Mar 5, 2016 11:11 AM, "Vinodh Nagaraj" <vi...@gmail.com> wrote:

> Hi All,
>
> Please help me.
>
> Thanks & Regards,
> Vinodh.N
>
> On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <vi...@gmail.com>
> wrote:
>
>> Hi All,
>>
>> I am new bee to Hadoop.
>>
>> I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for
>> learning purpose.
>>
>> I can execute start-all.cmd successfully.
>>
>> When i execute jps,i got the below output.
>> 28544 NameNode
>> 35728
>> 36308 DataNode
>> 43828 Jps
>> 40688 NodeManager
>> 33820 ResourceManager
>>
>> My configuration files are.
>>
>> core-site.xml
>> ---------------------
>> <configuration>
>>  <property>
>>        <name>fs.defaultFS</name>
>>        <value>hdfs://10.219.149.100:50075/</value>
>>        <description>NameNode URI</description>
>>  </property>
>> </configuration>
>>
>>
>>
>> hdfs-site.xml
>> ---------------------
>> <configuration>
>>    <property>
>>      <name>dfs.replication</name>
>>      <value>2</value>
>>     </property>
>>     <property>
>>       <name>dfs.namenode.name.dir</name>
>>       <value>D:\Hadoop_TEST\Hadoop\Data</value>
>>     </property>
>>     <property>
>>      <name>dfs.datanode.data.dir</name>
>>       <value>D:\Hadoop_TEST\Hadoop\Secondary</value>
>>    </property>
>>
>>   <property>
>>       <name>dfs.namenode.datanode.registration.ip-hostname-check</name>
>>
>>      <value>false</value>
>>    </property>
>> </configuration>
>>
>> I tried to copy text file from my locad drive to hdfs file system.but i
>> got the below error.
>>
>> *D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt
>> hdfs://10.219.149.100:50010/a.txt <http://10.219.149.100:50010/a.txt>*
>> *copyFromLocal: End of File Exception between local host is:
>> "PC205172/10.219.149.100 <http://10.219.149.100>"; destination host is:
>> "PC205172.cts.com <http://PC205172.cts.com>":50010; : java.io.EOFException;
>> For more details see:  http://wiki.apache.org/hadoop/EOFException
>> <http://wiki.apache.org/hadoop/EOFException>*
>>
>>
>> Please share your suggestions.
>>
>> How to identify whether i have installed hadoop properly or not
>> how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs
>> or hadoop command
>> how to identify  NAME NODE LOCATION , NAME NODE PROT and its
>>  configuration details by hdfs or hadoop command like  how many replicat
>> etc.
>>
>> Thanks & Regards,
>> Vinodh.N
>>
>>
>>
>

Re: Error While copying file from local to dfs

Posted by Mallanagouda Patil <ma...@gmail.com>.
Hi Vinod,

Can you try this.
1: core-site.XML
hdfs://localhost
2: restart hadoop stop-dfs.sh and start-dfs.sh
2:try this command
hadoop fs -copyFromLocal sourcefile /
It copies file from Source file to hdfs root.
I hope it helps.

Thanks
Mallan
On Mar 5, 2016 11:11 AM, "Vinodh Nagaraj" <vi...@gmail.com> wrote:

> Hi All,
>
> Please help me.
>
> Thanks & Regards,
> Vinodh.N
>
> On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <vi...@gmail.com>
> wrote:
>
>> Hi All,
>>
>> I am new bee to Hadoop.
>>
>> I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for
>> learning purpose.
>>
>> I can execute start-all.cmd successfully.
>>
>> When i execute jps,i got the below output.
>> 28544 NameNode
>> 35728
>> 36308 DataNode
>> 43828 Jps
>> 40688 NodeManager
>> 33820 ResourceManager
>>
>> My configuration files are.
>>
>> core-site.xml
>> ---------------------
>> <configuration>
>>  <property>
>>        <name>fs.defaultFS</name>
>>        <value>hdfs://10.219.149.100:50075/</value>
>>        <description>NameNode URI</description>
>>  </property>
>> </configuration>
>>
>>
>>
>> hdfs-site.xml
>> ---------------------
>> <configuration>
>>    <property>
>>      <name>dfs.replication</name>
>>      <value>2</value>
>>     </property>
>>     <property>
>>       <name>dfs.namenode.name.dir</name>
>>       <value>D:\Hadoop_TEST\Hadoop\Data</value>
>>     </property>
>>     <property>
>>      <name>dfs.datanode.data.dir</name>
>>       <value>D:\Hadoop_TEST\Hadoop\Secondary</value>
>>    </property>
>>
>>   <property>
>>       <name>dfs.namenode.datanode.registration.ip-hostname-check</name>
>>
>>      <value>false</value>
>>    </property>
>> </configuration>
>>
>> I tried to copy text file from my locad drive to hdfs file system.but i
>> got the below error.
>>
>> *D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt
>> hdfs://10.219.149.100:50010/a.txt <http://10.219.149.100:50010/a.txt>*
>> *copyFromLocal: End of File Exception between local host is:
>> "PC205172/10.219.149.100 <http://10.219.149.100>"; destination host is:
>> "PC205172.cts.com <http://PC205172.cts.com>":50010; : java.io.EOFException;
>> For more details see:  http://wiki.apache.org/hadoop/EOFException
>> <http://wiki.apache.org/hadoop/EOFException>*
>>
>>
>> Please share your suggestions.
>>
>> How to identify whether i have installed hadoop properly or not
>> how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs
>> or hadoop command
>> how to identify  NAME NODE LOCATION , NAME NODE PROT and its
>>  configuration details by hdfs or hadoop command like  how many replicat
>> etc.
>>
>> Thanks & Regards,
>> Vinodh.N
>>
>>
>>
>

Re: Error While copying file from local to dfs

Posted by Mallanagouda Patil <ma...@gmail.com>.
Hi Vinod,

Can you try this.
1: core-site.XML
hdfs://localhost
2: restart hadoop stop-dfs.sh and start-dfs.sh
2:try this command
hadoop fs -copyFromLocal sourcefile /
It copies file from Source file to hdfs root.
I hope it helps.

Thanks
Mallan
On Mar 5, 2016 11:11 AM, "Vinodh Nagaraj" <vi...@gmail.com> wrote:

> Hi All,
>
> Please help me.
>
> Thanks & Regards,
> Vinodh.N
>
> On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <vi...@gmail.com>
> wrote:
>
>> Hi All,
>>
>> I am new bee to Hadoop.
>>
>> I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for
>> learning purpose.
>>
>> I can execute start-all.cmd successfully.
>>
>> When i execute jps,i got the below output.
>> 28544 NameNode
>> 35728
>> 36308 DataNode
>> 43828 Jps
>> 40688 NodeManager
>> 33820 ResourceManager
>>
>> My configuration files are.
>>
>> core-site.xml
>> ---------------------
>> <configuration>
>>  <property>
>>        <name>fs.defaultFS</name>
>>        <value>hdfs://10.219.149.100:50075/</value>
>>        <description>NameNode URI</description>
>>  </property>
>> </configuration>
>>
>>
>>
>> hdfs-site.xml
>> ---------------------
>> <configuration>
>>    <property>
>>      <name>dfs.replication</name>
>>      <value>2</value>
>>     </property>
>>     <property>
>>       <name>dfs.namenode.name.dir</name>
>>       <value>D:\Hadoop_TEST\Hadoop\Data</value>
>>     </property>
>>     <property>
>>      <name>dfs.datanode.data.dir</name>
>>       <value>D:\Hadoop_TEST\Hadoop\Secondary</value>
>>    </property>
>>
>>   <property>
>>       <name>dfs.namenode.datanode.registration.ip-hostname-check</name>
>>
>>      <value>false</value>
>>    </property>
>> </configuration>
>>
>> I tried to copy text file from my locad drive to hdfs file system.but i
>> got the below error.
>>
>> *D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt
>> hdfs://10.219.149.100:50010/a.txt <http://10.219.149.100:50010/a.txt>*
>> *copyFromLocal: End of File Exception between local host is:
>> "PC205172/10.219.149.100 <http://10.219.149.100>"; destination host is:
>> "PC205172.cts.com <http://PC205172.cts.com>":50010; : java.io.EOFException;
>> For more details see:  http://wiki.apache.org/hadoop/EOFException
>> <http://wiki.apache.org/hadoop/EOFException>*
>>
>>
>> Please share your suggestions.
>>
>> How to identify whether i have installed hadoop properly or not
>> how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs
>> or hadoop command
>> how to identify  NAME NODE LOCATION , NAME NODE PROT and its
>>  configuration details by hdfs or hadoop command like  how many replicat
>> etc.
>>
>> Thanks & Regards,
>> Vinodh.N
>>
>>
>>
>

Re: Error While copying file from local to dfs

Posted by Mallanagouda Patil <ma...@gmail.com>.
Hi Vinod,

Can you try this.
1: core-site.XML
hdfs://localhost
2: restart hadoop stop-dfs.sh and start-dfs.sh
2:try this command
hadoop fs -copyFromLocal sourcefile /
It copies file from Source file to hdfs root.
I hope it helps.

Thanks
Mallan
On Mar 5, 2016 11:11 AM, "Vinodh Nagaraj" <vi...@gmail.com> wrote:

> Hi All,
>
> Please help me.
>
> Thanks & Regards,
> Vinodh.N
>
> On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <vi...@gmail.com>
> wrote:
>
>> Hi All,
>>
>> I am new bee to Hadoop.
>>
>> I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for
>> learning purpose.
>>
>> I can execute start-all.cmd successfully.
>>
>> When i execute jps,i got the below output.
>> 28544 NameNode
>> 35728
>> 36308 DataNode
>> 43828 Jps
>> 40688 NodeManager
>> 33820 ResourceManager
>>
>> My configuration files are.
>>
>> core-site.xml
>> ---------------------
>> <configuration>
>>  <property>
>>        <name>fs.defaultFS</name>
>>        <value>hdfs://10.219.149.100:50075/</value>
>>        <description>NameNode URI</description>
>>  </property>
>> </configuration>
>>
>>
>>
>> hdfs-site.xml
>> ---------------------
>> <configuration>
>>    <property>
>>      <name>dfs.replication</name>
>>      <value>2</value>
>>     </property>
>>     <property>
>>       <name>dfs.namenode.name.dir</name>
>>       <value>D:\Hadoop_TEST\Hadoop\Data</value>
>>     </property>
>>     <property>
>>      <name>dfs.datanode.data.dir</name>
>>       <value>D:\Hadoop_TEST\Hadoop\Secondary</value>
>>    </property>
>>
>>   <property>
>>       <name>dfs.namenode.datanode.registration.ip-hostname-check</name>
>>
>>      <value>false</value>
>>    </property>
>> </configuration>
>>
>> I tried to copy text file from my locad drive to hdfs file system.but i
>> got the below error.
>>
>> *D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt
>> hdfs://10.219.149.100:50010/a.txt <http://10.219.149.100:50010/a.txt>*
>> *copyFromLocal: End of File Exception between local host is:
>> "PC205172/10.219.149.100 <http://10.219.149.100>"; destination host is:
>> "PC205172.cts.com <http://PC205172.cts.com>":50010; : java.io.EOFException;
>> For more details see:  http://wiki.apache.org/hadoop/EOFException
>> <http://wiki.apache.org/hadoop/EOFException>*
>>
>>
>> Please share your suggestions.
>>
>> How to identify whether i have installed hadoop properly or not
>> how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs
>> or hadoop command
>> how to identify  NAME NODE LOCATION , NAME NODE PROT and its
>>  configuration details by hdfs or hadoop command like  how many replicat
>> etc.
>>
>> Thanks & Regards,
>> Vinodh.N
>>
>>
>>
>

Re: Error While copying file from local to dfs

Posted by Vinodh Nagaraj <vi...@gmail.com>.
Hi All,

Please help me.

Thanks & Regards,
Vinodh.N

On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <vi...@gmail.com>
wrote:

> Hi All,
>
> I am new bee to Hadoop.
>
> I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for
> learning purpose.
>
> I can execute start-all.cmd successfully.
>
> When i execute jps,i got the below output.
> 28544 NameNode
> 35728
> 36308 DataNode
> 43828 Jps
> 40688 NodeManager
> 33820 ResourceManager
>
> My configuration files are.
>
> core-site.xml
> ---------------------
> <configuration>
>  <property>
>        <name>fs.defaultFS</name>
>        <value>hdfs://10.219.149.100:50075/</value>
>        <description>NameNode URI</description>
>  </property>
> </configuration>
>
>
>
> hdfs-site.xml
> ---------------------
> <configuration>
>    <property>
>      <name>dfs.replication</name>
>      <value>2</value>
>     </property>
>     <property>
>       <name>dfs.namenode.name.dir</name>
>       <value>D:\Hadoop_TEST\Hadoop\Data</value>
>     </property>
>     <property>
>      <name>dfs.datanode.data.dir</name>
>       <value>D:\Hadoop_TEST\Hadoop\Secondary</value>
>    </property>
>
>   <property>
>       <name>dfs.namenode.datanode.registration.ip-hostname-check</name>
>
>      <value>false</value>
>    </property>
> </configuration>
>
> I tried to copy text file from my locad drive to hdfs file system.but i
> got the below error.
>
> *D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt
> hdfs://10.219.149.100:50010/a.txt <http://10.219.149.100:50010/a.txt>*
> *copyFromLocal: End of File Exception between local host is:
> "PC205172/10.219.149.100 <http://10.219.149.100>"; destination host is:
> "PC205172.cts.com <http://PC205172.cts.com>":50010; : java.io.EOFException;
> For more details see:  http://wiki.apache.org/hadoop/EOFException
> <http://wiki.apache.org/hadoop/EOFException>*
>
>
> Please share your suggestions.
>
> How to identify whether i have installed hadoop properly or not
> how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs or
> hadoop command
> how to identify  NAME NODE LOCATION , NAME NODE PROT and its
>  configuration details by hdfs or hadoop command like  how many replicat
> etc.
>
> Thanks & Regards,
> Vinodh.N
>
>
>

Re: Error While copying file from local to dfs

Posted by Vinodh Nagaraj <vi...@gmail.com>.
Hi All,

Please help me.

Thanks & Regards,
Vinodh.N

On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <vi...@gmail.com>
wrote:

> Hi All,
>
> I am new bee to Hadoop.
>
> I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for
> learning purpose.
>
> I can execute start-all.cmd successfully.
>
> When i execute jps,i got the below output.
> 28544 NameNode
> 35728
> 36308 DataNode
> 43828 Jps
> 40688 NodeManager
> 33820 ResourceManager
>
> My configuration files are.
>
> core-site.xml
> ---------------------
> <configuration>
>  <property>
>        <name>fs.defaultFS</name>
>        <value>hdfs://10.219.149.100:50075/</value>
>        <description>NameNode URI</description>
>  </property>
> </configuration>
>
>
>
> hdfs-site.xml
> ---------------------
> <configuration>
>    <property>
>      <name>dfs.replication</name>
>      <value>2</value>
>     </property>
>     <property>
>       <name>dfs.namenode.name.dir</name>
>       <value>D:\Hadoop_TEST\Hadoop\Data</value>
>     </property>
>     <property>
>      <name>dfs.datanode.data.dir</name>
>       <value>D:\Hadoop_TEST\Hadoop\Secondary</value>
>    </property>
>
>   <property>
>       <name>dfs.namenode.datanode.registration.ip-hostname-check</name>
>
>      <value>false</value>
>    </property>
> </configuration>
>
> I tried to copy text file from my locad drive to hdfs file system.but i
> got the below error.
>
> *D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt
> hdfs://10.219.149.100:50010/a.txt <http://10.219.149.100:50010/a.txt>*
> *copyFromLocal: End of File Exception between local host is:
> "PC205172/10.219.149.100 <http://10.219.149.100>"; destination host is:
> "PC205172.cts.com <http://PC205172.cts.com>":50010; : java.io.EOFException;
> For more details see:  http://wiki.apache.org/hadoop/EOFException
> <http://wiki.apache.org/hadoop/EOFException>*
>
>
> Please share your suggestions.
>
> How to identify whether i have installed hadoop properly or not
> how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs or
> hadoop command
> how to identify  NAME NODE LOCATION , NAME NODE PROT and its
>  configuration details by hdfs or hadoop command like  how many replicat
> etc.
>
> Thanks & Regards,
> Vinodh.N
>
>
>

Re: Error While copying file from local to dfs

Posted by Vinodh Nagaraj <vi...@gmail.com>.
Hi All,

Please help me.

Thanks & Regards,
Vinodh.N

On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <vi...@gmail.com>
wrote:

> Hi All,
>
> I am new bee to Hadoop.
>
> I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for
> learning purpose.
>
> I can execute start-all.cmd successfully.
>
> When i execute jps,i got the below output.
> 28544 NameNode
> 35728
> 36308 DataNode
> 43828 Jps
> 40688 NodeManager
> 33820 ResourceManager
>
> My configuration files are.
>
> core-site.xml
> ---------------------
> <configuration>
>  <property>
>        <name>fs.defaultFS</name>
>        <value>hdfs://10.219.149.100:50075/</value>
>        <description>NameNode URI</description>
>  </property>
> </configuration>
>
>
>
> hdfs-site.xml
> ---------------------
> <configuration>
>    <property>
>      <name>dfs.replication</name>
>      <value>2</value>
>     </property>
>     <property>
>       <name>dfs.namenode.name.dir</name>
>       <value>D:\Hadoop_TEST\Hadoop\Data</value>
>     </property>
>     <property>
>      <name>dfs.datanode.data.dir</name>
>       <value>D:\Hadoop_TEST\Hadoop\Secondary</value>
>    </property>
>
>   <property>
>       <name>dfs.namenode.datanode.registration.ip-hostname-check</name>
>
>      <value>false</value>
>    </property>
> </configuration>
>
> I tried to copy text file from my locad drive to hdfs file system.but i
> got the below error.
>
> *D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt
> hdfs://10.219.149.100:50010/a.txt <http://10.219.149.100:50010/a.txt>*
> *copyFromLocal: End of File Exception between local host is:
> "PC205172/10.219.149.100 <http://10.219.149.100>"; destination host is:
> "PC205172.cts.com <http://PC205172.cts.com>":50010; : java.io.EOFException;
> For more details see:  http://wiki.apache.org/hadoop/EOFException
> <http://wiki.apache.org/hadoop/EOFException>*
>
>
> Please share your suggestions.
>
> How to identify whether i have installed hadoop properly or not
> how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs or
> hadoop command
> how to identify  NAME NODE LOCATION , NAME NODE PROT and its
>  configuration details by hdfs or hadoop command like  how many replicat
> etc.
>
> Thanks & Regards,
> Vinodh.N
>
>
>

Re: Error While copying file from local to dfs

Posted by Vinodh Nagaraj <vi...@gmail.com>.
Hi All,

Please help me.

Thanks & Regards,
Vinodh.N

On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <vi...@gmail.com>
wrote:

> Hi All,
>
> I am new bee to Hadoop.
>
> I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for
> learning purpose.
>
> I can execute start-all.cmd successfully.
>
> When i execute jps,i got the below output.
> 28544 NameNode
> 35728
> 36308 DataNode
> 43828 Jps
> 40688 NodeManager
> 33820 ResourceManager
>
> My configuration files are.
>
> core-site.xml
> ---------------------
> <configuration>
>  <property>
>        <name>fs.defaultFS</name>
>        <value>hdfs://10.219.149.100:50075/</value>
>        <description>NameNode URI</description>
>  </property>
> </configuration>
>
>
>
> hdfs-site.xml
> ---------------------
> <configuration>
>    <property>
>      <name>dfs.replication</name>
>      <value>2</value>
>     </property>
>     <property>
>       <name>dfs.namenode.name.dir</name>
>       <value>D:\Hadoop_TEST\Hadoop\Data</value>
>     </property>
>     <property>
>      <name>dfs.datanode.data.dir</name>
>       <value>D:\Hadoop_TEST\Hadoop\Secondary</value>
>    </property>
>
>   <property>
>       <name>dfs.namenode.datanode.registration.ip-hostname-check</name>
>
>      <value>false</value>
>    </property>
> </configuration>
>
> I tried to copy text file from my locad drive to hdfs file system.but i
> got the below error.
>
> *D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt
> hdfs://10.219.149.100:50010/a.txt <http://10.219.149.100:50010/a.txt>*
> *copyFromLocal: End of File Exception between local host is:
> "PC205172/10.219.149.100 <http://10.219.149.100>"; destination host is:
> "PC205172.cts.com <http://PC205172.cts.com>":50010; : java.io.EOFException;
> For more details see:  http://wiki.apache.org/hadoop/EOFException
> <http://wiki.apache.org/hadoop/EOFException>*
>
>
> Please share your suggestions.
>
> How to identify whether i have installed hadoop properly or not
> how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs or
> hadoop command
> how to identify  NAME NODE LOCATION , NAME NODE PROT and its
>  configuration details by hdfs or hadoop command like  how many replicat
> etc.
>
> Thanks & Regards,
> Vinodh.N
>
>
>