You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Rajshekar <ra...@excelindia.com> on 2009/02/05 07:01:38 UTC

Not able to copy a file to HDFS after installing

Hello, 
I am new to Hadoop and I jus installed on Ubuntu 8.0.4 LTS as per guidance
of a web site. I tested it and found working fine. I tried to copy a file
but it is giving some error pls help me out

hadoop@excel-desktop:/usr/local/hadoop/hadoop-0.17.2.1$  bin/hadoop jar
hadoop-0.17.2.1-examples.jar wordcount /home/hadoop/Download\ URLs.txt
download-output
09/02/02 11:18:59 INFO ipc.Client: Retrying connect to server:
localhost/127.0.0.1:9000. Already tried 1 time(s).
09/02/02 11:19:00 INFO ipc.Client: Retrying connect to server:
localhost/127.0.0.1:9000. Already tried 2 time(s).
09/02/02 11:19:01 INFO ipc.Client: Retrying connect to server:
localhost/127.0.0.1:9000. Already tried 3 time(s).
09/02/02 11:19:02 INFO ipc.Client: Retrying connect to server:
localhost/127.0.0.1:9000. Already tried 4 time(s).
09/02/02 11:19:04 INFO ipc.Client: Retrying connect to server:
localhost/127.0.0.1:9000. Already tried 5 time(s).
09/02/02 11:19:05 INFO ipc.Client: Retrying connect to server:
localhost/127.0.0.1:9000. Already tried 6 time(s).
09/02/02 11:19:06 INFO ipc.Client: Retrying connect to server:
localhost/127.0.0.1:9000. Already tried 7 time(s).
09/02/02 11:19:07 INFO ipc.Client: Retrying connect to server:
localhost/127.0.0.1:9000. Already tried 8 time(s).
09/02/02 11:19:08 INFO ipc.Client: Retrying connect to server:
localhost/127.0.0.1:9000. Already tried 9 time(s).
09/02/02 11:19:09 INFO ipc.Client: Retrying connect to server:
localhost/127.0.0.1:9000. Already tried 10 time(s).
java.lang.RuntimeException: java.net.ConnectException: Connection refused
at org.apache.hadoop.mapred.JobConf.getWorkingDirecto ry(JobConf.java:356)
at org.apache.hadoop.mapred.FileInputFormat.setInputP
aths(FileInputFormat.java:331)
at org.apache.hadoop.mapred.FileInputFormat.setInputP
aths(FileInputFormat.java:304)
at org.apache.hadoop.examples.WordCount.run(WordCount .java:146)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65)
at org.apache.hadoop.examples.WordCount.main(WordCoun t.java:155)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Native
MethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(De
legatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.apache.hadoop.util.ProgramDriver$ProgramDescri
ption.invoke(ProgramDriver.java:6
at org.apache.hadoop.util.ProgramDriver.driver(Progra mDriver.java:139)
at org.apache.hadoop.examples.ExampleDriver.main(Exam pleDriver.java:53)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Native
MethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(De
legatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.apache.hadoop.util.RunJar.main(RunJar.java:155 )
at org.apache.hadoop.mapred.JobShell.run(JobShell.jav a:194)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:79)
at org.apache.hadoop.mapred.JobShell.main(JobShell.ja va:220)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketC hannelImpl.java:592)
at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.jav a:11
at org.apache.hadoop.ipc.Client$Connection.setupIOstr eams(Client.java:174)
at org.apache.hadoop.ipc.Client.getConnection(Client. java:623)
at org.apache.hadoop.ipc.Client.call(Client.java:546)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java: 212)
at org.apache.hadoop.dfs.$Proxy0.getProtocolVersion(U nknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:313)
at org.apache.hadoop.dfs.DFSClient.createRPCNamenode( DFSClient.java:102)
at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.j ava:17
at org.apache.hadoop.dfs.DistributedFileSystem.initia
lize(DistributedFileSystem.java:6
at org.apache.hadoop.fs.FileSystem.createFileSystem(F ileSystem.java:1280)
at org.apache.hadoop.fs.FileSystem.access$300(FileSys tem.java:56)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSyst em.java:1291)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:203)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:10
at org.apache.hadoop.mapred.JobConf.getWorkingDirecto ry(JobConf.java:352)
-- 
View this message in context: http://www.nabble.com/Not-able-to-copy-a-file-to-HDFS-after-installing-tp21845768p21845768.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.


Re: Not able to copy a file to HDFS after installing

Posted by Rasit OZDAS <ra...@gmail.com>.
Rajshekar,
I have also threads for this ;)

http://mail-archives.apache.org/mod_mbox/hadoop-core-user/200803.mbox/%3CPine.LNX.4.64.0803132200480.5549@localhost.localdomain%3E
http://www.mail-archive.com/hadoop-dev@lucene.apache.org/msg03226.html

Please try the following:

- Give local filepath for jar
- Give absolute path, not relative to the hadoop/bin
- HADOOP_HOME env. variable should be correctly set.

Hope this helps,
Rasit

2009/2/6 Rajshekar <ra...@excelindia.com>:
>
> Hi
> Thanks Rasi,
>
> From Yest evening I am able to start Namenode. I did few changed in
> hadoop-site.xml. it working now, but the new problem is I am not able to do
> map/reduce jobs using .jar files. it is giving following error
>
> hadoop@excel-desktop:/usr/local/hadoop$ bin/hadoop jar
> hadoop-0.19.0-examples.jar wordcount gutenberg gutenberg-output
> java.io.IOException: Error opening job jar: hadoop-0.19.0-examples.jar
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
>        at org.apache.hadoop.mapred.JobShell.run(JobShell.java:194)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>        at org.apache.hadoop.mapred.JobShell.main(JobShell.java:220)
> Caused by: java.util.zip.ZipException: error in opening zip file
>        at java.util.zip.ZipFile.open(Native Method)
>        at java.util.zip.ZipFile.<init>(ZipFile.java:131)
>        at java.util.jar.JarFile.<init>(JarFile.java:150)
>        at java.util.jar.JarFile.<init>(JarFile.java:87)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:88)
>        ... 4 more
>
> Pls help me out
>
>
>
> Rasit OZDAS wrote:
>>
>> Rajshekar,
>> It seems that your namenode isn't able to load FsImage file.
>>
>> Here is a thread about a similar issue:
>> http://www.nabble.com/Hadoop-0.17.1-%3D%3E-EOFException-reading-FSEdits-file,-what-causes-this---how-to-prevent--td21440922.html
>>
>> Rasit
>>
>> 2009/2/5 Rajshekar <ra...@excelindia.com>:
>>>
>>> Name naode is localhost with an ip address.Now I checked when i give
>>> /bin/hadoop namenode i am getting error
>>>
>>> root@excel-desktop:/usr/local/hadoop/hadoop-0.17.2.1# bin/hadoop namenode
>>> 09/02/05 13:27:43 INFO dfs.NameNode: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting NameNode
>>> STARTUP_MSG:   host = excel-desktop/127.0.1.1
>>> STARTUP_MSG:   args = []
>>> STARTUP_MSG:   version = 0.17.2.1
>>> STARTUP_MSG:   build =
>>> https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.17 -r
>>> 684969;
>>> compiled by 'oom' on Wed Aug 20 22:29:32 UTC 2008
>>> ************************************************************/
>>> 09/02/05 13:27:43 INFO metrics.RpcMetrics: Initializing RPC Metrics with
>>> hostName=NameNode, port=9000
>>> 09/02/05 13:27:43 INFO dfs.NameNode: Namenode up at:
>>> localhost/127.0.0.1:9000
>>> 09/02/05 13:27:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with
>>> processName=NameNode, sessionId=null
>>> 09/02/05 13:27:43 INFO dfs.NameNodeMetrics: Initializing NameNodeMeterics
>>> using context object:org.apache.hadoop.metrics.spi.NullContext
>>> 09/02/05 13:27:43 INFO fs.FSNamesystem: fsOwner=root,root
>>> 09/02/05 13:27:43 INFO fs.FSNamesystem: supergroup=supergroup
>>> 09/02/05 13:27:43 INFO fs.FSNamesystem: isPermissionEnabled=true
>>> 09/02/05 13:27:44 INFO ipc.Server: Stopping server on 9000
>>> 09/02/05 13:27:44 ERROR dfs.NameNode: java.io.EOFException
>>>        at java.io.RandomAccessFile.readInt(RandomAccessFile.java:776)
>>>        at
>>> org.apache.hadoop.dfs.FSImage.isConversionNeeded(FSImage.java:488)
>>>        at
>>> org.apache.hadoop.dfs.Storage$StorageDirectory.analyzeStorage(Storage.java:283)
>>>        at
>>> org.apache.hadoop.dfs.FSImage.recoverTransitionRead(FSImage.java:149)
>>>        at
>>> org.apache.hadoop.dfs.FSDirectory.loadFSImage(FSDirectory.java:80)
>>>        at
>>> org.apache.hadoop.dfs.FSNamesystem.initialize(FSNamesystem.java:274)
>>>        at
>>> org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:255)
>>>        at org.apache.hadoop.dfs.NameNode.initialize(NameNode.java:133)
>>>        at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:178)
>>>        at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:164)
>>>        at
>>> org.apache.hadoop.dfs.NameNode.createNameNode(NameNode.java:846)
>>>        at org.apache.hadoop.dfs.NameNode.main(NameNode.java:855)
>>>
>>> 09/02/05 13:27:44 INFO dfs.NameNode: SHUTDOWN_MSG:
>>> /************************************************************
>>> SHUTDOWN_MSG: Shutting down NameNode at excel-desktop/127.0.1.1
>>> ************************************************************/
>>>  Rajshekar
>>>
>>>
>>>
>>>
>>>
>>> Sagar Naik-3 wrote:
>>>>
>>>>
>>>> where is the namenode running ? localhost or some other host
>>>>
>>>> -Sagar
>>>> Rajshekar wrote:
>>>>> Hello,
>>>>> I am new to Hadoop and I jus installed on Ubuntu 8.0.4 LTS as per
>>>>> guidance
>>>>> of a web site. I tested it and found working fine. I tried to copy a
>>>>> file
>>>>> but it is giving some error pls help me out
>>>>>
>>>>> hadoop@excel-desktop:/usr/local/hadoop/hadoop-0.17.2.1$  bin/hadoop jar
>>>>> hadoop-0.17.2.1-examples.jar wordcount /home/hadoop/Download\ URLs.txt
>>>>> download-output
>>>>> 09/02/02 11:18:59 INFO ipc.Client: Retrying connect to server:
>>>>> localhost/127.0.0.1:9000. Already tried 1 time(s).
>>>>> 09/02/02 11:19:00 INFO ipc.Client: Retrying connect to server:
>>>>> localhost/127.0.0.1:9000. Already tried 2 time(s).
>>>>> 09/02/02 11:19:01 INFO ipc.Client: Retrying connect to server:
>>>>> localhost/127.0.0.1:9000. Already tried 3 time(s).
>>>>> 09/02/02 11:19:02 INFO ipc.Client: Retrying connect to server:
>>>>> localhost/127.0.0.1:9000. Already tried 4 time(s).
>>>>> 09/02/02 11:19:04 INFO ipc.Client: Retrying connect to server:
>>>>> localhost/127.0.0.1:9000. Already tried 5 time(s).
>>>>> 09/02/02 11:19:05 INFO ipc.Client: Retrying connect to server:
>>>>> localhost/127.0.0.1:9000. Already tried 6 time(s).
>>>>> 09/02/02 11:19:06 INFO ipc.Client: Retrying connect to server:
>>>>> localhost/127.0.0.1:9000. Already tried 7 time(s).
>>>>> 09/02/02 11:19:07 INFO ipc.Client: Retrying connect to server:
>>>>> localhost/127.0.0.1:9000. Already tried 8 time(s).
>>>>> 09/02/02 11:19:08 INFO ipc.Client: Retrying connect to server:
>>>>> localhost/127.0.0.1:9000. Already tried 9 time(s).
>>>>> 09/02/02 11:19:09 INFO ipc.Client: Retrying connect to server:
>>>>> localhost/127.0.0.1:9000. Already tried 10 time(s).
>>>>> java.lang.RuntimeException: java.net.ConnectException: Connection
>>>>> refused
>>>>> at org.apache.hadoop.mapred.JobConf.getWorkingDirecto
>>>>> ry(JobConf.java:356)
>>>>> at org.apache.hadoop.mapred.FileInputFormat.setInputP
>>>>> aths(FileInputFormat.java:331)
>>>>> at org.apache.hadoop.mapred.FileInputFormat.setInputP
>>>>> aths(FileInputFormat.java:304)
>>>>> at org.apache.hadoop.examples.WordCount.run(WordCount .java:146)
>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65)
>>>>> at org.apache.hadoop.examples.WordCount.main(WordCoun t.java:155)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke(Native
>>>>> MethodAccessorImpl.java:57)
>>>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(De
>>>>> legatingMethodAccessorImpl.java:43)
>>>>> at java.lang.reflect.Method.invoke(Method.java:616)
>>>>> at org.apache.hadoop.util.ProgramDriver$ProgramDescri
>>>>> ption.invoke(ProgramDriver.java:6
>>>>> at org.apache.hadoop.util.ProgramDriver.driver(Progra mDriver.java:139)
>>>>> at org.apache.hadoop.examples.ExampleDriver.main(Exam
>>>>> pleDriver.java:53)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke(Native
>>>>> MethodAccessorImpl.java:57)
>>>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(De
>>>>> legatingMethodAccessorImpl.java:43)
>>>>> at java.lang.reflect.Method.invoke(Method.java:616)
>>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155 )
>>>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.jav a:194)
>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65)
>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:79)
>>>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.ja va:220)
>>>>> Caused by: java.net.ConnectException: Connection refused
>>>>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>>>>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketC
>>>>> hannelImpl.java:592)
>>>>> at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.jav a:11
>>>>> at org.apache.hadoop.ipc.Client$Connection.setupIOstr
>>>>> eams(Client.java:174)
>>>>> at org.apache.hadoop.ipc.Client.getConnection(Client. java:623)
>>>>> at org.apache.hadoop.ipc.Client.call(Client.java:546)
>>>>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java: 212)
>>>>> at org.apache.hadoop.dfs.$Proxy0.getProtocolVersion(U nknown Source)
>>>>> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:313)
>>>>> at org.apache.hadoop.dfs.DFSClient.createRPCNamenode(
>>>>> DFSClient.java:102)
>>>>> at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.j ava:17
>>>>> at org.apache.hadoop.dfs.DistributedFileSystem.initia
>>>>> lize(DistributedFileSystem.java:6
>>>>> at org.apache.hadoop.fs.FileSystem.createFileSystem(F
>>>>> ileSystem.java:1280)
>>>>> at org.apache.hadoop.fs.FileSystem.access$300(FileSys tem.java:56)
>>>>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSyst em.java:1291)
>>>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:203)
>>>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:10
>>>>> at org.apache.hadoop.mapred.JobConf.getWorkingDirecto
>>>>> ry(JobConf.java:352)
>>>>>
>>>>
>>>>
>>>
>>> --
>>> View this message in context:
>>> http://www.nabble.com/Not-able-to-copy-a-file-to-HDFS-after-installing-tp21845768p21846923.html
>>> Sent from the Hadoop core-user mailing list archive at Nabble.com.
>>>
>>>
>>
>>
>>
>> --
>> M. Raşit ÖZDAŞ
>>
>>
>
> --
> View this message in context: http://www.nabble.com/Not-able-to-copy-a-file-to-HDFS-after-installing-tp21845768p21867199.html
> Sent from the Hadoop core-user mailing list archive at Nabble.com.
>
>



-- 
M. Raşit ÖZDAŞ

Re: Not able to copy a file to HDFS after installing

Posted by Rajshekar <ra...@excelindia.com>.
Hi
Thanks Rasi,

>From Yest evening I am able to start Namenode. I did few changed in
hadoop-site.xml. it working now, but the new problem is I am not able to do
map/reduce jobs using .jar files. it is giving following error

hadoop@excel-desktop:/usr/local/hadoop$ bin/hadoop jar
hadoop-0.19.0-examples.jar wordcount gutenberg gutenberg-output
java.io.IOException: Error opening job jar: hadoop-0.19.0-examples.jar
        at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
        at org.apache.hadoop.mapred.JobShell.run(JobShell.java:194)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.apache.hadoop.mapred.JobShell.main(JobShell.java:220)
Caused by: java.util.zip.ZipException: error in opening zip file
        at java.util.zip.ZipFile.open(Native Method)
        at java.util.zip.ZipFile.<init>(ZipFile.java:131)
        at java.util.jar.JarFile.<init>(JarFile.java:150)
        at java.util.jar.JarFile.<init>(JarFile.java:87)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:88)
        ... 4 more

Pls help me out



Rasit OZDAS wrote:
> 
> Rajshekar,
> It seems that your namenode isn't able to load FsImage file.
> 
> Here is a thread about a similar issue:
> http://www.nabble.com/Hadoop-0.17.1-%3D%3E-EOFException-reading-FSEdits-file,-what-causes-this---how-to-prevent--td21440922.html
> 
> Rasit
> 
> 2009/2/5 Rajshekar <ra...@excelindia.com>:
>>
>> Name naode is localhost with an ip address.Now I checked when i give
>> /bin/hadoop namenode i am getting error
>>
>> root@excel-desktop:/usr/local/hadoop/hadoop-0.17.2.1# bin/hadoop namenode
>> 09/02/05 13:27:43 INFO dfs.NameNode: STARTUP_MSG:
>> /************************************************************
>> STARTUP_MSG: Starting NameNode
>> STARTUP_MSG:   host = excel-desktop/127.0.1.1
>> STARTUP_MSG:   args = []
>> STARTUP_MSG:   version = 0.17.2.1
>> STARTUP_MSG:   build =
>> https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.17 -r
>> 684969;
>> compiled by 'oom' on Wed Aug 20 22:29:32 UTC 2008
>> ************************************************************/
>> 09/02/05 13:27:43 INFO metrics.RpcMetrics: Initializing RPC Metrics with
>> hostName=NameNode, port=9000
>> 09/02/05 13:27:43 INFO dfs.NameNode: Namenode up at:
>> localhost/127.0.0.1:9000
>> 09/02/05 13:27:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with
>> processName=NameNode, sessionId=null
>> 09/02/05 13:27:43 INFO dfs.NameNodeMetrics: Initializing NameNodeMeterics
>> using context object:org.apache.hadoop.metrics.spi.NullContext
>> 09/02/05 13:27:43 INFO fs.FSNamesystem: fsOwner=root,root
>> 09/02/05 13:27:43 INFO fs.FSNamesystem: supergroup=supergroup
>> 09/02/05 13:27:43 INFO fs.FSNamesystem: isPermissionEnabled=true
>> 09/02/05 13:27:44 INFO ipc.Server: Stopping server on 9000
>> 09/02/05 13:27:44 ERROR dfs.NameNode: java.io.EOFException
>>        at java.io.RandomAccessFile.readInt(RandomAccessFile.java:776)
>>        at
>> org.apache.hadoop.dfs.FSImage.isConversionNeeded(FSImage.java:488)
>>        at
>> org.apache.hadoop.dfs.Storage$StorageDirectory.analyzeStorage(Storage.java:283)
>>        at
>> org.apache.hadoop.dfs.FSImage.recoverTransitionRead(FSImage.java:149)
>>        at
>> org.apache.hadoop.dfs.FSDirectory.loadFSImage(FSDirectory.java:80)
>>        at
>> org.apache.hadoop.dfs.FSNamesystem.initialize(FSNamesystem.java:274)
>>        at
>> org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:255)
>>        at org.apache.hadoop.dfs.NameNode.initialize(NameNode.java:133)
>>        at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:178)
>>        at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:164)
>>        at
>> org.apache.hadoop.dfs.NameNode.createNameNode(NameNode.java:846)
>>        at org.apache.hadoop.dfs.NameNode.main(NameNode.java:855)
>>
>> 09/02/05 13:27:44 INFO dfs.NameNode: SHUTDOWN_MSG:
>> /************************************************************
>> SHUTDOWN_MSG: Shutting down NameNode at excel-desktop/127.0.1.1
>> ************************************************************/
>>  Rajshekar
>>
>>
>>
>>
>>
>> Sagar Naik-3 wrote:
>>>
>>>
>>> where is the namenode running ? localhost or some other host
>>>
>>> -Sagar
>>> Rajshekar wrote:
>>>> Hello,
>>>> I am new to Hadoop and I jus installed on Ubuntu 8.0.4 LTS as per
>>>> guidance
>>>> of a web site. I tested it and found working fine. I tried to copy a
>>>> file
>>>> but it is giving some error pls help me out
>>>>
>>>> hadoop@excel-desktop:/usr/local/hadoop/hadoop-0.17.2.1$  bin/hadoop jar
>>>> hadoop-0.17.2.1-examples.jar wordcount /home/hadoop/Download\ URLs.txt
>>>> download-output
>>>> 09/02/02 11:18:59 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9000. Already tried 1 time(s).
>>>> 09/02/02 11:19:00 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9000. Already tried 2 time(s).
>>>> 09/02/02 11:19:01 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9000. Already tried 3 time(s).
>>>> 09/02/02 11:19:02 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9000. Already tried 4 time(s).
>>>> 09/02/02 11:19:04 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9000. Already tried 5 time(s).
>>>> 09/02/02 11:19:05 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9000. Already tried 6 time(s).
>>>> 09/02/02 11:19:06 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9000. Already tried 7 time(s).
>>>> 09/02/02 11:19:07 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9000. Already tried 8 time(s).
>>>> 09/02/02 11:19:08 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9000. Already tried 9 time(s).
>>>> 09/02/02 11:19:09 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9000. Already tried 10 time(s).
>>>> java.lang.RuntimeException: java.net.ConnectException: Connection
>>>> refused
>>>> at org.apache.hadoop.mapred.JobConf.getWorkingDirecto
>>>> ry(JobConf.java:356)
>>>> at org.apache.hadoop.mapred.FileInputFormat.setInputP
>>>> aths(FileInputFormat.java:331)
>>>> at org.apache.hadoop.mapred.FileInputFormat.setInputP
>>>> aths(FileInputFormat.java:304)
>>>> at org.apache.hadoop.examples.WordCount.run(WordCount .java:146)
>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65)
>>>> at org.apache.hadoop.examples.WordCount.main(WordCoun t.java:155)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke(Native
>>>> MethodAccessorImpl.java:57)
>>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(De
>>>> legatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:616)
>>>> at org.apache.hadoop.util.ProgramDriver$ProgramDescri
>>>> ption.invoke(ProgramDriver.java:6
>>>> at org.apache.hadoop.util.ProgramDriver.driver(Progra mDriver.java:139)
>>>> at org.apache.hadoop.examples.ExampleDriver.main(Exam
>>>> pleDriver.java:53)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke(Native
>>>> MethodAccessorImpl.java:57)
>>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(De
>>>> legatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:616)
>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155 )
>>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.jav a:194)
>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65)
>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:79)
>>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.ja va:220)
>>>> Caused by: java.net.ConnectException: Connection refused
>>>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>>>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketC
>>>> hannelImpl.java:592)
>>>> at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.jav a:11
>>>> at org.apache.hadoop.ipc.Client$Connection.setupIOstr
>>>> eams(Client.java:174)
>>>> at org.apache.hadoop.ipc.Client.getConnection(Client. java:623)
>>>> at org.apache.hadoop.ipc.Client.call(Client.java:546)
>>>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java: 212)
>>>> at org.apache.hadoop.dfs.$Proxy0.getProtocolVersion(U nknown Source)
>>>> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:313)
>>>> at org.apache.hadoop.dfs.DFSClient.createRPCNamenode(
>>>> DFSClient.java:102)
>>>> at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.j ava:17
>>>> at org.apache.hadoop.dfs.DistributedFileSystem.initia
>>>> lize(DistributedFileSystem.java:6
>>>> at org.apache.hadoop.fs.FileSystem.createFileSystem(F
>>>> ileSystem.java:1280)
>>>> at org.apache.hadoop.fs.FileSystem.access$300(FileSys tem.java:56)
>>>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSyst em.java:1291)
>>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:203)
>>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:10
>>>> at org.apache.hadoop.mapred.JobConf.getWorkingDirecto
>>>> ry(JobConf.java:352)
>>>>
>>>
>>>
>>
>> --
>> View this message in context:
>> http://www.nabble.com/Not-able-to-copy-a-file-to-HDFS-after-installing-tp21845768p21846923.html
>> Sent from the Hadoop core-user mailing list archive at Nabble.com.
>>
>>
> 
> 
> 
> -- 
> M. Raşit ÖZDAŞ
> 
> 

-- 
View this message in context: http://www.nabble.com/Not-able-to-copy-a-file-to-HDFS-after-installing-tp21845768p21867199.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.


Re: Not able to copy a file to HDFS after installing

Posted by Rasit OZDAS <ra...@gmail.com>.
Rajshekar,
It seems that your namenode isn't able to load FsImage file.

Here is a thread about a similar issue:
http://www.nabble.com/Hadoop-0.17.1-%3D%3E-EOFException-reading-FSEdits-file,-what-causes-this---how-to-prevent--td21440922.html

Rasit

2009/2/5 Rajshekar <ra...@excelindia.com>:
>
> Name naode is localhost with an ip address.Now I checked when i give
> /bin/hadoop namenode i am getting error
>
> root@excel-desktop:/usr/local/hadoop/hadoop-0.17.2.1# bin/hadoop namenode
> 09/02/05 13:27:43 INFO dfs.NameNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting NameNode
> STARTUP_MSG:   host = excel-desktop/127.0.1.1
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 0.17.2.1
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.17 -r 684969;
> compiled by 'oom' on Wed Aug 20 22:29:32 UTC 2008
> ************************************************************/
> 09/02/05 13:27:43 INFO metrics.RpcMetrics: Initializing RPC Metrics with
> hostName=NameNode, port=9000
> 09/02/05 13:27:43 INFO dfs.NameNode: Namenode up at:
> localhost/127.0.0.1:9000
> 09/02/05 13:27:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with
> processName=NameNode, sessionId=null
> 09/02/05 13:27:43 INFO dfs.NameNodeMetrics: Initializing NameNodeMeterics
> using context object:org.apache.hadoop.metrics.spi.NullContext
> 09/02/05 13:27:43 INFO fs.FSNamesystem: fsOwner=root,root
> 09/02/05 13:27:43 INFO fs.FSNamesystem: supergroup=supergroup
> 09/02/05 13:27:43 INFO fs.FSNamesystem: isPermissionEnabled=true
> 09/02/05 13:27:44 INFO ipc.Server: Stopping server on 9000
> 09/02/05 13:27:44 ERROR dfs.NameNode: java.io.EOFException
>        at java.io.RandomAccessFile.readInt(RandomAccessFile.java:776)
>        at
> org.apache.hadoop.dfs.FSImage.isConversionNeeded(FSImage.java:488)
>        at
> org.apache.hadoop.dfs.Storage$StorageDirectory.analyzeStorage(Storage.java:283)
>        at
> org.apache.hadoop.dfs.FSImage.recoverTransitionRead(FSImage.java:149)
>        at
> org.apache.hadoop.dfs.FSDirectory.loadFSImage(FSDirectory.java:80)
>        at
> org.apache.hadoop.dfs.FSNamesystem.initialize(FSNamesystem.java:274)
>        at org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:255)
>        at org.apache.hadoop.dfs.NameNode.initialize(NameNode.java:133)
>        at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:178)
>        at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:164)
>        at org.apache.hadoop.dfs.NameNode.createNameNode(NameNode.java:846)
>        at org.apache.hadoop.dfs.NameNode.main(NameNode.java:855)
>
> 09/02/05 13:27:44 INFO dfs.NameNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down NameNode at excel-desktop/127.0.1.1
> ************************************************************/
>  Rajshekar
>
>
>
>
>
> Sagar Naik-3 wrote:
>>
>>
>> where is the namenode running ? localhost or some other host
>>
>> -Sagar
>> Rajshekar wrote:
>>> Hello,
>>> I am new to Hadoop and I jus installed on Ubuntu 8.0.4 LTS as per
>>> guidance
>>> of a web site. I tested it and found working fine. I tried to copy a file
>>> but it is giving some error pls help me out
>>>
>>> hadoop@excel-desktop:/usr/local/hadoop/hadoop-0.17.2.1$  bin/hadoop jar
>>> hadoop-0.17.2.1-examples.jar wordcount /home/hadoop/Download\ URLs.txt
>>> download-output
>>> 09/02/02 11:18:59 INFO ipc.Client: Retrying connect to server:
>>> localhost/127.0.0.1:9000. Already tried 1 time(s).
>>> 09/02/02 11:19:00 INFO ipc.Client: Retrying connect to server:
>>> localhost/127.0.0.1:9000. Already tried 2 time(s).
>>> 09/02/02 11:19:01 INFO ipc.Client: Retrying connect to server:
>>> localhost/127.0.0.1:9000. Already tried 3 time(s).
>>> 09/02/02 11:19:02 INFO ipc.Client: Retrying connect to server:
>>> localhost/127.0.0.1:9000. Already tried 4 time(s).
>>> 09/02/02 11:19:04 INFO ipc.Client: Retrying connect to server:
>>> localhost/127.0.0.1:9000. Already tried 5 time(s).
>>> 09/02/02 11:19:05 INFO ipc.Client: Retrying connect to server:
>>> localhost/127.0.0.1:9000. Already tried 6 time(s).
>>> 09/02/02 11:19:06 INFO ipc.Client: Retrying connect to server:
>>> localhost/127.0.0.1:9000. Already tried 7 time(s).
>>> 09/02/02 11:19:07 INFO ipc.Client: Retrying connect to server:
>>> localhost/127.0.0.1:9000. Already tried 8 time(s).
>>> 09/02/02 11:19:08 INFO ipc.Client: Retrying connect to server:
>>> localhost/127.0.0.1:9000. Already tried 9 time(s).
>>> 09/02/02 11:19:09 INFO ipc.Client: Retrying connect to server:
>>> localhost/127.0.0.1:9000. Already tried 10 time(s).
>>> java.lang.RuntimeException: java.net.ConnectException: Connection refused
>>> at org.apache.hadoop.mapred.JobConf.getWorkingDirecto
>>> ry(JobConf.java:356)
>>> at org.apache.hadoop.mapred.FileInputFormat.setInputP
>>> aths(FileInputFormat.java:331)
>>> at org.apache.hadoop.mapred.FileInputFormat.setInputP
>>> aths(FileInputFormat.java:304)
>>> at org.apache.hadoop.examples.WordCount.run(WordCount .java:146)
>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65)
>>> at org.apache.hadoop.examples.WordCount.main(WordCoun t.java:155)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(Native
>>> MethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(De
>>> legatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:616)
>>> at org.apache.hadoop.util.ProgramDriver$ProgramDescri
>>> ption.invoke(ProgramDriver.java:6
>>> at org.apache.hadoop.util.ProgramDriver.driver(Progra mDriver.java:139)
>>> at org.apache.hadoop.examples.ExampleDriver.main(Exam pleDriver.java:53)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(Native
>>> MethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(De
>>> legatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:616)
>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155 )
>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.jav a:194)
>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65)
>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:79)
>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.ja va:220)
>>> Caused by: java.net.ConnectException: Connection refused
>>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketC
>>> hannelImpl.java:592)
>>> at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.jav a:11
>>> at org.apache.hadoop.ipc.Client$Connection.setupIOstr
>>> eams(Client.java:174)
>>> at org.apache.hadoop.ipc.Client.getConnection(Client. java:623)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:546)
>>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java: 212)
>>> at org.apache.hadoop.dfs.$Proxy0.getProtocolVersion(U nknown Source)
>>> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:313)
>>> at org.apache.hadoop.dfs.DFSClient.createRPCNamenode( DFSClient.java:102)
>>> at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.j ava:17
>>> at org.apache.hadoop.dfs.DistributedFileSystem.initia
>>> lize(DistributedFileSystem.java:6
>>> at org.apache.hadoop.fs.FileSystem.createFileSystem(F
>>> ileSystem.java:1280)
>>> at org.apache.hadoop.fs.FileSystem.access$300(FileSys tem.java:56)
>>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSyst em.java:1291)
>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:203)
>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:10
>>> at org.apache.hadoop.mapred.JobConf.getWorkingDirecto
>>> ry(JobConf.java:352)
>>>
>>
>>
>
> --
> View this message in context: http://www.nabble.com/Not-able-to-copy-a-file-to-HDFS-after-installing-tp21845768p21846923.html
> Sent from the Hadoop core-user mailing list archive at Nabble.com.
>
>



-- 
M. Raşit ÖZDAŞ

Re: Not able to copy a file to HDFS after installing

Posted by Rajshekar <ra...@excelindia.com>.
Name naode is localhost with an ip address.Now I checked when i give
/bin/hadoop namenode i am getting error

root@excel-desktop:/usr/local/hadoop/hadoop-0.17.2.1# bin/hadoop namenode
09/02/05 13:27:43 INFO dfs.NameNode: STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = excel-desktop/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 0.17.2.1
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.17 -r 684969;
compiled by 'oom' on Wed Aug 20 22:29:32 UTC 2008
************************************************************/
09/02/05 13:27:43 INFO metrics.RpcMetrics: Initializing RPC Metrics with
hostName=NameNode, port=9000
09/02/05 13:27:43 INFO dfs.NameNode: Namenode up at:
localhost/127.0.0.1:9000
09/02/05 13:27:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with
processName=NameNode, sessionId=null
09/02/05 13:27:43 INFO dfs.NameNodeMetrics: Initializing NameNodeMeterics
using context object:org.apache.hadoop.metrics.spi.NullContext
09/02/05 13:27:43 INFO fs.FSNamesystem: fsOwner=root,root
09/02/05 13:27:43 INFO fs.FSNamesystem: supergroup=supergroup
09/02/05 13:27:43 INFO fs.FSNamesystem: isPermissionEnabled=true
09/02/05 13:27:44 INFO ipc.Server: Stopping server on 9000
09/02/05 13:27:44 ERROR dfs.NameNode: java.io.EOFException
        at java.io.RandomAccessFile.readInt(RandomAccessFile.java:776)
        at
org.apache.hadoop.dfs.FSImage.isConversionNeeded(FSImage.java:488)
        at
org.apache.hadoop.dfs.Storage$StorageDirectory.analyzeStorage(Storage.java:283)
        at
org.apache.hadoop.dfs.FSImage.recoverTransitionRead(FSImage.java:149)
        at
org.apache.hadoop.dfs.FSDirectory.loadFSImage(FSDirectory.java:80)
        at
org.apache.hadoop.dfs.FSNamesystem.initialize(FSNamesystem.java:274)
        at org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:255)
        at org.apache.hadoop.dfs.NameNode.initialize(NameNode.java:133)
        at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:178)
        at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:164)
        at org.apache.hadoop.dfs.NameNode.createNameNode(NameNode.java:846)
        at org.apache.hadoop.dfs.NameNode.main(NameNode.java:855)

09/02/05 13:27:44 INFO dfs.NameNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at excel-desktop/127.0.1.1
************************************************************/
 Rajshekar





Sagar Naik-3 wrote:
> 
> 
> where is the namenode running ? localhost or some other host
> 
> -Sagar
> Rajshekar wrote:
>> Hello, 
>> I am new to Hadoop and I jus installed on Ubuntu 8.0.4 LTS as per
>> guidance
>> of a web site. I tested it and found working fine. I tried to copy a file
>> but it is giving some error pls help me out
>>
>> hadoop@excel-desktop:/usr/local/hadoop/hadoop-0.17.2.1$  bin/hadoop jar
>> hadoop-0.17.2.1-examples.jar wordcount /home/hadoop/Download\ URLs.txt
>> download-output
>> 09/02/02 11:18:59 INFO ipc.Client: Retrying connect to server:
>> localhost/127.0.0.1:9000. Already tried 1 time(s).
>> 09/02/02 11:19:00 INFO ipc.Client: Retrying connect to server:
>> localhost/127.0.0.1:9000. Already tried 2 time(s).
>> 09/02/02 11:19:01 INFO ipc.Client: Retrying connect to server:
>> localhost/127.0.0.1:9000. Already tried 3 time(s).
>> 09/02/02 11:19:02 INFO ipc.Client: Retrying connect to server:
>> localhost/127.0.0.1:9000. Already tried 4 time(s).
>> 09/02/02 11:19:04 INFO ipc.Client: Retrying connect to server:
>> localhost/127.0.0.1:9000. Already tried 5 time(s).
>> 09/02/02 11:19:05 INFO ipc.Client: Retrying connect to server:
>> localhost/127.0.0.1:9000. Already tried 6 time(s).
>> 09/02/02 11:19:06 INFO ipc.Client: Retrying connect to server:
>> localhost/127.0.0.1:9000. Already tried 7 time(s).
>> 09/02/02 11:19:07 INFO ipc.Client: Retrying connect to server:
>> localhost/127.0.0.1:9000. Already tried 8 time(s).
>> 09/02/02 11:19:08 INFO ipc.Client: Retrying connect to server:
>> localhost/127.0.0.1:9000. Already tried 9 time(s).
>> 09/02/02 11:19:09 INFO ipc.Client: Retrying connect to server:
>> localhost/127.0.0.1:9000. Already tried 10 time(s).
>> java.lang.RuntimeException: java.net.ConnectException: Connection refused
>> at org.apache.hadoop.mapred.JobConf.getWorkingDirecto
>> ry(JobConf.java:356)
>> at org.apache.hadoop.mapred.FileInputFormat.setInputP
>> aths(FileInputFormat.java:331)
>> at org.apache.hadoop.mapred.FileInputFormat.setInputP
>> aths(FileInputFormat.java:304)
>> at org.apache.hadoop.examples.WordCount.run(WordCount .java:146)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65)
>> at org.apache.hadoop.examples.WordCount.main(WordCoun t.java:155)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(Native
>> MethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(De
>> legatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:616)
>> at org.apache.hadoop.util.ProgramDriver$ProgramDescri
>> ption.invoke(ProgramDriver.java:6
>> at org.apache.hadoop.util.ProgramDriver.driver(Progra mDriver.java:139)
>> at org.apache.hadoop.examples.ExampleDriver.main(Exam pleDriver.java:53)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(Native
>> MethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(De
>> legatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:616)
>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155 )
>> at org.apache.hadoop.mapred.JobShell.run(JobShell.jav a:194)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:79)
>> at org.apache.hadoop.mapred.JobShell.main(JobShell.ja va:220)
>> Caused by: java.net.ConnectException: Connection refused
>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketC
>> hannelImpl.java:592)
>> at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.jav a:11
>> at org.apache.hadoop.ipc.Client$Connection.setupIOstr
>> eams(Client.java:174)
>> at org.apache.hadoop.ipc.Client.getConnection(Client. java:623)
>> at org.apache.hadoop.ipc.Client.call(Client.java:546)
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java: 212)
>> at org.apache.hadoop.dfs.$Proxy0.getProtocolVersion(U nknown Source)
>> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:313)
>> at org.apache.hadoop.dfs.DFSClient.createRPCNamenode( DFSClient.java:102)
>> at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.j ava:17
>> at org.apache.hadoop.dfs.DistributedFileSystem.initia
>> lize(DistributedFileSystem.java:6
>> at org.apache.hadoop.fs.FileSystem.createFileSystem(F
>> ileSystem.java:1280)
>> at org.apache.hadoop.fs.FileSystem.access$300(FileSys tem.java:56)
>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSyst em.java:1291)
>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:203)
>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:10
>> at org.apache.hadoop.mapred.JobConf.getWorkingDirecto
>> ry(JobConf.java:352)
>>   
> 
> 

-- 
View this message in context: http://www.nabble.com/Not-able-to-copy-a-file-to-HDFS-after-installing-tp21845768p21846923.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.


Re: Not able to copy a file to HDFS after installing

Posted by Sagar Naik <sn...@attributor.com>.
where is the namenode running ? localhost or some other host

-Sagar
Rajshekar wrote:
> Hello, 
> I am new to Hadoop and I jus installed on Ubuntu 8.0.4 LTS as per guidance
> of a web site. I tested it and found working fine. I tried to copy a file
> but it is giving some error pls help me out
>
> hadoop@excel-desktop:/usr/local/hadoop/hadoop-0.17.2.1$  bin/hadoop jar
> hadoop-0.17.2.1-examples.jar wordcount /home/hadoop/Download\ URLs.txt
> download-output
> 09/02/02 11:18:59 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 1 time(s).
> 09/02/02 11:19:00 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 2 time(s).
> 09/02/02 11:19:01 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 3 time(s).
> 09/02/02 11:19:02 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 4 time(s).
> 09/02/02 11:19:04 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 5 time(s).
> 09/02/02 11:19:05 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 6 time(s).
> 09/02/02 11:19:06 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 7 time(s).
> 09/02/02 11:19:07 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 8 time(s).
> 09/02/02 11:19:08 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 9 time(s).
> 09/02/02 11:19:09 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 10 time(s).
> java.lang.RuntimeException: java.net.ConnectException: Connection refused
> at org.apache.hadoop.mapred.JobConf.getWorkingDirecto ry(JobConf.java:356)
> at org.apache.hadoop.mapred.FileInputFormat.setInputP
> aths(FileInputFormat.java:331)
> at org.apache.hadoop.mapred.FileInputFormat.setInputP
> aths(FileInputFormat.java:304)
> at org.apache.hadoop.examples.WordCount.run(WordCount .java:146)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65)
> at org.apache.hadoop.examples.WordCount.main(WordCoun t.java:155)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(Native
> MethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(De
> legatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:616)
> at org.apache.hadoop.util.ProgramDriver$ProgramDescri
> ption.invoke(ProgramDriver.java:6
> at org.apache.hadoop.util.ProgramDriver.driver(Progra mDriver.java:139)
> at org.apache.hadoop.examples.ExampleDriver.main(Exam pleDriver.java:53)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(Native
> MethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(De
> legatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:616)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:155 )
> at org.apache.hadoop.mapred.JobShell.run(JobShell.jav a:194)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:79)
> at org.apache.hadoop.mapred.JobShell.main(JobShell.ja va:220)
> Caused by: java.net.ConnectException: Connection refused
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketC hannelImpl.java:592)
> at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.jav a:11
> at org.apache.hadoop.ipc.Client$Connection.setupIOstr eams(Client.java:174)
> at org.apache.hadoop.ipc.Client.getConnection(Client. java:623)
> at org.apache.hadoop.ipc.Client.call(Client.java:546)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java: 212)
> at org.apache.hadoop.dfs.$Proxy0.getProtocolVersion(U nknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:313)
> at org.apache.hadoop.dfs.DFSClient.createRPCNamenode( DFSClient.java:102)
> at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.j ava:17
> at org.apache.hadoop.dfs.DistributedFileSystem.initia
> lize(DistributedFileSystem.java:6
> at org.apache.hadoop.fs.FileSystem.createFileSystem(F ileSystem.java:1280)
> at org.apache.hadoop.fs.FileSystem.access$300(FileSys tem.java:56)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSyst em.java:1291)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:203)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:10
> at org.apache.hadoop.mapred.JobConf.getWorkingDirecto ry(JobConf.java:352)
>   

Hadoop IO performance, prefetch etc

Posted by Songting Chen <ke...@yahoo.com>.
Hi,
   Most of our map jobs are IO bound. However, for the same node, the IO throughput during the map phase is only 20% of its real sequential IO capability (we tested the sequential IO throughput by iozone) 
   I think the reason is that while each map has a sequential IO request, since there are many maps concurrently running on the same node, this causes quite expensive IO switches.
   Prefetch may be a good solution here especially a map job is supposed to scan through an entire block and no more no less. Any idea how to enable it?

Thanks,
-Songting