You are viewing a plain text version of this content. The canonical link for it is here.
Posted to general@hadoop.apache.org by Deepak Halale <de...@gmail.com> on 2009/09/03 04:53:09 UTC
hadoop configuration help
Hi,
I am trying to configure Hadoop distributed environment .
I downloaded hadoop RPM from cloudera, installed it on MACHINE1 (Ubuntu
Linux) and MACHINE2 (Fedora Linux)
Installed Hadoop RPM on MACHINE2 and Debian Packages on MACHINE1
edited etc/hadoop/conf.pseudo/masters file on MACHINE1 with MACHINE2 name in
masters file
added MACHINE name in salves file of MACHINE2 under
etc/hadoop/conf.my_cluster/slaves (conf.my_cluster is the alternatives on
MACHINE2)
was able to SSN from MACHINE2 to MACHINE1
I am missing something?
getting the following error while running mapreduce job from MACHINE2
09/09/02 22:42:15 INFO dfs.DFSClient: org.apache.hadoop.ipc.RemoteException:
java.io.IOException: File
/var/lib/hadoop/cache/hadoop/mapred/system/job_200909021816_0003/job.jar
could only be replicated to 0 nodes, instead of 1
at
org.apache.hadoop.dfs.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1123)
at org.apache.hadoop.dfs.NameNode.addBlock(NameNode.java:330)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:481)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:890)
at org.apache.hadoop.ipc.Client.call(Client.java:716)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
at
org.apache.hadoop.dfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2450)
at
org.apache.hadoop.dfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2333)
at
org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1800(DFSClient.java:1745)
at
org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1922)
Thanks
Deepak