You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Rajgopal Vaithiyanathan <ra...@gmail.com> on 2010/12/28 13:33:28 UTC
Topology : Script Based Mapping
I wrote a script to map the IP's to a rack. The script is as follows. :
for i in $* ; do
topo=`echo $i | cut -d"." -f1,2,3 | sed 's/\./-/g'`
topo=/rack-$topo" "
final=$final$topo
done
echo $final
I also did ` chmod +x topology_script.sh`
I tried a sample data :
[joa@localhost bin]$ ./topology_script.sh 172.21.1.2 172.21.3.4
/rack-172-21-1 /rack-172-21-3
I also made the change in core-site.xml as follows.
<property>
<name>topology.script.file.name</name>
<value>$HOME/sw/hadoop-0.20.2/bin/topology_script.sh</value>
</property>
But while starting the cluster, The namenode logs shows the error (listed
below). and every IP gets mapped to the /default-rack
Kindly help.:)
Thanks in advance.
2010-12-28 17:30:50,549 WARN org.apache.hadoop.net.ScriptBasedMapping:
java.io.IOException: Cannot run program
"$HOME/sw/hadoop-0.20.2/bin/topology_script.sh" (in directory
"/home/joa/sw/hadoop-0.20.2"): java.io.IOException: error=2, No such file or
directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:474)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:149)
at org.apache.hadoop.util.Shell.run(Shell.java:134)
at
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)
at
org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.runResolveCommand(ScriptBasedMapping.java:148)
at
org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.resolve(ScriptBasedMapping.java:94)
at
org.apache.hadoop.net.CachedDNSToSwitchMapping.resolve(CachedDNSToSwitchMapping.java:59)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.resolveNetworkLocation(FSNamesystem.java:2158)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:2129)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.register(NameNode.java:687)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:416)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
Caused by: java.io.IOException: java.io.IOException: error=2, No such file
or
directory
at java.lang.UNIXProcess.<init>(UNIXProcess.java:164)
at java.lang.ProcessImpl.start(ProcessImpl.java:81)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:467)
... 19 more
--
Thanks and Regards,
Rajgopal Vaithiyanathan.
Re: Topology : Script Based Mapping
Posted by Edward Capriolo <ed...@gmail.com>.
On Tue, Dec 28, 2010 at 11:36 PM, Hemanth Yamijala <yh...@gmail.com> wrote:
> Hi,
>
> On Tue, Dec 28, 2010 at 6:03 PM, Rajgopal Vaithiyanathan
> <ra...@gmail.com> wrote:
>> I wrote a script to map the IP's to a rack. The script is as follows. :
>>
>> for i in $* ; do
>> topo=`echo $i | cut -d"." -f1,2,3 | sed 's/\./-/g'`
>> topo=/rack-$topo" "
>> final=$final$topo
>> done
>> echo $final
>>
>> I also did ` chmod +x topology_script.sh`
>>
>> I tried a sample data :
>>
>> [joa@localhost bin]$ ./topology_script.sh 172.21.1.2 172.21.3.4
>> /rack-172-21-1 /rack-172-21-3
>>
>> I also made the change in core-site.xml as follows.
>>
>> <property>
>> <name>topology.script.file.name</name>
>> <value>$HOME/sw/hadoop-0.20.2/bin/topology_script.sh</value>
>> </property>
>>
>
> I am not sure if $HOME gets expanded automatically. Can you try it as
> ${HOME}, or in the worst case specify the expanded path.
>
> Thanks
> Hemanth
>> But while starting the cluster, The namenode logs shows the error (listed
>> below). and every IP gets mapped to the /default-rack
>>
>> Kindly help.:)
>> Thanks in advance.
>>
>> 2010-12-28 17:30:50,549 WARN org.apache.hadoop.net.ScriptBasedMapping:
>> java.io.IOException: Cannot run program
>> "$HOME/sw/hadoop-0.20.2/bin/topology_script.sh" (in directory
>> "/home/joa/sw/hadoop-0.20.2"): java.io.IOException: error=2, No such file or
>> directory
>> at java.lang.ProcessBuilder.start(ProcessBuilder.java:474)
>> at org.apache.hadoop.util.Shell.runCommand(Shell.java:149)
>> at org.apache.hadoop.util.Shell.run(Shell.java:134)
>> at
>> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)
>> at
>> org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.runResolveCommand(ScriptBasedMapping.java:148)
>> at
>> org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.resolve(ScriptBasedMapping.java:94)
>> at
>> org.apache.hadoop.net.CachedDNSToSwitchMapping.resolve(CachedDNSToSwitchMapping.java:59)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.resolveNetworkLocation(FSNamesystem.java:2158)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:2129)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.register(NameNode.java:687)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:616)
>> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
>> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
>> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:416)
>> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
>> Caused by: java.io.IOException: java.io.IOException: error=2, No such file
>> or
>> directory
>> at java.lang.UNIXProcess.<init>(UNIXProcess.java:164)
>> at java.lang.ProcessImpl.start(ProcessImpl.java:81)
>> at java.lang.ProcessBuilder.start(ProcessBuilder.java:467)
>> ... 19 more
>>
>> --
>> Thanks and Regards,
>> Rajgopal Vaithiyanathan.
>>
>
$ is not expanded to shell or environment variables. They are only
expanded to other hadoop configuration variables. Use a full path.
Re: Topology : Script Based Mapping
Posted by Hemanth Yamijala <yh...@gmail.com>.
Hi,
On Tue, Dec 28, 2010 at 6:03 PM, Rajgopal Vaithiyanathan
<ra...@gmail.com> wrote:
> I wrote a script to map the IP's to a rack. The script is as follows. :
>
> for i in $* ; do
> topo=`echo $i | cut -d"." -f1,2,3 | sed 's/\./-/g'`
> topo=/rack-$topo" "
> final=$final$topo
> done
> echo $final
>
> I also did ` chmod +x topology_script.sh`
>
> I tried a sample data :
>
> [joa@localhost bin]$ ./topology_script.sh 172.21.1.2 172.21.3.4
> /rack-172-21-1 /rack-172-21-3
>
> I also made the change in core-site.xml as follows.
>
> <property>
> <name>topology.script.file.name</name>
> <value>$HOME/sw/hadoop-0.20.2/bin/topology_script.sh</value>
> </property>
>
I am not sure if $HOME gets expanded automatically. Can you try it as
${HOME}, or in the worst case specify the expanded path.
Thanks
Hemanth
> But while starting the cluster, The namenode logs shows the error (listed
> below). and every IP gets mapped to the /default-rack
>
> Kindly help.:)
> Thanks in advance.
>
> 2010-12-28 17:30:50,549 WARN org.apache.hadoop.net.ScriptBasedMapping:
> java.io.IOException: Cannot run program
> "$HOME/sw/hadoop-0.20.2/bin/topology_script.sh" (in directory
> "/home/joa/sw/hadoop-0.20.2"): java.io.IOException: error=2, No such file or
> directory
> at java.lang.ProcessBuilder.start(ProcessBuilder.java:474)
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:149)
> at org.apache.hadoop.util.Shell.run(Shell.java:134)
> at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)
> at
> org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.runResolveCommand(ScriptBasedMapping.java:148)
> at
> org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.resolve(ScriptBasedMapping.java:94)
> at
> org.apache.hadoop.net.CachedDNSToSwitchMapping.resolve(CachedDNSToSwitchMapping.java:59)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.resolveNetworkLocation(FSNamesystem.java:2158)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:2129)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.register(NameNode.java:687)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:616)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:416)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
> Caused by: java.io.IOException: java.io.IOException: error=2, No such file
> or
> directory
> at java.lang.UNIXProcess.<init>(UNIXProcess.java:164)
> at java.lang.ProcessImpl.start(ProcessImpl.java:81)
> at java.lang.ProcessBuilder.start(ProcessBuilder.java:467)
> ... 19 more
>
> --
> Thanks and Regards,
> Rajgopal Vaithiyanathan.
>