You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Jimmy Wan <ji...@indeed.com> on 2008/03/14 18:24:57 UTC

Configuring Nodes on Windows in Distributed Environment

Has anyone succeeded in doing this? I've successfully got a cluster with a  
few linux nodes running, but I'd like to add my desktop machine (JIMMY) to  
the mix for some spare compute cycles. I can happily run standalone apps  
with the loopback config, but I can't quite get my machine to play nicely  
in a distributed conf. The log output is nonsensical to me. It appears to  
be adding CR where they don't belong. Any ideas?

Configuration details and output from start-all.sh:

I've got the conf/logs/tmp/hadoop directories created in C:/home/hadoop.
I've got symlinks from /home/hadoop/* to C:/home/hadoop (I wasn't sure how  
the unix-style paths play nice with windows paths so I was trying to cover  
all the bases.
I created C:/tmp just for kicks (it's empty)

My .bashrc on contains:
export JAVA_HOME="/cygdrive/c/Dev/jdk1.6.0_04"
export HADOOP_IDENT_STRING=`hostname`
export HADOOP_INSTALL=/cygdrive/c/home/hadoop
export HADOOP_HOME=/cygdrive/c/home/hadoop/hadoop
export HADOOP_LOG_DIR=/cygdrive/c/home/hadoop/logs
export HADOOP_CONF_DIR=/cygdrive/c/home/hadoop/conf
export HADOOP_HEAPSIZE=1000

[hadoop@mainserver ~]$ hadoop/bin/start-all.sh
starting namenode, logging to  
/home/hadoop/logs/hadoop-mainserver-namenode-mainserver.out
mainserver: starting datanode, logging to  
/home/hadoop/logs/hadoop-mainserver-datanode-mainserver.out
slavenode1: starting datanode, logging to  
/home/hadoop/logs/hadoop-slavenode1-datanode-slavenode1.out
slavenode2: starting datanode, logging to  
/home/hadoop/hadoop/bin/../logs/hadoop-hadoop-datanode-slavenode2.out
-datanode-JIMMY.outanode, logging to /home/hadoop/logs/hadoop-JIMMY
-datanode.pid: No such file or directoryemon.sh: line 117:  
/tmp/hadoop-JIMMY
-datanode-JIMMY.out: No such file or directoryh: line 116:  
/home/hadoop/logs/hadoop-JIMMY
jimmy: head: cannot open  
`/home/hadoop/logs/hadoop-JIMMY\r-datanode-JIMMY.out' for reading: No such  
file or directory
mainserver: starting secondarynamenode, logging to  
/home/hadoop/logs/hadoop-mainserver-secondarynamenode-mainserver.out
starting jobtracker, logging to  
/home/hadoop/logs/hadoop-mainserver-jobtracker-mainserver.out
slavenode2: starting tasktracker, logging to  
/home/hadoop/hadoop/bin/../logs/hadoop-hadoop-tasktracker-slavenode2.out
slavenode1: starting tasktracker, logging to  
/home/hadoop/logs/hadoop-slavenode1-tasktracker-slavenode1.out
mainserver: starting tasktracker, logging to  
/home/hadoop/logs/hadoop-mainserver-tasktracker-mainserver.out
-tasktracker-JIMMY.outacker, logging to /home/hadoop/logs/hadoop-JIMMY
-tasktracker.pid: No such file or directoryn.sh: line 117:  
/tmp/hadoop-JIMMY
-tasktracker-JIMMY.out: No such file or directoryline 116:  
/home/hadoop/logs/hadoop-JIMMY
jimmy: head: cannot open  
`/home/hadoop/logs/hadoop-JIMMY\r-tasktracker-JIMMY.out' for reading: No  
such file or directory

-- 
Jimmy