You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Alexander Hristov <al...@planetalia.com> on 2012/09/29 16:27:26 UTC

Errors getting Hadoop 0.23.3 under cygwin / Windows 7

Hi

I'm trying to set up a development environment with Hadoop 0.23.3 under 
Windows 7 (x64) and Java 7, to no avail.

I've set up everything more or less the same way I did it in a Linux 
cluster.
hdfs namenode -format works properly and creates the corresponding 
subdirectories and files .
To avoid the problems I have been reading about with the spaces present 
in the Java path, I reinstalled Java and placed it under c:\Java.
Both HADOOP_HOME and JAVA_HOME exist and it is exported in .bashrc :

Alexander@dell-PC /hadoop/sbin
$ echo $JAVA_HOME
/cygdrive/c/Java


Still, when I try to start-dfs.sh, this is what happens:

cygpath: can't convert empty path
which: no hdfs in (./C:\cygwin\hadoop/bin)
dirname: falta un operando
Pruebe `dirname --help' para más información.
which: no hdfs in (./C:\cygwin\hadoop/bin)
cygpath: can't convert empty path
]tarting namenodes on [localhost
cygpath: can't convert empty path
cygpath: can't convert empty path
: hostname nor servname provided, or not known
cygpath: can't convert empty path
cygpath: can't convert empty path
localhost: bash: line 0: cd: C:cygwinhadoop: No such file or directory
localhost: Error: JAVA_HOME is not set and could not be found.
]tarting secondary namenodes [0.0.0.0
cygpath: can't convert empty path
cygpath: can't convert empty path
: hostname nor servname provided, or not known

Any help would be much appreciated, as I'm a bit out of ideas

Regards