You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Vivek K <ha...@gmail.com> on 2011/09/27 00:19:22 UTC

libhdfs and libjvm.so on distributed cache

Hi all,

I have written a hadoop pipes program that uses libhdfs to read files from
HDFS. The program runs fine in the pseudo-distributed setup on the cloudera
virtual machine. But when I tried to test it on a cluster, it failed.

Turns out the cluster computers didn't have libhdfs installed. For the time
being (instead of installing it on all workers), I am using the distributed
cache using -files option in hadoop pipes (and softlink libhdfs.so.0). The
problem I am having is that the cluster nodes have Java 64-bit installed and
have AMD processors. So libjvm.so on workers gives the following error:
"error while loading shared libraries: libjvm.so: wrong ELF class:
ELFCLASS64".

I tried putting the libjvm.so that I have on my machine (which is 32 bit).
Even though this avoids the above "wrong ELF" error but for some reason it
fails to read the CLASSPATH variable and hence hdfs-connect fails.

Any suggestions on how I can possibly fix this.

Thanks.

Best regards
Vivek
--