You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Pete Wyckoff (JIRA)" <ji...@apache.org> on 2008/09/07 00:15:44 UTC

[jira] Issue Comment Edited: (HADOOP-3344) libhdfs: always builds 32bit, even when x86_64 Java used

    [ https://issues.apache.org/jira/browse/HADOOP-3344?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12628916#action_12628916 ] 

wyckoff edited comment on HADOOP-3344 at 9/6/08 3:15 PM:
--------------------------------------------------------------

I'm getting this error ---

relocation  against `a local symbol' can not be used when making a shared object; recompile with -fPIC

This seems to have been a problem for others on amd64 too, but I don't know what the fix is other than adding -fPIC to the library file build, but that has bad implications on non amd64 platforms.


      was (Author: wyckoff):
    I tried using this on an amd64 machine and I have 2 problems. One is I only have autoconf 2.59 - is 2.61 really needed? When using with 2.59 I get the following error:

relocation  against `a local symbol' can not be used when making a shared object; recompile with -fPIC

This seems to have been a problem for others on amd64 too, but I don't know what the fix is other than adding -fPIC to the library file build, but that has bad implications on non amd64 platforms.

  
> libhdfs: always builds 32bit, even when x86_64 Java used
> --------------------------------------------------------
>
>                 Key: HADOOP-3344
>                 URL: https://issues.apache.org/jira/browse/HADOOP-3344
>             Project: Hadoop Core
>          Issue Type: Bug
>          Components: build, libhdfs
>         Environment: x86_64 linux, x86_64 Java installed
>            Reporter: Craig Macdonald
>         Attachments: HADOOP-3344.v0.patch, HADOOP-3344.v1.patch
>
>
> The makefile for libhdfs is hard-coded to compile 32bit libraries. It should perhaps compile dependent on which Java is set.
> The relevant lines are:
> LDFLAGS = -L$(JAVA_HOME)/jre/lib/$(OS_ARCH)/server -ljvm -shared -m32 -Wl,-x
> CPPFLAGS = -m32 -I$(JAVA_HOME)/include -I$(JAVA_HOME)/include/$(PLATFORM)
> $OS_ARCH can be (e.g.) amd64 if you're using a 64bit java on the x86_64 platform. So while gcc will try to link against the correct libjvm.so, it will fail because libhdfs is to be built 32bit (because of -m32)
> {noformat}
>      [exec] /usr/bin/ld: skipping incompatible /usr/java64/latest/jre/lib/amd64/server/libjvm.so when searching for -ljvm
>      [exec] /usr/bin/ld: cannot find -ljvm
>      [exec] collect2: ld returned 1 exit status
>      [exec] make: *** [/root/def/hadoop-0.16.3/build/libhdfs/libhdfs.so.1] Error 1
> {noformat}
> The solution should be to specify -m32 or -m64 depending on the os.arch detected.
> There are 3 cases to check:
>  * 32bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
>  * 64bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
>  * 64bit OS, 64bit java => libhdfs should be built 64bit, specify -m64

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.