You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Giridharan Kesavan (JIRA)" <ji...@apache.org> on 2008/10/31 04:42:44 UTC

[jira] Issue Comment Edited: (HADOOP-3344) libhdfs: always builds 32bit, even when x86_64 Java used

    [ https://issues.apache.org/jira/browse/HADOOP-3344?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12644201#action_12644201 ] 

gkesavan edited comment on HADOOP-3344 at 10/30/08 8:41 PM:
----------------------------------------------------------------------

Here we have the v2 version of the patch available which as well *requires autoconf-2.61*. This is an improved version over v1 submited by Craig. Many thanks to Craig.

I 've used a small piece of java code along with the m4 macros to detect the jvm arch. 
class getArch {
  public static void main(String []args) {
     System.out.println(System.getProperty("sun.arch.data.model", "32"));
  }
}
If somebody can suggest me of a better way I would be more than happy to implement it

This patch addresses all the three scenarios
    * 32bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
    * 64bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
    * 64bit OS, 64bit java => libhdfs should be built 64bit, specify -m64

{quote}
To *Build* libhdfs.so use    *ant compile-c++-libhdfs -Dcompile.c++=true*
To *Test*  libhdfs.so use    *ant test-c++-libhdfs -Dcompile.c++=true* 
{quote}
I have tested this patch on amd64 with 32 bit and 64bit jvm.  
Please help me by testing in other platforms as necessary and let me know your comments. 

Thanks





      was (Author: gkesavan):
    Here we have the v2 version of the patch available which as well requires autoconf-2.6. This is an improved version over v1 submited by Craig. 
Many thanks to Craig.

I 've used a small piece of java code along with the m4 macros to detect the jvm arch. 
class getArch {
  public static void main(String []args) {
     System.out.println(System.getProperty("sun.arch.data.model", "32"));
  }
}
If somebody can suggest me of a better way I would be more than happy to implement it

This patch addresses all the three scenarios
    * 32bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
    * 64bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
    * 64bit OS, 64bit java => libhdfs should be built 64bit, specify -m64

To Build libhdfs.so          ant  compile-c++-libhdfs -Dcompile.c++=true 
To test libhdfs.so             ant test-c++-libhdfs -Dcompile.c++=true 

I have tested this patch on amd64 with 32 bit and 64bit jvm.  
Please help me by testing in other platforms as necessary and let me know your comments. 

Thanks




  
> libhdfs: always builds 32bit, even when x86_64 Java used
> --------------------------------------------------------
>
>                 Key: HADOOP-3344
>                 URL: https://issues.apache.org/jira/browse/HADOOP-3344
>             Project: Hadoop Core
>          Issue Type: Bug
>          Components: build, libhdfs
>         Environment: x86_64 linux, x86_64 Java installed
>            Reporter: Craig Macdonald
>            Assignee: Giridharan Kesavan
>         Attachments: HADOOP-3344.v0.patch, HADOOP-3344.v1.patch
>
>
> The makefile for libhdfs is hard-coded to compile 32bit libraries. It should perhaps compile dependent on which Java is set.
> The relevant lines are:
> LDFLAGS = -L$(JAVA_HOME)/jre/lib/$(OS_ARCH)/server -ljvm -shared -m32 -Wl,-x
> CPPFLAGS = -m32 -I$(JAVA_HOME)/include -I$(JAVA_HOME)/include/$(PLATFORM)
> $OS_ARCH can be (e.g.) amd64 if you're using a 64bit java on the x86_64 platform. So while gcc will try to link against the correct libjvm.so, it will fail because libhdfs is to be built 32bit (because of -m32)
> {noformat}
>      [exec] /usr/bin/ld: skipping incompatible /usr/java64/latest/jre/lib/amd64/server/libjvm.so when searching for -ljvm
>      [exec] /usr/bin/ld: cannot find -ljvm
>      [exec] collect2: ld returned 1 exit status
>      [exec] make: *** [/root/def/hadoop-0.16.3/build/libhdfs/libhdfs.so.1] Error 1
> {noformat}
> The solution should be to specify -m32 or -m64 depending on the os.arch detected.
> There are 3 cases to check:
>  * 32bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
>  * 64bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
>  * 64bit OS, 64bit java => libhdfs should be built 64bit, specify -m64

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.