You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Tsz Wo (Nicholas), SZE (JIRA)" <ji...@apache.org> on 2009/01/03 01:45:44 UTC

[jira] Commented: (HADOOP-4949) Native compilation is broken

    [ https://issues.apache.org/jira/browse/HADOOP-4949?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12660431#action_12660431 ] 

Tsz Wo (Nicholas), SZE commented on HADOOP-4949:
------------------------------------------------

- After compiled the native library, some generated files are overwritten.  For examples,
{noformat}
M      src/native/configure
M      src/native/Makefile.in
M      src/native/src/org/apache/hadoop/io/compress/zlib/Makefile.in
M      src/native/lib/Makefile.in
M      src/c++/pipes/aclocal.m4
M      src/c++/pipes/configure
M      src/c++/utils/configure
?      src/c++/libhdfs/autom4te.cache
M      src/c++/libhdfs/Makefile.in
{noformat}
The changes depend on which the ant command parameters used.  Should we add these files to the ignore list?

- The compilation depends on the ordering of executions.  In my machine, 
-* it works if
-*# ant compile -Dcompile.c++=true -Dlibhdfs=true
-*# ant compile-native
-* but if fails if
-*# ant compile-native
-*# ant compile -Dcompile.c++=true -Dlibhdfs=true

- There are different ways to compile native library.  It is not clear to me that which one should be used.  e.g.
-* ant -Dcompile.c++=yes -Dcompile.native=yes -Dlibhdfs=yes compile
-* ant compile-native -Dcompile.c++=true -Dlibhdfs=true
-* ant compile -Dcompile.c++=true -Dlibhdfs=true
-* ant compile-native



> Native compilation is broken
> ----------------------------
>
>                 Key: HADOOP-4949
>                 URL: https://issues.apache.org/jira/browse/HADOOP-4949
>             Project: Hadoop Core
>          Issue Type: Bug
>    Affects Versions: 0.20.0
>            Reporter: Chris Douglas
>            Assignee: Chris Douglas
>            Priority: Blocker
>             Fix For: 0.20.0
>
>         Attachments: 4949-0.patch, 4949_20081230.patch, 4949_20081231.patch, HADOOP-4949.patch
>
>
> Compilation of the native libs is broken:
> {noformat}
> compile-core-native:
>     [javah] [Search path = /toolshome/build/Linux_2.6_rh4_x86_64/tools/java/jdk1.6.0_i586/jre/lib/resources.jar: \
>                            /toolshome/build/Linux_2.6_rh4_x86_64/tools/java/jdk1.6.0_i586/jre/lib/rt.jar: \
>                            /toolshome/build/Linux_2.6_rh4_x86_64/tools/java/jdk1.6.0_i586/jre/lib/sunrsasign.jar: \
>                            /toolshome/build/Linux_2.6_rh4_x86_64/tools/java/jdk1.6.0_i586/jre/lib/jsse.jar: \
>                            /toolshome/build/Linux_2.6_rh4_x86_64/tools/java/jdk1.6.0_i586/jre/lib/jce.jar: \
>                            /toolshome/build/Linux_2.6_rh4_x86_64/tools/java/jdk1.6.0_i586/jre/lib/charsets.jar: \
>                            /toolshome/build/Linux_2.6_rh4_x86_64/tools/java/jdk1.6.0_i586/jre/classes: \
>                            /hadoophome/build/classes]
>     [javah] [Loaded /hadoophome/build/classes/org/apache/hadoop/io/compress/zlib/ZlibCompressor.class]
>     [javah] [Loaded /toolshome/build/Linux_2.6_rh4_x86_64/tools/java/jdk1.6.0_i586/jre/lib/rt.jar(java/lang/Object.class)]
>     [javah] [Forcefully writing file /hadoophome/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibCompressor.h]
>     [javah] [Loaded /hadoophome/build/classes/org/apache/hadoop/io/compress/zlib/ZlibDecompressor.class]
>     [javah] [Forcefully writing file /hadoophome/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibDecompressor.h]
>     [javah] [Search path = /toolshome/build/Linux_2.6_rh4_x86_64/tools/java/jdk1.6.0_i586/jre/lib/resources.jar: \
>                            /toolshome/build/Linux_2.6_rh4_x86_64/tools/java/jdk1.6.0_i586/jre/lib/rt.jar: \
>                            /toolshome/build/Linux_2.6_rh4_x86_64/tools/java/jdk1.6.0_i586/jre/lib/sunrsasign.jar: \
>                            /toolshome/build/Linux_2.6_rh4_x86_64/tools/java/jdk1.6.0_i586/jre/lib/jsse.jar: \
>                            /toolshome/build/Linux_2.6_rh4_x86_64/tools/java/jdk1.6.0_i586/jre/lib/jce.jar: \
>                            /toolshome/build/Linux_2.6_rh4_x86_64/tools/java/jdk1.6.0_i586/jre/lib/charsets.jar: \
>                            /toolshome/build/Linux_2.6_rh4_x86_64/tools/java/jdk1.6.0_i586/jre/classes: \
>                            /hadoophome/build/classes]
>     [javah] Error: Class org.apache.hadoop.io.compress.lzo.LzoCompressor could not be found.
> {noformat}

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.