You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Paula Logan <pm...@verizon.net.INVALID> on 2021/09/10 14:12:30 UTC

hadoop-hdfs-native-client Help

Hello,
I am new to building Hadoop locally, and am having some issues.  Please let me know if this information should be sent to a different distro.

(1) Can Hadoop 3.3.1 be compiled and run with OpenJDK 11 or is OpenJDK 1.8 needed for compile while 1.8 or 11 can be used to run hadoop?


(2) I am compiling and testing Hadoop 3.3.1 on RHEL 8.4 on the command line not via any IDE inside an AWS instance.  I have encountered an issue      with Native Test Case #35 (all other 39 Native Test Cases succeed).
First here is my maven command:
mvn -e -X test -Pnative,parallel-tests,shelltest,yarn-ui -Dtest=allNative -Dparallel-tests=true -Drequire.bzip2=true -Drequire.fuse=true -Drequire.isal=true -Disal.prefix=/usr/local -Disal.lib=/usr/local/lib64 -Dbundle.isal=true -Drequire.openssl=true -Dopenssl.prefix=/usr -Dopenssl.include=/usr/include -Dopenssl.lib=/usr/lib64 -Dbundle.openssl=true -Dbundle.openssl.in.bin=true -Drequire.pmdk=true -Dpmdk.lib=/usr/lib64 -Dbundle.pmdk=true -Drequire.snappy=true -Dsnappy.prefix=/usr -Dsnappy.include=/usr/include -Dsnappy.lib=/usr/lib64 -Dbundle.snappy=true -Drequire.valgrind=true -Dhbase.profile=2.0 -Drequire.zstd=true -Dzstd.prefix=/usr -Dzstd.include=/usr/include -Dzstd.lib=/usr/lib64 -Dbundle.zstd=true -Dbundle.zstd.in.bin=true -Drequire.test.libhadoop=true
This is what I get for Test Case #35:
     [exec] 35/40 Test #35: test_libhdfs_threaded_hdfspp_test_shim_static ..............***Failed   31.58 sec     [exec] testRecursiveJvmMutex error:     [exec] ClassNotFoundException: RuntimeExceptionjava.lang.NoClassDefFoundError: RuntimeException     [exec] Caused by: java.lang.ClassNotFoundException: RuntimeException     [exec]     at java.net.URLClassLoader.findClass(URLClassLoader.java:382)     [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:418)     [exec]     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)     [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:351)     [exec] 2021-09-02 22:31:09,706 INFO  hdfs.MiniDFSCluster (MiniDFSCluster.java:<init>(529)) - starting cluster: numNameNodes=1, numDataNodes=1     [exec] 2021-09-02 22:31:10,134 INFO  namenode.NameNode (NameNode.java:format(1249)) - Formatting using clusterid: testClusterID     [exec] 2021-09-02 22:31:10,156 INFO  namenode.FSEditLog (FSEditLog.java:newInstance(229)) - Edit logging is async:true     [exec] 2021-09-02 22:31:10,182 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(814)) - KeyProvider: null     [exec] 2021-09-02 22:31:10,184 INFO  namenode.FSNamesystem (FSNamesystemLock.java:<init>(141)) - fsLock is fair: true     [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystemLock.java:<init>(159)) - Detailed lock hold time metrics enabled: false     [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(847)) - fsOwner                = ec2-user (auth:SIMPLE)     [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(848)) - supergroup       ...       [exec] 2021-09-02 22:31:13,204 INFO  ipc.Server (Server.java:logException(3020)) - IPC Server handler 7 on default port 44945, call Call#6 Retry#-1 org.apache.hadoop.hdfs.protocol.ClientProtocol.getBlockLocations from 127.0.0.1:37362: java.io.FileNotFoundException: File does not exist: /tlhData0001/file1             ...     [exec] 98% tests passed, 1 tests failed out of 40     [exec]     [exec] Total Test time (real) = 270.30 sec     [exec]     [exec] The following tests FAILED:     [exec]      35 - test_libhdfs_threaded_hdfspp_test_shim_static (Failed)     [exec] Errors while running CTest[INFO] ------------------------------------------------------------------------[INFO] Reactor Summary:[INFO][INFO] Apache Hadoop Main 3.3.1 ........................... SUCCESS [  0.707 s][INFO] Apache Hadoop Build Tools .......................... SUCCESS [  2.743 s][INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.692 s][INFO] Apache Hadoop Annotations .......................... SUCCESS [  1.955 s][INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  0.106 s][INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.101 s][INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.194 s][INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  0.806 s][INFO] Apache Hadoop Auth ................................. SUCCESS [  4.192 s][INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  0.452 s][INFO] Apache Hadoop Common ............................... SUCCESS [ 54.493 s][INFO] Apache Hadoop NFS .................................. SUCCESS [  2.123 s][INFO] Apache Hadoop KMS .................................. SUCCESS [  2.087 s][INFO] Apache Hadoop Registry ............................. SUCCESS [  2.538 s][INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.055 s][INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 16.283 s][INFO] Apache Hadoop HDFS ................................. SUCCESS [ 24.263 s][INFO] Apache Hadoop HDFS Native Client ................... FAILURE [04:49 min][INFO] Apache Hadoop HttpFS ............................... SKIPPED...[INFO] ------------------------------------------------------------------------[INFO] BUILD FAILURE[INFO] ------------------------------------------------------------------------[INFO] Total time: 06:49 min[INFO] Finished at: 2021-09-02T22:34:18Z[INFO] ------------------------------------------------------------------------[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (native_tests) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 8[ERROR] around Ant part ...<exec failonerror="true" dir="/home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/" executable="ctest">... @ 6:150 in /home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/antrun/build-main.xml[ERROR] -> [Help 1]

This RuntimeException error also appears for Native Test Case 2 but that test case doesn't fail.
Also, see a lot of "File not found" messages.  Assume at this point that the NoClassDefFoundError causes the code that creates the files to be skippedand once the NoClassDefFoundError is fixed, these files will be generated.
The compile of Hadoop 3.3.1 on RHEL 8.4 succeeded without issues. I have JAVA_HOME and JRE_HOME set in .bashrc to OpenJDK 1.8 and have added these into the $PATH.  
  export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.302.b08-0.el8_4.x86_64  export JAVA_OPTS="-Xms2048m -Xmx4096m -XX:+UseZGC"  export JRE_HOME=${JAVA_HOME}/jre  export LIBHDFS_OPTS="-Xms2048m -Xmx4096m"  export MAVEN_HOME=/usr/share/maven  export MAVEN_OPTS="-Xms256m -Xmx1536m"  export PROTOBUF_HOME=/usr/local  export PATH=/home/ec2-user/.local/bin:/home/ec2-user/bin:${JAVA_HOME}/bin:${JRE_HOME}/bin:${MAVEN_HOME}/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
It appears the $JRE_HOME/lib/rt.jar isn't being included in the maven.test.classpath or in the native module tests.
I tried setting CLASSPATH and JAVA_LIBRARY_PATH in .bashrc and tried passing via 'mvn' command, but still no success.
I followed the procedures in the BUILDING.txt file for the CentOS 8 as that was the closest to RHEL 8.4.
    ${HADOOP_SRC_HOME}/hadoop-hdfs-project/hadoop-hdfs-native-client/pom.xml
      <properties>        <native_cmake_args></native_cmake_args>        <native_ctest_args></native_ctest_args>        <native_make_args></native_make_args>
  Do I need to supply anything for any of these three properties?

(3)  Here is some more snippets of information from the maven log that I captured.
     [exec] 98% tests passed, 1 tests failed out of 40     [exec]     [exec] Total Test time (real) = 270.30 sec     [exec]     [exec] The following tests FAILED:     [exec]      35 - test_libhdfs_threaded_hdfspp_test_shim_static (Failed)     [exec] Errors while running CTest[INFO] ------------------------------------------------------------------------[INFO] Reactor Summary:[INFO][INFO] Apache Hadoop Main 3.3.1 ........................... SUCCESS [  0.707 s][INFO] Apache Hadoop Build Tools .......................... SUCCESS [  2.743 s][INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.692 s][INFO] Apache Hadoop Annotations .......................... SUCCESS [  1.955 s][INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  0.106 s][INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.101 s][INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.194 s][INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  0.806 s][INFO] Apache Hadoop Auth ................................. SUCCESS [  4.192 s][INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  0.452 s][INFO] Apache Hadoop Common ............................... SUCCESS [ 54.493 s][INFO] Apache Hadoop NFS .................................. SUCCESS [  2.123 s][INFO] Apache Hadoop KMS .................................. SUCCESS [  2.087 s][INFO] Apache Hadoop Registry ............................. SUCCESS [  2.538 s][INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.055 s][INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 16.283 s][INFO] Apache Hadoop HDFS ................................. SUCCESS [ 24.263 s][INFO] Apache Hadoop HDFS Native Client ................... FAILURE [04:49 min][INFO] Apache Hadoop HttpFS ............................... SKIPPED...[INFO] ------------------------------------------------------------------------[INFO] BUILD FAILURE[INFO] ------------------------------------------------------------------------[INFO] Total time: 06:49 min[INFO] Finished at: 2021-09-02T22:34:18Z[INFO] ------------------------------------------------------------------------[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (native_tests) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 8[ERROR] around Ant part ...<exec failonerror="true" dir="/home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/" executable="ctest">... @ 6:150 in /home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/antrun/build-main.xml[ERROR] -> [Help 1]

(4) What are the following properties?
      require.test.libhadoop - not sure of the purpose of this property or if I need it or is it just for hadoop project developers      bundle.<type> -vs-  bundle.<type>.in.bin  - What is the difference of bundle vs bundle.in.bin?

Please let me know what I might be missing or if a (or some) native files need to be modified to have the rt.jar (RuntimeException class contained within) be included.Wasn't sure if this was an OpenJDK 1.8 vs OpenJDK 11 issue as the JRE binary is located in a different directory in OpenJDK 11.  I am not using OpenJDK 11 at allnor is it installed in my RHEL 8.4 AWS instance.
I didn't submit a ticket as I assume there is something that I am not doing correctly or forgetting to include/do.
Any help you can give me would be very much appreciated.
Paula

Re: hadoop-hdfs-native-client Help

Posted by Masatake Iwasaki <iw...@oss.nttdata.co.jp>.
> Was wondering if you were using an IDE when you ran the Maven commands?

I ran commands from terminal right after checking out rel/release-3.3.1 tag with no modification on source tree.
I used maven-3.4.5-5 and cmake-3.18.2-11 provided by AppStream repo.
# You need newer cmake (and gcc9 and boost) if you want to build trunk (3.4.0-SNAPSHOT) with native profile.

On 2021/09/14 2:06, Paula Logan wrote:
> Was wondering if you were using an IDE when you ran the Maven commands?
> 
> I am not using an IDE; am running on command line.
> 
> Looks like maven.test.classpath doesn't include rt.jar;  does, however, include tools.jar which seems odd to me.
> 
> 
> -----Original Message-----
> From: Paula Logan <pm...@verizon.net>
> To: iwasakims@oss.nttdata.co.jp <iw...@oss.nttdata.co.jp>
> Sent: Mon, Sep 13, 2021 9:08 am
> Subject: Re: hadoop-hdfs-native-client Help
> 
> Thank you for taking the time to do a check.  RHEL 8.x must have some incompatibilities.
> 
> Will check with Oracle Java next to see if that Test Case #35 passes or fails.
> 
> 
> -----Original Message-----
> From: Masatake Iwasaki <iw...@oss.nttdata.co.jp>
> To: Paula Logan <pm...@verizon.net>; user@hadoop.apache.org <us...@hadoop.apache.org>
> Sent: Sun, Sep 12, 2021 11:34 pm
> Subject: Re: hadoop-hdfs-native-client Help
> 
>  > (2) I am compiling and testing Hadoop 3.3.1 on RHEL 8.4 on the command line not via any IDE inside an AWS instance.  I have encountered an issue
>  >      with Native Test Case #35 (all other 39 Native Test Cases succeed).
> 
> I could not reproduce the issue on CentOS 8 by java-1.8.0-openjdk-devel-1.8.0.302.b08-0.el8_4::
> 
>    $ java -version
>    openjdk version "1.8.0_302"
>    OpenJDK Runtime Environment (build 1.8.0_302-b08)
>    OpenJDK 64-Bit Server VM (build 25.302-b08, mixed mode)
> 
>    $ echo $JAVA_HOME
>    /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.302.b08-0.el8_4.x86_64
> 
>    $ mvn clean install -Pnative -DskipTests -DskipShade
>    $ cd hadoop-hdfs-project/hadoop-hdfs-native-client
>    $ mvn test -Pnative -Dtest=x
> 
> 
>  > (4) What are the following properties?
>  >
>  >      require.test.libhadoop - not sure of the purpose of this property or if I need it or is it just for hadoop project developers
>  >      bundle.<type> -vs-  bundle.<type>.in.bin  - What is the difference of bundle vs bundle.in.bin?
> 
> bundle.*.in.bin seems to be for Windows.
> https://issues.apache.org/jira/browse/HADOOP-12892 <https://issues.apache.org/jira/browse/HADOOP-12892>
> 
> Those bundle.* properties are used for creating distribution binary tarball of Hadoop (activated by -Pdist on build).
> It should not be related to the test failure of hadoop-hdfs-native-client.
> 
> 
> On 2021/09/10 23:12, Paula Logan wrote:
>  > Hello,
>  >
>  > I am new to building Hadoop locally, and am having some issues.  Please let me know if this information should be sent to a different distro.
>  >
>  >
>  > (1) Can Hadoop 3.3.1 be compiled and run with OpenJDK 11 or is OpenJDK 1.8 needed for compile while 1.8 or 11 can be used to run hadoop?
>  >
>  >
>  > (2) I am compiling and testing Hadoop 3.3.1 on RHEL 8.4 on the command line not via any IDE inside an AWS instance.  I have encountered an issue
>  >       with Native Test Case #35 (all other 39 Native Test Cases succeed).
>  >
>  > First here is my maven command:
>  >
>  > mvn -e -X test -Pnative,parallel-tests,shelltest,yarn-ui -Dtest=allNative -Dparallel-tests=true -Drequire.bzip2=true -Drequire.fuse=true -Drequire.isal=true -Disal.prefix=/usr/local -Disal.lib=/usr/local/lib64 -Dbundle.isal=true -Drequire.openssl=true -Dopenssl.prefix=/usr -Dopenssl.include=/usr/include -Dopenssl.lib=/usr/lib64 -Dbundle.openssl=true -Dbundle.openssl.in.bin=true -Drequire.pmdk=true -Dpmdk.lib=/usr/lib64 -Dbundle.pmdk=true -Drequire.snappy=true -Dsnappy.prefix=/usr -Dsnappy.include=/usr/include -Dsnappy.lib=/usr/lib64 -Dbundle.snappy=true -Drequire.valgrind=true -Dhbase.profile=2.0 -Drequire.zstd=true -Dzstd.prefix=/usr -Dzstd.include=/usr/include -Dzstd.lib=/usr/lib64 -Dbundle.zstd=true -Dbundle.zstd.in.bin=true -Drequire.test.libhadoop=true
>  >
>  > This is what I get for Test Case #35:
>  >
>  >       [exec] 35/40 Test #35: test_libhdfs_threaded_hdfspp_test_shim_static ..............***Failed   31.58 sec
>  >       [exec] testRecursiveJvmMutex error:
>  >       [exec] ClassNotFoundException: RuntimeExceptionjava.lang.NoClassDefFoundError: RuntimeException
>  >       [exec] Caused by: java.lang.ClassNotFoundException: RuntimeException
>  >       [exec]     at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>  >       [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
>  >       [exec]     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
>  >       [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
>  >       [exec] 2021-09-02 22:31:09,706 INFO  hdfs.MiniDFSCluster (MiniDFSCluster.java:<init>(529)) - starting cluster: numNameNodes=1, numDataNodes=1
>  >       [exec] 2021-09-02 22:31:10,134 INFO  namenode.NameNode (NameNode.java:format(1249)) - Formatting using clusterid: testClusterID
>  >       [exec] 2021-09-02 22:31:10,156 INFO  namenode.FSEditLog (FSEditLog.java:newInstance(229)) - Edit logging is async:true
>  >       [exec] 2021-09-02 22:31:10,182 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(814)) - KeyProvider: null
>  >       [exec] 2021-09-02 22:31:10,184 INFO  namenode.FSNamesystem (FSNamesystemLock.java:<init>(141)) - fsLock is fair: true
>  >       [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystemLock.java:<init>(159)) - Detailed lock hold time metrics enabled: false
>  >       [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(847)) - fsOwner                = ec2-user (auth:SIMPLE)
>  >       [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(848)) - supergroup
>  >       ...
>  >       [exec] 2021-09-02 22:31:13,204 INFO  ipc.Server (Server.java:logException(3020)) - IPC Server handler 7 on default port 44945, call Call#6 Retry#-1 org.apache.hadoop.hdfs.protocol.ClientProtocol.getBlockLocations from 127.0.0.1:37362: java.io.FileNotFoundException: File does not exist: /tlhData0001/file1
>  >       ...
>  >       [exec] 98% tests passed, 1 tests failed out of 40
>  >       [exec]
>  >       [exec] Total Test time (real) = 270.30 sec
>  >       [exec]
>  >       [exec] The following tests FAILED:
>  >       [exec]      35 - test_libhdfs_threaded_hdfspp_test_shim_static (Failed)
>  >       [exec] Errors while running CTest
>  > [INFO] ------------------------------------------------------------------------
>  > [INFO] Reactor Summary:
>  > [INFO]
>  > [INFO] Apache Hadoop Main 3.3.1 ........................... SUCCESS [  0.707 s]
>  > [INFO] Apache Hadoop Build Tools .......................... SUCCESS [  2.743 s]
>  > [INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.692 s]
>  > [INFO] Apache Hadoop Annotations .......................... SUCCESS [  1.955 s]
>  > [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  0.106 s]
>  > [INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.101 s]
>  > [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.194 s]
>  > [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  0.806 s]
>  > [INFO] Apache Hadoop Auth ................................. SUCCESS [  4.192 s]
>  > [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  0.452 s]
>  > [INFO] Apache Hadoop Common ............................... SUCCESS [ 54.493 s]
>  > [INFO] Apache Hadoop NFS .................................. SUCCESS [  2.123 s]
>  > [INFO] Apache Hadoop KMS .................................. SUCCESS [  2.087 s]
>  > [INFO] Apache Hadoop Registry ............................. SUCCESS [  2.538 s]
>  > [INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.055 s]
>  > [INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 16.283 s]
>  > [INFO] Apache Hadoop HDFS ................................. SUCCESS [ 24.263 s]
>  > [INFO] Apache Hadoop HDFS Native Client ................... FAILURE [04:49 min]
>  > [INFO] Apache Hadoop HttpFS ............................... SKIPPED
>  > ...
>  > [INFO] ------------------------------------------------------------------------
>  > [INFO] BUILD FAILURE
>  > [INFO] ------------------------------------------------------------------------
>  > [INFO] Total time: 06:49 min
>  > [INFO] Finished at: 2021-09-02T22:34:18Z
>  > [INFO] ------------------------------------------------------------------------
>  > [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (native_tests) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 8
>  > [ERROR] around Ant part ...<exec failonerror="true" dir="/home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/" executable="ctest">... @ 6:150 in /home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/antrun/build-main.xml
>  > [ERROR] -> [Help 1]
>  >
>  >
>  > This RuntimeException error also appears for Native Test Case 2 but that test case doesn't fail.
>  >
>  > Also, see a lot of "File not found" messages.  Assume at this point that the NoClassDefFoundError causes the code that creates the files to be skipped
>  > and once the NoClassDefFoundError is fixed, these files will be generated.
>  >
>  > The compile of Hadoop 3.3.1 on RHEL 8.4 succeeded without issues.
>  > I have JAVA_HOME and JRE_HOME set in .bashrc to OpenJDK 1.8 and have added these into the $PATH.
>  >
>  >    export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.302.b08-0.el8_4.x86_64
>  >    export JAVA_OPTS="-Xms2048m -Xmx4096m -XX:+UseZGC"
>  >    export JRE_HOME=${JAVA_HOME}/jre
>  >    export LIBHDFS_OPTS="-Xms2048m -Xmx4096m"
>  >    export MAVEN_HOME=/usr/share/maven
>  >    export MAVEN_OPTS="-Xms256m -Xmx1536m"
>  >    export PROTOBUF_HOME=/usr/local
>  >    export PATH=/home/ec2-user/.local/bin:/home/ec2-user/bin:${JAVA_HOME}/bin:${JRE_HOME}/bin:${MAVEN_HOME}/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
>  >
>  > It appears the $JRE_HOME/lib/rt.jar isn't being included in the maven.test.classpath or in the native module tests.
>  >
>  > I tried setting CLASSPATH and JAVA_LIBRARY_PATH in .bashrc and tried passing via 'mvn' command, but still no success.
>  >
>  > I followed the procedures in the BUILDING.txt file for the CentOS 8 as that was the closest to RHEL 8.4.
>  >
>  >      ${HADOOP_SRC_HOME}/hadoop-hdfs-project/hadoop-hdfs-native-client/pom.xml
>  >
>  >        <properties>
>  >          <native_cmake_args></native_cmake_args>
>  >          <native_ctest_args></native_ctest_args>
>  >          <native_make_args></native_make_args>
>  >
>  >    Do I need to supply anything for any of these three properties?
>  >
>  >
>  > (3)  Here is some more snippets of information from the maven log that I captured.
>  >
>  >       [exec] 98% tests passed, 1 tests failed out of 40
>  >       [exec]
>  >       [exec] Total Test time (real) = 270.30 sec
>  >       [exec]
>  >       [exec] The following tests FAILED:
>  >       [exec]      35 - test_libhdfs_threaded_hdfspp_test_shim_static (Failed)
>  >       [exec] Errors while running CTest
>  > [INFO] ------------------------------------------------------------------------
>  > [INFO] Reactor Summary:
>  > [INFO]
>  > [INFO] Apache Hadoop Main 3.3.1 ........................... SUCCESS [  0.707 s]
>  > [INFO] Apache Hadoop Build Tools .......................... SUCCESS [  2.743 s]
>  > [INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.692 s]
>  > [INFO] Apache Hadoop Annotations .......................... SUCCESS [  1.955 s]
>  > [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  0.106 s]
>  > [INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.101 s]
>  > [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.194 s]
>  > [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  0.806 s]
>  > [INFO] Apache Hadoop Auth ................................. SUCCESS [  4.192 s]
>  > [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  0.452 s]
>  > [INFO] Apache Hadoop Common ............................... SUCCESS [ 54.493 s]
>  > [INFO] Apache Hadoop NFS .................................. SUCCESS [  2.123 s]
>  > [INFO] Apache Hadoop KMS .................................. SUCCESS [  2.087 s]
>  > [INFO] Apache Hadoop Registry ............................. SUCCESS [  2.538 s]
>  > [INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.055 s]
>  > [INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 16.283 s]
>  > [INFO] Apache Hadoop HDFS ................................. SUCCESS [ 24.263 s]
>  > [INFO] Apache Hadoop HDFS Native Client ................... FAILURE [04:49 min]
>  > [INFO] Apache Hadoop HttpFS ............................... SKIPPED
>  > ...
>  > [INFO] ------------------------------------------------------------------------
>  > [INFO] BUILD FAILURE
>  > [INFO] ------------------------------------------------------------------------
>  > [INFO] Total time: 06:49 min
>  > [INFO] Finished at: 2021-09-02T22:34:18Z
>  > [INFO] ------------------------------------------------------------------------
>  > [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (native_tests) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 8
>  > [ERROR] around Ant part ...<exec failonerror="true" dir="/home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/" executable="ctest">... @ 6:150 in /home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/antrun/build-main.xml
>  > [ERROR] -> [Help 1]
>  >
>  >
>  > (4) What are the following properties?
>  >
>  >        require.test.libhadoop - not sure of the purpose of this property or if I need it or is it just for hadoop project developers
>  >        bundle.<type> -vs-  bundle.<type>.in.bin  - What is the difference of bundle vs bundle.in.bin?
>  >
>  >
>  > Please let me know what I might be missing or if a (or some) native files need to be modified to have the rt.jar (RuntimeException class contained within) be included.
>  > Wasn't sure if this was an OpenJDK 1.8 vs OpenJDK 11 issue as the JRE binary is located in a different directory in OpenJDK 11.  I am not using OpenJDK 11 at all
>  > nor is it installed in my RHEL 8.4 AWS instance.
>  >
>  > I didn't submit a ticket as I assume there is something that I am not doing correctly or forgetting to include/do.
>  >
>  > Any help you can give me would be very much appreciated.
>  >
>  > Paula
> 
>  >
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@hadoop.apache.org <ma...@hadoop.apache.org>
> For additional commands, e-mail: user-help@hadoop.apache.org <ma...@hadoop.apache.org>
> 
> 

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@hadoop.apache.org
For additional commands, e-mail: user-help@hadoop.apache.org


Re: hadoop-hdfs-native-client Help

Posted by Masatake Iwasaki <iw...@oss.nttdata.co.jp>.
> (2) I am compiling and testing Hadoop 3.3.1 on RHEL 8.4 on the command line not via any IDE inside an AWS instance.  I have encountered an issue 
>      with Native Test Case #35 (all other 39 Native Test Cases succeed).

I could not reproduce the issue on CentOS 8 by java-1.8.0-openjdk-devel-1.8.0.302.b08-0.el8_4::

   $ java -version
   openjdk version "1.8.0_302"
   OpenJDK Runtime Environment (build 1.8.0_302-b08)
   OpenJDK 64-Bit Server VM (build 25.302-b08, mixed mode)
   
   $ echo $JAVA_HOME
   /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.302.b08-0.el8_4.x86_64
   
   $ mvn clean install -Pnative -DskipTests -DskipShade
   $ cd hadoop-hdfs-project/hadoop-hdfs-native-client
   $ mvn test -Pnative -Dtest=x


> (4) What are the following properties?
> 
>       require.test.libhadoop - not sure of the purpose of this property or if I need it or is it just for hadoop project developers
>       bundle.<type> -vs-  bundle.<type>.in.bin  - What is the difference of bundle vs bundle.in.bin?

bundle.*.in.bin seems to be for Windows.
https://issues.apache.org/jira/browse/HADOOP-12892

Those bundle.* properties are used for creating distribution binary tarball of Hadoop (activated by -Pdist on build).
It should not be related to the test failure of hadoop-hdfs-native-client.


On 2021/09/10 23:12, Paula Logan wrote:
> Hello,
> 
> I am new to building Hadoop locally, and am having some issues.  Please let me know if this information should be sent to a different distro.
> 
> 
> (1) Can Hadoop 3.3.1 be compiled and run with OpenJDK 11 or is OpenJDK 1.8 needed for compile while 1.8 or 11 can be used to run hadoop?
> 
> 
> (2) I am compiling and testing Hadoop 3.3.1 on RHEL 8.4 on the command line not via any IDE inside an AWS instance.  I have encountered an issue
>       with Native Test Case #35 (all other 39 Native Test Cases succeed).
> 
> First here is my maven command:
> 
> mvn -e -X test -Pnative,parallel-tests,shelltest,yarn-ui -Dtest=allNative -Dparallel-tests=true -Drequire.bzip2=true -Drequire.fuse=true -Drequire.isal=true -Disal.prefix=/usr/local -Disal.lib=/usr/local/lib64 -Dbundle.isal=true -Drequire.openssl=true -Dopenssl.prefix=/usr -Dopenssl.include=/usr/include -Dopenssl.lib=/usr/lib64 -Dbundle.openssl=true -Dbundle.openssl.in.bin=true -Drequire.pmdk=true -Dpmdk.lib=/usr/lib64 -Dbundle.pmdk=true -Drequire.snappy=true -Dsnappy.prefix=/usr -Dsnappy.include=/usr/include -Dsnappy.lib=/usr/lib64 -Dbundle.snappy=true -Drequire.valgrind=true -Dhbase.profile=2.0 -Drequire.zstd=true -Dzstd.prefix=/usr -Dzstd.include=/usr/include -Dzstd.lib=/usr/lib64 -Dbundle.zstd=true -Dbundle.zstd.in.bin=true -Drequire.test.libhadoop=true
> 
> This is what I get for Test Case #35:
> 
>       [exec] 35/40 Test #35: test_libhdfs_threaded_hdfspp_test_shim_static ..............***Failed   31.58 sec
>       [exec] testRecursiveJvmMutex error:
>       [exec] ClassNotFoundException: RuntimeExceptionjava.lang.NoClassDefFoundError: RuntimeException
>       [exec] Caused by: java.lang.ClassNotFoundException: RuntimeException
>       [exec]     at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>       [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
>       [exec]     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
>       [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
>       [exec] 2021-09-02 22:31:09,706 INFO  hdfs.MiniDFSCluster (MiniDFSCluster.java:<init>(529)) - starting cluster: numNameNodes=1, numDataNodes=1
>       [exec] 2021-09-02 22:31:10,134 INFO  namenode.NameNode (NameNode.java:format(1249)) - Formatting using clusterid: testClusterID
>       [exec] 2021-09-02 22:31:10,156 INFO  namenode.FSEditLog (FSEditLog.java:newInstance(229)) - Edit logging is async:true
>       [exec] 2021-09-02 22:31:10,182 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(814)) - KeyProvider: null
>       [exec] 2021-09-02 22:31:10,184 INFO  namenode.FSNamesystem (FSNamesystemLock.java:<init>(141)) - fsLock is fair: true
>       [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystemLock.java:<init>(159)) - Detailed lock hold time metrics enabled: false
>       [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(847)) - fsOwner                = ec2-user (auth:SIMPLE)
>       [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(848)) - supergroup
>       ...
>       [exec] 2021-09-02 22:31:13,204 INFO  ipc.Server (Server.java:logException(3020)) - IPC Server handler 7 on default port 44945, call Call#6 Retry#-1 org.apache.hadoop.hdfs.protocol.ClientProtocol.getBlockLocations from 127.0.0.1:37362: java.io.FileNotFoundException: File does not exist: /tlhData0001/file1
>       ...
>       [exec] 98% tests passed, 1 tests failed out of 40
>       [exec]
>       [exec] Total Test time (real) = 270.30 sec
>       [exec]
>       [exec] The following tests FAILED:
>       [exec]      35 - test_libhdfs_threaded_hdfspp_test_shim_static (Failed)
>       [exec] Errors while running CTest
> [INFO] ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO]
> [INFO] Apache Hadoop Main 3.3.1 ........................... SUCCESS [  0.707 s]
> [INFO] Apache Hadoop Build Tools .......................... SUCCESS [  2.743 s]
> [INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.692 s]
> [INFO] Apache Hadoop Annotations .......................... SUCCESS [  1.955 s]
> [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  0.106 s]
> [INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.101 s]
> [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.194 s]
> [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  0.806 s]
> [INFO] Apache Hadoop Auth ................................. SUCCESS [  4.192 s]
> [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  0.452 s]
> [INFO] Apache Hadoop Common ............................... SUCCESS [ 54.493 s]
> [INFO] Apache Hadoop NFS .................................. SUCCESS [  2.123 s]
> [INFO] Apache Hadoop KMS .................................. SUCCESS [  2.087 s]
> [INFO] Apache Hadoop Registry ............................. SUCCESS [  2.538 s]
> [INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.055 s]
> [INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 16.283 s]
> [INFO] Apache Hadoop HDFS ................................. SUCCESS [ 24.263 s]
> [INFO] Apache Hadoop HDFS Native Client ................... FAILURE [04:49 min]
> [INFO] Apache Hadoop HttpFS ............................... SKIPPED
> ...
> [INFO] ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] ------------------------------------------------------------------------
> [INFO] Total time: 06:49 min
> [INFO] Finished at: 2021-09-02T22:34:18Z
> [INFO] ------------------------------------------------------------------------
> [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (native_tests) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 8
> [ERROR] around Ant part ...<exec failonerror="true" dir="/home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/" executable="ctest">... @ 6:150 in /home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/antrun/build-main.xml
> [ERROR] -> [Help 1]
> 
> 
> This RuntimeException error also appears for Native Test Case 2 but that test case doesn't fail.
> 
> Also, see a lot of "File not found" messages.  Assume at this point that the NoClassDefFoundError causes the code that creates the files to be skipped
> and once the NoClassDefFoundError is fixed, these files will be generated.
> 
> The compile of Hadoop 3.3.1 on RHEL 8.4 succeeded without issues.
> I have JAVA_HOME and JRE_HOME set in .bashrc to OpenJDK 1.8 and have added these into the $PATH.
> 
>    export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.302.b08-0.el8_4.x86_64
>    export JAVA_OPTS="-Xms2048m -Xmx4096m -XX:+UseZGC"
>    export JRE_HOME=${JAVA_HOME}/jre
>    export LIBHDFS_OPTS="-Xms2048m -Xmx4096m"
>    export MAVEN_HOME=/usr/share/maven
>    export MAVEN_OPTS="-Xms256m -Xmx1536m"
>    export PROTOBUF_HOME=/usr/local
>    export PATH=/home/ec2-user/.local/bin:/home/ec2-user/bin:${JAVA_HOME}/bin:${JRE_HOME}/bin:${MAVEN_HOME}/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
> 
> It appears the $JRE_HOME/lib/rt.jar isn't being included in the maven.test.classpath or in the native module tests.
> 
> I tried setting CLASSPATH and JAVA_LIBRARY_PATH in .bashrc and tried passing via 'mvn' command, but still no success.
> 
> I followed the procedures in the BUILDING.txt file for the CentOS 8 as that was the closest to RHEL 8.4.
> 
>      ${HADOOP_SRC_HOME}/hadoop-hdfs-project/hadoop-hdfs-native-client/pom.xml
> 
>        <properties>
>          <native_cmake_args></native_cmake_args>
>          <native_ctest_args></native_ctest_args>
>          <native_make_args></native_make_args>
> 
>    Do I need to supply anything for any of these three properties?
> 
> 
> (3)  Here is some more snippets of information from the maven log that I captured.
> 
>       [exec] 98% tests passed, 1 tests failed out of 40
>       [exec]
>       [exec] Total Test time (real) = 270.30 sec
>       [exec]
>       [exec] The following tests FAILED:
>       [exec]      35 - test_libhdfs_threaded_hdfspp_test_shim_static (Failed)
>       [exec] Errors while running CTest
> [INFO] ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO]
> [INFO] Apache Hadoop Main 3.3.1 ........................... SUCCESS [  0.707 s]
> [INFO] Apache Hadoop Build Tools .......................... SUCCESS [  2.743 s]
> [INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.692 s]
> [INFO] Apache Hadoop Annotations .......................... SUCCESS [  1.955 s]
> [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  0.106 s]
> [INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.101 s]
> [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.194 s]
> [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  0.806 s]
> [INFO] Apache Hadoop Auth ................................. SUCCESS [  4.192 s]
> [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  0.452 s]
> [INFO] Apache Hadoop Common ............................... SUCCESS [ 54.493 s]
> [INFO] Apache Hadoop NFS .................................. SUCCESS [  2.123 s]
> [INFO] Apache Hadoop KMS .................................. SUCCESS [  2.087 s]
> [INFO] Apache Hadoop Registry ............................. SUCCESS [  2.538 s]
> [INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.055 s]
> [INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 16.283 s]
> [INFO] Apache Hadoop HDFS ................................. SUCCESS [ 24.263 s]
> [INFO] Apache Hadoop HDFS Native Client ................... FAILURE [04:49 min]
> [INFO] Apache Hadoop HttpFS ............................... SKIPPED
> ...
> [INFO] ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] ------------------------------------------------------------------------
> [INFO] Total time: 06:49 min
> [INFO] Finished at: 2021-09-02T22:34:18Z
> [INFO] ------------------------------------------------------------------------
> [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (native_tests) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 8
> [ERROR] around Ant part ...<exec failonerror="true" dir="/home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/" executable="ctest">... @ 6:150 in /home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/antrun/build-main.xml
> [ERROR] -> [Help 1]
> 
> 
> (4) What are the following properties?
> 
>        require.test.libhadoop - not sure of the purpose of this property or if I need it or is it just for hadoop project developers
>        bundle.<type> -vs-  bundle.<type>.in.bin  - What is the difference of bundle vs bundle.in.bin?
> 
> 
> Please let me know what I might be missing or if a (or some) native files need to be modified to have the rt.jar (RuntimeException class contained within) be included.
> Wasn't sure if this was an OpenJDK 1.8 vs OpenJDK 11 issue as the JRE binary is located in a different directory in OpenJDK 11.  I am not using OpenJDK 11 at all
> nor is it installed in my RHEL 8.4 AWS instance.
> 
> I didn't submit a ticket as I assume there is something that I am not doing correctly or forgetting to include/do.
> 
> Any help you can give me would be very much appreciated.
> 
> Paula
> 

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@hadoop.apache.org
For additional commands, e-mail: user-help@hadoop.apache.org


Re: hadoop-hdfs-native-client Help

Posted by Paula Logan <pm...@verizon.net.INVALID>.
Jonathan,
My company has their own server suite.  We will not be using Amazon for the final solution.  I am only using AWS EC2 instance to compile, test, and package Hadoop 3.3.1using RHEL 8.4 while our SAs load RHEL 8.4 onto the servers.  Mostly concerned with the *Native* functions, modules, and methods.  I removed the -Dtest=allNative tosee which other test cases fail.  Several test cases fail with errors that I would have thought would have been fixed before releasing.  I assume at this point that these maybe specific to RHEL 8.4 and wouldn't have been caught with Ubuntu and CentOS 7.  Not sure.  I did sent e-mail to hdfs-dev@hadoop.apache.org.  Waiting to see whatthe developers recommend.
Appreciate your feedback.
Paula

-----Original Message-----
From: Jonathan Aquilina <ja...@eagleeyet.net.INVALID>
To: Paula Logan <pm...@verizon.net>; user@hadoop.apache.org <us...@hadoop.apache.org>
Sent: Fri, Sep 10, 2021 5:36 pm
Subject: RE: hadoop-hdfs-native-client Help

<!--#yiv8393330214 _filtered {} _filtered {}#yiv8393330214 #yiv8393330214 p.yiv8393330214MsoNormal, #yiv8393330214 li.yiv8393330214MsoNormal, #yiv8393330214 div.yiv8393330214MsoNormal {margin:0cm;font-size:11.0pt;font-family:"Calibri", sans-serif;}#yiv8393330214 span.yiv8393330214EmailStyle18 {font-family:"Calibri", sans-serif;color:windowtext;}#yiv8393330214 .yiv8393330214MsoChpDefault {font-family:"Calibri", sans-serif;} _filtered {}#yiv8393330214 div.yiv8393330214WordSection1 {}-->Hi Paula,    I am not sure how to answer your questions but is there a reason why you are using an EC2 instance instead of amazonz EMR (elastic Map reduce) Hadoop cluster. As far as I know you can set that up to work with HDFS setup as well as S3 buckets if you don’t need a long term cluster to stay online.    Regards, Jonathan    From: Paula Logan <pm...@verizon.net.INVALID>
Sent: 10 September 2021 16:13
To: user@hadoop.apache.org
Subject: hadoop-hdfs-native-client Help    Hello,    I am new to building Hadoop locally, and am having some issues.  Please let me know if this information should be sent to a different distro.       (1) Can Hadoop 3.3.1 be compiled and run with OpenJDK 11 or is OpenJDK 1.8 needed for compile while 1.8 or 11 can be used to run hadoop?       (2) I am compiling and testing Hadoop 3.3.1 on RHEL 8.4 on the command line not via any IDE inside an AWS instance.  I have encountered an issue       with Native Test Case #35 (all other 39 Native Test Cases succeed).    First here is my maven command:    mvn -e -X test -Pnative,parallel-tests,shelltest,yarn-ui -Dtest=allNative -Dparallel-tests=true -Drequire.bzip2=true -Drequire.fuse=true -Drequire.isal=true -Disal.prefix=/usr/local -Disal.lib=/usr/local/lib64 -Dbundle.isal=true -Drequire.openssl=true -Dopenssl.prefix=/usr -Dopenssl.include=/usr/include -Dopenssl.lib=/usr/lib64 -Dbundle.openssl=true -Dbundle.openssl.in.bin=true -Drequire.pmdk=true -Dpmdk.lib=/usr/lib64 -Dbundle.pmdk=true -Drequire.snappy=true -Dsnappy.prefix=/usr -Dsnappy.include=/usr/include -Dsnappy.lib=/usr/lib64 -Dbundle.snappy=true -Drequire.valgrind=true -Dhbase.profile=2.0 -Drequire.zstd=true -Dzstd.prefix=/usr -Dzstd.include=/usr/include -Dzstd.lib=/usr/lib64 -Dbundle.zstd=true -Dbundle.zstd.in.bin=true -Drequire.test.libhadoop=true    This is what I get for Test Case #35:         [exec] 35/40 Test #35: test_libhdfs_threaded_hdfspp_test_shim_static ..............***Failed   31.58 sec      [exec] testRecursiveJvmMutex error:      [exec] ClassNotFoundException: RuntimeExceptionjava.lang.NoClassDefFoundError: RuntimeException      [exec] Caused by: java.lang.ClassNotFoundException: RuntimeException      [exec]     at java.net.URLClassLoader.findClass(URLClassLoader.java:382)      [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:418)      [exec]     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)      [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:351)      [exec] 2021-09-02 22:31:09,706 INFO  hdfs.MiniDFSCluster (MiniDFSCluster.java:<init>(529)) - starting cluster: numNameNodes=1, numDataNodes=1      [exec] 2021-09-02 22:31:10,134 INFO  namenode.NameNode (NameNode.java:format(1249)) - Formatting using clusterid: testClusterID      [exec] 2021-09-02 22:31:10,156 INFO  namenode.FSEditLog (FSEditLog.java:newInstance(229)) - Edit logging is async:true      [exec] 2021-09-02 22:31:10,182 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(814)) - KeyProvider: null      [exec] 2021-09-02 22:31:10,184 INFO  namenode.FSNamesystem (FSNamesystemLock.java:<init>(141)) - fsLock is fair: true      [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystemLock.java:<init>(159)) - Detailed lock hold time metrics enabled: false      [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(847)) - fsOwner                = ec2-user (auth:SIMPLE)      [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(848)) - supergroup        ...        [exec] 2021-09-02 22:31:13,204 INFO  ipc.Server (Server.java:logException(3020)) - IPC Server handler 7 on default port 44945, call Call#6 Retry#-1 org.apache.hadoop.hdfs.protocol.ClientProtocol.getBlockLocations from 127.0.0.1:37362: java.io.FileNotFoundException: File does not exist: /tlhData0001/file1              ...      [exec] 98% tests passed, 1 tests failed out of 40      [exec]      [exec] Total Test time (real) = 270.30 sec      [exec]      [exec] The following tests FAILED:      [exec]      35 - test_libhdfs_threaded_hdfspp_test_shim_static (Failed)      [exec] Errors while running CTest [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main 3.3.1 ........................... SUCCESS [  0.707 s] [INFO] Apache Hadoop Build Tools .......................... SUCCESS [  2.743 s] [INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.692 s] [INFO] Apache Hadoop Annotations .......................... SUCCESS [  1.955 s] [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  0.106 s] [INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.101 s] [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.194 s] [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  0.806 s] [INFO] Apache Hadoop Auth ................................. SUCCESS [  4.192 s] [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  0.452 s] [INFO] Apache Hadoop Common ............................... SUCCESS [ 54.493 s] [INFO] Apache Hadoop NFS .................................. SUCCESS [  2.123 s] [INFO] Apache Hadoop KMS .................................. SUCCESS [  2.087 s] [INFO] Apache Hadoop Registry ............................. SUCCESS [  2.538 s] [INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.055 s] [INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 16.283 s] [INFO] Apache Hadoop HDFS ................................. SUCCESS [ 24.263 s] [INFO] Apache Hadoop HDFS Native Client ................... FAILURE [04:49 min] [INFO] Apache Hadoop HttpFS ............................... SKIPPED ... [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 06:49 min [INFO] Finished at: 2021-09-02T22:34:18Z [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (native_tests) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 8 [ERROR] around Ant part ...<exec failonerror="true" dir="/home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/" executable="ctest">... @ 6:150 in /home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/antrun/build-main.xml [ERROR] -> [Help 1]       This RuntimeException error also appears for Native Test Case 2 but that test case doesn't fail.    Also, see a lot of "File not found" messages.  Assume at this point that the NoClassDefFoundError causes the code that creates the files to be skipped and once the NoClassDefFoundError is fixed, these files will be generated.    The compile of Hadoop 3.3.1 on RHEL 8.4 succeeded without issues.   I have JAVA_HOME and JRE_HOME set in .bashrc to OpenJDK 1.8 and have added these into the $PATH.        export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.302.b08-0.el8_4.x86_64   export JAVA_OPTS="-Xms2048m -Xmx4096m -XX:+UseZGC"   export JRE_HOME=${JAVA_HOME}/jre   export LIBHDFS_OPTS="-Xms2048m -Xmx4096m"   export MAVEN_HOME=/usr/share/maven   export MAVEN_OPTS="-Xms256m -Xmx1536m"   export PROTOBUF_HOME=/usr/local   export PATH=/home/ec2-user/.local/bin:/home/ec2-user/bin:${JAVA_HOME}/bin:${JRE_HOME}/bin:${MAVEN_HOME}/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin    It appears the $JRE_HOME/lib/rt.jar isn't being included in the maven.test.classpath or in the native module tests.    I tried setting CLASSPATH and JAVA_LIBRARY_PATH in .bashrc and tried passing via 'mvn' command, but still no success.    I followed the procedures in the BUILDING.txt file for the CentOS 8 as that was the closest to RHEL 8.4.        ${HADOOP_SRC_HOME}/hadoop-hdfs-project/hadoop-hdfs-native-client/pom.xml          <properties>         <native_cmake_args></native_cmake_args>         <native_ctest_args></native_ctest_args>         <native_make_args></native_make_args>      Do I need to supply anything for any of these three properties?       (3)  Here is some more snippets of information from the maven log that I captured.         [exec] 98% tests passed, 1 tests failed out of 40      [exec]      [exec] Total Test time (real) = 270.30 sec      [exec]      [exec] The following tests FAILED:      [exec]      35 - test_libhdfs_threaded_hdfspp_test_shim_static (Failed)      [exec] Errors while running CTest [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main 3.3.1 ........................... SUCCESS [  0.707 s] [INFO] Apache Hadoop Build Tools .......................... SUCCESS [  2.743 s] [INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.692 s] [INFO] Apache Hadoop Annotations .......................... SUCCESS [  1.955 s] [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  0.106 s] [INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.101 s] [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.194 s] [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  0.806 s] [INFO] Apache Hadoop Auth ................................. SUCCESS [  4.192 s] [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  0.452 s] [INFO] Apache Hadoop Common ............................... SUCCESS [ 54.493 s] [INFO] Apache Hadoop NFS .................................. SUCCESS [  2.123 s] [INFO] Apache Hadoop KMS .................................. SUCCESS [  2.087 s] [INFO] Apache Hadoop Registry ............................. SUCCESS [  2.538 s] [INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.055 s] [INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 16.283 s] [INFO] Apache Hadoop HDFS ................................. SUCCESS [ 24.263 s] [INFO] Apache Hadoop HDFS Native Client ................... FAILURE [04:49 min] [INFO] Apache Hadoop HttpFS ............................... SKIPPED ... [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 06:49 min [INFO] Finished at: 2021-09-02T22:34:18Z [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (native_tests) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 8 [ERROR] around Ant part ...<exec failonerror="true" dir="/home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/" executable="ctest">... @ 6:150 in /home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/antrun/build-main.xml [ERROR] -> [Help 1]       (4) What are the following properties?          require.test.libhadoop - not sure of the purpose of this property or if I need it or is it just for hadoop project developers       bundle.<type> -vs-  bundle.<type>.in.bin  - What is the difference of bundle vs bundle.in.bin?       Please let me know what I might be missing or if a (or some) native files need to be modified to have the rt.jar (RuntimeException class contained within) be included. Wasn't sure if this was an OpenJDK 1.8 vs OpenJDK 11 issue as the JRE binary is located in a different directory in OpenJDK 11.  I am not using OpenJDK 11 at all nor is it installed in my RHEL 8.4 AWS instance.    I didn't submit a ticket as I assume there is something that I am not doing correctly or forgetting to include/do.    Any help you can give me would be very much appreciated.    Paula    

RE: hadoop-hdfs-native-client Help

Posted by Jonathan Aquilina <ja...@eagleeyet.net.INVALID>.
Hi Paula,

I am not sure how to answer your questions but is there a reason why you are using an EC2 instance instead of amazonz EMR (elastic Map reduce) Hadoop cluster. As far as I know you can set that up to work with HDFS setup as well as S3 buckets if you don’t need a long term cluster to stay online.

Regards,
Jonathan

From: Paula Logan <pm...@verizon.net.INVALID>
Sent: 10 September 2021 16:13
To: user@hadoop.apache.org
Subject: hadoop-hdfs-native-client Help

Hello,

I am new to building Hadoop locally, and am having some issues.  Please let me know if this information should be sent to a different distro.


(1) Can Hadoop 3.3.1 be compiled and run with OpenJDK 11 or is OpenJDK 1.8 needed for compile while 1.8 or 11 can be used to run hadoop?


(2) I am compiling and testing Hadoop 3.3.1 on RHEL 8.4 on the command line not via any IDE inside an AWS instance.  I have encountered an issue
     with Native Test Case #35 (all other 39 Native Test Cases succeed).

First here is my maven command:

mvn -e -X test -Pnative,parallel-tests,shelltest,yarn-ui -Dtest=allNative -Dparallel-tests=true -Drequire.bzip2=true -Drequire.fuse=true -Drequire.isal=true -Disal.prefix=/usr/local -Disal.lib=/usr/local/lib64 -Dbundle.isal=true -Drequire.openssl=true -Dopenssl.prefix=/usr -Dopenssl.include=/usr/include -Dopenssl.lib=/usr/lib64 -Dbundle.openssl=true -Dbundle.openssl.in.bin=true -Drequire.pmdk=true -Dpmdk.lib=/usr/lib64 -Dbundle.pmdk=true -Drequire.snappy=true -Dsnappy.prefix=/usr -Dsnappy.include=/usr/include -Dsnappy.lib=/usr/lib64 -Dbundle.snappy=true -Drequire.valgrind=true -Dhbase.profile=2.0 -Drequire.zstd=true -Dzstd.prefix=/usr -Dzstd.include=/usr/include -Dzstd.lib=/usr/lib64 -Dbundle.zstd=true -Dbundle.zstd.in.bin=true -Drequire.test.libhadoop=true

This is what I get for Test Case #35:

     [exec] 35/40 Test #35: test_libhdfs_threaded_hdfspp_test_shim_static ..............***Failed   31.58 sec
     [exec] testRecursiveJvmMutex error:
     [exec] ClassNotFoundException: RuntimeExceptionjava.lang.NoClassDefFoundError: RuntimeException
     [exec] Caused by: java.lang.ClassNotFoundException: RuntimeException
     [exec]     at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
     [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
     [exec]     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
     [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
     [exec] 2021-09-02 22:31:09,706 INFO  hdfs.MiniDFSCluster (MiniDFSCluster.java:<init>(529)) - starting cluster: numNameNodes=1, numDataNodes=1
     [exec] 2021-09-02 22:31:10,134 INFO  namenode.NameNode (NameNode.java:format(1249)) - Formatting using clusterid: testClusterID
     [exec] 2021-09-02 22:31:10,156 INFO  namenode.FSEditLog (FSEditLog.java:newInstance(229)) - Edit logging is async:true
     [exec] 2021-09-02 22:31:10,182 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(814)) - KeyProvider: null
     [exec] 2021-09-02 22:31:10,184 INFO  namenode.FSNamesystem (FSNamesystemLock.java:<init>(141)) - fsLock is fair: true
     [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystemLock.java:<init>(159)) - Detailed lock hold time metrics enabled: false
     [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(847)) - fsOwner                = ec2-user (auth:SIMPLE)
     [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(848)) - supergroup
     ...
       [exec] 2021-09-02 22:31:13,204 INFO  ipc.Server (Server.java:logException(3020)) - IPC Server handler 7 on default port 44945, call Call#6 Retry#-1 org.apache.hadoop.hdfs.protocol.ClientProtocol.getBlockLocations from 127.0.0.1:37362: java.io.FileNotFoundException: File does not exist: /tlhData0001/file1
     ...
     [exec] 98% tests passed, 1 tests failed out of 40
     [exec]
     [exec] Total Test time (real) = 270.30 sec
     [exec]
     [exec] The following tests FAILED:
     [exec]      35 - test_libhdfs_threaded_hdfspp_test_shim_static (Failed)
     [exec] Errors while running CTest
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main 3.3.1 ........................... SUCCESS [  0.707 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  2.743 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.692 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  1.955 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  0.106 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.101 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.194 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  0.806 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  4.192 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  0.452 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [ 54.493 s]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  2.123 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [  2.087 s]
[INFO] Apache Hadoop Registry ............................. SUCCESS [  2.538 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.055 s]
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 16.283 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [ 24.263 s]
[INFO] Apache Hadoop HDFS Native Client ................... FAILURE [04:49 min]
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 06:49 min
[INFO] Finished at: 2021-09-02T22:34:18Z
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (native_tests) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 8
[ERROR] around Ant part ...<exec failonerror="true" dir="/home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/" executable="ctest">... @ 6:150 in /home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/antrun/build-main.xml
[ERROR] -> [Help 1]


This RuntimeException error also appears for Native Test Case 2 but that test case doesn't fail.

Also, see a lot of "File not found" messages.  Assume at this point that the NoClassDefFoundError causes the code that creates the files to be skipped
and once the NoClassDefFoundError is fixed, these files will be generated.

The compile of Hadoop 3.3.1 on RHEL 8.4 succeeded without issues.

I have JAVA_HOME and JRE_HOME set in .bashrc to OpenJDK 1.8 and have added these into the $PATH.

  export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.302.b08-0.el8_4.x86_64
  export JAVA_OPTS="-Xms2048m -Xmx4096m -XX:+UseZGC"
  export JRE_HOME=${JAVA_HOME}/jre
  export LIBHDFS_OPTS="-Xms2048m -Xmx4096m"
  export MAVEN_HOME=/usr/share/maven
  export MAVEN_OPTS="-Xms256m -Xmx1536m"
  export PROTOBUF_HOME=/usr/local
  export PATH=/home/ec2-user/.local/bin:/home/ec2-user/bin:${JAVA_HOME}/bin:${JRE_HOME}/bin:${MAVEN_HOME}/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin

It appears the $JRE_HOME/lib/rt.jar isn't being included in the maven.test.classpath or in the native module tests.

I tried setting CLASSPATH and JAVA_LIBRARY_PATH in .bashrc and tried passing via 'mvn' command, but still no success.

I followed the procedures in the BUILDING.txt file for the CentOS 8 as that was the closest to RHEL 8.4.

    ${HADOOP_SRC_HOME}/hadoop-hdfs-project/hadoop-hdfs-native-client/pom.xml

      <properties>
        <native_cmake_args></native_cmake_args>
        <native_ctest_args></native_ctest_args>
        <native_make_args></native_make_args>

  Do I need to supply anything for any of these three properties?


(3)  Here is some more snippets of information from the maven log that I captured.

     [exec] 98% tests passed, 1 tests failed out of 40
     [exec]
     [exec] Total Test time (real) = 270.30 sec
     [exec]
     [exec] The following tests FAILED:
     [exec]      35 - test_libhdfs_threaded_hdfspp_test_shim_static (Failed)
     [exec] Errors while running CTest
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main 3.3.1 ........................... SUCCESS [  0.707 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  2.743 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.692 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  1.955 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  0.106 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.101 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.194 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  0.806 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  4.192 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  0.452 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [ 54.493 s]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  2.123 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [  2.087 s]
[INFO] Apache Hadoop Registry ............................. SUCCESS [  2.538 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.055 s]
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 16.283 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [ 24.263 s]
[INFO] Apache Hadoop HDFS Native Client ................... FAILURE [04:49 min]
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 06:49 min
[INFO] Finished at: 2021-09-02T22:34:18Z
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (native_tests) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 8
[ERROR] around Ant part ...<exec failonerror="true" dir="/home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/" executable="ctest">... @ 6:150 in /home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/antrun/build-main.xml
[ERROR] -> [Help 1]


(4) What are the following properties?

      require.test.libhadoop - not sure of the purpose of this property or if I need it or is it just for hadoop project developers
      bundle.<type> -vs-  bundle.<type>.in.bin  - What is the difference of bundle vs bundle.in.bin?


Please let me know what I might be missing or if a (or some) native files need to be modified to have the rt.jar (RuntimeException class contained within) be included.
Wasn't sure if this was an OpenJDK 1.8 vs OpenJDK 11 issue as the JRE binary is located in a different directory in OpenJDK 11.  I am not using OpenJDK 11 at all
nor is it installed in my RHEL 8.4 AWS instance.

I didn't submit a ticket as I assume there is something that I am not doing correctly or forgetting to include/do.

Any help you can give me would be very much appreciated.

Paula


Fwd: hadoop-hdfs-native-client Help

Posted by Paula Logan <pm...@verizon.net.INVALID>.
I think this e-mail should have been sent here instead of user@hadoop.apache.org.
Any suggestions, recommendations, and changes to get the failed test case to work would be greatly appreciated.


-----Original Message-----
From: Paula Logan <pm...@verizon.net>
To: user@hadoop.apache.org <us...@hadoop.apache.org>
Sent: Fri, Sep 10, 2021 10:12 am
Subject: hadoop-hdfs-native-client Help

Hello,
I am new to building Hadoop locally, and am having some issues.  Please let me know if this information should be sent to a different distro.

(1) Can Hadoop 3.3.1 be compiled and run with OpenJDK 11 or is OpenJDK 1.8 needed for compile while 1.8 or 11 can be used to run hadoop?


(2) I am compiling and testing Hadoop 3.3.1 on RHEL 8.4 on the command line not via any IDE inside an AWS instance.  I have encountered an issue      with Native Test Case #35 (all other 39 Native Test Cases succeed).
First here is my maven command:
mvn -e -X test -Pnative,parallel-tests,shelltest,yarn-ui -Dtest=allNative -Dparallel-tests=true -Drequire.bzip2=true -Drequire.fuse=true -Drequire.isal=true -Disal.prefix=/usr/local -Disal.lib=/usr/local/lib64 -Dbundle.isal=true -Drequire.openssl=true -Dopenssl.prefix=/usr -Dopenssl.include=/usr/include -Dopenssl.lib=/usr/lib64 -Dbundle.openssl=true -Dbundle.openssl.in.bin=true -Drequire.pmdk=true -Dpmdk.lib=/usr/lib64 -Dbundle.pmdk=true -Drequire.snappy=true -Dsnappy.prefix=/usr -Dsnappy.include=/usr/include -Dsnappy.lib=/usr/lib64 -Dbundle.snappy=true -Drequire.valgrind=true -Dhbase.profile=2.0 -Drequire.zstd=true -Dzstd.prefix=/usr -Dzstd.include=/usr/include -Dzstd.lib=/usr/lib64 -Dbundle.zstd=true -Dbundle.zstd.in.bin=true -Drequire.test.libhadoop=true
This is what I get for Test Case #35:
     [exec] 35/40 Test #35: test_libhdfs_threaded_hdfspp_test_shim_static ..............***Failed   31.58 sec     [exec] testRecursiveJvmMutex error:     [exec] ClassNotFoundException: RuntimeExceptionjava.lang.NoClassDefFoundError: RuntimeException     [exec] Caused by: java.lang.ClassNotFoundException: RuntimeException     [exec]     at java.net.URLClassLoader.findClass(URLClassLoader.java:382)     [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:418)     [exec]     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)     [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:351)     [exec] 2021-09-02 22:31:09,706 INFO  hdfs.MiniDFSCluster (MiniDFSCluster.java:<init>(529)) - starting cluster: numNameNodes=1, numDataNodes=1     [exec] 2021-09-02 22:31:10,134 INFO  namenode.NameNode (NameNode.java:format(1249)) - Formatting using clusterid: testClusterID     [exec] 2021-09-02 22:31:10,156 INFO  namenode.FSEditLog (FSEditLog.java:newInstance(229)) - Edit logging is async:true     [exec] 2021-09-02 22:31:10,182 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(814)) - KeyProvider: null     [exec] 2021-09-02 22:31:10,184 INFO  namenode.FSNamesystem (FSNamesystemLock.java:<init>(141)) - fsLock is fair: true     [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystemLock.java:<init>(159)) - Detailed lock hold time metrics enabled: false     [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(847)) - fsOwner                = ec2-user (auth:SIMPLE)     [exec] 2021-09-02 22:31:10,185 INFO  namenode.FSNamesystem (FSNamesystem.java:<init>(848)) - supergroup       ...       [exec] 2021-09-02 22:31:13,204 INFO  ipc.Server (Server.java:logException(3020)) - IPC Server handler 7 on default port 44945, call Call#6 Retry#-1 org.apache.hadoop.hdfs.protocol.ClientProtocol.getBlockLocations from 127.0.0.1:37362: java.io.FileNotFoundException: File does not exist: /tlhData0001/file1             ...     [exec] 98% tests passed, 1 tests failed out of 40     [exec]     [exec] Total Test time (real) = 270.30 sec     [exec]     [exec] The following tests FAILED:     [exec]      35 - test_libhdfs_threaded_hdfspp_test_shim_static (Failed)     [exec] Errors while running CTest[INFO] ------------------------------------------------------------------------[INFO] Reactor Summary:[INFO][INFO] Apache Hadoop Main 3.3.1 ........................... SUCCESS [  0.707 s][INFO] Apache Hadoop Build Tools .......................... SUCCESS [  2.743 s][INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.692 s][INFO] Apache Hadoop Annotations .......................... SUCCESS [  1.955 s][INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  0.106 s][INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.101 s][INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.194 s][INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  0.806 s][INFO] Apache Hadoop Auth ................................. SUCCESS [  4.192 s][INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  0.452 s][INFO] Apache Hadoop Common ............................... SUCCESS [ 54.493 s][INFO] Apache Hadoop NFS .................................. SUCCESS [  2.123 s][INFO] Apache Hadoop KMS .................................. SUCCESS [  2.087 s][INFO] Apache Hadoop Registry ............................. SUCCESS [  2.538 s][INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.055 s][INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 16.283 s][INFO] Apache Hadoop HDFS ................................. SUCCESS [ 24.263 s][INFO] Apache Hadoop HDFS Native Client ................... FAILURE [04:49 min][INFO] Apache Hadoop HttpFS ............................... SKIPPED...[INFO] ------------------------------------------------------------------------[INFO] BUILD FAILURE[INFO] ------------------------------------------------------------------------[INFO] Total time: 06:49 min[INFO] Finished at: 2021-09-02T22:34:18Z[INFO] ------------------------------------------------------------------------[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (native_tests) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 8[ERROR] around Ant part ...<exec failonerror="true" dir="/home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/" executable="ctest">... @ 6:150 in /home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/antrun/build-main.xml[ERROR] -> [Help 1]

This RuntimeException error also appears for Native Test Case 2 but that test case doesn't fail.
Also, see a lot of "File not found" messages.  Assume at this point that the NoClassDefFoundError causes the code that creates the files to be skippedand once the NoClassDefFoundError is fixed, these files will be generated.
The compile of Hadoop 3.3.1 on RHEL 8.4 succeeded without issues. I have JAVA_HOME and JRE_HOME set in .bashrc to OpenJDK 1.8 and have added these into the $PATH.  
  export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.302.b08-0.el8_4.x86_64  export JAVA_OPTS="-Xms2048m -Xmx4096m -XX:+UseZGC"  export JRE_HOME=${JAVA_HOME}/jre  export LIBHDFS_OPTS="-Xms2048m -Xmx4096m"  export MAVEN_HOME=/usr/share/maven  export MAVEN_OPTS="-Xms256m -Xmx1536m"  export PROTOBUF_HOME=/usr/local  export PATH=/home/ec2-user/.local/bin:/home/ec2-user/bin:${JAVA_HOME}/bin:${JRE_HOME}/bin:${MAVEN_HOME}/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
It appears the $JRE_HOME/lib/rt.jar isn't being included in the maven.test.classpath or in the native module tests.
I tried setting CLASSPATH and JAVA_LIBRARY_PATH in .bashrc and tried passing via 'mvn' command, but still no success.
I followed the procedures in the BUILDING.txt file for the CentOS 8 as that was the closest to RHEL 8.4.
    ${HADOOP_SRC_HOME}/hadoop-hdfs-project/hadoop-hdfs-native-client/pom.xml
      <properties>        <native_cmake_args></native_cmake_args>        <native_ctest_args></native_ctest_args>        <native_make_args></native_make_args>
  Do I need to supply anything for any of these three properties?

(3)  Here is some more snippets of information from the maven log that I captured.
     [exec] 98% tests passed, 1 tests failed out of 40     [exec]     [exec] Total Test time (real) = 270.30 sec     [exec]     [exec] The following tests FAILED:     [exec]      35 - test_libhdfs_threaded_hdfspp_test_shim_static (Failed)     [exec] Errors while running CTest[INFO] ------------------------------------------------------------------------[INFO] Reactor Summary:[INFO][INFO] Apache Hadoop Main 3.3.1 ........................... SUCCESS [  0.707 s][INFO] Apache Hadoop Build Tools .......................... SUCCESS [  2.743 s][INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.692 s][INFO] Apache Hadoop Annotations .......................... SUCCESS [  1.955 s][INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  0.106 s][INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.101 s][INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.194 s][INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  0.806 s][INFO] Apache Hadoop Auth ................................. SUCCESS [  4.192 s][INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  0.452 s][INFO] Apache Hadoop Common ............................... SUCCESS [ 54.493 s][INFO] Apache Hadoop NFS .................................. SUCCESS [  2.123 s][INFO] Apache Hadoop KMS .................................. SUCCESS [  2.087 s][INFO] Apache Hadoop Registry ............................. SUCCESS [  2.538 s][INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.055 s][INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 16.283 s][INFO] Apache Hadoop HDFS ................................. SUCCESS [ 24.263 s][INFO] Apache Hadoop HDFS Native Client ................... FAILURE [04:49 min][INFO] Apache Hadoop HttpFS ............................... SKIPPED...[INFO] ------------------------------------------------------------------------[INFO] BUILD FAILURE[INFO] ------------------------------------------------------------------------[INFO] Total time: 06:49 min[INFO] Finished at: 2021-09-02T22:34:18Z[INFO] ------------------------------------------------------------------------[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (native_tests) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 8[ERROR] around Ant part ...<exec failonerror="true" dir="/home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/" executable="ctest">... @ 6:150 in /home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/antrun/build-main.xml[ERROR] -> [Help 1]

(4) What are the following properties?
      require.test.libhadoop - not sure of the purpose of this property or if I need it or is it just for hadoop project developers      bundle.<type> -vs-  bundle.<type>.in.bin  - What is the difference of bundle vs bundle.in.bin?

Please let me know what I might be missing or if a (or some) native files need to be modified to have the rt.jar (RuntimeException class contained within) be included.Wasn't sure if this was an OpenJDK 1.8 vs OpenJDK 11 issue as the JRE binary is located in a different directory in OpenJDK 11.  I am not using OpenJDK 11 at allnor is it installed in my RHEL 8.4 AWS instance.
I didn't submit a ticket as I assume there is something that I am not doing correctly or forgetting to include/do.
Any help you can give me would be very much appreciated.
Paula