You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sonia Garudi (JIRA)" <ji...@apache.org> on 2017/03/17 09:29:42 UTC

[jira] [Created] (SPARK-20000) Spark Hive tests aborted due to lz4-java on ppc64le

Sonia Garudi created SPARK-20000:
------------------------------------

             Summary: Spark Hive tests aborted due to lz4-java on ppc64le
                 Key: SPARK-20000
                 URL: https://issues.apache.org/jira/browse/SPARK-20000
             Project: Spark
          Issue Type: Bug
          Components: Tests
    Affects Versions: 2.2.0
         Environment: Ubuntu 14.04 ppc64le 
$ java -version
openjdk version "1.8.0_111"
OpenJDK Runtime Environment (build 1.8.0_111-8u111-b14-3~14.04.1-b14)
OpenJDK 64-Bit Server VM (build 25.111-b14, mixed mode)
            Reporter: Sonia Garudi


The tests are getting aborted in Spark Hive project with the following error :

{code:borderStyle=solid}
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x00003fff94dbf114, pid=6160, tid=0x00003fff6efef1a0
#
# JRE version: OpenJDK Runtime Environment (8.0_111-b14) (build 1.8.0_111-8u111-b14-3~14.04.1-b14)
# Java VM: OpenJDK 64-Bit Server VM (25.111-b14 mixed mode linux-ppc64 compressed oops)
# Problematic frame:
# V  [libjvm.so+0x56f114]
{code}

In the thread log file, I found the following traces :
Event: 3669.042 Thread 0x00003fff89976800 Exception <a 'java/lang/NoClassDefFoundError': Could not initialize class net.jpountz.lz4.LZ4JNI> (0x000000079fcda3b8) thrown at [/build/openjdk-8-fVIxxI/openjdk-8-8u111-b14/src/hotspot/src/share/vm/oops/instanceKlass.cpp, line 890]

This error is due to the lz4-java (version 1.3.0), which doesn’t have support for ppc64le.PFA the thread log file.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org