You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "hoelog (JIRA)" <ji...@apache.org> on 2014/11/20 15:02:33 UTC

[jira] [Comment Edited] (SPARK-4354) 14/11/12 09:39:00 WARN TaskSetManager: Lost task 5.0 in stage 0.0 (TID 5, HYD-RNDNW-VFRCO-RCORE2): java.lang.NoClassDefFoundError: Could not initialize class org.xerial.snappy.Snappy

    [ https://issues.apache.org/jira/browse/SPARK-4354?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14219370#comment-14219370 ] 

hoelog edited comment on SPARK-4354 at 11/20/14 2:02 PM:
---------------------------------------------------------

In my case, I compiled spark 1.1 in the system which has GLIBCXX_3.4.9. and then run the spark with the cluster. some executors failed and others succeeds. I think the problem is that the servers of the failed executors have no GLIBCXX_3.4.9. ya, it should fail because the assembled spark jar has snappy compiled with GLIBCXX_3.4.9!

I checked the pre-build 2.4.0 hadoop spark, and it also has the snappy compiled with high or equal version of GLIBCXX_3.4.9.

{panel}
$ldd native/Linux/amd64/libsnappyjava.so
ldd: warning: you do not have execution permission for `native/Linux/amd64/libsnappyjava.so'
        linux-vdso.so.1 =>  (0x00007fff2bfff000)
        libstdc++.so.6 => /usr/lib64/libstdc++.so.6 (0x00007f3c2c896000)
        libm.so.6 => /lib64/libm.so.6 (0x00007f3c2c612000)
        libc.so.6 => /lib64/libc.so.6 (0x00007f3c2c27e000)
        /lib64/ld-linux-x86-64.so.2 (0x00000034c4a00000)
        libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f3c2c068000)

$ strings Linux/amd64/libsnappyjava.so | grep GLIB
GLIBC_2.2.5
GLIBCXX_3.4.9
GLIBCXX_3.4.11
GLIBCXX_3.4
{panel}

so you need to check your libstdc++ and GLIBCXX version.


was (Author: hoelog):
In my case, I compiled spark 1.1 in the system which has GLIBCXX_3.4.9. and then run the spark with the cluster. some executors failed and others succeeds. I think the problem is that the servers of the failed executors have no GLIBCXX_3.4.9. ya, it should fail because the assembled spark jar has snappy compiled with GLIBCXX_3.4.9!

I checked the pre-build 2.4.0 hadoop spark, and it also has the snappy compiled with high or equal version of GLIBCXX_3.4.9.

{panel}
$ldd native/Linux/amd64/libsnappyjava.so
ldd: warning: you do not have execution permission for `native/Linux/amd64/libsnappyjava.so'
        linux-vdso.so.1 =>  (0x00007fff2bfff000)
        libstdc++.so.6 => /usr/lib64/libstdc++.so.6 (0x00007f3c2c896000)
        libm.so.6 => /lib64/libm.so.6 (0x00007f3c2c612000)
        libc.so.6 => /lib64/libc.so.6 (0x00007f3c2c27e000)
        /lib64/ld-linux-x86-64.so.2 (0x00000034c4a00000)
        libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f3c2c068000)
{panel}

so you need to check your libstdc++ version.

> 14/11/12 09:39:00 WARN TaskSetManager: Lost task 5.0 in stage 0.0 (TID 5, HYD-RNDNW-VFRCO-RCORE2): java.lang.NoClassDefFoundError: Could not initialize class org.xerial.snappy.Snappy
> --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-4354
>                 URL: https://issues.apache.org/jira/browse/SPARK-4354
>             Project: Spark
>          Issue Type: Question
>          Components: Examples
>    Affects Versions: 1.1.0
>         Environment: Linux
>            Reporter: Shyam
>              Labels: newbie
>         Attachments: client-exception.txt
>
>
> Prebuilt Spark for Hadoop 2.4 installed in 4 redhat linux machines 
> Standalone cluster mode.
> Machine 1(Master)
> Machine 2(Worker node 1)
> Machine 3(Worker node 2)
> Machine 4(Client for executing spark examples)
> I ran below mentioned command in Machine 4 then got exception mentioned in the summary of this issue.
> sh spark-submit  --class org.apache.spark.examples.SparkPi --jars /FS/lib/spark-assembly-1.1.0-hadoop2.4.0.jar  --master spark://MasterIP:7077 --deploy-mode client /FS/lib/spark-examples-1.1.0-hadoop2.4.0.jar 10
> java.lang.NoClassDefFoundError: Could not initialize class org.xerial.snappy.Snappy



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org