You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2019/03/04 18:00:00 UTC

[jira] [Assigned] (SPARK-26995) Running Spark in Docker image with Alpine Linux 3.9.0 throws errors when using snappy

     [ https://issues.apache.org/jira/browse/SPARK-26995?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin reassigned SPARK-26995:
--------------------------------------

    Assignee: Luca Canali

> Running Spark in Docker image with Alpine Linux 3.9.0 throws errors when using snappy
> -------------------------------------------------------------------------------------
>
>                 Key: SPARK-26995
>                 URL: https://issues.apache.org/jira/browse/SPARK-26995
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes
>    Affects Versions: 2.3.0, 2.4.0
>            Reporter: Luca Canali
>            Assignee: Luca Canali
>            Priority: Minor
>
> Running Spark in Docker image with Alpine Linux 3.9.0 throws errors when using snappy.  
> The issue can be reproduced for example as follows: `Seq(1,2).toDF("id").write.format("parquet").save("DELETEME1")`  
> The key part of the error stack is as follows `Caused by: java.lang.UnsatisfiedLinkError: /tmp/snappy-1.1.7-2b4872f1-7c41-4b84-bda1-dbcb8dd0ce4c-libsnappyjava.so: Error loading shared library ld-linux-x86-64.so.2: Noded by /tmp/snappy-1.1.7-2b4872f1-7c41-4b84-bda1-dbcb8dd0ce4c-libsnappyjava.so)`  
> The source of the error appears to be due to the fact that libsnappyjava.so needs ld-linux-x86-64.so.2 and looks for it in /lib, while in Alpine Linux 3.9.0 with libc6-compat version 1.1.20-r3 ld-linux-x86-64.so.2 is located in /lib64.
> Note: this issue is not present with Alpine Linux 3.8 and libc6-compat version 1.1.19-r10 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org