You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Lars Francke (Jira)" <ji...@apache.org> on 2019/12/29 23:16:00 UTC

[jira] [Commented] (SPARK-30196) Bump lz4-java version to 1.7.0

    [ https://issues.apache.org/jira/browse/SPARK-30196?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17005066#comment-17005066 ] 

Lars Francke commented on SPARK-30196:
--------------------------------------

FYI: This seems to have broken Spark 3 on Mac OS for me due to
{code:java}
dyld: lazy symbol binding failed: Symbol not found: ____chkstk_darwin
  Referenced from: /private/var/folders/1v/ckh8py712_n_5r628_16w0l40000gn/T/liblz4-java-820584040681098780.dylib (which was built for Mac OS X 10.15)
  Expected in: /usr/lib/libSystem.B.dylibdyld: Symbol not found: ____chkstk_darwin
  Referenced from: /private/var/folders/1v/ckh8py712_n_5r628_16w0l40000gn/T/liblz4-java-820584040681098780.dylib (which was built for Mac OS X 10.15)
  Expected in: /usr/lib/libSystem.B.dylib
 {code}
 

I did a bit of googling but I'm not sure what's going on. Reverting to 1.6 works for me. I'm on MacOS 10.13. Any hints are appreciated.

If the lz4 stuff really only works with MacOS 10.15 that'd be sad but I can't really believe that.
Has anyone tried Spark 3 Preview 2 on a Mac with 10.15/10.14/10.13?

> Bump lz4-java version to 1.7.0
> ------------------------------
>
>                 Key: SPARK-30196
>                 URL: https://issues.apache.org/jira/browse/SPARK-30196
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build, Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Takeshi Yamamuro
>            Assignee: Takeshi Yamamuro
>            Priority: Major
>             Fix For: 3.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org