You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yang Jie (Jira)" <ji...@apache.org> on 2021/07/05 12:06:00 UTC

[jira] [Comment Edited] (SPARK-36019) Cannot run leveldb related UTs on Mac OS of M1 architecture

    [ https://issues.apache.org/jira/browse/SPARK-36019?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17374781#comment-17374781 ] 

Yang Jie edited comment on SPARK-36019 at 7/5/21, 12:05 PM:
------------------------------------------------------------

cc [~dongjoon]

It seems that the leveldb-jni module has not been maintained for a long time ? Is it possible for us to develop and test spark on M1? Or do we add configuration on M1 to skip corresponding UTs?


was (Author: luciferyang):
cc [~dongjoon]

 

It seems that the leveldb-jni module has not been maintained for a long time ? Is it possible for us to develop and test spark on M1? Or do we add configuration on M1 to skip corresponding UTs?

> Cannot run leveldb related UTs on Mac OS of M1 architecture
> -----------------------------------------------------------
>
>                 Key: SPARK-36019
>                 URL: https://issues.apache.org/jira/browse/SPARK-36019
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 3.3.0
>            Reporter: Yang Jie
>            Priority: Major
>
> When run leveldb related UTs on Mac OS of M1 architecture, there are some test failed as follows:
> {code:java}
> [INFO] Running org.apache.spark.util.kvstore.LevelDBSuite
> [ERROR] Tests run: 10, Failures: 0, Errors: 10, Skipped: 0, Time elapsed: 0.18 s <<< FAILURE! - in org.apache.spark.util.kvstore.LevelDBSuite
> [ERROR] org.apache.spark.util.kvstore.LevelDBSuite.testMultipleTypesWriteReadDelete  Time elapsed: 0.146 s  <<< ERROR!
> java.lang.UnsatisfiedLinkError: 
> Could not load library. Reasons: [no leveldbjni64-1.8 in java.library.path, no leveldbjni-1.8 in java.library.path, no leveldbjni in java.library.path, /Users/yangjie01/SourceCode/git/spark-mine-12/common/kvstore/target/tmp/libleveldbjni-64-1-7259526109351494242.8: dlopen(/Users/yangjie01/SourceCode/git/spark-mine-12/common/kvstore/target/tmp/libleveldbjni-64-1-7259526109351494242.8, 1): no suitable image found.  Did find:
> 	/Users/yangjie01/SourceCode/git/spark-mine-12/common/kvstore/target/tmp/libleveldbjni-64-1-7259526109351494242.8: no matching architecture in universal wrapper
> 	/Users/yangjie01/SourceCode/git/spark-mine-12/common/kvstore/target/tmp/libleveldbjni-64-1-7259526109351494242.8: no matching architecture in universal wrapper]
> 	at org.apache.spark.util.kvstore.LevelDBSuite.setup(LevelDBSuite.java:55)
> [ERROR] org.apache.spark.util.kvstore.LevelDBSuite.testObjectWriteReadDelete  Time elapsed: 0 s  <<< ERROR!
> java.lang.NoClassDefFoundError: Could not initialize class org.fusesource.leveldbjni.JniDBFactory
> 	at org.apache.spark.util.kvstore.LevelDBSuite.setup(LevelDBSuite.java:55)
> ....
> [ERROR] Tests run: 105, Failures: 0, Errors: 48, Skipped: 0{code}
> There seems to be a lack of JNI support



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org