You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by ranamitabh <ra...@gmail.com> on 2014/12/25 13:34:08 UTC

MapR distribution Spark throwing “no MapRClient in java.library.path” error

I am a big data developer and for past some time I have been using apache
spark of cloudera distribution. I have successfully written all my codes on
Cloudera-Spark but when I am trying my hands on MapR distribution of Spark
then I am getting the following error.

===============================================
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
    at java.lang.reflect.Method.invoke(Unknown Source)
    at com.mapr.fs.ShimLoader.loadNativeLibrary(ShimLoader.java:308)
    at com.mapr.fs.ShimLoader.load(ShimLoader.java:197)
    at
org.apache.hadoop.conf.CoreDefaultProperties.<clinit>(CoreDefaultProperties.java:54)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Unknown Source)
    at
org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1822)
    at
org.apache.hadoop.conf.Configuration.getProperties(Configuration.java:2037)
    at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2238)
    at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2190)
    at
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2107)
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:967)
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:941)
    at
org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopUtil.scala:102)
    at
org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:42)
    at
org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:202)
    at
org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
    at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
    at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105)
    at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:232)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:136)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:151)
    at
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:67)
    at word.count.WordCount.main(WordCount.java:28)
Caused by: java.lang.UnsatisfiedLinkError: no MapRClient in
java.library.path
    at java.lang.ClassLoader.loadLibrary(Unknown Source)
    at java.lang.Runtime.loadLibrary0(Unknown Source)
    at java.lang.System.loadLibrary(Unknown Source)
    at com.mapr.fs.shim.LibraryLoader.loadLibrary(LibraryLoader.java:41)
    ... 30 more
==========Unable to find library in jar due to exception. ==============
java.lang.RuntimeException: no native library is found for os.name=Windows
and os.arch=x86_64
    at com.mapr.fs.ShimLoader.findNativeLibrary(ShimLoader.java:496)
    at com.mapr.fs.ShimLoader.loadNativeLibrary(ShimLoader.java:318)
    at com.mapr.fs.ShimLoader.load(ShimLoader.java:197)
    at
org.apache.hadoop.conf.CoreDefaultProperties.<clinit>(CoreDefaultProperties.java:54)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Unknown Source)
    at
org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1822)
    at
org.apache.hadoop.conf.Configuration.getProperties(Configuration.java:2037)
    at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2238)
    at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2190)
    at
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2107)
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:967)
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:941)
    at
org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopUtil.scala:102)
    at
org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:42)
    at
org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:202)
    at
org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
    at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
    at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105)
    at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:232)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:136)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:151)
    at
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:67)
    at word.count.WordCount.main(WordCount.java:28)
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
    at java.lang.reflect.Method.invoke(Unknown Source)
    at com.mapr.fs.ShimLoader.loadNativeLibrary(ShimLoader.java:308)
    at com.mapr.fs.ShimLoader.load(ShimLoader.java:197)
    at
org.apache.hadoop.conf.CoreDefaultProperties.<clinit>(CoreDefaultProperties.java:54)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Unknown Source)
    at
org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1822)
    at
org.apache.hadoop.conf.Configuration.getProperties(Configuration.java:2037)
    at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2238)
    at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2190)
    at
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2107)
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:967)
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:941)
    at
org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopUtil.scala:102)
    at
org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:42)
    at
org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:202)
    at
org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
    at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
    at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105)
    at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:232)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:136)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:151)
    at
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:67)
    at word.count.WordCount.main(WordCount.java:28)
Caused by: java.lang.UnsatisfiedLinkError: no MapRClient in
java.library.path
    at java.lang.ClassLoader.loadLibrary(Unknown Source)
    at java.lang.Runtime.loadLibrary0(Unknown Source)
    at java.lang.System.loadLibrary(Unknown Source)
    at com.mapr.fs.shim.LibraryLoader.loadLibrary(LibraryLoader.java:41)
    ... 30 more
Exception in thread "main" java.lang.ExceptionInInitializerError
    at com.mapr.fs.ShimLoader.load(ShimLoader.java:214)
    at
org.apache.hadoop.conf.CoreDefaultProperties.<clinit>(CoreDefaultProperties.java:54)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Unknown Source)
    at
org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1822)
    at
org.apache.hadoop.conf.Configuration.getProperties(Configuration.java:2037)
    at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2238)
    at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2190)
    at
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2107)
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:967)
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:941)
    at
org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopUtil.scala:102)
    at
org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:42)
    at
org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:202)
    at
org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
    at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
    at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105)
    at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:232)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:136)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:151)
    at
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:67)
    at word.count.WordCount.main(WordCount.java:28)
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
    at java.lang.reflect.Method.invoke(Unknown Source)
    at com.mapr.fs.ShimLoader.loadNativeLibrary(ShimLoader.java:308)
    at com.mapr.fs.ShimLoader.load(ShimLoader.java:197)
    ... 24 more
Caused by: java.lang.UnsatisfiedLinkError: no MapRClient in
java.library.path
    at java.lang.ClassLoader.loadLibrary(Unknown Source)
    at java.lang.Runtime.loadLibrary0(Unknown Source)
    at java.lang.System.loadLibrary(Unknown Source)
    at com.mapr.fs.shim.LibraryLoader.loadLibrary(LibraryLoader.java:41)
    ... 30 more

====================================

I have written a simple word count code. *When I am removing MapR-Spark jar
and replacing it with Cloudera-Spark jar then the same code is executing
perfectly*. Kindly help.



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/MapR-distribution-Spark-throwing-no-MapRClient-in-java-library-path-error-tp9922.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org