You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Fi (JIRA)" <ji...@apache.org> on 2015/05/22 07:02:17 UTC
[jira] [Created] (SPARK-7819) Isolated Hive Client Loader appears
to cause Native Library libMapRClient.4.0.2-mapr.so already loaded in
another classloader error
Fi created SPARK-7819:
-------------------------
Summary: Isolated Hive Client Loader appears to cause Native Library libMapRClient.4.0.2-mapr.so already loaded in another classloader error
Key: SPARK-7819
URL: https://issues.apache.org/jira/browse/SPARK-7819
Project: Spark
Issue Type: Bug
Affects Versions: 1.4.1
Reporter: Fi
Attachments: stacktrace.txt, test.py
In reference to the pull request:
https://github.com/apache/spark/pull/5876
I have been running the Spark 1.3 branch for some time with no major hiccups, and recently switched to the Spark 1.4 branch.
I build my spark distribution with the following build command:
make-distribution.sh --tgz --skip-java-test --with-tachyon -Phive -Phive-0.13.1 -Pmapr4 -Pspark-ganglia-lgpl -Pkinesis-asl -Phive-thriftserver
When running a python script containing a series of smoke tests I use to validate the build, I encountered an error under the following conditions:
* start a spark context
* start a hive context
* run any hive query
* stop the spark context
* start a second spark context
* run any hive query
*** ERROR
>From what I can tell, the Isolated Class Loader is hitting a MapR class that is loading its native library (presumedly as part of a static initializer).
Unfortunately, the JVM prohibits this the second time around.
I would think that shutting down the SparkContext would clear out any vestigials of the JVM, so I'm surprised that this would even be a problem.
Note: all other smoke tests we are running passes fine.
I will attach the stacktrace and a python script reproducing the issue (at least for my environment and build).
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org