You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jukka Vaaramaki (JIRA)" <ji...@apache.org> on 2016/08/26 12:23:21 UTC

[jira] [Commented] (SPARK-17255) Spark queries inside Futures occasionally fail due to missing class definitions

    [ https://issues.apache.org/jira/browse/SPARK-17255?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15438863#comment-15438863 ] 

Jukka Vaaramaki commented on SPARK-17255:
-----------------------------------------

Pull request https://github.com/apache/spark/pull/14831

> Spark queries inside Futures occasionally fail due to missing class definitions
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-17255
>                 URL: https://issues.apache.org/jira/browse/SPARK-17255
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.0.0
>         Environment: OSX El Capitan, Scala 2.11.8, JDK 1.8
>            Reporter: Jukka Vaaramaki
>
> Wrapping Spark queries inside Future's causes them to occasionally run in threads that have very limited context class loaders. The class loader obtained by calling Thread.currentThread.getContextClassLoader can be very minimal despite any classpath definitions and missing all the Spark libraries. This causes ClassNotFoundException or ScalaReflectionException to be thrown when building or executing the Spark query. Sometimes an IllegalArgumentException is thrown when none of the compression codecs are found.
> An easy fix is to replace the Thread.currentThread.getContextClassLoader calls with getClass.getClassLoader.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org