You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/10/14 03:30:05 UTC

[jira] [Commented] (SPARK-11093) ChildFirstURLClassLoader#getResources should return all found resources, not just those in the child classloader

    [ https://issues.apache.org/jira/browse/SPARK-11093?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14956099#comment-14956099 ] 

Apache Spark commented on SPARK-11093:
--------------------------------------

User 'alewando' has created a pull request for this issue:
https://github.com/apache/spark/pull/9106

> ChildFirstURLClassLoader#getResources should return all found resources, not just those in the child classloader
> ----------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-11093
>                 URL: https://issues.apache.org/jira/browse/SPARK-11093
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.5.1
>            Reporter: Adam Lewandowski
>
> Currently when using a child-first classloader (spark.{driver|executor}.userClassPathFirst = true), the getResources method does not return any matching resources from the parent classloader if the child classloader contains any. This is not child-first, it's child-only and is inconsistent with how the default parent-first classloaders work in the JDK (all found resources are returned from both classloaders). It is also inconsistent with how child-first classloaders work in other environments (Servlet containers, for example). 
> ChildFirstURLClassLoader#getResources() should return resources found from both the child and the parent classloaders, placing any found from the child classloader first. 
> For reference, the specific use case where I encountered this problem was running Spark on AWS EMR in a child-first arrangement (due to guava version conflicts), where Akka's configuration file (reference.conf) was made available in the parent classloader, but was not visible to the Typesafe config library which uses Classloader.getResources() on the Thread's context classloader to find them. This resulted in a fatal error from the Config library: "com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'" .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org