You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/09/27 21:56:04 UTC

[jira] [Commented] (SPARK-10846) Stray META-INF in directory spark-shell is launched from causes problems

    [ https://issues.apache.org/jira/browse/SPARK-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14909872#comment-14909872 ] 

Sean Owen commented on SPARK-10846:
-----------------------------------

It's a standard Java mechanism, yes. It would have to be on the classpath to do anything though. I don't think your cwd is on the classpath, or shouldn't be. Can you check if that's somehow true? then the question is how to get rid of it. Barring that, yeah it's just how the services discovery mechanism works in the JVM, and it outside of Spark.

> Stray META-INF in directory spark-shell is launched from causes problems
> ------------------------------------------------------------------------
>
>                 Key: SPARK-10846
>                 URL: https://issues.apache.org/jira/browse/SPARK-10846
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.5.0
>            Reporter: Ryan Williams
>            Priority: Minor
>
> I observed some perplexing errors while running {{$SPARK_HOME/bin/spark-shell}} yesterday (with {{$SPARK_HOME}} pointing at a clean 1.5.0 install):
> {code}
> java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.s3.S3FileSystem not found
> {code}
> while initializing {{HiveContext}}; full example output is [here|https://gist.github.com/ryan-williams/34210ad640687113e5c3#file-1-5-0-failure].
> The issue was that a stray {{META-INF}} directory from some other project I'd built months ago was sitting in the directory that I'd run {{spark-shell}} from (*not* in my {{$SPARK_HOME}}, just in the directory I happened to be in when I ran {{$SPARK_HOME/bin/spark-shell}}). 
> That {{META-INF}} had a {{services/org.apache.hadoop.fs.FileSystem}} file specifying some provider classes ({{S3FileSystem}} in the example above) that were unsurprisingly not resolvable by Spark.
> I'm not sure if this is purely my fault for attempting to run Spark from a directory with another project's config files laying around, but I find it somewhat surprising that, given a {{$SPARK_HOME}} pointing to a clean Spark install, that {{$SPARK_HOME/bin/spark-shell}} picks up detritus from the {{cwd}} it is called from, so I wanted to at least document it here.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org