You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2016/01/06 06:45:40 UTC

[jira] [Created] (SPARK-12666) spark-shell --packages cannot load artifacts which are publishLocal'd by SBT

Josh Rosen created SPARK-12666:
----------------------------------

             Summary: spark-shell --packages cannot load artifacts which are publishLocal'd by SBT
                 Key: SPARK-12666
                 URL: https://issues.apache.org/jira/browse/SPARK-12666
             Project: Spark
          Issue Type: Bug
          Components: Spark Submit
    Affects Versions: 1.6.0, 1.5.1
            Reporter: Josh Rosen


Symptom:

I cloned the latest master of {{spark-redshift}}, then used {{sbt publishLocal}} to publish it to my Ivy cache. When I tried running {{./bin/spark-shell --packages com.databricks:spark-redshift_2.10:0.5.3-SNAPSHOT}} to load this dependency into {{spark-shell}}, I received the following cryptic error:

{code}
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: com.databricks#spark-redshift_2.10;0.5.3-SNAPSHOT: configuration not found in com.databricks#spark-redshift_2.10;0.5.3-SNAPSHOT: 'default'. It was required from org.apache.spark#spark-submit-parent;1.0 default]
	at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1009)
	at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
{code}

I think the problem here is that Spark is declaring a dependency on the spark-redshift artifact using the {{default}} Ivy configuration. The default configuration will be the only configuration defined in an Ivy artifact if that artifact defines no other configurations. Thus, for Maven artifacts I think the default configuration will end up mapping to Maven's regular JAR dependency but for Ivy artifacts I think we can run into trouble when loading artifacts which explicitly define their own configurations, since those artifacts might not have a configuration named {{default}}.

I spent a bit of time playing around with the SparkSubmit code to see if I could fix this but wasn't able to completely resolve the issue.

/cc [~brkyvz] (ping me offline and I can walk you through the repo in person, if you'd like)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org