You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by kamatsuoka <ke...@gmail.com> on 2014/01/15 23:29:27 UTC

libraryDependencies configuration is different for sbt assembly vs sbt run

When I run "sbt assembly", I use the "provided" configuration in the
build.sbt library dependency, to avoid conflicts in the fat jar: 

libraryDependencies += "org.apache.spark" %% "spark-core" %
"0.8.1-incubating" % "provided"

But if I want to do "sbt run", I have to remove the "provided," otherwise it
doesn't find the Spark classes.

Is there a way to set up my build.sbt so that it does the right thing in
both cases, without monkeying with my build.sbt each time?





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/libraryDependencies-configuration-is-different-for-sbt-assembly-vs-sbt-run-tp565.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: libraryDependencies configuration is different for sbt assembly vs sbt run

Posted by kamatsuoka <ke...@gmail.com>.
It turns out there's a way to make this work!  Here's a project/Build.scala
adapted from  Eugene Yokota's answer on StackOverflow
<http://stackoverflow.com/questions/18838944/sbt-how-can-i-add-provided-dependencies-back-to-run-test-tasks-classpath> 
.  I have most of my settings in my build.sbt, so this is a minimalist
version of Eugene's answer.





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/libraryDependencies-configuration-is-different-for-sbt-assembly-vs-sbt-run-tp565p1542.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.