You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Pierre B <pi...@realimpactanalytics.com> on 2014/06/01 11:34:09 UTC

Using sbt-pack with Spark 1.0.0

Hi all!

We'be been using the sbt-pack sbt plugin
(https://github.com/xerial/sbt-pack) for building our standalone Spark
application for a while now. Until version 1.0.0, that worked nicely.

For those who don't know the sbt-pack plugin, it basically copies all the
dependencies JARs from your local ivy/maven cache to a your target folder
(in target/pack/lib), and creates launch scripts (in target/pack/bin) for
your application (notably setting all these jars on the classpath).

Now, since Spark 1.0.0 was released, we are encountering a weird error where
running our project with "sbt run" is fine but running our app with the
launch scripts generated by sbt-pack fails.

After a (quite painful) investigation, it turns out some JARs are NOT copied
from the local ivy2 cache to the lib folder. I noticed that all the missing
jars contain "shaded" in their file name (but all not all jars with such
name are missing).
One of the missing JARs is explicitly from the Spark definition
(SparkBuild.scala, line 350): ``mesos-0.18.1-shaded-protobuf.jar``.

This file is clearly present in my local ivy cache, but is not copied by
sbt-pack.

Is there an evident reason for that?

I don't know much about the shading mechanism, maybe I'm missing something
here?


Any help would be appreciated!

Cheers

Pierre



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-sbt-pack-with-Spark-1-0-0-tp6649.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Using sbt-pack with Spark 1.0.0

Posted by lbustelo <gi...@bustelos.com>.
Are there any workarounds for this? Seems to be a dead end so far.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-sbt-pack-with-Spark-1-0-0-tp6649p11502.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Using sbt-pack with Spark 1.0.0

Posted by Pierre Borckmans <pi...@realimpactanalytics.com>.
You're right Patrick! 

Just had a chat with sbt-pack creator and indeed dependencies with classifiers are ignored to avoid problems with dirty cache...

Should be fixed in next version of the plugin.

Cheers

Pierre 

Message sent from a mobile device - excuse typos and abbreviations 

> Le 1 juin 2014 à 20:04, Patrick Wendell <pw...@gmail.com> a écrit :
> 
> https://github.com/apache/spark/blob/master/project/SparkBuild.scala#L350
> 
>> On Sun, Jun 1, 2014 at 11:03 AM, Patrick Wendell <pw...@gmail.com> wrote:
>> One potential issue here is that mesos is using classifiers now to
>> publish there jars. It might be that sbt-pack has trouble with
>> dependencies that are published using classifiers. I'm pretty sure
>> mesos is the only dependency in Spark that is using classifiers, so
>> that's why I mention it.
>> 
>> On Sun, Jun 1, 2014 at 2:34 AM, Pierre B
>> <pi...@realimpactanalytics.com> wrote:
>>> Hi all!
>>> 
>>> We'be been using the sbt-pack sbt plugin
>>> (https://github.com/xerial/sbt-pack) for building our standalone Spark
>>> application for a while now. Until version 1.0.0, that worked nicely.
>>> 
>>> For those who don't know the sbt-pack plugin, it basically copies all the
>>> dependencies JARs from your local ivy/maven cache to a your target folder
>>> (in target/pack/lib), and creates launch scripts (in target/pack/bin) for
>>> your application (notably setting all these jars on the classpath).
>>> 
>>> Now, since Spark 1.0.0 was released, we are encountering a weird error where
>>> running our project with "sbt run" is fine but running our app with the
>>> launch scripts generated by sbt-pack fails.
>>> 
>>> After a (quite painful) investigation, it turns out some JARs are NOT copied
>>> from the local ivy2 cache to the lib folder. I noticed that all the missing
>>> jars contain "shaded" in their file name (but all not all jars with such
>>> name are missing).
>>> One of the missing JARs is explicitly from the Spark definition
>>> (SparkBuild.scala, line 350): ``mesos-0.18.1-shaded-protobuf.jar``.
>>> 
>>> This file is clearly present in my local ivy cache, but is not copied by
>>> sbt-pack.
>>> 
>>> Is there an evident reason for that?
>>> 
>>> I don't know much about the shading mechanism, maybe I'm missing something
>>> here?
>>> 
>>> 
>>> Any help would be appreciated!
>>> 
>>> Cheers
>>> 
>>> Pierre
>>> 
>>> 
>>> 
>>> --
>>> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-sbt-pack-with-Spark-1-0-0-tp6649.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Using sbt-pack with Spark 1.0.0

Posted by Patrick Wendell <pw...@gmail.com>.
https://github.com/apache/spark/blob/master/project/SparkBuild.scala#L350

On Sun, Jun 1, 2014 at 11:03 AM, Patrick Wendell <pw...@gmail.com> wrote:
> One potential issue here is that mesos is using classifiers now to
> publish there jars. It might be that sbt-pack has trouble with
> dependencies that are published using classifiers. I'm pretty sure
> mesos is the only dependency in Spark that is using classifiers, so
> that's why I mention it.
>
> On Sun, Jun 1, 2014 at 2:34 AM, Pierre B
> <pi...@realimpactanalytics.com> wrote:
>> Hi all!
>>
>> We'be been using the sbt-pack sbt plugin
>> (https://github.com/xerial/sbt-pack) for building our standalone Spark
>> application for a while now. Until version 1.0.0, that worked nicely.
>>
>> For those who don't know the sbt-pack plugin, it basically copies all the
>> dependencies JARs from your local ivy/maven cache to a your target folder
>> (in target/pack/lib), and creates launch scripts (in target/pack/bin) for
>> your application (notably setting all these jars on the classpath).
>>
>> Now, since Spark 1.0.0 was released, we are encountering a weird error where
>> running our project with "sbt run" is fine but running our app with the
>> launch scripts generated by sbt-pack fails.
>>
>> After a (quite painful) investigation, it turns out some JARs are NOT copied
>> from the local ivy2 cache to the lib folder. I noticed that all the missing
>> jars contain "shaded" in their file name (but all not all jars with such
>> name are missing).
>> One of the missing JARs is explicitly from the Spark definition
>> (SparkBuild.scala, line 350): ``mesos-0.18.1-shaded-protobuf.jar``.
>>
>> This file is clearly present in my local ivy cache, but is not copied by
>> sbt-pack.
>>
>> Is there an evident reason for that?
>>
>> I don't know much about the shading mechanism, maybe I'm missing something
>> here?
>>
>>
>> Any help would be appreciated!
>>
>> Cheers
>>
>> Pierre
>>
>>
>>
>> --
>> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-sbt-pack-with-Spark-1-0-0-tp6649.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Using sbt-pack with Spark 1.0.0

Posted by Patrick Wendell <pw...@gmail.com>.
One potential issue here is that mesos is using classifiers now to
publish there jars. It might be that sbt-pack has trouble with
dependencies that are published using classifiers. I'm pretty sure
mesos is the only dependency in Spark that is using classifiers, so
that's why I mention it.

On Sun, Jun 1, 2014 at 2:34 AM, Pierre B
<pi...@realimpactanalytics.com> wrote:
> Hi all!
>
> We'be been using the sbt-pack sbt plugin
> (https://github.com/xerial/sbt-pack) for building our standalone Spark
> application for a while now. Until version 1.0.0, that worked nicely.
>
> For those who don't know the sbt-pack plugin, it basically copies all the
> dependencies JARs from your local ivy/maven cache to a your target folder
> (in target/pack/lib), and creates launch scripts (in target/pack/bin) for
> your application (notably setting all these jars on the classpath).
>
> Now, since Spark 1.0.0 was released, we are encountering a weird error where
> running our project with "sbt run" is fine but running our app with the
> launch scripts generated by sbt-pack fails.
>
> After a (quite painful) investigation, it turns out some JARs are NOT copied
> from the local ivy2 cache to the lib folder. I noticed that all the missing
> jars contain "shaded" in their file name (but all not all jars with such
> name are missing).
> One of the missing JARs is explicitly from the Spark definition
> (SparkBuild.scala, line 350): ``mesos-0.18.1-shaded-protobuf.jar``.
>
> This file is clearly present in my local ivy cache, but is not copied by
> sbt-pack.
>
> Is there an evident reason for that?
>
> I don't know much about the shading mechanism, maybe I'm missing something
> here?
>
>
> Any help would be appreciated!
>
> Cheers
>
> Pierre
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-sbt-pack-with-Spark-1-0-0-tp6649.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.