You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Sidney Feiner <si...@startapp.com> on 2017/02/01 07:23:58 UTC

RE: Jars directory in Spark 2.0

Is this done on purpose? Because it really makes it hard to deploy applications. Is there a reason they didn't shade the jars they use to begin with?

Sidney Feiner   /  SW Developer
M: +972.528197720  /  Skype: sidney.feiner.startapp

[StartApp]<http://www.startapp.com/>

From: Koert Kuipers [mailto:koert@tresata.com]
Sent: Tuesday, January 31, 2017 7:26 PM
To: Sidney Feiner <si...@startapp.com>
Cc: user@spark.apache.org
Subject: Re: Jars directory in Spark 2.0

you basically have to keep your versions of dependencies in line with sparks or shade your own dependencies.

you cannot just replace the jars in sparks jars folder. if you wan to update them you have to build spark yourself with updated dependencies and confirm it compiles, passes tests etc.

On Tue, Jan 31, 2017 at 3:40 AM, Sidney Feiner <si...@startapp.com>> wrote:
Hey,
While migrating to Spark 2.X from 1.6, I've had many issues with jars that come preloaded with Spark in the "jars/" directory and I had to shade most of my packages.
Can I replace the jars in this folder to more up to date versions? Are those jar used for anything internal in Spark which means I can't blindly replace them?

Thanks ☺


Sidney Feiner   /  SW Developer
M: +972.528197720<tel:+972%2052-819-7720>  /  Skype: sidney.feiner.startapp

[StartApp]<http://www.startapp.com/>

<http://www.startapp.com/press/#events_press>
 <http://www.startapp.com/press/#events_press>

RE: Jars directory in Spark 2.0

Posted by Sidney Feiner <si...@startapp.com>.
Ok, good to know ☺
Shading every spark app it is then…
Thanks!

Sidney Feiner   /  SW Developer
M: +972.528197720  /  Skype: sidney.feiner.startapp

[StartApp]<http://www.startapp.com/>

From: Marcelo Vanzin [mailto:vanzin@cloudera.com]
Sent: Wednesday, February 1, 2017 7:41 PM
To: Sidney Feiner <si...@startapp.com>
Cc: Koert Kuipers <ko...@tresata.com>; user@spark.apache.org
Subject: Re: Jars directory in Spark 2.0

Spark has never shaded dependencies (in the sense of renaming the classes), with a couple of exceptions (Guava and Jetty). So that behavior is nothing new. Spark's dependencies themselves have a lot of other dependencies, so doing that would have limited benefits anyway.

On Tue, Jan 31, 2017 at 11:23 PM, Sidney Feiner <si...@startapp.com>> wrote:
Is this done on purpose? Because it really makes it hard to deploy applications. Is there a reason they didn't shade the jars they use to begin with?

Sidney Feiner   /  SW Developer
M: +972.528197720<tel:+972%2052-819-7720>  /  Skype: sidney.feiner.startapp

[StartApp]<http://www.startapp.com/>

From: Koert Kuipers [mailto:koert@tresata.com<ma...@tresata.com>]
Sent: Tuesday, January 31, 2017 7:26 PM
To: Sidney Feiner <si...@startapp.com>>
Cc: user@spark.apache.org<ma...@spark.apache.org>
Subject: Re: Jars directory in Spark 2.0

you basically have to keep your versions of dependencies in line with sparks or shade your own dependencies.

you cannot just replace the jars in sparks jars folder. if you wan to update them you have to build spark yourself with updated dependencies and confirm it compiles, passes tests etc.

On Tue, Jan 31, 2017 at 3:40 AM, Sidney Feiner <si...@startapp.com>> wrote:
Hey,
While migrating to Spark 2.X from 1.6, I've had many issues with jars that come preloaded with Spark in the "jars/" directory and I had to shade most of my packages.
Can I replace the jars in this folder to more up to date versions? Are those jar used for anything internal in Spark which means I can't blindly replace them?

Thanks ☺


Sidney Feiner   /  SW Developer
M: +972.528197720<tel:+972%2052-819-7720>  /  Skype: sidney.feiner.startapp

[StartApp]<http://www.startapp.com/>

 <http://www.startapp.com/press/#events_press>



--
Marcelo

Re: Jars directory in Spark 2.0

Posted by Marcelo Vanzin <va...@cloudera.com>.
Spark has never shaded dependencies (in the sense of renaming the classes),
with a couple of exceptions (Guava and Jetty). So that behavior is nothing
new. Spark's dependencies themselves have a lot of other dependencies, so
doing that would have limited benefits anyway.

On Tue, Jan 31, 2017 at 11:23 PM, Sidney Feiner <si...@startapp.com>
wrote:

> Is this done on purpose? Because it really makes it hard to deploy
> applications. Is there a reason they didn't shade the jars they use to
> begin with?
>
>
>
> *Sidney Feiner*   */*  SW Developer
>
> M: +972.528197720 <+972%2052-819-7720>  */*  Skype: sidney.feiner.startapp
>
>
>
> [image: StartApp] <http://www.startapp.com/>
>
>
>
> *From:* Koert Kuipers [mailto:koert@tresata.com]
> *Sent:* Tuesday, January 31, 2017 7:26 PM
> *To:* Sidney Feiner <si...@startapp.com>
> *Cc:* user@spark.apache.org
> *Subject:* Re: Jars directory in Spark 2.0
>
>
>
> you basically have to keep your versions of dependencies in line with
> sparks or shade your own dependencies.
>
> you cannot just replace the jars in sparks jars folder. if you wan to
> update them you have to build spark yourself with updated dependencies and
> confirm it compiles, passes tests etc.
>
>
>
> On Tue, Jan 31, 2017 at 3:40 AM, Sidney Feiner <si...@startapp.com>
> wrote:
>
> Hey,
>
> While migrating to Spark 2.X from 1.6, I've had many issues with jars that
> come preloaded with Spark in the "jars/" directory and I had to shade most
> of my packages.
>
> Can I replace the jars in this folder to more up to date versions? Are
> those jar used for anything internal in Spark which means I can't blindly
> replace them?
>
>
>
> Thanks J
>
>
>
>
>
> *Sidney Feiner*   */*  SW Developer
>
> M: +972.528197720 <+972%2052-819-7720>  */*  Skype: sidney.feiner.startapp
>
>
>
> [image: StartApp] <http://www.startapp.com/>
>
>
>
> <http://www.startapp.com/press/#events_press>
>
>   <http://www.startapp.com/press/#events_press>
>



-- 
Marcelo