You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Mendelson, Assaf" <As...@rsa.com> on 2017/02/27 10:57:08 UTC

handling dependency conflicts with spark

Hi,

I have a project which uses Jackson 2.8.5. Spark on the other hand seems to be using 2.6.5
I am using maven to compile.

My original solution to the problem have been to set spark dependencies with the "provided" scope and use maven shade plugin to shade Jackson in my compilation.

The problem is, that when I run maven tests, the shading has not yet occurred and I get a conflict.

I thought of creating an uber jar of my dependencies and shade that, however, since my main dependency uses spring heavily, doing so causes configuration to not be created.

I was hoping someone has a better idea.
Thanks,
                Assaf.