You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Jack Kolokasis <ko...@ics.forth.gr> on 2018/11/20 21:21:31 UTC

Maven

Hello,

    is there any way to use my local custom - Spark as dependency while 
I am using maven to compile my applications ?

Thanks for your reply,
--Iacovos

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Maven

Posted by Sean Owen <sr...@gmail.com>.
Sure, if you published Spark artifacts in a local repo (even your
local file system) as com.foo:spark-core_2.12, etc, just depend on
those artifacts, not the org.apache ones.
On Tue, Nov 20, 2018 at 3:21 PM Jack Kolokasis <ko...@ics.forth.gr> wrote:
>
> Hello,
>
>     is there any way to use my local custom - Spark as dependency while
> I am using maven to compile my applications ?
>
> Thanks for your reply,
> --Iacovos
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org