You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Ismaël Mejía <ie...@gmail.com> on 2016/07/15 12:29:18 UTC

spark-packages with maven

Hello, I would like to know if there is an easy way to package a new
spark-package
with maven, I just found this repo, but I am not an sbt user.

https://github.com/databricks/sbt-spark-package

One more question, is there a formal specification or documentation of what
do
you need to include in a spark-package (any special file, manifest, etc) ? I
have not found any doc in the website.

Thanks,
Ismael

Re: spark-packages with maven

Posted by Jakob Odersky <ja...@odersky.com>.
Luciano,
afaik the spark-package-tool also makes it easy to upload packages to
spark-packages website. You are of course free to include any maven
coordinate in the --packages parameter

--jakob

On Fri, Jul 15, 2016 at 1:42 PM, Ismaël Mejía <ie...@gmail.com> wrote:
> Thanks for the info Burak, I will check the repo you mention, do you know
> concretely what is the 'magic' that spark-packages need or if is there any
> document with info about it ?
>
> On Fri, Jul 15, 2016 at 10:12 PM, Luciano Resende <lu...@gmail.com>
> wrote:
>>
>>
>> On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski <ja...@japila.pl> wrote:
>>>
>>> +1000
>>>
>>> Thanks Ismael for bringing this up! I meant to have send it earlier too
>>> since I've been struggling with a sbt-based Scala project for a Spark
>>> package myself this week and haven't yet found out how to do local
>>> publishing.
>>>
>>> If such a guide existed for Maven I could use it for sbt easily too :-)
>>>
>>> Ping me Ismael if you don't hear back from the group so I feel invited
>>> for digging into the plugin's sources.
>>>
>>> Best,
>>> Jacek
>>>
>>>
>>> On 15 Jul 2016 2:29 p.m., "Ismaël Mejía" <ie...@gmail.com> wrote:
>>>
>>> Hello, I would like to know if there is an easy way to package a new
>>> spark-package
>>> with maven, I just found this repo, but I am not an sbt user.
>>>
>>> https://github.com/databricks/sbt-spark-package
>>>
>>> One more question, is there a formal specification or documentation of
>>> what do
>>> you need to include in a spark-package (any special file, manifest, etc)
>>> ? I
>>> have not found any doc in the website.
>>>
>>> Thanks,
>>> Ismael
>>>
>>>
>>
>>
>> I was under the impression that spark-packages was more like a place for
>> one to list/advertise their extensions,  but when you do spark submit with
>> --packages, it will use maven to resolve your package
>> and as long as it succeeds, it will use it (e.g. you can do mvn clean
>> install for your local packages, and use --packages with a spark server
>> running on that same machine).
>>
>> From sbt, I think you can just use publishTo and define a local
>> repository, something like
>>
>> publishTo := Some("Local Maven Repository" at
>> "file://"+Path.userHome.absolutePath+"/.m2/repository")
>>
>>
>>
>> --
>> Luciano Resende
>> http://twitter.com/lresende1975
>> http://lresende.blogspot.com/
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: spark-packages with maven

Posted by Ismaël Mejía <ie...@gmail.com>.
Thanks for the info Burak, I will check the repo you mention, do you know
concretely what is the 'magic' that spark-packages need or if is there any
document with info about it ?

On Fri, Jul 15, 2016 at 10:12 PM, Luciano Resende <lu...@gmail.com>
wrote:

>
> On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski <ja...@japila.pl> wrote:
>
>> +1000
>>
>> Thanks Ismael for bringing this up! I meant to have send it earlier too
>> since I've been struggling with a sbt-based Scala project for a Spark
>> package myself this week and haven't yet found out how to do local
>> publishing.
>>
>> If such a guide existed for Maven I could use it for sbt easily too :-)
>>
>> Ping me Ismael if you don't hear back from the group so I feel invited
>> for digging into the plugin's sources.
>>
>> Best,
>> Jacek
>>
>> On 15 Jul 2016 2:29 p.m., "Ismaël Mejía" <ie...@gmail.com> wrote:
>>
>> Hello, I would like to know if there is an easy way to package a new
>> spark-package
>> with maven, I just found this repo, but I am not an sbt user.
>>
>> https://github.com/databricks/sbt-spark-package
>>
>> One more question, is there a formal specification or documentation of
>> what do
>> you need to include in a spark-package (any special file, manifest, etc)
>> ? I
>> have not found any doc in the website.
>>
>> Thanks,
>> Ismael
>>
>>
>>
>
> I was under the impression that spark-packages was more like a place for
> one to list/advertise their extensions,  but when you do spark submit with
> --packages, it will use maven to resolve your package
> and as long as it succeeds, it will use it (e.g. you can do mvn clean
> install for your local packages, and use --packages with a spark server
> running on that same machine).
>
> From sbt, I think you can just use publishTo and define a local
> repository, something like
>
> publishTo := Some("Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository")
>
>
>
> --
> Luciano Resende
> http://twitter.com/lresende1975
> http://lresende.blogspot.com/
>

Re: spark-packages with maven

Posted by Luciano Resende <lu...@gmail.com>.
On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski <ja...@japila.pl> wrote:

> +1000
>
> Thanks Ismael for bringing this up! I meant to have send it earlier too
> since I've been struggling with a sbt-based Scala project for a Spark
> package myself this week and haven't yet found out how to do local
> publishing.
>
> If such a guide existed for Maven I could use it for sbt easily too :-)
>
> Ping me Ismael if you don't hear back from the group so I feel invited for
> digging into the plugin's sources.
>
> Best,
> Jacek
>
> On 15 Jul 2016 2:29 p.m., "Ismaël Mejía" <ie...@gmail.com> wrote:
>
> Hello, I would like to know if there is an easy way to package a new
> spark-package
> with maven, I just found this repo, but I am not an sbt user.
>
> https://github.com/databricks/sbt-spark-package
>
> One more question, is there a formal specification or documentation of
> what do
> you need to include in a spark-package (any special file, manifest, etc) ?
> I
> have not found any doc in the website.
>
> Thanks,
> Ismael
>
>
>

I was under the impression that spark-packages was more like a place for
one to list/advertise their extensions,  but when you do spark submit with
--packages, it will use maven to resolve your package
and as long as it succeeds, it will use it (e.g. you can do mvn clean
install for your local packages, and use --packages with a spark server
running on that same machine).

From sbt, I think you can just use publishTo and define a local repository,
something like

publishTo := Some("Local Maven Repository" at
"file://"+Path.userHome.absolutePath+"/.m2/repository")



-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/

Re: spark-packages with maven

Posted by Burak Yavuz <br...@gmail.com>.
Hi Ismael and Jacek,

If you use Maven for building your applications, you may use the
spark-package command line tool (
https://github.com/databricks/spark-package-cmd-tool) to perform packaging.
It requires you to build your jar using maven first, and then does all the
extra magic that Spark Package requires.

Please contact me directly if you have any issues.

Best,
Burak

On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski <ja...@japila.pl> wrote:

> +1000
>
> Thanks Ismael for bringing this up! I meant to have send it earlier too
> since I've been struggling with a sbt-based Scala project for a Spark
> package myself this week and haven't yet found out how to do local
> publishing.
>
> If such a guide existed for Maven I could use it for sbt easily too :-)
>
> Ping me Ismael if you don't hear back from the group so I feel invited for
> digging into the plugin's sources.
>
> Best,
> Jacek
>
> On 15 Jul 2016 2:29 p.m., "Ismaël Mejía" <ie...@gmail.com> wrote:
>
> Hello, I would like to know if there is an easy way to package a new
> spark-package
> with maven, I just found this repo, but I am not an sbt user.
>
> https://github.com/databricks/sbt-spark-package
>
> One more question, is there a formal specification or documentation of
> what do
> you need to include in a spark-package (any special file, manifest, etc) ?
> I
> have not found any doc in the website.
>
> Thanks,
> Ismael
>
>
>

Re: spark-packages with maven

Posted by Jacek Laskowski <ja...@japila.pl>.
+1000

Thanks Ismael for bringing this up! I meant to have send it earlier too
since I've been struggling with a sbt-based Scala project for a Spark
package myself this week and haven't yet found out how to do local
publishing.

If such a guide existed for Maven I could use it for sbt easily too :-)

Ping me Ismael if you don't hear back from the group so I feel invited for
digging into the plugin's sources.

Best,
Jacek

On 15 Jul 2016 2:29 p.m., "Ismaël Mejía" <ie...@gmail.com> wrote:

Hello, I would like to know if there is an easy way to package a new
spark-package
with maven, I just found this repo, but I am not an sbt user.

https://github.com/databricks/sbt-spark-package

One more question, is there a formal specification or documentation of what
do
you need to include in a spark-package (any special file, manifest, etc) ? I
have not found any doc in the website.

Thanks,
Ismael