You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Martin Somers <so...@gmail.com> on 2016/07/26 12:54:38 UTC

sbt build under scala

Just wondering

Whats is the correct way of building a spark job using scala - are there
any changes coming with spark v2

Ive been following this post

http://www.infoobjects.com/spark-submit-with-sbt/



Then again Ive been mainly using docker locally what is decent container
for submitting these jobs locally

Im getting to a stage where I need to submit jobs remotely and thinking of
the best way of doing so


tks

M

Re: sbt build under scala

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

I don't think there are any sbt-related changes in Spark 2.0. Just
different versions in libraryDependencies.

As to the article, I'm surprised it didn't mention using sbt-assembly
[1] for docker-like deployment or sbt-native-packager [2] that could
create a Docker image.

[1] https://github.com/sbt/sbt-assembly
[2] http://www.scala-sbt.org/sbt-native-packager/

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Tue, Jul 26, 2016 at 2:54 PM, Martin Somers <so...@gmail.com> wrote:
> Just wondering
>
> Whats is the correct way of building a spark job using scala - are there any
> changes coming with spark v2
>
> Ive been following this post
>
> http://www.infoobjects.com/spark-submit-with-sbt/
>
>
>
> Then again Ive been mainly using docker locally what is decent container for
> submitting these jobs locally
>
> Im getting to a stage where I need to submit jobs remotely and thinking of
> the best way of doing so
>
>
> tks
>
> M

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org