You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by joshuata <jo...@gmail.com> on 2016/07/19 18:25:08 UTC

Re: how to setup the development environment of spark with IntelliJ on ubuntu

I have found the easiest way to set up a development platform is to use the 
databricks sbt-spark-package plugin
<https://github.com/databricks/sbt-spark-package>  (assuming you are using
scala+sbt). You simply add the plugin to your <project>/project/plugins.sbt
file and add the sparkVersion to your build.sbt file. It automatically loads
the necessary packages to build your applications.

It also provides the sbt console command that sets up a local spark repl to
prototype code against. 



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/how-to-setup-the-development-environment-of-spark-with-IntelliJ-on-ubuntu-tp27333p27357.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org